IIS vs Apache: Which is the Right Choice?
Last Updated: 08/26/2024
Both Apache and Microsoft IIS (Internet Information Services) have great abilities to host many kinds of websites for many kinds of people and businesses. If you are looking into starting your own web-server, you probably came across the old IIS vs Apache war.
True…Apache does have the most compatibility with websites mainly with .htaccess files and with older web applications, but IIS is a powerful, capable web-server as well (supports the ASP.Net framework, a powerful web application framework).
Over the years, IIS has gained much attention from the web hosting crowd in supporting web applications (e.g., WordPress). Also, PHP still supports Windows (as they have for years already).
Both IIS and Apache can be installed and used instantly, out of the box, with hosting HTML files. However, both need to be configured to make use of other technologies (such as PHP or Perl).
IIS also sandboxes people’s websites from each other (Application Pool Isolation), and allows for separate security permissions via Access Control Lists that are in the NTFS file-system as well as the rest of the Windows operating system.
Here are my opinions on which web-server software performs the best in certain areas.
Specific Area | Winner |
Easy Website Sand-boxing (websites, hosted on the same web server, protected from each other) | Application Pool Isolation | IIS |
Quick and Easy Initial Setup | IIS & Apache (tie) |
Easier to Manage | IIS (because of its powerful graphical user interface) |
Most Compatible with Websites (excluding ASP and ASP.Net websites) | Apache |
Amount of Available Internet Support | Apache |
Best PHP Performer (assuming Fast-CGI is used) | IIS & Apache (tie) |
Lighter on your System Resources | IIS |
Native ASP and ASP.Net Support | IIS (there is .NET Core for Linux, but it is not 1:1 with the full .NET Framework) |
Immune to the Slowloris attack | IIS (Apache can be configured to be resistant to the Slowloris attack, but without a rewrite, will not be able to be immune to this type of attack.) |
Notes
- (Application Pool Isolation) IIS application pools isolate different web applications from each other. This means that if one application crashes or its security is compromised, it does not affect others running on the same server.
- Fast-CGI allows servers to serve PHP enabled websites faster by keeping the PHP process or processes on, instead of turning them off when not in use, since creating a process and then terminating a process is resource intensive when a server has many requests to deal with.
- There is Mono for Apache, but that does not count, since Mono is emulating ASP.Net. It’s not an authentic ASP.Net framework.
- The Slowloris attack is a type of Denial-of-Service that causes a website to be temperately taken offline when using an affected web server (e.g., Apache).
- The attack uses up all the connection slots on the web server, so legitimate web traffic cannot get through. Unfortunately, Apache can never be immune to this attack without a rewrite of its code.
- There is an interesting Apache module you get get to help mitigate a Slowloris attack. In my testing, the mod_antiloris Apache module appears to mitigate the attacks quite effectively.
- In addition, putting Apache behind a reverse proxy (e.g., Caddy) will also stop the Slowloris attack from affecting your Apache web server.
Now am I saying that Apache is not any good? No, not at all. I personally view web servers – and any other server software – as tools. Just as a mechanic has different tools for his work, so does a server administrator have different tools at his disposal. If you feel Apache gets the job done, use Apache. If IIS gets the job done, use IIS.
Posted in Internet and Servers, Software
Do I Need a Web Hosting Control Panel?
If you are looking into running your own web server, you probably have heard about web hosting control panels before.
Web hosting control panels are software that runs on a web server that allow you and others to manage web domains, e-mail accounts, FTP accounts, MySQL databases, etc.
Here are four questions to ask yourself, if you are wondering if you need a web-based control panel:
1) Are you experienced using the Linux command line? If you plan to use Windows, have you ever administrated a Windows server before?
2) Do you have intermediate knowledge of how to setup and run a web-server?
3) Are you good at problem-solving?
4) Do you have many websites to manage?
If you answered “no” to questions #1, #2, or #3 and/or you answered “yes” to question #4, then you will be more comfortable using a web hosting control panel. Also, if you are planning to sell web-hosting or have a lot of websites to host, then using a web-hosting panel may be easier than doing everything manually. Otherwise just forgo a web hosting control panel and do it yourself.
Please keep in mind, using a web hosting control panel, you are potentially making it easier for a targeted attack, since one little web panel script with a vulnerability in it can potentially compromise your server (not joking).
If someone does hack the web-based control panel and gets into your accounts, they can do some serious damage and you better hope you have a current backup that restores successfully.
Here is an example of a web hosting control panel giving you a security problem (I know the linked article is old, but it still proves my point): https://krebsonsecurity.com/2012/07/plesk-0day-for-sale-as-thousands-of-sites-hacked/
Web-hosing panels are there to make your job easier, usually at the expense of being flexible with your server. When using a web-hosting control panel, you are “locked-in” with whatever the web hosing panel allows you to do. It’s basically convenience or flexibility with your server.
It is not advised to “do your own thing” (doing something that the web-hosting panel does not support; going around the control panel to do something), since this can cause problems down the road. It’s best to just stick with whatever the web hosing control panel provides you, so you better pick the right one the first time.
Posted in Computers, Internet and Servers, Software
5 Reasons Open Source Software is not Universally Used
Before anyone starts getting the pitchforks and torches out, let me say that I do use open source software. I use it on my home Internet router. I use it when I program web scripts for my websites. I use it for my online web-mail. I do not completely throwout open source, I just don’t think it has the potential that people may think it has.
Reason 1: There is no constant, heavy, quick technical support in open source software
People who are for open source may tell you that open source projects have their own forum to allow users of their software to receive support. The problem is, no one is obligated to help you in any matter of the software. Why? Because it is FREE! You are not paying them a dime (donations do not count), so what real incentive does anyone have to help anyone with the software?
If you are fortunate, you might have 1-5 people who might stick around and help users out frequently. However life happens and these people may not be around for a long time.
Also, support from Red Hat or Novell does not count. You are paying them for support. That is totally different from free community support.
Reason 2: Nothing is really free.
If you studied economics in school, you will know that nothing is really free. It still took money, time, and effort to make whatever it is that is free. Open source projects like CentOS and Ubuntu still take money to operate. Don’t ever think that it doesn’t. They have to pay for things like: servers (to run the website), equipment to test things out on, maybe even paying a few friends for their time in helping them with the project, etc.) Everything in the open source world somehow, someway costs money!
Reason 3: Possibility of people taking a particular open source project and adding/modifying/deleting parts of it and then releasing it under a different name.
The act of taking an open source project and making your own version of the project is called forking. Programmers fork projects because they want to “control” the project, but since they cannot control the main project, they just make a copy of it and modify it to their own needs and desires. Then they release it (sometimes) under a different name.
There is nothing inherently wrong with forking a open source project. However, this can potentially confuse someone who is wanting to use the software, because instead of being able to choose from only what the main developer(s) made, they now have to choose between the forked project(s) and the main project. Which open source project has the best features for my needs? Which project has the best free support? Which project will be around in 3 years?
Reason 4: Several “clones” of the same thing.
With open source software, you will notice some projects appear to be clones of each other. The best example of this is the desktop Linux operating systems. There are many computer desktop Linux operating systems out there, but why so many (not including the smaller Linux OSes)? How is someone supposed to choose between Ubuntu, CentOS, Linux Mint, Debian, Fedora, Slackware, Arch Linux, Gentoo, Mageia, OpenSUSE, PCLinuxOS 2011, etc.?
Reason 5: No responsibility taken.
You will find most open source software you are using is not under any kind of warranty. The developers of open source software will, most of the time, not take responsibility for any damage, loss, or anything else that happens due to bugs or what not.
Everyone is free to choose what kind of software they use, but remember that free does not always mean better.
Posted in Software