Response to “Antivirus – Community Ubuntu Documentation”

This is a response (as of 10-06-2013) to the following sections on the Community Ubuntu Documentation wiki page “Antivirus” (https://help.ubuntu.com/community/Antivirus):  No disrespect is intended with my replies.

1) “Possible reasons Linux is less prone to malware”

2) “Root User vs normal usage”

3) “Market Share Myth”

The Ubuntu documentation is in red and my replies are in black. All quotes from the wiki are direct quotes.

——————-

Possible reasons linux is less prone to malware

  1. Programs are run as normal user, not Root User
  2. More eyeballs on the code, nowhere for malware to hide
  3. Vast diversity makes it difficult to reproduce flaws in a system
  4. All software and drivers are frequently updated by Package Managers
  5. Software is generally installed from vast Repositories not from unfamiliar websites
  6. Developers/programmers are recognised as Rock Gods rather than treated with contempt
  7. Elegant, secure code is admired & aspired to. Hasty kludges are an embarrassment

Response to #1:  Both Windows (2000/XP/Vista/7/8/8.1/10) and Ubuntu Linux can run software as a normal user.

Response to #2:  Myth. If anything, there would be so much code (like in the Linux kernel) that no one could constantly go through all of the code to make sure that no “monkey wrenches” have been thrown into the works. 🙂

Take a look at: http://scalibq.wordpress.com/2013/05/15/the-myth-of-linuxopen-source-security/

Response to #3: I assume you mean many different types of hardware when you said “vast diversity”. That is not always true. If there is a flaw in the Linux kernel, technically it could affect all Linux systems that have not been patched.

Response to #4:  This does not guarantee that no viruses can take over your system. This is a poor argument.

Response to #5:  You are assuming that the servers hosting the files for the repositories are not infected with a viruses. This does not guarantee that no viruses can make their way into your system. This is a poor argument.

Response to #6:  …no comment…

Response to #7:  Not all software for Linux is secure. For example, the BIND DNS server has had multiple security issues over a 15+ year span. Not good.

“A computer virus, like a biological virus, must have a reproduction rate that exceeds its death (eradication) rate in order to spread. Each of the above obstacles significantly reduces the reproduction rate of the Linux virus. If the reproduction rate falls below the threshold necessary to replace the existing population, the virus is doomed from the beginning — even before news reports start to raise the awareness level of potential victims.” by Ray of http://librenix.com

A virus, if programmed correctly, could just lay dormant until other computer(s) are detected for possible infection can be found. Most viruses, in my opinion, will only get as far as the computer it infected (whether on Windows or Linux).

Root User vs normal usage

“For a Linux binary virus to infect executables, those executables must be writeable by the user activating the virus. That is not likely to be the case. Chances are, the programs are owned by root and the user is running from a non-privileged account. Further, the less experienced the user, the lower the likelihood that he actually owns any executable programs. Therefore, the users who are the least savvy about such hazards are also the ones with the least fertile home directories for viruses.” by Ray of http://librenix.com

If the virus uses an exploit in the Linux kernel, it may not matter whether or not the current user has permission to access other files. If you have SE-Linux enabled (assuming you are using a distribution that includes it), that may help prevent the virus from functioning (or at best, functioning correctly).

Market Share Myth

Some people say that linux suffers less from malware because it has less than 1% of the desktop market compared to Windows 90% & suggest that if linux ever increases in popularity then it will suffer just as badly. This argument is deeply flawed & not just by the spurious statistics. Linux dominates server markets(NB: this link dead). Why struggle to write a virus that might knock out a few thousand desktops when knocking out a few thousand servers could knock out a continent? Yet it is the desktop machines that are commonly exploited.

If 90% of computer users switched to Linux overnight, you would see a huge difference in the amount of malware you have for Linux.

What I think you do not understand is that hackers will go after targets that are easy and rich in “bounty”. In my opinion, most Windows users do not understand computer security (and the same would go for Mac OS X and several Linux users). They will click on just about anything, download just about anything, open e-mail attachments without observing if anything is out of the ordinary, etc. It is not that Windows is easier to hack than Linux. It is because there are many users that are not knowledgeable about computer security that makes it easier for the hackers to gain access to Windows computers.

Hackers know they have a better chance with Windows users than others. If even 50% of the Windows users suddenly went to Linux, you would have such an increase in malware (albeit not as much of an increase as you would have with 90% of Windows users switching over to Linux), that you may not be ready for it.

I used to use Linux to run a DNS resolver for the house and shop, but that does not mean that the DNS resolver was 100% secure just because I ran it on Linux. I ran it on Linux to save RAM, not for security. If I had let it go (without running any updates), I would have eventually gotten hacked.

“Why struggle to write a virus that might knock out a few thousand desktops when knocking out a few thousand servers could knock out a continent?”

That is speculation. How do you know that all the computers running the power grid, gas systems, etc. are all running Linux? Some could be running UNIX, Mac OS X, or even Windows.


Posted in Computers, Internet and Servers, Operating Systems, Software

Should You Self-Host Your Blog or Website?

First let me make something very clear.

While not all ISPs (Internet Service Providers; the people you get your Internet connection from) allow you to host off of your Internet connection, there are ones that will let you run your own server from your house or office.

For the rest of this blog post, I’ll assume you are using an ISP that allows you to run your own server.


I have read forum posts before about people who say that running a web-server from your own house is a bad idea. Well if they mean that it is bad to host your own server at the home/office in every single circumstance…then they are wrong. If you think about it, there are actually really good reasons to run a server from your house (or even office).

  1. Since your web-server is running from your house, you have better control of what happens to the server. On the other hand, running your web-server (or renting a web-server, which means you don’t even own a server then!) from a data center somewhere in the US does not really give you control of what happens to the server.
  2. You get more privacy when hosting yourself. You do not have to worry about someone copying your server data off onto some other computer to snoop through your information.
  3. When it comes to fire, floods, theft, etc. both the home/office and the data center are pretty much equivalent.
  4. If you have a business-grade connection from your ISP, you may get even better bandwidth than if you hosted from a data center, since the data center would be hosting 100s if not 1000s of servers.
  5. You get to choose all your server hardware that you want to use when you host at the home/office.
  6. Both the home/office and data centers can deal with power-loss issues. Of course, a data center will be better equipped to handle power outages. However if you have a battery backup on your home server, it will last for a bit. I do agree, if you have bad power (e.g., power goes off once every other day), then I would not try to self-host.
  7. In my opinion, you would get about the same (if not better) up time hosting yourself, then having a very busy data center try to host your server with everyone else.

Conclusion:  Assuming that you have the equipment and an ISP that let’s you do it, there is really no huge difference between hosting a server at a data center and hosting a server at the home/office for your personal or small business use.


Posted in Internet and Servers

IIS vs Apache: Which is the Right Choice?

Last Updated: 08/26/2024

Both Apache and Microsoft IIS (Internet Information Services) have great abilities to host many kinds of websites for many kinds of people and businesses. If you are looking into starting your own web-server, you probably came across the old IIS vs Apache war.

True…Apache does have the most compatibility with websites mainly with .htaccess files and with older web applications, but IIS is a powerful, capable web-server as well (supports the ASP.Net framework, a powerful web application framework).

Over the years, IIS has gained much attention from the web hosting crowd in supporting web applications (e.g., WordPress). Also, PHP still supports Windows (as they have for years already).

Both IIS and Apache can be installed and used instantly, out of the box, with hosting HTML files. However, both need to be configured to make use of other technologies (such as PHP or Perl).

IIS also sandboxes people’s websites from each other (Application Pool Isolation), and allows for separate security permissions via Access Control Lists that are in the NTFS file-system as well as the rest of the Windows operating system.


Here are my opinions on which web-server software performs the best in certain areas.

Specific Area Winner
Easy Website Sand-boxing (websites, hosted on the same web server, protected from each other)  |  Application Pool Isolation IIS
Quick and Easy Initial Setup IIS & Apache (tie)
Easier to Manage IIS (because of its powerful graphical user interface)
Most Compatible with Websites (excluding ASP and ASP.Net websites) Apache
Amount of Available Internet Support Apache
Best PHP Performer (assuming Fast-CGI is used) IIS & Apache (tie)
Lighter on your System Resources IIS
Native ASP and ASP.Net Support IIS (there is .NET Core for Linux, but it is not 1:1 with the full .NET Framework)
Immune to the Slowloris attack IIS (Apache can be configured to be resistant to the Slowloris attack, but without a rewrite, will not be able to be immune to this type of attack.)

Notes

  • (Application Pool Isolation) IIS application pools isolate different web applications from each other. This means that if one application crashes or its security is compromised, it does not affect others running on the same server.
  • Fast-CGI allows servers to serve PHP enabled websites faster by keeping the PHP process or processes on, instead of turning them off when not in use, since creating a process and then terminating a process is resource intensive when a server has many requests to deal with.
  • There is Mono for Apache, but that does not count, since Mono is emulating ASP.Net. It’s not an authentic ASP.Net framework.
  • The Slowloris attack is a type of Denial-of-Service that causes a website to be temperately taken offline when using an affected web server (e.g., Apache).
    • The attack uses up all the connection slots on the web server, so legitimate web traffic cannot get through. Unfortunately, Apache can never be immune to this attack without a rewrite of its code.
    • There is an interesting Apache module you get get to help mitigate a Slowloris attack. In my testing, the mod_antiloris Apache module appears to mitigate the attacks quite effectively.
    • In addition, putting Apache behind a reverse proxy (e.g., Caddy) will also stop the Slowloris attack from affecting your Apache web server.

Now am I saying that Apache is not any good? No, not at all. I personally view web servers – and any other server software – as tools. Just as a mechanic has different tools for his work, so does a server administrator have different tools at his disposal. If you feel Apache gets the job done, use Apache. If IIS gets the job done, use IIS.


Posted in Internet and Servers, Software

Is There Anything Wrong with Using Linux as a Server?

I have used both Windows and Linux on servers. They both are capable operating systems. What you need to ask yourself is “What do I need and/or want?”

I cannot (nor can anyone else) tell you “you need to use Windows…or…you need to use Linux”. If you know what your goals are, then it will make it easier for you to decide which OS to use as a server.

Here are some tips on which OS to use, based upon some possible reasons you have for choosing one OS over the other. Please note these are based upon my own opinions from using both for several years.

Supports the Most Popular Web Technologies Windows/Linux (a tie)

(Windows does support ASP & ASP.Net, whereas Linux officially does not)

Makes Better Use of Your CPU Windows
Makes Better Use of Your Memory Linux
More Flexible (not counting file-system security permissions) Linux
Out-of-the-Box Security Windows
More Stable Windows / Linux (a tie)*
Availability of Free Server Software Linux
Available Online Support Windows / Linux (a tie)
Flexibility of File-system Security Windows**
More User-Friendly Windows

* In my opinion, 99% of crashes on Windows are due to faulty hardware and/or drivers. However, both Windows (NT family) and Linux are stable operating systems, when using good, stable hardware and good, stable drivers.

** Due to the fact that Windows uses ACLs (Access Control Lists) by default. ACLs are much more flexible than UNIX Read/Write/Execute bits.


Posted in Internet and Servers, Operating Systems