Everything Should be Open Source?

Have you ever heard the phrase “everything should be open source”? Do you know why you use open-source software, or do you just use open-source software because a friend recommended it to you or it is the “thing to do”?

I used to frequently look for open-source software a few years ago. My attitude on that changed. I now, for the most part, use whatever I need to get the job done – open source or not.

For example, I use WordPress. That is open-source web software. I use it because it is free and because it fits my needs. Can I use a proprietary solution? Probably, but why would I do that since WordPress fits my needs?

Ask yourself these six questions if you are frequently tempted to always choose open source. If you can answer “Yes” to at least two of these questions, then in my opinion, you are good in choosing the open-source solution over a possible free closed-sourced or paid solution.

1. Do I know anything about the programming language(s) that this open-source software is written in?
2. Do I really need to make any changes to the open-source software, or is having it open source just “the thing to do”?
3. I do not require paid technical support.
4. Does the open-source solution offer features that are even close to the features of the paid (or free closed-source) solution?
5. Does the open-source software have good documentation?
6. Does the open-source software work with the operating system you are most comfortable using (albeit Windows, Linux, FreeBSD, MacOS X, etc.)?


Posted in Computers, Internet and Servers, Operating Systems, Software

Response to “Antivirus – Community Ubuntu Documentation”

This is a response (as of 10-06-2013) to the following sections on the Community Ubuntu Documentation wiki page “Antivirus” (https://help.ubuntu.com/community/Antivirus):  No disrespect is intended with my replies.

1) “Possible reasons Linux is less prone to malware”

2) “Root User vs normal usage”

3) “Market Share Myth”

The Ubuntu documentation is in red and my replies are in black. All quotes from the wiki are direct quotes.

——————-

Possible reasons linux is less prone to malware

  1. Programs are run as normal user, not Root User
  2. More eyeballs on the code, nowhere for malware to hide
  3. Vast diversity makes it difficult to reproduce flaws in a system
  4. All software and drivers are frequently updated by Package Managers
  5. Software is generally installed from vast Repositories not from unfamiliar websites
  6. Developers/programmers are recognised as Rock Gods rather than treated with contempt
  7. Elegant, secure code is admired & aspired to. Hasty kludges are an embarrassment

Response to #1:  Both Windows (2000/XP/Vista/7/8/8.1/10) and Ubuntu Linux can run software as a normal user.

Response to #2:  Myth. If anything, there would be so much code (like in the Linux kernel) that no one could constantly go through all of the code to make sure that no “monkey wrenches” have been thrown into the works. 🙂

Take a look at: http://scalibq.wordpress.com/2013/05/15/the-myth-of-linuxopen-source-security/

Response to #3: I assume you mean many different types of hardware when you said “vast diversity”. That is not always true. If there is a flaw in the Linux kernel, technically it could affect all Linux systems that have not been patched.

Response to #4:  This does not guarantee that no viruses can take over your system. This is a poor argument.

Response to #5:  You are assuming that the servers hosting the files for the repositories are not infected with a viruses. This does not guarantee that no viruses can make their way into your system. This is a poor argument.

Response to #6:  …no comment…

Response to #7:  Not all software for Linux is secure. For example, the BIND DNS server has had multiple security issues over a 15+ year span. Not good.

“A computer virus, like a biological virus, must have a reproduction rate that exceeds its death (eradication) rate in order to spread. Each of the above obstacles significantly reduces the reproduction rate of the Linux virus. If the reproduction rate falls below the threshold necessary to replace the existing population, the virus is doomed from the beginning — even before news reports start to raise the awareness level of potential victims.” by Ray of http://librenix.com

A virus, if programmed correctly, could just lay dormant until other computer(s) are detected for possible infection can be found. Most viruses, in my opinion, will only get as far as the computer it infected (whether on Windows or Linux).

Root User vs normal usage

“For a Linux binary virus to infect executables, those executables must be writeable by the user activating the virus. That is not likely to be the case. Chances are, the programs are owned by root and the user is running from a non-privileged account. Further, the less experienced the user, the lower the likelihood that he actually owns any executable programs. Therefore, the users who are the least savvy about such hazards are also the ones with the least fertile home directories for viruses.” by Ray of http://librenix.com

If the virus uses an exploit in the Linux kernel, it may not matter whether or not the current user has permission to access other files. If you have SE-Linux enabled (assuming you are using a distribution that includes it), that may help prevent the virus from functioning (or at best, functioning correctly).

Market Share Myth

Some people say that linux suffers less from malware because it has less than 1% of the desktop market compared to Windows 90% & suggest that if linux ever increases in popularity then it will suffer just as badly. This argument is deeply flawed & not just by the spurious statistics. Linux dominates server markets(NB: this link dead). Why struggle to write a virus that might knock out a few thousand desktops when knocking out a few thousand servers could knock out a continent? Yet it is the desktop machines that are commonly exploited.

If 90% of computer users switched to Linux overnight, you would see a huge difference in the amount of malware you have for Linux.

What I think you do not understand is that hackers will go after targets that are easy and rich in “bounty”. In my opinion, most Windows users do not understand computer security (and the same would go for Mac OS X and several Linux users). They will click on just about anything, download just about anything, open e-mail attachments without observing if anything is out of the ordinary, etc. It is not that Windows is easier to hack than Linux. It is because there are many users that are not knowledgeable about computer security that makes it easier for the hackers to gain access to Windows computers.

Hackers know they have a better chance with Windows users than others. If even 50% of the Windows users suddenly went to Linux, you would have such an increase in malware (albeit not as much of an increase as you would have with 90% of Windows users switching over to Linux), that you may not be ready for it.

I used to use Linux to run a DNS resolver for the house and shop, but that does not mean that the DNS resolver was 100% secure just because I ran it on Linux. I ran it on Linux to save RAM, not for security. If I had let it go (without running any updates), I would have eventually gotten hacked.

“Why struggle to write a virus that might knock out a few thousand desktops when knocking out a few thousand servers could knock out a continent?”

That is speculation. How do you know that all the computers running the power grid, gas systems, etc. are all running Linux? Some could be running UNIX, Mac OS X, or even Windows.


Posted in Computers, Internet and Servers, Operating Systems, Software

IIS vs Apache: Which is the Right Choice?

Last Updated: 08/26/2024

Both Apache and Microsoft IIS (Internet Information Services) have great abilities to host many kinds of websites for many kinds of people and businesses. If you are looking into starting your own web-server, you probably came across the old IIS vs Apache war.

True…Apache does have the most compatibility with websites mainly with .htaccess files and with older web applications, but IIS is a powerful, capable web-server as well (supports the ASP.Net framework, a powerful web application framework).

Over the years, IIS has gained much attention from the web hosting crowd in supporting web applications (e.g., WordPress). Also, PHP still supports Windows (as they have for years already).

Both IIS and Apache can be installed and used instantly, out of the box, with hosting HTML files. However, both need to be configured to make use of other technologies (such as PHP or Perl).

IIS also sandboxes people’s websites from each other (Application Pool Isolation), and allows for separate security permissions via Access Control Lists that are in the NTFS file-system as well as the rest of the Windows operating system.


Here are my opinions on which web-server software performs the best in certain areas.

Specific Area Winner
Easy Website Sand-boxing (websites, hosted on the same web server, protected from each other)  |  Application Pool Isolation IIS
Quick and Easy Initial Setup IIS & Apache (tie)
Easier to Manage IIS (because of its powerful graphical user interface)
Most Compatible with Websites (excluding ASP and ASP.Net websites) Apache
Amount of Available Internet Support Apache
Best PHP Performer (assuming Fast-CGI is used) IIS & Apache (tie)
Lighter on your System Resources IIS
Native ASP and ASP.Net Support IIS (there is .NET Core for Linux, but it is not 1:1 with the full .NET Framework)
Immune to the Slowloris attack IIS (Apache can be configured to be resistant to the Slowloris attack, but without a rewrite, will not be able to be immune to this type of attack.)

Notes

  • (Application Pool Isolation) IIS application pools isolate different web applications from each other. This means that if one application crashes or its security is compromised, it does not affect others running on the same server.
  • Fast-CGI allows servers to serve PHP enabled websites faster by keeping the PHP process or processes on, instead of turning them off when not in use, since creating a process and then terminating a process is resource intensive when a server has many requests to deal with.
  • There is Mono for Apache, but that does not count, since Mono is emulating ASP.Net. It’s not an authentic ASP.Net framework.
  • The Slowloris attack is a type of Denial-of-Service that causes a website to be temperately taken offline when using an affected web server (e.g., Apache).
    • The attack uses up all the connection slots on the web server, so legitimate web traffic cannot get through. Unfortunately, Apache can never be immune to this attack without a rewrite of its code.
    • There is an interesting Apache module you get get to help mitigate a Slowloris attack. In my testing, the mod_antiloris Apache module appears to mitigate the attacks quite effectively.
    • In addition, putting Apache behind a reverse proxy (e.g., Caddy) will also stop the Slowloris attack from affecting your Apache web server.

Now am I saying that Apache is not any good? No, not at all. I personally view web servers – and any other server software – as tools. Just as a mechanic has different tools for his work, so does a server administrator have different tools at his disposal. If you feel Apache gets the job done, use Apache. If IIS gets the job done, use IIS.


Posted in Internet and Servers, Software

Do I Need a Web Hosting Control Panel?

If you are looking into running your own web server, you probably have heard about web hosting control panels before.

Web hosting control panels are software that runs on a web server that allow you and others to manage web domains, e-mail accounts, FTP accounts, MySQL databases, etc.


Here are four questions to ask yourself, if you are wondering if you need a web-based control panel:

1) Are you experienced using the Linux command line? If you plan to use Windows, have you ever administrated a Windows server before?

2) Do you have intermediate knowledge of how to setup and run a web-server?

3) Are you good at problem-solving?

4) Do you have many websites to manage?

If you answered “no” to questions #1, #2, or #3  and/or  you answered “yes” to question #4, then you will be more comfortable using a web hosting control panel. Also, if you are planning to sell web-hosting or have a lot of websites to host, then using a web-hosting panel may be easier than doing everything manually. Otherwise just forgo a web hosting control panel and do it yourself.


Please keep in mind, using a web hosting control panel, you are potentially making it easier for a targeted attack, since one little web panel script with a vulnerability in it can potentially compromise your server (not joking).

If someone does hack the web-based control panel and gets into your accounts, they can do some serious damage and you better hope you have a current backup that restores successfully.

Here is an example of a web hosting control panel giving you a security problem (I know the linked article is old, but it still proves my point): https://krebsonsecurity.com/2012/07/plesk-0day-for-sale-as-thousands-of-sites-hacked/


Web-hosing panels are there to make your job easier, usually at the expense of being flexible with your server. When using a web-hosting control panel, you are “locked-in” with whatever the web hosing panel allows you to do. It’s basically convenience or flexibility with your server.

It is not advised to “do your own thing” (doing something that the web-hosting panel does not support; going around the control panel to do something), since this can cause problems down the road. It’s best to just stick with whatever the web hosing control panel provides you, so you better pick the right one the first time.


Posted in Computers, Internet and Servers, Software