Critical crypto bug leaves Linux, hundreds of apps open to eavesdropping

http://arstechnica.com/security/2014/03/critical-crypto-bug-leaves-linux-hundreds-of-apps-open-to-eavesdropping/

The link above takes you to an article talking about a cryptography library that supposedly has a vulnerability that allows people to eavesdrop on your SSL and TLS communications on websites and applications that make use of the GnuTLS library.

From the article:

“The bug in the GnuTLS library makes it trivial for attackers to bypass secure sockets layer (SSL) and Transport Layer Security (TLS) protections available on websites that depend on the open source package. Initial estimates included in Internet discussions such as this one indicate that more than 200 different operating systems or applications rely on GnuTLS to implement crucial SSL and TLS operations, but it wouldn’t be surprising if the actual number is much higher. Web applications, e-mail programs, and other code that use the library are vulnerable to exploits that allow attackers monitoring connections to silently decode encrypted traffic passing between end users and servers.”

Oops! Now this does not mean the Linux kernel is the problem, but this does go to show you that one library can bring security to its knees (that goes for any operating system, not just Linux).

This also shows that “many eyes” does not equal security! Remember that ALL software will have security problems, whether it be Windows, MacOS X, Linux, UNIX,  e-mail servers, DNS servers, forum software (phpBB, vBulletin, etc.), and even desktop word processors can have bad code that present a security risk(s) to your computer.

Worse, the open source community was warned a head of time (back in 2008 !) that GnuTLS was not safe to use (http://www.openldap.org/lists/openldap-devel/200802/msg00072.html)!   Did they not the get message?  Did they ignore the warning?  Who knows!


Posted in Computers, Internet and Servers, Operating Systems, Software

Everything Should be Open Source?

Have you ever heard the phrase “everything should be open source”? Do you know why you use open-source software, or do you just use open-source software because a friend recommended it to you or it is the “thing to do”?

I used to frequently look for open-source software a few years ago. My attitude on that changed. I now, for the most part, use whatever I need to get the job done – open source or not.

For example, I use WordPress. That is open-source web software. I use it because it is free and because it fits my needs. Can I use a proprietary solution? Probably, but why would I do that since WordPress fits my needs?

Ask yourself these six questions if you are frequently tempted to always choose open source. If you can answer “Yes” to at least two of these questions, then in my opinion, you are good in choosing the open-source solution over a possible free closed-sourced or paid solution.

1. Do I know anything about the programming language(s) that this open-source software is written in?
2. Do I really need to make any changes to the open-source software, or is having it open source just “the thing to do”?
3. I do not require paid technical support.
4. Does the open-source solution offer features that are even close to the features of the paid (or free closed-source) solution?
5. Does the open-source software have good documentation?
6. Does the open-source software work with the operating system you are most comfortable using (albeit Windows, Linux, FreeBSD, MacOS X, etc.)?


Posted in Computers, Internet and Servers, Operating Systems, Software

Response to “Antivirus – Community Ubuntu Documentation”

This is a response (as of 10-06-2013) to the following sections on the Community Ubuntu Documentation wiki page “Antivirus” (https://help.ubuntu.com/community/Antivirus):  No disrespect is intended with my replies.

1) “Possible reasons Linux is less prone to malware”

2) “Root User vs normal usage”

3) “Market Share Myth”

The Ubuntu documentation is in red and my replies are in black. All quotes from the wiki are direct quotes.

——————-

Possible reasons linux is less prone to malware

  1. Programs are run as normal user, not Root User
  2. More eyeballs on the code, nowhere for malware to hide
  3. Vast diversity makes it difficult to reproduce flaws in a system
  4. All software and drivers are frequently updated by Package Managers
  5. Software is generally installed from vast Repositories not from unfamiliar websites
  6. Developers/programmers are recognised as Rock Gods rather than treated with contempt
  7. Elegant, secure code is admired & aspired to. Hasty kludges are an embarrassment

Response to #1:  Both Windows (2000/XP/Vista/7/8/8.1/10) and Ubuntu Linux can run software as a normal user.

Response to #2:  Myth. If anything, there would be so much code (like in the Linux kernel) that no one could constantly go through all of the code to make sure that no “monkey wrenches” have been thrown into the works. 🙂

Take a look at: http://scalibq.wordpress.com/2013/05/15/the-myth-of-linuxopen-source-security/

Response to #3: I assume you mean many different types of hardware when you said “vast diversity”. That is not always true. If there is a flaw in the Linux kernel, technically it could affect all Linux systems that have not been patched.

Response to #4:  This does not guarantee that no viruses can take over your system. This is a poor argument.

Response to #5:  You are assuming that the servers hosting the files for the repositories are not infected with a viruses. This does not guarantee that no viruses can make their way into your system. This is a poor argument.

Response to #6:  …no comment…

Response to #7:  Not all software for Linux is secure. For example, the BIND DNS server has had multiple security issues over a 15+ year span. Not good.

“A computer virus, like a biological virus, must have a reproduction rate that exceeds its death (eradication) rate in order to spread. Each of the above obstacles significantly reduces the reproduction rate of the Linux virus. If the reproduction rate falls below the threshold necessary to replace the existing population, the virus is doomed from the beginning — even before news reports start to raise the awareness level of potential victims.” by Ray of http://librenix.com

A virus, if programmed correctly, could just lay dormant until other computer(s) are detected for possible infection can be found. Most viruses, in my opinion, will only get as far as the computer it infected (whether on Windows or Linux).

Root User vs normal usage

“For a Linux binary virus to infect executables, those executables must be writeable by the user activating the virus. That is not likely to be the case. Chances are, the programs are owned by root and the user is running from a non-privileged account. Further, the less experienced the user, the lower the likelihood that he actually owns any executable programs. Therefore, the users who are the least savvy about such hazards are also the ones with the least fertile home directories for viruses.” by Ray of http://librenix.com

If the virus uses an exploit in the Linux kernel, it may not matter whether or not the current user has permission to access other files. If you have SE-Linux enabled (assuming you are using a distribution that includes it), that may help prevent the virus from functioning (or at best, functioning correctly).

Market Share Myth

Some people say that linux suffers less from malware because it has less than 1% of the desktop market compared to Windows 90% & suggest that if linux ever increases in popularity then it will suffer just as badly. This argument is deeply flawed & not just by the spurious statistics. Linux dominates server markets(NB: this link dead). Why struggle to write a virus that might knock out a few thousand desktops when knocking out a few thousand servers could knock out a continent? Yet it is the desktop machines that are commonly exploited.

If 90% of computer users switched to Linux overnight, you would see a huge difference in the amount of malware you have for Linux.

What I think you do not understand is that hackers will go after targets that are easy and rich in “bounty”. In my opinion, most Windows users do not understand computer security (and the same would go for Mac OS X and several Linux users). They will click on just about anything, download just about anything, open e-mail attachments without observing if anything is out of the ordinary, etc. It is not that Windows is easier to hack than Linux. It is because there are many users that are not knowledgeable about computer security that makes it easier for the hackers to gain access to Windows computers.

Hackers know they have a better chance with Windows users than others. If even 50% of the Windows users suddenly went to Linux, you would have such an increase in malware (albeit not as much of an increase as you would have with 90% of Windows users switching over to Linux), that you may not be ready for it.

I used to use Linux to run a DNS resolver for the house and shop, but that does not mean that the DNS resolver was 100% secure just because I ran it on Linux. I ran it on Linux to save RAM, not for security. If I had let it go (without running any updates), I would have eventually gotten hacked.

“Why struggle to write a virus that might knock out a few thousand desktops when knocking out a few thousand servers could knock out a continent?”

That is speculation. How do you know that all the computers running the power grid, gas systems, etc. are all running Linux? Some could be running UNIX, Mac OS X, or even Windows.


Posted in Computers, Internet and Servers, Operating Systems, Software

IIS vs Apache: Which is the Right Choice?

Last Updated: 08/26/2024

Both Apache and Microsoft IIS (Internet Information Services) have great abilities to host many kinds of websites for many kinds of people and businesses. If you are looking into starting your own web-server, you probably came across the old IIS vs Apache war.

True…Apache does have the most compatibility with websites mainly with .htaccess files and with older web applications, but IIS is a powerful, capable web-server as well (supports the ASP.Net framework, a powerful web application framework).

Over the years, IIS has gained much attention from the web hosting crowd in supporting web applications (e.g., WordPress). Also, PHP still supports Windows (as they have for years already).

Both IIS and Apache can be installed and used instantly, out of the box, with hosting HTML files. However, both need to be configured to make use of other technologies (such as PHP or Perl).

IIS also sandboxes people’s websites from each other (Application Pool Isolation), and allows for separate security permissions via Access Control Lists that are in the NTFS file-system as well as the rest of the Windows operating system.


Here are my opinions on which web-server software performs the best in certain areas.

Specific Area Winner
Easy Website Sand-boxing (websites, hosted on the same web server, protected from each other)  |  Application Pool Isolation IIS
Quick and Easy Initial Setup IIS & Apache (tie)
Easier to Manage IIS (because of its powerful graphical user interface)
Most Compatible with Websites (excluding ASP and ASP.Net websites) Apache
Amount of Available Internet Support Apache
Best PHP Performer (assuming Fast-CGI is used) IIS & Apache (tie)
Lighter on your System Resources IIS
Native ASP and ASP.Net Support IIS (there is .NET Core for Linux, but it is not 1:1 with the full .NET Framework)
Immune to the Slowloris attack IIS (Apache can be configured to be resistant to the Slowloris attack, but without a rewrite, will not be able to be immune to this type of attack.)

Notes

  • (Application Pool Isolation) IIS application pools isolate different web applications from each other. This means that if one application crashes or its security is compromised, it does not affect others running on the same server.
  • Fast-CGI allows servers to serve PHP enabled websites faster by keeping the PHP process or processes on, instead of turning them off when not in use, since creating a process and then terminating a process is resource intensive when a server has many requests to deal with.
  • There is Mono for Apache, but that does not count, since Mono is emulating ASP.Net. It’s not an authentic ASP.Net framework.
  • The Slowloris attack is a type of Denial-of-Service that causes a website to be temperately taken offline when using an affected web server (e.g., Apache).
    • The attack uses up all the connection slots on the web server, so legitimate web traffic cannot get through. Unfortunately, Apache can never be immune to this attack without a rewrite of its code.
    • There is an interesting Apache module you get get to help mitigate a Slowloris attack. In my testing, the mod_antiloris Apache module appears to mitigate the attacks quite effectively.
    • In addition, putting Apache behind a reverse proxy (e.g., Caddy) will also stop the Slowloris attack from affecting your Apache web server.

Now am I saying that Apache is not any good? No, not at all. I personally view web servers – and any other server software – as tools. Just as a mechanic has different tools for his work, so does a server administrator have different tools at his disposal. If you feel Apache gets the job done, use Apache. If IIS gets the job done, use IIS.


Posted in Internet and Servers, Software