5 Common Computer Mistakes to Avoid

I will give five things that I recommend for you to avoid when fixing / using your computer that may save you more time and trouble in the future.

  1. Avoid running Windows registry cleaners on your computer

Simply put, a Windows registry cleaner supposedly “cleans out” the Windows registry by removing anything that does not need to be in there.

The problem?  The registry cleaner really does not have any clue whether a registry entry is truly needed or not, so it just guesses. Instead of helping your computer, registry cleaners may end up messing up your computer even more that it already was to begin with.

In addition, there is not really a big reason to clean the registry out. I have used Windows for over 18 years and not once have I ever had the registry become corrupt without me doing something intentionally that ended up causing the registry to mess up.

Bottom Line:  Please avoid Windows registry cleaners. They really are not needed, nor are they guaranteed to fix your problems.

  1. Avoid turning off your anti-virus scanner just because something is malfunctioning on your computer

You may encounter a problem with a software program on your computer, and the support desk person asks you to try running their software with your anti-virus (anti-malware) software turned off.

This is not wise, because you are assuming that their software not only has no malware in it, but you are also technically exposing your computer to other potential threats with it being disabled.

Only in very certain and specific circumstances do I ever recommend someone to (for a very short time) turn off their anti-malware protection software.

This “please try our software with the anti-malware turned off” business is really a generic response from a support person. They have no idea why their software is not working properly on your computer to begin with.

Please note that you can run a computer without any anti-malware software and be just fine (Windows, MacOS X, Linux, UNIX, etc. — they all are capable of being infested with malware), but you will not have any potential to know if you do in fact have malware on your system, without an anti-malware scanning software running in the background.

Bottom Line:  It is unwise to disable your anti-malware software to get a problem working, unless it is a last (and I mean last) resort.

  1. “Rebooting fixes everything”

People get this idea that if they reboot their system “all of their problems they had will go away”. This is not necessarily true. Sometimes rebooting does fix a problem, but other times rebooting is just putting a “Band-Aid” on the problem, and the problem will eventually resurface again.

With problems that resurface after a reboot, you will need you use trial-and-error (with Internet research) to figure out what is possibly wrong with your computer.

Yes, this part of the computer problem solving business is not fun, but it is necessary if you want to fix your computer without having to hire someone else to fix it.

Bottom Line:  Rebooting does not always fix your computer problems, nor should you assume that your computer problems have been fixed just by rebooting.

  1. Buying an SSD will always make my computer run faster

While it is true now days that someone can go down to the computer store and pick up an SSD (Solid State Drive) for a good price, SSDs are not guaranteed to always speed up a computer.

Why? There is more to the performance of a computer than just the hard drive. Anyone using a computer with 512 MB of RAM, an old 1 GHz CPU, running Windows 7 is going to have a miserable time, even if they are using an SSD with fast random-seek data access.

A computer without a lot of RAM and an old, slow CPU will cause the computer to crawl on a modern desktop OS, regardless if the computer is using an SSD or not.

Bottom Line:  SSDs (Solid State Drives) are a great way of drastically improving the data access (read and write) performance of a computer system, but they are not the only deciding factor for a computer’s performance.

  1. Avoid turning off your operating system’s automatic updates

I suspect many people run their operating systems without having the latest updates installed. This is bad for stability, performance, and security (that goes for any OS not just Windows). Updates are there for a reason. Ignoring them is not wise, unless you have a real good reason to ignore them.

Bottom Line:  Leave your operating system’s automatic updates on, unless you have a real good reason not to.


Posted in Computers, Internet and Servers, Operating Systems, Software

Difference between TCP and UDP

If you work with computers, you probably have heard about the TCP and UDP protocols. While they both are mechanisms to transmit data to other computers, they do not operate in the same manner. Below I’ll show you some differences between the two data protocols.

TCP (stands for “Transmission Control Protocol”)

  • It’s a connection-orientated data protocol
  • TCP is best used for applications that require high reliability
  • There is more overhead (more computer resources used) when using TCP
  • Other protocols such as: HTTP, HTTPs, FTP, and SMTP make use of the TCP protocol
  • TCP makes sure that the order in which data is received is the same order in which it was originally sent
  • TCP is typically slower than UDP
  • TCP allows for “flow control”
  • TCP checks for errors in the data transmission
  • TCP acknowledges segments
  • TCP has both error checking and options to recover in-case of an error

UDP (stands for “User Data-gram Protocol”)

  • Not a connection-oriented protocol
  • UDP is useful for applications that need fast transmission of data (regardless of data integrity)
  • Less overhead when using UDP, since UDP is a connectionless protocol
  • Other protocols such as: DNS, DHCP, and VOIP make use of the UDP protocol
  • UDP does not make sure that data received is in the same order that it was originally transmitted (less reliable, but faster)
  • UDP is typically faster than TCP
  • UDP has no “flow control”
  • UDP does not check for errors in the data transmission (less reliable, but faster)
  • UDP does not acknowledge segments
  • UDP has error checking but does not have any way to recover from errors it detects

Posted in Computers, Internet and Servers, Operating Systems

Web Hosting at Home – Pros and Cons

Have you ever thought of hosting your own website from home? This is a question many people on the Internet have asked before. While there are many answers to the question I wrote above, I will give my basic opinion on the matter.

Hosting from home is not always easy. You must manage:

  • Your Internet connection (at least to a certain degree)
  • Internet router
  • Server hardware
  • Software (including securing your server OS installation and keeping the software up-to-date)
  • Dealing with hacking attempts
  • Clean-up successful hacking attempts (this rarely happens on a properly setup server)
  • Keep consistent backups of your data (I do three backups daily on my server)
  • Putting in battery backups and power surge protectors
  • Dealing with support requests

My context when saying “hosting from home” is not running a large web-hosting business. I mean hosting your own private/business website as well as a few friends’ websites (including email). Obviously, you cannot be a large “GoDaddy” web-host, while you are running out of your home. That is not a realistic expectation.


The advantages of hosting from home are the following:

1) Privacy of your data

If setup correctly, no one can easily snoop on your private, confidential data that is on your own server.

2) You can choose your own server hardware

You are not limited by a web host’s hardware options for your server.

3) You can easily deal with server hardware failures

You don’t have to wait for a technician to fix your server.

4) Possibly cheaper for you in the long run

A decent Virtual Private Server purchased online could easily run $20.00/month, and a decent dedicated server could be as much as $100.00/month minimum!

5) No commitments to a web hosting company

You do not have to worry about some web hosting company making unreasonable demands (e.g., 50,000 file limit — on some large websites, you could easily go over a limit like this).

The disadvantages of hosting from home are the following:

1) Possible higher Internet service cost

Your ISP may require you to purchase an Internet service business plan instead of a residential one; this can possibly be expensive but not guaranteed to be expensive.

2) You may not have enough upstream bandwidth

To host efficiently from home, you will need to have at least 3 Mbps of upload speed; any less and you will notice performance degradation with loading your websites from other locations.

Please keep in mind if you are wanting to host several large videos, I recommend either lowering the bit-rate of your videos – so they will stream faster for your users – or use a 3rd party hosting service just for the videos themselves. Most self-hosters will not have enough upload bandwidth to properly serve 1GB+ video files.

3) Unstable home power

Few people may not have stable power at their house or business and thus their server goes on and off; this can easily be fixed by putting in a battery backup for your server.

4) Reliability of your Internet Service Provider

Unless you have a signed agreement with your ISP, they are under no obligation to keep your Internet connection up 24/7.

5) More manual labor required

On a hosted solution online, someone else does the hard work with maintaining your server; when you host from home, you must do it all yourself.


I hope the above helps you decide on whether to self-host web services. I know there are a lot of people online who say it is “bad, silly, stupid, not smart, wouldn’t recommend it” when it comes self-hosing web services.

With all due respect to those people, most of them have never done web hosting by themselves before, and are trashing something they have never done (which is silly and bad in itself!).


Posted in Computers, Internet and Servers

Akamai Discovers Linux Botnet that Hits with 150 Gbps DDoS Attacks

According to a web article, Akamai (a Content Delivery Network company) discovered a massive Linux botnet. A botnet is basically a bunch of compromised computers that allow attackers to perform various tasks that would otherwise be virtually impossible to accomplish without everyone’s compromised computers.

Basically, the botnet comes in the form of a Trojan. This Trojan targets Linux systems (including network routers). Once it gets into the system, it proceeds to download software to connect the computer to the botnet. The botnet is reportedly able to give up to a 150 Gbps DDoS** attack.

As I have said on my blog repeatedly, Linux is not immune to security problems. No operating system on the planet is immune to security problems. In this case, it is people using weak, insecure passwords on their Linux boxes.

If I set my Windows box’s Administrator password to ‘password123’ or ‘qwerty’, enabled remote desktop on my computer, and allow remote desktop through the firewall, I would eventually get hacked. Was that Windows’ fault or Microsoft’s fault? No, of course not. It would be my fault for setting a bad password on my computer.

Many people say “Linux is more secure than Windows”, but if you notice – most of the time – they do not give any technical arguments to backup what they said.

For example, a part of one of someone’s comment posted online said (direct quote):
“The primary attack vector to take over these systems is default or weak login passwords, and allowing internet-facing remote root. That has no bearing on Linux suddenly being less secure than it was yesterday, or in any way magically now just as insecure as Windows.”

Notice he said “That has no bearing on Linux suddenly being less secure than it was yesterday, or in any way magically now just as insecure as Windows.”, but he did not give any technical arguments to backup what he said. How is Windows “insecure”? How is Linux I have seen this dozens of times (no kidding).

What is worse is people will listen to them, assuming they are correct (e.g., Linux is more secure than Windows), and go off and repeat the same misinformation around on the Internet without even bothering to check if the information they received is in fact accurate.

Web article link: https://www.engadget.com/2015/09/29/linux-botnet-hits-with-150-gbps-ddos/

** Simply put, a DDoS attack is basically an attack that uses up the victim’s available bandwidth. This causes the victim’s computers to not correctly function when communicating with the outside world and internal network.


Posted in Computers, Internet and Servers, Operating Systems, Software