Security
I read an article on the Microsoft web site that said that NT "security" is sufficient for most users. At first, I was appalled that they made that statement considering the large number of security holes that exists (I have found hundreds myself). However, the statement is true. Most users are not concerned with security. Therefore, the extreme ease at which any user can gain Administrator access on an NT system right out of the box is not of concern to most users.
However, for the system administrator the issue is completely different. There are dozens of directories and hundreds of files that the administrator has to change by hand in order to make NT secure for most businesses. This requires anywhere from 30 minutes to an hour per machine to ensure that the security is correct. Even at the least secure levels, all UNIX variants are more secure than NT.
For the company president or CEO, NT "security" is also not sufficient. Because of the ease of gaining administrator access, all company secrets are at risk. I know many companies that let the users run on Windows NT, but the mission critical applications are run on UNIX. Despite the advantages that UNIX has in terms of scalability and reliability, the default security of NT scares away all but the die-hard Microsoft fans. Even a cursory inspection shows that it is not worth risking your business for the "pretty" Windows GUI.
Yes, there are holes in UNIX and Linux. In order to exploit them, you need to either be a UNIX guru or an expert programmer (or both). Anyone can easily exploit the holes in NT.
Another important aspect to consider is that the NT security model is stricter than Linux. That is the concepts behind Windows NT security could lead to more strictly controlled system. However, the implementation has made Windows NT open to all sorts of attacks. The gaping holes in Internet Explorer are just a few examples. (Well, actually there are quite a few.)
Linux
Advantages
* Security has had years to be tested and verified.
* Security is tied to the file. Can reinstall without fear of having to replace security information manually.
* Can easily tell if a file has been changed on your system This includes mundane things like the permissions, but also includes more important things like the checksum.(see below)
* Standard attack on any system is the "dictionary attack". Can create tools easily to test for this vulnerability. (see below)
* Able to merge systems/domains.
* Firewall functionality is built-in to the server.
* Can easily check if a new user has logged in changed the default password.
Disadvantages
* File access currently limited to READ-WRITE-EXECUTE for USER-GROUP-OTHER.
* Security is not as strict.
* No auditing.
NT
Advantages
* Stricter security available. (See below)
* Larger number of choices for access permissions.
* Auditing of security event.
Disadvantages
* Security is still in it's infancy. Infantile mistakes are still being made in regard to security. (See below)
* Security is bound to the name of the machine and domain. If you reinstall all security information is gone.(See below)
* Have to check by hand for any changes to files.
* No tools to check for "dictionary attack" vulnerability.
* Systems have to be reinstalled when merging domains. (Security ID is dependent on the currently installed copy of the system.)
* Microsoft sells an extra Firewall product.
* No easy way to check to see if a new user has logged in and changed their password (See below)
* There a literally hundreds of holes that allow any to create a Trojan horse without any special programming skills.(See below)
* This list goes on. Check this out for more.
Default permissions on drive shares under Windows-NT (C:, D: not C$, D$) is FULL-CONTROL for everyone.This means I can access it without even logging into the domain. The default permissions for the systemroot directory (e.g. C:\winnt) is also FULL-CONTROL for everyone. In five minutes I can create a dozen Trojan-Horses to give any user Administrator access.
Assume you have discovered that someone has broken into our network. On just a single machine, how long will it take to check 5 administration related groups to see if there are any additional users added? (Such as the Administrators or Account Operators group) UNIX 5 seconds per machine. NT 5 minutes per machines.
* Script runs once a day and checks the sum of /etc/passwd. If it has changed, a message is sent to the sysadmin (would also react because a password was changed.)
* Script could count the number of entries to see if one was added. Could also check group file for changes and report them.
* A quick script could be written to check to see if there were groups with no users, or when users were added to groups.
* Could be expanded to check through any files. Cannot automate these tasks on NT as everything is hidden in the registry and there are no tools built in. You could create a script to check NT for additions to groups, etc., but the output of "net user" and "net group" is extremely difficult to parse.
NT security is generally better than UNIX. However, "stricter" might be a more appropriate word than "better." It is theoretically harder to crack NT password as they use a larger encryption key. However, the LM HASH used to encrypt passwords actually breaks the password into two, seven-character pieces. This makes it easier to crack than the ten character UNIX algorithm as you can crack each half individually. Plus the encryption algorithm is the same for every user. If two people have the same password, you do not even need to crack the password to see this, as the encrypted version is the same. In addition, the standard attack is still a dictionary attack and that works effectively no matter how large the key is. The major problem with NT security is that you cannot get around it. There were several examples where this security mechanism has become more of a problem than it is worth.
In addition the encryption is the same all the time. Using tools freely available on the Internet, you can dump the encrypted passwords and compare them. If two users have the same password, the encrypted password will be the same for both. On Linux, there are 4096 different ways to encrypt the password. This make the odds very low that two users will ever have the same encrypted password. Using basic techniques of intelligence analysis you can quickly narrow down the possible passwords, making guessing NT passwords easier. I did this on one system and found several encrypted passwords that matched. I assumed correctly that this password had something to do with the company.
Using the the latest version of l0phtcrack I was able to crack about 10% of all the passwords in our domain within a few hours. I had 25% within a couple of days. Many of the passwords I would have considered safe a few years ago. However, because of the simplicity of the NT encryption mechanism no password is safe.(Note that cracking NT passwords is something which Microsoft said was just "theoretical".)
We made the mistake of rotating our backups every week in some of our offices. That is, we only had five tapes. We discovered that although NT Backup and the Event Viewer reported all was well, it wasn't. The system crashed and we had to reinstall. We then discovered that the tape was unreadable. However, it worked fine when we installed originally. Since our data was on another drive, it was untouched, but not well. All the permissions were based on the original installation. Although, we could re- create the users, the permissions on the files and directories no longer valid. As far as NT was concerned. These were different systems. Therefore, once again, we had to re-create the permissions by hand.
How can you tell if a file on an NT machine has changed? With what tool? Can't even think about using a batch script. In 5 minutes, I could write a script that writes the sum for each file on the system into a file then another script (or the same one with a different flag) it could check all of the files to see if they have changed. However, why bother if rpm already does it?
I have heard comments that because the Linux source code is freely available then it is less secure. The opposite is true. There are tens of thousands of people out there who look through the code and when a bug is discovered (related to security or not) it is made public and the bug is fixed. The xcmd bug (mentioned above) is an example of a potential security problem that was detected by someone who had the code and was then fixed by the developer. Since the developers use the code they develop (not necessarily true for NT) they have a stake in making it secure. With 10% of all Internet servers running Linux (still more than NT) it has to be secure. That's why the program that could bring down any NT machine was called "winnuke" and not "linuxnuke"? Granted the "ping of death" did effect Linux machines, but since the patch was on the net within a couple of hours, does it really count?
When new users are created they typically get a standard password. If they never login there is an account with a known password on the system. Under NT, there is no easy way to check for this. Under Linux it can easily be made part of the user creation process to check after a specific period of time.
The default permissions on Windows NT out of the box provides for very little security. Everyone, even users not logged into the domain have complete control over the system root directory (normally C:\WINNT). In addition, everyone also has the ability to change the C:\WINNT\SYSTEM32 directory, which also contains number of very important system programs and files. All a hacker needs to to is replace one of these with a Trojan horse and the next time it is run by an administrator, that hacker get administrator access to the system.
A normal user can also change the file associations. That means that when the administrator clicks on a .TXT file NOTEPAD.EXE isn't started or when the administrator clicks on a .DOC file WINWORD.EXE isn't started, but rather a Trojan horse. It can even be a batch file, that then adds the hacker to the Administrator group and then starts the intended application.
One important aspect of NT "security" is that most security bugs uncovered on NT are really naive. While many have been solved, (and sometimes amazingly fast for Microsoft within a few weeks) the simplicity by which these bugs can be exploited is incredible. This gives the impression that NT is built by a bunch of kids, or at least people who have no clue about real-world security, only the theory.
Although there are security bugs in linux, (such as sendmail), they are fairly sophisticated and out of reach, even with complete instruction to most computer users. The NT bugs can be exploited by anyone and with the tools that NT provides by default!
Another key problem is that so often, the bug fixes (security and otherwise) address just a single aspect of the problem and do not address the underlying mistake. For example, in W95 and NT it was possible to enter a public share and "jump behind" its root on the server and then get access to any directory on that server. The trick was simply to enter the public share in a command window and then issue a "cd ..." in the root of that share. This was not trapped by the server and this gave access to the parent directory and every other directory. The reaction from Microsoft was to fix just this one stupid error. In the next service pack, it was not possible to do it anymore, but doing "cd .." did it again. Plus the same problem existed on the Internet Information Server. When Microsoft fixed the ".." problem in MIIS, you could still get around it by access a URL using "../.." (the parent of the parent). Not good security.
For more really stupid NT security holes check out the Insecure.Org website.
Microsoft tries to market its domain concept as an improvement. Anyone who has tried to implement it in a large network, knows that this is far from the truth. Since the NT domain concept is the only security that NT knows about, it is an "all or nothing" deal. Either the shares are open to the entire world "everyone" or you have to add machines to domains, create "trust relationships" between domains and so forth. Remember that "trust relationships" are n*(n-1). Therefore, with just five domains, you have 30 relationships to manage. Even that few can be a real problem. Because of this shortcoming and the obvious difficulty with manging so many relationships, a lot of administrators simply make shares and other resources available to everyone. In other words, the NT shortcomings cause administrators to make their systems insecure.
On November 15, 2002 the web site osopinion.com published an article entitled "Study: Linux' Security Problems Outstrip Microsoft" in which it claimed that there were actually more security problems with Linux than with Windows.
Reading through the article I see nothing different from what has been said from Microsoft supporters for quite a long time about the relative security. There was a lot of handwaving in the article and throwing out numbers like they mean something. The article said "Linux software" and "Microsoft products". When you consider that a number of the bug/problems listed in the CERT advisories apply to both Linux and Windows, they are classified as "Linux software". However, they do not come from Microsoft and are therefore not "Microsoft products". The same bug should be counted against Microsoft, but isn't simply because it was not produced by Microsoft. However, since the software runs on Linux (and therefore "Linux software") it is counted as a bug against Linux. Also, counting the software bugs as "Linux problems" is like saying the PC-cillen bug is a Windows' bug simply because PC-cillen runs **only** on Windows.
Another important (very important) issue is that many of the Linux (i.e. open source) bugs are from buffer overruns, which have the potential for executing "random" code. They can be found easier in open source software simply because it is open source and you can see them! No one is going to try 4386 different combintations of text to force notepad.exe to overrun a buffer. That's why buffer overruns are never found. However, you can see it with the open source software and it is almost always reported as a security bug. There are more reports of Linux security bugs because anyone can find them.
Note that the CERT advisories are issued based on both the severity of the problem and the impact. Since most of the software on the Internet is based on open source, many, many times more people are effected than by a bug in a desktop OS. Therefore, open source has a greater impact as it effects more people and is more likely to be reported. As a result the statistics get skewed.
Check out the CERT site. If you sort the bugs by severity, 18 of the top 25 are "Microsoft products", while less than half of that is open source. A buffer overrun is something that could execute code a hacker wanted, but usually requires a lot of trial and error to figure out just what happens and how to exploit it. Microsoft bugs are things that are typically ones that people discover almost by accident. For example, accessing the parent directory of your Web Server's root by simply using a URL with ../. Since anyone can see the code for open source, the developers cannot hide the fact when the discover bugs themselves. Microsot can and does hide the fact there is a security bug. It simply fixes them without telling anyone in the next service pack. Again, the statistics get skewed.
Look again at the term "Microsoft products". There are thousands of more open source projects than "Microsoft products". So it make sense that there will be numerically more bugs in open source software than "Microsoft products". However, if you look at it in terms of percentage (based on the cert advisories), there was less than one open source bug for every 3000 "products" but one bug for every 35 "Microsoft products". Almost 1:1000!!!
However, those are pure statistics and you can manipulate them anyway you want. of the thousands of open source projects, many never get off the ground or produce a single line of code. Many more never become part of the standard Linxu distribution, so it would be unfair to list them because they simply skew the statistics.
You need to look at the numbers. One thing both the article and the Aberdeen report failed to mention is that a single advisory often contains multiple vulnerabilities. That is multiple security bugs in a single advisory, which was only counted once by Aberdeen. If you look, the 12 advisories that belong to Linux, there are 17 individual vulnerabilities. However, the seven advisories belonging to the "Microsoft products" platform contain 25 individual vulnerabilities.
Consider the fact that this was one of Aberdeen's "sponsored" reports. The Aberdeen group has been caught before in publishing an "independent" report blasting one company and being paid by that company's competition. Considering the source, my experience with both "products" and the numbers (not statistics), the article sounds like a lot of paid Microsoft propaganda.