Hi Guys,

There have been a number of interesting threads on data security, PC security and secure data deletion of late so I thought I would run this past you. Sorry it is so long Guess I never use one word when I can use 10?


Basic PC data security can be divided into two distinct elements:

1. Physical Security
2. Electronic Security


If an unauthorised person gains physical access to your machine, you are compromised, must expect the worst, and react accordingly. This would generally require removal from any network and Internet connection, formatting the hard drive and re-installation of the software. This is really the only way to ensure that you have negated any malicious software that the person might have loaded.

Where a machine contains sensitive data it should be kept under lock and key, and not left unattended whilst activated. Physical access control systems are commonplace for mainframes through to server farms, but desktop PCs are distributed and therefore far more vulnerable.

In these situations it is common to use a password-protected screensaver that will disconnect the User after a set period of inactivity (say 5 minutes). The correct password must be entered to re-activate the machine or you have to re-boot and log on as a valid User.

In the circumstances, it is important to use a secure operating system and set your workgroups and User security/authority policies at appropriate levels. Obviously there is no security where everyone has administrator rights. Similarly, it is advisable to set up your own account names and dispose of any defaults, as these will certainly be known to hackers.

Secure machines frequently do not have CD, floppy drives, USB and other ports that are not essential. This is to prevent the removal of data, loading of unauthorised software, and attachment of unauthorised peripherals.

The BIOS should not permit the User to run “set-up” or booting from any source other than that in the security policy. It is widely known that the BIOS can be re-set using a jumper switch on the motherboard or by removing the battery. This will usually take it back to a relatively insecure factory default position. To avoid this, the case should be locked to prevent easy access to the motherboard/battery. A better solution for larger organisations is to have a custom BIOS that reflects their security policies, such that re-setting gets an intruder nowhere, as it just restores the custom settings. The BIOS should prevent the attachment of unauthorised peripherals.

A machine that is so secured is far less vulnerable to a direct physical intrusion.

Whilst this might be considered good practice, it must not be considered foolproof. I once encountered an ex-corporate machine that was protected in this way, and I did not know the keystrokes or password to enter set-up. The 3.5” floppy drive had failed, and the system would not let the User replace it with another. I eventually resorted to inserting an incompatible memory module, booting, powering off and removing the offending memory, then re-booting. The machine recovered from the shock, then recognised the new floppy drive!!!

Because internal Users might be inclined to load inappropriate software or otherwise compromise security, it is important for organisations to have a strict usage policy that is well publicised, and in the employees’ contracts. It really boils down to education in safe computing, and acceptable behaviour in the workplace.

Random audits are a good method of enforcing these policies. I am not being draconian, in many countries the employer would be liable for unlicensed software on their machines, and fined heavily if it were detected. In any case, all software should be approved by the IT department, or there is no point having reference (lab) machines, and testing new releases or new software. If you are going to permit anarchy in this area, I do not see how you can have any hope of maintaining a secure environment.

A final thought on physical security is the use of removable hard-drives. These slide in and out of the PC and can be safely locked away when not in use. If your BIOS is protected and the machine is not allowed to boot from anything other than the hard drive, physical access does not present that much of a security risk. These devices frequently have cooling fans, which is an added bonus.


Electronic security is generally aimed at protecting the privacy of your data and applications.

The only notable exception that comes to mind is software that monitors the internet connection. If the machine is stolen and connected on another line, it will attempt to alert the owner, or some security centre.

The most common electronic security is in the form of workgroups with authorities and passwords to access the system, and particular drives, files and applications. There is a lot of User education required as they must use strong passwords (letters, numbers, symbols, uppercase and lowercase), and they must be obliged to change them regularly. They must not leave them lying around or give them to other people. Whilst on this point, it is worth noting that providing you follow the principles above, the longer a password is the stronger it is, as it can only be cracked by the “brute force” method, which works character by character.

You have to assume that all passwords can be cracked, and you are just buying time. If it takes 3 months to crack your password file and your policy is to change passwords every 14 days, you will be “safe”. If it is the other way round, you are vulnerable.

It is possible to add an extra layer of security called “pre-boot validation”. An example of this is CompuSec v4.15, which gives the validation and full hard drive encryption. The idea is that even if the BIOS power up security is bypassed, you still need a password to load the operating system, and the relevant authorities and passwords after that.

I do not intend to discuss data encryption and compression, at any length, as there are many articles on the Internet regarding this subject, and a vast array of products that provide it. I would comment that if you are using them it would be wise to have a RAID1 array, where two hard drives are used and one is an exact mirror image of the other. I would not like to attempt data recovery from a crashed encrypted and compressed hard drive, particularly if the data were striped (RAID0)!

There is a similar consideration as for passwords. If all else is equal, the longer the encryption key, the longer it will take to crack it, but decryption is possible given enough time and resources. The basic principle is that once sensitive data are no longer required, they should be securely destroyed.

One area that is of interest, is the concept of secure data removal. Most people are aware that data does not get “deleted” from a computer. It is merely renamed, the file pointer removed and the space flagged as available for re-use (free space). There are numerous software tools that will allow this information to be retrieved afterwards.

Similarly, there are many tools that claim to “wipe”, “erase” or “shred” data. The way these work is to overwrite the target data with something else that is meaningless.

In my opinion, once a file has been overwritten once, it is not recoverable by software means, as the tool has no way of determining what was there previously. Military security tends to require overwriting 7 times. Three times each with 0s and 1s then a final time with random 0s and 1s. The largest number of overwriting passes I have seen was 99! but most products at least meet the military standard of 7.

The underlying logic of multiple overwriting is to flip the magnetic domains back and forth, so that the original data disappears into the general “noise” on the disk. It might be useful to consider the perceived “threat” in a bit more detail.

The problem lies in two aspects of magnetic media:

1. Magnetic Remanance
2. Track Overlay

When you overwrite magnetic media a trace or remnant of the previous data is left behind. This does not affect the function of the hard drive, as it reads the strongest signal, which is the last one. An example might be that you write three “1”s, the first is on unused space and gives a value of 1, the second is over a zero, and gives a value of 0.95, and the third is over another one, and gives a value of 1.05. It is thus relatively simple to deduce that the original data were blank, zero, one. The hard drive would report 1, 1, 1, which is why I believe that a software recovery attempt would fail.

The second problem stems from the fact that the heads on the hard drive do not track in exactly the same place each time, so a thin strip of the previous data can be left behind. Whilst this is a problem for hard drives, it is a much more serious one for floppies, particularly if different drives have been used, as they are much less accurate in their tracking.

The only safe way to deal with a floppy is to burn it or pulverise it, that is, grind it to powder. As an aside, it is much more environmentally friendly to remove the disk from the floppy first, and recycle the empty plastic case in the normal way.

I must admit that I am not a great fan of degaussing as a means of sanitising magnetic media. The standard equipment is not designed for secure deletion, only to prepare media for re-use. A strong enough field would probably ruin the media anyway, it certainly would with a hard drive, and there are much more cost effective methods of total destruction. Another problem with degaussing is that it is less effective the longer the data have been written to the media, so one could never be absolutely sure.

The method that potentially allows recovery of data is called “magnetic force scanning tunnelling microscopy”. It would require physical access to the media in laboratory clean room conditions, which would mean that either your media has been stolen, or you are in big trouble with the FEDS. As it is a physical method, it won’t work on a copy as far as I can tell, at least not a copy that someone could take from your workplace. However, you must consider the possibility of someone cloning your drive and leaving the clone in your machine, whilst taking the original for analysis, would you be likely to notice that?

The basic principle seems to be to measure the absolute magnetic value of the data and compare this to the nominal value of the current data. This will certainly work for one overwrite and maybe even two or three, but once you get beyond that, the interpretation algorithms become exponentially more complex. That is NOT to say impossible. Like most security, all you are doing is buying time, if someone has the will, the time and the money, you are history.

This brings us to the secure deletion software. This relies on overwriting the data a sufficient number of times to deter anyone from attempting a physical reconstruction on the grounds that it would be to complex, time consuming, and costly.

There does not seem to be any real agreement amongst software suppliers and security experts as to what the best number of overwrites is. The “magic number” of 7 is mentioned quite frequently, but that is probably because it complies with DoD standard 5220.22-M (and possibly 28-M?). It is also the value chosen by the German military.

Peter Gutmann, on the other hand, suggests a generic cycle of 35 passes. This is to cover the three most common hard disk encoding schemes, so if you know what your encoding scheme is, you can achieve the same result with fewer passes. It is interesting to note that he includes 8 random data passes.

I believe that care needs to be taken to sufficiently vary and randomise the passes. If I know that you overwrote 50 times alternately with 0 and 1, I would not have to be a genius to calculate the original data values, provided that I had the equipment to measure the absolute ones. Apart from the measurement question, you have really done little better than overwrite once. My message is that I would not trust a tool that had fixed patterns, on the grounds that it would be too predictable.

I am also inclined to favour systems that allow you to choose your hard disk encoding scheme, as they are much quicker. They generally have an option to do all 35 passes, but this is usually because you don’t know your disk’s scheme, so you have to go for them all.

I am inclined to argue that the DoD standard is probably adequate for most purposes, particularly a “due diligence” defence?

At the end of the day, it is up to the individual to decide how secure the erasing needs to be, and how determined and likely a physical attack is. To put it bluntly, if your data are that sensitive, should you have them on your hard drive in the first place?

The kind of features I would look for in an erasing tool are:

1. DoD compliance as a minimum, preferably Gutmann as well
2. Sufficiently random in its overwriting to make prediction difficult
3. Ability to erase individual files, folders and whole drives
4. Ability to wipe free space
5. Provides confirmation, and error messages if it fails to erase
6. Scrambles and removes file names and dates

I would now like to consider one or two peripheral issues that relate to this kind of software.

As previously mentioned, your hard drive will have an amount of “free” or available space on it. This space is quite likely to contain the data from files that have been “deleted” in the normal way. There can also be all sorts of temporary and work files that Windows and your applications have created “on the fly”, then “deleted” in the standard way. It is quite possible for sensitive data to be in amongst this, so it is important that your chosen tool has the facility to erase this as well.

My main paranoia here is that although you probably keep sensitive data in encrypted folders or virtual drives, when you pull them out into an application, they are in your main system in an unencrypted form. They could well find their way into one of these temporary files in this unencrypted form? You put your data back into its safe location, and a copy is lying around on your system waiting to be exploited. This also applies to data in a restricted access application, the security has also been compromised.

For this reason, I suggest that your security tool should have the facility to wipe free space as a stand-alone task. I would suggest that you might want to schedule this to run on a daily cycle.

Following on from this is the concept of “file slack”. Here we have a file system that is composed of many “clusters”, or units of data storage. The system only “thinks” in terms of whole clusters, and will move to the start of the next cluster for its next storage allocation. If your cluster size is 16KB and you wish to store a 35KB file it will take three (3) clusters. This will mean that there is an unused space of 13 KB at the end of the file and before the start of the next cluster that will be used and overwritten. This “slack space” can easily contain sensitive data from a previous usage.

Obviously, the smaller the clusters, the less likely it is that meaningful data will be exposed, but the potential risk remains. The file wiping or “sanitisation” software should also overwrite slack space. It would also be highly desirable if it were to ignore space that has never been written to, as sanitising a 200Gb drive with very little on it could take an inordinate amount of time! With no tangible security benefit.

I think that a lot of people are aware that to perform a “good” defragmentation of a hard drive requires 20-30% free space. This is because the defragmentation software uses it to assemble and move the files being defragmented. I wonder what is left in that space afterwards, even though it is flagged in the system as being available? Now, most defragmenters will not handle an encrypted/compressed file very well, particularly if it is on a hidden virtual drive. You have to defragment with the drive “mounted” and therefore exposed. This may well leave your data exposed.

I believe that there is much to be said for a system that properly wipes free space.

There are two remaining areas that I will just mention at this point, as I have not quite satisfied myself.

1. Damaged sectors, which have become damaged in use, may contain sensitive data. It might be possible to recover these data using the ECC (error correction) facilities of the hard drive?
2. The Swap or Page file can contain all sorts of sensitive data. With the older DOS based operating systems, they could be wiped, just like any other part of the disk. I have yet to satisfy myself that the utilities within the later operating systems make a thorough job of this task.

Please note that this document just represents some of my personal paranoias, and should in no way be used to manage life critical systems or thermonuclear weapons . Or is that Java?