Results 1 to 4 of 4

Threaded View

  1. #1
    Senior Member nihil's Avatar
    Join Date
    Jul 2003
    United Kingdom: Bridlington

    AV Low detection rates.

    I saw this article and found it interesting:


    A novel if not particularly scientific method of testing.

    The vendors whine (as they always do) but not very convincingly in some cases:

    Randy Abrams, director of technical education at ESET, said that the report ‘is a textbook example of how to do anti-malware testing fundamentally wrong'. He said that a sample set of 1,708 unconfirmed malicious files were used, and as ESET sees around 200,000 unique new samples each day, 1,708 is not a statistically significant sample set.
    Hmmmm........................... 200,000 x 365 = 73,000,000 per year............ anyone actually believe that?

    A lot of what he sees will be the same malware, variations on a theme or obfuscations of older malware. But there is no way they are "unique" or "new"

    I believe that when you are talking about zero day or obfuscated malware then 1,708 is a highly significant statistical sample. However, if they had really confirmed that these items were malicious, then all should have remained in the test, detected or not.

    Basically the test didn't demonstrate anything we didn't know .........signature based detection is going to struggle if it doesn't have a signature.

    Also, it isn't so much about detection as prevention? a lot of vendors actually sell security suites that have sandboxes and behavioural analysis etc. They may deny access to the main systems or kill processes that are about to do something malicious.

    It is not that clear from the report, but it looks as if they just loaded the files and looked to see if the scanners detected anything suspicious?

    A more robust test would be to actually open/execute them and see if they were allowed to run.


    The article appears flawed as ESET (NOD32) seem to have had the best initial detection rate of 37%

    Link to the test results is here:


    This reminds me of a test an American consumer magazine ran a couple of years' back. They got people to write new or obfuscated malware (about 4500 specimens).

    The results were equally appalling, but once again I don't believe they actually tried to run the malware.

    Like this latest test, they didn't say how they treated warnings that a file looked suspicious.

    I am also a little disappointed that Avira and Panda were not included.
    Last edited by nihil; August 12th, 2010 at 03:37 PM.

Similar Threads

  1. Nmap 4.0
    By Irongeek in forum Security News
    Replies: 9
    Last Post: January 31st, 2006, 09:24 PM
  2. A look into IDS/Snort Whole thing by QoD
    By qod in forum The Security Tutorials Forum
    Replies: 6
    Last Post: February 27th, 2004, 02:03 AM
  3. A look into IDS/Snort part 1 of 3
    By qod in forum The Security Tutorials Forum
    Replies: 18
    Last Post: January 5th, 2004, 01:30 PM
  4. Error Detection Techniques(Parity Bit)
    By w0lverine in forum Other Tutorials Forum
    Replies: 2
    Last Post: December 19th, 2003, 07:58 PM
  5. Introduction to IDS
    By micael in forum IDS & Scanner Discussions
    Replies: 3
    Last Post: February 23rd, 2002, 09:05 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts