Over the last few weeks there have been a flurry of test results. The first report in October was from AV-comparatives. The guys there have carried out a performance test. The result for Sophos in this test was okay. Our performance with default settings was in the same ballpark as our major competitors. It was noted that with best settings we took longer but here is the nub of the problem. Just how must scanning is going on under the hood when all those options are enabled? For instance, how many archive formats are being scanned?
How do you tell whether one product is actually doing a better job than another? These are the questions that are likely to be asked and, unfortunately, there are no simple answers. Providing quality detection is a trade off between speed of product, depth of detection, size of detection data and risk of false positives. Default settings are what are determined to provide the best combination of the factors mentioned above for the majority of systems. They are not always the best in every situation and some systems definitely warrant some changes to optimise detection and performance. Perhaps the best answer on how to interpret a test is to not look at just one test but to look at several tests and build up an overall picture – which leads me to the next test.
Another view on the conundrum of which product is best is to consider how well a product might perform against new malware when the product has been artificially kept at a date older than the new malware. This is exactly the test that AV-comparatives have done with their latest retrospective test. They froze all the products on one date and then looked at how well they detected new malware that emerged in the week after and in the four weeks after. I am pleased to be able to report that Sophos outperformed our major competitors. I will admit that Avira, Kaspersky, GDATA and ESET all fared better, though their drop between 1 and 4 week detections was greater than Sophos. This test is at the mercy of new malware that is completely different from any seen before but each product has to face that issue.
AV-Comparatives have made a good job of the test but this type of testing excludes any advantages that are gained from run time detection and we know from our own internal testing that enabling Sophos run time detection offers significant improvements over and above generic detection. Cliff wrote an excellent article about it here demonstrating how, without data updates, the product protected customers against the latest malware exploiting a new Microsoft vulnerability. This is a great article to read if you want to consider whether you have the best settings for the product in your environment.
The most recent test, out just yesterday, was the VB100 test carried by Virus Bulletin. Graham Cluley has already talked about it here at his blog. Suffice to say that, yet again, Sophos were awarded the VB100, a test where a product has to detect all the malware currently on the wildlist with no false positives and using default settings. Tucked away inside the report is a table on performance which also shows Sophos to have good throughput on scanning files.
So, what do all these tests tell us? From where I am sat, I know that the product is meeting the certification requirements for wildlist testing, it has good proactive detection, it has a low false positive rate and it does all that quickly with limited impact on system resources. I also know that the run time detection technology is very good at stopping new malware. Unfortunately, due to the complexity and time consuming nature of such testing, no-one is doing a test yet that compares the different offerings. AMTSO has recently issued a Best Practices document on conducting dynamic testing so it is hoped that the one of the testers out there will determine a method of performing a fair and meaningful test in the near future.
None of these results mean that we can rest on our laurels and I know there are areas we want to improve to keep our detection top notch and there are guys hard at work in the labs looking for new ways to tackle the problems that the latest malware brings.