VB100 comparative testing

Latest Comparative Test

February 2015: This month the VB test team put 6 products through their paces on Red Hat Enterprise Linux. John Hawes has the details.

VB100 comparative testing is a regular independent comparison of anti-malware solutions.

Each test report combines the unique VB100 certification scheme with in-depth analysis of product performance on a range of scales.

VB100 certification

The VB100 award is a certification of products which meet the basic standards required to be recognised as legitimate and properly functioning anti-malware solutions.

To display a VB100 logo, a product must:

  • prove it can detect 100% of malware samples listed as 'In the Wild' by the WildList Organization
  • generate no false positives when scanning an extensive test set of clean samples

All this must be done with default, out-of-the-box settings in the VB lab environment.

All solutions tested are submitted for testing by their developers. Full procedures of the certification scheme can be found here, with detailed test methodology here.

Results history

An archive of VB100 test results is available, allowing users to view the performance of a vendor or solution over time. While a single test can only show a snapshot of a product's capabilities at a specific moment in time, this long-term view can be used as a guide to how vendors are likely to perform in future. Listings of results by product are here.

RAP testing

The unique RAP (Reactive and Proactive) tests measure detection rates using the freshest samples available at the time products are submitted to the test, as well as samples not seen until after product databases are frozen. This provides a measure of both the vendors' ability to handle newly emerging malware and their accuracy in detecting previously unknown malware.

The four-test RAP averages quadrant allows at-a-glance comparison of detection rates by these criteria. Full details of the RAP scheme are here.

Performance measures

Each VB100 comparative review is run on a different platform, allowing us to measure the performance of solutions in a wide range of environments.

We continue to expand and refine our measures of scanning speed, file access overheads, system resource usage and more, to provide a complete picture of how solutions impact the systems they are protecting as well as the quality of the protection provided. Full details of performance measures can be found in each comparative review; a list of comparative reviews by platform can be found here.

Test schedule

The full schedule of upcoming comparatives can be found here.

Any developers interested in submitting anti-malware products for review are advised to contact John Hawes, Anti-Malware Test Director (john.hawes@virusbtn.com).

Recent VB100 award winners

 Agnitum (Outpost)
 Check Point (ZoneAlarm)
 CYREN (Command)
 G Data
 Hauri (ViRobot)
 K7 Computing
 Kaspersky Lab
 Lavasoft (Ad-aware)
 Optimal Security
 PC Pitstop
 Quick Heal
 ThreatTrack Security
 Total Defense Inc