VB100 comparative testing

Latest Comparative Test

February 2014: This month the VB test team put 6 products through their paces on SUSE Linux. John Hawes has the details.

VB100 comparative testing is a regular independent comparison of anti-malware solutions.

Each test report combines the unique VB100 certification scheme with in-depth analysis of product performance on a range of scales.

VB100 certification

The VB100 award is a certification of products which meet the basic standards required to be recognised as legitimate and properly functioning anti-malware solutions.

To display a VB100 logo, a product must:

  • prove it can detect 100% of malware samples listed as 'In the Wild' by the WildList Organization
  • generate no false positives when scanning an extensive test set of clean samples

All this must be done with default, out-of-the-box settings in the VB lab environment.

All solutions tested are submitted for testing by their developers. Full procedures of the certification scheme can be found here, with detailed test methodology here.

Results history

An archive of VB100 test results is available, allowing users to view the performance of a vendor or solution over time. While a single test can only show a snapshot of a product's capabilities at a specific moment in time, this long-term view can be used as a guide to how vendors are likely to perform in future. Listings of results by product are here.

Buy Now

You can now purchase individual comparative tests for $19.95 each. Look for the button on test pages. You will also receive access to the full comparative article for that month.

RAP testing

The unique RAP (Reactive and Proactive) tests measure detection rates using the freshest samples available at the time products are submitted to the test, as well as samples not seen until after product databases are frozen. This provides a measure of both the vendors' ability to handle newly emerging malware and their accuracy in detecting previously unknown malware.

The four-test RAP averages quadrant allows at-a-glance comparison of detection rates by these criteria. Full details of the RAP scheme are here.

Performance measures

Each VB100 comparative review is run on a different platform, allowing us to measure the performance of solutions in a wide range of environments.

We continue to expand and refine our measures of scanning speed, file access overheads, system resource usage and more, to provide a complete picture of how solutions impact the systems they are protecting as well as the quality of the protection provided. Full details of performance measures can be found in each comparative review; a list of comparative reviews by platform can be found here.

Test schedule

The next comparative review will be published in the June 2014 issue of Virus Bulletin, covering the Windows Server 2012 platform. The full schedule of upcoming comparatives can be found here.

Any developers interested in submitting anti-malware products for review are advised to contact John Hawes, Anti-Malware Test Director (john.hawes@virusbtn.com).

Recent VB100 award winners

 Agnitum (Outpost)
 BeyondTrust (formerly eEye)
 Check Point (ZoneAlarm)
 Commtouch (formerly Authentium)
 Digital Defender
 G Data
 Hauri (ViRobot)
 Inca nProtect
 K7 Computing
 Lavasoft (Ad-aware)
 PC Booster
 PC Pitstop
 Quick Heal
 ThreatTrack Security
 Total Defense Inc
 Vexx Guard

Quick Links

Should software vendors extend support for their products on Windows XP beyond the end-of-life of the operating system?
Yes - it keeps their users secure
No - it encourages users to continue to use a less secure OS
I don't know
Leave a comment
View 24 comments

SMI Oil and Gas Cyber Security 2014

VB100 certification
VB100 For the first time in living memory, this test saw a clean sweep of certification passes, with all products reaching the required standard for a VB100 badge, and most also doing well in terms of stability.
See full results.

Virus Bulletin currently has 231,305 registered users.