VB100 (virus) test procedures

The VB100 (virus) award is granted to any product that passes the test criteria under test conditions in the VB lab as part of the formal VB comparative review process.

The basic requirements are that a product detects all malware listed as 'In the Wild' by the WildList Organization during the review period, and generates no false positives when scanning a set of clean files.

Various other tests are also carried out as part of the comparative review process, including speed and performance measurements, and results and conclusions are included with the review. The results of these secondary tests do not affect a product's qualification for VB100 certification.

Product submissions

Products must be submitted for review by their developers. Deadlines for product submission, along with details of platforms and the WildLists from which the test sets will be compiled, will be announced in advance of the test.

Submissions must be received by VB by the deadline set, with all required components including updates to virus definitions and detection engines. All software should, where possible, be made available ready to install, update and run in an 'offline' situation, as some testing is performed in a sealed environment without access to external networks. The core certification tests are run with full internet access, using the latest updates available at the time each test is run and with access to 'cloud' lookup systems where applicable, and those wishing to opt out of the offline components may do so by giving advance notice of such an intention along with their submission.

Submissions are accepted in the form of web downloads (the preferred format), email attachments or hard copies sent by post or courier, as long as they arrive before the deadline. Full licences or activation codes are preferred where applicable, but trial versions are also acceptable.

Testing and certification is open to any product vendor, entirely free of charge for up to two products per vendor per test - additional products may be submitted subject to a testing fee. By submitting a product for testing vendors agree to have their product reviewed and analysed as VB sees fit, within the scope of the testing methodology presented below. Once a product has been accepted for testing, it may not be withdrawn from the review by the vendor. However, VB reserves the right to refuse to test any product without further explanation.

Award criteria

The requirements for VB100 (virus) certification are:

  • 100% detection of malware listed as 'In the Wild' by the WildList Organization.

    The WildList to be used for each test will be the latest available at the time of the test set deadline. This deadline will be between five and seven days prior to the product submission deadline, and both dates will be communicated to potential participants, and publicized on the VB website, approximately two weeks prior to the deadline for each test. The certification test set includes both the traditional WildList and the Extended WildLists. All samples in the WildList collection are verified and, where applicable, replicated from originals provided by the WildList Organization.

    'Detection' in this case is accepted if the product clearly marks a file as infected in its log or on-screen display, or denies access to it during on-access testing. If such logging or blocking is unavailable or deemed unusable for test purposes, deletion or attempted disinfection of samples will also be an accepted indicator of detection.

  • No false positives when scanning VB's collection of known-clean files.

    The collection of known-clean files includes the test sets used for speed measurements, and is subject to regular and unannounced updating and enlargement. A false positive will be counted if a product is considered to have clearly flagged a file as infected in its log or on-screen display.

    A false positive will not be recorded if a file is labelled as something other than malware, such as adware, or a legitimate item of software with potentially dangerous uses. All other alerts on clean files will be counted as false positives.

    Flags will be adjudged to mark either a detection, in which case any files marked thus will be counted as detections in the infected set or as false positives in the clean sets, or mere suspicion, in which case no detection or false positive will be recorded. There will be no overlap between the two.

The WildList set is tested both on demand and on access. (During introductory tests only on-demand coverage will be required for the Extended WildList.) Testing will take place at three distinct points during the testing period, and will use the latest updates available at those three points. Any sample in the official set not detected during any of these test runs will be considered enough to deny a product certification; 100% detection must be maintained throughout the test. The clean set is divided into three parts and one part is scanned on demand at each of the three points during the test; any false positive recorded in any of these scans will result in a product failing to qualify for a VB100 award.

Other tests

Results of a range of additional tests will be included in comparative test reports, with the nature and design of these tests subject to change without notice. Detection tests may include testing with live internet access and latest updates, or may be run offline with frozen updates in the case of retrospective tests. In this latter case vendors may choose to withdraw from such tests, to accommodate those products which require online connectivity to function properly. Speed and performance measures will be taken based on a range of metrics, depending on the requirements of individual platforms and product types, and the way the data gathered in all these additional tests is presented and interpreted may be subject to modification and adjustment as appropriate.

Default settings

A product's default installation settings will be used for all tests, with the following exceptions:

  • Adjustments may be made to logging settings to allow adequate information to be gathered to analyse test results.
  • Adjustments may be made to how products respond to detections, where such adjustments will not affect the detection rate, to facilitate large-scale testing.
  • On-access scanning will be disabled, where possible, during on-demand testing.
  • In the event of a sample appearing to be missed due to a file type not being scanned by default (a common occurrence with, for example, archives not being scanned on access by some products), such samples may be rescanned with altered settings to verify this, in order to inform VB readers of the cause of such misses. Some adjustments to the file types scanned may be made during speed testing, in order to present more informative comparative data. Any false positive raised as a result of such alterations to default settings will not count against a product for certification, but may be recorded in the review text.

Three chances

Should the reviewer be unable to make a product function adequately, either wholly or in part, or should any event occur which appears to be the result of a problem with the installation and operation of a product, tests may be repeated up to a maximum of three times, on three different test machines, using clean images for each attempt.

Review text

Each product submitted for testing will be described to some extent in the text of the comparative review published in Virus Bulletin magazine and on www.virusbtn.com, with attention paid to design, usability, features and other criteria considered by the reviewer to be of interest. For the purposes of this analysis, product settings may be adjusted and additional testing carried out at the discretion of the reviewer. Any comments thus made are the opinion of the reviewer at the time of the review.

Right to reply

Should any vendor have any queries concerning the results of the tests, they are encouraged to contact VB for clarification and further analysis where necessary (email john.hawes@virusbtn.com).

VB100 award

A VB100 award means that a product has passed our tests, no more and no less. The failure to attain a VB100 award is not a declaration that a product cannot provide adequate protection in the real world if administered by a professional. VB urges any potential customer, when looking at the VB100 record of any software, not simply to consider passes and fails, but to read the small print in the reviews.

See also

Quick Links

Poll
Should software vendors extend support for their products on Windows XP beyond the end-of-life of the operating system?
Yes - it keeps their users secure
No - it encourages users to continue to use a less secure OS
I don't know
Leave a comment
View 23 comments

SMI Oil and Gas Cyber Security 2014

Jobs
In Virus Bulletin's jobs pages among others:

Virus Bulletin currently has 231,292 registered users.