VB100 comparative review on Windows 7 Professional

2010-12-01

John Hawes

Virus Bulletin
Editor: Helen Martin

Abstract

This month the VB lab team put a monster haul of products to test on Windows 7 Professional, but were disappointed by the level of unreliability, untrustworthiness and flakiness they encountered. John Hawes has all the gory details.


Table of contents
Introduction
Platform and test sets
Results
Agnitum Outpost Security Suite Pro 7.0.4
AhnLab V3 Internet Security 8.0.3.23
Arcabit ArcaVir 10.10.3708.4
Avast Software avast! 5.0.677
Avertive VirusTect 1.1.21
AVG Internet Security 2010 10.0.1152
Avira AntiVir Personal 10.0.0.567
Avira AntiVir Professional 10.0.0.918
BitDefender Business Client 11.0.22
Bkis BKAV Home Plus 2010 3090
CA Internet Security Suite Plus 7.0.0.107
CA Total Defense r12 12.0.193
Celeritas Software Company WinSafeGuard 1.1.21
Central Command Vexira 6.3.14
Clearsight AntiVirus 2.1.21
Commtouch Command Anti-Malware 5.1.10
Comodo AntiVirus 5.0.163652.1142
Comodo Internet Security 5.0.163652.1142
Coranti 2010 1.001.00011
Defenx Security Suite 2011 3389.519.1244
Digital Defender Antivirus Full 2.1.21
eEye Digital Security Blink 4.7.1
Emsisoft Anti-Malware 5.0.0.84
eScan Internet Security for Windows 11.0.1139.843
ESET NOD32 Antivirus 4.2.64.12
Filseclab Twister AntiVirus V7 R3 7.3.4.9985
Fortinet FortiClient 4.1.3.143
Frisk F-PROT Antivirus for Windows 6.0.9.4
F-Secure Client Security 9.01 build 122
F-Secure Internet Security 10.50 build 197
G DATA Antivirus 2011 21.1.0.5
Hauri ViRobot 5.5
Ikarus virus.utilities 1.0.227
Iolo System Shield 4.1.0
K7 Total Security Desktop Edition 10.0.057
Kaspersky Antivirus 6 for Windows 6.0.4.1212a
Kaspersky Internet Security 2011 11.0.2.556
Keniu Antivirus 1.0
Kingsoft Internet Security 2011 Advanced Edition 2008.11.6.63
Kingsoft Internet Security 2011 Standard Edition 2008.11.6.63
Lavasoft AdAware Professional 8.3.4
Lavasoft AdAware Total Security 21.1.0.28
McAfee VirusScan Enterprise 8.7i
Microsafe Avira Premium Security Suite 10.0.0.60
Microsoft Security Essentials 1.0.2498.0
MKS MKS_vir 10 b151
Nifty Corporation Security24 5.62
Norman Security Suite 8.00
Optenet Security Suite v.10.09.69
PC Booster AV Booster 1.1.21
PC Tools Internet Security 2011 8.0.0.608
PC Tools Spyware Doctor 8.0.0.608
Preventon AntiVirus 4.3.21
Qihoo 360 Antivirus 1.1.0.1313
Quick Heal Total Security 2011 12.00 (5.0.0.1)
Returnil System Safe 2011 3.2.10878.5466
Rising Internet Security 201022.71.02.03
Sophos Endpoint Security and Control 9.5.4
SPAMfighter VIRUSfighter 7.100.11
Sunbelt (now GFI) VIPRE 4.0.3904
Trustport Antivirus 2011 11.0.0.4565
VirusBuster VirusBuster Professional 6.3.14
Webroot Internet Security Complete 7.0.5.210
ZeoBIT PCKeeper 1.1.49.3149
Results tables
Conclusions
Technical details
Appendix – test methodology
Core goals
Malware detection measures
Performance measures
Sample selection and validation
Reviews and comments

Introduction

After the last comparative review – on a server platform – saw no let-up in the ever-increasing number of products eager to join our tests, the return to Windows 7 was always likely to bring in a monster haul of submissions. Along with the hardcore regulars, we expected a selection of newcomers – dominated as always by re-workings of existing engines but with a handful of entirely new technologies to add extra interest. As submissions streamed in on the test deadline, we were disappointed by a few notable no-shows – the world’s largest security provider among them – but gratified, surprised and eventually terrified by the huge number of entries.

The final tally came in at 64 submissions, breaking our previous record by a handful. The numbers were bulked up by a number of rebrandings of one of the most popular engines in the OEM market. While many of this month’s entries were from known and trusted providers, we spotted a few names on the list with a reputation for a lack of decent configuration controls, unreliable logging and general disorderliness, while several of the new faces were completely unknown to us, with the potential to cause all manner of headaches. With a long road ahead we bravely shut ourselves away in the test lab, anticipating a long and draining month, praying to all available deities that problems would be kept to a minimum and our work would prove smooth and pleasant. Some hope, you might say – let’s see how it went.

Platform and test sets

Windows 7 is no longer the fresh-faced new kid on the block, having matured into a solid and widely trusted platform with strong growth in usage figures. While most measures admit to some degree of inaccuracy, estimates are that around 20% of desktops worldwide are now running on the latest version of Microsoft’s latest operating system. The decline in use of the evergreen XP appears to be gathering pace – although for now it remains the most widely used platform – and Windows 7 seems in with a chance of exceeding XP’s popularity within the next 12 months.

The installation of Windows 7 was reasonably straightforward, with as usual only the bare contents of the install media used and no more recent updates – a brief connection to the Internet was required for activation, but updates were disabled prior to this period, to ensure equality between all test machines and to minimize unpredictable impact on performance measures. No additional software or drivers were required to provide full support of our current test hardware, and only a handful of extra tools were added to facilitate the testing process. These included PDF readers to peruse any instructions and help files provided; additional dependencies would be applied as required, on a per-product basis. While a few additional areas were tweaked slightly – mainly to prevent unwanted interference with speed measurements – for the most part, the platform was left with its out-of-the-box settings, including the User Account Controls (UAC). We expected to see the UAC interposing itself from time to time, but were interested in observing its interaction with the solutions under test.

Test sets were built and installed on the test systems in the usual manner. The deadline for product submissions was 27 October, with the official test set deadline on 22 October. The core certification set was built around the latest official WildList available on this date, which was the September list, released on 19 October. The list comprised the usual selection of password stealers targeting online banks and gamers, alongside the standard complement of worms, bots and similar nasties. Several of the strains of W32/Virut that have been causing problems in recent comparatives fell off this month’s list, but were replaced by yet more variants.

We ceased all updates to our clean test sets on 22 October as well, with a wide range of new items having been added in the weeks running up to this – additions mainly focused on popular download software, but also included a selection of major business software. Older and less significant items were removed from the sets as usual.

The remaining test sets were adjusted along the normal lines, with a selection of new items added to the polymorphic set and some older ones retired. The sets of trojans and worms were for the most part rebuilt from scratch with items first seen by us since the end of the last test. As usual, the RAP sets were put together in four weekly batches covering the three weeks leading up to the product submission deadline and the week following it. After some sorting, classification and validation, final lists of approved samples were produced and used to calculate detection scores for the products taking part.

In addition to the standard data provided, we decided this month to include some extra details of product versions where available, and to comment more closely on the number of errors, hangs, crashes etc. we experienced with the products, as well as to give an approximation of the total time required to get each product through the test suite. When planning our testing schedule we usually assume that a well-behaved product can be tested within a 24-hour period, allowing the main scans to run overnight. We hope it will be of interest for our readers to see which products conformed with our expectations.

With the many enhancements and adjustments made to our tests in recent months, a thorough overhaul of the detailed online methodology of our tests is required and will be completed as soon as possible. For those who do not regularly follow these reports, though, we have put together a brief synopsis of the test method which will be included with each set of published results, in the form of an appendix to the test report. We advise all readers to take this information on board as an aid to the comprehension and interpretation of the test results.

In the meantime, we present the full rundown of results and product reports for this month’s comparative, in all its exhaustive and occasionally gory detail.

Results

Agnitum Outpost Security Suite Pro 7.0.4

Additional version information: 3403.520.1244, database 27/10/2010.

Apparently not content with producing one of the most highly regarded personal firewall solutions on the market, Agnitum has integrated malware detection – courtesy of the hugely popular VirusBuster engine – into its security suite with considerable finesse. The result is a version of the protective technology which is superior in many respects to that provided by the engine’s developer itself. The product, measuring little over 100MB in all its parts, installs in a reasonably lengthy process, and requires a reboot to complete.

The interface has had something of a facelift recently, and looks thoroughly at home in the glossy surroundings of the Windows 7 environment. The layout is clear and easy to navigate, providing no more than the basic requirements as far as options are concerned, but doing so with clarity and simplicity. Speed tests showed some fairly slow scanning speeds initially on demand, but with superb improvements on return visits thanks to some clever caching of results. On-access overheads were fairly average, while resource usage was impressively low – particularly CPU use while busy. Detection rates were pretty decent across the sets, with a reasonably consistent showing in the RAP sets.

Helped by the clever caching which worked on detections as well as clean files, all tests were complete within a single day of testing, and throughout the test period the product’s stability was flawless.

A well-earned VB100 award goes to Agnitum thanks to complete coverage of the WildList and a clear run through the clean sets.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 83.06%
Worms & bots: 96.56%
Polymorphic: 90.52%
False positives: 0

AhnLab V3 Internet Security 8.0.3.23

Additional version information: Build 741, 2010.10.27.30.

AhnLab’s current product arrived as a fairly hefty 150MB installer, which ran through in fairly quick time with little input required from the operator. The process was not super rapid however, thanks to a rather lengthy pause at the outset as it got itself into the right mood. The interface is fairly clear and pleasant to use, with some sensibly laid out options providing a little more than the basics in a very usable manner.

The on-demand speed tests took a fair amount of time, with longish scans in all the sets and minimal speed-up on repeated runs. File access lag times were a fraction above the average, as was CPU use, although memory drain was not excessive. Detection rates were pretty solid in the main sets, and not bad in the RAP sets either – fairly steady through the weeks with an interesting dip in the final reactive week (‘week -1’), recovering to previous heights in the proactive week (‘week +1’).

At the end of the on-access run over the main test sets – probably the most taxing portion of the test suite – the interface became unresponsive, but recovered after a reboot, and this was the only stability issue noted in a test period lasting less than 24 hours in total.

No problems were observed in the WildList or clean sets, and AhnLab earns a VB100 award after a very respectable performance.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 94.09%
Worms & bots: 96.64%
Polymorphic: 99.64%
False positives: 0

Arcabit ArcaVir 10.10.3708.4

Additional version information: Bases 2010.10.27 10:35:16.

Provided as an extra-large 227MB install package, Arcabit’s set-up process is rather lengthy and occasionally bewildering. After a brief appearance the window vanishes for a spell, before running through the installation of the C++ redistributable package. This is followed by another lengthy spell of apparent inactivity, but eventually things get moving again. The product defaults at first to Polish, but this is easy to adjust, and once the installation is complete a reboot is requested. The login process felt somewhat longer than usual after this, but that may simply have been the result of a general sense of sluggishness picked up during the set-up procedure.

The main interface is divided into simple and advanced modes, from which we chose the more sophisticated version for most of our activities. This provides controls in a branching tree format down the left side, which gave reasonably simple access to a solid range of configuration controls. Scanning speeds were fairly sluggish over the archive and binaries sets, but fairly zippy through the other sets. On-access logs were fairly low too, although CPU use was quite high at busy times.

Detection rates were no more than reasonable in the standard sets, with a rather disappointing showing in the RAP sets, starting at a lowish level and dropping away rather sharply. The WildList was handled without problems, but in the clean sets a handful of items were mislabelled as malware, including several popular freeware tools and a file from Oracle which was considered so unlikely to be detected that it was included in the speed sets. As a result, Arcabit doesn’t quite make the grade for a VB100 award this month, despite good stability and getting through all the tests well within the expected 24 hours.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 69.75%
Worms & bots: 86.45%
Polymorphic: 84.78%
False positives: 8

Avast Software avast! 5.0.677

Additional version information: Definitions 101027-1.

After a batch of fairly hefty products, the hugely popular free version of avast! surprised us by arriving as a mere 50MB install package, including all required updates. The installation process was very simple, with the offer to join a community feedback scheme and the creation of a system restore point the only items of note. With no reboot required, the whole process was over in less than 30 seconds.

The interface is simply delightful – easy on the eye and the mind alike, providing ample configuration options without being overwhelming. Despite its free-for-home-use nature, the product includes a pretty thorough range of additional protection layers as would be expected of a fully fledged security suite. Running through the tests proved as pleasing as ever, with splendidly fast scanning speeds and similarly impressive on-access measures. RAM usage was fairly low, but CPU consumption a little higher than expected. Detection rates were also excellent, in the RAP sets as well as the standard ones, and with no problems in the clean or WildList sets the product easily earns a VB100 award.

Stability, responsiveness and general good design also earn Avast a respectful nod of approval from the lab team – the fact that all tests were complete not long after lunch on the same day they were started brought an additional smile.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 98.48%
Worms & bots: 97.90%
Polymorphic: 94.41%
False positives: 0

Avertive VirusTect 1.1.21

Additional version information: Definitions version 12.70.6, definitions date 26/10/2010.

Avertive submitted the first of what promised to be several pretty similar products this month, all based on the same SDK to the VirusBuster engine. The desktop AV product, VirusTect, was provided as an 80MB installer including fresh updates, and its installation process was simple and unchallenging. One thing which slowed things down initially was the need to be online in order to run the installer, but this appeared to only be for the first few moments and for the application of a licence code, which enables the more advanced settings. These proved not to be especially advanced, covering little more than the basics but going further than a few of this month’s products. The layout is clear and fairly lucid, although we found differentiating between ‘detect only’ and ‘try disinfect first’ options a little confusing.

Scanning speeds were medium on demand and on access, with performance measures coming in pretty low. Running through the sets was reasonably painless and problem-free, finishing comfortably within the one-day period allocated.

Detection rates in the main sets were solid, with middling rates in the RAP sets; the clean sets threw up only a single item alerted on as being packed with the Themida packer tool (popular with malware authors), and the WildList sets were handled without problems on demand. On access however, as with other branches of this product line in previous tests, a handful of items were missed despite being spotted on demand, and no VB100 award can be granted despite a decent showing in general.

ItW: 100.00%
ItW (o/a): 97.72%
Trojans: 81.60%
Worms & bots: 94.48%
Polymorphic: 90.51%
False positives: 0

AVG Internet Security 2010 10.0.1152

Additional version information: Virus database version 424/3220, release date 26 October 2010, 06:34.

Back with the larger installers, AVG’s comes in at 141MB, but does promise a complete suite. The set-up process is quite lengthy, and includes the offer of a toolbar which provides Yahoo! searching alongside the security features. No reboot is needed at the end, but the set-up is followed by some additional configuration stages, including registration of the user’s personal information and the option to join a community feedback scheme. The interface – which is also accessible via a funky modern desktop gizmo – has had a bit of a facelift since its appearance in recent tests, and looks sharp and crisp, although still somewhat cluttered by the large number of modules. Configuration is provided in considerable depth, but is generally straightforward to access and the layout makes good sense.

Previous tests have seen some rather sluggish scanning speeds and we were prepared for more of the same, but the facelift noted above has clearly gone deeper than the surface, providing some considerable improvements at the operational layer too. Initial scan times were pretty decent, and repeat runs lightning fast, indicating a smart approach to known-clean items. Even with the settings turned up from their initial default level, which delves deep into archive files but trusts in file extensions to decide what to scan, speeds remained more than respectable. A similarly impressive speed-up was observed in the on-access tests, and RAM use was perhaps just a fraction above the month’s average, but CPU use appeared fairly high in comparison to the rest of the field.

Detection rates were excellent in the main sets, and made a solid start in the RAP sets too, dropping off fairly steadily through the weeks but never dipping below a reasonable level. The suite includes a thorough range of additional protective layers to cover more recently emerging threats.

Stability was flawless, and testing was complete within the 24-hour period hoped for. With perfect coverage of the WildList and clean sets, a VB100 award is comfortably earned by AVG.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 95.41%
Worms & bots: 99.33%
Polymorphic: 99.33%
False positives: 0

Avira AntiVir Personal 10.0.0.567

Additional version information: Search engine 8.02.04.86, virus definition file 7.10.13.44.

Avira’s free-for-home-use product was provided as a 43MB main installer with 45MB of updates, and ran through fairly rapidly. It informs the user that Windows Defender may no longer be useful, but doesn’t go as far as removing it. It also offers an optional registration system, and fills the screen with a large advertisement encouraging the user to upgrade to the full paid edition. No reboot is needed to complete.

The interface is fairly simple and not overwhelmingly attractive, but provides a solid range of configuration options, many of the more interesting ones tucked away in the ‘advanced’ area. Default settings are sensible, although the scheduled scan job is fairly unusual in being set up ready to go but not enabled by default. Scanning speeds were pretty decent – although there was no sign of speed-up on repeat runs – and file access times were similarly good. Resource usage was on the low side of average.

The infected sets were powered through in splendid time, although a couple of items in the RAP sets appeared to snag the scanner somewhat; these needed to be removed to allow the scans to complete, but even with this interruption the product completed all tests without even needing an overnight stay. Detection rates were as superb as ever, with the RAP scores declining only very slightly into the later weeks.

The WildList presented no difficulties, and with the clean sets handled well too, a VB100 award is comfortably earned by Avira.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.13%
Worms & bots: 99.82%
Polymorphic: 100.00%
False positives: 0

Avira AntiVir Professional 10.0.0.918

Additional version information: Search engine 8.02.04.86, virus definition file 7.10.13.44.

The professional (paid-for) version of the Avira solution seemed pretty similar to the free version on the surface, with the installer comparable in size and the same updater used for both versions. The installation process includes many of the same stages, but uses a licence key file rather than the optional registration and nag screens. It’s all very clear, progresses quickly and needs no reboot to complete.

Looking more closely at the interface, a few additional protective modules are available, as well as more in-depth configuration options in some areas. Rather surprisingly, scanning speeds were a little slower than the free version in most cases, and on-access times noticeably higher, but performance figures were fairly close. Detection rates were identical, thanks to the shared updater. This meant that, once again, we needed to remove a brace of files from the RAP sets to prevent snagging, but the product quickly racked up some more superb scores, devouring the infected sets in truly awesome time and barely missing a thing, finishing the same working day as it started.

With no problems in the core certification sets Avira picks up another VB100, with our thanks for a speedy and reasonably stable performance throughout.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.13%
Worms & bots: 99.82%
Polymorphic: 100.00%
False positives: 0

BitDefender Business Client 11.0.22

Additional version information: N/A

BitDefender provided its business solution for this month’s test, which arrived as a 137MB package with all updates included. The set-up process is short and sweet, including proud mention of the awards earned by the company, and ends with a request to reboot the system. The interface is divided into simple and advanced versions, both of which are fairly clean, simple and businesslike; the advanced version offers an impeccable degree of fine-tuning options for the more demanding user.

Running through the tests proved unproblematic, if a little less rapid than expected. On-demand scanning showed no sign of the speed-up on repeat runs we have come to expect from the BitDefender range, but even so was considerably faster than some of this month’s competitors. In the on-access measures – where such techniques are perhaps more significant – the speed-ups were impressive, with lowish RAM usage too, although a fair number of CPU cycles were used when processing files at speed. The decent speeds and good stability ensured comfortable completion of the full test suite within 24 hours.

Detection rates were excellent, with superb scores in the main sets and a solid level across the RAP sets, declining very gradually across the weeks. No issues emerged in the certification sets, and with a thoroughly solid and respectable performance BitDefender is a worthy VB100 winner.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 97.20%
Worms & bots: 99.78%
Polymorphic: 100.00%
False positives: 0

Bkis BKAV Home Plus 2010 3090

Additional version information: Engine 3.5.6, pattern codes 3.337.949, update 25/10/2010.

Bkis is a fairly fresh face in our VB100 tests, but has shown impressive improvements in the few tests it has appeared in, and we looked forward to seeing further growth. The company’s home-user product was entered this month, weighing in at a fairly large 272MB including updates. The installation process was remarkably fast and simple though, requiring only a couple of clicks and a very brief wait (accompanied by an informative slideshow) to get things going. A reboot was needed to round things off.

The somewhat glaring orange interface looks a little simplistic, but provides a basic range of options very lucidly, making everything easy to find and operate. It proved responsive and stable throughout our stressful suite of tests. Scanning speeds through the clean sets were fairly sluggish, apart from in the archive sets where little was scanned internally, even with all options enabled. On-access measures were similarly hefty, and although RAM use was not much above average, CPU use was pretty high.

This was more than compensated for by the detection rates however, which proved truly remarkable across all the sets, including the RAP sets, with all three reactive weeks handled excellently and a step down to merely highly impressive in the proactive set.

The WildList presented no difficulties, and not a single problem appeared in the clean sets either; a superb showing earns Bkis another VB100 award, and our congratulations on one of the best performances of the month.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 87.76%
Worms & bots: 94.94%
Polymorphic: 83.87%
False positives: 0

CA Internet Security Suite Plus 7.0.0.107

Additional version information: Security center version 7.0.0.107, anti-malware SDK version 1.4.0.1499, signature file version 3998.0.0.0.

CA once again entered both consumer and business solutions for this test, and once again insisted on both being installed, activated and updated online on the deadline day. Our scheduling meant that the consumer version was updated fairly early in the day, with version 3998 of the signatures acquired; no time was available to re-check, although we are informed that another set of updates was released later the same day.

The initial installer file was 146MB, and after a fairly quick, colourful and funky set-up process it spent rather a long time downloading at least 25MB of additional updates. Having endured the wait for this to complete, ignored the offer of a Yahoo! toolbar, and witnessed a remarkably rapid scan which seemed to present its results almost instantly, a reboot was required.

On restart, further work was needed to register and license the product, with a fair amount of personal information needing to be entered. The interface design is iconoclastic and somewhat bizarre in places, with some fairly confusing options, but most of the things we needed were available after some searching and a little head-scratching.

On-demand scans were a little hard to monitor as they provided no progress information, but the speed tests completed without incident. Some slowish speeds were recorded in the archive sets, but good speeds elsewhere, with some solid speed-ups on repeated runs. On-access measures showed a similar pattern with decent times on initial viewing which were enhanced on return visits, while RAM use was fairly high, but CPU drain no more than average.

Detection scores were a little harder to come by, apparently thanks to an adjustment in how scan results are stored. In previous tests, CA’s solutions have routinely shown themselves to be among the fastest to plough through our large infected sets, but this time a scan left running overnight was found the next morning to have stopped less than halfway through, providing an error message and an interface announcing 50,000 detections but no logging of them to be found on disk. Given the 800MB of RAM in use by the scanner process, we assumed that scan results were instead being stored in memory.

Re-running the scans in smaller chunks proved a little better – they still slowed down notably as the number of detections rose, and after a few tens of thousands of detections the entire system became slow to respond. However, these circumstances would be fairly unlikely in the real world, so it is hard to complain about this odd change too strongly. The wasted time and additional work meant that testing overran considerably, taking up close to three of our precious test days.

In the end, some decent results were obtained in the standard sets, with RAP scores more mediocre. A couple of items were alerted on in the clean sets including Google’s Desktop Search package, and in the WildList set a handful of W32/Virut samples were not detected. Oddly, these were not from the most recent batch, but from those included in the last test – which at the time were covered by CA products. This suggests that some adjustment to the detection for this strain had left a gap in protection. Either way, CA’s home solution does not quite make the grade for VB100 certification this month.

ItW: 99.999%
ItW (o/a): 99.999%
Trojans: 77.03%
Worms & bots: 91.67%
Polymorphic: 96.25%
False positives: 2

CA Total Defense r12 12.0.193

Additional version information: Anti-malware version 1.3.3.1262, engine 36.1.0.6, signature 36.1.1.4001.

CA’s second entry this month is its business offering – a staple of our comparatives for many years and one which, for at least the last four years, has remained virtually unchanged despite our occasional complaints. This time, however, we were finally treated to a new business solution: Total Defense r12.

As usual, the company insisted on our installing and updating with Internet access, meaning it all had to be done on the deadline day, but despite our repeated requests to get things started well in advance the link did not arrive until the morning of the deadline itself. This was a little problematic to say the least, as the solution can apparently only be provided as a complete DVD image, measuring well over 3GB. This was the largest submission for this month’s test by more than ten times and also the slowest to download by a considerable margin, taking almost seven hours to make its way to the lab.

With this initial hurdle overcome, the rest of the set-up process was also far from plain sailing. There were a number of dependencies to resolve, including such security-friendly items as Adobe Flash and Reader, some confusing and demanding forms in the installation process (not least the insistence on changing the system’s admin password to something stronger before the install could complete), the failure of one install attempt failing but with little information as to why, and, after two reboots and an hour-long update, a message which insisted that the licence key applied just minutes earlier had now expired. An overnight wait and some kind of check with licensing servers (run at 2a.m.) overcame this, and we were finally able to get our first look at the product itself.

The main client interface is fairly pleasant and clearly designed, with most of the standard options provided in easily accessible positions with a general air of thoroughness and diligence. It also seemed sturdy and responsive, compared to the previous offering. An administration console was provided, which was again browser-based and heavily reliant on Flash, but it looked fairly well laid out and not too difficult to navigate. We didn’t really explore this in too much depth though, staying with it just long enough to grant rights to the local client to run scans and change settings.

Moving on to the test, speed measures went well, with initial scans fairly zippy and repeat visits lightning fast. Things were a little slower with full depth scanning enabled, but that’s only to be expected. On-access times were pretty decent, and while RAM consumption was fairly high – perhaps accounted for by the additional management tools required – CPU use was remarkably low.

The detection tests proved problematic, with on-demand scans taking huge amounts of time and using vast amounts of memory – over 1GB by the end of the main set scan – although this time it did at least complete without crashing. With the machine barely responding, even after a reboot, we didn’t dare revisit the admin GUI to harvest results, instead relying on ripping them out of a raw SQL file we managed to dig up. On-access tests, run after the test machine had been reimaged to a clean state, were a little less tricky, but harder to gather results for, as not only did the product disobey our explicit instruction not to clean or remove any files (thus rendering the logs kept by our access tools somewhat unreliable), but it also seemed to be a little inaccurate in its own logging. Several retries later, we eventually pulled together a set of figures which we hope are reasonably complete, showing similar scores to the consumer version in most sets, right down to the handful of Virut samples not covered in the WildList set.

Thus, after giving us a lot of headaches and taking up more than five full days of hands-on testing time, CA’s shiny new solution fails to earn VB100 certification at its first attempt.

ItW: 99.999%
ItW (o/a): 99.999%
Trojans: 73.62%
Worms & bots: 69.84%
Polymorphic: 96.25%
False positives: 0

Celeritas Software Company WinSafeGuard 1.1.21

Additional version information: Definitions version 12.70.6, definitions date 26/10/2010.

WinSafeGuard is the second of several similar clones based on the VirusBuster engine and Preventon GUI this month. Celeritas (properly referred to as ‘Celeritas Software Company’, to avoid confusion with other similarly named enterprises) also produces optimization and privacy-cleaning tools, as well as a tool to locate, update and fix drivers. The company’s version of the AV solution comes in a crispy blue-and-white colour scheme, with the expected fairly simple installation process and decent set of controls.

The testing process followed the lines laid down by our first attempt at testing a similar solution, and completed, as expected, the morning after the initial install. The capping of the logs at a size just too small for our test sets meant some periodic harvesting was required to ensure all data was kept for processing. The results showed no surprises, with the expected pretty decent showing in the standard sets, a reasonable set of RAP scores, a single suspicious packer noted in the clean sets and solid coverage of the WildList on demand. On access things unravelled once again though, with the same handful of samples mysteriously missed; the odd issue denies Celeritas a VB100 award, despite a generally decent showing.

ItW: 100.00%
ItW (o/a): 97.72%
Trojans: 81.60%
Worms & bots: 94.48%
Polymorphic: 90.51%
False positives: 0

Central Command Vexira 6.3.14

Additional version information: Engine 5.1.1, databases 12.70.8.

Central Command’s Vexira is yet another product that uses the ubiquitous VirusBuster engine, but goes a step further by using a clone of its interface too, with only the colour scheme and branding to tell the two apart. Provided as a 67MB installer with 85MB of updates, the set-up process includes more stages than many but is reasonably clear and painless, with no reboot needed to complete. The garish red of the interface is a little trying on the eyes at first but one soon becomes inured to it. Similarly, the layout seems clunky and awkward initially, but after some practice it is reasonably straightforward to operate; a decent, if not exhaustive level of configuration is provided.

On-demand scanning speeds were pretty good, remaining steady across multiple attempts and slowing down somewhat in the archive set once archive handling was activated. Although the option to check compressed files on access appears in the GUI, it could not be made to produce any results. Resource use and file access lags were fairly low, and stability was solid throughout, with all tests finished within 24 hours of initial installation.

Results were much as expected, with a very decent showing in the standard sets and pretty reasonable, and again very steady scores in the RAP sets. With a single Themida-packed item alerted on in the clean sets and no problems at all in the WildList, Central Command once again earns a VB100 award quite comfortably.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 82.40%
Worms & bots: 96.64%
Polymorphic: 90.52%
False positives: 0

Clearsight AntiVirus 2.1.21

Additional version information: Definitions version 12.70.6, definitions date 26/10/2010.

Three in a row here for the VirusBuster engine, with another solution using the Preventon interface – this one from Clearsight, a company that seems to be focused on keeping things simple. A free edition of the product is available, along with a ‘Pro’ version that has extra configuration controls; as usual we required these, so had to connect to the web to apply an activation code before continuing.

Running through the tests quickly became formulaic, having practised a few times already, and once again it took up most of a day for the main tests and finished after an overnight scan job. Speeds were as expected – not unreasonable, but not super-fast, with a lightish touch in terms of resource use. On-access times were a little odd: super-light in some areas, but above average in others. Detection rates, as predicted, were decent in most areas, with once again the WildList handled fine on demand but a handful of infected samples not spotted on access; thus, another reasonable performance does not quite make the grade for certification.

ItW: 100.00%
ItW (o/a): 97.72%
Trojans: 81.60%
Worms & bots: 94.48%
Polymorphic: 90.51%
False positives: 0

Commtouch Command Anti-Malware 5.1.10

Additional version information: Engine version 5.2.12, DAT file ID 201010270229.

The Command solution, recently taken over by Commtouch, had some issues in the last comparative, which were eventually discovered to be due to a miscommunication at the submission stage which led to a rather aged version of the engine being used. Watching closely for such issues this time, we installed the slender 13MB main package and added the similarly compact 24MB updater with caution, but all seemed in good order after a slowish but low-interaction set-up process.

The interface is fairly simple, with a button marked ‘advanced’ which grants access to a fairly basic set of configuration options. What controls there are can be accessed easily, and setting up the various jobs required was simple and rapid. It did not take too long to run through the speed tests, with reasonable and very steady scanning speeds on demand and not overly heavy overheads on access, while CPU use was no higher than most and RAM consumption fairly low.

Running through the infected sets proved a little more time consuming – not so much for the scans themselves, but more for the time needed to display and export results. On several occasions this took such a long time that we assumed it had completely failed, and on one occasion we observed an actual crash, possibly caused by an on-access pop-up appearing while the product was straining to decipher its log data. However, the data was stored in Access database format, and we were able to rescue it where necessary, and testing completed in pretty reasonable time.

On processing the data retrieved, we found some pretty decent scores across the sets, with a fairly steady level across the RAP sets – achieving their peak in the ‘week -1’ set. No problems were spotted in the clean or WildList sets, and a VB100 award is duly earned.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 71.97%
Worms & bots: 88.73%
Polymorphic: 100.00%
False positives: 0

Comodo AntiVirus 5.0.163652.1142

Additional version information: Virus signature database version 6526.

Comodo put in an impressive debut performance in the last test, although it did not quite achieve certification. Once again this month both the plain anti-virus and full suite solutions were entered for testing. The AntiVirus product was provided as a 51MB installer package but required online updating, which fetched an additional 111MB of data after a fairly lengthy, multi-stage set-up process. The set-up includes the offer to use Comodo’s own secure DNS servers, ‘in-the-cloud’ validation of running applications, and a wide range of possible languages – some of the translations being provided by the active user base. A reboot is required to complete the process.

The interface displayed at the end of the installation process has seen a significant redesign since the last time it graced our test bench. It looks slick, clean and attractive, with large, clear and well-labelled controls, providing a reasonable if not exhaustive level of fine tuning. The solution includes considerably more than the basics of traditional anti-malware however, with the ‘Defense+’ tab providing a pretty impressive range of additional intrusion prevention measures.

Testing ran fairly smoothly, with both on-access and on-demand scanning speeds around the norm for this month’s figures, and resource consumption similarly average. At one point during the big scan of the main infected sets the product showed a very polite message suggesting it had crashed, but the scan appeared to complete and no gaps were noted with real-time protection either. The real-time tests were a little more difficult to get through, as the product insisted on removing every trace it spotted. The job started on a Friday afternoon and was only just finishing at lunchtime the following Monday, meaning the product took slightly more than the hoped-for average in terms of machine hours, but thanks to careful scheduling not much hands-on time was actually wasted.

Scores in the main test sets were fairly decent, with a little work to do in covering some items in the polymorphic sets, and RAP scores were at the lower end of the middle of the field. In the clean sets a single file from a version of the popular Nero CD burning suite was flagged as a virus, with an additional item labelled suspicious, while a handful of WildList files were not picked up. Thus Comodo is denied VB100 certification once again despite a generally reasonable showing.

ItW: 99.19%
ItW (o/a): 99.19%
Trojans: 85.34%
Worms & bots: 90.72%
Polymorphic: 64.76%
False positives: 1

Comodo Internet Security 5.0.163652.1142

Additional version information: Virus signature database version 6526.

With a set-up package and process almost identical to its sibling product, Comodo’s suite solution also needed online updates and a reboot to get things going. The main addition that makes this a full suite is Comodo’s well-regarded firewall, but this made for little extra work in the installation.

The GUI is similar, clear and clean with a nice layout and ample configuration for most of its features, without appearing cluttered or awkward to navigate. Scanning speeds were reasonable on demand, while on access they were notably faster than the previous product, although CPU use was higher to compensate. Again, our on-access scan over the main sets took an extremely long time, but we were ready and ran it over a weekend, and this time no stability issues were observed despite the long duration. Detection scores were reasonable in general, but a single false positive and a handful of misses in the WildList ensure Comodo misses out on VB100 certification after a promising showing.

ItW: 99.19%
ItW (o/a): 99.19%
Trojans: 85.73%
Worms & bots: 91.02%
Polymorphic: 64.76%
False positives: 1

Coranti 2010 1.001.00011

Additional version information: Updated 27/10/2010 1400 GMT.

Coranti’s multi-engine approach – which includes technologies from BitDefender, F-PROT, Norman and Lavasoft – meant that the original 47MB installer package needed to be augmented with a large quantity of update data. Some 300MB came down in a 30-minute period after the fairly simple and speedy set-up process, which needed no reboot to complete. The interface is fairly busy, providing lots of information and an excellent degree of configuration, but is rationally laid out and reasonably simple to operate.

Scanning speeds were not very fast, as might be expected, but not terrible either. On-access overheads and resource consumption were very heavy. Despite this, getting through the full test suite took a day and a night as hoped, and showed the expected excellent detection rates across all sets, with very gradual declines through the RAP sets.

The clean set brought up a number of alerts, some of them reporting adware items while others said little more than that an item had been ‘reported’, but these were allowed as suspicious alerts only; with the WildList covered effortlessly, Coranti earns another VB100 award for its efforts.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.14%
Worms & bots: 99.99%
Polymorphic: 100.00%
False positives: 0

Defenx Security Suite 2011 3389.519.1244

Additional version information: Malware database 27/10/2010

Defenx provided its product as a 108MB installation package with all updates rolled in. The set-up process ran through a fair number of stages, including the setting of a system restore point and installation of Visual C++ Runtime components, checking network connections and running applications before finally requesting a reboot to complete. The interface, which is similar to the Agnitum solution on which it is based, is clear and logical, providing a reasonable level of configuration for the anti-malware module which is just one of several protective layers included in the product.

Scanning speeds were slowish at first but improved splendidly on repeat runs, while on-access overheads were reasonable and resource usage fairly low. Detection rates were much as might be expected from the VirusBuster engine underlying the anti-malware component, with good results in the main sets and a reasonable showing in the RAP sets. Smart caching of results extended to the infected sets, where the on-access run over the main sets completed in less than 15 minutes – something of a record and a delight in a month where a handful of solutions required several days to complete the same task.

With all tests completed well inside the allotted period, and no issues more serious than a (quite accurate) warning of a Themida-protected file in the clean sets, Defenx easily earns another VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 83.45%
Worms & bots: 96.63%
Polymorphic: 90.52%
False positives: 0

Digital Defender Antivirus Full 2.1.21

Additional version information: Definitions version 12.70.6, definitions date 26/10/2010.

Yet another of the swathe of similar VirusBuster/Preventon­based solutions, Digital Defender has entered several tests in the past year or so and has its first VB100 well under its belt, although the last few tests have seen some bad luck. The installation and set-up process has already been covered in several previous entries this month, the only difference here being the company logo and colour scheme. Speeds, resource consumption and detection rates were all pretty reasonable, testing ran for almost exactly 24 hours without incident, and once again that handful of items in the WildList spoiled what would otherwise have been a very decent performance.

ItW: 100.00%
ItW (o/a): 97.72%
Trojans: 81.60%
Worms & bots: 94.48%
Polymorphic: 90.51%
False positives: 0

eEye Digital Security Blink 4.7.1

Additional version information: Rule version 1603, anti-virus version 1.1.1257.

The Blink solution includes a wealth of extra protective layers above and beyond the anti-malware component provided by the Norman engine, and the installation package is thus a fair size at 158MB, with an additional 72MB of updates to add. The installation process is not complex, but takes some time – much of it taken up by some Visual C++ components – and completes without the need for a reboot. The interface is sharp and serious, with a decent level of controls.

Running through the on-demand tests was rather a chore, as the default setting for such scans is ‘idle’ priority. They thus strolled languorously through the speed sets, in no great hurry to get anywhere, but completed with only a couple of suspicious item warnings in the clean set. On-access times were similarly sluggish, but resource consumption was not outlandishly high. The infected sets also took a fair amount of time (despite the priority being adjusted upwards to hurry things along), mainly thanks to the in-depth sandboxing provided.

In the end, after several days of hands-on time and a weekend in between to run long scans, full results were gathered without any sign of stability issues, and showed decent scores in the main sets and a somewhat disappointing showing in the RAP sets. Happily for eEye though, after a run of bad luck in recent tests the WildList came up all clear, and a VB100 award is earned without undue difficulty.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 69.81%
Worms & bots: 90.35%
Polymorphic: 85.40%
False positives: 0

Emsisoft Anti-Malware 5.0.0.84

Additional version information: N/A.

Emsisoft’s solution has grown into a mature and stylish looker, with a record of solid scores thanks to the Ikarus engine underlying it. The installation package, weighing in at 100MB including latest updates, runs through a fairly standard set of stages with no reboot needed to complete. It then runs a set-up wizard to finalize the last few stages of configuration. The interface is attractive and clean, with some configuration options, although it can be a little confusing in places; the behaviour of the main menu pane is particularly unsettling.

Scans ran fairly slowly with no sign of improvement on repeat runs, but on-access overheads were quite light, no doubt thanks in part to a very limited selection of file types being analysed. Memory usage was on the low side, but CPU perhaps a little higher than average, and as expected detection rates were pretty solid, with only the polymorphic set leaving much room for improvement. Acquiring these scores was far from easy however, as scans of large numbers of infected items tended to be rather slow, and in one instance a scan left running over the weekend froze without leaving any details of what it had done so far. Even after rebooting the machine, the product seemed shy about coming back online, and in the end we had to reinstall it on another system to complete the tests.

With this slowness and instability, testing took several days either side of a long, wasted weekend, but many of these issues would only affect the most demanding of users, and the scores were good enough to make up for it. With no problems in the WildList and no false alarms, Emsisoft earns a VB100 award, having put us to quite some pains.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 89.02%
Worms & bots: 99.42%
Polymorphic: 81.84%
False positives: 0

eScan Internet Security for Windows 11.0.1139.843

Additional version information: Date of virus signatures 27 Oct 2010 11:52.

The latest version of eScan’s suite arrived as a 144MB installer, including all updates needed for the test. The install ran through the standard set of stages, including disabling the Windows firewall, and ended with a reboot.

The interface is fancy and stylish, with Mac-style icons which enlarge on mouse rollover, but under the hood a splendid level of configuration controls are provided to satisfy even the most specialist of requirements. Operating proved fairly simple and pleasant, but slow scanning speeds tried our patience. On-demand speeds were generally slow but unpredictable, with some scans taking twice as long as other checks of the same sample sets run just minutes earlier. On-access overheads were fairly light however, and resource use not too heavy either.

Getting results for the detection sets took some time, with a scan of just the main sets and clean sets taking almost 60 hours to complete, and the RAP sets not much less. With the best part of a week taken up it needed more than its fair share of testing time and resources, but in the end showed a solid set of scores, with excellent levels in the main sets, a slow decline from a high starting position in the RAP sets, and no issues in the WildList or RAP sets. After a long and arduous test run, eScan earns a VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 97.96%
Worms & bots: 99.77%
Polymorphic: 100.00%
False positives: 0

ESET NOD32 Antivirus 4.2.64.12

Additional version information: Virus signature database 5568 (20101027).

One of our most consistent performers, ESET’s NOD32 is just about guaranteed to make an appearance in any VB100 comparative, and this month is no exception. The product comes as a slender 41MB package including all required updates, and the installation process is zippy and simple, enlivened by the offer to join a community feedback scheme and the choice of whether or not to detect greyware. No reboot is needed to finish.

The interface has been stable for some time now, and needs no changing; it has a good clean design, seeming sturdy and serious at all times but never ugly, and providing an excellent level of fine-tuning controls. In places it is perhaps a little repetitive, with seemingly the same items appearing in several places, and we found the scheduler a little difficult to track down, but it was generally a breeze to operate.

Scanning speeds were medium on initial runs but seemed to speed up considerably for the ‘warm’ measures, while on-access overheads were perhaps a fraction higher than the average. CPU usage was fairly low, while RAM use was higher than many this month. Stability was decent, and testing completed in good time, with on-demand scans of the infected sets taking a while thanks to the in-depth heuristics being applied, but all completing within a day and a night.

Final results were as splendid as ever, with solid scores across all sets and a particularly solid showing in the RAP sets. The clean sets turned up their usual handful of greyware alerts, which are doubtless quite accurate and mainly point out toolbars included with trial versions of popular apps. Nothing upset things in the WildList set, and ESET extends its unbroken run of VB100 success by yet another month.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 92.28%
Worms & bots: 96.15%
Polymorphic: 99.95%
False positives: 0

Filseclab Twister AntiVirus V7 R3 7.3.4.9985

Additional version information: Definition version 12.13447846, definition date 26/10/2010 17:00:38.

Filseclab’s product came as a free downloadable trial from the company’s website, at 53MB for the main installer and 41MB of updates, also easily accessed. The set-up process was fast and simple, but needed a reboot to complete. The interface is fairly clear and appealing, with a decent level of configuration, although some of the options in the interface – notably adding to the depth of archives scanned – seemed to have no effect. Operation proved fairly simple, and the tests rolled along nicely, with some fairly slow speeds in the on-demand tests but average overheads and low resource use, particularly in terms of CPU cycle use.

Filseclab’s on-access component seems not to fully intercept all file reads, although some blocking was evident, so instead we gathered all on-access data by copying files around the system. Logging also seemed only to be active if the user responded to a prompt (unless the product was set to automatically apply actions), so we ended up with various copies of our test sets, in various states of repair, scattered across the test machine. Things were somewhat simpler on demand, and didn’t take too long, so testing didn’t overrun the allotted time slot by more than half a day or so, although it was more hands-on than most solutions.

Detection rates proved fairly decent, including a fairly good showing in the RAP sets, but as usual a fair number of WildList samples were not covered – most, but not all of them from the most recent strains of W32/Virut. We also saw a handful of false alarms in the clean sets, notably the popular VLC media player and some items from major business software house SAP. Thus Filseclab still does not quite make the grade for VB100 certification, but continues to show improvement.

ItW: 97.64%
ItW (o/a): 97.64%
Trojans: 88.66%
Worms & bots: 92.84%
Polymorphic: 43.30%
False positives: 6

Fortinet FortiClient 4.1.3.143

Additional version information: Virus signatures version 56.405, anti-virus engine 4.2.253.

Fortinet’s client solution came as a fairly large 91MB main package with an even larger 156MB of updates, but the set-up was fairly fast and simple, with only a warning that network connectivity may be interrupted temporarily to distinguish it from the average installation process. No reboot was needed to complete.

The interface is clear and efficient, fast to navigate and it is easy to set up jobs. On-demand speeds were not very quick, but on-access lag times were OK and RAM usage was fairly low. CPU use, on the other hand, was a little on the high side. No problems with stability were encountered, and testing completed in good time.

Results were very solid in the standard sets, but a little random in the RAP sets, zooming up and down like a rollercoaster. This led us to re-run some scans, but the same results were seen in multiple runs on several systems. No issues emerged in the WildList or clean sets, and Fortinet earns a VB100 award, with our gratitude for giving us so little to complain about.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 92.90%
Worms & bots: 98.71%
Polymorphic: 99.28%
False positives: 0

Frisk F-PROT Antivirus for Windows 6.0.9.4

Additional version information: Scanning engine 4.6.1, virus signature file 26/10/2010, 19:48.

F-PROT was its usual slim and compact self, the main installer just 29MB with 22MB of updates to apply, and the set-up process was super-fast and very painless, although a reboot was required at the end. The interface remains unchanged after several years – still as simple, chilly and crisp as ever, providing only basic controls.

This didn’t get in the way of testing however, which ran along nicely through the speed tests, with some reasonable scan times on demand and low overheads on access; CPU use was surprisingly high, although RAM consumption was negligible even under heavy strain.

The detection tests also seemed to be progressing nicely, but an overnight job proved too much and the scanner froze part way through, needing a reboot to get back to business. This seemed to be due to sheer weight of traffic rather than any particular file snarling things up however, as re-running the remaining portions in smaller chunks produced no further issues and testing completed by the end of the second day.

With decent scores in the main sets and an average showing in the RAP sets, Frisk also handled the WildList and clean sets with aplomb, earning a VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 69.71%
Worms & bots: 88.91%
Polymorphic: 100.00%
False positives: 0

F-Secure Client Security 9.01 build 122

Additional version information: Anti-virus 9.20 build 16071.

F-Secure as usual entered a brace of products. First up is the company’s client solution. The 58MB installer is supplemented by 115MB of updates, shared by the two products, and runs through the standard stages to complete in good time, needing a reboot to finish. A hotfix package was also provided, and applied without difficulty, and the updates were similarly untroublesome. The interface is cool and stylish but can be a little tricky to navigate, since it is rather different from the average product and does not use standard styles and layouts. However, after some exploring, what limited configuration is available can be found fairly easily. Initial scanning speeds were good, and repeat runs lightning-fast, while on-access lags were very light indeed, partly thanks to the limitation of the types of files scanned. Resource usage was also fairly light.

Running through the test sets was smooth and unproblematic, although once again the logging proved unsuited to our unusual requirements – taking a long time for the HTML log files to be built at the end of each scan. At least the product proved capable of running to completion though, as we have seen problems in this area in the past.

When logs were finally ready, we saw some splendid scores, dropping fairly rapidly in the RAP sets but picking up a fraction in the ‘week +1’ set, as several products have this month. The WildList and clean sets presented no problems, and a VB100 award is duly earned.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 97.49%
Worms & bots: 99.73%
Polymorphic: 100.00%
False positives: 0

F-Secure Internet Security 10.50 build 197

Additional version information: Anti-virus 9.30 build 16250.

F-Secure’s main consumer suite product was just about indistinguishable from the client solution, with an installer of similar size and an installation process along very similar lines. The interface is likewise hard to tell apart from the client version, with the same quirky design and initial learning curve; options are just as limited. Scanning speeds were again good to start with and awesome on repeat views, with superbly low on-access overheads. RAM use was low, although CPU use was a little higher than the client version.

With time pressing, we opted to use the command-line scanner included in this product, with the same settings as the main GUI scanner, to avoid the extra half a day needed for the GUI to produce logs. We saw pretty similar scores, unsurprisingly as both products used the same updater. Again, no problems emerged in the certification sets, and F-Secure secures a pair of passes this month.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 98.29%
Worms & bots: 99.83%
Polymorphic: 100.00%
False positives: 0

G DATA Antivirus 2011 21.1.0.5

Additional version information: Update 10/25/2010.

G DATA is another regular entrant in our comparatives, with a strong record of high scores and solid performances. The company’s 2011 solution arrived as a 287MB package including all updates for both engines used, and installed simply in a few steps with little waiting around. A reboot was needed to complete. The interface is busy and informative but not cluttered, and provides the usual wealth of configuration options.

Scanning speeds, as usual, were no more than medium on first run, but quickly became super-zippy in the ‘warm’ runs. On-access measures were a little heavy, but again showed signs of improvement once familiarized with a system and its contents. Resource usage was impressively low throughout.

Testing generally ran smoothly and rapidly, although at one point the scanner GUI froze after a fairly simple scan, refusing to close down nicely and requiring a reboot to recover. After the reboot all was fine however, and no repeat of the incident was observed. In the final reckoning, as ever for G DATA, scores were stratospheric, demolishing all the sets with ease, including an excellent showing in the RAP sets. No false positive issues combined with flawless coverage of the WildList earns G DATA another VB100 award after another remarkable performance.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.97%
Worms & bots: 99.95%
Polymorphic: 100.00%
False positives: 0

Hauri ViRobot 5.5

Additional version information: Engine version 2010-10-25.01(6374897).

Hauri had been missing from our tests for some time until its recent reappearance (see VB, October 2010, p.29). There were a few problems in its last appearance and we hoped to see a better performance this time around. The installer is fairly large at 317MB, although that includes all required updates. It runs through fairly easily, including the options to scan running processes before the installation begins. It completes rapidly, with no reboot needed, and the product interface is well designed and very professional, with a simple, pleasant style and the required controls and options in all the right places.

On-demand speeds were fairly mediocre, and on-access overheads also a little heavy, although RAM use was decidedly low. Running through the on-demand scans went OK, although saving logs at the end of larger jobs occasionally took longer than running the job itself, and on occasion may have failed, leading to some worries about the completeness of results. Analysing the logs showed some good scores though, with excellent coverage of the standard sets and very good scores in the RAP sets too.

Moving on to the on-access run over the standard sets, we soon observed that something was amiss when the opener tool completed its run in a matter of minutes. The logs showed that only the very first item had been blocked; it appeared that, as with several other products from China and Korea this month, ‘real time’ seems not to operate in actual real time. In this case, it seemed that files were being queued for checking, and in the meantime granted access to, pending a decision by the scanner. Pop-ups and log entries claimed that the product had blocked access to files, but as these appeared hours after the access took place this seemed more than a little inaccurate. Indeed, some 48 hours after the initial accessing, the ‘blocking’ process had only reached around 10% of the way through the set, and during that time files could be read, written to, and even executed.

Not having anything like the time required to wait for the full results of the test, and not having any confidence in the protection supposedly provided, we decided to put a stop to things on the third day, writing off on-access results as a lost cause.

No VB100 award could thus be granted, and this decision was made easier by a false positive noted in the clean sets, in a piece of software from a leading manufacturer of mobile phones.

ItW: 100.00%
ItW (o/a): %
Trojans: 99.72%
Worms & bots: 99.88%
Polymorphic: 100.00%
False positives: 1

Ikarus virus.utilities 1.0.227

Additional version information: Update version 1.0.227, scan engine version 1.1.90, virus database 77036.

After a number of near misses over the last few years, Ikarus achieved its first VB100 certification in the summer, and following the success of another product using its engine in this test, things looked all set for a repeat performance. The product came as a 200MB ISO image of an install CD, with an additional 73MB of updates, and installed rapidly with the standard set of stages. The only unusual addition was the installation of the .NET framework, which added a little to the installation time, but no reboot was needed to complete.

The interface, using .NET, remains a little clunky and occasionally slow to respond, with a tendency to misbehave under heavy pressure, but is fairly simple to operate, providing a minimal selection of options. The speed tests ran through in decent time, and overheads were not too heavy, with below average memory consumption but a fair amount of CPU drain. The main detection tests went smoothly, with the full suite completed within 24 hours, although after some big scans the GUI became unresponsive and a reboot was needed to right things.

Checking the results showed some good scores across the board, with a gradual decline through the weeks of the RAP test, and with no problems in the core certification sets, Ikarus earns its second VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 85.96%
Worms & bots: 99.29%
Polymorphic: 81.84%
False positives: 0

Iolo System Shield 4.1.0

Additional version information: Definitions date: Tuesday, October 26, 2010, 18:48.

Iolo produces a wide range of software solutions, including various optimization and clean-up tools, and the company’s security offerings have made a few sporadic appearances in our tests over the last few years.

Iolo has generally been unlucky in its timing or with the operation of the product in our environment, with several entries having been abandoned due to set-up problems. We almost gave up this time too, after the 48MB installer – which set up simply with no difficult questions and needed a reboot to finish off – refused to update online on the deadline day, apparently due to the set-up of our lab’s Internet connection. Nevertheless we persevered, and discovered that we could simply drop in the detection databases without the need for an Internet connection. As the product is based on technology from Commtouch (formerly Authentium), which in turn licenses the Frisk engine, we hoped to see a change in Iolo’s luck this month.

The product itself is glossy and attractive, with large, clear buttons providing access to a range of functions, including a decent level of configuration. Usage is fairly simple and it seemed generally stable and responsive.

Scanning speeds were not bad, although overheads seemed a little heavy and CPU use was quite high. On-access tests ran smoothly, but the interface reported far fewer detections than had actually been blocked, and logging for the on-demand component proved difficult to decipher from its unusual format. Nevertheless, with some workarounds including allowing the product to delete samples and checking what was left behind, as well as using the on-access component for some on-demand measures, we achieved a fairly accurate set of results, showing the expected decent results in the main sets, reasonable and very stable coverage of the RAP samples, and no problems in the core certification sets, earning Iolo its first VB100 award and our congratulations.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 70.19%
Worms & bots: 88.88%
Polymorphic: 100.00%
False positives: 0

K7 Total Security Desktop Edition 10.0.057

Additional version information: Virus definition 9.66.2845.

K7 has become one of the regular and reliable performers in recent tests, and returns once more to the fray. The solution came as a slimline 57MB installer, which ran through very quickly with just a handful of steps, and no reboot was needed.

The interface is pleasant, clean and simple on the surface, with ample options presented in a clear and well-organized manner underneath, and it met with universal approval from the lab team. Running through the tests proved rapid and problem-free, with good on-demand speeds, low on-access overheads and low memory consumption, although CPU use was around average.

Detection scores were obtained without fuss, and showed decent rates in the main sets, with RAP scores a little below expectations, picking up a little in the ‘week +1’ set. Nevertheless, the WildList was handled well and the clean set threw up no surprises, earning K7 another VB100 award, and our thanks for another simple and painless day of testing.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 68.12%
Worms & bots: 90.76%
Polymorphic: 100.00%
False positives: 0

Kaspersky Antivirus 6 for Windows 6.0.4.1212a

Additional version information: N/A.

Kaspersky once again entered both its version 6 product and its latest suite, with version 6 up first. The installer came as a 78MB package, and took its updates from a large bundle of 157MB, although this included data for the full range of products.

The installation process was of reasonable speed and minimal complexity, included the option to disable the Windows firewall, and ended with a reboot of the system. The interface is sparkly and attractive without becoming too cluttered, and provides the full range of controls suitable for any purpose.

On-demand scanning speeds started a little below average, thanks to full-depth defaults, but sped up enormously later on, while on-access overheads were reasonable, increasing considerably when the settings were turned all the way up, as might be expected. Resource usage was admirably low throughout.

Running the detection tests proved fairly speedy, but in the RAP sets a number of files were found which seemed to cause some problems; scans repeatedly came to a halt, with one overnight job found next day to be estimating a further eight days until completion.

Eventually, after removing several such problematic files and restarting the scan numerous times, we got through to the end and managed to export the results – another job which took quite some time. In the end, completing all tests took more than two full days.

It all proved worthwhile though, with some very good scores in all sets and a strong showing in the RAP tests. The core certification components presented no difficulties, and Kaspersky earns a VB100 award without presenting too many problems.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 91.69%
Worms & bots: 97.87%
Polymorphic: 100.00%
False positives: 0

Kaspersky Internet Security 2011 11.0.2.556

Additional version information: Database release data 20/10/2010 12:31:00.

The latest version of Kaspersky’s ever-popular consumer suite solution was provided as a slightly larger 110MB installation package, and used the same set of bases for its updates. The installation process zipped through rapidly – all done in half a minute with no need to reboot – and presented the latest interface in all its glory. The trademark green has been toned down somewhat from recent editions, ditching the shiny metallic look for a more autumnal, foresty shade, and the product itself has a number of other more technical innovations rolled in. These include another Windows 7 desktop gewgaw, and a snazzy drag-and-drop scanning system, but all the old fine-tuning controls are still available under the bonnet, in their usual slightly quirky presentation style.

Again, scanning speeds started off average and sped up massively for the warm jobs, and on-access times were similarly enhanced after initial inspection. RAM use was a little higher than for the version 6 edition, but CPU use was way down. We saw the same batches of samples snagging the scanner – most of them small installation packages which were mostly excluded from the final RAP lists in the later stages of validation – but we were ready this time and removed most of them as soon as we saw the issue re-emerge. It was interesting to note that the option to abort scanning a file after 30 seconds seemed not to help out with this issue. Also recurring was the extreme slowness of displaying and exporting logs, but perhaps this is forgivable given that our log data is orders of magnitude larger than any that a real-world user would need to handle.

In the final reckoning, after a day and a half or so of work completing the tests, scores were again superb, a few notches higher than the older version as one might expect. RAP scores in particular were pretty stellar, and the core certification sets proved a breeze, with another VB100 award going to Kaspersky this month.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 95.49%
Worms & bots: 98.02%
Polymorphic: 100.00%
False positives: 0

Keniu Antivirus 1.0

Additional version information: 2010.10.19.0650.

As a Chinese solution based on the Kaspersky engine, we hoped that Keniu would handle the handful of nasties lurking in our RAP sets as we began installing the 82MB package. The set-up was fast and simple, with a very brief ‘system analysis’ phase but no messing around and no need to reboot; we soon had the simple, minimal interface up and running. With its plain colour scheme and large buttons it is fairly basic to operate, but provides a few options in an ‘advanced’ area, and proved admirably suited to running through our tests.

On-demand scanning speeds were rather on the slow side, lacking the advanced tricks used by others to help things along on repeat viewings, but lag times were light and resource usage below average. On-access tests produced a few odd results, and had to be repeated, but this was fairly speedy and simple and didn’t stretch our time allowance too much.

In the on-demand tests, we saw a number of files catching the scanner out, which stuck itself into a loop and refused to emerge. In one case even rebooting the system didn’t seem to help, with the scanner seeming to run along but failing to detect anything further. The installation had to be abandoned as irrevocably broken, and along with numerous stop-start scans, a reinstallation with several known-dangerous files removed in advance was needed to get to the end of testing. After several days’ hard work we got things as finished as possible, with solid scores in the standard sets and a good start in the RAP sets, which declined fairly rapidly after the first week and remained fairly steady from there on. An early freezing of updates for submission, along with the problems encountered, should explain the lower-than-expected scores.

The WildList set was ably handled in the end though, and with no problems in the clean sets Keniu earns a VB100 award, having given us plenty to do to get there.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 93.23%
Worms & bots: 97.93%
Polymorphic: 100.00%
False positives: 0

Kingsoft Internet Security 2011 Advanced Edition 2008.11.6.63

Additional version information: Engine version 2009.02.05.15, data stream 2007.03.29.18, virus definitions 2010.10.28.01.

Kingsoft as usual entered both ‘Standard’ and ‘Advanced’ editions of its suite solution, and as usual there was very little difference between the two. We start with the ‘Advanced’ edition purely for alphabetical reasons, and note that the 69MB installer is significantly larger than that of the ‘Standard’ version. The installation process is rapid and simple, with no reboot required, leading into a set-up wizard which gives options on settings, the use of ‘in-the-cloud’ resources, and providing feedback.

The interface is clean and clear and seems to use much nicer fonts than the previous versions tested. Navigation is simple and options are good, although translation remains a little clunky and hard to follow in places. Running through the test presented few problems, with some slowish speeds on demand, notably in the archive sets where many compression systems are unpacked in some depth, but file access lag times were light and system resource usage not too heavy either. Initial runs through the test sets seemed to show that logging is capped at a certain size or length, but no information or options were found regarding this, and so testing was split into chunks to ensure complete information.

Detection scores were pretty low in the trojans and RAP sets, with only the set of worms and bots producing a respectable set of figures, but the clean sets were handled well. Stability was rock-solid throughout the tests, even under heavy stress and over samples which caused serious problems for many products this month. All looked well until we spotted a single item in the WildList set not detected: one sample out of 2,500 replications of the latest W32/Virut strain spoiled Kingsoft’s chances of reclaiming its award despite a tester-friendly, if not overly impressive showing.

ItW: 99.9999%
ItW (o/a): 99.9999%
Trojans: 28.48%
Worms & bots: 63.24%
Polymorphic: 62.79%
False positives: 0

Kingsoft Internet Security 2011 Standard Edition 2008.11.6.63

Additional version information: Engine version 2009.02.05.15, data stream 2007.03.29.18, virus definitions 2010.10.24.01.

As mentioned above, the ‘Standard’ edition of Kingsoft’s product is pretty much identical to the ‘Advanced’ product on the surface, but we noted the far smaller 51MB installer, and also the updates included, which appear to be several days older than the ‘Advanced’ product. The installation process and user experience in general were light, fast, simple and clear, and stability was again rock-solid throughout all tests, allowing us to get both products done in the same 24-hour period, on adjacent test machines. Scanning speeds were pretty similar, but for this version access times were a little lighter, and resource consumption a fraction heavier.

Detection rates were again disappointing – notably lower than the ‘Advanced’ edition, with the older updates doubtless contributing. Again, the clean sets were handled without problems, but again that single Virut sample in the WildList set put paid to any hopes of a VB100 award for the product.

ItW: 99.9999%
ItW (o/a): 99.9999%
Trojans: 8.30%
Worms & bots: 53.35%
Polymorphic: 62.64%
False positives: 0

Lavasoft AdAware Professional 8.3.4

Additional version information: N/A.

Lavasoft returned to the test bench this month hoping for a repeat of its performance in this summer’s Vista test, in which it achieved its first VB100 award (see VB, August 2010, p.21). The product looks much the same, the 128MB installer doing its business in good time, offering to install the Google Chrome browser for more secure web browsing, and rebooting to finish off the process. A friendly, colourful interface is presented, with large, clear icons for the various sections. An ‘advanced’ version is available for those seeking finer controls, but this offers little real configuration of the kind required for our testing, and most jobs were done with the settings entirely unchanged.

This made for some acceptable scanning speeds on demand and excellent speeds on access, with resource consumption perhaps a little above average, but in the infected sets there was a lot of activity. On-demand jobs were long and slow and had to be repeated several times after seizing up or stopping altogether, while on-access measures of the infected sets would run for days, rendering the test system unstable and highly peculiar.

Eventually, after well over a full week’s testing time, running on several machines at once by the end of the month and the last to finish by some way, we managed to get what looked like a full set of results – showing the solid scores we would expect from the Sunbelt engine that does most of the heavy lifting here. In the on-access measures, we noted a handful of items not being blocked, and thought perhaps there was some asynchronous unpacking or emulation of complex files going on, as observed in previous tests. However, in this case after numerous efforts to persuade the product to spot them we could see no sign of detection in any of the product’s highly ephemeral logs, nor any indication of action to remove them when written to the system folder, and we had to assume no detection. Thus, despite decent scores elsewhere and no issues in the clean sets, Lavasoft is not awarded VB100 certification for its standard product.

ItW: 100.00%
ItW (o/a): 99.19%
Trojans: 95.54%
Worms & bots: 98.93%
Polymorphic: 79.30%
False positives: 0

Lavasoft AdAware Total Security 21.1.0.28

Additional version information: Update 10/25/2010.

Lavasoft’s second entry this month is a whole different kettle of fish. Based on the G DATA product with some additional detection skills from Lavasoft’s in-house team, it came in at a hefty 418MB in total, including updates. The multi-stage installation process took a couple of minutes to get through.

The interface itself is very similar to that of G DATA’s solution, with a little rebranding, looking very crisp and efficient with its detailed status information on the front page and superb configuration settings which are easily accessible. Scanning speeds benefited from some smart caching of results both on demand and on access, and while CPU cycle usage was a little on the high side, RAM drain was fairly low.

The product powered through the tests with no sign of stability issues, effortlessly brushing aside the sample sets. Scores – once yanked out of the slightly fiddly logs – were really quite stupendous, with barely anything missed in the standard sets and some excellent scores across the RAP weeks. The WildList was demolished in short order, and all looked to be going perfectly until a single item in the clean sets – a popular media player which was downloaded a quarter of a million times in the previous week from a single major download site – was alerted on as the Viking worm, which it clearly wasn’t. Thus a small faux pas scuppered Lavasoft Total’s chances of VB100 certification this month, undermining what would otherwise have been one of the most impressive performances of the month.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.98%
Worms & bots: 99.96%
Polymorphic: 100.00%
False positives: 1

McAfee VirusScan Enterprise 8.7i

Additional version information: Scan engine version 5400.1158, DAT version 6149.000, DAT created on 27 October 2010.

McAfee’s business product has been another long-term high achiever in our tests, regularly praised in these pages for its no-nonsense approach and simple usability. The company has missed a few tests recently, and had some problems with complex polymorphic file infectors a few months ago, and after considerable work assisting diagnosis we were hopeful of a change in fortunes this month.

The product arrived as a 27MB installation bundle, with an additional 13MB of patches and 79MB of updates, all in easily applied executable formats. It ran through its set-up fairly quickly and easily – the most interesting moment being the offer of ‘standard’ or ‘maximum’ protection. At the end it announced that, while a backup was not strictly required right away, it would be needed for some components to operate fully, so we restarted immediately.

The interface, which requires a response to a UAC prompt each time it is opened, remains its austere, businesslike self, with no unnecessary glitz or clutter. Controls are well designed and simple to operate, and full configuration is available in all areas. On-demand speeds were good with the defaults, and not bad with the settings turned up to full, and while on-access scanning times were perhaps a shade above average, RAM use was low and CPU use in busy periods not excessive either.

The detection tests (which do not measure the extra protection provided by the product’s cloud-based Artemis system) ran smoothly, and logging was clear and reliable. The only problem we observed – which caused us to re-run some of our on-access tests – was one we have commented on in these pages before, but which seemed more pronounced this month: when the on-access settings are changed, there is a noticeable period when the protection seems to go down and restart. We observed this in the main on-access test: having noticed the informative pop-up busily reporting numerous detections and worrying that it might hinder progress, we set the notification option to off; on checking the logs of our opener tool, we saw that several hundred samples (which we knew the product should detect) were not blocked during this period, implying that protection had been off for a good 10–20 seconds. This is unlikely to be a major problem, as most people will not be regularly tweaking their settings and it would be pretty unlikely for anything to penetrate a system during one of these brief spells, but it is still a little worrying.

That aside, we gathered a full set of results in under the allotted 24 hours. We saw some solid scores in the standard sets and decent rates in the RAP sets too – even without the benefit of the cloud resources intended to bolster protection against the latest threats. The clean set was handled smoothly, but in the WildList set a single sample of W32/Virut went undetected. Generating several thousand more samples to provide to the developers proved fruitless, so it was clear that this was a most unlikely combination of circumstances, but was still enough to deny McAfee a VB100 award once again.

ItW: 99.9999%
ItW (o/a): 99.9999%
Trojans: 81.51%
Worms & bots: 94.59%
Polymorphic: 100.00%
False positives: 0

Microsafe Avira Premium Security Suite 10.0.0.60

Additional version information: Motor de analisis 8.02.04.86, fichero de firmas de virus 7.10.13.44.

Eagle-eyed readers will have observed that, while Microsafe is a new name in our roster, the product we’re looking at here is well known to us, albeit in a different language. Microsafe provides a rebranded version of Avira’s highly regarded suite, translated into Spanish and Portuguese, along with some extras of its own – including the rare offer of insurance against malware getting past the product.

The Spanish version of the product came in at around 58MB with 45MB of updates, and was fairly simple to set up despite the language not being one of our lab team’s many specialities. The interface was simple to operate, in part thanks to familiarity, and in part due to its simplicity and well-ordered, fairly intuitive design.

Tests zoomed through in excellent time, completing the same working day they started, with zippy on-demand times and below average overheads. Detection rates were superb, with some excellent scores in all sets and a particularly strong showing in the RAP sets. With no issues in the core certification sets Microsafe earns a VB100 award on its first attempt.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.13%
Worms & bots: 99.82%
Polymorphic: 100.00%
False positives: 0

Microsoft Security Essentials 1.0.2498.0

Additional version information: Anti-malware client version: 2.1.6805.0, engine version 1.1.6301.0, anti-virus definitions 1.93.441.0, anti-spyware definitions 1.93.441.0.

Microsoft’s free-for-home use consumer package is another regular fixture in our desktop comparatives. The product is relatively small, with the main program weighing in at only 8MB and updates an additional 57MB. The set-up process is pretty simple and fast, with only two or three clicks of the ‘next’ button required and the whole job done in under a minute, with no reboot needed. The GUI is similarly simple and unfussy, with a basic set of configuration options presented in a wordy, but fairly understandable manner.

Running the first few parts of the test suite didn’t take too long. On-demand speeds were on the low side, but on-access overheads were reasonable at first and quickly sped up once the solution had settled in. Resource use was very light, with CPU use barely registering. The clean set was handled fine, and again at reasonable speed.

On hitting the infected sets, things began to slow down a little. Knowing the product’s reputation for thoroughness from previous tests, we left it to run over a weekend, the whole of which was required to get through the full on-demand jobs. The on-access scans also took several days to complete. At one point it seemed to have slowed to a complete stop – so we gave the machine a reboot and restarted from where we had left off – but eventually we managed to gather a full set of results. Of course, this issue would only affect the most insanely badly infected of users in the real world.

In the final reckoning, the standard sets were dealt with excellently, and some very decent scores were recorded in the RAP sets too. With the WildList also handled nicely, Microsoft easily earns a VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 91.88%
Worms & bots: 98.56%
Polymorphic: 99.85%
False positives: 0

MKS MKS_vir 10 b151

Additional version information: 16.0 b147.

It may be a new one in these pages, but the MKS name has been around for some time. Indeed, the company has submitted its product for previous tests, but on those occasions we were unable to get things operating well enough to include any results. Hoping for better things this time, we ran the 79MB installer – which welcomed us with a friendly cartoon worm character, ran through a few steps and installed in good speed. When presented after no reboot, the interface defaulted to the Polish language, despite the installer having been in English, and it took us a few moments of burrowing through the GUI to find the settings to change it back. The GUI made it clear that this was a Beta version, which may explain these small glitches, as well as some of the bigger ones to come.

Some initial tests ran into problems fairly quickly, when the product crashed while trying to display the logs of our archive test. After a reboot, we tried running some on-access tests but could get no response. On finally finding the on-access controls – buried in the system tray menu but nowhere to be seen in the main interface – we found on-access scanning was off, and trying to switch it on brought up a never-ending progress bar. After reinstalling several times, on several different systems, we repeatedly hit the same wall, and eventually gave up trying to achieve any on-access or performance results.

Having gone this far, it seemed worth our while continuing as far as we could with on-demand results, and scanning speeds were fairly reasonable. Running over the infected sets proved a little more tricky, with scans repeatedly stopping at random, clearly not having covered all the areas requested, but by dint of repeated and arduous running and re-running of scans, we finally gathered a reasonably complete set of figures. These showed some rather weak scores in most areas. RAP scores were the most disappointing, and there were large numbers of false alarms in the clean set and several of the speed sets. The majority of these were from a handful of threat IDs, all Virut variants, implying that the heuristic rules for these particular signatures are a little on the loose side to say the least. The lack of on-access protection and the false positives mean that MKS still needs to do a fair bit of work to reach the required standard for VB100 certification.

ItW: 97.07%
ItW (o/a): %
Trojans: 25.16%
Worms & bots: 43.90%
Polymorphic: 57.46%
False positives: 2428

Nifty Corporation Security24 5.62

Additional version information: 3.0.1.1015.

Nifty has become a semi-regular participant in our comparatives over the last few years, and with the company’s solution based on the generally solid Kaspersky engine, it has usually done pretty well. Aimed exclusively at the Japanese market with no translated version available, testing Security24 is always a bit of an adventure, and one we generally look forward to with equal measures of excitement and trepidation.

The installer is surprisingly small at only 83MB, and runs fairly slowly, with most of the stages requiring a blind, hopeful click of the ‘next’ button (while some of the messages are readable, others seem to rely on characters provided by the operating system, which in our case were not available, resulting in mangled gibberish). When finally done, a reboot is initiated, and on restart we got to see the unusual, but not unattractive interface, and also to note that a browser toolbar of some complexity had also been installed. Not much can be said about configuration options as most were impossible to decipher, but there do seem to be a few fine-tuning controls.

Running the on-demand tests was quick and painless, with good speed-ups in the warm measures, and on-access speeds were light in the executables and slightly slower in media and other file types; resource consumption seemed universally low. The infected sets were something of a monster chore, with the expected slowness (niftiness not being Nifty’s strong point) worse than usual and enhanced by the issues observed with the engine this month. Several scans which had run at the speed of a geriatric snail for days on end finally came to a complete halt on a selection of files in the RAP sets, and a reboot of the system was required to allow us to restart scans. In the end we resorted to removing chunks of the sets to ensure we could gather as much data as possible in the given time, as well as running on several machines at once. Eventually, after close to 10 machine-days and with the deadline for this report already upon us, we got everything we needed. We found some solid scores in the standard sets, as expected, with some decent scores in the RAP sets too, tailing off somewhat more in the later weeks than other products with the same engine – most likely due to an earlier submission with slightly older updates.

The core test sets presented no difficulties, and after a lengthy struggle Nifty earns another VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 92.50%
Worms & bots: 97.96%
Polymorphic: 100.00%
False positives: 0

Norman Security Suite 8.00

Additional version information: Scanner engine version 6.06.10, last updated 2010/10/25 12:50; anti-virus version 8.00, last updated 2010/10/14.

Norman’s suite edition arrived as a 94MB package with updates included, and had a fast and simple installation process. A reboot was requested to complete things, as the installer warned, a short while after the process seemed to have finished. The interface is a little quirky, occasionally opening in ‘blurry’ mode as it prepares for action, and on occasion appearing a little flaky – several times we were informed that anti-malware components including the on-demand scanner were not installed, and links to the appropriate sections had fallen off the GUI entirely, but protection appeared to remain active. The GUI is also a little baffling in places, and we couldn’t figure out how to run on-demand scans from there at all, although quite complex jobs can be set up using the task editor section – apparently these are for scheduled operation only. Thus most on-demand tests were run from the context menu option.

The main speed tests took an age, thanks to the sandboxing and unpacking of many archive types to extreme depth. On-access overheads were pretty hefty too, as was CPU use, although memory consumption was not much above average. Opting to run the main scans over a weekend, we were disappointed to find, come Monday morning, that the scanner had spent most of the last couple of days waiting for a decision as to what to do about a ‘Commercial’ item found in the clean sets, delaying continuation until we returned. This was a little frustrating, and many users would expect scheduled jobs to run unattended and report back, rather than waiting for them to decide what to do – especially if the settings had been set to merely log detections. This setting seemed not to work in other ways too, with samples deleted and disinfected despite explicit instructions not to do so.

Eventually, after another few days of waiting for the scans to complete, a full set of results was acquired with no evidence of instability under pressure. Scores were reasonable in the main sets, and a little low in the RAPs, with considerable fluctuations from week to week. Two items were marked as suspicious in the clean sets, but there were no full-blown false positives, and the WildList was covered completely, thus earning Norman a second VB100 award in a row after a previous spell of bad luck.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 69.86%
Worms & bots: 90.36%
Polymorphic: 85.40%
False positives: 0

Optenet Security Suite v.10.09.69

Additional version information: Build 3304, last update 27 October 2010.

Yet another new name on our lists, Optenet produces a pretty comprehensive suite solution covering all the major bases of firewall, anti-spam, anti-phishing, web filtering and anti-malware, with the latter component provided courtesy of the Kaspersky engine. The installer weighed in at 94MB and ran through in a fair number of steps, which included setting a password to protect the settings and providing an email address in case the password is forgotten. At the end, Windows presented a dialog suggesting perhaps it had not installed correctly, but it seemed to be running fine after the required reboot.

The browser-based interface is fairly well designed and clear, with a few quirks of language to become accustomed to and the occasional annoyance as the login session expires. Scanning speeds were not bad given the depth of analysis going on, and lag times and RAM use were fairly low, although CPU use was a little on the high side. Running through the test sets hit a couple of snags on nasty files, as expected, but not as many as other products seen this month. In the end a good set of results were obtained without too much difficulty, all testing just fitting into the hoped for 24-hour period. Scores were splendid in the main sets, and not bad in the RAP sets either. A clear run through the WildList and clean sets made for an impressive showing all round, and easily earns Optenet a VB100 certification on its first attempt.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 82.53%
Worms & bots: 96.06%
Polymorphic: 100.00%
False positives: 0

PC Booster AV Booster 1.1.21

Additional version information: Definitions version 12.70.6, definitions date 26/10/2010.

Observant readers who have made their way this far through the report may recognize the version information here – yes, yet another from the cluster of clone products based on the VirusBuster engine and SDK. PC Booster, as the name suggests, provides a range of optimization and tune-up utilities, and has recently decided to add anti-malware to the stable too. The solution arrived as the familiar 81MB installer, and ran through the standard steps, with no reboot required, to present us with the familiar interface – this time with a crisp and slightly more angular look than some of the others.

With the GUI design and layout now more than familiar, working with its simple and sensible set-up was smooth and trouble-free. We ran through the tests in good time, once again taking just an afternoon and an overnight run to complete the set.

Results were much as expected, with average on-demand speeds and overheads, resource usage on the low side, and detection rates generally pretty respectable. Once again however, that handful of WildList samples went undetected on access, and PC Booster is denied a VB100 award by a stroke of bad luck.

ItW: 100.99%
ItW (o/a): 97.72%
Trojans: 81.60%
Worms & bots: 93.38%
Polymorphic: 94.48%
False positives: 0

PC Tools Internet Security 2011 8.0.0.608

Additional version information: Database version 6.16180.

PC Tools is another regular in our desktop platform reviews, and as usual both the full suite and Spyware Doctor products were provided for us to look at. The suite came as a 186MB package and took some time to install, running through the standard steps rapidly but then trundling quietly away for a few minutes before reappearing to ask if we trusted our local network, then going back to work for another minute or so and finally completing.

The shiny blue interface has remained fairly unchanged over the last few years, with its large buttons and information on the main screen, and controls for the scanner, multiple guard types, firewall and anti-spam components buried underneath. Not much configuration is provided, and some of it is a little confusing, but it’s generally fairly easy to operate. One unusual feature which we always have to remember when testing this product is that details of scan results are only kept locally if there is no network connection, so when running big scans we have to disconnect from the lab’s internal systems.

The speed tests were not superb to begin with, but improved massively on second and subsequent runs, and while lag times were not too excessive, RAM use was notably high. No issues were noted with stability however, and testing took no longer than expected – in the end producing a very creditable set of scores in the standard sets and a fairly decent showing in the RAP sets. The WildList and clean sets presented no problems, and PC Tools earns a VB100 award quite comfortably.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 82.78%
Worms & bots: 93.73%
Polymorphic: 100.00%
False positives: 0

PC Tools Spyware Doctor 8.0.0.608

Additional version information: Database version 6.16180.

With identical version details, and a pretty similar-looking interface, Spyware Doctor is essentially the PC Tools suite minus the firewall and anti-spam components. Even the installer is only 1MB smaller than its stable mate.

The set-up process was again not the fastest, although no reboot was needed, and scanning speeds were slow to start off with but much quicker in the warm runs. On-access speeds seemed much quicker though, making us wonder if perhaps slightly different settings were used which prevented our test scripts from operating as normal. Memory and CPU usage were both along similar lines to the suite product, but slightly lower in each case.

Testing proceeded without incident, completing in a day and a night, and showed the same sort of scores – solid in the standard sets and not bad in the RAP sets. With no problems in the clean or WildList sets PC Tools earns a second certification this month.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 82.80%
Worms & bots: 93.73%
Polymorphic: 100.00%
False positives: 0

Preventon AntiVirus 4.3.21

Additional version information: Definitions version 12.70.6, definitions date 26/10/2010.

Preventon is the original OEM developer of the VirusBuster-based product on which so many of this month’s entries are in turn based. Unsurprisingly, we found the set-up and usage similar to our experiences with all the others. Things are kept simple and run smoothly, with good stability, reasonable speeds and decent scores.

However, once again, there were some small problems in the WildList set. Having experienced similar issues with the same product in previous tests, some hurried investigations were carried out, eventually coming to the conclusion that the issue lay in the way the builds were put together for testing, and that these problems would not affect real-world users. However, even with updates carried out online this month we could not persuade the detection to work on access (although it remained fully functional on demand), and we had no choice but to deny Preventon VB100 certification this month.

ItW: 100.00%
ItW (o/a): 97.72%
Trojans: 81.60%
Worms & bots: 94.48%
Polymorphic: 90.51%
False positives: 0

Qihoo 360 Antivirus 1.1.0.1313

Additional version information: Signature date 2010-10-24.

Qihoo’s solution is based on the BitDefender engine, and its installer comes in at 105MB. It runs through fairly quickly, with no reboot needed, and on presenting its interface offers an opportunity to join in a cloud scheme. The GUI is stylish and attractive, with some nice large buttons and plenty of good configuration options, lucidly presented, under the surface.

Scanning speeds were not too slow, and on-access lag times were extremely low, although we noted that the on-access module – as with several this month – does not properly intercept read operations, rendering these measures less than fully useful. Despite this, RAM and CPU use were not much below average during the test period. On-demand scans ran smoothly, producing some very decent scores in all sets, but the on-access measure proved a little more tricky: while all files read were actually checked, the product did not stop them being accessed, instead slowly providing pop-ups and logging detections a while later. In the end, the final sample spotted was not alerted on until more than a day after it had been opened. At least during this period some protection seemed to remain in place, and when set to delete or disinfect things were a little faster.

With the good scores extending to the WildList set, and no issues emerging in the clean sets either, Qihoo earns another VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.58%
Worms & bots: 99.81%
Polymorphic: 100.00%
False positives: 0

Quick Heal Total Security 2011 12.00 (5.0.0.1)

Additional version information: Virus database 27 October 2010-12-01.

This is the first appearance on our test bench for Quick Heal’s 2011 edition, and an attractive beast it is too. The installer is on the large side at 178MB, but only takes a few steps and under a minute to run, with no reboot needed. The new GUI is in a pleasant combination of warm green and cool blue shades, with the currently fashionable large icons arrayed across the main screen representing the various functions. In this case they are divided into ‘Files and folders’ (the anti-malware component), ‘Emails’ (anti-spam and mail anti-malware), ‘Internet and Network’ (firewalling and anti-phishing) and ‘External Drives and Devices’ (covering the scanning of attached drives and protection against autorun attacks). The selection of components is thus reasonably complete, and presented in a clear and simple way. Beneath each section is a good array of controls, here much more closely resembling previous editions, with wordy, but fairly detailed and usable options dialogs. A fair number of additional tools are also included.

This clarity and completeness helped us get through the test in good time, with some fairly decent scores on demand and not bad on-access lag times; resource consumption was a little higher than average in all categories. The detection tests ran through without incident, although logs were rather slow to display above a certain size, and all tests completed in good time. Results were not too bad, with some solid scores in the standard sets and RAP scores showing a steady drop across the weeks before rising again in the ‘week +1’ set, as several others did this month. The WildList and clean sets presented no problems though, and Quick Heal earns a VB100 award without undue difficulty.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 74.03%
Worms & bots: 93.54%
Polymorphic: 99.95%
False positives: 0

Returnil System Safe 2011 3.2.10878.5466

Additional version information: REL 8.

Returnil’s intriguing solution, with its virtualization speciality alongside the anti-malware protection provided by the F-PROT engine, comes as a small 35MB installer with just 23MB of updates. The installer is a little long and fiddly, but gets through in a reasonable time without problems, and finishes off with a reboot request. Scanning speeds were less than supersonic in most sets, and file access lags a little sluggish, but resource usage was perhaps a fraction below average. Getting through the infected sets took some time, and quite a lot of RAM was used up as the GUI keeps all its logging data on hand, but this is unlikely to affect the average user.

Final results were not bad, with good scores in the standard sets and decent, very dependable rates across the RAP sets. The clean sets threw up a number of packer and encrypted file warnings, but nothing serious, and with the WildList set handled without problems Returnil easily makes the grade for VB100 certification.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 75.36%
Worms & bots: 89.73%
Polymorphic: 100.00%
False positives: 0

Rising Internet Security 201022.71.02.03

Additional version information: N/A.

Rising’s product arrived as a 109MB package, which installed fairly speedily, warning about a temporary loss of network connectivity while it put its components in place. After the required reboot, a configuration wizard takes the user through a number of further set-up stages. We were sad to see that the ‘Rising assistant’, aka the dancing cartoon lion that usually adorns desktops, was not in evidence this month.

The interface is wordy and a little cluttered but reasonably simple to find one’s way around, and enabled fairly easy running of our tests. On-demand speeds were on the slow side, but not extremely so, and on-access lags were fairly hefty, but RAM use was fairly low and CPU use not too high either.

Detection rates were reasonable in the standard sets and fairly mediocre in the RAP sets, with considerable fluctuation from week to week. The clean set was handled well, but in the WildList set a number of items were not spotted, including a large swathe of rather old W32/Polip samples, and as a result no VB100 award can be granted this month.

ItW: 96.91%
ItW (o/a): 96.91%
Trojans: 51.35%
Worms & bots: 76.03%
Polymorphic: 73.93%
False positives: 0

Sophos Endpoint Security and Control 9.5.4

Additional version information: Detection engine 3.13.1, detection data 4.59G.

Sophos’s latest version includes a number of new features which slowly seem to be filling an interface which once looked a little bare and deserted. While its appearance has none of the gloss and shine of some of the more decorative consumer products, the layout is rational and efficient, as befits its intended business market. The 67MB installer needed only 2.8MB of additional updates, making it one of the smaller solutions this month. The installation process – which includes the offer to remove third-party products – is all done in around a minute, with no reboot required, although we did restart anyway to ensure manual application of updates was completed properly.

Operation is simple and sensible, and with some decent scanning speeds the first tests were completed in good time. File access lags were on the heavy side – more so of course with the settings turned up – but RAM use was fairly low and CPU use extremely low. When running the detection tests last time around we found scans to be slower than we were used to – which it emerged was not due to the ‘live’ cloud-based checking added recently, but instead a result of additional checking of related areas when finding certain malware files. To disable this, we delved into the advanced settings area (into which we do not generally stray), and found an Aladdin’s cave of super-fine tuning controls, which the user is advised to adjust only on the instruction of a trained expert.

With some adjustments made here to suit our specialist requirements the tests ran through to completion in short order, and final processing showed the usual excellent levels in the main sets, with RAP scores a little down on expectations but highly consistent across the weeks. The WildList and clean sets presented no difficulties, other than an alert of potentially unwanted items in a popular game, and Sophos easily earns another VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 94.03%
Worms & bots: 98.02%
Polymorphic: 100.00%
False positives: 0

SPAMfighter VIRUSfighter 7.100.11

Additional version information: Updated 26/10/2010 16:17:03.

SPAMfighter’s solution is the last of the swathe based on the same set-up, and in this case things are a little different, with the company’s own interface laid on top. This has caused a few problems in the past, but this month we saw a new-look solution which promised to fare better. The 82MB installer takes half a dozen steps to finish, plus an online activation, and no reboot. The new look is closer to other similar solutions but retains its own style, including the company’s army helmet logo. A sensible, if basic, range of options is provided, although in some places presentation is a little baffling – such as radio buttons labelled ‘turn on/off’ with no indication as to what state is being turned on/off.

Speeds were much as expected, as were overheads and resource consumption, and our well-practised testing procedures got tests complete within 24 hours. Although the product GUI seemed to have died at some point in our large overnight scan, there was no sign of interruption and results seemed complete, showing a decent performance throughout until that handful of pesky WildList samples were missed on access, with the same minor bug once again denying SPAMfighter a VB100 award.

ItW: 100.00%
ItW (o/a): 97.72%
Trojans: 81.60%
Worms & bots: 94.48%
Polymorphic: 90.51%
False positives: 0

Sunbelt (now GFI) VIPRE 4.0.3904

Additional version information: Definitions version 7153 (27/10/2010 16:00:00), VIPRE engine version 3.9.2456.2.

VIPRE is another fairly slimline product, with the main package only 16MB, plus 64MB of updates. The installation is fairly rapid and unsurprising, but ends with a reboot, and afterwards a configuration wizard is presented, which offers a demo video at the end. The main interface is fairly clear and sensible, but does not provide a lot of detailed configuration.

Running through the initial tests proved fairly straightforward. Scanning speeds were very rapid over executable files, but slower than most in some areas – notably the set of media and document samples. On-access lags showed a similar pattern; RAM consumption was very low, although CPU use was on the high side at busy times.

Running through the clean sets took some time – almost a full 24 hours – but somewhat strangely, the infected sets ran through much more rapidly. Some re-runs were needed after some slightly odd results, and in the final reckoning results were pretty decent, with an excellent showing in the standard sets and very solid RAP scores. In the WildList, the same handful of files that upset one of the Lavasoft solutions were not immediately blocked, but in this case it was clear that the asynchronous scanning was operating; while on-access logging is not kept for long enough to be of any use, we found that writing the samples to the C: partition saw them alerted on and deleted within a minute or so. Thus, with the clean sets handled fine and no problems elsewhere, VIPRE earns another VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 95.05%
Worms & bots: 98.93%
Polymorphic: 79.30%
False positives: 0

Trustport Antivirus 2011 11.0.0.4565

Additional version information: BitDefender engine version 7.3444, updated 27/10/2010 07:52:12; AVG engine version 1.7.9856, updated 27/10/2010 07:34:00.

Trustport has become a regular and reliable participant in our tests, routinely achieving some splendid scores, and we were looking forward to getting our hands on the 2011 version in the lab. The 174MB installer doesn’t take too long to install but spends some time at the end ‘integrating’ with the system. After a reboot it runs a brief set-up wizard, which mainly concerns itself with licensing issues. The interface has a friendly, colourful main screen, filled with status information, but we mainly worked from an ‘Expert settings’ screen which more closely resembled previous incarnations of the product. This provided ample controls in a clear fashion.

Running the tests proved reasonably simple, with on-demand speeds and on-access overheads on the slow side, and fairly high use of RAM – as may be expected from a dual-engine solution – but CPU use was not exceptional. Stability was fine throughout, and all tests completed in a little more than the standard day.

Final figures were as excellent as ever – close to perfect in the standard sets and splendid in the RAP sets too, slowly creeping down as the samples grew fresher but never less than impressive. The core certification requirements were easily met, and Trustport comfortably earns a VB100 award with another excellent performance.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.44%
Worms & bots: 99.96%
Polymorphic: 100.00%
False positives: 0

VirusBuster VirusBuster Professional 6.3.14

Additional version information: Virus scan engine 5.1.1, virus database 12.70.8 27/10/2010.

VirusBuster’s engine has dominated this month’s test thanks to numerous solutions making use of it, and even the interface has already made an appearance in the shape of the rebadged and recoloured Vexira. However, the original remains little changed from its several years of regular and reliable appearances on the test bench, these days coming in at 69MB for the main package and 81MB for updates. The installer is fairly sluggish, though it requires little interaction, and the on-access scanner is clearly fully operational long before the install is finished, so no reboots are required here. The interface is awkward and quirky, but somehow loveable, and with the benefit of familiarity does provide a pretty thorough range of controls.

Scanning speeds were not the fastest, but were at least consistent, while on-access lags were a little higher than some but resource usage fairly low. Stability was rock-solid, as usual, in all tests, and the entire suite of tests were completed as planned within 24 hours. Results were much as expected, with good scores in the standard sets and a decent showing in the RAPs, and with no issues in the certification sets another VB100 award is duly granted to VirusBuster.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 80.55%
Worms & bots: 94.07%
Polymorphic: 90.52%
False positives: 0

Webroot Internet Security Complete 7.0.5.210

Additional version information: Security definitions version 1811, virus engine version 3.12.1.

Webroot has produced a shiny new version of its product, complete with new title, which arrived as a fairly large 265MB zip file containing all the required updates as well as the main product. Installation was quite a slow process but not too arduous, and required a reboot followed by some set-up stages to complete.

The new interface is fairly pretty, going for the modish row-of-large-icons look, and includes a number of components including system clean-up tools, privacy protection and a ‘sync and sharing’ section as well as the security area. The lab team found it rather difficult to navigate the controls, of which few are provided. Fortunately, the developers provided some extra controls for our specialist requirements, and we soon got things moving along.

On-demand scanning speeds were slow to start with but very quick in the warm scans, while on-access overheads and resource requirements were low from the off. Getting through the detection tests took an interminable amount of time – more than three full days for the main scans (fortunately we had anticipated this from past experience and set the job to run over a weekend) and not much less for the on-access test. With the Sophos engine providing much of the detection we saw some solid scores in the standard sets and a decent showing in the RAP sets, and with no problems in the core sets a VB100 award is granted to Webroot.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 94.58%
Worms & bots: 98.13%
Polymorphic: 100.00%
False positives: 0

ZeoBIT PCKeeper 1.1.49.3149

Additional version information: Engine version 8.2.4.84, virus database version 7.10.13.49.

Rounding off this epic report is yet another new face, albeit one with a well-known personality. ZeoBIT provides a pair of solutions, MacKeeper and PCKeeper, which aim to combine a wide range of useful utilities in a single package. Arriving at the last minute for this test, news that the anti-malware component of the solution is based on the Avira engine made us hopeful of another speedy and simple test run.

We were provided with a small downloader file which proceeded to fetch the main product from the Internet, and after a few initial steps spent some 25 minutes doing so. Once it was done we needed to apply a licence key to go any further, but oddly the web page where this could be done repeatedly crashed the IE8 browser on our test system. Eventually, we resorted to installing Firefox, where no further problems were found. No reboot was requested at any stage, but as the initial set-up was done on the deadline day and we needed to image the system and wait a while before running tests, it got one anyway.

The interface is quite attractive, again favouring the row-of-icons looks, but also including some status information along the bottom. It has a slick, shiny appearance. Alongside the anti-virus module are sections labelled ‘Utilities’, ‘Services’ and ‘Settings’, and these provide a wealth of additional tools – online backup, disk checking and defragmentation, secure data shredding, software uninstallation, deduplication and recovery tools barely touching the surface of the long list of handy items bundled here. Operation proved fairly simple, with a decent level of controls made available in what seemed to be a lucid and sensible manner.

Some initial problems with logging were quickly solved by the responsive developers, and some issues with the on-access measures were just as speedily diagnosed as being due to a misunderstanding of the settings. An option marked ‘ignore’, which in most cases would make a real-time scanner simply deny access and record an infection, actually denied access on the first visit only, then permanently ‘ignored’ or whitelisted the file in question. Setting to delete instead proved far more effective, and the product powered through the sets at remarkable speed, getting through all the jobs well within the scheduled period despite some initial issues.

Detection results proved as excellent as expected, with barely a thing missed, and with a good show in the core certification areas a VB100 award is duly earned, making for a happy end to this month’s comparative.

ItW: 100.00%
ItW (o/a): 100.00%
Trojans: 99.28%
Worms & bots: 99.81%
Polymorphic: 100.00%
False positives: 0

Results tables

Conclusions

Praise be for the English winter. Had we run this test surrounded by blue skies, sultry evenings and smiling people in skimpy clothing, it would surely have been an extremely unpleasant chore. As it was, in the season of no warmth, no light, no joy, November (to paraphrase the poet), we didn’t really mind too much being shut away in a cramped and rather cluttered test lab, heated far beyond the required temperature by roaring test machines working their little hearts out, for many long days and longer evenings.

What we did find rather upsetting was the level of unreliability, untrustworthiness and downright flakiness seen in this month’s clutch of products. On more than one occasion one team member was heard to comment: ‘surely [company name removed out of kindness] must have QA processes’, only to draw a mournful sigh and a ‘I’m not sure that’s such a simple assumption’ from a sympathetic colleague.

Windows 7 is far from new, and certainly not an obscure platform. As the most current offering from the world’s largest provider of operating systems, developers ought to be paying considerable attention to the way their products run on it, and ensuring that the (presumably fairly large) proportion of their customers who use the platform have access to solutions with a reasonable chance of making it through a day or two without crashing, hanging, freezing, behaving in a bizarre and unpredictable manner or just generally freaking out. Apparently, however, this is too much to expect from some of the allegedly professional developers of so-called security solutions.

Perhaps our draining month’s work has left me judgemental, not to say tetchy. There were, of course, some good things noted this month. A fair proportion of products did make it through the fairly stressful experience of our in-depth suite of tests with dignity and honour intact. To those we owe our greatest respect and gratitude. In general, these were the best performers in terms of detection rates too, indicating that, for the most part, quality will out. Of course a few sterling detectors suffered instabilities, while a few of the most stable and reliable achieved mediocre scores at best. We also saw some of the most trustworthy and test-friendly products denied certification by the narrowest of margins, with relatively minor false alarms causing problems for a few. For others, there were some much more glaring problems with false alarms on major and common items, which would surely have caused some serious issues for their clientele. Once again, the WildList – derided by some as too small, too simple and too widely available to present any challenge for solution developers – has wrought havoc, upsetting many otherwise reasonable performances even from some of the most well-known of vendors.

Alongside this month’s comparative we have run some interesting experimental tests, currently purely for our own consumption but many of which we hope to see emerging fully fledged in upcoming months. Given this month’s experiences, it seems even more important to provide clear insight into the stability and reliability of the solutions under test, and perhaps some kind of simple table giving an indication of the numbers of crashes and other serious errors encountered in the process of testing is in order.

Having spent much of the month surviving mainly by looking forward to the next test – on a Linux platform and thus orders of magnitude smaller than this month’s epic – we will use the downtime to work on these expansions and improvements for the benefit of our readers. Any comments, queries or suggestions are, as always, most welcome.

Technical details

Test environment. All products were tested on identical machines with AMD Phenom II X2 550 processors, 4GB RAM, dual 80GB and 1TB hard drives, running Microsoft Windows 7 Professional, 32-bit edition.

Any developers interested in submitting products for VB's comparative reviews should contact [email protected]. The current schedule for the publication of VB comparative reviews can be found at http://www.virusbtn.com/vb100/about/schedule.xml.

Appendix – test methodology

The following is a brief précis of how our tests are conducted. More detail is available at http://www.virusbtn.com/vb100/about/100procedure.xml.

Core goals

The purpose of the VB100 comparative is to provide insight into the relative performance of the solutions taking part in our tests, covering as wide a range of areas as possible within the limitations of time and available resources. The results of our tests should not be taken as a definitive indicator of the potential of any product reviewed, as all solutions may contain additional features not covered by our tests and may offer more or less protection depending on the configuration and operation of a specific setting and implementation.

VB100 certification is designed to be an indicator of general quality and should be monitored over a period of time. Achieving certification in a single comparative can only show that the solution in question has met the certification requirements in that specific test. A pattern of regular certification and few or no failed attempts should be understood to indicate that the solution’s developers have strong quality control processes and strong ties to industry-wide sample sharing initiatives – ensuring constant access to and coverage of the most prevalent threats.

Alongside the pass/fail data, we recommend taking into account the additional information provided in each report, and also suggest consultation of other reputable independent testing and certification organizations, links to many of which can be found on the Virus Bulletin website at http://www.virusbtn.com/resources/links/index?test.

Malware detection measures

In all cases, details of malware detection rates recorded in this report cover only static detection of inactive malware present on the hard drive of the test system, not active infections or infection vectors.

For on-demand tests, products are directed to scan sample sets using the standard on-demand scan from the product interface. Where no option to scan a single folder is provided a context-menu or ‘right-click’ scan is used; if this is not possible either, any available command-line scanning tool is used as a last resort.

In all cases the default settings are used, with the exception of automatic cleaning/quarantining/removal, which is disabled where possible, and logging options, which are adjusted where applicable to ensure the full details of scan results are kept for later processing.

In on-access measures sample sets are accessed using bespoke tools which spark products with on-read protection capabilities to check, and where necessary block access to malicious files. Again, automatic cleaning and removal is disabled where possible. In solutions which provide on-write but not on-read detection, sample sets are copied from one partition of the test system to another, or written to the test system from a remote machine. In the case of solutions which offer on-read detection but default to other methods only, settings may be changed to enable on-read for malicious test sets to facilitate testing.

It is important in this setting to understand the difference between detection and protection. The results we report show only the core detection capabilities of traditional malware technology. Many of the products under test may include additional protective layers to supplement this, including but not limited to: firewalls, spam filters, web and email content filters and parental controls, software and device whitelisting, URL and file reputation filtering including online lookup systems, behavioural/dynamic monitoring, HIPS, integrity checking, sandboxing, virtualization systems, backup facilities, encryption tools, data leak prevention and vulnerability scanning. The additional protection offered by these diverse components is not measured in our tests. Users may also obtain more or less protection than we observe by adjusting product settings to fit their specific requirements.

Performance measures

The performance data included in our tests is intended as a guide only, and should not be taken as an indicator of the exact speeds and resource consumptions a user can expect to observe on their own systems. Much of the data is presented in the form of relative values compared to baselines recorded while performing identical activities on identical hardware, and is thus not appropriate for inferring specific performances in other settings; it should instead be used to provide insight into how products perform compared to other solutions available.

On-demand speed figures are provided as a simple throughput rate, taken by measuring the length of time taken to scan a standard set of clean sample files using the standard on-demand scan from the product interface. The size of the sample set is divided by the time taken to give a value in megabytes of data processed per second. On-access speeds are gathered by running a file-opening tool over the same sets; speeds are recorded by the tool and compared with the time taken to perform the same action on an unprotected system (these baselines are taken several times and an average baseline time is used for all calculations). The difference in the times is divided by the size of the sample set, to give the additional time taken to open the samples in seconds per megabyte of data.

Both on-demand and on-access measures are made with the default settings, with an initial ‘cold’ measure showing performance on first sight of the sample sets and ‘warm’ measures showing the average of several subsequent scans over the same sets. This indicates whether products are using smart caching techniques to avoid re-scanning items that have already been checked.

An additional run is performed with the settings adjusted, where possible, to include all types of files and to scan inside archive files. This is done to allow closer comparison between products with more or less thorough settings by default. The level of settings used by default and available is shown in the archive type table. These results are based on scanning and accessing a set of archives in which the Eicar test file is embedded at different depths. An uncompressed copy of the file is also included in the archives with its file extension changed to a random one not used by any executable file type, to show whether solutions rely on file extensions to determine whether or not to check them.

System resource usage figures are recorded using the Windows performance monitor tool. Levels of memory and CPU usage are recorded every five seconds during each of several tasks. The on-access speed test periods plus an additional on-access run over the system partition are used for the ‘heavy file access’ measures, and periods of inactivity for the ‘idle system’ measures. During all these measures the solution’s main interface, a single instance of Windows Explorer and a single command prompt window are open on the system, as well as any additional windows required by the testing tools. The results are compared with baseline figures obtained during the same baseline test runs used for the on-access speed calculations, to produce the final results showing the percentage increase in resource usage during the various activities covered.

Sample selection and validation

The sample sets for the speed tests are built by harvesting all available files from a selection of clean systems and dividing them into categories of file types, as described in the test results. They should thus represent a reasonable approximation of the ratios of different types of files on a normal system. The remaining portion of the false positive sample set is made up of a selection of items from a wide range of sources, including popular software download sites, the download areas of major software development houses, software included on pre-installed computers, and CDs and DVDs provided with hardware and magazines.

In all cases packages used in the clean sets are installed on test systems to check for obvious signs of malware infiltration, and false positives are confirmed by solution developers prior to publication wherever possible. Samples used are rated for significance in terms of user base, and any item adjudged too obscure or rare is discarded from the set. The set is also regularly cleaned of items considered too old to remain significant.

Samples used in the infected test set also come from a range of sources. The WildList samples used for the core certification set stem from the master samples maintained by the WildList Organization. These are validated in our own lab, and in the case of true viruses, only fresh replications generated by us are included in the test sets (rather than the original samples themselves). The polymorphic virus set includes a range of complex viruses, selected either for their current or recent prevalence or for their interest value as presenting particular difficulties in detection; again all samples are replicated and verified in our own lab.

For the other sets, including the RAP sets, any sample gathered by our labs in the appropriate time period and confirmed as malicious by us is considered fair game for inclusion. Sources include the sharing systems of malware labs and other testing bodies, independent organizations and corporations, and individual contributors as well as our own direct gathering systems. All samples are marked with the date on which they are first seen by our lab. The RAP collection period begins three weeks prior to the product submission deadline for each test, and runs until one week after that deadline; the deadline date itself is considered the last day of ‘week -1’.

The sets of trojans and ‘worms and bots’ are rebuilt for each test using samples gathered by our labs in the period from the closing of the previous RAP set until the start of the current one. An exception to this rule is in the ‘worms and bots’ set, which also includes a number of samples which have appeared on WildLists in the past 18 months.

All samples are verified and classified in our own labs using both in-house and commercially available tools. To be included in our test sets all samples must satisfy our requirements for malicious behaviour; adware and other ‘grey’ items of potentially unwanted nature are excluded from both the malicious and clean sets as far as possible.

Reviews and comments

The product descriptions, test reports and conclusions included in the comparative review aim to be as accurate as possible to the experiences of the test team in running the tests. Of necessity, some degree of subjective opinion is included in these comments, and readers may find that their own feelings towards and opinions of certain aspects of the solutions tested differ from those of the lab test team. We recommend reading the comments, conclusions and additional information in full wherever possible, and congratulate those whose diligence has brought them this far.

twitter.png
fb.png
linkedin.png
hackernews.png
reddit.png

 

Latest reviews:

VBSpam comparative review

The Q1 2024 VBSpam test measured the performance of nine full email security solutions, one custom configured solution and one open‑source solution.

VBSpam comparative review

The Q4 2023 VBSpam test measured the performance of eight full email security solutions, one custom configured solution, one open-source solution and one blocklist.

VBSpam comparative review

In the Q3 2023 VBSpam test we measured the performance of eight full email security solutions, one custom configured solution, one open-source solution and one blocklist.

VBSpam comparative review

In the Q2 2023 VBSpam test we measured the performance of nine full email security solutions, one custom configured solution, one open-source solution and one blocklist.

VBSpam comparative review

In the Q1 2023 VBSpam test we measured the performance of eight full email security solutions, one custom configured solution, one open-source solution and one blocklist.

We have placed cookies on your device in order to improve the functionality of this site, as outlined in our cookies policy. However, you may delete and block all cookies from this site and your use of the site will be unaffected. By continuing to browse this site, you are agreeing to Virus Bulletin's use of data as outlined in our privacy policy.