VB100 June 2008 - Ubuntu Linux 8.04LTS Server Edition

2008-06-01

John Hawes

Virus Bulletin
Editor: Helen Martin

Abstract

John Hawes dusts off his Linux skills for a comparative review of anti-malware products on the Ubuntu Server platform. Check out which of the 15 products were awarded VB100 status and which failed to make the grade.


Introduction

Once again the VB100 review rolls around to its annual visit to a Linux platform, and once again the same questions arise. The ever-growing hordes of Linux and open-source aficionados continue to revel in the relative impenetrability of their security model and in the scant attention paid to them by malware creators. Why, I am asked on what seems like a daily basis, would I want to run anti-virus on my Linux box? What have I to fear? A tiny handful of malware with puny penetration levels is surely a risk worth taking, runs the standard argument.

Of course, this is quite beside the point; while Linux as a desktop operating system continues to nurture its small, but generally keen and committed user base, it is when running on a server that it is really at home, continuing to dominate at gateways and scattered throughout corporate and academic networks. Fileserver systems, with their arrays of Windows clients storing and transferring all manner of things thanks to the delights of Samba, can be nasty breeding grounds for network-wide malware infestations if they are not properly protected. However, they can also be used as effective blockades, preventing malicious code from being passed to new targets. Even where desktops feature their own anti-malware systems, corporate policies often (rightly) insist on thorough regimes of protection with no system allowed to operate without malware scanning, no matter how secure it may seem. For this reason, the Linux VB100 generates a great deal of interest from our readers in the corporate world, who are not above demanding tests on far more esoteric platforms.

Ubuntu Linux is a relative newcomer to the scene compared to the likes of SUSE and Red Hat, the even more venerable Slackware and, of course, Debian, from which Ubuntu evolved. The distribution’s first release was in 2004, and since then it has seen massive growth, with strong financial backing through its links to Canonical Ltd and its entrepreneur founder Mark Shuttleworth, the first African to venture into space.

Ubuntu has shied away from the duality of commercial and free versions adopted by other big-money distros, and embraced the open-source philosophy wholeheartedly. Accompanied by numerous offshoot projects focusing on specific desktops systems and user groups, Ubuntu’s focus on friendly usability, stability and consistent updating has brought strong penetration of desktops – a poll held last summer found over 30% of respondents were using it, with its nearest rival, OpenSUSE, at 19% and Debian at 11%. At the server level fewer details are available, but the server edition seemed more appropriate to our purposes; of course this selection brought with it the likelihood of some compromises in usability, with any cuddly ease of use likely to have been stripped away in favour of efficiency, security and robustness.

The challenge of learning a new platform should, I hoped, be somewhat mitigated by the fairly small number of entries this month – a mere 15 products providing a relatively easy ride between the mammoth Vista test last time around and what is likely to be an even more gargantuan array of products in the XP test scheduled to take place later in the summer. Looking forward to simple command-line interfaces providing easy access to configuration options, I burned the install image, dusted off my rather neglected Linux skills, and ventured bravely into the lab.

Platform and test sets

Installation of Ubuntu was a pretty straightforward process, guided by a pleasant graphical setup process. As usual in VB100 testing I tried to keep things as simple as possible, sticking to the default settings to get as close as possible to an out-of-the-box setup. Of course this policy couldn’t be applied perfectly, with some stages such as disk partitioning requiring specific tuning for my needs, but the final setup provided the basic Ubuntu fileserver. As expected, this didn’t include a desktop environment, so those products with attractive interfaces would not have their full range of offerings investigated, but the command-line style is generally preferred at the server level anyway, with a minimum of ‘magic’ going on to open potential security holes and drain resources.

Less expected was the absence of other useful items, including an NFS implementation, but a little investigation showed that a large range of extra goodies were available on the install CD. Beyond a few basic steps such as configuring networking, connecting to the lab servers and client systems and copying test samples to the local machines, very little further work was required before taking snapshot images and getting down to business. In the previous Linux test (see VB, April 2007, p.11) a copy of dazuko, the open-source file-hooking software used by many Linux products, was prepared on the test machines in advance, but in this case one of the submissions had thoughtfully included a pre-built binary so this step was not necessary (although I had little doubt that some compilation would eventually be required).

Client systems were also prepared, using a standard Windows XP SP2 image with the Samba shares of the test servers mapped, with all on-access tests planned to be run from these. To avoid unfairness, network activity was kept to a minimum during the tests, which were run one at a time to further reduce the possibility of unequal treatment.

The test sets were aligned with the March 2008 WildList, which was released a few weeks prior to the test set deadline of 2 May and the product submission deadline of 5 May. The latest WildList included a fairly large number of new additions, but these were concentrated in a few families – most notably a large swathe of W32/OnlineGames trojans, showing further evolution of the WildList into the cybercrime-ridden modern world. Quite a few older items fell from the list, including several strains of W32/Mytob and W32/MyDoom, and also the veteran W32/Nimda, which finally dropped off the list after a marathon stint of officially being in the wild.

My attendance of numerous meetings and conferences this month hampered any efforts to expand the other test sets by more than a minimal amount, with only a handful of items added to the clean and infected sets and the meagre set of Linux samples dusted off. The most significant addition was the insertion of a set of Linux files into the clean set, to form an extra part of the speed measurements. This time the samples were taken from a separate system from those running the tests, one which had been in heavy use for some time and thus held a more eclectic range of items. The entire contents of /bin, /sbin, /etc and /opt were included, making the new set on a par with the others in terms of size on disk, but considerably ahead of them in the number of files it contained.

Finally, the standard archive test set consisted of the EICAR test file embedded in a selection of archive types at a range of depths. These would, as usual, be scanned with the default settings and with ‘all files’ and ‘scan archives’ settings enabled where possible, and detection at a depth of five or more levels on at least half of the set would be considered adequate for a product’s inclusion in the full archive graphs. With everything ready to go, it was time to see how the selection of products fared.

Results

Alwil avast! for Linux 3.1.0

Alwil’s product arrived as several archive files, which when unpacked were found to contain simple installer scripts which did all the work of setting things up very nicely. However, the on-access component was a little less straightforward – it needed some compilation and access to the dazuko sources, which caused a headache and required calls to the developers for advice as the various requisites were set up. The libdazuko library, not built by default by the standard setup process, was also needed, but when at last everything was in place testing proceeded without further incident. Configuration, both of the command-line scanner and the on-access components, operated in a straightforward and standard fashion, with ample documentation available to guide the novice user.

Scanning speeds were about what I should have expected, my hopes of seeing testing times cut drastically as a result of using a pared-down operating system having quickly been dashed. On-access scanning speeds in particular were somewhat slower than I had hoped, doubtless due in large part to the test being run across the network. Avast!’s detection rates were little changed from previous tests, although detection for the single file-infector on the WildList which was missed last time around was added and the core set was covered without problems.

In the clean sets, however, a couple of items were mislabelled as malware, including one which has tripped up a series of products in the past year, and thus Alwil will have to wait a little longer before reclaiming its place on the VB100 podium.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 96.67%
Macro: %
Standard: %
Polymorphic: 87.13%

AVG Anti-Virus 7.5.51

AVG’s product was considerably simpler to install, coming as a single .deb installer package which set everything up in a few moments, the majority of which were spent entering a licence key. Again using the dazuko file-hooking system, this time all the installer required was the kernel module to be in place. The design conformed to Linux norms, with straightforward syntax to the command-line scanner and the configuration files for the on-access monitor. Guidance and information was also ample and properly implemented.

Speeds were a little disappointing, even more so with scanning of all files and archives enabled, but detection rates were good, with no WildList samples missed and no false positives raised, and thus AVG earns a VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 88.33%
Macro: %
Standard: %
Polymorphic: 73.89%

Avira AntiVir for Linux 2.1.12-31

Avira developed the dazuko system and continues to fund its maintenance and development. It is not surprising, therefore, that the company’s product is among those making use of the file-hooking software.

The installation setup came as an archive file containing an install script, as well as the pre-built dazuko module – I suspect this addition is not generally provided for customers, but many Linux distributions come with the binary package available. The installer offered the delights of centralized management systems and graphical interfaces, which I was forced to turn down, and again the design, settings and documentation were excellent.

Scanning speeds this time were a little more impressive, and once again detection rates were superb, with most of the missed items merely being the result of rare file types not being scanned with the default settings. With flawless coverage of the WildList and not a hint of a false positive, Avira easily qualifies for a VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 66.67%
Macro: %
Standard: %
Polymorphic: 100.00%

BitDefender Security for Linux 3.0.0.80505

BitDefender’s product was another using the .deb system, this time with a built-in installation system too. This caused some issues initially as older versions of C++ libraries were required, which in turn required the installation of several other dependencies. Presumably on a fully networked system this would all have been handled by the package manager, reaching out to the web for any requirements.

Once over these hurdles things went very easily however, with a Samba VFS module used for the on-access component – this was the standard alternative to the dazuko system in the last Linux test and its operation proved simple, with a small change to the Samba configuration to point it at the new scanning object the only requirement.

In the previous Linux test several products using the VFS method encountered difficulties with speed and stability, but there were no such issues here, with things running along at excellent speeds without so much as a wobble until on-demand scanning of the archive set brought up a few segmentation-fault crashes. A handful of items were removed and the test was completed successfully, and the issue could not be reproduced in isolation. Detection was excellent, and without any samples missed in the WildList set and avoiding false positives, BitDefender also wins another VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 93.33%
Macro: %
Standard: %
Polymorphic: 100.00%

Doctor Web Dr. Web for Linux 4.44.0

The Dr.Web product was a little more pared-down than the others, with a few simple .tgz archives which just needed extracting into the system root to drop their files into the right spots. After some teething problems with permissions – the result of inadequate perusal of the documentation on my part – things got trotting along nicely. I found the syntax of the command-line scanner a little quirky, but soon mastered it, along with the implementation of the on-access scanner SpIDerGuard, which again made use of Samba’s built-in VFS objects system.

The product’s extreme thoroughness in analysing archives meant that scan times on some sets were rather long, but hugely detailed logs were produced, packed with information on the files which had been scanned. These included alerts on a range of ‘riskware’ and ‘hacktool’ products which I may not have wanted to have around had I been a genuine network administrator.

On more normal files speeds were very impressive, and detection rates were also extremely high, but once again a handful of items from the WildList set were not covered, and a couple of items in the clean set were mislabelled as malware, thus denying Dr.Web a VB100 award this time.

ItW: 97.55%
ItW (o/a): 97.55%
Linux: 100.00%
Macro: %
Standard: %
Polymorphic: 100.00%

ESET Security 3.0.3

The ESET installation process returned to the .deb package method, and proved fast and efficient. On-access scanning could be implemented using either dazuko or the Samba VFS path, and the latter was adopted at the request of the developers. This proved simple to get working once I had navigated my way around the setup, and again the command-line scanner was a joy to operate.

Scanning speeds were not as eye-watering as usual, but they seemed much quicker on the infected sets, suggesting that the clean items were being subjected to some thorough probing. With excellent detection and no false positive issues, ESET storms its way to a record 50th VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 100.00%
Macro: %
Standard: %
Polymorphic: 100.00%

Frisk F-PROT Antivirus 6.2.1.4252

Installation of F-PROT took the simple method of extracting an archive onto the system and poking around inside it for the required tools and daemons. Man pages and other hints were plentiful and clear, and setup was a painless process, as was testing itself.

Having become accustomed to the dragged-out nature of the tests so far, F-PROT’s scanning speeds seemed lightning-quick, with detection rates equally remarkable. But, just as everything was looking rosy for F-PROT, a single item in the clean set – a rather specialist text editing tool – was labelled as a backdoor program, and F-PROT therefore fails to make the VB100 grade by a whisker.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 100.00%
Macro: %
Standard: %
Polymorphic: 100.00%

F-Secure Linux Security 7.00.71615

F-Secure’s product is a lot bigger and shinier, with a complex installation process involving first setting up several dependencies, running the initial installer then running a secondary configuration program. Part of the reason for this bulkiness is the complexity of the product – in addition to the simple anti-virus scanner provided by most submissions this month, F-Secure’s product includes its own firewall and a series of intrusion-prevention and integrity-checking tools.

Getting things running was initially a little tricky, requiring in-depth perusal of a lengthy PDF manual included inside the install packages – of course, with no desktop environment on the test systems, this required copying it back to the client machine (and installing PDF viewing software) to read it. On first attempt the product claimed its on-access component was active, but it seemed to be having no effect, and the web interface – which appeared to be the only means of accessing much of the configuration – was inaccessible beyond the login page.

On second attempt things went much better, however – the on-access scanner, apparently based on a custom version of the dazuko technology, worked fine and the web interface presented a pleasant and well-laid-out experience.

I operated on-demand scanning via the command line as usual, and other tests also proceeded normally after a few tweaks to the settings via the GUI. Scanning speeds were fairly languid once again, but scanning levels were thorough and detection equally in-depth. Just when all was looking up, the same tool that tripped up F-PROT was alerted on, this time being labelled an Ircbot trojan. This meant that F-Secure was also denied a VB100 award this month, and as the alert was marked as originating from the AVP engine, more upsets were expected.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 100.00%
Macro: %
Standard: %
Polymorphic: 99.88%

Kaspersky Anti-Virus for Samba Servers for Linux 5.5

Kaspersky uses the .deb package method for its install, but this time it seemed to do little more than place the required software in the right spots; exactly where these spots might be was left somewhat unclear. After some rummaging around I found the manual pages and linked them in with the man system, which shed some light on how to proceed. Post-install scripts had doctored my Samba configuration to include the VFS on-access scanner, which was fairly simple to configure. The command-line scanner operated in a fairly normal way too, although it had an unwieldy title and required to be run as root to access its own configuration files.

Once things were up and running everything went fairly smoothly, with speeds not too bad in a slow month like this. Detection rates were excellent as ever, including complete coverage of the WildList, but as expected that pesky false positive cropped up once more and Kaspersky Lab fails to add another VB100 award to its collection.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 100.00%
Macro: %
Standard: %
Polymorphic: 99.88%

MicroWorld eScan for Linux Server 2,0-16

After the performance of the last few products, things did not bode well for MicroWorld, which is another product based on Kaspersky’s AVP engine.

The installation process was another complicated monster, with an enormous list of dependencies thoughtfully provided by the developers. While much of the list could be acquired from repositories on the platform’s install CD, many more items had to be scavenged from the Internet. Many of them seemed to relate to the graphics display, so I assumed they were required to support some kind of interface, which would not be required. Aided by detailed instructions provided by the developers I gathered them up and eventually managed to get all the eScan components, most of which were provided as .deb packages, to install and run.

Once everything was happy, and after another tweak to a Samba configuration file to activate a VFS object, the test chugged along at a fairly laid-back pace, the default settings being extremely thorough. In the end things went much as predicted, with excellent detection rates throughout and just that single pesky little file mislabelled in the clean set spoiling MicroWorld’s hopes for a VB100 award.

Please refer to the PDF for test data

Norman Virus Control 5.701

With Norman we returned once more to the simple and trusty method of dropping an archive load of files into the filesystem root. This did a splendid job, setting everything up just so, including the man pages, which enabled fast and easy navigation of the configuration process. With the dazuko module loaded once again, everything looked set to go in record time.

Getting through the on-demand tests proved a breeze, thanks to the lucid instructions and logical controls. The on-access monitoring seemed solid and functional, although slightly lower in detection of some file-infecting viruses thanks to the Sandbox technology playing less of a role by default. Looking to change these settings, all the advice I could find referred me to a Java-based interface, but getting access to this would require considerable effort and time – which was running short.

Skipping the added speed measurements with full scanning enabled, a quick look through the logging showed that Norman had brought a run of recent misses to an end with excellent WildList detection and no false positives, bringing it proudly back to VB100 certified status.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 100.00%
Macro: %
Standard: %
Polymorphic: 73.47%

Quick Heal 9.50

Quick Heal is another dazuko product, which also had a few other dependencies to fill. This proved to be a fairly straightforward job thanks to some tips from the developers.

The setup process was clear, helpful and even colourful, making a surprising and impressive difference to the clarity of an installation, which can often get swamped in a mass of samey text. Getting to grips with the command-line scanner proved a little less smooth, with a rather unusual syntax required including putting the path to files to be scanned before all other arguments. A GUI is also available, for those hedonists running glitzy desktop environments, but the command line served ably once its intricacies had been mastered.

The product lived up to its name with its scanning speeds, which were helped in the on-access test by not scanning archives by default. Upon investigating this, I found various options for the configuration of on-access logs and other sundries, but little concerning the actual types of files scanned. Of course, I may have been looking in the wrong place. Moving swiftly on to the results, I found detection rates at their usual decent level with excellent scores on the WildList and other more recent samples, but once more a couple of items in the clean set were alerted on and Quick Heal misses out on a VB100 award this month.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 66.67%
Macro: %
Standard: %
Polymorphic: 83.86%

Sophos Anti-Virus for Linux 6.3.3

Sophos’s product is unusual in this test in using its own file-hooking setup. This goes by the name of Talpa and apparently, like dazuko, has been made open-source but is not as widely implemented. This gave rise to a few worries, as the platform under test is rather new and as yet not officially supported. However, after some confusion over which version I should be using, things went like a dream.

The product is supplied as a simple .tgz archive and an installation script which prepares and sets up everything very neatly, including pleasantly accessible documentation.

The command-line scanner uses pretty straightforward and humanly readable syntax, and scanning times were excellent with the bare, no-options settings. Tweaking them up a bit still produced good speed, and the on-access scanner was similarly zippy, although working out the rather less straightforward configuration system took a few moments. In the end, detection rates were top-notch, false positives absent, and Sophos put some upsets in recent tests behind it by winning a VB100 award with ease.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 65.00%
Macro: %
Standard: %
Polymorphic: 100.00%

Symantec AntiVirus 1.0.4.516

I approached Symantec’s product with some dread, remembering a rather traumatic experience with its Byzantine configuration system in last year’s Linux test. Searching inside the pair of archives provided I found a selection of .deb packages and some documentation in PDF files, which were shipped over to the workstation for some in-depth browsing. Eventually, after a few false starts thanks to conflicting instructions at different points, things were up and running pretty solidly.

Next came the equally arduous task of navigating my way around the product, which was carried out mostly by means of passing arguments to the ‘sav’ command, which would then silently be followed by the scanning and monitoring daemons. This made monitoring the progress of scans rather tricky, having to rely mainly on watching the tail of the syslog – the only logging method available as far as I could tell – and keeping an eye on the hard drive activity lights.

On-access monitoring defaults to removing or disinfecting files, so to speed things along I delved bravely into the full glory of the configuration system, digging up secrets gleaned from tech support gurus for the last test. A listing of the configuration settings, which take the form of a registry-style database of keys, provided a little illumination, but its true meanings were far from clear. Passing in commands proved a lengthy and difficult process. Building up huge commands to pass in simple changes and an absence of feedback to confirm an instruction had been accepted, required repeated trawls through the data to check changes had indeed occurred.

Once everything was under control, things moved along nicely, with excellent scanning speeds and the usual impeccable detection. Archive settings were a little low to measure fairly against others on the speed graphs, but changing them would have required more visits to the config system, and the resultant wear and tear on keyboards would have eaten heavily into my hardware budget. Leaving things as they were, Symantec’s perfect detection scores across all test sets and absence of false positives earns it yet another VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 100.00%
Macro: %
Standard: %
Polymorphic: 100.00%

VirusBuster SambaShield 1.2.0_10

The final product on the list came from VirusBuster and proved a much more pleasant experience. Provided as a pair of archives with perl installation scripts, the setup ran through speedily and without difficulty. Another Samba VFS object managed the on-access side of things, and the layout and syntax seemed generally well thought out and sensible.

The tests zoomed through at an impressive rate, and detection levels showed continued improvement. Again a lack of time prevented in-depth investigation of the on-access configuration system to enable full archive scanning, but the only other problem encountered was the layout of the logging, keeping the scanned path on a separate line from detection information, which entailed a few extra stages to my results parsing. Beyond this minor quibble, though, the WildList was covered without problems, and the rest of the sets handled pretty well too, and without any false positives either VirusBuster earns a VB100 award.

ItW: 100.00%
ItW (o/a): 100.00%
Linux: 83.33%
Macro: %
Standard: %
Polymorphic: 79.29%

Results tables

Conclusions

Once more the Linux test has proved to be the domain of the hardened VB100 experts, the small list of products participating in this test consisting entirely of names made familiar from consistent and dogged appearances in the test month after month. A few regulars were notable by their absence, with some of the larger, more corporate-focused companies yet to implement support for the platform selected.

After a string of low-scoring tests I had hoped that this month might finally see a clean sweep with all products passing, and as far as the WildList went we nearly made it, with only one product having trouble in this area. Once again, however, the test’s strict false positive rules played a major part, with just four files scuppering the chances of a VB100 award for six of the products.

Beyond the basics of the scores, the products themselves displayed a dizzying variation in style and implementation, with some remaining extremely simple while others have expanded their functionality in a range of new ways. Both paths proved capable of providing stable, rapid and usable products as well as confusing, sluggish and wobbly protection, with documentation – or at least accessing it – being the most significant factor as far as ease of use was concerned. Of course, submissions were not necessarily made in the same format as paying customers would receive, and the likelihood of more obvious installation instructions and user manuals would make a big difference in some cases. With a little work, however, all products were made to function sufficiently well to get through the tests, and all provided a decent level of configurability, albeit in some cases in a rather bizarre and arcane fashion.

The added complexity of the installation and navigation of various products meant that this month’s comparative was not the quick and restful experience I had hoped for between two much larger tests. It has highlighted the pace with which most products are keeping up with our test sets, and the need for more rapid expansion of those test collections toprovide a more accurate gauge of their capabilities. Hopefully, June will grant time to ensure the test sets are well enlarged in time for the forthcoming XP comparative, due to commence at the start of July and to appear in the August issue of VB. Perhaps that test, which I expect to break the 40 product mark, will finally see that clean sweep with no failures – I can but hope.

Technical details

Test environment. Tests were run on identical machines with AMD Athlon64 3800+ dual core processors, 1 GB RAM, 40 GB and 200 GB dual hard disks, DVD/CD-ROM and 3.5-inch floppy drive, running Ubuntu Linux 8.04LTS Server edition.

Client machines. Client machines had 1.6 GHz Intel Pentium processors with 512 MB RAM, 20 GB dual hard disks, CD-ROM and 3.5-inch floppy drive running Microsoft Windows XP Professional SP2, connected via Samba 3.0.28a.

Any developers interested in submitting products for VB's comparative reviews should contact [email protected]. The current schedule for the publication of VB comparative reviews can be found at http://www.virusbtn.com/vb100/about/schedule.xml.

twitter.png
fb.png
linkedin.png
hackernews.png
reddit.png

 

Latest articles:

Nexus Android banking botnet – compromising C&C panels and dissecting mobile AppInjects

Aditya Sood & Rohit Bansal provide details of a security vulnerability in the Nexus Android botnet C&C panel that was exploited to compromise the C&C panel in order to gather threat intelligence, and present a model of mobile AppInjects.

Cryptojacking on the fly: TeamTNT using NVIDIA drivers to mine cryptocurrency

TeamTNT is known for attacking insecure and vulnerable Kubernetes deployments in order to infiltrate organizations’ dedicated environments and transform them into attack launchpads. In this article Aditya Sood presents a new module introduced by…

Collector-stealer: a Russian origin credential and information extractor

Collector-stealer, a piece of malware of Russian origin, is heavily used on the Internet to exfiltrate sensitive data from end-user systems and store it in its C&C panels. In this article, researchers Aditya K Sood and Rohit Chaturvedi present a 360…

Fighting Fire with Fire

In 1989, Joe Wells encountered his first virus: Jerusalem. He disassembled the virus, and from that moment onward, was intrigued by the properties of these small pieces of self-replicating code. Joe Wells was an expert on computer viruses, was partly…

Run your malicious VBA macros anywhere!

Kurt Natvig wanted to understand whether it’s possible to recompile VBA macros to another language, which could then easily be ‘run’ on any gateway, thus revealing a sample’s true nature in a safe manner. In this article he explains how he recompiled…


Bulletin Archive

We have placed cookies on your device in order to improve the functionality of this site, as outlined in our cookies policy. However, you may delete and block all cookies from this site and your use of the site will be unaffected. By continuing to browse this site, you are agreeing to Virus Bulletin's use of data as outlined in our privacy policy.