============================================================ File 0XECSUM.TXT: "EXECUTIVE SUMMARY" AntiVirus/AntiMalware Product Test "2001-04" antiVirus Test Center (VTC), University of Hamburg ============================================================ [Formatted with non-proportional font (Courier), 72 columns] ********************************************************************** Content of this file: ********************************************************************** 0. Editors Foreword 1. Background of this test; Malware Threats Table ES0: Development of viral/malware threats 2. VTC Testbeds used in VTC test "2001-04" Table ES1: Content of VTC test databases in test 2001-04 3. Products participating ins Test "2001-04" Table ES2: List of AV products in test 2001-04 4. A serious problem: Flaw in Microsoft FindFirst/FindNext routine? 5. Results of AV products under DOS Table DOS-A: Development of DOS scanners from 1997-02 to 2001-04 Findings DOS.1 - DOS.8 Grading DOS products according to their detection performance 6. Results of on-demand detection under Windows-NT Table WNT-A: Development of W-NT scanners from 1997-07 to 2001-04 Findings WNT.1 - WNT.8 Grading WNT products according to their detection performance 7. Results of on-demand detection under Windows-98 Table W98-A: Development of W-98 scanners from 1998-10 to 2001-04 Findings W98.1 - W98.8 Grading W98 products according to their detection performance 8. Results of on-demand detection under Windows-2000 Table W2k-A: Development of W-2k scanners from 2000-08 to 2001-04 Findings W2k.1 - W2k.8 Grading W2k products according to their detection performance 9. Comparison of Wdetection behaviour for W32 platforms Grading AV products concerning W32-harmonical behaviour 10. Results of on-demand detection under Linux(SuSe) Table Lin-A: Development of Linux scanners in 2001-04 Findings LIN.1 - LIN.8 Grading LINUX products according to their detection performance 11. Conclusion: In Search of the "Perfect AV/AM product" 12. Availability of full test results 13. Copyright, License, and Disclaimer *********************************************************************** 1. Editors Foreword: ==================== VTC test "2001-04" (started mid-December 2000) proved - unexpectedly - to be the most complex and challenging task so far. This test produced more data and results than any VTC test before, not only because of the large zoo testbeds and the fact that we tested 5 different platforms. With the growth of our testbeds, we also experienced much more problems as products behaved "abnormally" (see 8problms.txt). Moreover, some products (and our test crew) suffered from a known but uncorrected flaw in Microsoft FindFirst/FindNext routines which required several postscans (see 4). Evidently, the time of DOS-based AntiVirus/AntiMalware products - formerly the reference products for measuring detection rates on W32 platforms - passes by. Not only is the number of DOS products decreasing, but worse: detection rates tend to fall. Consequently, DOS products can no longer be regarded as reference products. This forced the editor to develop a general framework for evaluation of each single product (see 7EVAL texts). With the deployment of new W32 platforms (including W-XP), customers going from one W32 platform to another will assume that related AV/AM products behave with IDENTICAL detection rates. We tested this assumption (which we call "W32-harmonical behaviour", see 9) and found that this assumptions holds presently only for FEW products. Here, AV/AM producers must invest significantly more intelligence and work. One serious concern from our results is that AV producers concentrate more on detection of In-The-Wild (ITW) viruses than on zoo viruses. Indeed, one AV company informed us that they dont wish to participate in our test as they concentrate on ITW detection and are aware that their products will produce "unfavourable results" for our zoo testbeds (see 3). For many other AV products, detection rates of ITW viruses are perfect (100%) or excellent (>99%) but detection of zoo viruses is often significantly lower. Evidently, AV producers focusing on ITW detection forget that any ITW virus has been a zoo virus before becoming In-the-wild. It cannot surprise that customers of such products experience badly how neglection of zoo virus detection affects their IT services when a hithertoo unknow zoo virus is deployed broadly (the author of this test report had to advice several victims of such ill-advised "ITW-mindedness" aka "zoo-blindness"). And first victims - often large companies - can also not win from any fast exchange of newly "wildering" code: this is always too late for some! This test - as all previous ones - has been performed by students of Hamburg university Faculty for Informatics with special interest in IT Security (see our 4-semester curriculum started in 1988 on our homepage). Different from other tests where submitters of products have to pay a fee for being admitted to tests, VTC tests are "free of fee". This implies that students who have to complete their examinations and usually also work to earn their life are only "partially available" for tests. Moreover, our hardware which is essentially funded by Faculty support (sometimes also by donation of new machines, usually more powerful than those which we can by from university money) canNOT compete with the technical equipment in other test-labs. We regret that these circimstances cause delays in perfor- ming and publishing our regular test reports, but instead of hurrying to meet dates and expectations, we insist that assessed quality of our test results shall have - also in the future - highest priority. Most work in VTC tests rest on the shoulders of our test crew, and the editor wishes to thanx them all for their devotion and hard work. 1. Background of this test: Malware Threats: ============================================ Malicious software (malware) including viruses (=self-replicating malware), trojan horses (=pure payload without self-replication), virus droppers and network malware (e.g. worms and hostile applets), are regarded as serious threats to PC users esp. when connected to Intranetworks and Internet. The development of malicious software can well be studied in view of the growth of VTC (zoo and In-The-Wild) testbeds. The following table summarizes, for previous and current VTC tests (indicated by their year and month of publication), the size of virus and malware (full = "zoo") databases (indicating each the different viruses and number of instantiations of a virus or malware and having in mind that some revisions of testbeds were made): Table ES0: Development of threats as present in VTC test databases: =================================================================== = File viruses= = Boot Viruses= =Macro Viruses= == Malware == =ScriptViruses= Test# Number Infected Number Infected Number Infected Number Malware Number Number Viruses objects viruses objects viruses objects file macro viruses objects ------------------------------------------------------------------------------------- 1997-07: 12,826 83,910 938 3,387 617 2,036 213 72 --- --- 1998-03: 14,596 106,470 1,071 4,464 1,548 4,436 323 459 --- --- 1998-10: 13,993 112,038 881 4,804 2,159 9,033 3,300 191 --- --- 1999-03: 17,148 128,534 1,197 4,746 2,875 7,765 3,853 200 --- --- + 5 146,640 (VKIT+4*Poly) 1999-09: 17,561 132,576 1,237 5,286 3,546 9,731 6,217 329 --- --- + 7 166,640 (VKit+6*Poly) 2000-04: 18,359 135,907 1,237 5,379 4,525 12,918 6,639 260 --- --- + 7 166,640 (VKit+6*Poly) 2000-08: --- --- --- --- 5,418 15,720 --- 500 306 527 2001-04: 20,564 140,703 1,311 5,723 6,233 19,387 12,160 627 477 904 + 7 166,640 (Vkit+6*Poly) ------------------------------------------------------------------------------------- Remark: Before test 1998-10, an ad-hoc cleaning operation was applied to remove samples where virality could not be proved easily. Since test 1999-03, separate tests are performed to evaluate detection rates of VKIT-generated and selected polymorphic file viruses. With annual deployment of more than 5,000 viruses and about 1,000 Trojan horses, many of which are available from Internet, and in the absence of inherent protection against such dysfunctional software, users must rely on AntiMalware and esp. AntiVirus software to detect and eradicate - where possible - such malicious software. Hence, the detection quality of AntiMalware esp. including AntiVirus products becomes an essential prerequisite of protecting customer productivity and data. Virus Test Center (VTC) at Hamburg University´s Faculty for Informatics performs regular tests of AntiMalware and esp. AntiVirus Software. VTC recently tested current versions of on-demand scanners for their ability to identify PC viruses. Tests were performed on VTCs malware databases, which were frozen on their status as of *** October 31, 2000 *** to give AV/AM producers a fair chance to support updates within the 8 weeks submission period (product submission date: December 11, 2000). The main test goal was to determine detection rates, reliability (=consistency) of malware identification and reliability of detection rates for submitted or publicly available scanners. Special tests were devoted to detection of multiple generations of 6 polymorphic file viruses (Maltese.Amoeba, Mte.Encroacher.B, Natas, Tremor Tequila and One-Half) and of viruses generated with the "VKIT" file virus generator. It was also tested whether viruses packed with 6 popular compressing methods (PKZIP, ARJ, LHA, RAR, WinRAR and CAB) would be detected (and to what degree) by scanners. Moreover, avoidance of False Positive alarms on "clean" (=non-viral and non- malicious) objects was also determined. Finally, a set of selected non-viral file and macro malware (droppers, Trojan horses, intended viruses etc) was used to determine whether and to what degree AntiVirus products may be used for protecting customers against Trojan horses and other forms of malware. VTC maintains, in close and secure connection with AV experts worldwide, collections of boot, file and macro viruses as well as related malware ("zoo") which have been reported to VTC or AV labs. Moreover, following the list of "In-The-Wild Viruses" (published on regular basis by Wildlist.org), a collection of viruses reported to be broadly visible is maintained to allow for comparison with other tests; presently, this list does not report ITW Malware. 2. VTC Testbeds used in VTC test "2001-04": =========================================== The current sizes of different VTC testbeds (developped from previous testbeds through inclusion of new viruses and malware and some revision) is given in the following table (for detailed indices of VTC testbeds, see file "a3testbed.zip") Table ES1: Content of VTC test databases: ========================================================================== "Full Zoo":20,564 File Viruses in 140,703 infected files, 60,000 Instantiations of 6 polymorphic file viruses 10,706 Variations of file viruses generated with VKIT 12,160 different File Malware in 6,250 file objects 664 Clean Files (in 27 directories) used for False Positive Detection 1,311 System (Boot etc) Viruses in 5,723 infected images, 6,233 Macro (VBA) Viruses in 19,387 infected documents, 403 different Macro Malware in 627 macro objects 329 Clean macro objects (in 26 directories) used for False Positive test 477 different script (VBS etc) viruses in 904 infected objects 115 different families "exotic" viruses/trojans (274 variants) ----------------------------------------------------------------- "ITW Zoo": 20 File Viruses in 409 infected files, 22 System Viruses in 300 infected images, 147 Macro Viruses in 1,347 infected documents 16 Script Viruses in 133 infected objects ============================================================================ For a survey of platforms, see A4tstdir.txt, and for the content of the resp. testbeds see A3TSTBED.zip (available for download). Concerning quality of viral testbeds, it is sometimes difficult to assess the "virality" (=ability of a given sample to replicate at least twice under given constraints) of large "viral zoo" databases, esp. as some viruses work under very specific conditions. We are glad to report, that colleagues such as Dr. Vesselin Bontchev, Fridrik Skulason, Igor Muttik and Righard Zwienenberg (to name only some) helped us with critical and constructive comments to establish viral testbeds, the residual non-viral part of which should be very small. We also wish to thank "WildList Organisation" for supporting us with their set of In-The-Wild viruses; the related results may support users in comparing VTC tests with other ITW-only tests. 3. Products participating ins Test "2001-04": ============================================= For test "2001-04", the following *** 25 *** AntiVirus products (adressed in subsequent tables by a 3-letter abbreviation) under DOS, Windows-98, Windows-NT, Windows-2000 and LINUX in 77 different versions were tested: Table ES2: List of AV products in test "2001-04" ================================================ Abbreviation/Product/Version Tested under Platform ---------------------------------------------------------------- ADO = AntiDote W-98 AT5 = Anti-Trojan 5 W-98 AVA = AVAST v7.70 DOS AV3 = AVAST32 v3.0 W-NT W-98 W-2k AVG = AVG 6.220 DOS W-NT W-98 W-2k AVK = AVK 3.0 DOS W-NT W-98 W-2k AVP = AVP Platinum DOS W-NT W-98 W-2k Linux CLE = Cleaner 3.2 W-98 CMD = Command Software AV 4.60 W-NT W-98 W-2k Linux DRW = DrWeb 4.21 DOS W-NT W-98 DSE = Dr Solomon Emergency AV W-98 FPR = FProt 3.08b DOS W-NT W-98 W-2k FPW = FProt FP-WIN 3.05 W-NT W-98 W-2k FSE = FSAV 5.21 W-NT W-98 W-2k Linux INO = Inoculan 4.53 Enterprise Ed. DOS W-NT W-98 W-2k MR2 = MR2S 1.14 DOS W-98 NAV = NAV 5.01.01 DOS W-NT W-98 W-2k NVC = NVC 4.8/4.9 DOS W-NT W-98 W-2k PAV = PAV 3.0 DOS W-NT W-98 W-2k PER = Perivian AntiVirus W-NT W-98 W-2k PRO = Protector 7.70 W-NT W-98 W-2k QHL = QuickHeal 6.0 W-98 RAV = RAV 8.1 W-NT W-98 W-2k Linux SCN = NAI VirusScan 4.12.0 DOS W-NT W-98 W-2k Linux VSP = VSP 12.02.2 DOS W-NT W-98 W-2k ----------------------------------------------------------------- For details of AV products including options used to determine optimum detection rates: see A3SCNLS.TXT. For scanners where results are missing: see 8problms.txt. In general, AV products were either submitted or, when test versions were available on Internet, downloaded from respective ftp/http sites. Few scanners were not available either in general (e.g. TNT) or for this test, some of which were announced to participate in some future test. Finally, very few AV producers answered VTCs requests for submitting scanners with electronic silence. Concerning often asked questions, some AV producers deliberately don´t submit their products: TrendMicro Germany has informed the author of this report that they are NOT interested in VTC test participation as their scanner is deliberately trimmed to on-access scanning and detection of In-The-Wild viruses. As VTC emphasize also the detection of zoo viruses where their products would produce "unfavourable results", there is no intent to submit their products. Following this clarification, VTC refrains from inviting TrendMicro for future test participation. Panda has permitted tests to any institution and university *except VTC*. Regrettably, we had to ask Sophos, producer of Sophos AntiVirus (aka Sweep) to refrain from submitting their products, as VTC doesnot wish to support an enterprise which deliberately advocates and practices virus eXchange (vX) (whether rapid or not) which according to VTCs Code of Conduct is an attitude practiced by malevolent collectors and authors of malevolent software. Sophos followed VTCs suggestion but prefers calling this request "exclusion". So far, there is no indication that Sophos has changed their attitude concerning "vX". In contrast, VTC welcomes that both MR2S and VSP are submitted for the test again. These products were excluded for one test as one former co-author (Mr. Marx) had publicly accused VTC for bad testing when his own product (TSC) produced "inadequate results" (including multiple crashes) As Mr. Marx is no longer involved, we are glad to continue supporting MR2S and VSP improvement. The following paragraphs survey essential findings in comparison with last VTC tests (performance over time), as well as some relative "grading" of scanners for detection of file and macro viruses, both in full "zoo" and "In-The-Wild" testbeds, of file and macro malware, as well as detection of ITW file and macro viruses in objects packed with ARJ, LHA, ZIP, RAR, WinRAR and CAB. Special tests were performed for a large set of viruses generated with VKit virus generator, and the ability to detect polymorphic file viruses was also analysed. For DOS (only), the detection fo boot/MBR infectors was also tested. Finally, the ability of AV products to avoid False Positive alarms is also analysed. With the growing diversity of platforms, many PC clients and servers are used to transmit code for non-native platforms; often devices operating under totally different platforms (UNIX variants, OS/2, but also Personal Digital Assistants or mobile/cell phones) are connected to PCs. When PCs host code (active content or executables) for such platforms, they can also serve as entry gates for viral or malicious code. Consequently, we have introduced a test for "exotic malware" (viral or trojanic) on alien platforms such as Linux, OS/2, Java et al. Detailed results including precision and reliability of virus and malware identification (including the grids used to assign a performance level to each product) are presented in files (platform-specific): for DOS: 6bdosfil.txt, 6 dosmac.txt, 6ddosboo.txt for W32: 6gwnt.txt, 6fw98.txt, 6hw2k.txt comparison: 6mcmp32.txt for Linux: 6xlin.txt In a rather detailed analysis, detection rates are presented for each platform (operating systems), and product behaviour is graded in compa- rison with all products tested on ther resp. platform: evaluation/grading for DOS products: 7evaldos.txt for W-NT products: 7evalwnt.txt for W-98 products: 7evalw98.txt for W-2k products: 7evalw2k.txt for LINUX products: 7evallin.txt Under the scope of VTCs grading system, a "Perfect WNT AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99% of zoo samples, in ALL categories (file, boot and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (6) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Remark: detection of "exotic viruses" is presently NOT rated. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). Remark: detection of "exotic malware" is presently NOT rated. 4. A serious problem: Flaw in Microsoft FindFirst/FindNext routine? =================================================================== Since VTC tests started (including the work of Vesselin Bontchev in the early 1990s), we have experienced many problems. Products were often difficult to install and manage (see 8problems). With the growing size and diversity of testbeds, it was expected that problems would again grow. But there is one problem which not only affects testers with large viral databases (which is hardly the situation which customers experience). More than ever before, we found after completion of some test run that AV products had NOT TOUCHED all parts of the directory, for no obvious reason and always without any diagnosis (no exception etc). In such cases, we determined those parts of the testbed which had not been processed and restarted the product ("postscan"). When after completion some remainder part was untouched, we started a 2nd postscan for the remainder. The most probably reason for such behaviour of SEVERAL products which otherwise behave "smoothly" is: the methods offered by Microsoft to traverse a directory, esp. routines FindFirst and FindNext, DONT WORK RELIABLY on large directories. This effect has been reported first by Eugene Kaspersky, but we have seen NO IMPROVEMENT OR CORRECTION. Evidently, this problems seems to be related to the invocation of those routines (FF/FM), and this may be affected by some compiler or assembler. Only through excessive inspection of resulting testlogs ("test quality assurance"), we could reduce the impact of this MS flaw on our test results: but it "NOT NATURAL" that anyone must start an AV product more than once to be sure that all potentially malicious objects had ALL been checked! This problem may not only show its dirty face for large virus/malware testbeds. With growing sizes of customer directories, the likelihood that NOT ALL OBJECTS are touched by any method using FF/FM grows. As this is a problem also for many companies with large directories, WE STRONGLY REQUEST THAT MICROSOFT CURES THIS FLAW. 5. Results of AV products under DOS: ==================================== This is a summary of the essential findings for AV/AM products under DOS. For details see 7evaldos.txt. Meant as a perspective of product results, the following table (DOS-A) lists all results of DOS scanners for zoo detection of file, macro and script viruses, in last 9 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table DOS-A: File/Macro/Script Virus Detection Rate in last 9 VTC tests under DOS: ================================================================================== --------- File Virus ----------------------- + ------------------- Macro Virus ----------------- + ScriptVirusDet- SCAN9702 9707 9802 9810 9903 9909 0004 0104 DeltaI 9702 9707 9802 9810 9903 9909 0004 0008 0104 DeltaI 0008 0104 Delta NER % % % % % % % % % I % % % % % % % % % % I % % % -------------------------------------------------+---------------------------------------------------+---------------- ALE 98.8 94.1 89.4 - - - - - - I 96.5 66.0 49.8 - - - - - - - I - - - ANT 73.4 80.6 84.6 75.7 - - 92.8 - - I 58.0 68.6 80.4 56.6 - - 85.9 93.3 - - I 55.2 - - AVA 98.9 97.4 97.4 97.9 97.6 97.4 97.5 95.2 -2.3 I 99.3 98.2 80.4 97.2 95.9 94.6 93.7 - 92.3 - I - 30.0 - AVG 79.2 85.3 84.9 87.6 87.1 86.6 - 81.9 - I 25.2 71.0 27.1 81.6 82.5 96.6 - - 98.3 - I - 57.9 - AVK - - - 90.0 75.0 - - 99.7 - I - - - 99.7 99.6 - - 100~ 100~ 0.0 I 91.5 99.4 +7.9 AVP 98.5 98.4 99.3 99.7 99.7 99.8 99.6 99.8 +0.2 I 99.3 99.0 99.9 100% 99.8 100% 99.9 - 100~ - I - 99.8 - CMD - - - - - - 99.5 - - I - - - - - 99.5 100% 100~ - - I 93.5 - - DRW 93.2 93.8 92.8 93.1 98.2 98.3 - - - I 90.2 98.1 94.3 99.3 98.3 - 98.4 97.6 98.0 +0.4 I 60.8 95.6 +34.8 DSE 99.7 99.6 99.9 99.9 99.8 - - - - I 97.9 98.9 100% 100% 100% - - - - - I - - - FMA - - - - - - - - - I 98.6 98.2 99.9 - - - - - - - I - - - FPR 90.7 89.0 96.0 95.5 98.7 99.2 99.6 97.8 -1.8 I 43.4 36.1 99.9 99.8 99.8 99.7 100% 100~ 100% 0.0 I 90.5 96.9 +5.9 FSE - - 99.4 99.7 97.6 99.3 99.9 - - I - - 99.9 90.1 99.6 97.6 99.9 - - - I - - - FWN - - - - - - - - I 97.2 96.4 91.0 85.7 - - - - - - I - - - HMV - - - - - - - - I - - 98.2 99.0 99.5 - - - - - I - - - IBM 93.6 95.2 96.5 - - - - - - I 65.0 88.8 99.6 - - - - - - - I - - - INO - - 92.0 93.5 98.1 94.7 94.6 91.0 - I - - 90.3 95.2 99.8 99.5 99.7 99.7 99.3 -0.4 I 77.8 66.0 -11.8 IRS - 81.4 74.2 - 51.6 - - - - I - 69.5 48.2 - 89.1 - - - - - I - - - ITM - 81.0 81.2 65.8 64.2 - - - - I - 81.8 58.2 68.6 76.3 - - - - - I - - - IVB 8.3 - - - 96.9 - - - - I - - - - - - - - - - I - - - MR2 - - - - - 65.4 - - - I - - - - - 69.6 - - 44.2 - I - 85.1 - NAV 66.9 67.1 97.1 98.1 77.2 96.0 93.3 90.8 -2.5 I 80.7 86.4 98.7 99.8 99.7 98.6 97.4 97.0 93.8 -3.2 I 24.8 31.2 +6.6 NOD - - - 96.9 - 96.9 98.3 - - I - - - - 99.8 100% 99.4 - - - I - - - NVC 87.4 89.7 94.1 93.8 97.6 - 99.1 - - I 13.3 96.6 99.2 90.8 - 99.6 99.9 99.9 99.8 -0.1 I 83.7 88.5 +4.8 PAN - - 67.8 - - - - - - I - - 73.0 - - - - - - - I - - - PAV - 96.6 98.8 - 73.7 98.8 98.7 99.9 +1.2 I - - 93.7 100% 99.5 98.8 99.9 - 100~ - I - 99.8 - PCC - - - - - - - - - I - 67.6 - - - - - - - - I - - - PCV 67.9 - - - - - - - - I - - - - - - - - - - I - - - PRO - - - - 35.5 - - - - I - - - - 81.5 - - - - - I - - - RAV - - - 71.0 - - - - - I - - - 99.5 99.2 - - - - - I - - - SCN 83.9 93.5 90.7 87.8 99.8 97.1 99.9 99.8 -0.1 I 95.1 97.6 99.0 98.6 100% 100% 100% 100~ 100% 0.0 I 85.6 100% +14.4 SWP 95.9 94.5 96.8 98.4 - 99.0 98.4 - - I 87.4 89.1 98.4 98.6 - 98.4 98.4 - - - I - - - TBA 95.5 93.7 92.1 93.2 - - - - - I 72.0 96.1 99.5 98.7 - - - - - - I - - - TSC - - 50.4 56.1 39.5 51.6 - - - I - - 81.9 76.5 59.5 69.6 - - - - I - - - TNT 58.0 - - - - - - - - I - - - - - - - - - - I - - - VDS - 44.0 37.1 - - - - - - I 16.1 9.9 8.7 - - - - - - - I - - - UKV - - - - - - - - - I - - - - - - - 0.0 - - I 0.0 - - VET - 64.9 - - 65.3 - - - - I - 94.0 97.3 97.5 97.6 - - - - - I - - - VIT - - - - - - 7.6 - - I - - - - - - - - - - I - - - VRX - - - - - - - - - I - - - - - - - - - - I - - - VBS 43.1 56.6 - 35.5 - - - - - I - - - - - - - - - - I - - - VHU 19.3 - - - - - - - - I - - - - - - - - - - I - - - VSA - - 56.9 - - - - - - I - - 80.6 - - - - - - - I - - - VSP - - - 76.1 71.7 79.6 - - - I - - - - - - - - 0.0 - I - 85.3 - VSW - - 56.9 - - - - - - I - - 83.0 - - - - - - - I - - - VTR 45.5 - - - - - - - - I 6.3 - - - - - - - - - I - - - XSC 59.5 - - - - - - - - I - - - - - - - - - - I - - - -------------------------------------------------+---------------------------------------------------+----------------- Mean74.2 84.8 84.4 85.4 81.2 90.6 98.3 95.1%-0.9%I 69.6 80.9 83.8 89.6 93.6 88.2 98.0 98.6 93.8%-0.5%I 66.4 79.7% +8.9% -------------------------------------------------+---------------------------------------------------+----------------- Generally, the ability of DOS scanners to detect file zoo viruses "in the mean" has a visible tendency to decrease. Moreover, those scanners present in last test reduced detection rates by -0.9% (mean). Concerning macro viruses, "mean" detection rate is also reduced and tends to become insufficient. Here, products having participated in last VTC tests reduced their detection rates "only" moderately (-0.5%). Concerning script viruses which is presently the fastest growing sector, the detection rate is very low (79.7% mean). But those (7) scanners which participated in last test succeeded to improve their detection rate by impressing 8.9% "in the mean". Findings DOS.1: General development of file/macro/script zoo detection rates: ----------------------------------------------------------------------------- For DOS, file and macro zoo virus detection rates in the mean are decreasing. 4 (of 9) scanners detects almost all zoo file viruses (>99.7%), whereas 2 (out of 13) scanners detect ALL macro zoo viruses (and 3 more almost all). Detection rates for script viruses are, in the mean, still inacceptably low (79.7%) but the mean detection rate of those scanners which also participated in last VTC test is significantly improving, though only 1 (of 13) scanners detect ALL zoo script viruses. Findings DOS.2: Development of ITW file/macro/script virus detection rates: --------------------------------------------------------------------------- 4 AV products (out of 13) detect ALL In-The-Wild file, macro and zoo viruses in ALL instantiations (files): AVK, AVP, PAV, SCN 4 more products can be rated "perfect" concerning detection of file and macro viruses but they still fail to detect all script viral files (objects): AVG, DRW, FPR, NVC Findings DOS.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- No DOS product is overall rated "perfect". 4 "excellent" overall scanners: SCN, PAV,AVP, AVK 1 "very good" overall scanner : FPR Finding DOS.4: Performance of DOS scanners by virus classes: ------------------------------------------------------------ Perfect scanners for file zoo: =NONE= Excellent scanners for file zoo: PAV,AVP,SCN,AVK Perfect scanners for macro zoo: FPR,SCN Perfect scanners for script zoo: SCN Perfect scanners for polymorphic set:AVG,AVK,AVP,DRW,PAV Perfect scanners for VKit set: =NONE= Excellent scanners for VKit set: AVK,AVP,PAV,SCN Perfect scanners for boot viruses: =NONE= Excellent scanners for boot viruses: PAV,AVP,SCN,AVK Findings DOS.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Detection of packed viral objects needs improvement Perfect packed file/macro virus DOS detector: SCN Perfect packed file virus detector: AVK, AVP, PAV, SCN Perfect packed macro virus detector: SCN Findings DOS.6: Avoidance of Fales Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is improving though still regarded insufficient. FP-avoiding perfect DOS scanners: AVA, AVG, AVK, INO, PAV, SCN Findings DOS.7: Detection rates for file/macro malware: ------------------------------------------------------- File/Macro Malware detection under DOS is stable and needs further improvement. NO product is "perfect" (in last test: 2 products!) but 5 products are rated "excellent" (>90% detection rate): AVP, PAV, AVK, FPR, SCN Concerning only macro malware detection, NO product is rated "perfect": =NONE= Concerning macro malware detection only, 6 more products are rated "excellent": FPR, SCN, AVK, AVP, PAV, NVC Findings DOS.8: Detection rates for exotic viruses: --------------------------------------------------- Exotic viruses are detected by scanners under DOS only to a lesser degree. The following products detect at least 70%: FSE, PAV, AVK, AVP, SCN Grading DOS products according to their detection performance: ============================================================== Test category: "Perfect" "Excellent" -------------------------------------------------------------- DOS zoo file test: --- PAV,AVP,SCN,AVK DOS zoo macro test: FPR,SCN --- DOS zoo script test: SCN --- DOS zoo boot test: --- PAV,AVP,SCN,AVK DOS zoo Poly test: AVG,AVK,AVP,DRW,PAV --- DOS zoo VKit test: --- AVK,AVP,PAV,SCN DOS ITW tests: AVK,AVP,PAV,SCN AVG,DRW,FPR,NVC DOS pack-tests: SCN --- DOS FP avoidance: AVA,AVG,AVK,INO,PAV,SCN --- DOS Malware Test: --- AVP,PAV,AVK,FPR,SCN --------------------------------------------------------------- ************************************************************** "Perfect" DOS AntiVirus product: =NONE= "Excellent" DOS AV products: 1st place: SCN (13 points) 2nd place: AVK,PAV ( 9 points) 4th place: AVP ( 7 points) 5th place: AVG ( 5 points) 6th place: FPR,DRW ( 3 points) 8th place: AVA,INO ( 2 points) 10th place: NVC ( 1 point ) ************************************************************** "Perfect" DOS AntiMalware product: =NONE= "Excellent" DOS AntiMalware product: 1st place: SCN (14 points) 2nd place: AVK,PAV (10 points) 4th place: AVP ( 8 points) 5th place: FPR ( 4 points) ************************************************************ 6. Results of on-demand detection under Windows-NT: =================================================== This is a summary of the essential findings for AV/AM products under W-NT. For details see 7evalwnt.txt. Meant as a perspective of product results, the following table (WNT-A) lists all results of WNT scanners for zoo detection of file, macro and script viruses, in last 9 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table WNT-A: File/Macro/Script Virus Detection Rate in last 8 VTC tests under W-NT: =================================================================================== Scan ======== File Virus Detection ========== + ========= Macro Virus Detection ============= + =ScriptVirusDet= ner 9707 9802 9810 9903 9909 0004 0104 Delta I 9707 9802 9810 9903 9909 0004 0008 0104 Delta I 0008 0104 Delta -----------------------------------------------+-----------------------------------------------+----------------- ANT 88.9 69.2 91.3 - 87.2 92.8 - - I 92.2 - 85.7 - 89.3 90.2 96.4 - - I 55.2 - - ANY - - 69.7 - - - - - I - - 70.5 - - - - - - I - - - ATD - - - - - 100% - - I - - - - - 99.9 - - - I - - - AVA - 97.4 96.6 97.1 97.4 97.2 95.2 -2.0 I - 91.9 97.2 95.2 93.3 94.3 94.1 95.7 +1.6 I 15.0 29.1 +14.1 AVG - - - 87.3 87.0 85.4 81.9 -3.5 I - - - 82.5 96.6 97.5 97.9 98.3 +0.4 I 45.8 57.9 +12.1 AVK - - 99.6 90.2 99.8 99.7 99.8 +0.1 I - - 99.6 99.6 100% 99.9 100~ 100~ 0.0 I 91.8 99.8 +7.0 AVP - - 83.7 99.9 99.8 99.9 99.9 0.0 I - - 100% 99.2 100% 99.9 100~ 100~ 0.0 I 88.2 99.8 +11.6 AVX - - - - - - - - I - - - - - - - 99.0 - I 61.4 - - AW - 56.4 - - - - - - I - 61.0 - - - - - - - I - - - CLE - - - - - - - - I - - - - - - - - - I 4.2 - - CMD - - - - - 99.6 97.8 -1.8 I - - - - - 100% 100% 100% 0.0 I 93.5 96.9 +3.4 DRW/DWW - - 93.3 98.3 98.3 - - I - - - 98.3 98.8 98.4 97.5 - - I 59.8 - - DSS/E 99.6 99.7 99.9 99.3 - - - - I 99.0 100% 100% 100% - - - - - I - - - ESA - - - - - 58.0 - - I - - - - - 88.9 - - - I - - - FPR/FMA - 96.1 - 98.7 99.4 - 97.8 - I - 99.9 99.8 99.8 99.7 - - 100% - I - 97.1 - FPW - - - - - 99.6 97.8 -1.8 I - - - - 99.7 100% 100% 100% 0.0 I 90.8 96.9 +6.1 FSE - 85.3 99.8 100% 99.9 100% 51.1 -48.9#I - - 99.9 100% 100% 100% 100% 100% 0.0 I 96.7 100% +3.3 FWN - - - - - - - - I - - 99.6 99.7 - 99.9 - - - I - - - HMV - - - - - - - - I - - 99.0 99.5 - - - - - I - - - IBM 95.2 95.2 77.2 * * * * * I 92.9 92.6 98.6 * * * * * * I * * * INO - 92.8 - 98.1 98.0 98.7 97.8 -0.9 I - 89.7 - 99.8 99.7 99.7 99.8 99.7 +0.1 I 78.1 92.7 +14.6 IRS - 96.3 - 97.6 - - - - I - 99.1 - 99.5 - - - - - I - - - IVB - - - - - - - - I - - 92.8 95.0 - - - - - I - - - MKS - - - - - 78.0 - - I - - - - - 97.1 - - - I - - - MR2 - - - - 61.9 - - - I - - - - 69.6 - - - - I - - - NAV 86.5 97.1 - 98.0 97.6 96.8 93.9 -2.9 I 95.6 98.7 99.9 99.7 98.7 98.0 97.7 97.0 -0.7 I 36.6 54.5 +17.9 NOD - - - 97.6 98.2 98.3 - - I - - - 99.8 100% 99.4 - - - I - - - NVC 89.6 93.8 93.6 96.4 - 99.1 98.1 -1.0 I 96.6 99.2 - 98.9 98.9 99.9 99.9 99.8 -0.1 I 83.7 88.5 +4.8 NVN - - - - 99.0 - - - I - - - - 99.5 - - - - I - - - PAV 97.7 98.7 98.4 97.2 99.6 100% 99.9 -0.1 I 93.5 98.8 99.5 99.4 99.7 99.9 100~ 100~ 0.0 I 90.2 99.8 +9.6 PCC 63.1 - - - - - - - I - 94.8 - - - - - - - I - - - PER - - - - - - - - I - 91.0 - - - - 85.0 68.2 -16.8 I 0.0 22.0 +22.0 PRO - - - 37.3 42.4 45.6 70.5 +14.9 I - - - 58.0 61.9 67.4 69.1 67.1 -2.0 I 13.1 35.8 +22.7 QHL - - - - - - - - I - - - - - 0.0 - - I 6.9 - - RAV - 81.6 84.9 85.5 - 88.0 93.5 +5.5 I - 98.9 99.5 99.2 - 97.9 96.9 99.6 +2.7 I 47.1 84.9 - RA7 - - - 89.3 - - - - I - - - 99.2 - - - - - I - - - SCN 94.2 91.6 71.4 99.1 99.8 99.8 99.8 0.0 I 97.6 99.1 97.7 100% 100% 100% 100% 100% 0.0 I 95.8 100% +4.2 SWP 94.5 96.8 98.4 - 99.0 99.6 - - I 89.1 98.4 97.5 - 98.4 98.6 - - - I - - - TBA - 93.8 92.6 * * * - - I 96.1 - 98.7 * * * * * * I * * * TNT - - - * * * - - I - - 44.4 * * * * * * I * * * VET 64.9 - - 65.4 * * - - I - 94.0 - 94.9 * * * * * I * * * VSA - 56.7 - - - - - - I - 84.4 - - - - - - - I - - - VSP - - - 87.0 69.8 78.1 74.5 -3.6 I - - - 86.7 0.3 0.0 - 0.0 0.0 I - 85.3 - -----------------------------------------------+-----------------------------------------------+----------------- Mean: 87.4 88.1 89.0 89.2 90,0 91.0 90.6% -2.8%I 94.7 95.9 91.6 95.3 95,1 96.5 96.3 95.3 -0.9%I 57.7 78.9 +11.0% -----------------------------------------------+-----------------------------------------------+----------------- Remark #: this result is "influenced" by the fact that the scanner didnot access even within 2 post-scans several (many) viral directories. The same may apply, though to a much lesser degree, to other products. This may be a side-effect of some software fault in the operating system when transversing the directory tree (FindFirst/FindNext routines). Generally, the ability of W-NT scanners to detect file zoo viruses "in the mean" is stable with a tendency to decrease. The detection level is still insufficient (<91%). Moreover, those scanners present in last test reduced detection rates by 2.8% (mean). Concerning macro viruses, "mean" detection rate is slightly reduced on a still acceptable level (95.3%); here, products having participated in last VTC tests succeed in following the growth by keeping detection rates (+0.1%) on high levels, where spectacular improvements are difficult. Concerning script viruses which is presently the fastest growing sector, the detection rate is very low (78.9% mean). But those (14) scanners which participated in last test succeeded to improve their detection rate by impressing 11% "in the mean". Findings WNT.1: General development of file/macro/script zoo detection rates: ----------------------------------------------------------------------------- For W-NT, file and macro zoo virus detection rates in the mean are decreasing. 4 (of 16) scanners detects almost all zoo file viruses (>99.8%), whereas 5 (out of 17) scanners detect ALL macro zoo viruses (and 3 more almost all). Detection rates for script viruses are, in the mean, still inacceptably low (78.9%) but the mean detection rate of those scanners which also parti- cipated in last VTC test is significantly improving, and 2 (of 17) scanners detect ALL zoo script viruses. Findings WNT.2: Development of ITW file/macro/script virus detection rates: --------------------------------------------------------------------------- 3 AV products (out of 17) detect ALL In-The-Wild file, macro and script viruses in ALL instantiations (files): AVK, AVP, SCN 7 more products can be rated "perfect" concerning detection of file and macro viruses but they still fail to detect all script viral files (objects): AVG, CMD, FPR, FPW, FSE, INO, NAV, NVC Findings WNT.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- No WNT product is overall rated "perfect". 3 "excellent" overall scanners: SCN, AVP, AVK 3 "very good" overall scanners: FPR, CMD, FPW Finding WNT.4: Performance of WNT scanners by virus classes: ------------------------------------------------------------ Perfect scanners for file zoo: =NONE= Excellent scanners for file zoo: AVP,PAV,AVK,SCN Perfect scanners for macro zoo: CMD, FPR, FPW, FSE, PAV,SCN Perfect scanners for script zoo: FSE,SCN Perfect scanners for polymorphic set: AVG,AVP,FSE,RAV Perfect scanners for VKit set: FSE,SCN Findings WNT.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Detection of packed viral objects needs improvement Perfect packed file/macro virus WNT detector: AVK, SCN Perfect packed file macro detector: AVK, AVP, FSE, PAV, SCN Perfect packed macro virus detector: AVK, SCN Findings WNT.6: Avoidance of Fales Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is improving though still regarded insufficient. FP-avoiding perfect W-NT scanners: AV3,AVG,AVK,INO,PRO,SCN Findings WNT.7: Detection rates for file/macro malware: ------------------------------------------------------- File/Macro Malware detection under WNT is slowly improving. NO product is "perfect" but 8 products are rated "excellent" (>90% detection rate): FSE,SCN,AVP,PAV,AVK,FPR,FPW,CMD Concerning only macro malware detection, 2 products are rated "perfect": FSE,SCN And concerning macro malware detection only, 9 more products are rated "excellent": CMD,FPR,FPW,AVK,AVP,PAV,NVC,RAV,INO Findings WNT.8: Detection rates for exotic viruses: --------------------------------------------------- Exotic viruses are detected by scanners under WNT only to a lesser degree. The following products detect at least 70%: FSE,PAV,AVK,AVP,SCN Grading WNT products according to their detection performance: ============================================================== Test category: "Perfect" "Excellent" --------------------------------------------------------------- WNT zoo file test: --- AVP,PAV,AVK,SCN WNT zoo macro test: CMD,FPR,FPW,FSE,PAV,SCN --- WNT zoo script test: FSE, SCN --- WNT zoo Poly test: AVG,AVP,FSE,RAV --- WNT zoo VKit test: FSE, SCN --- WNT ITW tests: AVK,AVP,SCN AVG,CMD,FPR,FPW,FSE, INO,NAV,NVC WNT pack-tests: AVK,SCN --- WNT FP avoidance: AV3,AVG,AVK,INO,PRO,SCN --- WNT Malware Test: --- FSE,SCN,AVP,PAV,AVK, FPR,FPW,CMD ---------------------------------------------------------------- ************************************************************** "Perfect" WNT AntiVirus product: =NONE= "Excellent" WNT AV products: 1st place: SCN (13 points) 2nd place: FSE ( 9 points) 3rd place: AVK ( 7 points) 4th place: AVG, AVP ( 5 points) 6th place: CMD,FPR,FPW,INO,PAV ( 3 points) 11th place: AV3,PRO,RAV ( 2 points) ************************************************************* "Perfect" WNT AntiMalware product: =NONE= "Excellent" WNT AntiMalware product: 1st place: SCN (14 points) 2nd place: FSE (10 points) 3rd place: AVK ( 8 points) 4th place: AVP ( 6 points) 5th place: CMD,FPR,FPW,PAV ( 4 points) ************************************************************* 7. Results of on-demand detection under Windows-98: =================================================== This is a summary of the essential findings for AV/AM products under W-98. For details see 7evalw98.txt. Meant as a perspective of product results, the following table (W98-A) lists all results of W98 scanners for zoo detection of file, macro and script viruses, in last 6 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table W98-A: Comparison: File/Macro/Script Virus Detection Rate in last 6 VTC tests under W-98: ================================================================================================ Scan ------- File Virus Detection -------+---------- Macro Virus Detection ----------+ --ScriptVirusDet-- ner 98/10 99/03 99/09 00/04 01/04 DELTA I 98/10 99/03 99/09 00/04 00/08 01/04 DELTA I 00/08 01/04 DELTA ---------------------------------------------+-------------------------------------------+------------------- ACU - - - - - I - 97.6 - - - - - I - - - ADO - - - - 99.9 - I - - - - - 99.9 - I - 99.8 - AN5 - - 87.2 - - - I - - 89.3 - - - - I - - - ANT 91.3 - 86.5 92.8 - - I 84.3 - 89.5 90.2 96.4 - - I 55.2 - - ANY - - - - - - I 70.7 - - - - - - I - - - ATR - - - - - - I - - - - - - - I - 2.7 - AVA/3 96.6 97.6 97.2 97.5 95.2 -2.3 I 96.7 95.9 93.9 94.3 94.1 95.7 +1.6 I 15.0 30.0 +15.0 AVG - 87.3 87.0 85.4 81.9 -3.3 I - 82.5 96.6 97.5 97.9 98.3 0.9 I - 57.9 - AVK 99.6 90.8 99.8 99.7 99.8 +0.1 I 99.6 99.6 100.0 99.9 100~ 100~ 0.0 I 91.2 99.8 +8.6 AVP 99.9 99.9 99.8 99.9 99.9 0.0 I 100.0 99.2 100.0 99.9 100~ 100~ 0.0 I 88.2 99.8 +10.6 AVX - 74.2 75.7 77.4 - - I - - 98.7 94.5 99.0 - - I 61.4 - - CLE - - - - 0.1 - I - - - - - 0.0 - I 4.2 6.3 +2.1 CMD - - 98.4 99.6 97.8 -1.6 I - - 99.6 100.0 100% 100% 0.0 I 93.5 96.9 +3.4 DSS/DSE 99.9 99.9 * 99.8 99.9 +0.1 I 100.0 100.0 * 100.0 100% 99.9 -0.1 I 95.8 100% +3.2 DRW/DWW - 89.5 98.3 96.7 98.5 +1.8 I - 98.3 98.8 98.4 - 98.0 - I - 95.6 - ESA - - - 58.0 - - I - - - 88.9 - - - I - - - FPR/FMA - 93.9 99.4 99.7 97.8 -1.9 I 92.4 99.8 99.7 100.0 - 100% - I - 96.9 - FPW - - 99.2 99.6 97.8 -1.6 I - - 99.9 100.0 100% 100% 0.0 I 90.8 96.9 +6.1 FSE 99.8 100.0 99.9 100.0 99.7 0.1 I 100.0 100.0 100.0 100.0 100% 100% 0.0 I 96.7 100% +3.3 FWN - - - - - I 99.6 99.7 99.9 99.8 - - - I - - - HMV - - - - - I - 99.5 - - - - - I - - - IBM 92.8 * * * * * I 94.5 * * * - - - I - - - INO 93.5 98.1 97.1 98.7 97.9 -0.8 I 88.1 99.8 98.1 99.7 99.8 99.7 -0.1 I 78.1 92.7 +14.6 IRS 96.7 97.6 - - - - I 99.0 99.5 - - - - - I - - - ITM - 64.2 - - - - I - - - - - - - I - - - IVB - - - - - - I 92.8 95.0 - - - - - I - - - MKS - - - - - - I - - - 97.1 - 44.2 - I - - - MR2 - - 65.9 - 50.1 - I - - 64.9 - - - - I - 85.1 - NAV - 96.8 97.6 96.8 93.9 -3.1 I 95.3 99.7 98.7 98.0 97.7 97.0 -0.7 I 36.6 65.5 +28.9 NOD - 97.6 98.3 98.3 - - I - 99.8 100.0 99.4 - - - I - - - NV5 - - 99.0 - - - I - - 99.6 - - - - I - - - NVC 93.6 97.0 99.0 99.1 98.1 0.1 I - 99.1 99.6 99.9 99.9 99.8 -0.1 I 83.7 88.5 +1.8 PAV 98.4 99.9 99.6 100.0 99.7 0.4 I 99.5 99.5 86.7 99.9 100~ 99.5 -0.5 I 90.2 99.8 +9.6 PCC - 81.2 - - - - I - 98.0 - - - - - I - - - PER - - - - - - I - - - 53.7 67.2 68.5 +1.3 I 18.0 22.0 +4.0 PRO - 37.3 39.8 44.6 69.9 +25.3 I - 58.0 61.9 67.4 69.1 67.1 -2.0 I 12.1 40.7 +28.6 QHL - - - - - - I - - - 0.0 - 0.0 - I 6.9 - - RAV 84.9 - 86.9 86.5 93.6 +7.1 I 92.2 - 98.1 97.9 96.9 99.6 +2.7 I 47.1 84.9 +37.8 SCN 86.6 99.8 99.7 100.0 99.9 -0.1 I 97.7 100.0 99.8 100.0 100% 100% 0.0 I 95.8 100% +4.2 SWP 98.4 - 99.0 99.6 - - I 98.6 - 98.5 98.6 - - - I - - - TBA 92.6 * * * * * I 98.7 * * * * * - I - - - TSC - 55.3 53.8 - - - I - 76.5 64.9 - - - - I - - - VBS - - - - - - I 41.5 - - - - - - I - - - VBW - 26.5 - - - - I 93.4 - - - - - - I - - - VET - 66.3 * * * * I - 97.6 * * * * - I - - - VSP - 86.4 79.7 78.1 64.9 -13.2 I - 0.4 0.3 - - 0.0 - I - 85.3 - ---------------------------------------------+-------------------------------------------+------------------- Mean 95.0 84.2 89.7 91.6 87.4% +0.4%I 92.1 90.3 93.5 95.0 95.6 84.7% +0.2%I 61.0 76.0 +11.4% ---------------------------------------------+-------------------------------------------+------------------- The number of scanners in test has grown to 22. But as some new products reached very low detection rates, mean detection rates for file and macro virus detection are significantly reduced. Generally, the ability of W-98 scanners to detect file zoo viruses "in the mean" is decreasing to an insufficient level. But those products which also participated in the last test (where the mean detection rate was 91.6) improved their detection rates modestly (+0.4% mean). Concerning macro viruses, "mean" detection rate is significantly reduced (esp. due to several products with very insufficient detection rates), but those products which participated in last test on a still acceptable level (95.6%) further improved their detection rates slightly (+0.2%). Concerning script viruses which is presently the fastest growing sector, the detection rate is significantly improved though still low (76% mean). But those (16) scanners which participated in last test improved their detection rate by impressing 11.4% "in the mean". Findings W98.1: General development of file/macro/script zoo detection rates: ----------------------------------------------------------------------------- For W-98, file and macro zoo virus detection rates are decreasing in the mean. 5 (of 22) scanners detect almost all zoo file viruses (>99.8%), whereas 4 (of 22) scanners detect ALL macro zoo viruses (and 2 more almost all). Detection rates for script viruses are, in the mean, still inacceptably low (76%) but the mean detection rate of those scanners which also participated in last VTC test is significantly improving, and 3 (of 16) scanners detect ALL zoo script viruses. Findings W98.2: Development of ITW file/macro/script virus detection rates: --------------------------------------------------------------------------- 3 AV products (out of 22) detect ALL In-The-Wild file, macro and zoo viruses in ALL instantiations (files): AVK, DSE, SCN 12 more products can be rated "perfect" concerning detection of file and macro viruses but they still fail to detect all script viral files (objects): ADO,AVG,AVP,CMD,DRW,FPR,FPW,FSE,INO,NAV,NVC,PAV Findings W98.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- No W98 product is overall rated "perfect". 7 "excellent" overall scanners: AVK,DSE,SCN,ADO,FSE,PAV,DSE 4 "very good" overall scanners: CMD,FPR,FPW,DRW Finding W98.4: Performance of W98 scanners by virus classes: ------------------------------------------------------------ Perfect scanners for file zoo: =NONE= Excellent scanners for file zoo: ADO,AVP,AVK,SCN,FSE,PAV,DSE Perfect scanners for macro zoo: CMD,FPR,FPW,FSE,SCN Perfect scanners for script zoo: DSE,FSE,SCN Perfect scanners for polymorphic set: ADO,AVG,AVK,AVP,DRW,FSE,SCN Perfect scanners for VKit set: =NONE= Excellent scanners for VKit set: ADO,AVK,AVP,DSE,FSE,PAV,SCN Findings W98.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Detection of packed viral objects needs improvement "Perfect" packed file/macro virus W98 detector: ADO,SCN "Excellent" packed file/macro virus W98 detectors:AVK,AVP,PAV "Very Good" packed file/macro virus W98 detectors: AVG,CMD,DRW,FPR,FPW,NAV "Perfect" for packed file virus only: ADO,AVK,AVP,PAV,SCN "Perfect" for packed macro virus only: ADO,SCN Findings W98.6: Avoidance of Fales Alarms: ------------------------------------------ Avoidance of False-Positive Alarms has significantly improved for many scanners: FP-avoiding perfect W-98 scanners: AV3,AVG,AVK,AVP,DSE,INO,NAV,PRO,RAV,SCN,VSP Findings W98.7: Detection rates for file/macro malware: ------------------------------------------------------- File/Macro Malware detection under W-98 is slowly improving. NO product is "perfect" but 9 products are rated "excellent" (>90% detection rate): FSE,AVP,AVK,ADO,PAV,FPW,FPR,CMD,SCN Concerning only macro malware detection, 1 products is rated "perfect": FSE And concerning macro malware detection only, 10 more products are rated "excellent": CMD,FPW,AVK,AVP,PAV,ADO,NVC,FPR,DSE,SCN Findings W98.8: Detection rates for exotic viruses: --------------------------------------------------- Exotic viruses are detected by scanners under W-98 only to a lesser degree. The following products detect at least 70%: FSE,AVK,AVP,ADO,PAV,SCN Grading W98 products according to their detection performance: ============================================================== Test category: "Perfect" "Excellent" ------------------------------------------------------------------ W98 zoo file test: --- ADO,AVP,AVK,SCN,FSE,PAV,DSE W98 zoo macro test: CMD,FPR,FPW,FSE,SCN --- W98 zoo script test: DSE,FSE,SCN --- W98 zoo Poly test: ADO,AVG,AVK,AVP,DRW,FSE,SCN --- W98 zoo VKit test: --- ADO,AVK,AVP,DSE,FSE,PAV,SCN W98 ITW tests: AVK,DSE,SCN ADO,AVG,AVP,CMD,DRW,FPR, FPW,FSE,INO,NAV,NVC,PAV W98 pack-tests: ADO,SCN AVK,AVP,PAV W98 FP avoidance: AV3,AVG,AVK,AVP,DSE, --- INO,NAV,PRO,RAV,SCN,VSP W98 Malware Test: --- FSE,AVP,AVK,ADO,PAV,FPW, FPR,CMD,SCN ----------------------------------------------------------------- ************************************************************** "Perfect" W-98 AntiVirus product: =NONE= "Excellent" W-98 AntiVirus products: 1st place: SCN (14 points) 2nd place: AVK,FSE ( 9 points) 4th place: AVP,DSE ( 8 points) 6th place: ADO ( 7 points) 7th place: AVG ( 5 points) 8th place: PAV ( 4 points) 9th place: CMD,DRW,FPR,FPW,INO,NAV ( 3 points) 15th place: AV3,PRO,RAV,VSP ( 2 points) ************************************************************ "Perfect" W-98 AntiMalware product: =NONE= "Excellent" W-98 AntiMalware products: 1st place: SCN (15 points) 2nd place: AVK,FSE (10 points) 4th place: AVP ( 9 points) 5th place: ADO ( 8 points) 6th place: PAV ( 5 points) 7th place: CMD,FPR,FPW ( 4 points) ************************************************************** 8. Results of on-demand detection under Windows-2000: ===================================================== This is a summary of the essential findings for AV/AM products under W-2k. For details see 7evalw2k.txt. Meant as a perspective of product results, the following table (W2k-A) lists all results of W2k scanners for zoo detection of file, macro and script viruses, in last 9 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table W2k-A: Comparison: File/Macro/Script Virus Detection Rate: ================================================================ Scan I = File Virus = + === Macro Virus === + = Script Virus = ner I Detection I Detection I Detection -----+-----------------+---------------------+-------------------- Test I 0104 Delta I 0008 0104 Delta I 0008 0104 Delta -----+-----------------+---------------------+-------------------- ANT I - - I 93.3 - - I 53.9 - - AV3 I 95.0 - I 94.1 95.7 +1.6 I 15.0 29.1 +14.1 AVG I 81.9 - I 97.9 98.3 +0.4 I 45.8 57.9 +12.1 AVK I 99.8 - I 100.0~ 100.0~ 0.0 I 91.5 99.8 +8.3 AVP I 99.9 - I 100.0~ 100.0~ 0.0 I 88.2 99.8 +11.6 AVX I - - I 99.0 - - I 61.4 - - CLE I - - I - - - I 4.2 - - CMD I 97.8 - I 100.0% 100.0% 0.0 I 93.5 96.9 +3.4 DRW I - - I 97.5 - - I 59.8 - - FPR I 97.8 - I - 100.0% - I - 96.9 - FPW I 97.8 - I 100.0% 100.0% 0.0 I 90.8 96.9 +6.1 FSE I - - I 100.0% 100.0% 0.0 I 96.7 100% +3.3 INO I 97.9 - I 99.8 99.7 -0.1 I 78.1 93.1 +15.0 NAV I 93.9 - I 97.7 97.0 -0.7 I 36.6 54.5 +17.9 NVC I 98.1 - I 99.9 99.8 -0.1 I 83.7 88.5 +4.8 PAV I 97.5 - I 100.0~ 99.4 -0.6 I 90.2 98.5 +8.3 PER I - - I 85.0 68.2 -16.8 I 0.0 22.0 +22.0 PRO I 70.6 - I 69.1 67.1 -2.0 I 12.1 40.7 +28.6 QHL I - - I 0.0 - - I 6.9 - - RAV I 93.5 - I 96.9 99.6 3.3 I 47.1 84.9 +37.8 SCN I 89.0 - I 100.0% 100.0% 0.0 I 95.8 100% +4.2 VSP I - - I - 0.0 - I - 85.3 - -----+-----------------+---------------------+--------------------- Mean : 97.6% - I 99.9% 89.7% -1.0 I 57.6 79.4 +13.2% -----+-----------------+---------------------+--------------------- Generally, the ability of W2k scanners to detect file zoo viruses "in the mean" is on an acceptable level. Concerning macro viruses, "mean" detection rate is significantly reduced to an inacceptably low level (<90%) as almost all scanners detected less (mean -1.0) macro viruses than in last test (2000-08). But 5 scanners (CMD,FPR,FPW,FSE,SCN) detected all macro viruses. Concerning script viruses which is presently the fastest growing sector, detection rate is still very low (79.4% mean) but those (15) products which also participated in last VTC test have improved their detection rates by impressing figures (+13.2% mean). And now, 2 scanners (FSE,SCN) detect all script viruses. Findings W2k.1: General development of file/macro/script zoo detection rates: ----------------------------------------------------------------------------- For W-2000, file detection rates are acceptable but macro and script zoo virus detection rates need essential improvement. No scanner is rated perfect overall as NO scanner detects ALL file viruses. But 5 (of 17) scanners detect ALL zoo macro viruses: CMD,FPR,FPW,FSE,SCN And 2 (out of 17) scanners detect ALL zoo script viruses: FSE,SCN Findings W2k.2: Development of ITW file/macro/script virus detection rates: --------------------------------------------------------------------------- 5 AV products (out of 17) detect ALL In-The-Wild file, macro and zoo viruses in ALL instantiations (files): AVK,AVP,INO,PAV,SCN And 12 products can be rated "perfect" concerning detection of file and macro viruses but they still fail to detect all script viral files (objects): AVG,AVK,AVP,CMD,FPR,FPW,FSE,INO,NAV,NVC,PAV,SCN Findings W2k.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- No W2k product is overall rated "perfect". 2 "excellent" overall scanners: AVK,AVP 4 "very good" overall scanners: PAV,CMD,FPR,FPW Findings W2k.4: Performance of W2k scanners by virus classes: ------------------------------------------------------------ Perfect scanners for file zoo: =NONE= Excellent scanners for file zoo: AVP,AVK Perfect scanners for macro zoo: CMD,FPR,FPW,FSE,SCN Perfect scanners for script zoo: FSE,SCN Perfect scanners for polymorphic set: AVG,AVK,AVP,FSE,PAV Perfect scanners for VKit set: =NONE= Excellent scanners for VKit set: AVK,AVP,FSE,PAV,SCN Findings W2k.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Detection of packed viral objects needs improvement Perfect packed file/macro virus WNT detector: AVK,AVP,PAV,SCN Perfect packed file macro detector: AVK,AVP,FSE,PAV,SCN Perfect packed macro virus detector: AVK,AVP,PAV,SCN Findings W2k.6: Avoidance of Fales Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is improving though still regarded insufficient. FP-avoiding perfect W-2k scanners: AV3,AVG,AVK,INO,PRO,RAV,SCN,VSP Findings W2k.7: Detection rates for file/macro malware: ------------------------------------------------------- File/Macro Malware detection under W2k is slowly improving. NO product is "perfect" but 7 products are rated "excellent" (>90% detection rate): FSE,AVP,AVK,CMD,FPR,FPW,PAV *************************************************** Concerning only macro malware detection, 2 products are rated "perfect": FSE,SCN And concerning macro malware detection only, 9 more products are rated "excellent": CMD,FPR,FPW,AVK,AVP,PAV,NVC,INO,RAV Findings W2k.8: Detection rates for exotic viruses: --------------------------------------------------- Exotic viruses are detected by scanners under W2k only to a lesser degree. The following products detect at least 70%: FSE,AVP,AVK,PAV,SCN Grading W2k products according to their detection performance: ============================================================== Test category: "Perfect" "Excellent" -------------------------------------------------------------- W2k zoo file test: --- AVP,AVK W2k zoo macro test: CMD,FPR,FPW,FSE,SCN --- W2k zoo script test: FSE,SCN --- W2k zoo Poly test: AVG,AVK,AVP,FSE,PAV --- W2k zoo VKit test: --- AVK,AVP,FSE,PAV,SCN W2k ITW tests: AVK,AVP,INO,PAV,SCN --- W2k pack-tests: AVK,AVP,PAV,SCN --- W2k FP avoidance: AV3,AVG,AVK,INO,PRO, --- RAV,SCN,VSP W2k Malware Test: --- FSE,AVP,AVK,PAV,SCN --------------------------------------------------------------- ************************************************************** "Perfect" W-2000 AntiVirus product: =NONE= "Excellent" W-2000 AV products: 1st place: SCN (11 points) 2nd place: AVK (10 points) 3rd place: AVP ( 8 points) 4th place: FSE,PAV ( 7 points) 6th place: AVG,INO ( 4 points) 8th place: AV3,CMD,FPR,FPW,PRO,RAV,VSP ( 2 point) ************************************************************ "Perfect" W-2000 AntiMalware product: =NONE= "Excellent" W-2000 AntiMalware product: 1st place: SCN (12 points) 2nd place: AVK (11 points) 3rd place: AVP ( 9 points) 4th place: FSE,PAV ( 8 points) ************************************************************** 9. Comparison of detection results under Windows-32 platforms: ============================================================== This is a summary of the comparison of AV/AM products under different W32 platforms (W-NT, W-98, W-2k). For details see 7evalw32.txt. With the fast deployment of new versions of Microsoft Windows-32 (in past 5 years from W-NT to W-95, W-98, W-2000 and soon W-XP), both customers needing protection nd producers of security-enhancing software (esp. AntiVirus and AntiMalware) can only cope with the pace when they essentially re-use engines prepared for previous W32 platforms and simply "adapt" them to the intrinsics of the new platforms. Otherwise, "rewriting" the resp. software would consume too much time and efforts, and customers would receive "adapted" products only with some delay. AV/AM testers cannot determine the characteristics of the algorithms in scanning engines, either in following legal objectives (which, in most Copyright laws, prohibit reverse-engineering of proprietory code, except for specific reasons such as collecting evidence for a court case or teaching related techniques, as in Hamburg university IT Security curriculum), or for shere complexity of related code (and in many cases, for unsufficient professional knowledge of testers). It is therefore worthwhile to analyse whether those AV/AM products versions of which are available for all W32 platforms behave EQUALLY concerning detection and identification of viral and malicious code. Test Hypothesis: "W32-harmonical behaviour of W32 products: =========================================================== We assume that those products which participate for all W32 platforms (WNT, W98 and W2k) for ALL categories shall yield INDENTICAL results. We call product behaviour following this hypothesis "W32-harmonical". Finding W32.1: Equality of results for all W32 platforms: --------------------------------------------------------- Few W-32 scanners perform equally on W-NT/W-NT/W-2k in ALL categories and can be called "W32-harmonical". When looking at specific categories only, about half of products can be regarded as "W32-harmonical" for macro and script viruses and malware. --------------------------------------------------------- For ALL categories, the following *2* W32 scanners (of 17) yield ALWAYS identical results: AVK,FPW --------------------------------------------------------- The following W32 scanners yield identical results for all file (zoo,ITW) viruses/malware: AVK,FPR,FPW,NAV The following *7* W32 scanners yield identical results for all macro (zoo,ITW) viruses/malware: AVG,AVK,CMD,FPR,FPW,NAV,NVC,PRO The following *8* products yield identical results for all script (zoo,ITW) viruses/malware: AVK,CMD,FPW,FSE,NVC,PER,SCN,VSP --------------------------------------------------------- The following grid is used to grade W32 products concerning their ability for overall identical detection on all W32 platforms: A "perfect" product will yield identical results for all 3 categories (file, macro, script virus/malware) An "excellent" product will yield identical results for 2 categories Grading W32 products according to identical detection rates: ============================================================ Test category: "Perfect" --- ----------------------------------------------------------- W32 identical detection AVK,FPW --- =========================================================== ************************************************************ "Perfect" W32-harmonical AntiVirus products: 1st place: AVK,FPW ( 2 points) ************************************************************ "Perfect" W32-harmonical AntiMalware products: 1st place: AVK,FPW ( 2 points) ************************************************************ 10. Results of on-demand detection under Linux(SuSe): ===================================================== This is a summary of the essential findings for AV/AM products under Linux. For details see 7evallin.txt. Meant as a perspective of product results, the following table (LIN-A) lists all results of Linux scanners for zoo detection of file, macro and script viruses, in last 9 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table E-Lin-1: Performance of LINUX scanners in Test 2001-04: ============================================================= Scan I = File Virus = + = Macro Virus = + = Script Virus = ner I Detection I Detection I Detection -----+-----------------+-----------------+----------------- Test I 0104 Delta I 0104 Delta I 0104 Delta -----+-----------------+-----------------+----------------- AVP I 99.9 - I 100~ - I 99.8 - CMD I 97.8 - I 100% - I 96.9 - FSE I 97.1 - I 100~ - I 96.9 - RAV I 93.5 - I 99.6 - I 84.9 - SCN I 99.7 - I 100% - I 99.8 - -----+-----------------+-----------------+----------------- Mean : 97.6% - I 99.9% - I 95.7% -----+-----------------+-----------------+----------------- Findings LIN.1: General development of file/macro/script zoo detection rates: ----------------------------------------------------------------------------- For those few (5) Linux scanners, mean detection rates for file, macro and script (zoo) viruses (though statistically less relevant) were significantly better than products for other platforms. No scanner is rated "perfect" (100% detection) for file and script viruses, but 2 (of 5) scanners are "perfect" concerning macro virus detection: CMD,SCN Findings LIN.2: Development of ITW file/macro/script virus detection rates: --------------------------------------------------------------------------- No AV product detects "perfectly" all ITW file, macro and script viruses in all files but 2 scanners detect viruses and files at 100% though not always reliably: AVP,SCN Findings LIN.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- No LINUX product is overall rated "perfect". 2 "excellent" overall scanners: AVP,SCN Findings LIN.4: Performance of LIN scanners by virus classes: ------------------------------------------------------------ Perfect scanners for file zoo: =NONE= Excellent scanners for file zoo: AVP,SCN Perfect scanners for macro zoo: CMD,SCN Perfect scanners for script zoo: =NONE= Excellent scanners for script zoo: AVP,SCN Perfect scanners for polymorphic set: AVP Perfect scanners for VKit set: =NONE= Excellent scanners for VKit set: FSE,SCN Findings LIN.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Detection of packed viral objects needs improvement Perfect packed file/macro virus LINUX detector: CMD,SCN Perfect packed file macro detector: AVP,CMD,SCN Perfect packed macro virus detector: CMD,SCN Findings LIN.6: Avoidance of Fales Alarms: ------------------------------------------ Avoidance of False-Positive Alarms is insufficient and needs improvement. FP-avoiding perfect LINUX scanners: AVP,RAV,SCN Findings LIN.7: Detection rates for file/macro malware: ------------------------------------------------------- NO LINUX product is "perfect" but 2 products are rated "excellent" (>90% detection rate): AVP,CMD Findings LIN.8: Detection rates for exotic viruses: --------------------------------------------------- Exotic viruses are detected by scanners under LINUX to an insufficient lesser degree. Only 1 product detects at least 70%: AVP Grading LIN products according to their detection performance: ============================================================== Test category: "Perfect" "Excellent" --------------------------------------------------------------- LINUX zoo file test: --- AVP,SCN LINUX zoo macro test: CMD,SCN --- LINUX zoo script test: --- AVP,SCN LINUX zoo Poly test: AVP --- LINUX zoo VKit test: --- FSE,SCN LINUX ITW tests: --- AVP,SCN LINUX pack-tests: CMD,SCN --- LINUX FP avoidance: AVP,RAV,SCN --- LINUX Malware Test: --- --- ---------------------------------------------------------------- ************************************************************ "Perfect" LINUX AntiVirus product: =NONE= "Excellent" LINUX AV products: 1st place: SCN (10 points) 2nd place: AVP ( 7 points) 3rd place: CMD ( 4 points) 4th place: RAV ( 2 points) 5th place: FSE ( 1 points) ************************************************************ "Perfect" LINUX AntiMalware product: =NONE= "Excellent" LINUX AntiMalware product: =NONE= ************************************************************ 11. Final remark: In search of the "Perfect AV/AM product": =========================================================== This test includes 3 platforms for which engines are hardly comparable, namely DOS (16-bit engines), WNT/W98/W2k (32-bit engines, comparable) and LINUX. Moreover, several manufacturers submitted only products for special platforms. ********************************************************* In general, there is NO AntiVirus nor AntiMalware product which can be rated "PERFECT" in all categories for all different testbeds. ********************************************************* *********************************************************** But when differentiating for categories, there are several products which can be rated either "perfect" or "excellent". ************************************************************ Instead of calculating an overall value (e.g. the sum of points yielded divided by the number of products in test for a given platform), the following TABLES lists product suites by their places, sorting by assigned points: Table SUM-AV: Survey of Results for AntiVirus Products: ------------------------------------------------------- ================== AntiVirus Products ==================== DOS (13) WNT (18) W98 (24) W2k (17) W32 (17) LINUX(5) ---------------------------------------------------------- 1st place: SCN (13) SCN (13) SCN (14) SCN (11) AVK ( 2) SCN (10) : AVK ( 9) FSE ( 9) AVK ( 9) AVK (10) FPW ( 2) AVP ( 7) : PAV ( 9) AVK ( 7) FSE ( 9) AVP ( 8) CMD ( 4) : AVP ( 7) AVG ( 5) AVP ( 8) FSE ( 7) RAV ( 2) : AVG ( 5) AVP ( 5) DSE ( 8) PAV ( 7) FSE ( 1) : FPR ( 3) CMD ( 3) ADO ( 7) AVG ( 4) : DRW ( 3) FPR ( 3) AVG ( 5) INO ( 4) : AVA ( 2) FPW ( 3) PAV ( 4) AV3 ( 2) : INO ( 2) INO ( 3) CMD ( 3) CMD ( 2) : NVC ( 1) PAV ( 3) DRW ( 3) FPR ( 2) : AV3 ( 2) FPR ( 3) FPW ( 2) : PRO ( 2) FPW ( 3) PRO ( 2) : RAV ( 2) INO ( 3) RAV ( 2) : NAV ( 3) VSP ( 2) : AV3 ( 2) : PRO ( 2) : RAV ( 2) : VSP ( 2) ----------------------------------------------------------- Remark: Numbers for platforms indicate numbers of products in test. Numbers for products indicates points assigned for that platform. Table SUM-AM: Survey of Results for AntiMalware Products: --------------------------------------------------------- ================ AntiMalware Products ================= DOS(13) WNT(18) W98(24) W32( ) W32(17) LINUX(5) ------------------------------------------------------- 1st place: SCN (14) SCN (14) SCN (15) SCN (12) AVK ( 2) --- : AVK (10) FSE (10) AVK (10) AVK (11) FPW ( 2) : PAV (10) AVK ( 8) FSE (10) AVP ( 9) : AVP ( 8) AVP ( 6) AVP ( 9) FSE ( 8) : FPR ( 4) CMD ( 4) ADO ( 8) PAV ( 8) : FPR ( 4) PAV ( 5) : FPW ( 4) CMD ( 4) : PAV ( 4) FPR ( 4) : FPW ( 4) ----------------------------------------------------------- Remark: Numbers for platforms indicate numbers of products in test. Numbers for products indicates points assigned for that platform. Generally, we hope that these rather detailed results help AV producers to adapt their products to growing threats and thus to protect their customers. 14. Availability of full test results: ====================================== Much more information about this test, its methods and viral databases, as well as detailed test results are available for anonymous FTP downloading from VTCs HomePage (VTC is part of Working Group AGN): ftp://agn-www.informatik.uni-hamburg.de/vtc Any comment and critical remark which helps VTC learning to improve our test methods will be warmly welcomed. Further to this test, we follow suggestions of AV producers to test the heuristic detection ability of scanners for those viruses which were detected first only after product submission. In this "pro-active test", we will test the heuristic detection quality for products submitted for W-NT, for macro and script viruses detected between November 2000 and January 2001 as well as between February 2001 and April 2001. The next comparative test will evaluate macro (VBA/VBA5) and script virus detection, and this test is planned for June to September 2001, with viral databases frozen on April 31, 2001. Any AV producer wishing to participate in forthcoming test is´invited to submit related products. On behalf of the VTC Test Crew: Dr. Klaus Brunnstein (May 31, 2001) 15. Copyright, License, and Disclaimer: ======================================= This publication is (C) Copyright 2001 by Klaus Brunnstein and the Virus Test Center (VTC) at University of Hamburg, Germany. Permission (Copy-Left) is granted to everybody to distribute copies of this information in electronic form, provided that this is done for free, that contents of the information are not changed in any way, and that origin of this information is explicitly mentioned. It is esp. permitted to store and distribute this set of text files at university or other public mirror sites where security/safety related information is stored for unrestricted public access for free. Any other use, esp. including distribution of these text files on CD-ROMs or any publication as a whole or in parts, are ONLY permitted after contact with the supervisor, Prof. Dr. Klaus Brunnstein or authorized members of Virus Test Center at Hamburg University, and this agreement must be in explicit writing, prior to any publication. No responsibility is assumed by the author(s) for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Prof. Dr. Klaus Brunnstein University of Hamburg, Germany (May 31, 2001)