============================================================ File 0XECSUM.TXT: "EXECUTIVE SUMMARY" VTC University of Hamburg AntiMalware Product Test "2000-04" ============================================================ [Formatted with non-proportional font (Courier), 72 columns] Content of this text: ===================== 1. Background of this test; Malware Threats 2. VTC Testbeds used in VTC test "2000-04" ---------------------------------------------------------- 3. Summary #1: Development of DOS AV product detection rates: Result #1: For DOS, both file and macro virus detection rates improved significantly. Several scanners detect more than 99% of zoo file and macro viruses but no scanner is graded "perfect". ----------------------------------------------------------- 4. Summary #2: Performance of DOS AV products on ZOO testbeds, including Polymorphic and VKIT virus detection: Result #2: Quality of best DOS scanners not yet perfect. Excellent DOS scanners: AVP and FPR (as before) and SCN, FSE, CMD and NVC (last test: "very good"). ---------------------------------------------------------- 5. Summary #3: Performance of DOS scanners on ITW testbeds: Result #3: High ITW detection rates and implied risks: 11 products "perfect" for ITW viruses under DOS: AVP,CMD,DRW,FPR,FSE,INO,NOD,NVC,PAV,SCN,SWP. ---------------------------------------------------------- 6. Summary #4: Performance of DOS scanners by virus classes: Result #4: Performance of DOS scanners by virus classes: Perfect scanners for macro zoo: CMD, FPR, SCN Perfect scanners for boot zoo: AVP, NOD Perfect scanners for Polymorphic virus set: AVP, FSE, NAV, NOD, PAV. NO perfect scanner for file and VKit zoo viruses. --------------------------------------------------------- 7. Summary #5: Detection of packed viral objects needs improvement: Result #5: Detection of packed viral objects needs improvement Perfect packed file/macro virus DOS detector: FSE,PAV,SCN Packed macro virus only: "Very Good" detector: NOD "Good" detectors: CMD, FPR ---------------------------------------------------------- 8. Summary #6: False Positive avoidance (DOS/Win-NT): Result #6: Avoidance of False-Positive Alarms is insufficient. FP-avoiding perfect DOS scanners: AVA, PAV, SCN, SWP FP-avoiding perfect W-NT scanners: AVA, AVG, AVK, SCN, SWP ---------------------------------------------------------- 9. Summary #7: Detection of File/Macro Malware (DOS/Win-NT): Result #7: AntiMalware detection under DOS/W-NT improving No "perfect" file/macro malware detector "Excellent" file/macro malware detectors: FPR/FPW, SCN, CMD, FSE and PAV ---------------------------------------------------------- 10. Summary #8: File/Macro virus detection under Win-NT: Result #8: Virus detection rates under W-NT on high level 1 "perfect" Windows-NT zoo scanner: FSE 9 "excellent" scanners: PAV, AVP, SCN, AVK, CMD, FPW, SWP and NVC 4 Macro-only "perfect" products: CMD,FPW,FSE,SCN ---------------------------------------------------------- 11. Summary #9: File/Macro Virus detection under 32-bit engines: Result #9: Several W-32 scanners perform equally on W-98/W-NT ---------------------------------------------------------- 12. Summary#10: Malware detection under Windows 98/NT: Result #10: AntiMalware quality of AV products is developping: NO "perfect" AM product for W-98 and W-NT for file and macro malware detection BUT 7 scanners with grade "excellent" (>90%): AVK, AVP, CMD, FPR, FSE, PAV and SCN. Concerning macro malware detection under W-98/W-NT: 3 products are "perfect": CMD, FPR and FSE 12 products are "excellent": SCN, ATD, AVK, PAV, INO, FWN, AVP, NOD, NVC, RAV, SWP and AV ********************************************************** 13. Conclusion: Searching for the "Perfect AV/AM product": Result #11: Best AV products in test: FSE and SCN Best AM products in test: SCN ********************************************************** 14. Availability of full test results 15. Copyright, License, and Disclaimer Tables: ======= Table ES0: Development of viral/malware threats Table ES1: Content of VTC test databases in test 2000-04 Table ES2: List of AV products in test 2000-04 Table ES3: Development of DOS scanners from 1997-02 to 2000-04 Table ES4: Development of W-NT scanners from 1997-07 to 2000-04 1. Background of this test: Malware Threats: ============================================ Malicious software (malware) including viruses (=self-replicating malware), trojan horses (=pure payload without self-replication), virus droppers and network malware (e.g. worms and hostile applets), are regarded as serious threats to PC users esp. when connected to Intranetworks and Internet. The development of malicious software can well be studied in view of the growth of VTC testbeds. The following table summarizes, for previous and current VTC tests (indicated by their year and month of publication), the size of virus and malware (full = "zoo") databases (indicating each the different viruses and number of instantiations of a virus or malware and having in mind that some revisions of testbeds were made): Table ES0: Development of threats as present in VTC test databases: =================================================================== = File viruses= = Boot Viruses= =Macro Viruses= == Malware == Test# Number Infected Number Infected Number Infected Number Malware Viruses objects viruses objects viruses objects file macro ----------------------------------------------------------------------- 1997-07: 12,826 83,910 938 3,387 617 2,036 213 72 1998-03: 14,596 106,470 1,071 4,464 1,548 4,436 323 459 1998-10: 13,993 112,038 881 4,804 2,159 9,033 3,300 191 1999-03: 17,148 128,534 1,197 4,746 2,875 7,765 3,853 200 + 5 146,640 (VKIT+4*Poly) 1999-09: 17,561 132,576 1,237 5,286 3,546 9,731 6,217 329 + 7 166,640 (VKit+6*Poly) 2000-04: 18,359 135,907 1,237 5,379 4,525 12,918 6,639 260 + 7 166,640 (VKit+6*Poly) ----------------------------------------------------------------------- Remark: Before test 1998-10, an ad-hoc cleaning operation was applied to remove samples where virality could not be proved easily. Since test 1999-03, separate tests are performed to evaluate detection rates of VKIT-generated and selected polymorphic file viruses. With annual deployment of more than 5,000 viruses and about 1,000 Trojan horses, many of which are available from Internet, and in the absence of inherent protection against such dysfunctional software, users must rely on AntiMalware and esp. AntiVirus software to detect and eradicate - where possible - such malicious software. Hence, the detection quality of AntiMalware esp. including AntiVirus products becomes an essential prerequisite of protecting customer productivity and data. Virus Test Center (VTC) at Hamburg Universityīs Faculty for Informatics performs regular tests of AntiMalware and esp. AntiVirus Software. VTC recently tested current versions of on-demand scanners for their ability to identify PC viruses. Tests were performed on VTCs malware databases, which were frozen on their status as of *** October 31, 1999 *** to give AV/AM producers a fair chance to support updates within the 8 weeks submission period. The main test goal was to determine detection rates, reliability (=consistency) of malware identification and reliability of detection rates for submitted or publicly available scanners. Special tests were devoted to detection of multiple generations of 6 polymorphic file viruses (Maltese.Amoeba, Mte.Encroacher.B, Natas, Tremor Tequila and One-Half) and of viruses generated with the "VKIT" file virus generator. It was also tested whether viruses packed with 4 popular compressing methods (PKZIP, ARJ, LHA and RAR) would be detected (and to what degree) by scanners. Moreover, avoidance of False Positive alarms on "clean" (=non-viral and non-malicious) objects was also determined. Finally, a set of selected non-viral file and macro malware (droppers, Trojan horses, intended viruses etc) was used to determine whether and to what degree AntiVirus products may be used for protecting customers against Trojan horses and other forms of malware. VTC maintains, in close and secure connection with AV experts worldwide, collections of boot, file and macro viruses as well as related malware ("zoo") which have been reported to VTC or AV labs. Moreover, following the list of "In-The-Wild Viruses" (published on regular basis by Wildlist.org), a collection of viruses reported to be broadly visible is maintained to allow for comparison with other tests; presently, this list does not report ITW Malware. 2. VTC Testbeds used in VTC test "2000-04": =========================================== The current sizes of VTC testbeds (developped from previous testbed through inclusion of new viruses and malware and some revision) is given in the following table: Table ES1: Content of VTC test databases: ================================================================= "Full Zoo":18,357 File Viruses in 135,907 infected files, 60,000 Instantiations of 6 polymorphic file viruses 10,706 Variations of file viruses generated with VKIT 4,282 different File Malware in 6,639 file objects 31,851 Clean Files used for False Positive Detection 1,237 System (Boot etc) Viruses in 5,379 infected images, 4,525 Macro Viruses in 12,9181 infected documents, 260 different Macro Malware in 394 macro objects 329 Clean macro objects used for False Positive test ----------------------------------------------------- "ITW Zoo": 39 File Viruses in 1,047 infected files, 33 System Viruses in 423 infected images, and 80 Macro Viruses in 672 infected documents ================================================================== Remark: The organisation which collects worldwide information on viruses "in-the-Wild" has reorganized its "in-the-Wild" database early 1999; consequently, the number of ITW viruses is significantly smaller than in previous tests. (For detailed indices of VTC testbeds, see file "a3testbed.zip") Concerning quality of viral testbeds, it is sometimes difficult to assess the "virality" (=ability of a given sample to replicate at least twice under given constraints) of large "viral zoo" databases, esp. as some viruses work under very specific conditions. We are glad to report, that colleagues such as Dr. Vesselin Bontchev, Fridrik Skulason, Igor Muttik and Righard Zwienenberg (to name only some) helped us with critical and constructive comments to establish viral testbeds, the residual non-viral part of which should be very small. We also wish to thank "WildList Organisation" for supporting us with their set of In-The-Wild viruses; the related results may support users in comparing VTC tests with other ITW-only tests. For test "2000-04", the following *** 28 *** AntiVirus products (adressed in subsequent tables by a 3-letter abbreviation) under DOS, Windows-98 and Windows-NT were tested: Table ES2: List of AV products in test "2000-04" ================================================ Abbreviation/Product/Version Tested under Platform ---------------------------------------------------------- ANT = AntiVir 5.21 (VDF 27.11.1999) for DOS, W-98, W-NT ATD = AntiDote 1.50 (Vintage Sol) for W-98, W-NT AVA = AVAST v7.70, AVAST v3.0 for DOS, W-98, W-NT AVK = AVK 9 (November 30,1999) for W-98, W-NT AVP = AVP v3.0 for W-98, W-NT CLE = Cleaner 3.0 for W-98, W-NT CMD = Command Software AV 4.58 for DOS, W-98, W-NT DRW = DrWeb 4.14 for DOS, W-98, W-NT DSE = Dr Solomon Emergency AV 4.03 for W-98 ESA = eSafe 2.1 (Aladdin) for W-98, W-NT FPR = F-PROT 3.06c for DOS, W-98 FPW = F-PROT for Windows 3.05 for W-98, W-NT FSE = FSAV v3.0 and 4.06 for DOS, W-98, W-NT FWN = FWin32 for W-98, W-NT INO = Inoculan 4.53 (11/24/99) for DOS, W-98, W-NT MKS = MKS-vir (Polonia) for W-98, W-NT NAV = NAV 1.0, 5.01.01 (Nov.1999) for DOS, W-98, W-NT NOD = NOD 32 V.129 for DOS, W-98, W-NT NVC = NVC 4.73 for DOS, W-98, W-NT PAV = PAV v3.0 (November 30,1999) for DOS, W-98, W-NT PER = PER AntiVirus (Peru) for W-98 PRO = Protector 7.7 B05 for W-98, W-NT QHL = Quickheal Version 5.21 for W-98, W-NT RAV = RAV 7.6 for W-98, W-NT SCN = NAI VirusScan 4.50, 4.03 for DOS, W-98, W-NT SWP = Sweep 3.28 for DOS, W-98, W-NT VIT = VirIT Explorer Lite 2.6.26 for DOS, W-98 VSP = VirScanPlus 11.90.04 for DOS, W-98, W-NT ------------------------------------------------------------ For details of AV products including options used to determine optimum detection rates: see A3SCNLS.TXT. For scanners where results are missing: see 8problms.txt. In general, AV products were either submitted or, when test versions were available on Internet, downloaded from respective ftp/http sites. Few scanners were not available either in general (e.g. TNT) or for this test, some of which were announced to participate in some future test. Finally, very few AV producers answered VTCs requests for submitting scanners with electronic silence. Concerning often asked questions, some AV producers deliberately donīt submit their products. This applies esp. to TrendMicro whose representative asked, after a less acceptable test result in some previous test, to delay participation until after a "new engine" is available promising better results. As TrendMicro has not sub- mitted any product, we no longer ask for their submission. Other AV companies have been deliberately excluded, due to malpractice. This applies to Panda which made its participation dependent upon receiving VTCs whole database, which is incompatible with VTC rules. Moreover, MR2 (formerly TSC) has been excluded when its producer (Mr. Marx) accused VTC for "bad testing" although his own product showed severe problems in accessing viral samples for detection. The following paragraphs survey essential findings in comparison with last VTC tests (performance over time), as well as some relative "grading" of scanners for detection of file and macro viruses, both in full "zoo" and "In-The-Wild" testbeds, of file and macro malware, as well as detection of ITW file and macro viruses in objects packed with ARJ, LHA, ZIP and RAR. Finally, the ability of AV products to avoid False Positive alarms is also analysed. Detailed results including precision and reliability of virus identification and of results for boot/MBR infectors are described in overview tables "6a-sumov.txt" and the related tables for DOS (boot+file+macro), Win-98 and Win-NT (file+macro) detection. An additional test was performed to measure the ability of AV products to detect all polymorphic instantiations of 4 selected polymorphic file viruses, and one more test was devoted to detect any of 10,706 file virus variations generated with the virus development kit "VKIT" detected in fall 1998. A rather detailed evaluation of test results including progress and shortcomings is given in 7EVAL.TXT. 3. Summary #1: Development of DOS scanner detection rates: ========================================================== Concerning performance of DOS scanners, a comparison of virus detection results in tests from "1997-02" until present test "2000-04" shows how scanners behave and how manufacturers succeed in adapting their products to the growing threat of new viruses. The following table lists the development of detection rates of scanners (most current versions in each test), and it calculates changes ("+" indicating improvement) in detection rates. For reasons of fairness, it must be noted that improvement of those products which have yet reached a very high level of detection and quality (say: more than 95%) is much more difficult to achieve than for those products which reached lower detection rates. Some products have incorporated new engines (esp. for 32-bit platforms) and included formerly separate scanners (e.g. on macro viruses) which lead to improved performance. Generally, changes in the order of about +-1.5% are less significant as this is about the growth rate of new viruses per month, so detection depends strongly upon whether some virus is reported (and analysed and included) just before a new update is delivered. Table ES3 lists developments for detection of file and macro viruses; for details as well as for boot virus detection, see result tables (6b-6g) as well as overview (6asumov.txt) and Evaluation (7eval.txt). Table ES3: Improvement of DOS scanners from 1997-02 to 2000-04: =============================================================== -------- File Virus Detection ---------- ------ Macro Virus Detection ----------- SCAN 9702 9707 9802 9810 9903 9909 0004 DELTA 9702 9707 9802 9810 9903 9909 0004 DELTA NER % % % % % % % % % % % % % % % --------------------------------------------------------------------------------------- ALE 98.8 94.1 89.4 - - - - - 96.5 66.0 49.8 - - - - - ANT 73.4 80.6 84.6 75.7 - - 92.8 - 58.0 68.6 80.4 56.6 - - 85.9 - AVA 98.9 97.4 97.4 97.9 97.6 97.4 97.5 0.1 99.3 98.2 80.4 97.2 95.9 94.6 93.7 -0.9 AVG 79.2 85.3 84.9 87.6 87.1 86.6 - - 25.2 71.0 27.1 81.6 82.5 96.6 - - AVK - - - 90.0 75.0 - - - - - - 99.7 99.6 - - - AVP 98.5 98.4 99.3 99.7 99.7 99.8 99.6 -0.2 99.3 99.0 99.9 100% 99.8 100% 99.9 -0.1 CMD - - - - - - 99.5 - - - - - - 99.5 100% 0.5 DRW 93.2 93.8 92.8 93.1 98.2 98.3 - 0.1 90.2 98.1 94.3 99.3 98.3 - 98.4 - DSE 99.7 99.6 99.9 99.9 99.8 - - - 97.9 98.9 100% 100% 100% - - - FMA - - - - - - - - 98.6 98.2 99.9 - - - - - FPR 90.7 89.0 96.0 95.5 98.7 99.2 99.6 0.4 43.4 36.1 99.9 99.8 99.8 99.7 100% 0.3 FSE - - 99.4 99.7 97.6 99.3 99.9 0.6 - - 99.9 90.1 99.6 97.6 99.9 2.3 FWN - - - - - - - 97.2 96.4 91.0 85.7 - - - - HMV - - - - - - - - - 98.2 99.0 99.5 - - - IBM 93.6 95.2 96.5 - - - - - 65.0 88.8 99.6 - - - - - INO - - 92.0 93.5 98.1 94.7 94.6 -0.3 - - 90.3 95.2 99.8 99.5 99.7 0.2 IRS - 81.4 74.2 - 51.6 - - - - 69.5 48.2 - 89.1 - - - ITM - 81.0 81.2 65.8 64.2 - - - - 81.8 58.2 68.6 76.3 - - - IVB 8.3 - - - 96.9 - - - - - - - - - - - MR2 - - - - - 65.4 - - - - - - - 69.6 - - NAV 66.9 67.1 97.1 98.1 77.2 96.0 93.3 -2.7 80.7 86.4 98.7 99.8 99.7 98.6 97.4 -1.2 NOD - - - 96.9 - 96.9 98.3 1.4 - - - - 99.8 100% 99.4 -0.6 NVC 87.4 89.7 94.1 93.8 97.6 - 99.1 - 13.3 96.6 99.2 90.8 - 99.6 99.9 0.3 PAN - - 67.8 - - - - - - - 73.0 - - - - - PAV - 96.6 98.8 - 73.7 98.8 98.7 -0.1 - - 93.7 100% 99.5 98.8 99.9 1.1 PCC - - - - - - - - - 67.6 - - - - - - PCV 67.9 - - - - - - - - - - - - - - - PRO - - - - 35.5 - - - - - - - 81.5 - - - RAV - - - 71.0 - - - - - - - 99.5 99.2 - - - SCN 83.9 93.5 90.7 87.8 99.8 97.1 99.9 2.8 95.1 97.6 99.0 98.6 100% 100% 100% 0.0 SWP 95.9 94.5 96.8 98.4 - 99.0 98.4 -0.6 87.4 89.1 98.4 98.6 - 98.4 98.4 0.0 TBA 95.5 93.7 92.1 93.2 - - - - 72.0 96.1 99.5 98.7 - - - - TSC - - 50.4 56.1 39.5 51.6 - - - - 81.9 76.5 59.5 69.6 - - TNT 58.0 - - - - - - - - - - - - - - - VDS - 44.0 37.1 - - - - - 16.1 9.9 8.7 - - - - - VET - 64.9 - - 65.3 - - - - 94.0 97.3 97.5 97.6 - - - VIT - - - - - - 7.6 - - - - - - - - - VRX - - - - - - - - - - - - - - - - VBS 43.1 56.6 - 35.5 - - - - - - - - - - - - VHU 19.3 - - - - - - - - - - - - - - - VSA - - 56.9 - - - - - - - 80.6 - - - - - VSP - - - 76.1 71.7 79.6 - 7.9 - - - - - - - - VSW - - 56.9 - - - - - - - 83.0 - - - - - VTR 45.5 - - - - - - - 6.3 - - - - - - - XSC 59.5 - - - - - - - - - - - - - - - --------------------------------------------------------------------------------------- Mean 74.2 84.8 84.4 85.4 81.2 90.6 98.3 1.7 69.6 80.9 83.8 89.6 93.6 88.2 98.0 +0.1 --------------------------------------------------------------------------------------- Remark: for abbreviations and details of products present only in previous tests, see related parts of former VTC test reports. ****************************************************************** Result #1: For DOS, both file and macro virus detection rates improved significantly. Several scanners detect more than 99% of zoo file and macro viruses but no scanner is graded "perfect". ****************************************************************** Results #1.1) Under DOS, good news is that those scanners whose ability to properly detect file viruses have been significantly improved in comparison with last test; from last testīs good mean value (90.6% mean detection rate), file viruses in VTCs large collection ("zoo") are now detected much better (98.3%). Most products with high detection level maintain their rates, but no single product presently detects ALL file zoo viruses. Concerning In-The-Wild file viruses, 13 (out of 16) products reach the level "perfect" (100% detection). #1.2) Also under DOS, detection of zoo macro viruses has also reached the highest level ever (98.0%) with a significant improvement (almost 10%). Now, 3 products (CMD, FPR and SCN) detect ALL zoo samples, and 4 more product reach a 99.9% detection level. And 11 (out of 15) detect all In-The-Wild viruses. #1.3) Summarizing the DOS situation, AV producers should aim at maintaining this high level of detection. ******************************************************************** 4. Summmary #2: Performance of DOS scanners on zoo testbeds: ============================================================ Concerning rating of DOS scanners, the following grid is applied to classify scanners: - detection rate =100.0% : scanner is graded "perfect" - detection rate above 95% : scanner is graded "excellent" - detection rate above 90% : scanner is graded "very good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall grade" (including file and macro virus detection), the lowest of the related results is used to classify resp. scanners. If several scanners of the same producer have been tested, grading is applied to the most current version (which is, in most cases, the version with highest detection rates). Only scanners where all tests were completed are considered; here, the most current version with test completed was selected. (For problems in test: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with file and macro virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100%): (file/macro zoo; file/macro ITW) "Perfect" DOS scanners: NONE Change: ------- "Excellent" DOS scanners: SCN ( 99.9% 100.0%; 100.0% 100.0%) (+) FPR ( 99.6% 100.0%; 100.0% 100.0%) (=) CMD ( 99.5% 100.0%; 100.0% 100.0%) (+) FSE ( 99.9% 99.9%; 100.0% 100.0%) (+) AVP ( 99.6% 99.9%; 100.0% 100.0%) (=) NVC ( 99.1% 99.9%; 100.0% 100.0%) (+) "Very Good" DOS scanners: PAV ( 98.7% 99.9%; 100.0% 100.0%) (=) NOD ( 98.3% 99.4%; 100.0% 100.0%) (=) SWP ( 98.4% 98.4%; 100.0% 100.0%) (=) ************************************************************** Finding #2: Quality of best DOS scanners not yet perfect Excellent DOS scanners: SCN, FPR, CMD, FSE, AVP, NVC *************************************************************** Findings #2.1) The overall virus detection quality of best DOS scanners has reached a very acceptable level both for file and macro viruses which are not "in-the-wild". #2.2) 6 DOS scanners - SCN, FPR, CMD, FSE, AVP, NVC - are almost perfect. #2.3) 3 more products are "very good" (same level as in last test): PAV, NOD and SWP. ************************************************************** 5. Summary #3: Performance of DOS scanners on ITW testbeds: =========================================================== Concerning "In-The-Wild" viruses, a much more rigid grid must be applied to classify scanners, as the likelihood is significant that a user may find such a virus on her/his machine. The following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses is now an absolute requirement. The following 10 DOS products reach 100% for boot, file and macro virus detection and are rated "perfect" in this category (alphabetically ordered): ( File Macro Boot) ----------------------- "Perfect" DOS ITW scanners: AVP (100.0% 100.0% 100.0%) (=) CMD (100.0% 100.0% 100.0%) (=) DRW (100.0% 100.0% 100.0%) (+) FPR (100.0% 100.0% 100.0%) (=) FSE (100.0% 100.0% 100.0%) (=) INO (100.0% 100.0% 100.0%) (+) NOD (100.0% 100.0% 100.0%) (=) NVC (100.0% 100.0% 100.0%) (+) PAV (100.0% 100.0% 100.0%) (=) SCN (100.0% 100.0% 100.0%) (=) SWP (100.0% 100.0% 100.0%) (=) ------------------------ ************************************************************** Result #3: High ITW detection rates and implied risks: 11 prodcts "perfect" for ITW viruses under DOS: AVP,CMD,DRW,FPR,FSE,INO,NOD,NVC,PAV,SCN,SWP. ************************************************************** Result #3.1) 11 DOS scanners "perfect" for ITW viruses: AVP,CMD,DRW,FPR,FSE,INO,NOD,NVC,PAV,SCN,SWP. #3.2) In-The-Wild detection of best DOS scanners has been slightly improved since last test. Number of perfect scanners (in this category) has jumped from 10 to 11. #3.3) But the concentration of some AV producers to reach 100% In-The-Wild detection rates does NOT guarantee high detection rates of "zoo" viruses. ************************************************************** 6. Summary #4: Performance of DOS scanners by virus classes: ============================================================ Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting file, boot and macro viruses. Compared to the last test (1999-09), the number of "excellent" macro virus detectors has significantly grown (as has the class of "good" ones which is not listed here); in contrast, "standard" file (and even more: boot) viruses seem to be comparably less carefully handled in product upgrading. Two special tests of file viruses were also performed to determine the quality of AV product maintenance. One test was concerned with almost 11,000 viruses generated from the VKIT virus generator. Some AV products count every of the potential 15,000 viruses as new variant while others count all VKIT viruses just as ONE virus. Fortunately, a high proportion of tested products detects these viruses (see 4.5), although reliability of detection is significantly less than normally (see 6BDOSFIL.TXT). Another special test was devoted to the detection of 10,000 polymorphic generations each of the following polymorphic viruses: Maltese.Amoeba, MTE.Encroacher.B, NATAS, TREMOR, One-Half and Tequila. Detection rates were "almost perfect". Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed. 6.1 Grading the Detection of zoo file viruses: ---------------------------------------------- "Perfect" DOS scanner: === NONE === (=) "Excellent" DOS scanners: FSE ( 99.9%) (=) SCN ( 99.9%) (+) AVP ( 99.6%) (=) FPR ( 99.6%) (=) CMD ( 99.5%) (+) NVC ( 99.1%) (+) "Very Good" DOS file scanners: PAV ( 98.7%) (=) SWP ( 98.4%) (-) AVA ( 97.5%) (=) NOD ( 98.3%) (=) 6.2 Grading the Detection of zoo macro viruses: ----------------------------------------------- "Perfect" DOS macro scanners: CMD (100.0%) (+) FPR (100.0%) (+) SCN (100.0%) (=) "Excellent" DOS macro scanners: AVP ( 99.9%) (-) FSE ( 99.9%) (+) NVC ( 99.9%) (=) INO ( 99.7%) (=) NOD ( 99.4%) (+) "Very Good" DOS file scanners: DRW ( 98.4%) (+) SWP ( 98.4%) (=) NAV ( 97.4%) (=) 6.3 Grading the Detection of zoo boot viruses: ---------------------------------------------- "Perfect" DOS scanner: AVP (100.0%) (+) NOD (100.0%) (+) "Excellent" DOS scanners: CMD ( 99.9%) (+) FPR ( 99.9%) (=) FSE ( 99.9%) (=) NVC ( 99.9%) (+) SCN ( 99.9%) (=) PAV ( 99.8%) (+) SWP ( 99.4%) (+) "Very Good" DOS file scanners: INO ( 98.1%) (+) 6.4 Grading of Poly-virus detection: ------------------------------------ Based on the detection data (see 6BDOSFIL.TXT, table FDOS.FA), and with additional conditions, that 1) all infected objects for all viruses were detected 2) with full reliability of identification and detection, the following products can be rated as "perfect" Poly-detectors: "Perfect" Poly-detectors: AVP (100.0) (=) DRW (100.0) (+) FSE (100.0) (=) NAV (100.0) (=) NOD (100.0) (=) PAV (100.0) (=) The following products are "almost perfect" as they reach 100% detection rate (exactly) but with less precise identification precision: "Almost Perfect" Poly detectors: ANT (+), AVA (+), CMD (=), DRW (=), FPR (=), INO (=), NVC (=), SCN (+), SWP (=) and VSP (+). 6.5 Grading of VKit virus detection: ------------------------------------ Based on detection data (see 6BDOSFIL.TXT, table FDOS.FB), and with additional conditions, that 1) all infected objects for all viruses were detected 2) with full reliability of identification and detection, NO product was "perfect" but several detected almost all samples (rounded to 100.0%) but with some unreliability of identification: "Perfect" VKIT-detectors: NONE (=) "Almost Perfect" VKIT detectors: ANT (+), AVA (+), AVP (=), CMD (+), DRW (+), FPR (+), FSE (=), NOD (+), PAV (=), SCN (=) and SWP (+). **************************************************************** Result #4: Performance of DOS scanners by virus classes: --------------------------------------------- Perfect scanners for macro zoo: CMD, FPR, SCN Perfect scanners for boot zoo: AVP, NOD Perfect scanners for Polymorphic virus set: AVP, FSE, NAV, NOD, PAV. NO perfect scanner for file and VKit zoo viruses. **************************************************************** Results #4.1) Specialised scanners (esp. those specialising on macro viruses) are not superior to best overall scanners, even concerning large collections such as VTCs "zoo" testbeds. **************************************************************** 7. Summary #5: Detection of viruses in packed objects under DOS: ================================================================ Detection of file and macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with given popular methods (PKZIP, ARJ, LHA and RAR) are also detected. Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially been unchanged to ease comparison and improvement. Results (see 6BDOSFIL.TXT, 6DDOSMAC.TXT) are AGAIN rather DISAPPOINTING, esp. as we have to report major problems of products in scanning the whole testbed (although not very large), as reported in 8PROBLMS.TXT. A "perfect" product would detect ALL packed viral samples (100%) (file AND macro) for all packers: -------------------------------------------------- "Perfect" packed virus detectors: FSE, PAV and SCN -------------------------------------------------- A "very good" product would reach 100% detection of packed viral samples (file and macro) for at least 3 packers: --------------------------------------------------------------- "Very good" packed macro virus detector: NOD (ZIP, ARJ and RAR) --------------------------------------------------------------- A "good" product would detect viral samples (ITW file and macro) for at least 2 packers: -------------------------------------------- "Good" packed macro virus detector: CMD, FPR -------------------------------------------- ******************************************************************** Result #5: Detection of packed viral objects needs improvement Perfect packed file/macro virus DOS detector: FSE,PAV,SCN Packed macro virus only: "Very Good" detector: NOD "Good" detectors: CMD, FPR ******************************************************************** Results #5.1) 3 products = FSE, PAV and SCN = can be rated "perfect" concerning detection of infected packed objects, at least on the level of ITW file and macro viruses. #5.2) Only 3 more products have reached an acceptable level of detecting viruses in packed infected objects with 2 or 3 compression methods. Signi- ficant investment of work is needed here. ******************************************************************** 8. Summary #6: False-Positive avoidance of DOS and Win-NT scanners: =================================================================== Regarding the ability of scanners to avoid FP alarms, the following AV products running under DOS reported NO SINGLE False Positive alarm BOTH in file and macro zoo testbeds and are therefore rated "perfect": ------------------------------------------------- FP-avoiding "perfect" DOS scanners: AVA (+), PAV (+), SCN (=) and SWP (+) -------------------------------------------------- Several DOS scanners gave NO FP alarm EITHER on clean files OR macros: ----------------------------------------------------- Perfect FP-avoidance on DOS clean file testbed: AVA (+), AVP (=), CMD (=), FPR (=), NAV (=), NVC (=), PAV (=), SCN (=), SWP (=), VIT (+) ----------------------------------------------------- Perfect FP-avoidance on DOS clean macro file testbed: ANT (+), AVA (=), FSE (+), NOD (+), PAV (+), SCN (=), SWP (=), VSP(+). ----------------------------------------------------- In comparison to DOS results, an analysis of FP-avoidance for Windows-NT based scanners is slightly more promising. Concerning avoidance of ANY FP-alarm BOTH for file and macro viruses, 6 products are rated "perfect": -------------------------------------------------------------------- FP-avoiding "perfect" W-NT scanners: AVA, AVG, AVK, MKS, SCN, SWP. -------------------------------------------------------------------- Several more W-NT scanners also gave NO FP alarm EITHER on clean files OR on clean macros: ------------------------------------------------------------- Perfect FP-avoidance under Win-NT for clean file objects: ATD (+), AVA (+), AVG (+), AVK (=), AVP (=), CMD (+), ESA (+), FPW (=), FSE (=), MKS (+), NAV (=), NVC (+), PAV (=), PRO (+), RAV (+), SCN (=), SWP (=). ------------------------------------------------------------- Perfect FP-avoidance under Win-NT for clean macro objects: AVA (=), AVG (+), AVK (+), MKS (+), NOD (+), QHL (+), SCN (=), SWP (=), VSP (=). ------------------------------------------------------------- Concerning avoidance of False-Positive alarms BOTH under DOS AND Windows-NT, only 2 products can be rated as "perfect": ------------------------------------------------------ "Perfect" FP-avoiding scanner both under DOS and W-NT: AVA, SCN and SWP. ------------------------------------------------------ (Remark: direct comparison of 16-bit scan engines for DOS and 32-bit scan engines for W-NT is not possible. The argument concerning an "overall perfect product" applies more to the suite of software than to single products. Indeed, FPW and FPR are different engines in Frisk Software suite, as SCN engines are in NAIs suite). **************************************************************** Result #6: Avoidance of False-Positive Alarms is insufficient. FP-avoiding perfect DOS scanners: AVA, PAV, SCN, SWP FP-avoiding perfect W-NT scanners: AVA, AVG, AVK, SCN, SWP **************************************************************** Results #6.1) Several products reliably avoid ANY False Positive alarm on clean file and macro objects, either under DOS and Win-NT. #6.2) Only 3 products avoid ANY false-positive alarm BOTH under DOS and Windows-NT: AVA, SCN and SWP! #6.3) The number of scanners avoiding FP alarms has been slightly improved since last test. #6.3) AV producers should intensify work to avoid FP alarms. ***************************************************************** 9. Summary #7: Detection of File and Macro Malware (DOS/Win-NT): ================================================================ Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be warned and protected not only about viruses but also about other malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Regrettably, consciousness of AV producers to protect their users against related threats is still underdeveloped. Manifold arguments are presented why AV products are not the best protection against non-viral malware; from a technical point, these arguments may seem conclusive but at the same time, almost nothing is done to support customers with adequate AntiMalware software. On the other side, AV methods (such as scanning for presence or absence of characteristic features) are also applicable - though not ideal - to detect non-viral malware. Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. As in last test, still NO product can be rated a "perfect AM detector". Compared to last test, only 1 product (formerly 4, one of which "left" the mmarket) can now be rated "very good". Developments for file malware (e.g. trojans stealing passwords) is disappointing as detection rates are generally decreasing, whereas detection of macro malware (which is well defined in VTCs "List of Known Macro Viruses/Malware") is improving: ===== Malware Detection ===== = under DOS == = under W-NT = (File/Macro-mw;File/Macro-mw) ----------------------------------- "Perfect" DOS/W-NT mw scanners: NONE ----------------------------------- "Excellent" DOS/W-NT scanners: FPR/FPW (95.3% 100.0%; 91.5% 100.0%) (+) SCN (93.4% 99.6%; 93.3% 99.6%) (=) CMD (92.8% 100.0%; 95.3% 100.0%) (+) FSE (94.6% 96.2%; 98.7% 100.0%) (+) PAV (90.8% 98.8%; 91.6% 98.8%) (+) ---------------------------------- Concerning macro malware detection, the situation is better as there are "perfect" products (100% detection): 3 products detect macro malware under W-NT at 100%: CMD, FPW/FPW and FSE. 2 products detects macro malware under DOS at 100%: CMD and FPR/FPW. But otherwise, detection rates need still development: ===== Malware Detection ===== = under DOS == = under W-NT = (File/Macro-mw;File/Macro-mw) ----------------------------------- "Perfect" macro mw (DOS/W-NT):FPR/FPW (95.3% 100.0%; 91.5% 100.0%) (+) CMD (92.8% 100.0%; 95.3% 100.0%) (+) ----------------------------------- "Excellent" macro mw (DOS/WNT): SCN (93.4% 99.6%; 93.3% 99.6%) (=) PAV (90.8% 98.8%; 91.6% 98.8%) (+) AVP (83.2% 96.9%; 91.6% 98.8%) (=) FSE (94.6% 96.2%; 98.7% 100.0%) (+) NOD (77.6% 96.2%; 78.3% 97.3%) (-) NVC (64.9% 95.4%; 77.6% 96.2%) (=) INO (74.4% 95.0%; - 98.1%) (=) SWP (78.3% 95.0%; 78.3% 95.0%) (=) ---------------------------------- ************************************************************** Result #7: AntiMalware detection under DOS/W-NT improving No "perfect" file/macro malware detector "Excellent" file/macro malware detector: FPR/FPW, SCN, CMD, FSE and PAV ************************************************************** Results #7.1: The ability of AV products to detect also non-viral malware is only improving for macro malware while the ability to detect file malware decreased since last tests. #7.2: Concerning file and macro malware, 4 products can be rated "excellent": FPR/FPW,SCN,CMD,FSE,PAV. #7.3: With continuing growth of malware testbeds and growing threats to customers, AV producers MUST improve their products also in this area. ************************************************************** 10. Summary #8: File/Macro virus detection under Windows-NT: ============================================================ The number of scanners running under Windows NT is still small, though growing. Significantly less products were available for these tests, compared with the traditional DOS scene. The following table summarizes results of file and macro virus detection under Windows-NT in last 6 VTC tests: Scan ====== File Virus Detection ======= ===== Macro Virus Detection ======= ner 9707 9802 9810 9903 9909 0004 Delta 9707 9802 9810 9903 9909 0004 Delta ------------------------------------------------------------------------------ ANT 88.9 69.2 91.3 - 87.2 92.8 12.6 92.2 - 85.7 - 89.3 90.2 0.9 ANY - - 69.7 - - - - - - 70.5 - - - - ATD - - - - - 100% - - - - - - 99.9 - AVA - 97.4 96.6 97.1 97.4 97.2 -0.2 - 91.9 97.2 95.2 93.3 94.3 1.0 AVG - - - 87.3 87.0 85.4 -1.6 - - - 82.5 96.6 97.5 0.9 AVK - - 99.6 90.2 99.8 99.7 -0.1 - - 99.6 99.6 100% 99.9 -0.1 AVP - - 83.7 99.9 99.8 99.9 0.1 - - 100% 99.2 100% 99.9 -0.1 AVX - - - 74.2 75.2 80.4 5.2 - - - 98.9 98.7 94.5 -4.2 AW - 56.4 - - - - - - 61.0 - - - - - CMD - - - - - 99.6 - - - - - - 100% - DRW/DWW - - 93.3 98.3 98.3 0.0 - - - 98.3 98.8 98.4 -0.4 DSS/E 99.6 99.7 99.9 99.3 * - - 99.0 100% 100% 100% * - - ESA - - - - - 58.0 - - - - - - 88.9 - FPR/FMA - 96.1 - 98.7 99.4 - - - 99.9 99.8 99.8 99.7 - - FPW - - - - - 99.6 - - - - - 99.7 100% 0.3 FSE - 85.3 99.8 100% 99.9 100% 0.1 - - 99.9 100% 100% 100% 0.0 FWN - - - - - - - - - 99.6 99.7 - 99.9 - HMV - - - - - - - - - 99.0 99.5 - - - IBM 95.2 95.2 77.2 * * * * 92.9 92.6 98.6 * * * * INO - 92.8 - 98.1 98.0 98.7 0.7 - 89.7 - 99.8 99.7 99.7 0.0 IRS - 96.3 - 97.6 - - - - 99.1 - 99.5 - - - IVB - - - - - - - - - 92.8 95.0 - - - MKS - - - - - 78.0 - - - - - - 97.1 - MR2 - - - - 61.9 - - - - - - 69.6 - - NAV 86.5 97.1 - 98.0 97.6 96.8 -0.8 95.6 98.7 99.9 99.7 98.7 98.0 -0.7 NOD - - - 97.6 98.2 98.3 0.1 - - - 99.8 100% 99.4 -0.6 NVC 89.6 93.8 93.6 96.4 - 99.1 - 96.6 99.2 - 98.9 98.9 99.9 1.0 NVN - - - - 99.0 - - - - - - 99.5 - - PAV 97.7 98.7 98.4 97.2 99.6 100% 0.4 93.5 98.8 99.5 99.4 99.7 99.9 0.2 PRO - - - 37.3 42.4 45.6 3.2 - - - 58.0 61.9 67.4 5.5 QHL - - - - - - - - - - - - 0.0 - RAV - 81.6 84.9 85.5 - 88.0 - - 98.9 99.5 99.2 - 97.9 - RA7 - - - 89.3 - - - - - - 99.2 - - - PCC 63.1 - - - - - - - 94.8 - - - - - PER - - - - - - - - 91.0 - - - - - SCN 94.2 91.6 71.4 99.1 99.8 99.8 0.0 97.6 99.1 97.7 100% 100% 100% 0.0 SWP 94.5 96.8 98.4 - 99.0 99.6 0.6 89.1 98.4 97.5 - 98.4 98.6 0.2 TBA - 93.8 92.6 * * * - 96.1 - 98.7 * * * TNT - - - * * * - - - 44.4 * * * VET 64.9 - - 65.4 * * - - 94.0 - 94.9 * * VSA - 56.7 - - - - - - 84.4 - - - - VSP - - - 87.0 69.8 78.1 8.3 - - - 86.7 0.3 0.0 -0.3 ------------------------------------------------------------------------------ Mean: 87.4 88.1 89.0 89.2 90,0 91.0 1.8% 94.7 95.9 91.6 95.3 95,1 96.5 0.2% ------------------------------------------------------------------------------ Generally, the ability of W-NT scanners to detect file zoo viruses "in the mean" is stable but on an insufficient level (91.0%), but those scanners present in last test pushed detection rates by 1.8% (mean). On the side of macro viruses, "mean" detection rate is stable on an acceptable level (96.3%); here, products having participated in last VTC tests succeed in following the growth by keeping detection rates (+0.2%) on high levels, from where spectacular improvements are less easy. The same grid as for the DOS and W-98 classification is applied to grade scanners according to their ability to detect file and macro viruses under Windows NT. Under W-NT, just ONE product (in both cases: FSE) reached 100% detection rate for both file and macro viruses, both zoo and In-The-Wild, and is rated "perfect". But 8 scanners reach grade "Excellent" (>99% detection), and 4 scanners are rated "very good" (>95%): (file/macro zoo; file/macro ITW) -------------------------------- "Perfect" W-NT scanners: FSE (100.0% 100.0%; 100.0% 100.0%) (+) -------------------------------- ------- "Excellent" W-NT scanners: ATD (100.0% 99.9%; 100.0% 100.0%) (+) PAV (100.0% 99.9%; 100.0% 100.0%) (=) AVP ( 99.9% 99.9%; 100.0% 100.0%) (=) SCN ( 99.8% 100.0%; 100.0% 100.0%) (=) AVK ( 99.7% 99.9%; 100.0% 100.0%) (=) CMD ( 99.6% 100.0%; 100.0% 100.0%) (+) FPW ( 99.6% 100.0%; 100.0% 100.0%) (=) SWP ( 99.6% 98.6%; 100.0% 100.0%) (=) NVC ( 99.1% 99.9%; 100.0% 100.0%) (+) ------------------------------- "Very Good" W-NT scanners: INO ( 98.7% 99.7%; 100.0% 100.0%) (=) DRW ( 98.3% 98.4%; 100.0% 100.0%) (=) NOD ( 98.3% 99.4%; 100.0% 100.0%) (=) NAV ( 96.8% 98.0%; 100.0% 100.0%) (=) ------------------------------- For detection of macro viruses under Windows NT, the following 16 scanners detect at least 95% of zoo macro and 100% of ITW viruses: ------------------------------------------------------ "Perfect" (100% 100%): CMD, FPW, FSE and SCN "Excellent" (>99% 100%): ATD, AVK, AVP, FWN, INO, NOD, NVC and PAV "Very Good" (>95% 100%): AVG, DRW, NAV, RAV and SWP ------------------------------------------------------- ************************************************************** Result #8: Virus detection rates under W-NT on high level 1 "perfect" Windows-NT zoo scanner: FSE 10 "excellent" scanners: ATD, PAV, AVP, SCN, AVK, CMD, FPW, SWP and NVC 4 Macro-only "perfect" products: CMD,FPW,FSE,SCN ************************************************************** Results #8.1: Detection rates for file and esp. macro viruses for scanners under Windows NT have reached a fairly high level, similar to W-98: Perfect scanner (100%): 1 (last test: 0) Excellent scanners (>99%): 8 (last test: 8) #8.2: AV producers should invest more work into file virus detection, esp. into VKIT virus detection (where results are not equally promising) as well as detection of viruses in compressed objects (essential for on-access scanning). ************************************************************** 11. Summary #9: File/Macro Virus detection under 32-bit engines: ================================================================ Concerning 32-Bit engines as used in Windows-98 and Windows-NT, it is interesting to test the validity of the hypothesis that related engines produce same detection and identification quality. (For details see 6HCOMP32.TXT). When comparing results from related tests, good news is that 32-bit engines growingly behave equally well on W-98 and W-NT platforms: ---------------------------------------------------------- Equal detection of zoo file viruses: 13 (of 23) products of ITW file viruses: 22 (of 23) products of zoo macro viruses: 21 (of 23) products of ITW macro viruses: 21 (of 23) products ---------------------------------------------------------- ***************************************************************** Result #10: Several W-32 scanners perform equally on W-98/W-NT ***************************************************************** Results #10.1: The assumption that 32-bit engines in scanners produce the same detection rate for different instantiations of 32-bit operating systems (esp. for Windows-98 and Windows-NT) is now correct for almost all scanners. #10.2: Analysis of ITW detection rates is NOT sufficient to determine the behaviour of 32-bit engines and does not guarantee equal detection rates for different W-32 platforms (esp. W-98/W-NT). ***************************************************************** 12. Summary #10: Malware detection under Windows 98/NT: ======================================================= As Windows 98 and Windows-NT are often used for downloading potential- ly hazardous objects from Internet, it is interesting to measure the ability of AntiVirus products to also act as AntiMalware products. The same grid is applied as to grading of DOS AM products. Similar to DOS, NO AV products can presently be graded as "Perfect" (all rates 100.0%) but 7 scanners (compared to last test: 2) perform as AM products with grade "Excellent" (>90%), both for W-98 and W-NT; several scanners reach now the same level as in test 1999-03, after having lost this grade due to insufficient detection rates in test "1999-09". ===== Malware Detection ===== == File malw == Macro malw == (W-98 W-NT ; W-98 W-NT) ----------------------------------- "Perfect" W98/W-NT mw scanners: NONE ----------------------------------- "Excellent" W98/W-NT mw scanners: AVK (91.1% 91.1%; 98.8% 98.8%) (+) AVP (91.5% 91.6%; 96.9% 96.9%) (+) CMD (95.3% 95.3%; 100.0% 100.0%) (+) FPW (91.5% 91.5%; 100.0% 100.0%) (+) FSE (98.7% 98.7%; 100.0% 100.0%) (=) PAV (91.6% 91.6%; 98.8% 98.8%) (+) SCN (94.3% 93.3%; 100.0% 99.6%) (=) ----------------------------------- Detection of macro malware is evidently better supported than file malware detection, as several more AV products detect macro malware under W-98 (14 products) and W-NT (16 products). As no product was rated "perfect", the following products reached a "very good" level of macro malware detection under both Win-98 and Win-NT: Detection of macro malware under W-98/W-NT -------------------------------------------- "Perfect": CMD, FPW, FSE (100.0% 100.0%) "Excellent": SCN (100.0% 99.6%) ATD, AVK, PAV ( 98.8% 98.8%) INO ( 97.3% 97.3%) FWN ( 96.9% 98.1%) AVP ( 96.9% 96.9%) NOD ( 96.2% 96.2%) NVC, RAV ( 95.4% 95.4%) SWP ( 95.0% 95.0%) AVX ( 94.2% 94.2%) ------------------------------------------- *************************************************************** Result #10: AntiMalware quality of AV products is developping: NO "perfect" AM product for W-98 and W-NT for file and macro malware detection but 7 scanners with grade "excellent" (>90%): AVK, AVP, CMD, FPR, FSE, PAV and SCN. Concerning macro malware detection under W-98/W-NT: 3 products are "perfect": CMD, FPR and FSE 12 products are "excellent": SCN, ATD, AVK, PAV, INO, FWN, AVP, NOD, NVC, RAV, SWP and AVX *************************************************************** Results #11.1: Several AntiMalware producers help customers detecting also non-viral malware under 32-bit operating systems, esp. Win-98 and Win-NT. #11.2: The ability to detect macro malware is more developed than detection of file malware. #11.3: Much more work must be invested to reliably detect file and macro malware and protect customers from downloading trojans etc. *************************************************************** 13. Final remark: Searching the "Perfect AV/AM product" ===================================================== Under the scope of VTCs grading system, a "Perfect AV/AM product" would have the following characteristics: Definition: A "Perfect AntiVirus (AV) product" ---------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99% of zoo samples, in ALL categories (file, boot and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (4) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Definition: A "Perfect AntiMalware (AM) product" ------------------------------------------------ 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). ************************************************************* In VTC test "2000-04", we found *** NO perfect AV product *** and we found *** No perfect AM product *** ************************************************************* But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" -------------------------------------------------------------- DOS zoo tests: --- AVP,FPR DOS ITW tests: AVP,CMD,DRW,FPR,FSE, INO,NOD,NVC,PAV,SCN,SWP DOS pack-tests: FSE,PAV,SCN FP avoidance DOS: AVA,PAV,SCN,SWP FP avoidance W-NT: AVA,AVG,AVK,SCN,SWP W-NT zoo tests: FSE W-32 uniformity: AVK,AVP,CMD,FPR,FSE,PAV,SCN --------------------------------------------------------------- Malware NT/DOS: --- SCN Malware W-98/W-NT: --- AVK,AVP,CMD,FPR,FSE,PAV,SCN --------------------------------------------------------------- We regard it as indication of good work that the number of products which were graded with at least 1 point (that is: being at least once in the category "perfect" counting 2 points or in category "excellent" counting 1 point) has grown from 4 (last test) to now 14! In order to support the race for more customer protection, we evaluate the order of performance in this test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places (also indicating changes in places versus last test 1999-09): ************************************************************ "Perfect" AntVirus product: NONE "Excellent" AV products: 1st place: SCN(+) ( 9 points) 2nd place: PAV(+),FSE(=) ( 7 points) 4th place: SWP(+) ( 6 points) 5th place: AVA(+),AVP(-),FPR(=) ( 4 points) 8th place: AVK(+),CMD(+) ( 3 points) 10th place: AVG(+),DRW(+),INO(+),NOD(+),NVC(+) ( 2 points) ************************************************************ "Perfect" AntiMalware product: NONE "Excellent" AntiMalware product: 1st place: SCN(=) (11 points) 2nd place: FSE(+),PAV(+) ( 9 points) 4th place: FPR(+) ( 6 points) 5th place: AVP(+),CMD(+) ( 5 points) 7th place: AVK (+) ( 4 points) ************************************************************ Generally, we hope that these rather detailed results help AV producers to adapt their products to growing threats and thus to protect their customers. 14. Availability of full test results: ====================================== Much more information about this test, its methods and viral databases, as well as detailed test results are available for anonymous FTP downloading from VTCs HomePage (VTC is part of Working Group AGN): ftp://agn-www.informatik.uni-hamburg.de/vtc Any comment and critical remark which helps VTC learning to improve our test methods will be warmly welcomed. The next comparative test will evaluate macro (VBA/VBA5) and VBS virus detection, and this test is planned for May to July 2000, with viral databases frozen on May 10, 2000. Any AV producer wishing to participate in forthcoming test is invited to submit related products. On behalf of the VTC Test Crew: Dr. Klaus Brunnstein (May 26, 2000) (minor corrections: June 14, 2000) 15. Copyright, License, and Disclaimer: ======================================= This publication is (C) Copyright 2000 by Klaus Brunnstein and the Virus Test Center (VTC) at University of Hamburg, Germany. Permission (Copy-Left) is granted to everybody to distribute copies of this information in electronic form, provided that this is done for free, that contents of the information are not changed in any way, and that origin of this information is explicitly mentioned. It is esp. permitted to store and distribute this set of text files at university or other public mirror sites where security/safety related information is stored for unrestricted public access for free. Any other use, esp. including distribution of these text files on CD-ROMs or any publication as a whole or in parts, are ONLY permitted after contact with the supervisor, Prof. Dr. Klaus Brunnstein or authorized members of Virus Test Center at Hamburg University, and this agreement must be in explicit writing, prior to any publication. No responsibility is assumed by the author(s) for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Prof. Dr. Klaus Brunnstein University of Hamburg, Germany (May 26, 2000)