===================================================== File 6ASUMOV.TXT: Overview: Results of VTC Test "2000-04" (April 2000): ===================================================== Formatted with non-proportional font (Courier) Content of this file: ===================== 1) General/Background 2) Special problems/experiences in test "2000-04" 3) Results and evaluation 4) Overview Tables Table A0: Products/Versions in VTC test "2000-04" Table A1: Detection Rate of File/Boot/Macro Viruses in Full Test (DOS) Table AA: Comparison: File/Macro Virus Detection Rate in last 6 VTC tests (DOS) Table A2: Detection Rate of Boot/File/Macro Viruses in In-The-Wild Test (DOS) Table A3: Detection Rate of Infected DOS Objects (Files/Images/Documents) Table A4: Consistency/Reliability of Virus Detection in Full Test (DOS) Table A5: Detection Rate of Non-Viral File & Macro Malware Table AB: Comparison: File/Macro Virus Detection Rate in last 3 VTC tests under Windows 98 Table A6: Detection Rate of File/Macro Viruses in Full/ITW Test (Windows 98) Table AC: Comparison: File/Macro Virus Detection Rate in last 5 VTC tests under Windows NT Table A7: Detection Rate of File/Macro Viruses in Full/ITW Test (Windows NT) 1) General/Background: ====================== The test presented here works on the shoulder of previous VTC tests performed by Vesselin Bontchev (his last test "1994-07" is available, for comparison, from another entry in VTCs ftp site) and regular VTC tests published since 1997-07, and it is an upgrade of VTCs last test "1999-09" published in November 1999. For details of previous tests, see VTCs homepage: http://agn-www.informatik.uni-hamburg.de/vtc Concerning operating platforms, tests are performed for AntiVirus (AntiMalware) products running under DOS, Windows 98 and Windows-NT, respectively. As in previous tests, VTC tested on-demand AntiVirus products under DOS for their ability to detect boot, file and macro viruses in the resp. virus databases. File and macro virus detection (both in full databases as in "In-the-Wild" testbeds) was tested for scanner versions running under Windows 98 and Windows NT. For these tests, virus databases were signifi- cantly updated against last tests. A separate test was performed on multiple generations of 6 polymorphic file viruses; this is a subset generated by VTCs dynamic generation procedure for detection of polymorphic viruses. Another separate test is based on file viruses generated by the VKIT file virus generator. Further to macro and file malware tests, this test included also determining the quality of detection of viruses in objects packed with four popular packers, and it analysed the ability of AV products to avoid "false positive" alarms. The protocols produced by each scanner were analysed for indications how many viruses and infected objects were detected, and whether identifi- cation is consistent (that is: producing the same name for all infected files, images and documents, respectively) and reliable (detecting all infected files, images and documents). Databases of file, boot and macro viruses were frozen on their status as known and available to VTC on October 31, 1999. For a detailed index of the respective virus databases, see: A3TSTBED.ZIP. Following discussions about virality of all samples in boot and file virus databases, there was a careful cleaning process where all samples of potentially questionable virality were moved to a special database from where self-reproduction experiments were started. We also received some information from experts of tested products which lead us to move few more samples to the database of doubtful samples for further inspection. No questions or doubts were raised about the macro virus testbed. It is VTCs policy to move samples of questionable virality or malicity back to the related testbeds when we can definitively prove that they are viral or otherwise malicious. After a series of pre-tests during which test procedures were installed (esp. for products with new engines and those participating for the first time), tests were performed in two phases: in phase 1, macro viruses and macro malware were tested; related results have been published on February 26, 2000 (labeled "pre-release 2000-02") in phase 2, file and boot viruses and file malware were tested; due to severe problems with many crashes (see file 8PROBLMS.TXT), this test required much post-processing The following products participated in this test: Table A0: Products/Versions in VTC test "2000-04": ================================================== ANT = AntiVir 5.21 (VDF 27.11.1999) for DOS, W-98, W-NT ATD = AntiDote 1.50 (Vintage Sol) for W-98, W-NT AVA = AVAST v7.70, AVAST v3.0 for DOS, W-98, W-NT AVK = AVK 9 (November 30,1999) for W-98, W-NT AVP = AVP v3.0 for W-98, W-NT CLE = Cleaner 3.0 for W-98, W-NT CMD = Command Software AV 4.58 for DOS, W-98, W-NT DRW = DrWeb 4.14 for DOS, W-98, W-NT DSE = Dr Solomon Emergency AV 4.03 for W-98 ESA = eSafe 2.1 (Aladdin) for W-98, W-NT FPR = F-PROT 3.06c for DOS, W-98 FPW = F-PROT for Windows 3.05 for W-98, W-NT FSE = FSAV v3.0 and 4.06 for DOS, W-98, W-NT FWN = FWin32 for W-98, W-NT INO = Inoculan 4.53 (11/24/99) for DOS, W-98, W-NT MKS = MKS-vir (Polonia) for W-98, W-NT NAV = NAV 1.0, 5.01.01 (Nov.1999) for DOS, W-98, W-NT NOD = NOD 32 V.129 for DOS, W-98, W-NT NVC = NVC 4.73 for DOS, W-98, W-NT PAV = PAV v3.0 (November 30,1999) for DOS, W-98, W-NT PER = PER AntiVirus (Peru) for W-98 PRO = Protector 7.7 B05 for W-98, W-NT QHL = Quickheal Version 5.21 for W-98, W-NT RAV = RAV 7.6 for W-98, W-NT SCN = NAI VirusScan 4.50, 4.03 for DOS, W-98, W-NT SWE = Sweep 3.28 for DOS, W-98, W-NT VIT = VirIT Explorer Lite 2.6.26 for DOS, W-98 VSP = VirScanPlus 11.90.04 for DOS, W-98, W-NT ----------------------------------------------------------- Those tables which summarize developments of detection rates over several VTC tests comprise some products which were either no longer available (*) or which were not submitted or available for this test (-). For details of related AV products (which are always addressed with the same scanner code), see related test reports (available from VTCs www/ftp site). 2) Special problems/experiences in test "2000-04": ================================================== Again, the complexity of the test and problems with scanners and systems delayed publication for about 8 weeks. Indeed, more products than before needed "special care" (aka "spoon-feeding") as they either crashed or refused scanning several directories, for unknown reasons. Partially, such problems may be related to the MS-NTFS problem as described in a previous test report ("1999-03", file "3asumov.txt": Problem #1). While VTC testbeds are different from user data collections which should usually have at most very few viruses, it is our sincere hope that such products behave more "user-friendly" in "normal environments". Concerning other impacts on timeliness, VTCs tests have to fit into university schedules as well as personal schedules of students who essentially run these tests. A special impediment during winter 1999 were Y2k-related student activities which somewhat reduced the priority of VTC test procedures (but this reason no longer applies). But apart from those problems documented in "8problms.txt", test procedures work with sufficient stability. 3) Results and evaluation: ========================== For details of results on scanner tests under DOS, Windows 98 and Windows NT, see test documents 6B-6F and 6H. For a detailed analysis of performances, see 7EVAL.TXT. All documents are in ASCII, formatted (and best reproducible) with non-proportional fonts (Courier), 72 columns. 4) Overview Tables: =================== In order to extract optimum information from the test results, comparative tables were produced for essential sets both for "full (zoo) test" as well as for the subset regarded equivalent to Joe Well`s and Wildlist Organisation`s "In-The-Wild" list. Result tables are given in ASCII, in a form from which an EXCEL or LOTUS spreadsheet can simply be derived, by simply deleting the headline and importing the "pure" table into a related spreadsheet. VTC deliberately did NOT follow suggestions to (optionally) present the results in XLS form, to avoid ANY possible macro-viral side effect (e.g. when some mirror site inadvertantly implants an XLS virus during preprocessing, e.g. when adding some information on the mirror site). Concerning the report on VTCs website, we have added graphical representations (gifs embedded in html pages) both for comparative tables as well as for each product. In order to determine whether and to what degree AntiVirus products also help users to identify non-replicating malicious software such as trojan horses, virus generators and intended (though not replicating) viruses, a special test was performed to detect known non-viral malware related to macro and file viruses. Macro malware was selected as published in VTCs monthly "List of Known Macro Viruses". For this test, known network malware (e.g. worms, hostile applets and malicious ActiveX) were deliberately excluded, but will be incorporated into a future malware test). Different from some previous test (where we accepted not to publish malware detection results for those products where their producers requested abstention), related malware tests are a mandatory part of VTC tests. We regret that we cannot admit AV products whose developers are evidently not willing to protect their customers from such threats, or who forbid that a related capability of their product be tested by an independent institution. Fortunately, all relevant AV producers agreed with related VTC test conditions. Detailed results are collected in separate files: Operating system = DOS: ----------------------- 6bdosfil.txt: Detection Rates of File Viruses (full/ITW) and File Malware, as well as packed objects 6cdosboo.txt: Detection Rates of Boot Viruses (full/ITW) 6ddosmac.txt: Detection Rates of Macro Viruses (full/ITW) and Macro Malware, as well as packed objects Operating system = Windows 98: ------------------------------ 6ewin98.txt: Detection Rates of File/Macro Viruses (full/ITW) Operating system = Windows NT: ------------------------------ 6fwinnt.txt: Detection Rates of File/Macro Viruses (full/ITW) Much more information is available from detailed tables, including detection of virus related to goat files, images and documents (see related chapters, as referred to in "1CONTENT.TXT"). ------------------------ Overview Table A1: ----------------------- Table A1 contains detection rates for boot, file and macro viruses under platform DOS for all products tested, including those where several subsequent versions were received during test period. Table A1: Detection Rate of Boot/File/Macro Viruses in Full DOS Test: ===================================================================== Scanner Boot Viruses File Viruses Macro Viruses Testbed 1237 100.0 18359 100.0 4525 100.0 ------------------------------------------------------- ANT 1159 93.7 17040 92.8 3888 85.9 AVA 17906 97.5 4239 93.7 AVP 1237 100.0 18294 99.6 4522 99.9 CMD 1236 99.9 18271 99.5 4525 100.0 DRW 4452 98.4 FPR 1236 99.9 18288 99.6 4525 100.0 FSE 1233 99.7 18347 99.9 4521 99.9 INO 1213 98.1 17360 94.6 4512 99.7 NAV 1104 89.2 17131 93.3 4408 97.4 NOD 1237 100.0 18053 98.3 4500 99.4 NVC 1236 99.9 18192 99.1 4521 99.9 PAV 1235 99.8 18121 98.7 4522 99.9 SCN 1236 99.9 18338 99.9 4525 100.0 SWP 1230 99.4 18073 98.4 4453 98.4 VIT 1398 7.6 VSP 0 0.0 ------------------------------------------------------ Mean value: 91,3% 97,8% 91.9% ..in last test: 90.6% 91.1% 90.3% ------------------------------------------------------ Explanation of the different columns: 1) "Scanner" is the code name of the scanner as listed in file A2SCANLS.TXT (see also Table A0). 2) "File Viruses (%)" is the number of different file infecting *viruses* in the virus collection used during the tests, which have been detected by the particular scanner. Their percentage from the full set of viruses in the collection used for tests is given in parenthesis. We define two viruses as being different if they differ in at least one bit in their non-modifiable parts. For variably encrypted viruses, the virus body has to be decrypted before the comparison is to be performed. For polymorphic viruses, additionally the part of the virus which is modified during the replication process has to be ignored. 3) "Boot Viruses (%)" as in (2) but for boot/MBR infectors. 4) "Macro Viruses (%)" is the number of different macro *viruses* from the collection used for the test that the scanner detects. This field is analogous to field 2, apart from listing macro viruses, not file infecting viruses. ------------------------ Overview Table AA: --------------------- Table AA indicates, for those scanners (most current versions each time) in last tests (1997-02/07 until 2000-04), how the results of file and macro virus detection rates developed. Results of these tests are given, and difference DELTA between last 2 tests is calculated. A positive value indicates that the resp. scanner improved in related category, whereas a "-"-sign indicates that the present result is not as good as the previous one. Results of +-0.5% are regarded as "statistical" as this may depend upon differences in signature updates. In some cases, comparison is impossible due to problems in previous or present test. Table AA: Comparison: File/Macro Virus Detection Rate in last 7 VTC tests under DOS: ===================================================== -------- File Virus Detection ---------- ------ Macro Virus Detection ----------- SCAN 9702 9707 9802 9810 9903 9909 0004 DELTA 9702 9707 9802 9810 9903 9909 0004 DELTA NER % % % % % % % % % % % % % % % % --------------------------------------------------------------------------------------- ALE 98.8 94.1 89.4 - - - - - 96.5 66.0 49.8 - - - - - ANT 73.4 80.6 84.6 75.7 - - 92.8 - 58.0 68.6 80.4 56.6 - - 85.9 - AVA 98.9 97.4 97.4 97.9 97.6 97.4 97.5 0.1 99.3 98.2 80.4 97.2 95.9 94.6 93.7 -0.9 AVG 79.2 85.3 84.9 87.6 87.1 86.6 - - 25.2 71.0 27.1 81.6 82.5 96.6 - - AVK - - - 90.0 75.0 - - - - - - 99.7 99.6 - - - AVP 98.5 98.4 99.3 99.7 99.7 99.8 99.6 -0.2 99.3 99.0 99.9 100% 99.8 100% 99.9 -0.1 CMD - - - - - - 99.5 - - - - - - 99.5 100% 0.5 DRW 93.2 93.8 92.8 93.1 98.2 98.3 - 0.1 90.2 98.1 94.3 99.3 98.3 - 98.4 - DSE 99.7 99.6 99.9 99.9 99.8 - - - 97.9 98.9 100% 100% 100% - - - FMA - - - - - - - - 98.6 98.2 99.9 - - - - - FPR 90.7 89.0 96.0 95.5 98.7 99.2 99.6 0.4 43.4 36.1 99.9 99.8 99.8 99.7 100% 0.3 FSE - - 99.4 99.7 97.6 99.3 99.9 0.6 - - 99.9 90.1 99.6 97.6 99.9 2.3 FWN - - - - - - - 97.2 96.4 91.0 85.7 - - - - HMV - - - - - - - - - 98.2 99.0 99.5 - - - IBM 93.6 95.2 96.5 - - - - - 65.0 88.8 99.6 - - - - - INO - - 92.0 93.5 98.1 94.7 94.6 -0.3 - - 90.3 95.2 99.8 99.5 99.7 0.2 IRS - 81.4 74.2 - 51.6 - - - - 69.5 48.2 - 89.1 - - - ITM - 81.0 81.2 65.8 64.2 - - - - 81.8 58.2 68.6 76.3 - - - IVB 8.3 - - - 96.9 - - - - - - - - - - - MR2 - - - - - 65.4 - - - - - - - 69.6 - - NAV 66.9 67.1 97.1 98.1 77.2 96.0 93.3 -2.7 80.7 86.4 98.7 99.8 99.7 98.6 97.4 -1.2 NOD - - - 96.9 - 96.9 98.3 1.4 - - - - 99.8 100% 99.4 -0.6 NVC 87.4 89.7 94.1 93.8 97.6 - 99.1 - 13.3 96.6 99.2 90.8 - 99.6 99.9 0.3 PAN - - 67.8 - - - - - - - 73.0 - - - - - PAV - 96.6 98.8 - 73.7 98.8 98.7 -0.1 - - 93.7 100% 99.5 98.8 99.9 1.1 PCC - - - - - - - - - 67.6 - - - - - - PCV 67.9 - - - - - - - - - - - - - - - PRO - - - - 35.5 - - - - - - - 81.5 - - - RAV - - - 71.0 - - - - - - - 99.5 99.2 - - - SCN 83.9 93.5 90.7 87.8 99.8 97.1 99.9 2.8 95.1 97.6 99.0 98.6 100% 100% 100% 0.0 SWP 95.9 94.5 96.8 98.4 - 99.0 98.4 -0.6 87.4 89.1 98.4 98.6 - 98.4 98.4 0.0 TBA 95.5 93.7 92.1 93.2 - - - - 72.0 96.1 99.5 98.7 - - - - TSC - - 50.4 56.1 39.5 51.6 - 12.1 - - 81.9 76.5 59.5 69.6 - - TNT 58.0 - - - - - - - - - - - - - - - VDS - 44.0 37.1 - - - - - 16.1 9.9 8.7 - - - - - VET - 64.9 - - 65.3 - - - - 94.0 97.3 97.5 97.6 - - - VIT - - - - - - 7.6 - - - - - - - - - VRX - - - - - - - - - - - - - - - - VBS 43.1 56.6 - 35.5 - - - - - - - - - - - - VHU 19.3 - - - - - - - - - - - - - - - VSA - - 56.9 - - - - - - - 80.6 - - - - - VSP - - - 76.1 71.7 79.6 - 7.9 - - - - - - - - VSW - - 56.9 - - - - - - - 83.0 - - - - - VTR 45.5 - - - - - - - 6.3 - - - - - - - XSC 59.5 - - - - - - - - - - - - - - - --------------------------------------------------------------------------------------- Mean 74.2 84.8 84.4 85.4 81.2 90.6 98.3 1.7 69.6 80.9 83.8 89.6 93.6 88.2 98.0 +0.1 --------------------------------------------------------------------------------------- Explanation of new column: 5) DELTA (=Change) is the relative difference between results of test "1999-09" and "2000-04". Remark: concerning acronyms of AV products tested in previous tests, see the related test report (from VTC website) for related information. ------------------------ Overview Table A2: --------------------- Table A2 indicates how many viruses belonging to the "In-The-Wild" subset of the full virus databases have been found by the respective scanner. Optimum measure is 100%. For detailed results, see "6bdosfil.txt", "6cdosboo.txt" and "6ddosmac.txt". Table A2: Detection Rate of Boot/File/Macro Viruses in DOS-ITW Test: ==================================================================== Scanner Boot Viruses File Viruses Macro Viruses Testbed 33 100.0 39 100.0 80 100.0 ------------------------------------------------------ AVA 33 100.0 39 100.0 79 98.8 AVP 33 100.0 39 100.0 80 100.0 CMD 33 100.0 39 100.0 80 100.0 DRW 33 100.0 39 100.0 80 100.0 FPR 33 100.0 39 100.0 80 100.0 FSE 33 100.0 39 100.0 80 100.0 INO 33 100.0 39 100.0 80 100.0 NAV 33 100.0 39 100.0 79 98.8 NOD 33 100.0 39 100.0 80 100.0 NVC 33 100.0 39 100.0 80 100.0 PAV 33 100.0 39 100.0 80 100.0 SCN 33 100.0 39 100.0 80 100.0 SWP 33 100.0 39 100.0 80 100.0 VIT 24 61.5 0 0.0 ------------------------------------------------------- Mean value: 100.0 97,3% 92.7% ..last test: 98.3% 96.8% 92.2% ------------------------------------------------------- ------------------------ Overview Table A3: --------------------- Table A3 indicates how many infected objects (files, boot/MBR images, Word and EXCEL documents) have been found by the respective scanner in the full database. Optimum measure is 100%. For detailed results, see "6bdosfil.txt", "6cdosboo.txt" and "6ddosmac.txt". Table A3: Detection Rate of Infected DOS Objects (Files/Images/Documents): ================================================ Scanner Boot Viruses File Viruses Macro Viruses Testbed 5379 100.0 135907 100.0 12918 100.0 ----------------------------------------------------- ANT 4782 88.9 128322 94.4 11262 87.2 AVA 133199 98.0 12193 94.4 AVP 5370 99.8 135257 99.5 12906 99.9 CMD 5378 100.0 135698 99.8 12918 100.0 DRW 12759 98.8 FPR 5378 100.0 135750 99.9 12918 100.0 FSE 5367 99.8 135863 100.0 12904 99.9 INO 5152 95.8 128477 94.5 12890 99.8 NAV 4812 89.5 127800 94.0 12613 97.6 NOD 5379 100.0 134283 98.8 12857 99.5 NVC 5378 100.0 134822 99.2 12907 99.9 PAV 5367 99.8 134107 98.7 12906 99.9 SCN 5378 100.0 135636 99.8 12918 100.0 SWP 5345 99.4 134631 99.1 12775 98.9 VIT 9169 6.7 VSP 0 0.0 ------------------------------------------------------ Mean value: 97.8% 91.1% 98.3% ..last test: 89.3% 90.5% 90.5% ------------------------------------------------------ Explanation of the different columns (see also 1-4 at Table A1): 6) "Number (%) of objects infected with file viruses" is the number of *files* infected with file-infecting viruses from the test set, which are detected by that particular scanner as being infected. Percentage of those files from the full set of files is given in parenthesis. We often have more than one infected file per virus, but not all viruses are represented by the same number of files, so this number does not give a good impression of the real detection rate of the scanner. It is included here only for completeness. Of course, it still *does* provide some information - usually the better a scanner, the more files it will detect as infected. 7) "Number (%) of objects infected with boot viruses" is the number of infected boot sectors in the test set that the scanner detects as infected. This field is analogous to field 5, though it lists infected boot sectors, not files. 8) "Number of objects infected with macro viruses" is the number of infected documents in the test set that the scanner detects as infected. This field is analogous to field 5, though it lists infected documents, not files. (*) Remark: concerning mean value of macro virus detection: for fair basis of comparison, the unusually low detection values of VIT and VSP were not counted. ------------------------ Overview Table A4: --------------------- Table A4 provides information about the "quality" of detection. Incon- sistent or unreliable identification means that some virus is identified with different names in different objects belonging to the same virus. Unreliable detection means that some virus is identified at least once, though not in all objects infected with the related virus. Optimum measures both for inconsistency and unreliability are 0%. For detailed results, see "6bdosfil.txt", "6cdosboo.txt" and "6ddosmac.txt". Table A4: Consistency/Reliability of Virus Detection in Full DOS Test: ====================================================================== Scanner Unreliable Identification: Unreliable Detection: Boot(%) File(%) Macro(%) Boot(%) File(%) Macro(%) --------------------------------------------------------------- ANT 26.2 7.6 2.8 6.9 2.6 0.7 AVA 4.4 0.7 0.7 0.4 AVP 71.7 2.2 1.7 0.6 0.6 0.0 CMD 0.6 0.3 1.2 0.0 0.0 0.0 DRW 1.2 0.4 FPR 0.2 0.1 0.0 0.0 0.0 0.0 FSE 14.1 2.5 1.7 0.5 0.1 0.0 INO 0.0 3.2 1.7 6.6 1.0 0.1 NAV 34.3 0.0 0.0 0.0 2.5 0.2 NOD 9.0 11.8 1.1 0.0 1.1 0.1 NVC 1.7 6.9 1.1 0.0 0.6 0.0 PAV 69.5 2.4 1.7 0.6 0.0 0.0 SCN 0.9 3.3 2.1 0.0 0.1 0.0 SWP 3.3 4.7 0.8 0.5 0.3 0.3 VIT 0.1 2.1 VSP 0.0 0.0 -------------------------------------------------------------- Mean value: 19.3% 3.5% 1.3% 1.3% 0.8% 0.2% ..last test: 5.4% 4.5% 1.2% 3.3% 1.5% 0.3% -------------------------------------------------------------- More Explanation of the different columns (see 1-8 at tables 1+3): 9) The fields "Unconsistent Identification" measures the relative amount (%) of those viruses where different names were assigned to the same virus. This is, to some extent, a measure of how precise the identification capacity of the resp. scanner is; optimum measure is 0%. 10) The fields "Unreliable Detection" measures the relative amount (%) of viruses which were only partly detected. Definition of unreliable detection is that at least one sample of the virus *is* detected and at least one sample of the virus is *not* detected. In some sense, unreliable detection is more dangerous than those cases when a scanner misses the virus completely, because an unreliably detected virus may be a hidden source of continuous viral infections. ------------------------ Overview Table A5: --------------------- Table A5 indicates whether some AntiVirus DOS-product also detects non-viral malware, esp. including virus generators, trojans and intended (though not self-replicating) viruses. Results only apply to Macro Malware where VTCs "List of Known Macro Malware" displays the current status of all known malicious threats. For detailed results see "6bdosfil.txt" and "6ddosmac.txt". Table A5: DOS Detection Rate of Non-Viral File & Macro Malware ============================================================== Scanner File Malware Macro Malware Testbed 4282 100.0 394 100.0 --------------------------------------- ANT 284 72.1 AVA 3757 56.6 315 79.9 AVP 5588 84.2 386 98.0 CMD 6174 93.0 394 100.0 DRW 316 80.2 FPR 6309 95.0 394 100.0 FSE 6346 95.6 384 97.5 INO 5069 76.4 378 95.9 NAV 4908 73.9 317 80.5 NOD 5228 78.7 381 96.7 NVC 4434 66.8 364 92.4 PAV 6131 92.3 391 99.2 SCN 6269 94.4 393 99.7 SWP 5138 77.4 377 95.7 VIT 378 5.7 8 2.0 VSP 3020 45.5 1 0.3 --------------------------------------- Mean value: 72.1% 80.6% ..last test: 71,1% 88,0% --------------------------------------- ------------------------ Overview Table AB: --------------------- Table AB indicates, for those scanners (most current versions each time) in last 4 tests (1998/10 when detection under Windows 98 was first tested, 1999-03, 1999-09 and 2000-04), how the results of file and macro virus detection rates under Windows 98 developped. (Concerning format of this table: see table AA). Table AB: Comparison: Zoo File/Macro Virus Detection Rate in last 4 VTC tests under Windows 98: ============================================================ ---- File Virus Detection --- ---- Macro Virus Detection -- SCAN 98/10 99/03 99/09 00/04 DELTA 98/10 99/03 99/09 00/04 DELTA NER % % % % % % % % % % ---------------------------------------------------------------------- ACU - - - - - - 97.6 - - - AN5 - - 87.2 - - - - 89.3 - - ANT 91.3 - 86.5 92.8 - 84.3 - 89.5 90.2 1.7 ANY - - - - - 70.7 - - - - AVA 96.6 97.6 97.2 97.5 0.3 96.7 95.9 93.9 94.3 0.4 AVG - 87.3 87.0 85.4 -1.6 - 82.5 96.6 97.5 0.9 AVK 99.6 90.8 99.8 99.7 -0.1 99.6 99.6 100.0 99.9 -0.1 AVP 99.9 99.9 99.8 99.9 0.1 100.0 99.2 100.0 99.9 -0.1 AVX - 74.2 75.7 77.4 1.7 - - 98.7 94.5 -4.2 CMD - - 98.4 99.6 1.2 - - 99.6 100.0 0.4 DSS/DSE 99.9 99.9 * 99.8 - 100.0 100.0 * 100.0 - DRW/DWW - 89.5 98.3 96.7 -1.6 - 98.3 98.8 98.4 -0.4 ESA - - - 58.0 - - - - 88.9 - FPR/FMA - 93.9 99.4 99.7 0.3 92.4 99.8 99.7 100.0 0.3 FPW - - 99.2 99.6 0.4 - - 99.9 100.0 0.1 FSE 99.8 100.0 99.9 100.0 0.1 100.0 100.0 100.0 100.0 0.0 FWN - - - - 99.6 99.7 99.9 99.8 -0.1 HMV - - - - - 99.5 - - - IBM 92.8 * * * - 94.5 * * * - INO 93.5 98.1 97.1 98.7 1.6 88.1 99.8 98.1 99.7 1.6 IRS 96.7 97.6 - - - 99.0 99.5 - - - ITM - 64.2 - - - - - - - - IVB - - - - - 92.8 95.0 - - - MKS - - - - - - - - 97.1 - MR2 - - 65.9 - - - - 64.9 - - NAV - 96.8 97.6 96.8 -0.8 95.3 99.7 98.7 98.0 -0.7 NOD - 97.6 98.3 98.3 0.0 - 99.8 100.0 99.4 -0.6 NV5 - - 99.0 - - - - 99.6 - - NVC 93.6 97.0 99.0 99.1 0.1 - 99.1 99.6 99.9 0.3 PAV 98.4 99.9 99.6 100.0 0.4 99.5 99.5 86.7 99.9 13.2 PCC - 81.2 - - - - 98.0 - - - PER - - - - - - - - 53.7 - PRO - 37.3 39.8 44.6 4.8 - 58.0 61.9 67.4 5.5 QHL - - - - - - - - 0.0 - RAV 84.9 - 86.9 86.5 -0.4 92.2 - 98.1 97.9 -0.2 SCN 86.6 99.8 99.7 100.0 0.3 97.7 100.0 99.8 100.0 0.2 SWP 98.4 - 99.0 99.6 0.6 98.6 - 98.5 98.6 0.1 TBA 92.6 * * * - 98.7 * * * - TSC - 55.3 53.8 - - - 76.5 64.9 - - VBS - - - - - 41.5 - - - - VBW - 26.5 - - - 93.4 - - - - VET - 66.3 * * * - 97.6 * * - VSP - 86.4 79.7 78.1 -1.6 - 0.4 0.3 - - ----------------------------------------------------------------------- Mean 95.0% 84.2% 89,7% 91.6% 0.3% 92.1% 90.3% 93,5% 95.0% 0.9% ----------------------------------------------------------------------- Remark: those products marked with "*" are no longer supported but have been migrated into other products which are included in tests. ------------------------ Overview Table A6: --------------------- Table A6 summarizes results of current tests under Windows 98. Tests were performed for detection of file and macro viruses. In addition, detection of macro-related malware was also tested. For detailed results see "6fw98.txt". Table A6: Detection Rate of File/Macro Viruses and Malware in Zoo and ITW Tests for Windows 98: ======================================================= Scanner File Viruses File Malware Macro Viruses Macro Malware Testbed 18359 100.0 4282 100.0 4525 100.0 260 100.0 ---------------------------------------------------------------------- AVA 17905 97.5 2418 56.5 4266 94.3 212 81.5 AVG 15684 85.4 2259 52.8 4410 97.5 203 78.1 AVK 18295 99.7 3901 91.1 4522 99.9 257 98.8 AVP 18349 99.9 3920 91.5 4522 99.9 252 96.9 AVX 14211 77.4 2512 58.7 4276 94.5 245 94.2 CLE 206 4.8 0 0.0 CMD 18287 99.6 4080 95.3 4525 100.0 260 100.0 DRW 17751 96.7 2861 66.8 4453 98.4 204 78.5 DSE 18330 99.8 4000 93.4 4525 100.0 259 99.6 ESA 10647 58.0 1397 32.6 4022 88.9 148 56.9 FPR 18295 99.7 4100 95.7 4525 100.0 260 100.0 FPW 18287 99.6 3916 91.5 4525 100.0 260 100.0 FSE 18350 100.0 4226 98.7 4525 100.0 260 100.0 FWN 4516 99.8 252 96.9 INO 18114 98.7 3352 78.3 4513 99.7 253 97.3 MKS 1586 37.0 4393 97.1 225 86.5 MR2 14270 77.7 2157 50.4 2688 59.4 122 46.9 NAV 17768 96.8 3273 76.4 4435 98.0 214 82.3 NOD 18053 98.3 3323 77.6 4500 99.4 250 96.2 NVC 18193 99.1 2777 64.9 4521 99.9 248 95.4 PAV 18351 100.0 3923 91.6 4522 99.9 257 98.8 PER 445 10.4 2429 53.7 112 43.1 PRO 8187 44.6 583 13.6 3048 67.4 64 24.6 QHL 0 0.0 0 0.0 RAV 15881 86.5 1994 46.6 4428 97.9 248 95.4 SCN 18352 100.0 4039 94.3 4525 100.0 260 100.0 SWP 18278 99.6 3352 78.3 4463 98.6 247 95.0 VIT 23 0.5 9 3.5 VSP 14333 78.1 2169 50.7 0 0.0 1 0.4 --------------------------------------------------------------------- Mean value: 91.0% 63.0% 86.8% 74.0% ..last test: 89.9% 68.3% 93.3% 87.4% --------------------------------------------------------------------- ------------------------ Overview Table AC: --------------------- Table AC indicates, for those scanners (most current versions each time) in last 6 tests (since 1997-07 when detection under W-NT was first tested until 1999-09), how the results of file and macro virus detection rates under Windows NT developed. (Concerning format of this table: see table AA). Table AC: Comparison: File/Macro Virus Detection Rate in last 6 VTC tests under Windows-NT: =========================================================== Scan ====== File Virus Detection ======= ===== Macro Virus Detection ======= ner 9707 9802 9810 9903 9909 0004 Delta 9707 9802 9810 9903 9909 0004 Delta ------------------------------------------------------------------------------ ANT 88.9 69.2 91.3 - 87.2 92.8 12.6 92.2 - 85.7 - 89.3 90.2 0.9 ANY - - 69.7 - - - - - - 70.5 - - - - ATD - - - - - 100% - - - - - - 99.9 - AVA - 97.4 96.6 97.1 97.4 97.2 -0.2 - 91.9 97.2 95.2 93.3 94.3 1.0 AVG - - - 87.3 87.0 85.4 -1.6 - - - 82.5 96.6 97.5 0.9 AVK - - 99.6 90.2 99.8 99.7 -0.1 - - 99.6 99.6 100% 99.9 -0.1 AVP - - 83.7 99.9 99.8 99.9 0.1 - - 100% 99.2 100% 99.9 -0.1 AVX - - - 74.2 75.2 80.4 5.2 - - - 98.9 98.7 94.5 -4.2 AW - 56.4 - - - - - - 61.0 - - - - - CMD - - - - - 99.6 - - - - - - 100% - DRW/DWW - - 93.3 98.3 98.3 0.0 - - - 98.3 98.8 98.4 -0.4 DSS/E 99.6 99.7 99.9 99.3 * - - 99.0 100% 100% 100% * - - ESA - - - - - 58.0 - - - - - - 88.9 - FPR/FMA - 96.1 - 98.7 99.4 - - - 99.9 99.8 99.8 99.7 - - FPW - - - - - 99.6 - - - - - 99.7 100% 0.3 FSE - 85.3 99.8 100% 99.9 100% 0.1 - - 99.9 100% 100% 100% 0.0 FWN - - - - - - - - - 99.6 99.7 - 99.9 - HMV - - - - - - - - - 99.0 99.5 - - - IBM 95.2 95.2 77.2 * * * * 92.9 92.6 98.6 * * * * INO - 92.8 - 98.1 98.0 98.7 0.7 - 89.7 - 99.8 99.7 99.7 0.0 IRS - 96.3 - 97.6 - - - - 99.1 - 99.5 - - - IVB - - - - - - - - - 92.8 95.0 - - - MKS - - - - - 78.0 - - - - - - 97.1 - MR2 - - - - 61.9 - - - - - - 69.6 - - NAV 86.5 97.1 - 98.0 97.6 96.8 -0.8 95.6 98.7 99.9 99.7 98.7 98.0 -0.7 NOD - - - 97.6 98.2 98.3 0.1 - - - 99.8 100% 99.4 -0.6 NVC 89.6 93.8 93.6 96.4 - 99.1 - 96.6 99.2 - 98.9 98.9 99.9 1.0 NVN - - - - 99.0 - - - - - - 99.5 - - PAV 97.7 98.7 98.4 97.2 99.6 100% 0.4 93.5 98.8 99.5 99.4 99.7 99.9 0.2 PRO - - - 37.3 42.4 45.6 3.2 - - - 58.0 61.9 67.4 5.5 QHL - - - - - - - - - - - - 0.0 - RAV - 81.6 84.9 85.5 - 88.0 - - 98.9 99.5 99.2 - 97.9 - RA7 - - - 89.3 - - - - - - 99.2 - - - PCC 63.1 - - - - - - - 94.8 - - - - - PER - - - - - - - - 91.0 - - - - - SCN 94.2 91.6 71.4 99.1 99.8 99.8 0.0 97.6 99.1 97.7 100% 100% 100% 0.0 SWP 94.5 96.8 98.4 - 99.0 99.6 0.6 89.1 98.4 97.5 - 98.4 98.6 0.2 TBA - 93.8 92.6 * * * - 96.1 - 98.7 * * * TNT - - - * * * - - - 44.4 * * * VET 64.9 - - 65.4 * * - - 94.0 - 94.9 * * VSA - 56.7 - - - - - - 84.4 - - - - VSP - - - 87.0 69.8 78.1 8.3 - - - 86.7 0.3 0.0 -0.3 ------------------------------------------------------------------------------ Mean: 87.4 88.1 89.0 89.2 90,0 91.0 1.8% 94.7 95.9 91.6 95.3 95,1 96.5 0.2% ------------------------------------------------------------------------------ ------------------------ Overview Table A7: --------------------- Table A7 summarizes results of tests under Windows NT. Tests were performed for detection of file and macro viruses. In addition, detection of file and macro-related malware was also tested. Those scanners which were submitted as being identical for Windows 98 and Windows NT were only tested under Windows 98 (see table A6). For detailed results see "6gwnt.txt". Table A7: Detection Rate of File/Macro Viruses and Malware in Zoo and ITW Tests for Windows NT: ======================================================= Scanner File Viruses File Malware Macro Viruses Macro Malware Testbed 18359 100.0 4282 100.0 4525 100.0 260 100.0 ---------------------------------------------------------------------- ANT 17040 92.8 2794 65.2 4081 90.2 181 69.6 ATD 18351 100.0 3923 91.6 4522 99.9 257 98.8 AVG 15681 85.4 2259 52.8 4410 97.5 203 78.1 AVK 18295 99.7 3901 91.1 4522 99.9 257 98.8 AVP 18349 99.9 3921 91.6 4522 99.9 252 96.9 AVS 17840 97.2 2384 55.7 4266 94.3 212 81.5 AVX 14756 80.4 2520 58.9 4276 94.5 245 94.2 CLE 205 4.8 0 0.0 CMD 18287 99.6 4080 95.3 4525 100.0 260 100.0 DRW 18054 98.3 2861 66.8 4453 98.4 204 78.5 ESA 10647 58.0 1397 32.6 4022 88.9 148 56.9 FPW 18287 99.6 3916 91.5 4525 100.0 260 100.0 FSE 18350 100.0 4226 98.7 4525 100.0 260 100.0 FWN 4522 99.9 255 98.1 INO 18114 98.7 3352 78.3 4513 99.7 253 97.3 MKS 14311 78.0 1586 37.0 4393 97.1 226 86.9 MR2 14270 77.7 2157 50.4 2688 59.4 122 46.9 NAV 17768 96.8 3273 76.4 4435 98.0 214 82.3 NOD 18053 98.3 3323 77.6 4500 99.4 250 96.2 NVC 18193 99.1 2777 64.9 4521 99.9 248 95.4 PAV 18351 100.0 3923 91.6 4522 99.9 257 98.8 PRO 8368 45.6 583 13.6 3048 67.4 64 24.6 QHL 0 0.0 0 0.0 RAV 16147 88.0 1994 46.6 4428 97.9 248 95.4 SCN 18330 99.8 3996 93.3 4525 100.0 259 99.6 SWP 18278 99.6 3352 78.3 4463 98.6 247 95.0 VSP 14333 78.1 2167 50.6 0 0.0 1 0.4 ---------------------------------------------------------------------- Mean value: 90.4% 66.2% 87.7% 76.7% ..last test: 90.8% 72.4% 95.5% 90.1% (*) ---------------------------------------------------------------------- (*) Remark: concerning mean value of macro virus/malware detection: for fair basis of comparison, the unusually low detection values of VSP were not counted (mean value including VSP: macro virus detection: 91.4%; macro malware detection 86.2%).