============================================================ File 0XECPRE.TXT: PRE-RELEASED EXECUTIVE SUMMARY: Macro Virus/Malware Results Released: February 26, 2000 Updated: March 1, 2000 Virus Test Center (VTC), University of Hamburg AntiMalware Product Millennium Test "2000-02" ============================================================ [Formatted with non-proportional font (Courier), 72 columns] Content of this text: ===================== 0. Foreword for pre-released "Millennium" test report 1. Background of this test; Malware Threats 2. Products included in VTC "Millennium Test 2000-02" --------------------------------------------------------- 3. Summary #1: Zoo Macro Virus detection quality of DOS AV products: ----------------------------------------------------- #1.1) !3! scanners are "perfect" as they detect ALL zoo viruses: CMD, FPR and SCN! (last test: 3). #1.2) Mean detection rate of those (14) scanners tested has reached an impressing mean rate of 97.8%. --------------------------------------------------------- 4. Summary #2: ITW Macro Virus detection quality of DOS AV products: ----------------------------------------------------- #2.1) !11! scanners are "perfect" as they detect ALL ITW viruses: AVP, CMD, DRW, FPR, FSE, INO, NOD, NVC, PAV, SCN and SWP. (In last test, 10 scanners were rated "perfect") --------------------------------------------------------- 5. Summary #3: Zoo Macro Virus detection quality of W-NT AV products: ------------------------------------------------------ #3.1) !4! "perfect" Windows-NT zoo scanner: CMD, FPW, FSE and SCN. #3.2) Plus 9 "excellent" scanners: ATD, AVK, AVP, FWN, NVC, PAV; INO; NOD; SWP. --------------------------------------------------------- 6. Summary #4: ITW Macro Virus detection quality of W-NT AV products: ------------------------------------------------------ #4.1) !19! "perfect" Windows-NT ITW macro scanner: ATD, AVA, AVG, AVK, AVP, AVX, CMD, DRW, FPR, FSE, FWN, INO, NAV, NOD, NVC, PAV, RAV, SCN and SWP. --------------------------------------------------------- 7. Summary #5: Detection of packed viral objects under W-NT: --------------------------------------------- #5.1) Best scanners detect packed macro viruses for ALL 4 methods: ATD, AVK, AVP, AVX, FSE and PAV #5.2) But there is still need for improvement. --------------------------------------------------------- 8. Summary #6: False Positive avoidance (DOS/Win-NT): -------------------------------------- #6.1) FP-Avoiding "perfect" DOS/W-NT products (4): AVA, NOD, SCN and SWP. #6.2) FP-avoiding perfect DOS scanners (7): ANT, AVA, FSE, NOD, PAV, SCN and SWP. FP-avoiding perfect W-NT scanners (7): AVA, AVG, AVK, NOD, QHL, SCN and SWP. ---------------------------------------------------------- 9. Summary #7: Detection of Macro Malware (DOS/Win-NT): ---------------------------------------- #7.1) "Perfect" AntiMalware products for DOS/W-NT (2) CMD, FPR/FPW #7.2) "Excellent" AntiMalware products for DOS or W-NT: FSE, SCN, PAV, AVP, NOD, NVC, INO, SWP. ---------------------------------------------------------- 10. Availability of pre-released test results 11. Copyright, License, and Disclaimer Tables: ======= Table ES0: Development of viral/malware threats Table ES2: List of AV products in test 2000-02 Table ES3: Development of DOS scanners from 1997-02 to 2000-02 Table ES4: Development of W-NT scanners from 1997-07 to 2000-02 1. Background of this test: Malware Threats: ============================================ Malicious software (malware) including viruses (=self-replicating malware), trojan horses (=pure payload without self-replication), virus droppers and network malware (e.g. worms and hostile applets), are regarded as serious threats to PC users esp. when connected to Intranetworks and Internet. The development of malicious software can well be studied in view of the growth of VTC testbeds. The following table summarizes, for previous and actual VTC tests (indicated by their year and month of publication), the size of virus and malware (full = "zoo") databases (indicating each the different viruses and number of instantiations of a virus or malware and having in mind that some revisions of testbeds were made): Table ES0: Development of threats as present in VTC test databases: =================================================================== = File viruses= = Boot Viruses= =Macro Viruses= == Malware == Test# Number Infected Number Infected Number Infected Number Malware Viruses objects viruses objects viruses objects file macro ----------------------------------------------------------------------- 1997-07: 12,826 83,910 938 3,387 617 2,036 213 72 1998-03: 14,596 106,470 1,071 4,464 1,548 4,436 323 459 1998-10: 13,993 112,038 881 4,804 2,159 9,033 3,300 191 1999-03: 17,148 128,534 1,197 4,746 2,875 7,765 3,853 200 + 5 146,640 (VKIT+4*Poly) 1999-09: 17,561 132,576 1,237 5,286 3,546 9,731 6,217 329 + 7 166,640 (VKit+6*Poly) 1999-09: ****** ******* ***** ***** 4,525 12,918 ***** 394 ----------------------------------------------------------------------- ****** Details to be included in the final report. ----------------------------------------------------------------------- Remark: Before test 1998-10, an ad-hoc cleaning operation was applied to remove samples where virality could not be proved easily. Since test 1999-03, separate tests are performed to evaluate detection rates of VKIt-generated and selected polymorphic file viruses. With annual deployment of more than 5,000 viruses and several 100 Trojan horses, many of which are available from Internet, and in the absence of inherent protection against such dysfunctional software, users must rely on AntiMalware and esp. AntiVirus software to detect and eradicate - where possible - such malicious software. Hence, the detection quality of AntiMalware esp. including AntiVirus products becomes an essential prerequisite of protecting customer productivity and data. Virus Test Center (VTC) at Hamburg University´s Faculty for Informatics performs regular tests of AntiMalware and esp. AntiVirus Software. VTC recently tested actual versions of on-demand scanners for their ability to identify PC viruses. Tests were performed on VTCs malware databases, which were frozen on their status as of *** September 30, 1999 *** to give AV/AM producers a fair chance to support updates within the 8 weeks submission period. Scanners were submitted (downloaded) on November 30,1999. The main test goal was to determine detections rates, reliability (=consistency) of malware identification and reliability of detection rates for submitted or publicly available scanners. Special tests were devoted to detection of multiple generations of 6 polymorphic file viruses (Maltese.Amoeba, Mte.Encroacher.B, Natas, Tremor Tequila and One-Half) and of viruses generated with the "VKIT" file virus generator. It was also tested whether viruses packed with 4 popular compressing methods (PKZIP, ARJ, LHA and RAR) would be detected (and to what degree) by scanners. Moreover, avoidance of False Positive alarms on "clean" (=non-viral and non-malicious) objects was also determined. Finally, a set of selected non-viral file and macro malware (droppers, Trojan horses, intended viruses etc) was used to determine whether and to what degree AntiVirus products may be used for protecting customers against Trojan horses and other forms of malware. VTC maintains, in close and secure connection with AV experts worldwide, collections of boot, file and macro viruses as well as related malware ("zoo") which have been reported to VTC or AV labs. Moreover, following the list of "In-The-Wild Viruses" (published on regular basis by Wildlist.org), a collection of viruses reported to be broadly visible is maintained to allow for comparison with other tests; presently, this list doesnot report ITW Malware. 2. Products included in VTC "Millennium Test 2000-02" ===================================================== For test "2000-02", the following *** 31 *** AntiVirus products (adressed in subsequent tables by a 3-letter abbreviavion) under DOS, Windows-98 or Windows-NT were tested: Table ES2: List of AV products in test "2000-02" ================================================ Abbreviation/Product/Version Tested under Platform ------------------------------------------------------------------- ANT = AntiVir 5.21 (VDF 27.11.1999) (CD) for DOS, W-98, W-NT ATD = AntiDote 1.50 (Vintage Sol) (http) for W-98, W-NT AVA = AVAST v7.70 (34 Nov.99) (CD/diskettes) for DOS/W3.x = AVAST32 v3.0 (197 Nov.99) (CD/diskettes) for W-98, W-NT AVG = AVG 6.0 (CD 99-09/ftp) for W-98, W-NT AVK = AVK 9 (November 30,1999) (CD) for W-98, W-NT AVP = AVP (email) for DOS, W-98, W-NT AVX = AVX 5.0 (ftp) for W-98, W-NT CLE = Cleaner 3.0 (http) for W-98, W-NT CMD = Command Software AV (ftp) for DOS, W-98, W-NT DRW = DrWeb (email) for DOS, W-98, W-NT DSE = Dr Solomon Emergency AV (CD) for DOS, W-98 ESA = eSafe 21 (Aladdin) (http) for W-98, W-NT FPR = FProt 3.06c (ftp) for DOS, W-98 FPW = FProt FP-WIN 3.05 (ftp) for DOS, W-98, W-NT FSE = FSAV (ftp) for DOS, W-98, W-NT FWN = FWin32 (email) for W-98, W-NT INO = Inoculan 4.53 (11/24/99) (CD) for DOS, W-98, W-NT MKS = MKS-vir (Polonia) (http) for W-98 NAV = NAV/Sig Nov.1999 (http) for DOS, W-98, W-NT NOD = NOD 32 V.129 (email) for DOS, W-98, W-NT NVC = NVC (email) for DOS, W-98, W-NT PAV = PAV 2000 (November 30,1999) (CD) for DOS, W-98, W-NT PER = PER AntiVirus (Peru) (http) for W-98 PRO = Protector (http) for W-98, W-NT QUL = Quickheal Version 5.21 (http) for W-98, W-NT RAV = RAV 7.6 (email) for W-98, W-NT SCN = NAI VirusScan (CD/http) for DOS, W-98, W-NT SWE = Sweep (http) for DOS, W-98, W-NT VIT = VirIT Explorer Lite 2.6.26 (http) for DOS, W-98 ------------------------------------------------------------------- Details of product (versions etc) will be described in A2SCANLS.TXT. Problems observed during tests (which may have adversely influenced published results) will be described in 8PROBLMS.TXT (cf.final report). In general, AV products were either submitted or, when test versions were available on Internet, downloaded from respective ftp/http sites. Few scanners were not available either in general (TNT) or for this test, some of which were announced to participate in some future test. Finally, very few AV producers answered VTCs bids for submitting scanners with electronic silence. The following paragraphs summarizes those findings related to the detection of macro viruses and non-replicant malware. For details: see files 6DDOSPRE.TXT, 6FW98PRE.TXT, 6GWNTPRE.txt. 3. Summary #1: Development of DOS scanner detection rates: ========================================================== Concerning performance of DOS scanners, a comparison of virus detection results in tests from "1997-02" until present test "1999-09" shows how scanners behave and how manufacturers succeed in adapting their products to the growing threat of new viruses. The following table lists the development of detection rates of scanners (most actual versions in each test), and it calculates changes ("+" indicating improvement) in detection rates. For reasons of fairness, it must be noted that improvement of those products which have yet reached a very high level of detection and quality (say: more than 95%) is much more difficult to achieve than for those products which reached lower detection rates. Some products have incorporated new engines (esp. for 32-bit platforms) and included formerly separate scanners (e.g. on macro viruses) which lead to improved performance. Generally, changes in the order of about +-1.5% are less significant as this is about the growth rate of new viruses per month, so detection depends strongly upon whether some virus is reported (and analysed and included) just before a new update is delivered. Table ES3 lists developments for detection of file and macro viruses; for details as well as for boot virus detection, see result tables (6b-6g) as well as overview (6asumov.txt) and Evaluation (7eval.txt). Table ES3: Improvement of DOS scanners from 1997-02 to 2000-02: =============================================================== ------- File Virus Detection ------- ------ Macro Virus Detection ------------- SCAN 9702 9707 9802 9810 9903 9909 DELTA 9702 9707 9802 9810 9903 9909 0002 DELTA NER % % % % % % % % % % % % % % % ---------------------------------------------------------------------------------- ALE 98.8 94.1 89.4 - - - - 96.5 66.0 49.8 - - - - - ANT 73.4 80.6 84.6 75.7 - - - 58.0 68.6 80.4 56.6 - - 85.9 - AVA 98.9 97.4 97.4 97.9 97.6 97.4 -0.2 99.3 98.2 80.4 97.2 95.9 94.6 93.7 -0.9 AVG 79.2 85.3 84.9 87.6 87.1 86.6 -0.5 25.2 71.0 27.1 81.6 82.5 96.6 - - AVK - - - 90.0 75.0 - - - - 99.7 99.6 - - - AVP 98.5 98.4 99.3 99.7 99.7 99.8 0.1 99.3 99.0 99.9 100% 99.8 100% 99.9 -0.1 CMD - - - - - - - - - - - - 99.5 100% 0.5 DRW 93.2 93.8 92.8 93.1 98.2 98.3 0.1 90.2 98.1 94.3 99.3 98.3 - 98.4 - DSS 99.7 99.6 99.9 99.9 99.8 - - 97.9 98.9 100% 100% 100% - - - FMA - - - - - - - 98.6 98.2 99.9 - - - - - FPR 90.7 89.0 96.0 95.5 98.7 99.2 0.5 43.4 36.1 99.9 99.8 99.8 99.7 100% 0.3 FSE - - 99.4 99.7 97.6 99.3 1.7 - - 99.9 90.1 99.6 97.6 99.9 2.3 FWN - - - - - - 97.2 96.4 91.0 85.7 - - - - HMV - - - - - - - - 98.2 99.0 99.5 - - - IBM 93.6 95.2 96.5 - - - - 65.0 88.8 99.6 - - - - - INO - - 92.0 93.5 98.1 94.7 -3.4 - - 90.3 95.2 99.8 99.5 99.7 0.2 IRS - 81.4 74.2 - 51.6 - - - 69.5 48.2 - 89.1 - - - ITM - 81.0 81.2 65.8 64.2 - - - 81.8 58.2 68.6 76.3 - - - IVB 8.3 - - - 96.9 - - - - - - - - - - MR2 - - - - - 65.4 - - - - - - 69.6 - - NAV 66.9 67.1 97.1 98.1 77.2 96.0 18.8 80.7 86.4 98.7 99.8 99.7 98.6 97.4 -1.2 NOD - - - 96.9 - 96.9 - - - - - 99.8 100% 99.4 -0.6 NVC 87.4 89.7 94.1 93.8 97.6 - - 13.3 96.6 99.2 90.8 - 99.6 99.9 0.3 PAN - - 67.8 - - - - - - 73.0 - - - - - PAV - 96.6 98.8 - 73.7 98.8 25.1 - - 93.7 100% 99.5 98.8 99.9 1.1 PCC - - - - - - - - 67.6 - - - - - - PCV 67.9 - - - - - - - - - - - - - - PRO - - - - 35.5 - - - - - - 81.5 - - - RAV - - - 71.0 - - - - - - 99.5 99.2 - - - SCN 83.9 93.5 90.7 87.8 99.8 97.1 -2.7 95.1 97.6 99.0 98.6 100% 100% 100% 0.0 SWP 95.9 94.5 96.8 98.4 - 99.0 - 87.4 89.1 98.4 98.6 - 98.4 98.4 0.0 TBA 95.5 93.7 92.1 93.2 - - - 72.0 96.1 99.5 98.7 - - - - TSC - - 50.4 56.1 39.5 51.6 12.1 - - 81.9 76.5 59.5 69.6 - - TNT 58.0 - - - - - - - - - - - - - - VDS - 44.0 37.1 - - - - 16.1 9.9 8.7 - - - - - VET - 64.9 - - 65.3 - - - 94.0 97.3 97.5 97.6 - - - VRX - - - - - - - - - - - - - - - VBS 43.1 56.6 - 35.5 - - - - - - - - - - - VHU 19.3 - - - - - - - - - - - - - - VSA - - 56.9 - - - - - - 80.6 - - - - - VSP - - - 76.1 71.7 79.6 7.9 - - - - - - - - VSW - - 56.9 - - - - - - 83.0 - - - - - VTR 45.5 - - - - - - 6.3 - - - - - - - XSC 59.5 - - - - - - - - - - - - - - ---------------------------------------------------------------------------------- Mean 74.2 84.8 84.4 85.4 81.2 90.6 +5.0% 69.6 80.9 83.8 89.6 93.6 88.2 98.0 +0.1 ---------------------------------------------------------------------------------- Remark: for abbreviations and details of products present only in previous tests, see related parts of VTC test report. Result for file virus detection will be published in the final report. Concerning rating of DOS scanners, the following grid is applied to classify scanners: - detection rate =100.0% : scanner is graded "perfect" - detection rate above 95% : scanner is graded "excellent" - detection rate above 90% : scanner is graded "very good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall grade" (including .... ... macro virus detection), the lowest of the related results is used to classify resp. scanners. If several scanners of the same producer have been tested, grading is applied to the most actual version (which is, in most cases, the version with highest detection rates). Only scanners where all tests were completed are considered; here, the most actual version with test completed was selected. (For problems in test: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with file and macro virus detection rates in unpacked forms, and with perfect zoo virus detection (rate=100%): Grading the Detection of zoo macro viruses: ------------------------------------------- "Perfect" DOS macro scanners: CMD (100.0%) (+) FPR (100.0%) (+) SCN (100.0%) (=) "Excellent" DOS macro scanners: AVP ( 99.9%) (-) FSE ( 99.9%) (+) PAV ( 99.9%) (+) NVC ( 99.9%) (+) INO ( 99.8%) (+) NOD ( 99.5%) (-) "Very Good" DOS macro scanners: DRW ( 98.8%) (+) NAV ( 98.6%) (-) SWP ( 98.9%) (+) Remark: "+" indicates that the procuts performed better than in last test; "=" means euqal detection rate, and "-" indicates a lower detection rate. **************************************************************** Results #1: Concerning detection of zoo macro viruses: ------------------------------------------ #1.1) !3! scanners are "perfect" as they detect ALL zoo viruses: CMD, FPR and SCN! (last test: 3). #1.2) Mean detection rate of those (14) scanners tested has reached an impressing mean rate of 97.8%. ------------------------------------------ Remark: the removal of one "weak" product has surely influenced mean detection rate! **************************************************************** 4. Summary #2: Performance of DOS scanners on ITW testbeds: =========================================================== Concerning "In-The-Wild" viruses, a much more rigid grid must be applied to classify scanners, as the likelihood is significant that a user may find such a virus on her/his machine. The following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses is now an absolute requirement. The following 11 DOS products reach 100% for macro virus detection and are rated "perfect" in this category (alphabetically ordered): Grading the Detection of ITW macro viruses (DOS): ------------------------------------------------- "Perfect" DOS ITW macro scanners: AVP (100.0%) (+) CMD (100.0%) (=) DRW (100.0%) (+) FPR (100.0%) (=) FSE (100.0%) (+) INO (100.0%) (+) NOD (100.0%) (+) NVC (100.0%) (+) PAV (100.0%) (+) SCN (100.0%) (=) SWP (100.0%) (+) "Excellent" DOS ITW macro scanners: NONE "Very Good" DOSITW macro scanners: AVA ( 98.8%) (+) NAV ( 98.8%) (+) ANT ( 97.5%) (+) **************************************************************** Results #2: Concerning detection of ITW macro viruses: ------------------------------------------ #2.1) !11! scanners are "perfect" as they detect ALL ITW viruses: AVP, CMD, DRW, FPR, FSE, INO, NOD, NVC, PAV, SCN and SWP. (last test: 10 scanners were rated "perfect") **************************************************************** 5. Summary #3: Zoo Macro Virus detection quality of W-NT AV products: ===================================================================== The number of scanners running under Windows NT is still small, though growing. Significantly less products were available for these tests, compared with the traditional DOS scene. The following table summarizes results of file and macro virus detection under Windows-NT in last 5 VTC tests: Table ES4: Comparison: Macro Virus Detection Rates in last 6 VTC tests under Windows NT: =========================================================== Scan ==== File Virus Detection ==== ========= Macro Virus Detection ========= ner 97/07 98/02 98/10 99/03 Delta 97/07 98/02 98/10 99/03 99/09 00/02 Delta ------------------------------------------------------------------------------- ANT 88.9 69.2 91.3 - - 92.2 - 85.7 - 87.2 90.2 3.0 ANY - - 69.7 - - - - 70.5 - - - - ATD - - - - - - - - - - 99.9 - AVA - 97.4 96.6 97.1 +0.5 - 91.9 97.2 95.2 97.4 94.3 -3.1 AVG - - - 87.3 - - - - 82.5 87.0 97.5 10.5 AVK - - 99.6 90.2 -9.4 - - 99.6 99.6 99.8 99.9 0.1 AVP - - 83.7 99.9 +16.2 - - 100.0 99.2 99.8 99.9 0.1 AVX - - - 74.2 - - - - 98.9 75.2 94.5 19.3 AW - 56.4 - - - - - 61.0 - - - - CMD - - - - - - - - - 98.4 100% 1.6 DRW - - - 93.3 - - - - 98.3 - - - DWW - - - - - - - - 98.2 98.3 98.4 0.1 DSS 99.6 99.7 99.9 99.3 -0.6 99.0 100.0 100.0 100.0 - - - ESA - - - - - - - - - - 88.9 - FPR/FPW - 96.1 - 98.7 - - 99.9 99.8 99.8 99.4 100% 0.5 FSE - 85.3 99.8 100.0 +0.2 - - 99.9 100.0 99.9 100% 0.1 FWN - - - - - - - 99.6 99.7 - 99.9 - HMV - - - - - - - 99.0 99.5 - - - IBM 95.2 95.2 77.2 - - 92.9 92.6 98.6 - - - - INO - 92.8 - 98.1 - - 89.7 - 99.8 98.0 99.7 1.7 IRS - 96.3 - 97.6 - - 99.1 - 99.5 - - - IVB - - - - - - - 92.8 95.0 - - - MR2 - - - - - - - - 61.9 - - - NAV 86.5 97.1 - 98.0 - 95.6 98.7 99.9 99.7 97.6 98.0 0.4 NOD - - - 97.6 - - - - 99.8 98.2 99.4 1.2 NVC 89.6 93.8 93.6 96.4 +2.8 96.6 99.2 - 98.9 99.0 99.9 0.9 PAV 97.7 98.7 98.4 97.2 -1.2 93.5 98.8 99.5 99.4 99.6 99.9 0.3 PRO - - - 37.3 - - - - 58.0 42.4 67.4 25.0 RAV - 81.6 84.9 85.5 +0.6 - 98.9 99.5 99.2 - 97.9 - RA7 - - - 89.3 - - - - 99.2 - - - PCC 63.1 - - - - - 94.8 - - - - - PER - - - - - - 91.0 - - - - - SCN 94.2 91.6 71.4 99.1 +27.7 97.6 99.1 97.7 100.0 99.8 100% 0.2 SWP 94.5 96.8 98.4 - - 89.1 98.4 97.5 - 99.0 98.6 -0.4 TBA - 93.8 92.6 - - 96.1 - 98.7 - - - - TNT - - - - - - - 44.4 - - - - VET 64.9 - - 65.4 - - 94.0 - 94.9 - - - VSA - 56.7 - - - - 84.4 - - - - - VSP - - - 87.0 - - - - 86.7 69.8 - - ------------------------------------------------------------------------------- Mean: 87.4% 88.1% 89.0% 89.2% +4.0% 94.7% 95.9% 91.6% 95.3% 91.9% 96.6% +3,4% ------------------------------------------------------------------------------- Generally, the ability of W-NT scanners to detect zoo macro viruses "in the mean" now reaches 96.6% and now reaches the best result ever. But it should be noticed that some the mean improvement is essentially due to the fact that some scanners (which failed before) have been adapted. The improved rate still leavs some space for further improvements. Now, 4 scanners detect ALL zoo macro viruses (100% detection rate). The same grid as for the DOS classification (see 3.) is applied to grade scanners according to their ability to detect macro viruses under Windows NT. For detection of macro viruses under Windows NT, the following 16 scanners detect at least 95% of zoo macro and 100% of ITW viruses: "Perfect" (100%): CMD, FPW, FSE and SCN. "Excellent" (>99%): ATD, AVK, AVP, FWN, NVC, PAV; INO; NOD; SWP. "Very Good" (>95%): DRW; RAV; NAV; AVG; AVX. ************************************************************** Results #3: Macro virus detection rates under W-NT on high level ---------------------------------------------------- #3.1) !4! "perfect" Windows-NT zoo scanner: CMD, FPW, FSE and SCN. #3.2) Plus 9 "excellent" scanners: ATD, AVK, AVP, FWN, NVC, PAV; INO; NOD; SWP. ************************************************************** 6. Summary #4: ITW Macro Virus detection quality of W-NT AV products: ===================================================================== Applying the same grid as for the detection of ITW viruses under DOS, ITW detection rates have been significantly improved: 19 (of 23) scanners now are rated "perfect": Grading the Detection of ITW macro viruses (DOS): ------------------------------------------------- "Perfect" W-NT ITW macro scanners: ATD (100.0%) (+) AVA (100.0%) (+) AVG (100.0%) (+) AVK (100.0%) (+) AVP (100.0%) (=) AVX (100.0%) (+) CMD (100.0%) (=) DRW (100.0%) (=) FPR (100.0%) (=) FSE (100.0%) (=) FWN (100.0%) (+) INO (100.0%) (=) NAV (100.0%) (+) NOD (100.0%) (=) NVC (100.0%) (=) PAV (100.0%) (=) RAV (100.0%) (+) SCN (100.0%) (=) SWP (100.0%) (=) "Excellent" W-NT ITW macro scanners: NONE "Very Good" W-NT ITW macro scanners: ANT ( 97.5%) (=) ESA ( 97.5%) (+) PRO ( 97.5%) (+) ************************************************************** Results #4: Macro virus detection rates under W-NT on high level ---------------------------------------------------- #4.1) !19! "perfect" Windows-NT ITW macro scanner: ATD, AVA, AVG, AVK, AVP, AVX, CMD, DRW, FPR, FSE, FWN, INO, NAV, NOD, NVC, PAV, RAV, SCN and SWP. ************************************************************** 7. Summary #5: Detection of packed viral objects under W-NT: ============================================================ As many macro objects are transferred over insecure networks in compressed for, it is essential that AV products are able to detect macro viruses also in packed objects. VTC tests the related detection quality of AV products by packing ITW macro viruses with 4 broadly used packers: PKZIP, LHA, ARJ and RAR. Scanners both under DOS and W-NT are only partly successful in detecting packed ITW macro viruses (which tehy reliably detect "perfectly" in unpacked form). The following grid is applied: A "perfect" scanner would detect all ITW viruses in all 4 packed forms with 100% An "excellent" scanner would detect all ITW viruses in at least 3 packed forms with 100% A "very good" scanner would detect all ITW viruses in at least 3 packed forms with 100% The results of this test demonstrate that significant progress has been made since last VTC test (1999-09) as 6 products now detect ALL ITW macro viruses packed with ALL 4 packers. The following products (including detected packing methods) are graded according to the schema: "Perfect" W-NT scanner: ATD, AVK, AVP, AVX, FSE and PAV (all: ZIP, LHA, ARJ and RAR) "Excellent" W-NT scanner: AVG, DRW and NOD (all: ZIP, ARJ, RAR) INO, NAV and RAV (all: ZIP, LHA, ARJ) "Very Good" W-NT scanner: SCN (ZIP, LHA) CMD, FPW (ZIP, ARJ) FWN (ZIP, RAR) SWP (ZIP, RAR) While almost all products detect ZIPped ITW macro viruses, most products seem to have some preferences for specific packers which may make them eligible for those enterprises which permit some and forbid others. It is also interesting to observe that DOS, W-98 and W-NT versions of some products (where related versions were tested for all plat- forms) behave differently (for reasons which are beyond the possibility of testers to understand): Product: DOS version W-98 version W-NT version detects detects detects ------------------------------------------------------ AVA 0 1 1 AVP 0 4 4 CMD 2 2 2 DRW 0 3 3 FPR 2 2 2 *** FSE 4 4 4 *** INO 2 3 3 NAV 0 3 3 NOD 3 3 3 NVC 1 1 1 *** PAV 4 4 4 *** SCN 4 4 3 SWP 0 3 3 ------------------------------------------------------- Remark: This table lists only products for which versions on all 3 platforms were available. Moreover, only those products are listed where the resp. detection erate was 100%. ***: Those 2 products detect ALL ITW macro viruses packed with ANY method. ******************************************************************* Results #5: False-Positive detection rates under W-NT improving! ------------------------------------------------------ #5.1) Best scanners detect packed macro viruses for ALL 4 methods: ATD, AVK, AVP, AVX, FSE and PAV #5.2) But there is still need for improvement. ************************************************************** 8. Summary #6: False Positive avoidance (DOS/Win-NT): ===================================================== Regarding the ability of scanners to avoid FP alarms, the situation is significantly better than in last test: now, 7 products gave no FP alarm upon the testbed (in last test, NO scanner was "perfetc"). FP-avoiding "perfect" DOS scanners: ANT, AVA, FSE, NOD, PAV, SCN and SWP. For scanners under W-NT, equally 7 products gave no FP alarm: "Perfect" FP-avoiding W-NT scanners: AVA, AVG, AVK, NOD, QHL, SCN and SWP. Concerning avoidance of False-Positive alarms BOTH under DOS AND Windows-NT, 4 products can be rated as "perfect": "Perfect" FP-avoiding scanner both under DOS and W-NT: AVA, NOD, SCN and SWP. (Remark: direct comparison of 16-bit scan engines for DOS and 32-bit scan engines for W-NT is not possible. The argument concerning an "overall perfect product" applies more to the suite of software than to single products. Indeed, FPW and FPR are different engines in Frisk Software suite, as SCN engines are in NAIs suite). **************************************************************** Results #6: Avoidance of False-Positive Alarms is improving: ------------------------------------------------- #6.1) FP-Avoiding "perfect" DOS/W-NT products (4): AVA, NOD, SCN and SWP. #6.2) FP-avoiding perfect DOS scanners (7): ANT, AVA, FSE, NOD, PAV, SCN and SWP. FP-avoiding perfect W-NT scanners (7): AVA, AVG, AVK, NOD, QHL, SCN and SWP. **************************************************************** 9. Summary #7: Detection of Macro Malware (DOS/Win-NT): ======================================================= Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be warned and protected not only about viruses but also about other malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Regrettably, consciousness of AV producers to protect their users against related threats is still underdevelopped. Manifold arguments are presented why AV products are not the best protection against non-viral malware; from a technical point, these arguments may seem conclusive but at the same time, almost nothing is done to support customers with adequate AntiMalware software. On the other side, AV methods (such as scanning for presence or absence of characteristic features) are also applicable - though not ideal - to detect non-viral malware. Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. Looking only at the macro malware results (which have been better than related file malware results in all previous tests), some AV products succeed in properly detecting ALL non-viral malware samples, at least either under DOS or W-NT. For the first time, 2 AV products (indeed, serving as "AntiMalware" products) detect ALL malware samples in VTCs (deliberately small) malware testbed, and 8 more AM products detect malware samplees on high level ("Excellent": detection artes >=95.9%): ===== Malware Detection ===== = under DOS == = under W-NT = Macro-Malware Macro-Malware ----------------------------- "Perfect" DOS/W-NT mw scanners: CMD (100.0% 100.0%) FPR/FPW (100.0% 100.0%) ----------------------------- "Perfect" W-NT mw scanner: FSE (100.0% 96.2%) ----------------------------- "Excellent" DOS/W-NT scanners: SCN ( 99.6% 99.6%) PAV ( 98.8% 98.8%) AVP ( 96.9% 96.9%) NOD ( 96.2% 96.2%) NVC ( 95.4% 95.4%) INO ( 95.0% 97.3%) SWP ( 95.0% 95.0%) ----------------------------- ************************************************************** Results #7: AntiMalware detection under DOS/W-NT improving: ----------------------------------------------- #7.1) "Perfect" AntiMalware products for DOS/W-NT (2) CMD, FPR/FPW #7.2) "Excellent" AntiMalware products for DOS or W-NT: FSE, SCN, PAV, AVP, NOD, NVC, INO, SWP. ************************************************************** 10. Availability of pre-released test results: ============================================== Much more information about this test, its methods and viral databases, as well as detailed test results are available for anonymous FTP downloading from VTCs HomePage (VTC is part of Working Group AGN): ftp://agn-www.informatik.uni-hamburg.de/vtc Any comment and critical remark which helps VTC learning to improve our test methods will be warmly welcomed. The next comparative test is planned for May to July 2000, with viral databases frozen on March 31, 2000. Any AV producer wishing to participate in forthcoming test is invited to submit related products. On behalf of the VTC Test Crew: Dr. Klaus Brunnstein (February 26, 2000) 11. Copyright, License, and Disclaimer: ======================================= This publication is (C) Copyright 2000 by Klaus Brunnstein and the Virus Test Center (VTC) at University of Hamburg, Germany. Permission (Copy-Left) is granted to everybody to distribute copies of this information in electronic form, provided that this is done for free, that contents of the information are not changed in any way, and that origin of this information is explicitly mentioned. It is esp. permitted to store and distribute this set of text files at university or other public mirror sites where security/safety related information is stored for unrestricted public access for free. Any other use, esp. including distribution of these text files on CD-ROMs or any publication as a whole or in parts, are ONLY permitted after contact with the supervisor, Prof. Dr. Klaus Brunnstein or authorized members of Virus Test Center at Hamburg University, and this agreement must be in explicit writing, prior to any publication. No responsibility is assumed by the author(s) for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Prof. Dr. Klaus Brunnstein University of Hamburg, Germany (February 26, 2000)