========================================= File 7EVAL.TXT Evaluation of VTC Scanner Test "2000-04": ========================================= Formatted with non-proportional font (Courier) Content of this file: ===================== Eval #1: Development of DOS Scanner Detection Rates Eval #2: Evaluation of overall DOS AV detection rates Eval #3: In-The-Wild Detection under DOS Eval #4: Evaluation of detection by virus classes under DOS 4.1 Grading the Detection of file viruses 4.2 Grading the Detection of macro viruses 4.3 Grading the Detection of boot viruses 4.4 Grading of Poly-virus detection 4.5 Grading of VKit virus detection Eval #5: Detection of Packed File and Macro Viruses under DOS Eval #6: False Positive Detection in Clean Files and Macros (DOS/W-NT) Eval #7: Evaluation of File and Macro Malware detection (DOS/W-NT) Eval #8: Overall virus detection rates under Windows-98 Eval #9: Overall virus detection rates under Windows-NT Eval #10: File/Macro Virus detection under 32-bit engines Eval #11: Evaluation for malware virus detection under Windows 98/NT This part of VTC "2000-04" test report evaluates the detailed results as given in sections (files): 6ASUMOV.TXT Overview and Summary DOS/W-98/W-NT 6BDOSFIL.TXT File Virus/Malware results DOS 6CDOSBOO.TXT Boot Virus results DOS 6DDOSMAC.TXT Macro Viruses/Malware results DOS 6FW98.TXT File/Macro Viruses/Malware results W-98 6GWNT.TXT File/Macro Viruses/Malware results W-NT 6HCMP32.TXT Comparison File/Macro results W-98/NT Eval #1: Development of DOS Scanner Detection Rates: ==================================================== Concerning performance of DOS scanners, a comparison of virus detection results in previous 7 tests ("1997-02" until "1999-09") with "2000-04" shows how scanners behave and how manufacturers work in adapting their products to the growing threat of new viruses and malware. Table E1 lists the development of the detection rates of scanners (most current versions in each test), and it calculates the change ("+" indicating improvement) in detection rates between last (1999-09) and the actual test (2000-04). Finally, the "mean change" both in absolute and relative improvement of detection rates also is given (last row of table E1). This comparison concentrates on file and macro virus detection quality. VTC tests do NOT TEST for physical boot sector detection (see 4testcon.txt), so results may be unfair for those scanners which analyse physical layout of boot viruses. Therefore, boot virus detection results are not discussed here in detail (related results are available: 6CDOSBOO.TXT). For reasons of fairness, it must be noted that improvement of those products which have already reached a very high level of detection and quality (say: more than 95%) is much more difficult to achieve than for those products which reached lower detection rates. Some products have incorporated new engines (esp. for 32-bit platforms) and included formerly separate scanners (e.g. on macro viruses) which lead to improved performance. Generally, changes in the order of about +-1.5% are less significant as this is about the growth rate of new viruses per month, so detection depends strongly upon whether some virus is reported (and analysed and included) just before a new update is delivered. Table E1: Improvement of DOS scanners from 1997-02 to 2000-04: ============================================================== -------- File Virus Detection ---------- ------ Macro Virus Detection ----------- SCAN NER % % % % % % % % % % % % % % % --------------------------------------------------------------------------------------- ALE 98.8 94.1 89.4 - - - - - 96.5 66.0 49.8 - - - - - ANT 73.4 80.6 84.6 75.7 - - 92.8 - 58.0 68.6 80.4 56.6 - - 85.9 - AVA 98.9 97.4 97.4 97.9 97.6 97.4 97.5 0.1 99.3 98.2 80.4 97.2 95.9 94.6 93.7 -0.9 AVG 79.2 85.3 84.9 87.6 87.1 86.6 - - 25.2 71.0 27.1 81.6 82.5 96.6 - - AVK - - - 90.0 75.0 - - - - - - 99.7 99.6 - - - AVP 98.5 98.4 99.3 99.7 99.7 99.8 99.6 -0.2 99.3 99.0 99.9 100% 99.8 100% 99.9 -0.1 CMD - - - - - - 99.5 - - - - - - 99.5 100% 0.5 DRW 93.2 93.8 92.8 93.1 98.2 98.3 - 0.1 90.2 98.1 94.3 99.3 98.3 - 98.4 - DSE 99.7 99.6 99.9 99.9 99.8 - - - 97.9 98.9 100% 100% 100% - - - FMA - - - - - - - - 98.6 98.2 99.9 - - - - - FPR 90.7 89.0 96.0 95.5 98.7 99.2 99.6 0.4 43.4 36.1 99.9 99.8 99.8 99.7 100% 0.3 FSE - - 99.4 99.7 97.6 99.3 99.9 0.6 - - 99.9 90.1 99.6 97.6 99.9 2.3 FWN - - - - - - - 97.2 96.4 91.0 85.7 - - - - HMV - - - - - - - - - 98.2 99.0 99.5 - - - IBM 93.6 95.2 96.5 - - - - - 65.0 88.8 99.6 - - - - - INO - - 92.0 93.5 98.1 94.7 94.6 -0.3 - - 90.3 95.2 99.8 99.5 99.7 0.2 IRS - 81.4 74.2 - 51.6 - - - - 69.5 48.2 - 89.1 - - - ITM - 81.0 81.2 65.8 64.2 - - - - 81.8 58.2 68.6 76.3 - - - IVB 8.3 - - - 96.9 - - - - - - - - - - - MR2 - - - - - 65.4 - - - - - - - 69.6 - - NAV 66.9 67.1 97.1 98.1 77.2 96.0 93.3 -2.7 80.7 86.4 98.7 99.8 99.7 98.6 97.4 -1.2 NOD - - - 96.9 - 96.9 98.3 1.4 - - - - 99.8 100% 99.4 -0.6 NVC 87.4 89.7 94.1 93.8 97.6 - 99.1 - 13.3 96.6 99.2 90.8 - 99.6 99.9 0.3 PAN - - 67.8 - - - - - - - 73.0 - - - - - PAV - 96.6 98.8 - 73.7 98.8 98.7 -0.1 - - 93.7 100% 99.5 98.8 99.9 1.1 PCC - - - - - - - - - 67.6 - - - - - - PCV 67.9 - - - - - - - - - - - - - - - PRO - - - - 35.5 - - - - - - - 81.5 - - - RAV - - - 71.0 - - - - - - - 99.5 99.2 - - - SCN 83.9 93.5 90.7 87.8 99.8 97.1 99.9 2.8 95.1 97.6 99.0 98.6 100% 100% 100% 0.0 SWP 95.9 94.5 96.8 98.4 - 99.0 98.4 -0.6 87.4 89.1 98.4 98.6 - 98.4 98.4 0.0 TBA 95.5 93.7 92.1 93.2 - - - - 72.0 96.1 99.5 98.7 - - - - TSC - - 50.4 56.1 39.5 51.6 - - - - 81.9 76.5 59.5 69.6 - - TNT 58.0 - - - - - - - - - - - - - - - VDS - 44.0 37.1 - - - - - 16.1 9.9 8.7 - - - - - VET - 64.9 - - 65.3 - - - - 94.0 97.3 97.5 97.6 - - - VIT - - - - - - 7.6 - - - - - - - - - VRX - - - - - - - - - - - - - - - - VBS 43.1 56.6 - 35.5 - - - - - - - - - - - - VHU 19.3 - - - - - - - - - - - - - - - VSA - - 56.9 - - - - - - - 80.6 - - - - - VSP - - - 76.1 71.7 79.6 - 7.9 - - - - - - - - VSW - - 56.9 - - - - - - - 83.0 - - - - - VTR 45.5 - - - - - - - 6.3 - - - - - - - XSC 59.5 - - - - - - - - - - - - - - - --------------------------------------------------------------------------------------- Mean 74.2 84.8 84.4 85.4 81.2 90.6 98.3 1.7 69.6 80.9 83.8 89.6 93.6 88.2 98.0 +0.1 --------------------------------------------------------------------------------------- ****************************************************************** Findings #1: For DOS, both file virus and macro detection rates improved significantly. Several scanners detect more than 99% of zoo file and macro viruses but no scanner is graded "perfect". ****************************************************************** Findings #1.1) Under DOS, good news is that those scanners whose ability to properly detect file viruses have been significantly improved in comparison with last test; from last testīs good mean value (90.6% mean detection rate), file viruses in VTCs large collection ("zoo") are now detected much better (98.3%). Most products with high detection level maintain their rates, but no single product presently detects ALL file zoo viruses. Concerning In-The-Wild file viruses, 13 (out of 16) products reach the level "perfect" (100% detection). #1.2) Also under DOS, detection of zoo macro viruses has also reached the highest level ever (98.0%) with a significant improvement (almost 10%). Now, 3 products (CMD, FPR and SCN) detect ALL zoo samples, and 4 more products reach a 99.9% detection level. And 11 (out of 15) detect all In-The-Wild viruses. #1.3) Summarizing the DOS situation, AV producers should aim at maintaining this high level of detection. ******************************************************************** Eval #2: Evaluation of overall DOS AV detection rates: ====================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" Besides grading products in related categories according to their performance, it is interesting to compare how products developed. In comparison with previous results (VTC test "1999-03"), it is notified whether some product remained in the same category (=), improved into a higher category (*) or lost some grade (-). Eval #2.1: Overall grades of DOS scanners: ========================================== To assess an "overall AV grade" (including file and macro virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. If several scanners of the same producer have been tested, grading is applied to the most current version (which is in most cases the version with highest detection rates). Only scanners where all tests were completed are considered. (For problems in test: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with file and macro virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100%): (file/macro zoo; file/macro ITW) "Perfect" DOS scanners: NONE Change: ------- "Excellent" DOS scanners: SCN ( 99.9% 100.0%; 100.0% 100.0%) (+) FPR ( 99.6% 100.0%; 100.0% 100.0%) (=) CMD ( 99.5% 100.0%; 100.0% 100.0%) (+) FSE ( 99.9% 99.9%; 100.0% 100.0%) (+) AVP ( 99.6% 99.9%; 100.0% 100.0%) (=) NVC ( 99.1% 99.9%; 100.0% 100.0%) (+) "Very Good" DOS scanners: PAV ( 98.7% 99.9%; 100.0% 100.0%) (=) NOD ( 98.3% 99.4%; 100.0% 100.0%) (=) SWP ( 98.4% 98.4%; 100.0% 100.0%) (=) ************************************************************** Finding #2: Quality of best DOS scanners not yet perfect Excellent DOS scanners: SCN, FPR, CMD, FSE, AVP, NVC *************************************************************** Findings #2.1) The overall virus detection quality of best DOS scanners has reached a very acceptable level both for file and macro viruses which are not "in-the-wild". #2.2) 6 DOS scanners - SCN, FPR, AVP, FSE, CMD, NVC - are almost perfect. #2.3) 3 more products are "very good" (same level as in last test): PAV, NOD and SWP. ************************************************************** Eval #3: In-The-Wild Detection under DOS: ========================================= Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses is now absolute requirement. The following 11 DOS products (of 14) reach 100% for boot, file and macro virus detection and are rated "perfect" in this category (alphabetically ordered): ( File Macro Boot) ----------------------- "Perfect" DOS ITW scanners: AVP (100.0% 100.0% 100.0%) (=) CMD (100.0% 100.0% 100.0%) (=) DRW (100.0% 100.0% 100.0%) (+) FPR (100.0% 100.0% 100.0%) (=) FSE (100.0% 100.0% 100.0%) (=) INO (100.0% 100.0% 100.0%) (+) NOD (100.0% 100.0% 100.0%) (=) NVC (100.0% 100.0% 100.0%) (+) PAV (100.0% 100.0% 100.0%) (=) SCN (100.0% 100.0% 100.0%) (=) SWP (100.0% 100.0% 100.0%) (=) ------------------------ It is good to observe that now 11 DOS scanners are "perfect" ITW virus detectors (last test: 10 products). While 3 products have improved to this category, 2 previously "perfect" products lost quality and placement. ************************************************************** Findings #3: High ITW detection rates and implied risks. ************************************************************** Findings #3.1) 11 DOS scanners are "perfect" for ITW viruses: AVP,CMD,DRW,FPR,FSE,INO,NOD,NVC,PAV,SCN,SWP. #3.2) In-The-Wild detection of best DOS scanners has been slightly improved since last test. Number of perfect scanners (in this category) has jumped from 10 to 11. #3.3) But the concentration of some AV producers to reach 100% In-The-Wild detection rates does NOT guarantee high detection rates of "zoo" viruses. ************************************************************** Eval #4: Evaluation for detection by virus classes under DOS: ============================================================= Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting file, boot and macro viruses. Compared to the last test (1999-09), the number of "excellent" macro virus detectors has significantly grown (as has the class of "good" ones which is not listed here); in contrast, "standard" file (and even more: boot) viruses seem to be comparably less carefully handled in product upgrading. Two special tests of file viruses were also performed to determine the quality of AV product maintenance. One test was concerned with almost 11,000 viruses generated from the VKIT virus generator. Some AV products count each of the potential 14,000 viruses as new variant while others count all VKIT viruses just as ONE virus. Fortunately, a high proportion of tested products detects these viruses (see 4.5), although reliability of detection is significantly less than normally (see 6BDOSFIL.TXT). Another special test was devoted to the detection of 10,000 polymorphic generations each of the following polymorphic viruses: Maltese.Amoeba, MTE.Encroacher.B, NATAS, TREMOR, One-Half and Tequila. Detection rates were "almost perfect". Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed. 4.1 Grading the Detection of zoo file viruses: ---------------------------------------------- "Perfect" DOS scanner: === NONE === (=) "Excellent" DOS scanners: FSE ( 99.9%) (=) SCN ( 99.9%) (+) AVP ( 99.6%) (=) FPR ( 99.6%) (=) CMD ( 99.5%) (+) NVC ( 99.1%) (+) "Very Good" DOS file scanners: PAV ( 98.7%) (=) SWP ( 98.4%) (-) AVA ( 97.5%) (=) NOD ( 98.3%) (=) 4.2 Grading the Detection of zoo macro viruses: ----------------------------------------------- "Perfect" DOS macro scanners: CMD (100.0%) (+) FPR (100.0%) (+) SCN (100.0%) (=) "Excellent" DOS macro scanners: AVP ( 99.9%) (-) FSE ( 99.9%) (+) NVC ( 99.9%) (=) INO ( 99.7%) (=) NOD ( 99.4%) (+) "Very Good" DOS file scanners: DRW ( 98.4%) (+) SWP ( 98.4%) (=) NAV ( 97.4%) (=) 4.3 Grading the Detection of zoo boot viruses: ---------------------------------------------- "Perfect" DOS scanners: AVP (100.0%) (+) NOD (100.0%) (+) "Excellent" DOS scanners: CMD ( 99.9%) (+) FPR ( 99.9%) (=) NVC ( 99.9%) (+) SCN ( 99.9%) (=) PAV ( 99.8%) (+) FSE ( 99.7%) (=) SWP ( 99.4%) (+) "Very Good" DOS file scanners: INO ( 98.1%) (+) 4.4 Grading of Poly-virus detection: ------------------------------------ Based on the detection data (see 6BDOSFIL.TXT, table FDOS.FA), and with additional conditions, that 1) all infected objects for all viruses were detected 2) with full reliability of identification and detection, the following products can be rated as "perfect" Poly-detectors: "Perfect" Poly-detectors: AVP (100.0) (=) DRW (100.0) (+) FSE (100.0) (=) NAV (100.0) (=) NOD (100.0) (=) PAV (100.0) (=) The following products are "almost perfect" as they reach 100% detection rate (exactly) but with less precise identification precision: "Almost Perfect" Poly detectors: ANT (+), AVA (+), CMD (=), FPR (=), INO (=), NVC (=), SCN (+), SWP (=) and VSP (+). 4.5 Grading of VKit virus detection: ------------------------------------ Based on detection data (see 6BDOSFIL.TXT, table FDOS.FB), and with additional conditions, that 1) all infected objects for all viruses were detected 2) with full reliability of identification and detection, NO product was "perfect" but several detected almost all samples (rounded to 100.0%) but with some unreliability of identification: "Perfect" VKIT-detectors: NONE (=) "Almost Perfect" VKIT detectors: ANT (+), AVA (+), AVP (=), CMD (+), DRW (+), FPR (+), FSE (=), NOD (+), PAV (=), SCN (=) and SWP (+). **************************************************************** Finding #4: Performance of DOS scanners by virus classes: --------------------------------------------- Perfect scanners for macro zoo: CMD, FPR, SCN Perfect scanners for boot zoo: AVP, NOD Perfect scanners for Polymorphic virus set: AVP, FSE, NAV, NOD, PAV. NO perfect scanner for file and VKit zoo viruses. **************************************************************** Finding #4.1) Specialised scanners (esp. those specialising on macro viruses) are not superior to best overall scanners, even concerning large collections such as VTCs "zoo" testbeds. **************************************************************** Eval #5: Detection of Packed File and Macro Viruses under DOS: ============================================================== Detection of file and macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with given popular methods (PKZIP, ARJ, LHA and RAR) are also detected. Tests are performed only on In-The-Wild viruses packed once (no recursiive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially bee unchanged to ease comparison and improvement. Results (see 6BDOSFIL.TXT, 6DDOSMAC.TXT) are AGAIN rather DISAPPOINTING, esp. as we have to report major problems of products in scanning the whole testbed (although not very large), as reported in 8PROBLMS.TXT. A "perfect" product would detect ALL packed viral samples (100%) (file AND macro) for all packers: -------------------------------------------------- "Perfect" packed virus detectors: FSE, PAV and SCN -------------------------------------------------- A "very good" product would reach 100% detection of packed viral samples (file and macro) for at least 3 packers: --------------------------------------------------------------- "Very good" packed macro virus detector: NOD (ZIP, ARJ and RAR) --------------------------------------------------------------- A "good" product would detect viral samples (ITW file and macro) for at least 2 packers: -------------------------------------------- "Good" packed macro virus detector: CMD, FPR -------------------------------------------- Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. ******************************************************************** Findings #5: Detection of packed viral objects needs improvement Perfect packed file/macro virus DOS detector: FSE,PAV,SCN Packed macro virus only: "Very Good" detector: NOD "Good" detectors: CMD, FPR ******************************************************************** Findings #5.1) 3 products = FSE, PAV and SCN = can be rated "perfect" concerning detection of infected packed objects, at least on the level of ITW file and macro viruses. #5.2) Only 3 other products have reached an acceptable level of detecting viruses in packed infected objects with 2 or 3 compression methods. Signi- ficant investment of work is needed here. ******************************************************************** Eval #6: False-Positive Detection in Clean Files and Macros: ============================================================ First introduced in VTC test "1998-10", a set of clean (and non- malicious) objects has been added to the file and macro virus tes- beds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be signifi- cantly more rigid than that one used for detection (see Eval #2). The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" Regarding the ability of scanners to avoid FP alarms, the following AV products running under DOS reported NO SINGLE False Positive alarm BOTH in file and macro zoo testbeds and are therefore rated "perfect": FP-avoiding "perfect" DOS scanners: AVA (+), PAV (+), SCN (=) and SWP (+) (VSP also avoids FPs, but at low level of virus detection). Several DOS scanners gave NO FP alarm EITHER on clean files OR macros: Perfect FP-avoidance on DOS clean file testbed: AVA (+), AVP (=), CMD (=), FPR (=), NAV (=), NVC (=), PAV (=), SCN (=), SWP (=), VIT (+) Perfect FP-avoidance on DOS clean macro file testbed: ANT (+), AVA (=), FSE (+), NOD (+), PAV (+), SCN (=), SWP (=), VSP (+). In comparison to DOS results, an analysis of FP-avoidance for Windows-NT based scanners is slightly more promising. Concerning avoidance of ANY FP-alarm BOTH for file and macro viruses, 2 products are rated "excellent": FP-avoiding "perfect" W-NT scanners: AVA, AVG, AVK, MKS, SCN, SWP. Several more W-NT scanners also gave NO FP alarm EITHER on clean files OR on clean macros: Perfect FP-avoidance under Win-NT for clean file objects: ATD (+), AVA (+), AVG (+), AVK (=), AVP (=), CMD (+), ESA (+), FPW (=), FSE (=), MKS (+), NAV (=), NVC (+), PAV (=), PRO (+), RAV (+), SCN (=), SWP (=). Perfect FP-avoidance under Win-NT for clean macro objects: AVA (=), AVG (+), AVK (+), MKS (+), NOD (+), QHL (+), SCN (=), SWP (=), VSP (=). As in last test, avoidance of false-positive alarms is LESS advanced for macro viruses (see 6DDOSMAC.TXT, table FDOS.M4). Moreover, some products (which were well rated in last test) showed serious problems during testing. Concerning avoidance of False-Positive alarms BOTH under DOS AND Windows-NT, only 2 products can be rated as "perfect": "Perfect" FP-avoiding scanner both under DOS and W-NT: AVA, SCN and SWP. (Remark: direct comparison of 16-bit scan engines for DOS and 32-bit scan engines for W-NT is not possible. The argument concerning an "overall perfect product" applies more to the suite of software than to single products. Indeed, FPW and FPR are different engines in Frisk Software suite, as SCN engines are in NAIs suite). **************************************************************** Findings #6: Avoidance of False-Positive Alarms is insufficient. FP-avoiding perfect DOS scanners: AVA, PAV, SCN, SWP FP-avoiding perfect W-NT scanners: AVA, AVG, AVK, MKS, SCN, SWP **************************************************************** Findings #6.1) Several products reliably avoid ANY False Positive alarm on clean file and macro objects, either under DOS and Win-NT. #6.2) Only 3 products avoid ANY false-positive alarm BOTH under DOS and Windows-NT: !AVA, SCN and SWP! #6.3) The number of scanners avoiding FP alarms has been slightly improved since last test. #6.3) AV producers should intensify work to avoid FP alarms. ***************************************************************** Eval #7: Evaluation of File and Macro Malware detection (DOS/W-NT): =================================================================== Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be warned and protected not only about viruses but also about other malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Regrettably, consciousness of AV producers to protect their users against related threats is still underdeveloped. Manifold arguments are presented why AV products are not the best protection against non-viral malware; from a technical point, these arguments may seem conclusive but at the same time, almost nothing is done to support customers with adequate AntiMalware software. On the other side, AV methods (such as scanning for presence or absence of characteristic features) are also applicable - though not ideal - to detect non-viral malware. Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file and macro malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" As in last test, still NO product can be rated a "perfect AM detector". Compared to last test, only 1 product (formerly 4, one of which "left" the market) can now be rated "very good". Developments for file malware (e.g. trojans stealing passwords) is disappointing as detection rates are generally decreasing, whereas detection of macro malware (which is well defined in VTCs "List of Known Macro Viruses/Malware") is improving: ===== Malware Detection ===== = under DOS == = under W-NT = (File/Macro-mw;File/Macro-mw) ----------------------------------- "Perfect" DOS/W-NT mw scanners: NONE ----------------------------------- "Excellent" DOS/W-NT scanners: FPR/FPW (95.3% 100.0%; 91.5% 100.0%) (+) SCN (93.4% 99.6%; 93.3% 99.6%) (=) CMD (92.8% 100.0%; 95.3% 100.0%) (+) FSE (94.6% 96.2%; 98.7% 100.0%) (+) PAV (90.8% 98.8%; 91.6% 98.8%) (+) ---------------------------------- Concerning macro malware detection, the situation is better as there are "perfect" products (100% detection): 3 products detect macro malware under W-NT at 100%: CMD, FPW/FPW and FSE. 2 products detects macro malware under DOS at 100%: CMD and FPR/FPW. But otherwise, detection rates need still development: ===== Malware Detection ===== = under DOS == = under W-NT = (File/Macro-mw;File/Macro-mw) ----------------------------------- "Perfect" macro mw (DOS/W-NT):FPR/FPW (95.3% 100.0%; 91.5% 100.0%) (+) CMD (92.8% 100.0%; 95.3% 100.0%) (+) ----------------------------------- "Excellent" macro mw (DOS/WNT): SCN (93.4% 99.6%; 93.3% 99.6%) (=) PAV (90.8% 98.8%; 91.6% 98.8%) (+) AVP (83.2% 96.9%; 91.6% 98.8%) (=) FSE (94.6% 96.2%; 98.7% 100.0%) (+) NOD (77.6% 96.2%; 78.3% 97.3%) (-) NVC (64.9% 95.4%; 77.6% 96.2%) (=) INO (74.4% 95.0%; - 98.1%) (=) SWP (78.3% 95.0%; 78.3% 95.0%) (=) ---------------------------------- ************************************************************** Findings #7: AntiMalware detection under DOS/W-NT is improving No "perfect" file/macro malware detector "Excellent" file/macro malware detector: FPR/FPW, SCN, CMD, FSE and PAV ************************************************************** Findings #7.1: The ability of AV products to detect also non-viral malware is only improving for macro malware while the ability to detect file malware decreased since last tests. #7.2: Concerning file and macro malware, 5 products can be rated "excellent": FPR/FPW, SCN, CMD, FSE, PAV. #7.3: With continuing growth of malware testbeds and growing threats to customers, AV producers MUST improve their products also in this area. ************************************************************** Eval #8: Overall virus detection rates under Windows-98: ======================================================== The following table summarizes results of file and macro virus detection under Windows 98 since 1998/10, including relative improvement (DELTA) from last test to current results: Table AB: Comparison: File/Macro Virus Detection Rate in last 4 VTC tests under Windows 98: ============================================================ ---- File Virus Detection --- ---- Macro Virus Detection -- SCAN 98/10 99/03 99/09 00/04 DELTA 98/10 99/03 99/09 00/04 DELTA NER % % % % % % % % % % ---------------------------------------------------------------------- ACU - - - - - - 97.6 - - - AN5 - - 87.2 - - - - 89.3 - - ANT 91.3 - 86.5 92.8 - 84.3 - 89.5 90.2 1.7 ANY - - - - - 70.7 - - - - AVA 96.6 97.6 97.2 97.5 0.3 96.7 95.9 93.9 94.3 0.4 AVG - 87.3 87.0 85.4 -1.6 - 82.5 96.6 97.5 0.9 AVK 99.6 90.8 99.8 99.7 -0.1 99.6 99.6 100.0 99.9 -0.1 AVP 99.9 99.9 99.8 99.9 0.1 100.0 99.2 100.0 99.9 -0.1 AVX - 74.2 75.7 77.4 1.7 - - 98.7 94.5 -4.2 CMD - - 98.4 99.6 1.2 - - 99.6 100.0 0.4 DSS/DSE 99.9 99.9 * 99.8 - 100.0 100.0 * 100.0 - DRW/DWW - 89.5 98.3 96.7 -1.6 - 98.3 98.8 98.4 -0.4 ESA - - - 58.0 - - - - 88.9 - FPR/FMA - 93.9 99.4 99.7 0.3 92.4 99.8 99.7 100.0 0.3 FPW - - 99.2 99.6 0.4 - - 99.9 100.0 0.1 FSE 99.8 100.0 99.9 100.0 0.1 100.0 100.0 100.0 100.0 0.0 FWN - - - - 99.6 99.7 99.9 99.8 -0.1 HMV - - - - - 99.5 - - - IBM 92.8 * * * - 94.5 * * * - INO 93.5 98.1 97.1 98.7 1.6 88.1 99.8 98.1 99.7 1.6 IRS 96.7 97.6 - - - 99.0 99.5 - - - ITM - 64.2 - - - - - - - - IVB - - - - - 92.8 95.0 - - - MKS - - - - - - - - 97.1 - MR2 - - 65.9 - - - - 64.9 - - NAV - 96.8 97.6 96.8 -0.8 95.3 99.7 98.7 98.0 -0.7 NOD - 97.6 98.3 98.3 0.0 - 99.8 100.0 99.4 -0.6 NV5 - - 99.0 - - - - 99.6 - - NVC 93.6 97.0 99.0 99.1 0.1 - 99.1 99.6 99.9 0.3 PAV 98.4 99.9 99.6 100.0 0.4 99.5 99.5 86.7 99.9 13.2 PCC - 81.2 - - - - 98.0 - - - PER - - - - - - - - 53.7 - PRO - 37.3 39.8 44.6 4.8 - 58.0 61.9 67.4 5.5 QHL - - - - - - - - 0.0 - RAV 84.9 - 86.9 86.5 -0.4 92.2 - 98.1 97.9 -0.2 SCN 86.6 99.8 99.7 100.0 0.3 97.7 100.0 99.8 100.0 0.2 SWP 98.4 - 99.0 99.6 0.6 98.6 - 98.5 98.6 0.1 TBA 92.6 * * * - 98.7 * * * - TSC - 55.3 53.8 - - - 76.5 64.9 - - VBS - - - - - 41.5 - - - - VBW - 26.5 - - - 93.4 - - - - VET - 66.3 * * * - 97.6 * * - VSP - 86.4 79.7 78.1 -1.6 - 0.4 0.3 - - ----------------------------------------------------------------------- Mean 95.0% 84.2% 89,7% 91.6% 0.3% 92.1% 90.3% 93,5% 95.0% 0.9% ----------------------------------------------------------------------- Generally, the ability of W-98 scanners to detect file and macro zoo viruses "in the mean" is further improved, for file viruses now to 91.6%, and for macro viruses to now 95.0%. On the other side, comparison for products participating in last 2 tests (which include essentially market leading products) is less positive: mean improvement of file virus detection is only 0.37%, and mean macro virus detection rates are moderately up +0.9%, but generally these products still reach a high level of detection (>95%). The same grid as for the DOS classification is applied to classify scanners according to their ability to detect file and macro viruses under Windows 98. Both zoo and In-The-Wild rates are used to grade products (100% detection rate of file and macro ITW viruses is absolute requirement, which surprisingly some otherwise good scanners did not meet). This time, FSE and SCN reached 100% detection rate for both file and macro viruses, both in zoo and In-The-Wild, to be rated "perfect" (last test: 0 product). The following list indicates which scanners reach "perfect", "excellent" (>99%) and "very good" (>95%) grades: (file zoo/ITW; macro zoo/ITW) -------------------------------- "Perfect" W-98 scanners: FSE (100.0% 100.0%; 100.0% 100.0%) (+) SCN (100.0% 100.0%; 100.0% 100.0%) (+) -------------------------------- ------- "Excellent" W-98 scanners: PAV (100.0% 100.0%; 99.9% 100.0%) (+) AVP ( 99.9% 100.0%; 99.9% 100.0%) (=) DSE ( 99.8% 100.0%; 100.0% 100.0%) (+) FPR ( 99.7% 100.0%; 100.0% 100.0%) (=) AVK ( 99.7% 100.0%; 99.9% 100.0%) (=) CMD ( 99.6% 100.0%; 100.0% 100.0%) (+) NVC ( 99.1% 100.0%; 99.9% 100.0%) (=) ------------------------------- "Very Good" W-98 scanners: SWP ( 99.6% 100.0%; 98.6% 100.0%) (=) INO ( 98.7% 100.0%; 99.7% 100.0%) (+) NOD ( 98.3% 100.0%; 99.4% 100.0%) (=) NAV ( 96.8% 100.0%; 98.0% 100.0%) (=) DRW ( 96.7% 100.0%; 98.4% 100.0%) (=) ------------------------------- For detection of macro viruses under Windows 98, the following 15 scanners detect at least 95% of zoo macro and 100% of ITW viruses: "Perfect" (100% 100%): CMD, DSE, FPR, FPW, FSE and SCN "Excellent" (>99% 100%): ATD, AVK, AVP, CMD, DSE, FPR, FPW, FSE, FWN, INO, NOD, NVC, PAV and SCN "Very Good" (>95% 100%): AVG, DRW, NAV, RAV and SWP. ************************************************************** Findings #8: Virus detection rates under W-98 on high level ************************************************************** Findings #8.1: Detection rates for file and esp. macro viruses for scanners under Windows 98 is rather stable on a fairly high, though not perfect level: Perfect scanner (100%): 3 (last test: 0) Excellent scanners (>99%): 7 (last test: 6) Very Good scanners (>95%): 5 (last test: 6) Findings #8.2: AV producers should invest more work into file virus detection, esp. into VKIT virus detection (where results are not equally promising) as well as detection of viruses in virus detection in compressed objects (essential for on-access scanning). ************************************************************** Eval #9: Overall virus detection rates under Windows-NT: ======================================================== The number of scanners running under Windows NT is still small, though growing. Significantly less products were available for these tests, compared with the traditional DOS scene. The following table summarizes results of file and macro virus detection under Windows-NT in last 5 VTC tests: Table AC: Comparison: File/Macro Virus Detection Rate in last 4 VTC tests under Windows NT: =========================================================== Scan ====== File Virus Detection ======= ===== Macro Virus Detection ======= ner 9707 9802 9810 9903 9909 0004 Delta 9707 9802 9810 9903 9909 0004 Delta ------------------------------------------------------------------------------ ANT 88.9 69.2 91.3 - 87.2 92.8 12.6 92.2 - 85.7 - 89.3 90.2 0.9 ANY - - 69.7 - - - - - - 70.5 - - - - ATD - - - - - 100% - - - - - - 99.9 - AVA - 97.4 96.6 97.1 97.4 97.2 -0.2 - 91.9 97.2 95.2 93.3 94.3 1.0 AVG - - - 87.3 87.0 85.4 -1.6 - - - 82.5 96.6 97.5 0.9 AVK - - 99.6 90.2 99.8 99.7 -0.1 - - 99.6 99.6 100% 99.9 -0.1 AVP - - 83.7 99.9 99.8 99.9 0.1 - - 100% 99.2 100% 99.9 -0.1 AVX - - - 74.2 75.2 80.4 5.2 - - - 98.9 98.7 94.5 -4.2 AW - 56.4 - - - - - - 61.0 - - - - - CMD - - - - - 99.6 - - - - - - 100% - DRW/DWW - - 93.3 98.3 98.3 0.0 - - - 98.3 98.8 98.4 -0.4 DSS/E 99.6 99.7 99.9 99.3 * - - 99.0 100% 100% 100% * - - ESA - - - - - 58.0 - - - - - - 88.9 - FPR/FMA - 96.1 - 98.7 99.4 - - - 99.9 99.8 99.8 99.7 - - FPW - - - - - 99.6 - - - - - 99.7 100% 0.3 FSE - 85.3 99.8 100% 99.9 100% 0.1 - - 99.9 100% 100% 100% 0.0 FWN - - - - - - - - - 99.6 99.7 - 99.9 - HMV - - - - - - - - - 99.0 99.5 - - - IBM 95.2 95.2 77.2 * * * * 92.9 92.6 98.6 * * * * INO - 92.8 - 98.1 98.0 98.7 0.7 - 89.7 - 99.8 99.7 99.7 0.0 IRS - 96.3 - 97.6 - - - - 99.1 - 99.5 - - - IVB - - - - - - - - - 92.8 95.0 - - - MKS - - - - - 78.0 - - - - - - 97.1 - MR2 - - - - 61.9 - - - - - - 69.6 - - NAV 86.5 97.1 - 98.0 97.6 96.8 -0.8 95.6 98.7 99.9 99.7 98.7 98.0 -0.7 NOD - - - 97.6 98.2 98.3 0.1 - - - 99.8 100% 99.4 -0.6 NVC 89.6 93.8 93.6 96.4 - 99.1 - 96.6 99.2 - 98.9 98.9 99.9 1.0 NVN - - - - 99.0 - - - - - - 99.5 - - PAV 97.7 98.7 98.4 97.2 99.6 100% 0.4 93.5 98.8 99.5 99.4 99.7 99.9 0.2 PRO - - - 37.3 42.4 45.6 3.2 - - - 58.0 61.9 67.4 5.5 QHL - - - - - - - - - - - - 0.0 - RAV - 81.6 84.9 85.5 - 88.0 - - 98.9 99.5 99.2 - 97.9 - RA7 - - - 89.3 - - - - - - 99.2 - - - PCC 63.1 - - - - - - - 94.8 - - - - - PER - - - - - - - - 91.0 - - - - - SCN 94.2 91.6 71.4 99.1 99.8 99.8 0.0 97.6 99.1 97.7 100% 100% 100% 0.0 SWP 94.5 96.8 98.4 - 99.0 99.6 0.6 89.1 98.4 97.5 - 98.4 98.6 0.2 TBA - 93.8 92.6 * * * - 96.1 - 98.7 * * * TNT - - - * * * - - - 44.4 * * * VET 64.9 - - 65.4 * * - - 94.0 - 94.9 * * VSA - 56.7 - - - - - - 84.4 - - - - VSP - - - 87.0 69.8 78.1 8.3 - - - 86.7 0.3 0.0 -0.3 ------------------------------------------------------------------------------ Mean: 87.4 88.1 89.0 89.2 90,0 91.0 1.8% 94.7 95.9 91.6 95.3 95,1 96.5 0.2% ------------------------------------------------------------------------------ Generally, the ability of W-NT scanners to detect file zoo viruses "in the mean" is stable but on an insufficient level (91.0%), but those scanners present in last test pushed detection rates by 1.8% (mean). On the side of macro viruses, "mean" detection rate is stable on an acceptable level (96.3%); here, products having participated in last VTC tests succeed in following the growth by keeping detection rates (+0.2%) on high levels, from where spectacular improvements are less easy. The same grid as for the DOS and W-98 classification is applied to grade scanners according to their ability to detect file and macro viruses under Windows NT. Under W-NT, just ONE product (in both cases: FSE) reached 100% detection rate for both file and macro viruses, both zoo and In-The-Wild, and is rated "perfect". But 8 scanners reach grade "Excellent" (>99% detection), and 4 scanners are rated "very good" (>95%): (file/macro zoo; file/macro ITW) -------------------------------- "Perfect" W-NT scanners: FSE (100.0% 100.0%; 100.0% 100.0%) (+) -------------------------------- ------- "Excellent" W-NT scanners: ATD (100.0% 99.9%; 100.0% 100.0%) (+) PAV (100.0% 99.9%; 100.0% 100.0%) (=) AVP ( 99.9% 99.9%; 100.0% 100.0%) (=) SCN ( 99.8% 100.0%; 100.0% 100.0%) (=) AVK ( 99.7% 99.9%; 100.0% 100.0%) (=) CMD ( 99.6% 100.0%; 100.0% 100.0%) (+) FPW ( 99.6% 100.0%; 100.0% 100.0%) (=) SWP ( 99.6% 98.6%; 100.0% 100.0%) (=) NVC ( 99.1% 99.9%; 100.0% 100.0%) (+) ------------------------------- "Very Good" W-NT scanners: INO ( 98.7% 99.7%; 100.0% 100.0%) (=) DRW ( 98.3% 98.4%; 100.0% 100.0%) (=) NOD ( 98.3% 99.4%; 100.0% 100.0%) (=) NAV ( 96.8% 98.0%; 100.0% 100.0%) (=) ------------------------------- For detection of macro viruses under Windows NT, the following 16 scanners detect at least 95% of zoo macro and 100% of ITW viruses: "Perfect" (100% 100%): CMD, FPW, FSE and SCN "Excellent" (>99% 100%): ATD, AVK, AVP, FWN, INO, NOD, NVC and PAV "Very Good" (>95% 100%): AVG, DRW, NAV, RAV and SWP ************************************************************** Findings #9: Virus detection rates under W-NT on high level 1 "perfect" Windows-NT zoo scanner: FSE 10 "excellent" scanners: ATD, PAV, AVP, SCN, AVK, CMD, FPW, SWP and NVC 4 Macro-only "perfect" products: CMD,FPW,FSE,SCN ************************************************************** Findings #9.1: Detection rates for file and esp. macro viruses for scanners under Windows NT have reached a fairly high level, similar to W-98: Perfect scanner (100%): 1 (last test: 0) Excellent scanners (>99%): 8 (last test: 8) Very Good scanners (>95%): 4 (last test: 5) #9.2: AV producers should invest more work into file virus detection, esp. into VKIT virus detection (where results are not equally promising) as well as detection of viruses in virus detection in compressed objects (essential for on-access scanning). ************************************************************** Eval #10: File/Macro Virus detection under 32-bit engines: ========================================================= Concerning 32-Bit engines as used in Windows-98 and Windows-NT, it is interesting to test the validity of the hypothesis that related engines produce same detection and identification quality. (For details see 6HCOMP32.TXT). When comparing results from related tests, good news is that 32-bit engines growingly behave equally well on W-98 and W-NT platforms: Equal detection of zoo file viruses: 13 (of 23) products of ITW file viruses: 22 (of 23) products of zoo macro viruses: 21 (of 23) products of ITW macro viruses: 21 (of 23) products ***************************************************************** Findings #10: Several W-32 scanners perform equally on W-98/W-NT ***************************************************************** Findings #10.1: The assumption that 32-bit engines in scanners produce the same detection rate for different instantiations of 32-bit operating systems (esp. for Windows-98 and Windows-NT) is now correct for almost all scanners. #10.2: Analysis of ITW detection rates is NOT sufficient to determine the behaviour of 32-bit engines and does not guarantee equal detection rates for different W-32 platforms (esp. W-98/W-NT). ***************************************************************** Eval #11: Evaluation for malware virus detection under Windows 98/NT: ===================================================================== As Windows 98 and Windows-NT are often used for downloading potentially hazardous objects from Internet, it is interesting to measure the ability of AntiVirus products to also act as AntiMalware products. The same grid is applied as to grading of DOS AM products. Similar to DOS, NO AV products can presently be graded as "Perfect" (all rates 100.0%) but 7 scanners (compared to last test: 2) perform as AM products with grade "Excellent" (>90%), both for W-98 and W-NT; several scanners reach now the same level as in test 1999-03, after having lost this grade due to insufficient detection rates in test "1999-09". ===== Malware Detection ===== == File malw == Macro malw == (W-98 W-NT ; W-98 W-NT) ----------------------------------- "Perfect" W-98/W-NT mw scanners: NONE ----------------------------------- "Excellent" W-98/W-NT mw scanners:AVK (91.1% 91.1%; 98.8% 98.8%) (+) AVP (91.5% 91.6%; 96.6% 96.6%) (+) CMD (95.3% 95.3%; 100.0% 100.0%) (+) FPR (91.5% 91.5%; 100.0% 100.0%) (+) FSE (98.7% 98.7%; 100.0% 100.0%) (=) PAV (91.6% 91.6%; 98.8% 98.8%) (+) SCN (94.3% 93.3%; 100.0% 99.6%) (=) ----------------------------------- Detection of macro malware is evidently better supported than file malware detection, as several more AV products detect macro malware under W-98 (14 products) and W-NT (16 products). 3 products were rated "perfect", and 12 other products reached a "very good" level of macro malware detection under both Win-98 and Win-NT: Detection of macro malware under W-98/W-NT -------------------------------------------- "Perfect": CMD, FPW, FSE (100.0% 100.0%) "Excellent": SCN (100.0% 99.6%) ATD, AVK, PAV ( 98.8% 98.8%) INO ( 97.3% 97.3%) FWN ( 96.9% 98.1%) AVP ( 96.9% 96.9%) NOD ( 96.2% 96.2%) NVC, RAV ( 95.4% 95.4%) SWP ( 95.0% 95.0%) AVX ( 94.2% 94.2%) ------------------------------------------- It is also interesting to observe that malware detection is less platform dependent as DOS, W-98 and W-NT detection rates are often consistent. Generally, some AV products are able to help protecting users by detecting file and macro-related malware at a significant level. Fortunately, related products also show good to excellent results in detecting viral malware. *************************************************************** Findings #11: AntiMalware quality of AV products is developping No "perfect" AM product for W-98 and W-NT for file and macro malware detection but 7 scanners with grade "excellent" (>90%): AVK, AVP, CMD, FPR, FSE, PAV and SCN. Concerning macro malware detection under W-98/W-NT: 3 products are "perfect": CMD, FPR and FSE 12 products are "excellent": SCN, ATD, AVK, PAV, INO, FWN, AVP, NOD, NVC, RAV, SWP and AVX *************************************************************** Findings #11.1: Several AntiMalware producers also help customers detecting non-viral malware under 32-bit operating systems, esp. Win-98 and Win-NT. #11.2: The ability to detect macro malware is more developed than detection of file malware. #11.3: Much more work must be invested to reliably detect file and macro malware and protect customers from downloading trojans etc. ***************************************************************