========================================= File 7EVALDOS.TXT Evaluation of VTC Scanner Test "2001-04": ========================================= Formatted with non-proportional font (Courier) Content of this file: ===================== Eval A: Development of detection rates under DOS: ------------------------------------------------- Eval DOS.01: Development of DOS Scanner Detection Rates Table DOS-A: File/Macro/Script Virus Detection Rate in last 9 VTC tests Eval DOS.02: Evaluation of overall DOS AV detection rates Eval DOS.03: In-The-Wild Detection under DOS Eval DOS.04: Evaluation of detection by virus classes under DOS DOS.04.1 Grading the Detection of file viruses under DOS DOS.04.2 Grading the Detection of macro viruses under DOS DOS.04.3 Grading the Detection of script viruses under DOS DOS.04.4 Grading of Poly-virus detection DOS.04.5 Grading of VKit virus detection DOS.04.6 Grading the detection of boot viruses under DOS Eval DOS.05: Detection of Packed File and Macro Viruses under DOS Eval DOS.06: Avoidance of False Alarms (File, Macro) under DOS Eval DOS.07: Detection of File and Macro Malware under DOS Eval DOS.08: Detection of "Exotic" malware Eval DOS.SUM Grading of DOS products ********************************************************************** This part of VTC "2001-04" test report evaluates the detailed results as given in sections (files): 6BDOSFIL.TXT File Virus/Malware results DOS 6CDOSBOO.TXT Boot Virus results DOS 6DDOSMAC.TXT Macro Viruses/Malware results DOS The following (13) products participated in this test for DOS products: ----------------------------------------------------------------- Products submitted for aVTC test under DOS: ----------------------------------------------------------------- AVA V7.70 Database: VPS 7.70-47, Dec.4,2000 AVG 6.220 Database: 105 AVK Version 3.0 Build 133 Database: Dec.01,2000 AVP Version 3.0 Build 133 DRW 4.21 Database: Sep.25,2000 FPR 3.08b Database: Dec.11,2000 INO V4.5 n(s) Database: Dec.11,2000 MR2 1.14 (Dec.2000) Database: Dec.11,2000 NAV Database: Dec.07,2000 NVC 4.90.00 PAV 3.0 Build 131 Database: Dec.08,2000 SCN v4.12.0 Database: Dec.06,2000 VSP 12.02.2 ----------------------------------------------------------------- Eval DOS.01: Development of Scanner Detection Rates under DOS: ============================================================== The number of scanners running under DOS is gradually reducing; for the first time in VTC tests, the number of WNT scanners (18) was larger than those for DOS (13). Evidently, AV producers invest now more work into the development of the W32-related platforms. It can readily be assumed that the importance of DOS scanners will be further reduced, in favour of W32 products. The following table summarizes results of file and macro virus detection under DOS in last 9 VTC tests: Table DOS-A: File/Macro/Script Virus Detection Rate in last 9 VTC tests under DOS: ================================================================================== --------- File Virus ----------------------- + ------------------- Macro Virus ----------------- + ScriptVirusDet- SCAN9702 9707 9802 9810 9903 9909 0004 0104 DeltaI 9702 9707 9802 9810 9903 9909 0004 0008 0104 DeltaI 0008 0104 Delta NER % % % % % % % % % I % % % % % % % % % % I % % % -------------------------------------------------+---------------------------------------------------+---------------- ALE 98.8 94.1 89.4 - - - - - - I 96.5 66.0 49.8 - - - - - - - I - - - ANT 73.4 80.6 84.6 75.7 - - 92.8 - - I 58.0 68.6 80.4 56.6 - - 85.9 93.3 - - I 55.2 - - AVA 98.9 97.4 97.4 97.9 97.6 97.4 97.5 95.2 -2.3 I 99.3 98.2 80.4 97.2 95.9 94.6 93.7 - 92.3 - I - 30.0 - AVG 79.2 85.3 84.9 87.6 87.1 86.6 - 81.9 - I 25.2 71.0 27.1 81.6 82.5 96.6 - - 98.3 - I - 57.9 - AVK - - - 90.0 75.0 - - 99.7 - I - - - 99.7 99.6 - - 100~ 100~ 0.0 I 91.5 99.4 +7.9 AVP 98.5 98.4 99.3 99.7 99.7 99.8 99.6 99.8 +0.2 I 99.3 99.0 99.9 100% 99.8 100% 99.9 - 100~ - I - 99.8 - CMD - - - - - - 99.5 - - I - - - - - 99.5 100% 100~ - - I 93.5 - - DRW 93.2 93.8 92.8 93.1 98.2 98.3 - - - I 90.2 98.1 94.3 99.3 98.3 - 98.4 97.6 98.0 +0.4 I 60.8 95.6 +34.8 DSE 99.7 99.6 99.9 99.9 99.8 - - - - I 97.9 98.9 100% 100% 100% - - - - - I - - - FMA - - - - - - - - - I 98.6 98.2 99.9 - - - - - - - I - - - FPR 90.7 89.0 96.0 95.5 98.7 99.2 99.6 97.8 -1.8 I 43.4 36.1 99.9 99.8 99.8 99.7 100% 100~ 100% 0.0 I 90.5 96.9 +5.9 FSE - - 99.4 99.7 97.6 99.3 99.9 - - I - - 99.9 90.1 99.6 97.6 99.9 - - - I - - - FWN - - - - - - - - I 97.2 96.4 91.0 85.7 - - - - - - I - - - HMV - - - - - - - - I - - 98.2 99.0 99.5 - - - - - I - - - IBM 93.6 95.2 96.5 - - - - - - I 65.0 88.8 99.6 - - - - - - - I - - - INO - - 92.0 93.5 98.1 94.7 94.6 91.0 - I - - 90.3 95.2 99.8 99.5 99.7 99.7 99.3 -0.4 I 77.8 66.0 -11.8 IRS - 81.4 74.2 - 51.6 - - - - I - 69.5 48.2 - 89.1 - - - - - I - - - ITM - 81.0 81.2 65.8 64.2 - - - - I - 81.8 58.2 68.6 76.3 - - - - - I - - - IVB 8.3 - - - 96.9 - - - - I - - - - - - - - - - I - - - MR2 - - - - - 65.4 - - - I - - - - - 69.6 - - 44.2 - I - 85.1 - NAV 66.9 67.1 97.1 98.1 77.2 96.0 93.3 90.8 -2.5 I 80.7 86.4 98.7 99.8 99.7 98.6 97.4 97.0 93.8 -3.2 I 24.8 31.2 +6.6 NOD - - - 96.9 - 96.9 98.3 - - I - - - - 99.8 100% 99.4 - - - I - - - NVC 87.4 89.7 94.1 93.8 97.6 - 99.1 - - I 13.3 96.6 99.2 90.8 - 99.6 99.9 99.9 99.8 -0.1 I 83.7 88.5 +4.8 PAN - - 67.8 - - - - - - I - - 73.0 - - - - - - - I - - - PAV - 96.6 98.8 - 73.7 98.8 98.7 99.9 +1.2 I - - 93.7 100% 99.5 98.8 99.9 - 100~ - I - 99.8 - PCC - - - - - - - - - I - 67.6 - - - - - - - - I - - - PCV 67.9 - - - - - - - - I - - - - - - - - - - I - - - PRO - - - - 35.5 - - - - I - - - - 81.5 - - - - - I - - - RAV - - - 71.0 - - - - - I - - - 99.5 99.2 - - - - - I - - - SCN 83.9 93.5 90.7 87.8 99.8 97.1 99.9 99.8 -0.1 I 95.1 97.6 99.0 98.6 100% 100% 100% 100~ 100% 0.0 I 85.6 100% +14.4 SWP 95.9 94.5 96.8 98.4 - 99.0 98.4 - - I 87.4 89.1 98.4 98.6 - 98.4 98.4 - - - I - - - TBA 95.5 93.7 92.1 93.2 - - - - - I 72.0 96.1 99.5 98.7 - - - - - - I - - - TSC - - 50.4 56.1 39.5 51.6 - - - I - - 81.9 76.5 59.5 69.6 - - - - I - - - TNT 58.0 - - - - - - - - I - - - - - - - - - - I - - - VDS - 44.0 37.1 - - - - - - I 16.1 9.9 8.7 - - - - - - - I - - - UKV - - - - - - - - - I - - - - - - - 0.0 - - I 0.0 - - VET - 64.9 - - 65.3 - - - - I - 94.0 97.3 97.5 97.6 - - - - - I - - - VIT - - - - - - 7.6 - - I - - - - - - - - - - I - - - VRX - - - - - - - - - I - - - - - - - - - - I - - - VBS 43.1 56.6 - 35.5 - - - - - I - - - - - - - - - - I - - - VHU 19.3 - - - - - - - - I - - - - - - - - - - I - - - VSA - - 56.9 - - - - - - I - - 80.6 - - - - - - - I - - - VSP - - - 76.1 71.7 79.6 - - - I - - - - - - - - 0.0 - I - 85.3 - VSW - - 56.9 - - - - - - I - - 83.0 - - - - - - - I - - - VTR 45.5 - - - - - - - - I 6.3 - - - - - - - - - I - - - XSC 59.5 - - - - - - - - I - - - - - - - - - - I - - - -------------------------------------------------+---------------------------------------------------+----------------- Mean74.2 84.8 84.4 85.4 81.2 90.6 98.3 95.1%-0.9%I 69.6 80.9 83.8 89.6 93.6 88.2 98.0 98.6 93.8%-0.5%I 66.4 79.7% +8.9% -------------------------------------------------+---------------------------------------------------+----------------- Generally, the ability of DOS scanners to detect file zoo viruses "in the mean" has a visible tendency to decrease. Moreover, those scanners present in last test reduced detection rates by -0.9% (mean). Concerning macro viruses, "mean" detection rate is also reduced and tends to become insufficient. Here, products having participated in last VTC tests reduced their detection rates "only" moderately (-0.5%). Concerning script viruses which is presently the fastest growing sector, the detection rate is very low (79.7% mean). But those (7) scanners which participated in last test succeeded to improve their detection rate by impressing 8.9% "in the mean". ************************************************************** Findings DOS.1: For DOS, file and macro zoo virus detection rates in the mean are decreasing. 4 (of 9) scanners detects almost all zoo file viruses (>99.7%), whereas 2 (out of 13) scanners detect ALL macro zoo viruses (and 3 more almost all). Detection rates for script viruses are, in the mean, still inacceptably low (79.7%) but the mean detection rate of those scanners which also participated in last VTC test is significantly improving, though only 1 (of 13) scanners detect ALL zoo script viruses. ******************************************************************** Eval DOS.02: In-The-Wild (File,Macro,Script) Detection under DOS ================================================================ Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses also esp. detecting ALL instantiations of those viruses is now ABSOLUTE REQUIREMENT, for file, macro and script viruses. The following 4 DOS products (of 13) reach 100% for file, macro and script virus and file detection and are rated "perfect" in this category (alphabetically ordered): ITW Viruses&Files ( File Macro Script) ------------------------- "Perfect" WNT DOS scanners: AVK (100.0% 100.0% 100.0%) AVP (100.0% 100.0% 100.0%) PAV (100.0% 100.0% 100.0%) SCN (100.0% 100.0% 100.0%) ------------------------ The following scanners reached the grade "perfect" for file and macro ITW virus detection and did also detect all ITW script viruses though not all viral instantiations (files): ITW Viruses&Files ( File Macro ) --------------------- "Perfect" file/macro only ITW scanners: AVG (100.0% 100.0%) DRW (100.0% 100.0%) FPR (100.0% 100.0%) NVC (100.0% 100.0%) --------------------- ************************************************************ Findings DOS.2: 4 AV products (out of 13) detect ALL In-The-Wild file, macro and zoo viruses in ALL instantiations (files): AVK, AVP, PAV, SCN 4 more products can be rated "perfect" con- cerning detection of file and macro viruses but they still fail to detect all script viral files (objects): AVG, DRW, FPR, NVC ************************************************************ Eval DOS.03: Evaluation of overall DOS AV detection rates (zoo,ITW) =================================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is graded "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall AV grade" (including file, macro and script virus virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. Only scanners where all tests were completed are considered. (For problems in test: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with file and macro virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100%). Under DOS, NO product reached 100% detection rate for file, macro and script viruses, both zoo and In-The-Wild, and could be rated "perfect" (last time, 1 product was rated "perfect"). But 4 scanners are graded "Excellent" (>99%), and 1 more scanner are rated "very good" (>95%): (zoo:file/macro/script;file/macro/script:ITW) ---------------------------------------------- "Perfect" DOS scanners: =NONE= ---------------------------------------------- "Excellent" DOS scanners: SCN ( 99.8 100% 100% ; 100% 100% 100% ) PAV ( 99.9 100~ 99.8 ; 100% 100% 100% ) AVP ( 99.8 100~ 99.8 ; 100% 100% 100% ) AVK ( 99.7 100~ 99.4 ; 100% 100% 100% ) ---------------------------------------------- "Very Good" DOS scanners: FPR ( 97.8 100% 96.9 ; 100% 100% 100% ) ---------------------------------------------- ***************************************************************** Findings DOS.3: No DOS product is overall rated "perfect". 4 "excellent" overall scanners: SCN, PAV,AVP, AVK 1 "very good" overall scanner : FPR ***************************************************************** Eval DOS.04: Evaluation of detection by virus classes under DOS: ================================================================ Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting file, macro and script viruses. Moreover, boot virus detection was tested also under this platform. Two special tests of file viruses were also performed to determine the quality of AV product maintenance. One test was concerned with almost 11,000 viruses generated from the VKIT virus generator. Some AV products count each of the potential 14,000 viruses as new variant while others count all VKIT viruses just as ONE virus. Fortunately, a high proportion of tested products detects these viruses (see 4.5), although reliability of detection is significantly less than normally (see 6BDOSFIL.TXT). Another special test was devoted to the detection of 10,000 polymorphic generations each of the following 6 polymorphic viruses: Maltese.Amoeba, MTE.Encroacher.B, NATAS, TREMOR, One-Half and Tequila. Detection rates were "almost perfect". Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed. DOS.04.1 Grading the Detection of file viruses under DOS: --------------------------------------------------------- "Perfect" DOS scanner: === NONE === "Excellent" DOS scanners: PAV ( 99.9%) AVP ( 99.8%) SCN ( 99.8%) AVK ( 99.7%) "Very Good" DOS file scanners: FPR ( 97.8%) AVA ( 95.2%) DOS.04.2 Grading the Detection of macro viruses under DOS ---------------------------------------------------------- "Perfect" DOS file scanners: FPR (100.0%) SCN (100.0%) "Excellent" DOS file scanners: AVK ( 100~ ) AVP ( 100~ ) PAV ( 100~ ) NVC ( 99.8%) INO ( 99.3%) "Very Good" DOS file scanners: AVG ( 98.3%) DRW ( 98.0%) DOS.04.3 Grading the Detection of Script viruses under DOS: ----------------------------------------------------------- "Perfect" DOS script scanners: SCN (100.0%) "Excellent" DOS script scanners: AVP ( 99.8%) PAV ( 99.8%) AVK ( 99.4%) "Very Good" DOS script scanners: FPR ( 96.9%) DRW ( 95.6%) DOS.04.4 Grading of Poly-virus detection under DOS -------------------------------------------------- "Perfect" Poly-detectors which detect all instantiations of all (6) polymorphic file viruses always reliable: AVG, AVK, AVP, DRW, PAV "Excellent" Poly-detectors which detect all instantiations of all (6) polymorphic file viruses but not always reliably: FPR, INO, NAV, NVC, SCN DOS.04.5 Grading of VKit virus detection: ----------------------------------------- "Perfect" VKit detectors which detect all instantiations of all VKit viruses always reliable: =NONE= "Excellent" VKit detectors which detect all instantiations of all VKit viruses but not always reliably: AVK, AVP, PAV, SCN DOS.04.6 Grading the detection of boot viruses under DOS: --------------------------------------------------------- "Perfect" DOS boot scanners: =NONE= "Excellent" DOS macro scanners: PAV ( 99.4%) AVP ( 99.2%) SCN ( 99.2%) AVK ( 99.1%) "Very Good" DOS file scanners: NVC ( 98.2%) FPR ( 97.2%) INO ( 97.2%) NAV ( 95.9%) ******************************************************************** Finding DOS.4: Performance of DOS scanners by virus classes: --------------------------------------------- Perfect scanners for file zoo: =NONE= Excellent scanners for file zoo: PAV,AVP,SCN,AVK Perfect scanners for macro zoo: FPR,SCN Perfect scanners for script zoo: SCN Perfect scanners for polymorphic set:AVG,AVK,AVP,DRW,PAV Perfect scanners for VKit set: =NONE= Excellent scanners for VKit set: AVK,AVP,PAV,SCN Perfect scanners for boot viruses: =NONE= Excellent scanners for boot viruses: PAV,AVP,SCN,AVK ******************************************************************** Eval DOS.05: Detection of Packed File and Macro Viruses under DOS ================================================================= Detection of file and macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with given popular methods are also detected. In addition to those 4 packers used in previous tests (PKZIP, ARJ, LHA, RAR), the following packers were added to detection tests: WinRAR and CAB. Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially be unchanged to ease comparison and improvement. A "perfect" product would detect ALL packed viral samples (100%) file AND macro for all (6) packers: --------------------------------------------- "Perfect" packed virus detectors: SCN --------------------------------------------- An "excellent" product would reach 100% detection of packed viral samples (file¯o) for at least 5 packers: -------------------------------------------------------- "Excellent" packed macro virus detector: AVK, AVP, PAV -------------------------------------------------------- Remark: these 3 products reach 100% detection rates for all 6 packers though not reliably for CABed macro viruses. A "very good" product would detect viral samples (ITW file¯o) for at least 4 packers: ------------------------------------------------- "Very Good" packed macro virus detector: DRW, FPR ------------------------------------------------- Concerning only detection of packed file virus samples, the following products can be rated "perfect" as they detect ALL samples: --------------------------------------------------------- "Perfect" packed file virus detectors: AVK, AVP, PAV, SCN --------------------------------------------------------- Concerning only detection of packed macro virus samples, the only product rated "perfect" is that (1) which detect ALL (file AND macro) viral samples: ------------------------------------------- "Perfect" packed macro virus detectors: SCN ------------------------------------------- Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. ******************************************************************** Findings DOS.5: Detection of packed viral objects needs improvement Perfect packed file/macro virus DOS detector: SCN Perfect packed file virus detector: AVK, AVP, PAV, SCN Perfect packed macro virus detector: SCN ******************************************************************** Eval DOS.06: Avoidance of False Alarms (File, Macro) under DOS: =============================================================== First introduced in VTC test "1998-10", a set of clean (and non-malicious) objects has been added to the file and macro virus testbeds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be significantly more rigid than that one used for detection. The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" Regarding the ability of scanners to avoid FP alarms, 6 (out of 13) products in test reported NO SINGLE False Positive alarm in file and macro zoo testbeds and are therefore rated "perfect": ---------------------------------------------------------------- "Perfect" FP avoiding DOS scanners: AVA, AVG, AVK, INO, PAV, SCN ---------------------------------------------------------------- Several more (totally 10) DOS scanners gave NO FP alarm on clean files: ------------------------------------------------------------- "Perfect" file FP avoiding DOS scanners: AVA, AVG, AVK, AVP, FPR, INO, MR2, NAV, NVC, PAV, SCN ------------------------------------------------------------- **************************************************************** Findings DOS.6: Avoidance of False-Positive Alarms is improving though still regarded insufficient. FP-avoiding perfect DOS scanners: AVA, AVG, AVK, INO, PAV, SCN **************************************************************** Eval DOS.07: Detection of File and Macro Malware under DOS ========================================================== Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non-viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file and macro malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" Concerning file AND macro malware detection: -------------------------------------------------------------- "Perfect" file/macro malware detectors under DOS: === NONE === -------------------------------------------------------------- --------------------------------------------------- "Excellent" file/macro malware detectors under DOS: File / Macro AVP (96.4% 99.3%) PAV (96.2% 99.3%) AVK (96.0% 99.3%) FPR (94.0% 99.8%) SCN (90.4% 99.8%) --------------------------------------------------- Concerning only, macro malware detection NO product is rated "perfect", while 6 reach grade "excellent": --------------------------------------------------- "Perfect" macro malware detectors under DOS: =NONE= --------------------------------------------------- "Excellent" macro malware detectors under DOS: FPR ( 99.8%) SCN ( 99.8%) AVK ( 99.3%) AVP ( 99.3%) PAV ( 99.3%) NVC ( 99.0%) ---------------------------------------------------- ******************************************************************* Findings DOS.7: File/Macro Malware detection under DOS is stable and needs further improvement. NO product is "perfect" (in last test: 2 products!) but 5 products are rated "excellent" (>90% detection rate): AVP, PAV, AVK, FPR, SCN *************************************************** Concerning only macro malware detection, NO product is rated "perfect": =NONE= Concerning macro malware detection only, 6 more products are rated "excellent": FPR, SCN, AVK, AVP, PAV, NVC ****************************************************************** Eval DOS.08: Detection of "Exotic" malware under DOS ==================================================== With growing exchange of objects which may be activated under other platforms (esp. including Java, Linux etc), scanners must also detect related malware to warn customers before activating such malcode. For the first time in VTC tests, a selected (small) testbed of viruses active on other platforms - we presently call these "exotic" viruses" - has been used to determine the detection quality of contemporary on-demand scanners (engines, signatures). Even with "reduced" quality requirements, the following products detect exotic malware on an initially low level of at least 70% (viruses, files): virus/file PAV (92.2% 92.7%) AVP (90.4% 92.0%) AVK (89.6% 91.6%) SCN (70.4% 80.3%) ******************************************************************* Findings DOS.8: Exotic viruses are detected by scanners under DOS only to a lesser degree. The following products detect at least 70%: FSE, PAV, AVK, AVP, SCN ******************************************************************* Eval DOS.SUM: Grading of DOS products: ====================================== Under the scope of VTCs grading system, a "Perfect DOS AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99% of zoo samples, in ALL categories (file, boot and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (6) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Remark: detection of "exotic viruses" is presently NOT rated. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). Remark: detection of "exotic malware" is presently NOT rated. ***************************************************************** In VTC test "2001-04", we found *** NO perfect DOS AV product *** and we found *** No perfect DOS AM product *** ***************************************************************** But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" -------------------------------------------------------------- DOS zoo file test: --- PAV,AVP,SCN,AVK DOS zoo macro test: FPR,SCN --- DOS zoo script test: SCN --- DOS zoo boot test: --- PAV,AVP,SCN,AVK DOS zoo Poly test: AVG,AVK,AVP,DRW,PAV --- DOS zoo VKit test: --- AVK,AVP,PAV,SCN DOS ITW tests: AVK,AVP,PAV,SCN AVG,DRW,FPR,NVC DOS pack-tests: SCN --- DOS FP avoidance: AVA,AVG,AVK,INO,PAV,SCN --- DOS Malware Test: --- AVP,PAV,AVK,FPR,SCN --------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in this DOS test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" DOS AntiVirus product: =NONE= "Excellent" DOS AV products: 1st place: SCN (13 points) 2nd place: AVK,PAV ( 9 points) 4th place: AVP ( 7 points) 5th place: AVG ( 5 points) 6th place: FPR,DRW ( 3 points) 8th place: AVA,INO ( 2 points) 10th place: NVC ( 1 point ) ************************************************************ "Perfect" DOS AntiMalware product: =NONE= "Excellent" DOS AntiMalware product: 1st place: SCN (14 points) 2nd place: AVK,PAV (10 points) 4th place: AVP ( 8 points) 5th place: FPR ( 4 points) ************************************************************