========================================== File 7EVALLIN.TXT Evaluation of results for File, Macro and Script Virus/Malware detection under LINUX in VTC Test "2001-04": ========================================== Formatted with non-proportional font (Courier) Content of this file: ===================== Eval A: Development of detection rates under Linux: ********************************************************************** Eval LIN.01: Development of Linux Scanner Detection Rates Eval LIN.02: Evaluation of overall Linux AV detection rates Eval LIN.03: In-The-Wild Detection under Linux Eval LIN.04: Evaluation of detection by virus classes under Linux LIN.04.1 Grading the Detection of file viruses LIN.04.2 Grading the Detection of macro viruses LIN.04.3 Grading the Detection of boot viruses LIN.04.4 Grading of Poly-virus detection LIN.04.5 Grading of VKit virus detection Eval LIN.05: Detection of Packed File and Macro Viruses under Linux Eval LIN.06: Avoidance of False Alarms (File, Macro) under Linux Eval LIN.07: Detection of File and Macro Malware under Linux Eval LIN.08: Detection of "Exotic" malware Eval LIN.SUM Grading of WNT products ********************************************************************** This part of VTC "2001-04" test report evaluates the detailed results as given in sections (files): 6JLINUX.TXT File/Macro Viruses/Malware results LINUX The following (5) products participated in this scanner test for Linux products: -------------------------------------------------------- Products submitted for aVTC test under Linux (SuSe): -------------------------------------------------------- AVP v: 3.0 build 135.2 sig: Dec.05, 2000 CMD v: 4.60.0 sig: Dec.11, 2000 FSE v: 4.09 build 2351 sig: Dec.07, 2000 RAV v: 8.0.005 scaneng 8.3 sig: Dec.11, 2000 SCN v: 4.12.0 scaneng4.1.20 sig: Dec.06, 2000 -------------------------------------------------------- Eval LIN.01: Development of Linux Scanner Detection Rates: ==================================================== Linux-based scanners were tested in "2001-04" for the first time. Only 5 scanners were available for tests, so the mean value (which are significantly better than those for other operating platforms) is not statistically very relevant. Table E-Lin-1: Performance of LINUX scanners in Test 2001-04: ============================================================= Scan I = File Virus = + = Macro Virus = + = Script Virus = ner I Detection I Detection I Detection -----+-----------------+-----------------+----------------- Test I 0104 Delta I 0104 Delta I 0104 Delta -----+-----------------+-----------------+----------------- AVP I 99.9 - I 100~ - I 99.8 - CMD I 97.8 - I 100% - I 96.9 - FSE I 97.1 - I 100~ - I 96.9 - RAV I 93.5 - I 99.6 - I 84.9 - SCN I 99.7 - I 100% - I 99.8 - -----+-----------------+-----------------+----------------- Mean : 97.6% - I 99.9% - I 95.7% -----+-----------------+-----------------+----------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. ****************************************************************** Findings LIN.01: For those few (5) Linux scanners, mean detection rates for file, macro and script (zoo) viruses (though statistically less relevant) were signi- ficantly better than products for other platforms. No scanner is rated "perfect" (100% detection) for file and script viruses, but 2 (of 5) scanners are "perfect" concerning macro virus detection: CMD and SCN. ****************************************************************** Eval LIN.02: In-The-Wild (File,Macro,Script) Detection under LINUX ================================================================== Concerning "In-The-Wild" viruses, the following grid is applied: - detection rate is 100% : scanner is "perfect" - detection rate is >99% : scanner is "excellent" - detection rate is >95% : scanner is "very good" - detection rate is >90% : scanner is "good" - detection rate is <90% : scanner is "risky" 100% detection of In-the-Wild viruses also esp. detecting ALL instantiations of those viruses is now ABSOLUTE REQUIREMENT, for file, macro and script viruses and always with high reliability. Presently, no scanners is "perfect" but 2 (out of 5) products detect all viruses and all infected objects (though not with highest reliability and precision): ITW Viruses&Files ( File Macro Script) ------------------------- "Perfect" LINUX ITW scanner: =NONE= "Excellent" LINUX ITW scanners: AVP (100.0% 100.0% 100.0%) SCN (100.0% 100.0% 100.0%) ------------------------ ************************************************************ Findings LIN.2: No AV product detects "perfectly" all ITW file, macro and script viruses in all files but 2 scanners detect viruses and files at 100% though not always reliably: AVP, SCN ************************************************************ Eval LIN.03: Evaluation of overall LINUX AV detection rates (zoo,ITW) ===================================================================== The following grid is applied to classify scanners: - detection rate =100% : scanner is graded "perfect" - detection rate above 99% : scanner is graded "excellent" - detection rate above 95% : scanner is graded "very good" - detection rate above 90% : scanner is graded "good" - detection rate of 80-90% : scanner is graded "good enough" - detection rate of 70-80% : scanner is graded "not good enough" - detection rate of 60-70% : scanner is graded "rather bad" - detection rate of 50-60% : scanner is graded "very bad" - detection rate below 50% : scanner is graded "useless" To assess an "overall AV grade" (including file, macro and script virus virus detection, for unpacked objects), the lowest of the related results is used to classify each scanner. Only scanners where all tests were completed are considered. (For problems in test: see 8problms.txt). The following list indicates those scanners graded into one of the upper three categories, with file and macro virus detection rates in unpacked samples, and with perfect ITW virus detection (rate=100%). Under LINUX, NO product reached 100% detection rate for file, macro and script viruses, both zoo and In-The-Wild, and could be rated "perfect". Those 2 scanners (out of 5) which detected ALL ITW file/macro/script viruses, are graded as "excellent" concerning overall detection: (zoo:file/macro/script;file/macro/script:ITW) ---------------------------------------------- "Perfect" LINUX scanners: =NONE= ---------------------------------------------- "Excellent" LINUX scanners: AVP ( 99.9 100~ 99.8 ; 100% 100% 100% ) SCN ( 99.7 100% 99.8 ; 100% 100% 100% ) ---------------------------------------------- ************************************************************** Findings WNT.3: No LINUX product is overall rated "perfect". 2 "excellent" overall scanners: AVP, SCN ************************************************************** Eval LIN.04: Evaluation of detection by virus classes under LINUX: ================================================================== Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or in detecting one class significantly better than others). It is therefore worth notifying which scanners perform best in detecting file, macro and script viruses. Two special tests of file viruses were also performed to determine the quality of AV product maintenance. One test was concerned with almost 11,000 viruses generated from the VKIT virus generator. Some AV products count each of the potential 14,000 viruses as new variant while others count all VKIT viruses just as ONE virus. Fortunately, a high proportion of tested products detects these viruses (see 4.5), although reliability of detection is significantly less than normally. Another special test was devoted to the detection of 10,000 polymorphic generations each of the following 6 polymorphic viruses: Maltese.Amoeba, MTE.Encroacher.B, NATAS, TREMOR, One-Half and Tequila. Detection rates were "almost perfect". Products rated "perfect" (=100%), "excellent" (>99%) and "very good" (>95%) are listed. LIN.04.1 Grading the Detection of file viruses under LINUX: ----------------------------------------------------------- "Perfect" LINUX scanner: === NONE === "Excellent" LINUX scanners: AVP ( 99.9%) SCN ( 99.7%) "Very Good" LINUX file scanners: CMD ( 97.8%) FSE ( 97.1%) LIN.04.2 Grading the Detection of macro viruses under LINUX: ------------------------------------------------------------- "Perfect" LINUX macro scanners: CMD (100.0%) SCN (100.0%) "Excellent" LINUX macro scanners:AVP ( 100~ ) FSE ( 100~ ) RAV ( 99.6%) LIN.04.3 Grading the Detection of Script viruses under LINUX: ------------------------------------------------------------- "Perfect" LINUX script scanners: ===NONE=== "Excellent" LINUX script scanners:AVP ( 99.8%) SCN ( 99.8%) "Very Good" LINUX script scanners:CMD ( 96.9%) FSE ( 96.9%) LIN.04.4 Grading of Poly-virus detection under LINUX: ----------------------------------------------------- "Perfect" Poly-detectors which detect all instantiations of all (6) polymorphic file viruses always reliable: AVP "Excellent" Poly-detectors which detect all instantiations of all (6) polymorphic file viruses but not always reliably: CMD, FSE, RAV, SCN LIN.04.5 Grading of VKit virus detection: ----------------------------------------- "Perfect" VKit detectors which detect all generations of VKit viruses always reliably: ===NONE=== "Excellent" Kit detectors which detect all generations of VKit viruses viruses but not always reliably: AVP, SCN **************************************************************** Finding LIN.4: Performance of LINUX scanners by virus classes: ----------------------------------------------- Perfect scanners for file zoo: =NONE= Excellent scanners for file zoo: AVP,SCN Perfect scanners for macro zoo: CMD, SCN Perfect scanners for script zoo: =NONE= Excellent scanners for script zoo: AVP, SCN Perfect scanners for polymorphic set: AVP Perfect scanners for VKit set: =NONE= Excellent scanners for VKit set: FSE, SCN **************************************************************** Eval LIN.05: Detection of Packed File and Macro Viruses under LINUX: ==================================================================== Detection of file and macro viruses within packed objects becomes essential for on-access scanning, esp. for incoming email possibly loaded with malicious objects. It seems therefore reasonable to test whether at least ITW viral objects compressed with given popular methods are also detected. In addition to those 4 packers used in previous tests (PKZIP, ARJ, LHA, RAR), the following packers were added to detection tests: WinRAR and CAB. Tests are performed only on In-The-Wild viruses packed once (no recursive packing). As last test showed that AV products are rather far from perfect detection of packed viruses, testbed has essentially be unchanged to ease comparison and improvement. A "perfect" product would detect ALL packed viral samples (100%) file AND macro for all (6) packers: --------------------------------------------- "Perfect" packed virus detectors: CMD and SCN --------------------------------------------- An "excellent" product would reach 100% detection of packed viral samples (file¯o) for at least 5 packers: -------------------------------------------------------- "Excellent" packed macro virus detector: =NONE= ----------------------------------------------- A "very good" product would detect viral samples (ITW file¯o) for at least 4 packers: ----------------------------------------------- "Very Good" packed macro virus detector: =NONE= ----------------------------------------------- Concerning only detection of packed file virus samples, the following products can be rated "perfect" as they detect ALL samples: ------------------------------------------------------ "Perfect" packed file virus detectors: AVP, CMD, SCN ------------------------------------------------------ Concerning only detection of packed macro virus samples, the only products rated "perfect" are those (2) which detect ALL (file AND macro) viral samples: ------------------------------------------------ "Perfect" packed macro virus detectors: CMD, SCN ------------------------------------------------ Remark: Much more data were collected on precision and reliability of virus detection in packed objects. But in the present state, it seems NOT justified to add differentiation to results discussed here. ************************************************************************* Findings LIN.5: Detection of packed viral objects needs improvement Perfect packed file/macro virus LINUX detector: CMD, SCN Perfect packed file macro detector: AVP, CMD, SCN Perfect packed macro virus detector: CMD, SCN ************************************************************************* Eval LIN.06: Avoidance of False Alarms (File, Macro) under LINUX: ================================================================= First introduced in VTC test "1998-10", a set of clean (and non-malicious) objects has been added to the file and macro virus testbeds to determine the ability of scanners to avoid False-Positive (FP) alarms. This ability is essential for "excellent" and "very good" scanners as there is no automatic aid to customers to handle such cases (besides the psychological impact on customerīs work). Therefore, the grid used for grading AV products must be significantly more rigid than that one used for detection. The following grid is applied to classify scanners: - False Positive rate = 0.0%: scanner is graded "perfect" - False Positive rate < 0.5%: scanner is graded "excellent" - False Positive rate < 2.5%: scanner is graded "very good" - False Positive rate < 5.0%: scanner is graded "good enough" - False Positive rate <10.0%: scanner is graded "rather bad" - False Positive rate <20.0%: scanner is graded "very bad" - False Positive rate >20.0%: scanner is graded "useless" Regarding the ability of scanners to avoid FP alarms, 3 (out of 5) products in test reported NO SINGLE False Positive alarm in file and macro zoo testbeds and are therefore rated "perfect": --------------------------------------------------- "Perfect" FP avoiding LINUX scanners: AVP, RAV, SCN --------------------------------------------------- All (5) LINUX scanner gave NO FP alarm on clean files: ------------------------------------------------------------------ "Perfect" file FP avoiding LINUX scanners: AVP, CMD, FSE, RAV, SCN ------------------------------------------------------------------ ***************************************************************** Findings LIN.6: Avoidance of False-Positive Alarms is insuf- ficient and needs improvement. FP-avoiding perfect LINUX scanners: AVP, RAV, SCN ***************************************************************** Eval LIN.07: Detection of File and Macro Malware under LINUX ============================================================ Since test "1997-07", VTC tests also the ability of AV products to detect non-viral malware. An essential argument for this category is that customers are interested to be also warned about and protected from non-viral and non-wormy malicious objects such as trojans etc, the payload of which may be disastrous to their work (e.g. stealing passwords). Since VTC test "1999-03", malware detection is a mandatory part of VTC tests, both for submitted products and for those downloaded as free evaluation copies. A growing number of scanners is indeed able to detect non-viral malware. The following grid (admittedly with reduced granularity) is applied to classify detection of file and macro malware: - detection rate =100% : scanner is "perfect" - detection rate > 90% : scanner is "excellent" - detection rate of 80-90% : scanner is "very good" - detection rate of 60-80% : scanner is "good enough" - detection rate of < 60% : scanner is "not good enough" Presently, there is NO product which "perfectly" detects all (non-viral) file and macro malware in VTC testbed but 2 products are rated "excellent": Concerning file AND Macro malware detection: ---------------------------------------------------------------- "Perfect" file/macro malware detectors under LINUX: === NONE === "Excellent" file/macro malware detectors: File / Macro AVP (96.4% 99.3%) CMD (94.0% 99.8%) ----------------------------------------------------------------- Concerning only, macro malware detection 2 products are rated "perfect", and 9 more reach grade "excellent": ------------------------------------------------------ "Perfect" file malware detectors: =NONE= "Excellent" file malware detectors: AVP (96.4%) CMD (94.0%) ------------------------------------------------------- "Perfect" macro malware detectors: =NONE= "Excellent" macro malware detectors: CMD (99.8%) FSE (99.8%) SCN (99.8%) AVP (99.3%) RAV (97.0%) --------------------------------------------------- ******************************************************************* Findings LIN.7: NO LINUX product is "perfect" but 2 products are rated "excellent" (>90% detection rate): AVP, CMD ******************************************************************* Eval LIN.08: Detection of "Exotic" malware under LINUX ======================================================= With growing exchange of objects which may be activated under other platforms (esp. including Java, Linux etc), scanners must also detect related malware to warn customers before activating such malcode. For the first time in VTC tests, a selected (small) testbed of viruses active on other platforms - we presently call these "exotic" viruses" - has been used to determine the detection quality of contemporary on-demand scanners (engines, signatures). Even with "reduced" quality requirements, only 1 (out of 5) product detects exotic malware on an initially low level of at least 70% (viruses, files): virus/file AVP (90.4% 92.0%) ******************************************************************* Findings LIN.8: Exotic viruses are detected by scanners under LINUX to an insufficient lesser degree. Only 1 product detects at least 70%: AVP ******************************************************************* Eval LIN.SUM: Grading of LINUX products: ======================================== Under the scope of VTCs grading system, a "Perfect LINUX AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND in at least 99% of zoo samples, in ALL categories (file, boot and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (6) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Remark: detection of "exotic viruses" is presently NOT rated. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). Remark: detection of "exotic malware" is presently NOT rated. ******************************************************************* In VTC test "2001-04", we found *** NO perfect LINUX AV product *** and we found *** No perfect LINUX AM product *** ******************************************************************* But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" --------------------------------------------------------------- LINUX zoo file test: --- AVP,SCN LINUX zoo macro test: CMD,SCN --- LINUX zoo script test: --- AVP,SCN LINUX zoo Poly test: AVP --- LINUX zoo VKit test: --- FSE,SCN LINUX ITW tests: --- AVP,SCN LINUX pack-tests: CMD,SCN --- LINUX FP avoidance: AVP,RAV,SCN --- LINUX Malware Test: --- --- ---------------------------------------------------------------- In order to support the race for more customer protection, we evaluate the order of performance in this WNT test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" LINUX AntiVirus product: =NONE= "Excellent" LINUX AV products: 1st place: SCN (10 points) 2nd place: AVP ( 7 points) 3rd place: CMD ( 4 points) 4th place: RAV ( 2 points) 5th place: FSE ( 1 points) ************************************************************ "Perfect" LINUX AntiMalware product: =NONE= "Excellent" LINUX AntiMalware product: =NONE= ************************************************************