===================================== Results of VTC test for Computer Bild ===================================== Date: June 15, 1998 =================== In this test, file and macro virus detection as well as file and macro malware detection was tested for 11 products as given by Computer Bild. The test was based on VTCs virus/malware database as frozen on their status of April 30,1998. Test were performed under Windows 95. Project management: Prof. Dr. Klaus Brunnstein Tanja Hofmann Database management: Jörn Dierks Martin Kittel Test Crew: Thomas Buck Martin Kittel Jan Maier Sven Meinhardt Mario Ticak Report Content: 1) Products/versions/company/dates tested 2) Virus and Malware Databases 3) Overall performance 4) Results of File Virus Test 5) Results for Macro Virus Test 6) Results for File Malware Test 7) Results for Macro Malware Test 8) Problems in Tests 1) Products/versions/company/dates tested: ========================================== The following products were selected by "Computer Bild", a German bi-monthly magazine with a circulation of about 1,6 Mio copies. AVK 8.02 G Data AntiVirenKit 5.5.98 DRPC Flexform Doctor-PC 7.3.98 DRPC-U Flexform Doctor-PC 27.5.98 DTVW 330 Trend Micro Desktop Virus Wall 3.11.97 DTVW-U 371 Trend Micro Desktop Virus Wall 19.5.98 FSEC 4.01 Percomp F-Secure 19.5.98 INOC Cheyenne Inoculan 11.2.98 NOR 4.0 Symantec Norton Antivirus 2.9.97 NOR-U 4.0 Symantec Norton Antivirus 1.6.98 PAN Omega See Panda 24.4.98 PAN-U Omega See Panda 7.6.98 PAV 3.0 / 119 G Data Power Anti Virus 5.5.98 SCAN McAfee VirusScan 23.5.98 SOL 7.81 Dr Solomon's Antivirus Toolkit 31.1.98 SOL-U 7.84 Dr Solomon's Antivirus Toolkit 30.4.98 TBAV 8.05 Promus Thunderbyte 25.5.98 Remark#1: for evaluation, the "updated (-U)" signature were used where available. Remark #2: 2 more products selected by Computer Bild were DrSolomon`s Emergency Antivirus and CDV AntiVirus. Both could not be tested as they could not be "convinced" to scan network drives (as required in VTC test methods, see VTC test 1998-02 report). Moreover, both engines are identical, according to DrSolomon Germany, to DSS engine as tested. 2) Virus and Malware Databases: =============================== Virus and malware databases reflect the status of virii and malware known to VTC at April 30, 1998. The following testbeds were used: File Viruses: 17720 with 23272 infected files (COM/EXE) Macro Viruses: 2159 with 9034 infected objects (DOC/XLS) File Malware: 213 Macro Malware: 191 In order to reduce the time needed for file virus tests, only one infected COM or EXE per virus were stored. This reduced the testbed from its normally about 140,000 infected objects to about 24,000 infected files. As no problems with testing was awaited in the macro virus database, the complete database prepared for the "large" VTC test "1998-07" was used. File and Macro Malware include trojans, droppers, intended, first- generation viruses etc. 3) Overall Performance: ======================= Only one product reaches an "excellent" result (>99% in virus detection, >95% in malware detection) in every test category: (file/macro virus; file/macro malware) Perfect (100% each) -------------------------------------- Overall excellent: SOL-U (99.7% 99.1% 98.8% 96.4%) Overall Very Good: AVK (99.5% 99.6% 87.7% 95.5%) PAV (99.0% 99.5% 87.1% 95.5%) Remark: FSEC: (CRASHED 100%(rounded) 98.2% 99.1%) 4) Results of File Virus Test: ============================== The following table summarizes results of File Virus tests. The table lists the number of detected file viruses (both in absolute figures and as percentage) as well as the number of infected files (equally in absolute and relative measures). Moreover, the reliability of detection and identification is given (which is less relevant as a short version of the databases was used). In this test, no product was "PERFECT". The following products reached "excellent" (>99% detection of viruses) and "very good" detection measures: Perfect (=100%): no product Excellent (>99%): SOL-U 99.7% AVK 99.5% PAV 99.0% Very Good (>95%): NOR 90.0% The following products were untestable as they crashed: DRPC (Updated+original), FSECure, Panda (Updated+original) This includes Viruses ---- unreliably ---- Files Scanner detected identified detected detected --------------------------------------------------------- AVK 17634 99.5 369 2.1 16 0.1 23160 99.5 DRPC-U 0 0.0 0 0.0 0 0.0 0 0.0 CRASHED DRPC 0 0.0 0 0.0 0 0.0 0 0.0 CRASHED DTVW-U 11777 66.5 1600 9.0 155 0.9 15514 66.7 DTVW 11456 64.7 1524 8.6 155 0.9 15080 64.8 FSEC 0 0.0 0 0.0 0 0.0 0 0.0 CRASHED INOC 15572 87.9 245 1.4 67 0.4 20422 87.8 NOR-U 16963 95.7 3 0.0 108 0.6 22266 95.7 NOR 15943 90.0 3 0.0 211 1.2 20753 89.2 PAN-U 0 0.0 0 0.0 0 0.0 0 0.0 CRASHED PAN 0 0.0 0 0.0 0 0.0 0 0.0 CRASHED PAV 17543 99.0 309 1.7 31 0.2 23047 99.1 SCAN 15708 88.6 465 2.6 125 0.7 20601 88.5 SOL-U 17673 99.7 184 1.0 14 0.1 23208 99.7 SOL 17538 99.0 206 1.2 22 0.1 23033 99.0 TBAV 13415 75.7 0 0.0 200 1.1 17475 75.1 --------------------------------------------------------- 5) Results for Macro Virus Test: ================================ The following table summarizes results of Macro Virus tests. The table lists the number of detected macro viruses (both in absolute figures and as percentage) as well as the number of infected objects (equally in absolute and relative measures). Moreover, the reliability of detection and identification is given (which is relevant as this is the FULL VTC testbed). In this test, NO product was "PERFECT" (=100.00%), although FSEC comes near (100% with rounding up). The following products reached "excellent" (>99% detection of viruses) and "very good" detection measures: Almost perfect (100.0%): FSEC 100.0% (rounded) Excellent (<99.0%): AVK 99.6% PAV 99.5% SOL-U 99.1% Very good (>95%): SCAN 97.7% NOR-U 97.0% DTW-U 96.5% This includes Viruses ---- unreliably ---- Files Scanner detected identified detected detected --------------------------------------------------------- AVK 2151 99.6 48 2.2 0 0.0 9014 99.8 DRPC-U 804 37.2 27 1.3 20 0.9 3447 38.2 DRPC 804 37.2 27 1.3 20 0.9 3447 38.2 DTVW-U 2072 96.0 86 4.0 6 0.3 8733 96.7 DTVW 1982 91.8 90 4.2 7 0.3 8466 93.7 FSEC 2158 100.0(*) 38 1.8 0 0.0 9032 100.0(*) INOC 1810 83.8 11 0.5 5 0.2 7628 84.4 NOR-U 2095 97.0 27 1.3 3 0.1 8859 98.1 NOR 1888 87.4 31 1.4 13 0.6 8075 89.4 PAN-U 1428 66.1 29 1.3 2 0.1 6335 70.1 PAN 1417 65.6 30 1.4 2 0.1 6300 69.7 PAV 2148 99.5 48 2.2 1 0.0 9011 99.7 SCAN 2110 97.7 30 1.4 2 0.1 8893 98.4 SOL-U 2140 99.1 52 2.4 2 0.1 8989 99.5 SOL 2099 97.2 44 2.0 7 0.3 8812 97.5 TBAV 1921 89.0 0 0.0 9 0.4 8364 92.6 --------------------------------------------------------- Remark: (*) rounded to 100% --------------------------------------------------------- 6) Results for File Malware Test: ================================= The following table summarizes results of File Malware tests. The table lists the number of single file malware (both in absolute figures and as percentage) as well as the number of multiple malware (equally in absolute and relative measures). VTC rating is, due to difficulties in malware detection, different from virus detection rating. Products are rated "Excellent" if >95% detection rate is achieved; "Very good" means detection of >85%. Perfect: no product Excellent (>95%): SOL-U 98.8% FSEC 98.2% Very Good (>85%): INOC 88.3% AVK 87.7% PAV 87.1% File This includes File Malware ---- unreliably ---- Malware Scanner detected identified detected detected --------------------------------------------------------- AVK 143 87.7 10 6.1 2 1.2 189 88.7 DRPC-U 18 11.0 0 0.0 3 1.8 29 13.6 DRPC 18 11.0 0 0.0 3 1.8 28 13.1 DTVW-U 96 58.9 3 1.8 2 1.2 128 60.1 DTVW 95 58.3 3 1.8 2 1.2 127 59.6 FSEC 160 98.2 2 1.2 0 0.0 210 98.6 INOC 144 88.3 5 3.1 2 1.2 191 89.7 NOR-U 47 28.8 0 0.0 7 4.3 65 30.5 NOR 45 27.6 0 0.0 6 3.7 63 29.6 PAN-U 57 35.0 2 1.2 5 3.1 82 38.5 PAN 56 34.4 2 1.2 5 3.1 83 39.0 PAV 142 87.1 21 12.9 3 1.8 184 86.4 SCAN 69 42.3 3 1.8 5 3.1 105 49.3 SOL-U 161 98.8 6 3.7 0 0.0 211 99.1 SOL 161 98.8 6 3.7 0 0.0 211 99.1 TBAV 49 30.1 0 0.0 4 2.5 69 32.4 --------------------------------------------------------- 7) Results for Macro Malware Test: ================================== The following table summarizes results of Macro Malware tests. The table lists the number of single Macro malware (both in absolute figures and as percentage) as well as the number of multiple malware (equally in absolute and relative measures). As in File Malware test, products are rated "excellent" if >95% detection rate is achieved; "Very good" means detection of >85%. Perfect (=100%): no product Excellent (>95%): FSEC 99.1% SOL-U 96.4% AVK 95.5% PAV 95.5% Very Good (>85%): SCAN 87.5% Macro This includes Macro Malware ---- unreliably ---- Malware Scanner detected identified detected detected --------------------------------------------------------- AVK 107 95.5 0 0.0 1 0.9 186 96.9 DRPC-U 25 22.3 1 0.9 0 0.0 49 25.5 DRPC 25 22.3 1 0.9 0 0.0 49 25.5 DTVW-U 91 81.3 4 3.6 0 0.0 156 81.3 DTVW 82 73.2 1 0.9 2 1.8 143 74.5 FSEC 111 99.1 1 0.9 0 0.0 191 99.5 INOC 86 76.8 1 0.9 2 1.8 156 81.3 NOR-U 83 74.1 0 0.0 1 0.9 150 78.1 NOR 64 57.1 0 0.0 2 1.8 122 63.5 PAN-U 89 79.5 1 0.9 0 0.0 161 83.9 PAN 83 74.1 1 0.9 0 0.0 152 79.2 PAV 107 95.5 1 0.9 1 0.9 186 96.9 SCAN 98 87.5 0 0.0 1 0.9 174 90.6 SOL-U 108 96.4 2 1.8 1 0.9 186 96.9 SOL 94 83.9 2 1.8 0 0.0 168 87.5 TBAV 88 78.6 0 0.0 2 1.8 160 83.3 --------------------------------------------------------- 8) Problems in Tests: ===================== For the following products, problems were observed which partly made either reporting or evaluation of reports impossible or only partly successful: F-Secure: crashed several times in file test in macro scan: program error "Fpwsrv32; continued with "ignore" problems in formating report: "Internal Error Condition Test Failure: (rc>=0) at: Source\ureplbox.cpp:501" (then continues). Panda: stops scanning before end of directory, at some file which could not be determined; same behaviour even with scanning specified directory entries (instead of "scan whole partition"). DRPC: Stops scanning before end, also when directory entries specified. Macro scan: stopped after scanning 7665 files. CDV 7.83: could not scan net drives. Dr. Turbo 2.30 / Carmel: multiple crashes (partly upon simultaneous access from several test clients), incomplete report. Inoculan: Original version worked only on specific clients. After attempting updating signature, product did not work at all (error message: "cannot allocate memory").