Executive Summary:
VTC University of Hamburg
AntiMalware Product Test "1998-02"

Malicious software (malware) including viruses (=self-replicating mal- ware), trojan horses (=pure payload without self-replication), virus droppers, and network malware (e.g. worms and hostile applets), regarded as serious threat to PC users. With the deployment of more than 4,000 viruses and several 100 Trojan horses annually, many of which are available on Internet, and in the absence of inherent protection against such dysfunctional software, users must rely on AntiMalware and esp. AntiVirus software to detect and eradicate malicious software. Hence, the quality of AntiMalware products become one essential means of protecting customer productivity and data.

Virus Test Center (VTC) at Hamburg University`s Faculty for Informatics performs regulat tests of AntiMalware and (esp.) AntiVirus Software. VTC recently tested on-demand scanners for their ability to identify PC viruses. Tests were performed on VTCs malware databases, which were frozen on their November 30, 1997 status to give AV producers a fair chance to support updates within 7 weeks. The test goal was to determine detections rates, reliability (=consistency) of malware identification and reliability of detection of submitted or publicly available scanners. Moreover, development of detection quality was measured where more than one update was available during the test period (including updates up to January 23, 1998).

VTC maintains collections of boot, file and macro viruses as well as related malware ("zoo"). Moreover, following the monthly list of "In-The-Wild Viruses" (published by Joe Wells of IBM), a separate collection of viruses reported to be broadly visible is maiintained to allow for comparison with other tests; presently, this list doesnot report ITW Malware.


 ***************************************************************
 "Full Zoo":14,596 File Viruses   in 106,470 infected files,
               323 File Malware   
             1.071 System Viruses in   4,464 infected images,
             1.548 Macro Viruses  in   4,436 infected documents,
               459 Macro Malware
            -----------------------------------------------------
 "ITW Zoo":     98 File Viruses   in   3,128 infected files,
                85 System Viruses in     702 infected images, and
                66 Macro Viruses  in     515 infected documents.
 ****************************************************************   
 Content of VTC testbeds is contained in the file "a3testbed.zip"
 ****************************************************************
 
For test "1998-02", the following AntiVirus products (of manufacturer) under DOS/Windows, Windows-95 and Windows-NT were tested:

	Alert (Look), AVAST! (Alwil), AVG (Grisoft), AVP (KAMI Ltd), 
	AntiVir (H+B EDV), Anyware (Anyware), DSAV (Dr. Solomon),
	DrWeb (Dialogue Science), F-SECURE (Data Fellows),
	F-Prot (Frisk Software), F/Win (Kurtzhals), 
   	HSMV (Valky,Vrtik), IBM AV (IBM), Inoculan (Cheyenne), 
	Integrity Master (Stiller Research), Norman Virus Control (Norman Data) 
	Norton AV (Symantec), Panda (Panda), Perforin ()
	Power Antivirus (G-Data), RAV (Romanion AV), Scan (McAfee), 
	Sweep (Sophos),	TBAV (ThunderByte), TSCAN (Marx),
	VDS (Advanced Research Group), VETMAC (CYBEC), VirusSafe (Eliashim), 
	VirusSweep (Quarterdeck).
AV products were either submitted or, when test versions were available on Internet, downloaded from respective ftp/http sites. Few more scanners were withdrawn from VTC tests in general or for this test, some of which were announced to participate in the next test. Finally, very few AV producers answered VTCs bids for submitting scanners with electronic silence.

Concerning malware detection, some AV producers asked that their product is NOT tested in this category. While VTC regards malware as essential threat to users and therefore suggests including malware detection in ANY AV product, it followed such request; therefore, some AV products are missing in malware detection tables.

The following text surveys essential findings in comparison with last VTC tests (performance over time), as well as some relative "grading" of scanners for detection of file and macro viruses as well as related malware, unpacked both in the full "zoo" and "In-The-Wild", as well - for the first time in VTC test - detection of file and macro viruses in objects packed with ARJ, LHA and ZIP. Detailed results including precision and reliability of virus identification as well as results for boot/MBR infectors are described in overview tables "6a-sumov.txt" and the related tables for DOS/boot+file+macro, Win95 and Win-NT detection.

Summary #1: Evaluation of DOS Scanner Improvement during last tests:

Concerning performance of DOS scanners, a comparison of virus detection results in test "1997-02" with the new test "1997-07" shows how scanners behave and how manufacturers work for adapting their products to the growing threat of new viruses. The following table lists the development of the detection rate of scanners (most actual versions in each test), and it calculates the change (+ indicating improvement) in detection rates.

For reasons of fairness, it must be noted that improvement of those products which have yet reached a very high level of detection and quality (say: more than 90 or 95%) is much more difficult to acchive than for those products which reach lower detection rates. Moreover, changes in the order of about +-2% are not significant as this is about the growth rate per month, so detection depends strongly whether some virus is reported (and analysed and included) just before a new update is delivered.

The following table lists developments for detection of file and macro viruses respectively; for details as well as for boot virus detection, see result tables (6b-6f).


SCANNER      File Virus Detection        Macro Virus Detection                       
	  97/2   97/7   98/2 CHANGE   97/2   97/7   98/2 CHANGE       
----------------------------------------------------------------
ALERT     98.8%  94.1%  89.4%  -4.7%  96.5%  66.0%  49.8% -16.8%
AVAST     98.9%  97.4%  97.4%   0.0%  99.3%  98.2%  80.4% -17.8%
AVG       79.2%  85.3%  84.9%  -0.4%  25.2%  71.0%  27.1% +43.9%
AVP       98.5%  98.4%  99.3%  +0.9%  99.3%  99.0%  99.9%  +0.9%
ANTVIR    73.4%  80.6%  84.6%  +4.0%  58.0%  68.6%  80.4% +11.8%
DRWEB     93.2%  93.8%  92.8%  -1.0%  90.2%  98.1%  94.3%  -3.8%
DrSol     99.7%  99.6%  99.9%  +0.3%  97.9%  98.9% 100.0%  +1.1%
FMACRO       -      -      -      -   98.6%  98.2%  99.9%  +1.7%
FPROT     90.7%  89.0%  96.0%  +7.0%  43.4%  36.1%  99.9% +63.8%
FSEC         -      -   99.7%     -      -      -  100.0%     - 
FWIN         -      -      -      -   97.2%  96.4%  91.0%  -5.4%
IBM       93.6%  95.2%  96.5%  +1.3%  65.0%  88.8%  99.6% +10.8%
INOC         -      -   92.0%     -      -      -   90.3%     - 
IRIS         -   81.4%  74.2%  -7.2%     -   69.5%  48.2% -22.3%
ITM          -   81.0%  81.2%  +0.2%  81.8%  58.2%  68.6% +10.4%
IVB        8.3%     -      -      -      -      -      -      - 
HMVS         -      -      -      -      -      -   98.2%     - 
NAV       66.9%  67.1%  97.1  +30.0%  80.7%  86.4%  98.7% +12.3%    
NVC       87.4%  89.7%  94.1%  +4.4%  13.3%  96.6%  99.2%  +2.6%
PANDA        -      -   67.8%     -      -      -   73.0%     - 
PAV          -   96.6%  98.8%  +2.2%     -   93.7% 100.0%  +6.3%
PCC          -      -      -      -      -   67.6%     -      - 
PCVP      67.9%     -      -      -      -      -      -      - 
SCN       83.9%  93.5%  90.7%  -2.8%  95.1%  97.6%  99.0%  +2.4%
SWP       95.9%  94.5%  96.8%  +2.3%  87.4%  89.1%  98.4%  +9.3%
TBAV      95.5%  93.7%  92.1%  -1.6%  72.0%  96.1%  99.5%  +3.4%
TSCAN        -      -   50.4%     -      -      -   81.9%     - 
TNT       58.0%     -      -      -      -      -      -      - 
VDS          -   44.0%  37.1%  -6.9%  16.1%   9.9%   8.7%  -1.2%
VET          -   64.9%     -      -      -   94.0%  97.3%  +3.3%
Virex        -      -      -      -      -      -      -      - 
VBster    43.1%  56.6%     -      -      -      -      -      - 
VHnter    19.3%     -      -      -      -      -      -      - 
VSAFE        -      -   56.9%     -      -      -   80.6%     - 
VSWP         -      -   56.9%     -      -      -   83.0%     - 
VTrack    45.5%     -      -      -    6.3%     -      -      - 
XSCAN     59.5%     -      -      -      -      -      -      - 
----------------------------------------------------------------
Result #1:
The general impression is that most DOS scanners improved their performance or remained on a rather high level, with few exceptions. This implies that most AV producers handle the growing amount of viruses "rather well". It is worthwhile to especially note that the quality of macro virus detection has been improved since last test.

Concerning rating of DOS scanners, the following grid is applied to classify scanners:

  • detection rate above 95% : the scanner is graded "excellent"
  • detection rate above 90% : the scanner is graded "very good"
  • detection rate of 80-90% : the scanner is graded "good enough"
  • detection rate of 70-80% : the scanner is graded "not good enough"
  • detection rate of 60-70% : the scanner is graded "rather bad"
  • detection rate of 50-60% : the scanner is graded "very bad"
  • detection rate below 50% : the scanner is graded "useless"
To assess an "overall grade" (including file and macro virus detection), the lowest of the related results is used to classify the resp. scanner. If several scanners of the same producer has been tested, grading is applied to the most actual version (which is, on most cases, the version with highest detection rates). Only scanners where all tests were completed are considered; here, the most actual version with test completed was selected.

The following list indicates those scanners graded into one of the upper two categories, with file and macro virus detection rates in unpacked forms, and with perfect ITW virus detection (rate=100%):

file/macro zoo; file/macro ITW
"Excellent" DOS scanners:

   
DSS 780     (99.9%  100.0%; 100.0%  100.0%) 
FSEC 3017   (99.4%   99.9%; 100.0%  100.0%)
AVPD 117    (99.3%   99.9%; 100.0%  100.0%) 
PAV 30      (98.9%   99.9%; 100.0%  100.0%)                         
FPR 3045    (96.0%   99.9%; 100.0%  100.0%) 
IBM 30BA    (96.5%   99.6%; 100.0%  100.0%) 
NAV 40      (97.1%   98.7%; 100.0%  100.0%)
SWP 306     (96.8%   98.4%; 100.0%  100.0%) 
The following scanners missed the level of "excellency" only by small margin:
FSEC 3017   (99.4%   99.9%;  99.7%  100.0%)

Result #1A:
the overall virus detection quality of DOS scanners has reached a very acceptable level also for viruses which are not "in-the-wild".

Concerning "In-The-Wild" viruses, a much more rigid grid must be applied to classify scanners, as the likelyhood is significant that a user may find such a virus on her/his machine. The following grid is applied:

  • detection rate is 100% : scanner is "excellent"
  • detection rate is >95% : scanner is "very good"
  • detection rate is >90% : scanner is "good"
  • detection rate is <90% : scanner is "risky"
The following DOS products reach 100% both for file and macro virus detection and are rated "excellent" in this category (in alphabetical order):

AVPD 117    (100% 100%) 
DrWeb 326   (100% 100%)
DSS 780     (100% 100%)
FProt 3045  (100% 100%)
FSEC 3017   (100% 100%)
IBMAV 30BA  (100% 100%)
NAV 40      (100% 100%)
PAV 30      (100% 100%)
Scan 31401  (100% 100%)
Sweep 306   (100% 100%)
Few scanners miss the highest category for ITW-detection marginally; these include:
AV 77001    ( 99.0% 100%) 
TBAV 805    ( 99.0% 100%) 
NVC 435     ( 96.9% 100%) 
As macro-only product, VETMacro also reaches 100% ITW detection.
Result #1B:
In-The-Wild detection of best DOS scanners has been significantly improved since last tests.

Summary #2: Evaluation for detection by virus classes under DOS:

Some scanners are specialised on detecting some class of viruses (either in deliberately limiting themselves to one class, esp. macro viruses, or as that part is significantly better as other parts). It is therefore worth notifying which scanners perform best in detecting file, boot and macro viruses. Compared to the last test, the number of "excellent" macro virus detectors has significantly grown (as has the class of "good" ones which is not listed here); in contrast, "standard" file viruses (and even more) boot viruses seem to be less attractive in product upgrading.

Those products with grade "excellent" are listed below.

2.1 Detection of file viruses:
"Excellent" DOS scanners:

DSS 780       (99.9%)
FSEC 3017     (99.4%)
AVPD 113      (99.3%)
PAV 30        (98.8%)
AV 77001      (97.4%)
NAV 40        (97.1%)
SWP 306       (96.8%)
IBM 30BA      (96.5%)
FPR 3045      (96.0%)
2.2 Detection of macro viruses:
"Excellent" DOS scanners:
DSS 780      (100.0%)
AVPD 117      (99.9%)
FPR 3045      (99.9%)
FSEC 3017     (99.9%)
PAV 30        (99.9%)
IBM 30BA      (99.6%)
TBAV 805      (99.5%)
NVC 435       (99.2%)
Scan 31401    (99.0%)
NAV 40        (98.7%)
SWP 306       (98.4%)
HMVS          (98.0%)
VETMacro      (97.3%)

Result #2: Specialised scanners (esp. on macro viruses) are no longer superior to best overall scanners, even concerning large collections such as VTC "zoo".
Summary #3: Detection of viruses in packed objects under DOS:

For the first time, VTC tests include testing detection of viruses in packed objects. 3 different macking mathods were selected: ARJ, LHA and PKZIP. For every virus, one infected object was packed with each of these three methods, and detection was tested.

No product can presently be rated as "excellent" in this category. Those scanners belonging to the class ""Very Good" are listed (ordered by detection rates for file and macro malware):

"Excellent on packed scanning":
             (file v. macro v.)
DSS 780       (99.8%   95.6%)
FSEC 3017     (99.3%   95.1%)

"Almost excellent packed scanning":
AVPD 117      (92.9%  100.0%)
FPR 3045      (93.3%   95.0%)
NVC 435       (92.9%   92.8%)
DrWeb 326     (92.5%   94.3%)

Result #3: Few products have reached a very good level of detecting viruses in packed infected objects with given compression methods. Here, more improvement is needed.


Summary #4: Evaluation of DOS File and Macro Malware detection:

Several scanners are able to detect also non-viral malware. The related test includes 49 macro-related specimen and 163 file-related "malware". The following grid is applied to classify detection of macro malware:

  • detection rate > 90% : the scanner is graded "excellent"
  • detection rate of 80-90% : the scanner is graded "very good"
  • detection rate of 60-80% : the scanner is graded "good enough"
  • detection rate of < 60% : the scanner is graded "not good enough"
Only one product can be rated as "excellent" in this category, but as this is an emerging area, few more scanners belonging to the class ""Very Good" have also a good chance to assist users in detecting DOS malware:
                                         (file-m. macro-m.)
"Excellent" DOS scanners:     DSS 780      (98.1   98.6%)

"Very Good" DOS scanners:     PAV 30       (87.0%  95.8%)
                              FPR 3045     (86.4%  97.2%)
                              AVPD 117     (86.4%  95.8%)
                              Inoculan 404 (90.1%  83.3%)
Result #4: Concerning DOS malware, some scanners have good potential to develop to overall AntiMalware products. In this area, consciousness of AV producers must be devlopped

Summary #5: Evaluation for overall virus detection rates under Windows 95 and Windows NT:

The number of scanners running under Windows 95 and Windows NT is still small, though growing. Significantly less products were available for these tests, compared with the traditional DOS scene.

The same grid as for the DOS classification is applied to classify scanners according to their ability to detect file and macro viruses under Windows 95 and Windows NT.

The following list indicates those scanners under Windows 95 graded into one of the upper categories "excellent" and "very good" upon detecting file and macro viruses: "Excellent" Windows 95 scanners:

DSS 780      (99.9% 100.0%)
PAV 605      (99.4% 100.0%)
AVPD 117     (97.2%  99.9%)
NAV 40       (97.1%  98.7%)
IBM 30BA     (96.5%  99.8%)
SWP 306      (96.8%  98.4%)
Result #5A: With still few scanners working under Windows 95, only few have reached a level of excellency. With growing deployment of Windows 95 systems, there is significant need for product improvemment.

The following list indicates those scanners under Windows 95 graded into one of the upper categories "excellent" and "very good" upon detecting file and macro viruses: "Excellent" Windows NT scanners:

DSS 780      (99.7% 100.0%)
PAV 605      (98.7%  98.8%)
AVPD 117     (97.4%  99.9%)
NAV 40       (97.1%  98.7%)
FPR 3045     (96.1%  99.9%)
IRIS 14062   (96.1%  99.1%)
IBM 30BA     (95.2%  99.8%) 
SWP 306      (96.8%  98.4%)
Result #5B: With still few scanners working under Windows NT, only few have reached a level of excellency. With growing deployment of Windows NT systems esp. in commercial applications, there is really strong need for product improvemment.

Most Windows 95 and Windows NT scanners use the same engine, often generally named as "32-bit engine". Some AV experts tend to assume that such engines yield equal detection results as they "only" differ in operation system-specific details which should not affect detection behaviour.

It is therefore interesting to observe that Windows 95 and Windows NT results differ for several products, though usually not widely. This also indicates that AV producers may not have fully guaranteed that Win-32 engines are indeed valid for all related 32-bit platforms.

Much more details are available in the test report tables, including precision of detection and identification under all platforms and for all categories; this may be valuable information for AV producers to improve the quality of their products. Moreover, detection of packed infected objects as well as malware detection is also available for Windows 95 and Windows NT; related results reflect the same general finding as for DOS, though usually on a lower level of detection performance. Generally, Windows 95 and Windows NT products deserve significantly more care.

Final remark: more detailed information about the test, its methods and viral databases, as well as detailed test results are available for anonymous FTP downloading from VTCs HomePage (VTC is part of the Working Group AGN)

Any comment and critical remark which helps VTC learning to improve our teste methods will be warmly welcomed. The next comparative test is planned for May-June 1998, with viral databases to frozen on April 30, 1998. Any AV producer wishing to participate in that test is invited to submit related products.

On behalf of the VTC Test Crew:
Dr. Klaus Brunnstein (March 16,1998)