=========================================
	Virus Test Center, University of Hamburg:
	      AntiVirus Repair Test (ART 1.0)
		 (c) 2000 Klaus Brunnstein
              Test Report (November 19, 2000)
	=========================================


Content:
========	
	1) Executive Summary
	2) Background of this test
	3) Test methods
	4) Discussion of test results
	5) Ranking AV products
	6) Copyright, License, and Disclaimer

	A) Appendix A: ART (3.0) specification
	B) Appendix B: Details of products tested
	C) Appendix C: Results in detail (tables)
	D) Appendix D: Testbed of Word/Excel viruses 

  Detailed information about the test including logs of detection and repair 
  processes for all AV products is available for anonymous FTP download from: 
  ftp://agn-www.exvtc.de/pub/texts/tests/pc-av/2000-11/

	
	**********************************************************************
	                      1) Executive Summary:
	**********************************************************************

	With growing importance of the exchange of documents generated with 
	Microsoft Office tools (esp. including Word and Excel), malicious code 
	becomes an equally growing threat to enterprises, organisations and 
	individuals. 

	Different from executables for which uninfected "original" copies may 
	be recovered from an adequately organised archive, documents are 
	usually under steady development. If malicious infections happen
	during such phases of development, recoverable versions hardly exist.

	Consequently, repair of infected Word and Excel documents must be
	guaranteed at least for those viruses which are found "in-the-wild".

	In their diplom thesis at the Faculty for Informatics, Hamburg
	University, Martin Retsch and Stefan Tode investigated (assisted
	by the author of this report) in some detail how different AntiVirus 
	products behaved in repairing infected documents. As no information 
	about the inherent detection and repair algorithms is available, 
	a specific test method was developped which permitted to distinguish 
	essential classes of repaired objects (for details, see parts 2-6 of 
	this report, as well as appendices A-D).

	Based on a set of 40 Word (W97M) viruses and 8 Excel (X97M) viruses,
	a testbed of infected objects was generated where 3 different types
	were used to adress distinguish different repair methods. 

	19 products for Windows NT which were submitted to VTC test 2000-09 
	were tested. As reported in VTC test 2000-09, several products were 
	either not submitted or have been excluded from test due to serious 
	reasons. For details, see related VTC test report, as well as 
	"reactions" on VTC web site.

	A set of "user requirements" was developped and a catalog of 8 criteria
	was derived the observation of which shall guarantee that each document
	within the tested classes infected with any of the given viruses shall
	be perfectly repaired. That is: usage of a repaired document shall not
	be distinguishable from usage of an uninfected document. 

	Summarizing the manifold of detailed results (discussed in the report),
	the following 2 general conclusions can be drawn:

	     !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
	     Conclusion 1: Generally, products are much more successful
	    	           in repairing Excel documents than Word documents.
             !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
	     Conclusion 2: Several products fail from perfectly repairing
                           Word and Excel documents only for few of the
	        	   given criteria.
	     !!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!	

	Based on a specific ranking system, the ability to repair was measured
	with sufficient granularity to distinguish between classes of failure
	to properly repair infected samples. A mandatory prerequisite was
	perfect (100%) detection and reliable (100%) identification of all
	viruses in the testbed. Taking only those (8 of 19) AV products 
	into account which repair ALL infected samples on at least a 
	"very good" basis, the following major results were achieved:

		
		
		!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
		Conclusion 3: One product - Symantec´s NAV - repaired 
			      ALL documents for ALL viruses "PERFECTLY".
		!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

		!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
		Conclusion 4: One more product - NAI´s SCN - repaired ALL 
			      documents ALL viruses almost perfetcly, 
			      being graded as "EXCELLENT".
		!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

		!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
		Conclusion 5: Six more products - GData´ AVK and PAV, 
                              Kaspersky Lab´s AVP, Commands CMD, 
                              Frisk Software´s FPW, and F-Secure´s
			      F-Secure (all for Windows NT) - repaired 
			      ALL documents for ALL viruses with 
			      "VERY GOOD" results.
		!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
	     	 

	The following table lists the results:
		!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
       		      Classifi-  Product    AV       Overall 
   		Rank   cation     Name    Company     Points  
   		----------------------------------------------
    		   1    perfect     NAV    Symantec       9       
  		---------------------------------------------- 
  		   2   excellent    SCN     NAI          8.5       
  		----------------------------------------------
  		   3   very good    AVK      GData        8        
  		   3   very good    AVP   Kaspersky Lab   8        
  		   3   very good    CMD     Command       8        
  	 	   3   very good    FPW  Frisk Software   8           
  		   3   very good    FSE    F-Secure       8           
    		   3   very good    PAV      GData        8    
   		!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!


	Finaly remark: we hope that our detailed analysis helps AV producers
	to identify weaknesses of present versions of their products and to
	improve the ability of future versions to repair Word and Excel
	documents.

	Finally, I wish to thank both Martin Retsch and Stefan Tode for their
	valuable work. Moreover, my thank goes to the VTC test team which
	laid the basis for this investigation, and to Marian Kassovic for
	his support in managing VTC tests and polishing this test report.


		November 15, 2000
					   Dr. Klaus Brunnstein
			        Professor for Applications of Informatics
					 Faculty for Informatics
	                              University of Hamburg, Germany

			      contact: brunnstein@exvtc.de
	
	**********************************************************************
	****************** End of 1) Executive Summary ***********************
	**********************************************************************


2) Background of this test:
===========================
Numbers and threats of self-replicating malicious code - namely 
"viruses" (as self-reproducing in a locally determined perimeter) 
and "worms" (self-replicating beyond any reasonable perimeter)
still grow at significant rates. Users become more interested
that any such malware is reliably detected and that, whenever
possible, maliciously affected objects are also repaired reliably,
or at least removed in a way that work with such objects is no
longer affected.

Unfortunately, it is not always possible to reliably "clean"
objects. This is esp. obvious for cleaning WORD and EXCEL objects
(aka documents and spreadsheets), as macro viruses influence
such objects both as VBA/VBA5 code and as byte code. From a users
point of view, a "repaired" object must behave as if it had not been 
infected at all. Therefore, the following requirements shall hold:


		      Table 1: User requirements:
	--------------------------------------------------------------
	        Requirements for a "perfectly" repaired 
	   document or spreadsheet from a user´s viewpoint:
	--------------------------------------------------------------
	1. The virus is removed completely from the document file.
        2. The document file is still readable after disinfection AND
           the document file can still be saved after disinfection AND
           the VB editor can still be invoked, (all this without 
           occuring warning messages) AND
           in case of a Word Document it is not a template any more.
        3. The document file contains user macros and macros are still 
           working (provided the macro virus permits this due to its 
           conception and characteristics under normal circumstances).
	---------------------------------------------------------------

In order to study problems and successes esp. for the
removal of macro viruses, Martin Retsch and Stefan Tode (then 
students at the Faculty for Informatics, Hamburg University) 
developed a method ("ART") which helps analysing different 
mechanisms of removal of macro viruses and to evaluate results
which several products produced in a given set of viruses.

The following 19 AV products for Windows NT (which were submitted 
for VTCs test "VTC 2000-09", see VTC website) participated in 
this test:


		           Table 2: Products in ARTest:
	-----------------------------------------------------------
	Code Product                Manufacturer
	Name  Name	                Name
	-----------------------------------------------------------
	ANT  AntiVir	       	    H+B EDV Datentechnik, Germany
	AVA  AVAST!	            ALWIL Software, Czech Republic
	AVG  AVG                    GriSoft, Czech Republic
	AVK  AntiVirenKit 8         GData Software, Germany
	AVP  AntiViral Toolkit Pro  Kaspersky Lab, Russia
	AVX  AntiVirus eXpert 2000  Softwin, Bucharest, Romania
	CMD  Command Antivirus	    Command Software Systems, USA
	DRW  DrWeb		    DialogueScience, Russia
	FPW  F-PROT for Windows     Frisk Software, Iceland
	FSE  F-Secure		    F-Secure Corporation, Finland
	INO  InoculateIT            Computer Associates, USA   
	NAV  Norton Antivirus	    Symantec, Cupertino (CA), USA
	NVC  Norman Virus Control   Norman Data, Germany
	PAV  Power Antivirus        GData Software, Germany
	PER  Per Antivirus          PER Systems, PERU
	PRO  Protector Plus         Proland Software,Bangalore,India
	QHL  QuickHeal              Cat Computer Services, India
	RAV  RAV Antivirus          GeCAD, Bucharest, Romania
	SCN  McAfee ViruScan        Network Associates, 
                                               Santa Clara (CA), USA      
        ------------------------------------------------------------

	As reported in VTC test 2000-09, several products were either not
	submitted or have been excluded from test due to serious reasons.
	For details, see related VTC test report, as well as "reactions"
	on VTC website.


Tests were performed on a special testbed which included only
viruses reported "In-The-Wild". We wish to thank the Wildlist
Organisation for their support, as we used their April 2000
Wildlist (most samples of which are still in-the-wild). 

For this test, a set of 48 macro (VBA5) viruses,  esp. including
Word (W97M) and EXCEL (X97M) viruses, was selected:


		Table 3: Testbed of ITW-Viruses:
 		------------------------------------
	 	Summary:     40 Word  (W97M) viruses
			 in 204 Word documents
		              8 Excel (X97M) viruses
                         in  42 Excel documents
                ------------------------------------
                Totally:     48 (VBA5) macro viruses
                         in 246 infected documents
                ------------------------------------   

The list of macro viruses is given in Appenxid D.     

	

3) Test methods:
================
The test progressed in the following 4 steps:

	Phase I:   generating goat objects which are adequate for
                   analysing desired features of repaired objects,
        Phase II:  generating a testbed of infected goats,
        Phase III: testing whether an AntiVirus product was able
	           to detect the resp. virus in any object, and
	Phase IV:  running an AntiVirus in repair mode.

In Phase I: 3 kinds of (VBA5) goat objects were produced:
	    I.1) one with no user macro,
            I.2) one with one user macro in its own module, and
            I.3) one with the user macro in the "ThisDocument" module;
                 this goat object will not be infected with 
                 "snatching" macro viruses.

In Phase II: Each virus was consecutively replicated *5 times* for all 
             goat objects. This is based on the (rather strict) definition, 
             that some self-reproducing code is called a "virus" if it 
             replicates over at least three generations. Consequently, 
             generations 1 and 2 are used for the subsequent test phases.

In Phase III: Each AntiVirus product was executed *twice* over the
             testbed:
             III.1) First, the ability of each AV product to detect 
		    the respective virus reliably was tested; hence
                    the detection rate was determined in this phase.
	     III.2) In the second run, each AV product was executed
                    in its repair mode (for details, see Appendix B).

                    Logs of detection and repair phases are available
                    from the ftp site (see /DETECT and /REPAIR).

In Phase IV: In this phase, results of the repair process were 
             analysed in detail. Here, the "user requirements" 
             (see table 1) play a significant role.

    In order to evaluate the "precision of repair", the following 
    objectives must be fulfilled:

        Table 4: Criteria for successfully repaired documents:
    ---------------------------------------------------------------
                 Mandatory Criteria:                
    Criterion 1: The document is disinfected (AV diagnosis).
    Criterion 2: Disinfected document contains only macros which 
                 correspond to those in the original goat file.
    ---------------------------------------------------------------
                 Desirable Criteria:
    Criterion 3: Disinfected document can be opened and saved.
    Criterion 4: The VB editor can still be invoked.
    Criterion 5: User macro inside the disinfected document 
                 is still working.
    Criterion 6: No warning messages are occuring during opening, 
                 closing or saving the document, starting the VB editor 
                 or running the user macro.
    Criterion 7: In case of a Word document, it is not a template 
                 after disinfection any more.
    Criterion 8: The macro virus protection dialog is not appearing 
                 during opening the document.
    ----------------------------------------------------------------

    Concerning criterion 2: some AV products generally delete all 
    macros (or attempt to do so), whether virus-related or not. 
    This criterion is relevant for such cases where, after dis-
    infection, still some macro code (e.g. byte code) is found or
    where even macro code is found which was not a part of the
    original goat object before repair. 

    For each fulfilled criterion, a product is awarded *1 point*,
    with the exception that fulfillment of criterion 5 is awarded 
    *2 points*. The fulfillment of criteria 1 and 2 are MANDATORY; 
    products which don´t fulfil those criteria will not be evaluated 
    further.

    Several methods and tools are used for assessing the fulfilments
    of these criteria:

    a) for assessing the non-existence of the original virus,
       the following 3 AV products were used: F-Prot, AVPlite, FWin.
    b) the source code of the repaired documents was extracted
       with related tools (VBA5SRC, VBASCAN from NAI; HMVS); then,
       source codes of the original goat object, the infected goat
       object and the repaired goat object were compared to determine
       whether the virus was properly removed. 

For details of the evaluation process, see Appendix A.

Finally, results of phase IV were used to grade each AV product
according to the following "ranking system":

		Table 5: Ranking System for Repair Ability:
                -------------------------------------------
        ranking  detection rate  points  repairing rate  points
        -------------------------------------------------------
        1             100%        6           100%        3
        2             >=99%       5           >=95%      2,5
        3             >=95%       4           >=90%       2
        4             >=90%       3           >=85%      1,5
        5             >=85%       2           >=80%       1
        6             >=80%       1           >=75%      0,5
        7              <75%       0            <75%       0
        --------------------------------------------------------

The ranking systems deliberately gives double points to detection
(which is prerequisite for repair).


4) Discussion of test results:
==============================

  4.1) Detection of ITW viruses in testbed:
  -----------------------------------------
       (For details, see Appendix C, table ART.1a)

       Out of 19 AV products, the following 16 detected
       ALL 48 viruses in ALL 246 infected macro objects,
       with full precision and reliable identification:

             ANT, AVA, AVG, AVK, AVP, AVX, CMD, DRW, 
             FPW, FSE, INO, NAV, NVC, PAV, RAV, SCN.
	
       Only 2 products failed to detect ALL viruses AND infected 
       documents:

       PRO detected 100.0% of ITW viruses in 97.6% of documents
       QHL detected  89.6% of ITW viruses in 90.2% of documents
       PER detected  81.3% of ITW viruses in 81.3% of documents
       
  4.2) Maximum points (maxpoints) per product:
  --------------------------------------------
       The number of reliably detected infected objects in the 
       testbed is also the maximum number of points (=maxpoint) 
       which may be reached in the repair process.  

	      ================================================
              maxpoint = the maximum number of points is
                         the number of all detected documents
	                 (1 point per document) 
	      ================================================

       Points are given for all those criteria relevant to the object
       (document). For any product, only those documents are counted
       in which an infection was properly detected.


   4.3)	Repair of Word/Excel/all documents:
	-----------------------------------
	(For details of data, see Appendix C/tables ART.2a-2c)

	It is evident that AV products are significantly more
	successful in repairing EXCEL than WORD documents. In
	comparison with the optimum result, 
		
		only NAV repairs all infected documents 
		both of WORD and EXCEL type "perfectly".

			   
		    AV-       ===== Repair rate =====
		Product(s)    WORD   EXCEL    Overall
		-------------------------------------
		    NAV      100.0%  100.0%   100.0%
		    SCN       98.3%  100.0%    98.6%
                AVK,AVP,PAV   92.2%   99.3%    93.4%
	            CMD       91.4%  100.0%    92.9%
		  FSE,FPW     90.3%  100.0%    92.0%
		--------------------------------------


   4.4) Repair Rates for Word/Excel Viruses for different goat types:
        -------------------------------------------------------------
	(For details of data, see Appendix C/tables ART.3a-3f)

	For an in-depth analysis of repair abilities, it is interesting
	to analyse the repair performance against the 3 different goat types
	used in the testbed:

		goat type 1: a document with no user macros
		goat type 2: a document with user macros inside a module
		goat type 3: a document with user macros inside the
			       "ThisDocument" module.

	Taking only those AV products with best Repair Rates (Perfect=100%
	and all rates >90%) into account, only 2 products = NAV and SCN = 
        reach high scores (>90%) for ALL goat types both for Word and Excel 
        viruses:

				   
	       AV-       =Goat Type 1= I =Goat Type 2= I =Goat Type 3= 
             Product      WORD  EXCEL  I  WORD  EXCEL  I  WORD  EXCEL    
	     --------------------------+---------------+--------------
	       NAV      100.0% 100.0%  I 100.0% 100.0% I 100.0% 100.0%
	     --------------------------+---------------+-------------- 
	       SCN       98.8% 100.0%  I  98.9% 100.0% I  96.8% 100.0%
             --------------------------+---------------+--------------

	Several more products are able to reach Repair Rate=100% for at least one
	goat type. It is esp. interesting to observe that the ability to repair
	Excel viruses is significantly more developed (evidently, Excel virus
	repair is much easier) than for Word viruses, as the following tables
	indicate:	 
	
				Products with RR=100% for Word viruses:
                                --------------------------------------- 
		Goat type 1:    AVA, FPW, FSE, INO, NAV (PRO)
		Goat Type 2:    NAV
		Goat type 3:    NAV
				----------------------------------------
                                Remark: PRO didnot detect all ITW viruses
                                        but correctly repaired those found.

	In contrast, a larger number of products reach RR=100% for Excel viruses:

				Products with RR=100% for Excel viruses:
                                --------------------------------------------
		Goat type 1:    ANT, AVA, AVG, CMD, DRW, FPW, FSE, INO, NAV, 
                                     NVC, RAV, SCN (PER, QHL)
		Goat Type 2:    AVK, AVP, CMD, FPW, FSE, INO, NAV, PAV, SCN
		Goat type 3:    AVK, AVP, CMD, FPW, FSE, INO, NAV, PAV, SCN
                                --------------------------------------------
				Remark: PER and QHL didnot detect all ITW 
                                       	viruses but correctly repaired those found.
	

   4.5) Repair Rates for infected Word/Excel documents:
        -----------------------------------------------
        (For details of data, see Appendix C/tables ART.4a-4c)

	Concerning completely correct repair of Word and Excel DOCUMENTS (of ALL 
        goat types), only 2 products reach a high score (RR>=90%):	

			    AV-        === Documents ===
        	   	  Product      WORD   EXCEL   ALL
	 	 	  ---------------------------------
		    	   NAV        100.0% 100.0% 100.0%
		   	   SCN         90.2% 100.0%  91.9% 
        	    	 ----------------------------------
	
	Repair rates for Excel documents only are significantly better:
		RR=100.0% reach: CMD, FPW, FSE, INO, NAV, SCN
		RR= 95.2% reach: AVK, AVP, PAV

	But concerning overall DOCUMENT REPAIR, the distance from the 2 best
	products (RR>=90%) is rather large: next best product is INO (RR=75.6%).


   4.6) Repair Rates for infected Word/Excel viruses:
        ---------------------------------------------
        (For details of data, see Appendix C/tables ART.5a-5c)

	Concerning completely correct and reliable repair of Word and Excel 
        VIRUSES, the result is similar to that one for Word/Excel documents: 
        only 2 products reach a high score (RR>=90%): 


			    AV-        ===== Viruses ====
        	   	  Product      WORD   EXCEL   ALL
	 	 	  ---------------------------------
		    	   NAV        100.0% 100.0% 100.0%
		   	   SCN         90.0% 100.0%  91.7% 
        	    	 ----------------------------------
	
	Again, repair rates for Excel viruses only are significantly better:
		RR=100.0% reach: CMD, FPW, FSE, INO, NAV, SCN

	Concerning overall VIRUS REPAIR, the distance from the 2 best products
	(RR>=90%) is rather large: next best product is INO (RR=64.6%).
		


   4.7) Loss of points for Criteria 3-7 for Word/Excel documents:
        ---------------------------------------------------------
        (For details of data, see Appendix C/tables ART.6a-6b)

	It is interesting to analyse where repair problems originate. Here, test 
        results were analysed with respect to the different criteria (see phase IV, 
        table 4). While fulfilment of Criteria 1-2 were absolutely required, 
        fulfilment of other criteria is evidently not fully guaranteed:

	   For Word documents:
		Criterion 5 (user macro handling) is NOT fulfilled by 15 (of 19) products
		Criterion 7 (template bit)   is also NOT fulfilled by 15 (of 19) products
		Criterion 3 (save)                is NOT fulfilled by  7 (of 19) products
		Criterion 4 (VB editor works)     is NOT fulfilled by  6 (of 19) products
		Criterion 6 (no warning)          is NOT fulfilled by  6 (of 19) products
		Criterion 8 (macro protection)    is NOT fulfilled by  2 (of 19) products

	   For Excel documents, the situation is much better:
		Criterion 5 (user macro handling) is NOT fulfilled by 10 (of 19) products
		Criterion 8 (macro protection)    is NOT fulfilled by  4 (of 19) products
		Criterion 3 (save)                is NOT fulfilled by  1 (of 19) product
		Criterion 6 (no warning)          is NOT fulfilled by  1 (of 19) product
		Criterion 4 (VB editor works)     is     FULFILLED by    ALL     products

	Only NAV fulfils ALL criteria.

	The 2nd best product, SCN,  has only problems for Word virus repair 
        concerning criteria 5 and 7.



   4.8) Detection Rates versus Repair Rates:
        ------------------------------------
        (For details of data, see Appendix C/table ART.7a)

	First, 16 (out of 19) products detect ALL ITW viruses (DR=100.0%) - this is
	the mandatory requirement for VTC test ranking. Therefore, all reach the
	maximum possible 6 points.

	Concerning repair, only NAV reaches the maximum number of 6+3 points, 
        followed by SCN (6+2.5 points) and 6 more products (6+2 points), as shown 
        in the following table:

		NAV: Points for maximum detection rate =  6
		     Points for maximum   repair  rate = 3.0
		     ---------------------------------------
			Rating:  9  points of 9: "Perfect"
		     ---------------------------------------
		SCN: Points for maximum detection rate =  6
		     Points for maximum   repair  rate = 2.5 
		     ---------------------------------------
			Rating: 8.5 points of 9: "Excellent"
		     ---------------------------------------
            6 products: AVK, AVP, CMD, FPW, FSE and PAV:
		     Points for maximum detection rate =  6
		     Points for maximum   repair  rate = 2.0 
		     ---------------------------------------
			Rating: 8.0 points of 9: "Very Good"
		     ---------------------------------------
	

5) Ranking AV products:
=======================

Based on the results as discussed (Appendix C/table 8a), and with
and with reference to the Ranking System (table 5), the following 
grades are assigned to the quality of AV products in this test:


         Classifi-  Product    AV       Overall 
   Rank   cation     Name    Company     Points  
   ----------------------------------------------
     1    perfect     NAV    Symantec       9       
   ---------------------------------------------- 
     2   excellent    SCN     NAI          8.5       
   ----------------------------------------------
     3   very good    AVK      GData        8        
     3   very good    AVP   Kaspersky Lab   8        
     3   very good    CMD     Command       8        
     3   very good    FPW  Frisk Software   8           
     3   very good    FSE    F-Secure       8           
     3   very good    PAV      GData        8    
   ----------------------------------------------
     9     good       AVA      ALWIL        7
     9     good       AVG     GriSoft       7    
     9     good       DRW Dialogue Science  7        
     9     good       INO       CAI         7 
     9     good       NVC   Norman Data     7
     9     good       RAV     GeCAD         7
   ----------------------------------------------
    15     fair       ANT    H&B EDV        6       
    15     fair       AVX    Softwin        6   
   ----------------------------------------------
    17   average      PRO Proland Software  5 
   ----------------------------------------------   
    18 below average  QHL Cat Comp.Service 3.5
   ----------------------------------------------
    19   very poor    PER  PER Systems      2
   ----------------------------------------------


In order to help AV companies to improve their detection
and repair rates, they receive "repaired objects" and original
goat objects for their analysis (as far as a secure path can be 
established for the transfer of such code).



6) Copyright, License, and Disclaimer:
======================================

This publication is (C) Copyright 2000 by Klaus Brunnstein and the 
Virus Test Center (VTC) at University of Hamburg, Germany.

Permission (Copy-Left) is granted to everybody to distribute copies of
this information in electronic form, provided that this is done for
free, that contents of the information are not changed in any way, and
that origin of this information is explicitly mentioned. It is esp.
permitted to store and distribute this set of text files at university
or other public mirror sites where security/safety related information
is stored for unrestricted public access for free. 

Any other use, esp. including distribution of these text files on
CD-ROMs or any publication as a whole or in parts, are ONLY permitted
after contact with the supervisor, Prof. Dr. Klaus Brunnstein or
authorized members of Virus Test Center at Hamburg University, and this
agreement must be in explicit writing, prior to any publication. 

No responsibility is assumed by the author(s) for any injury and/or
damage to persons or property as a matter of products liability,
negligence or otherwise, or from any use or operation of any methods,
products, instructions or ideas contained in the material herein.

             Prof. Dr. Klaus Brunnstein, Hamburg University, Germany
                        brunnstein@exvtc.de
                                 (November 19, 2000)
 
------------------------ Appendix A -------------------------------

A) ART (3.0) specification:
===========================
The following specification was developped in several rounds of
discussions with experts in the fields of macro virus detection 
and repair. We wish to thank this community for their critical, 
though constructive support in preparing ART methodologies.

===========================================
Description of ART  = Antivirus Repair Test
  Version 3.0 (Status: September 07,2000)
===========================================
      (c) Martin Retsch, Stefan Tode
        (assisted by Klaus Brunnstein)

First time within VTC Antivirus test, we will perform a repair test of 
document files. For this test, we concentrate our work on Word97 and 
Excel97 document viruses.

The test includes the following steps:

1. Selection of the most spreaded (ITW) W97 and X97 macro viruses
   from the "wild-list".
2. Replication of each sample over 5 generations in our replicator.
   We are using 3 different kinds of our goat files,
   a) one with no user macro,
   b) one with the user macro in it's own module and
   c) one with the user macro in the "ThisDocument" module.
   Goat c) will not be infected with "snatching" macro viruses.
3. For the test database, we are using the first 2 generations of 
   succesful replicated viruses (viruses are automagically replicated
   5 times and the generations 1 and 2 are used in the testbed).
4. Each AntiVirus-Scanner runs twice over the database, once for measuring 
   the detection rate and the second time for repairing the files.
5. Results are examined with regard to detection of the viruses and
   correct repair of the documents.

To automate the test, we are using Perl scripts, to run the tools, to parse
their output, to run the function tests in real office applications and
to generate the report.


Definition of the evaluation criteria and scales:

    Concerning the ability of some scanner to repair document viruses 
    correctly, we define a "perfect" AntiVirus product as follows:

    The "perfect" repair component of an AntiVirus product is able to 
    disinfect any macro virus which it detects, in such a way, that:

        1. the virus is removed completely from the document file.
        2. the document file is still readable after disinfection and
           the document file can still be saved after disinfection and
           the VB editor can still be invoked, (all this without 
           occuring warn messages) and
           in case of a Word Document it is not a template any more.
        3. the document file contains user macros and macros are still working 
           (provided that macro virus permits this due to its conception
           and characteristics under normal circumstances).

The whole evaluation of a product must be seen in context of the detection 
rate of macro viruses. A product, which e.g. detects only 20% of the macro 
viruses but which is able to repair 100% of those, canNOT be rated as a
perfect product. Therefore the valence should be clearly related to the 
detection rate. Files which were not detected from a scanner, will be 
removed from the repair test for this scanner.

For simplifying the classification of a product, we have developed a 
rating system for the evaluation, where we apply the following eight
criteria:

    Criterion 1: The document is disinfected.
    Criterion 2: Disinfected document contains only Macros which 
                 correspond to those in the original goatfile.               
    Criterion 3: Disinfected document can be opened and saved.
    Criterion 4: The VB editor can still be invoked.
    Criterion 5: User macro inside the disinfected document is still 
                 working.
    Criterion 6: No warning messages are occuring during opening, closing or
                 saving the document, starting the VB editor or running 
                 the user macro.
    Criterion 7: In case of a Word document, it is not a template after
                 disinfection any more.
    Criterion 8: The macrovirus protection is not appearing during
                 opening the document.

For each criterion, a product in test can be awarded one point,
in case that it fulfill criterion 5: two points. The fulfillment 
of the first and second criterion is MANDATORY to reach any point
for one of the other criteria.

Those (8) criteria cannot be checked for all macro viruses or goat files.

For Criterion 1 we will use 3 different Anti Virus Programs 
(F-Prot,AVPlite, FWin) to test the documents.

For Criterion 2, we use two different tools.
   A) we use VBA5SRC and VBASCAN from NAI to extract the source code 
      part of the documents. We are comparing the source code between the
      original goat files, the infected goat samples and the disinfected
      goat samples, to see if the virus was removed.
   B) we use HMVS to extract the source code part of the documents. We 
      compare the source code between the original goat files, the infected 
      goat samples and the disinfected goat samples, to see if the virus 
      was removed.

Criterion 5 will only be checked, if the infected goat file contains our 
user macro. In Word, the user macro will be started with the commandline
option /m. In Excel, the user macro will be started from inside Excel with
the VBA-command ThisWorkbook.$usermacro.

Criterion 7 will only be evaluated, if the infected goat file is a Word 
template. (Our original WORD goat files are all documents). For the test of
this criterion, we use "oledecode" to extract the WordDocument stream to
evaluate the template bit.

Criterion 8 will only be evaluated, if the original goatfile doesn't contain
macros. In that case, the built-in macro virus protection of Word/Excel will
be switched on before opening that document. Then, we test if a macro warning 
appears.

Consequently, between 5 and 8 points can be achieved for each tested 
document file in case of Word Documents, and 5 to 7 points in case of 
Excel Documents. Summing up all points and comparing the result with 
the maximum number of points yields the evaluation rate.

If one product reaches the highest number of points, it is rated 
"perfect" concerning its ability to repair documents. otherwise, lesser
grades are assigned.

The evaluation process, consisting of detection rate and repairing 
rate, is as following:

    As the detection rate (which is prerequisite for reliable repair)
    is rated higher than the repair rate, we are awarding twice as much 
    points for the detection rate than for the repair rate. The
    distribution of points is listed in the following table:

        ranking  detection rate  points  repairing rate  points
        -------------------------------------------------------
        1             100%        6           100%        3
        2             >=99%       5           >=95%      2,5
        3             >=95%       4           >=90%       2
        4             >=90%       3           >=85%      1,5
        5             >=85%       2           >=80%       1
        6             >=80%       1           >=75%      0,5
        7             >=75%       0           >=70%       0
        --------------------------------------------------------


As the detection rate is the dominant factor, it is impossible for a 
product to reach a rank that is higher than the rank of the detection 
rate.

Examples (2):
    1) A product, which has a detection rate of 95% and a repairi rate of 
       100%, therefore gets 4+2 = 6 points, as the rank of the detection 
       rate is only 3.

    2) A product, which has a detection rate of 100% and a repair rate 
       of 80%, therefore gets  6+1 = 7 points.


We assign a verb (from "perfect" to "very poor") to the overall rank,
as defined in the following table:

                    ranking             points
                    --------------------------
                    1 = perfect             =9
                    2 = excellent           >8
                    3 = very good           >7
                    4 = good                >6
                    5 = fair                >5
                    6 = only average        >4
                    7 = below average       >3
                    8 = poor                >2
                    9 = very poor           >=0
                    ---------------------------

------------------------ Appendix B -------------------------------

B) Details of products in test:
===============================
For details (e.g. manufacturer data), see file A2SCNLS.txt
in VTC test "VTC 2000-09".

ANT: 	Product: AntiVir   
	Version: 06.02.00.04
        Engine: 06.02.00.03
        Signature: Version 06.02.00.00, 30.5.2000
        Repair Switch: Reparieren ohne Rückfrage
                       Infizierte zerstörte Dateien ignorieren
                       Formatvorlage immer konvertieren
                       alle verdächtigen Makros löschen
                       Formattabelle komprimieren
	Company: H+B EDV Datentechnik GmbH, Tetnang, Germany

AVA     Product: AVAST!
	Avast32
        Signature 05/29/2000
        Start: GUID
        Repair Options: scan for executables and macroviruses
        		scan files standard and polymorphic
        		ignore virus selectiveness
        		test complete files
        		scan allfiles, report allfiles, report errors
        		virus alert continue
        		all archieves
        		virus alert: remove it - remove all macros
	Company: ALWIL Software, Praha (Prague), Czech Republic

AVG:    Product: AVG
	Version: 6.0.159
        Engine:
        Signature: v 73, 31.5.2000
        Repair Switch: /CLEAN
	Company: GriSoft, Brno, Czech Republic

AVK:    Product: AntiVirenKit 8
	Version: 
        Engine: 9n
        Signature: 2.6.2000
        Start: GUI
        Repair Switch: Remove virus immediately
	Company: GData Software GmbH, Bochum, Germany

AVP:    Antiviral Toolkit Pro
	Version: Antiviral Toolkit Pro Version 3.5.1.0
        Scanner Version 3.0.132.4
        Signature:  AVP Antivirus Bases Last Update 15.05.2000
        Start: Gui       
        avp objects: no memory, no sectors
        	     allfiles
        options: warnings, code analyzer, show clean objects, 
        	 show pack info in log, no redundant
        	 disinfect automatically
	Company: Kaspersky Lab, Moscow, Russia
        
AVX     Product: AntiVirus eXpert 2000 Desktop
	Antivirus Expert desktop 2000 Version 5.5 23.05.2000
        custom all options
        Commandline Disinfect: avxc D:\ /all /files /arc /mail 
                               /hed /log=avx.ful /disinfect /auto
	Company: Softwin, Bucuresti, Roumania

CMD     Product: Command Antivirus
	Version: 4.59
        Signature: sign.def 30.5.2000, macro.def 31.5.2000
        Start: Batch
        Repairswitch: /DISINF
	Company: Command Software Systems, Jupiter(FL), USA

DRW     Product: DrWeb
	Version: 4.17 03/24/2000
        Signature: DRW41708 05/26/2000
        Repair Switch: 
        DrWebWCL D:\ /AL /GO /HA /OK /SD /UP /AR /NM /CU /RPdrw.FUL
	Company: DialogueScience Inc., Moscow, Russia
        

FPW     Product: F-PROT for Windows
	Version: 5.07d
        F-Prot for Windows FP-Win Version: 3.07b
        Signature:  macro.def 06/02/2000, sign.def 05/18/2000, sign2.def 06/02/2000
        Start: Gui
        Options:
        Advanced, Report all scanned objects, Subfolders, Dumb scan,
        use heuristics, compressed files, inside archives (Zip, Arj),
        Action Report Only      
        Repair Switch: Attempt disinfection, 
		       If disinfection fails: Report only
                       Remove all macros: Never
	Company: Frisk Software International, Reykjavik, Iceland

FSE     Product: F-SECURE
	Version: 5.10, Build 6171
        Signature: sign.def 30.5.2000, fsmacro.def 31.5.2000, avp 31.5.2000
        Repair Switch: Disinfect automatically
	Company: F-Secure Corporation, Espoo, Finland

INO     Product: InoculanIT
	Inoculan 4.0/InoculateIT 4.5x for Windows NT
        Virus Signaturee: 12.15
        Signature Date: virsig.dat 05/31/2000
        Engine Ver 12.07, Date: 05/30/2000
        Mode: 	Start GUI
        	heuristic, Reviewer, Files
        	no Bootsector, no prompt, no sound, 
		no scan migrated files
        	Cure File Remove infected macro only, 
		no copy no rename 
	Company: Computer Associates International (CAI), Islandia (NY), USA       

NAV     Product: Norton Antivirus 
	Version: 5.01
        Signature: 05.06.2000
        Start: Batch
        Repair Switch: Repair automatically (switched on Gui)
	Company: Symantec, Cupertino (CA), USA

NVC     Product: Norman Virus Control
	Version: 4.80
        Signature: V. 4.70, Engine v.4.70.56, 30.5.2000
        Start: Batch
        Repairswitch: /cl
	Company: Norman Data Defense Systems, Solingen, Germany

PAV	Product: GData Software, Germany
	GData PowerAntiVirus - AVP for Win 95/98 and NT
        Version: 3.0 , build , 129
	Signature: 05/27/2000
	Start: GUID
        Objects: all on, no memory, no system area, all files
        Action: Report only
        Options: warnings, heuristic, clean objects, pack info
                 no redundant check
        Repair Options: automatic clean
	Company: GData Software GmbH, Bochum, Germany

PER:    Product: PER AntiVirus
        Version: 1.60 Evaluation Version
        Signature: 06/12/2000 resp. 04/23/1999 in programm info
        Start: GUID
	Company: PER Systems S.A., Lima, Peru

PRO:    Product: Protector Plus
	Protector Plus Version: 6.7.C27
        Start: Gui
        Options: no cure Virus, no prompt, Suspicious macro check 
        	 scan Email attachments, scan compressed files enhanced
        Repair Switch: Cure Virus, no prompt before action
	Company: Proland Software India, Bangalore, India

QHL:    Product: QuickHeal
	Version: 5.24
        Signaturee 22.4.2000
	Start: GUI
        Repair Switch: Repair
	Company: Cat Computer Services (P) Ltd., India
       
RAV	Product: RAV Antivirus v7
	Version RAV Antivirus for Win32 7.6.02
	Start: c:\rav>ravav.exe o:\ /ALL /RPTALL /LISTALL 
                 /REPORT RAV7.REP /UNZIP /CLEAN
        Company: GeCAD The Software Company, Bucharest, Romania

SCN     Product: McAfee ViruScan
	Version: VirusScan v4.5 Anti-Virus Software
        Virus Definitions 4.0.4080
        created on 05/31/2000, Scan engine 4.0.70
        Started: GUI Version
        Options GUI version: all options 
                           + enable macro and program heuristics 
        Company: Network Associates, Santa Clara (CA), USA      

------------------------ Appendix C ----------------------------------------

Table ART.1a: Detection Rates of ITW-Repair testbed
===================================================                          
                           This  includes        
             Viruses    ---- unreliably ----       Files
Scanner     detected    identified   detected    detected
----------------------------------------------------------
Maximum      48 100.0                            246 100.0
----------------------------------------------------------
ANT          48 100.0     0   0.0     0   0.0    246 100.0 
AVA          48 100.0     0   0.0     0   0.0    246 100.0 
AVG          48 100.0     0   0.0     0   0.0    246 100.0 
AVK          48 100.0     0   0.0     0   0.0    246 100.0 
AVP          48 100.0     0   0.0     0   0.0    246 100.0 
AVX          48 100.0     0   0.0     0   0.0    246 100.0 
CMD          48 100.0     0   0.0     0   0.0    246 100.0 
DRW          48 100.0     0   0.0     0   0.0    246 100.0 
FPW          48 100.0     0   0.0     0   0.0    246 100.0 
FSE          48 100.0     0   0.0     0   0.0    246 100.0 
INO          48 100.0     0   0.0     0   0.0    246 100.0 
NAV          48 100.0     0   0.0     0   0.0    246 100.0 
NVC          48 100.0     0   0.0     0   0.0    246 100.0 
PAV          48 100.0     0   0.0     0   0.0    246 100.0 
PER          39  81.3     4   4.2     0   0.0    200  81.3 (*)
PRO          48 100.0     0   0.0     4   4.2    240  97.6 (*) 
QHL          43  89.6     4   4.2     2   2.1    222  90.2 (*)
RAV          48 100.0     0   0.0     0   0.0    246 100.0 
SCN          48 100.0     0   0.0     0   0.0    246 100.0
---------------------------------------------------------- 
Remark: Format of tables as in VTC detection tests.


Table ART.2a: Repair Rates for Word Documents
=============================================
Scanner   maxpoint   reached   percentage
-----------------------------------------
ANT         1326       952        71.8%
AVA         1326      1118        84.3%
AVG         1326      1100        83.0%
AVK         1326      1222        92.2%
AVP         1326      1222        92.2%
AVX         1326       286        21.6%
CMD         1326      1212        91.4%
DRW         1326      1100        83.0%
FPW         1326      1198        90.3%
FSE         1326      1198        90.3%
INO         1326      1070        80.7%
NAV         1326      1326       100.0%
NVC         1326      1068        80.5%
PAV         1326      1222        92.2%
PER         1254      1046        83.4% (*)
PRO         1326      1118        84.3% (*)
QHL         1210       962        79.5% (*)
RAV         1326      1100        83.0%
SCN         1326      1304        98.3%
-----------------------------------------
Remark #1: Maxpoint is the maximum number of
           points which a product may reach; it is 
           the sum of points for all goat objects 
           in which a virus is reliably detected.
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.2b: Repair Rates for Excel Documents
==============================================
Scanner   maxpoint   reached   percentage
-----------------------------------------
ANT          278       174        62.6%
AVA          278       226        81.3%
AVG          278       226        81.3%
AVK          278       276        99.3%
AVP          278       276        99.3%
AVX          278       194        69.8%
CMD          278       278       100.0%
DRW          278       226        81.3%
FPW          278       278       100.0%
FSE          278       278       100.0%
INO          278       278       100.0%
NAV          278       278       100.0%
NVC          278       226        81.3%
PAV          278       276        99.3%
PER           40        32        80.0% (*)
PRO          236       202        85.6% (*)
QHL          238       194        81.5% (*)
RAV          278       226        81.3%
SCN          278       278       100.0%
-----------------------------------------
Remark #1: Maxpoint is the maximum number of
           points which a product may reach; it is 
           the sum of points for all goat objects 
           in which a virus is reliably detected.
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.2c: Repair Rates for documents overall
================================================
Scanner   maxpoint   reached   percentage
-----------------------------------------
ANT         1604       1126       70.2%
AVA         1604       1344       83.8%
AVG         1604       1326       82.7%
AVK         1604       1498       93.4%
AVP         1604       1498       93.4%
AVX         1604        480       29.9%
CMD         1604       1490       92.9%
DRW         1604       1326       82.7%
FPW         1604       1476       92.0%
FSE         1604       1476       92.0%
INO         1604       1348       84.0%
NAV         1604       1604      100.0%
NVC         1604       1294       80.7%
PAV         1604       1498       93.4%
PER         1294       1078       83.3% (*)
PRO         1562       1320       84.5% (*)
QHL         1448       1156       79.8% (*)
RAV         1604       1326       82.7%
SCN         1604       1582       98.6%
-----------------------------------------
Remark #1: Maxpoint is the maximum number of
           points which a product may reach; it is 
           the sum of points for all goat objects 
           in which a virus is reliably detected.
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.3a: Repair Rates for Word documents of goat type 1
============================================================
           total  correct
Scanner   number  number      %    maxpoint  reached      %
---------------------------------------------------------------
ANT         80     72       90.0%     486      468      96.3%
AVA         80     80      100.0%     486      486     100.0%
AVG         80     74       92.5%     486      480      98.8%
AVK         80     74       92.5%     486      480      98.8%
AVP         80     74       92.5%     486      480      98.8%
AVX         80      0        0.0%     486      246      50.6%
CMD         80     74       92.5%     486      480      98.8%
DRW         80     74       92.5%     486      480      98.8%
FPW         80     80      100.0%     486      486     100.0%
FSE         80     80      100.0%     486      486     100.0%
INO         80     80      100.0%     486      486     100.0%
NAV         80     80      100.0%     486      486     100.0%
NVC         80     72       90.0%     486      468      96.3%
PAV         80     74       92.5%     486      480      98.8%
PER         76     72       94.7%     460      456      99.1% (*)
PRO         80     80      100.0%     486      486     100.0% (*)
QHL         70     32       45.7%     416      382      91.8% (*)
RAV         80     74       92.5%     486      480      98.8%
SCN         80     74       92.5%     486      480      98.8%
---------------------------------------------------------------
Remark #0: goat type 1: a document with no user macros
Remark #1: total number:   number of files of goat type 1
           correct number: is the count of files, where 
                           the AV product reached all points
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.3b: Repair Rates for Excel documents of goat type 1
=============================================================
           total  correct
Scanner   number  number      %    maxpoint  reached      %
---------------------------------------------------------------
ANT         16     16      100.0%      96       96     100.0%
AVA         16     16      100.0%      96       96     100.0%
AVG         16     16      100.0%      96       96     100.0%
AVK         16     14       87.5%      96       94      97.9%
AVP         16     14       87.5%      96       94      97.9%
AVX         16     14       87.5%      96       84      87.5%
CMD         16     16      100.0%      96       96     100.0%
DRW         16     16      100.0%      96       96     100.0%
FPW         16     16      100.0%      96       96     100.0%
FSE         16     16      100.0%      96       96     100.0%
INO         16     16      100.0%      96       96     100.0%
NAV         16     16      100.0%      96       96     100.0%
NVC         16     16      100.0%      96       96     100.0%
PAV         16     14       87.5%      96       94      97.9%
PER          2      2      100.0%      12       12     100.0% (*)
PRO         16     14       87.5%      96       94      97.9% (*)
QHL         14     14      100.0%      84       84     100.0% (*)
RAV         16     16      100.0%      96       96     100.0%
SCN         16     16      100.0%      96       96     100.0%
---------------------------------------------------------------
Remark #0: goat type 1: a document with no user macros
Remark #1: total number:   number of files of goat type 1
           correct number: is the count of files, where 
                           the AV product reached all points
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.3c: Repair Rates for Word documents of goat type 2
============================================================
           total  correct
Scanner   number  number      %    maxpoint  reached      %
---------------------------------------------------------------
ANT         76      4        5.3%     526      318      60.5%
AVA         76      6        7.9%     526      386      73.4%
AVG         76      6        7.9%     526      380      72.2%
AVK         76     50       65.8%     526      482      91.6%
AVP         76     50       65.8%     526      482      91.6%
AVX         76      0        0.0%     526        0       0.0%
CMD         76     24       31.6%     526      428      81.4%
DRW         76      6        7.9%     526      380      72.2%
FPW         76     46       60.5%     526      468      89.0%
FSE         76     46       60.5%     526      468      89.0%
INO         76     50       65.8%     526      414      78.7%
NAV         76     76      100.0%     526      526     100.0%
NVC         76      4        5.3%     526      370      70.3%
PAV         76     50       65.8%     526      482      91.6%
PER         72      6        8.3%     496      360      72.6% (*)
PRO         76      6        7.9%     526      386      73.4% (*)
QHL         68      6        8.8%     470      340      72.3% (*)
RAV         76      6        7.9%     526      380      72.2%
SCN         76     70       92.1%     526      520      98.9%
---------------------------------------------------------------
Remark #0: goat type 2: document with user macros inside a module
Remark #1: total number:   number of files of goat type 2
           correct number: is the count of files, where 
                           the AV product reached all points
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.3d: Repair Rates for Excel documents of goat type 2
=============================================================
           total  correct
Scanner   number  number      %    maxpoint  reached      %
---------------------------------------------------------------
ANT         16      0        0.0%     112       48      42.9%
AVA         16      0        0.0%     112       80      71.4%
AVG         16      0        0.0%     112       80      71.4%
AVK         16     16      100.0%     112      112     100.0%
AVP         16     16      100.0%     112      112     100.0%
AVX         16      0        0.0%     112       70      62.5%
CMD         16     16      100.0%     112      112     100.0%
DRW         16      0        0.0%     112       80      71.4%
FPW         16     16      100.0%     112      112     100.0%
FSE         16     16      100.0%     112      112     100.0%
INO         16     16      100.0%     112      112     100.0%
NAV         16     16      100.0%     112      112     100.0%
NVC         16      0        0.0%     112       80      71.4%
PAV         16     16      100.0%     112      112     100.0%
PER          2      0        0.0%      14       10      71.4% (*)
PRO         12      2       16.7%      84       64      76.2% (*)
QHL         14      0        0.0%      98       70      71.4% (*)
RAV         16      0        0.0%     112       80      71.4%
SCN         16     16      100.0%     112      112     100.0%
---------------------------------------------------------------
Remark #0: goat type 2: document with user macros inside a module
Remark #1: total number:   number of files of goat type 2
           correct number: is the count of files, where 
                           the AV product reached all points
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.3e: Repair Rates for Word documents of goat type 3
============================================================
           total  correct
Scanner   number  number      %    maxpoint  reached      %
---------------------------------------------------------------
ANT         48     14       29.2%     314      166      52.9%
AVA         48     14       29.2%     314      246      78.3%
AVG         48     14       29.2%     314      240      76.4%
AVK         48     16       33.3%     314      260      82.8%
AVP         48     16       33.3%     314      260      82.8%
AVX         48      0        0.0%     314       40      12.7%
CMD         48     40       83.3%     314      304      96.8%
DRW         48     14       29.2%     314      240      76.4%
FPW         48     14       29.2%     314      244      77.7%
FSE         48     14       29.2%     314      244      77.7%
INO         48     14       29.2%     314      170      54.1%
NAV         48     48      100.0%     314      314     100.0%
NVC         48     14       29.2%     314      230      73.2%
PAV         48     16       33.3%     314      260      82.8%
PER         46     14       30.4%     298      230      77.2% (*)
PRO         48     14       29.2%     314      246      78.3% (*)
QHL         48     14       29.2%     314      240      76.4% (*)
RAV         48     14       29.2%     314      240      76.4%
SCN         48     40       83.3%     314      304      96.8%
---------------------------------------------------------------
Remark #0: goat type 3: a document with user macros 
                        inside the "ThisDocument" module
Remark #1: total number:   number of files of goat type 3
           correct number: is the count of files, where 
                           the AV product reached all points
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.3f: Repair Rates for Excel documents of goat type 3
=============================================================
           total  correct
Scanner   number  number      %    maxpoint  reached      %
---------------------------------------------------------------
ANT         10      0        0.0%      70      30       42.9%
AVA         10      0        0.0%      70      50       71.4%
AVG         10      0        0.0%      70      50       71.4%
AVK         10     10      100.0%      70      70      100.0%
AVP         10     10      100.0%      70      70      100.0%
AVX         10      0        0.0%      70      40       57.1%
CMD         10     10      100.0%      70      70      100.0%
DRW         10      0        0.0%      70      50       71.4%
FPW         10     10      100.0%      70      70      100.0%
FSE         10     10      100.0%      70      70      100.0%
INO         10     10      100.0%      70      70      100.0%
NAV         10     10      100.0%      70      70      100.0%
NVC         10      0        0.0%      70      50       71.4%
PAV         10     10      100.0%      70      70      100.0%
PER          2      0        0.0%      14      10       71.4% (*)
PRO          8      2       25.0%      56      44       78.6% (*)
QHL          8      0        0.0%      56      40       71.4% (*)
RAV         10      0        0.0%      70      50       71.4%
SCN         10     10      100.0%      70      70      100.0%
---------------------------------------------------------------
Remark #0: goat type 3: a document with user macros 
                        inside the "ThisDocument" module
Remark #1: total number:   number of files of goat type 3
           correct number: is the count of files, where 
                           the AV product reached all points
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.4a: Repair Rate for Word document
===========================================
Scanner documents fullpoints      %
-------------------------------------------
ANT        204       90         44.1%
AVA        204      100         49.0%
AVG        204       94         46.1%
AVK        204      140         68.6%
AVP        204      140         68.6%
AVX        204        0          0.0%
CMD        204      138         67.6%
DRW        204       94         46.1%
FPW        204      140         68.6%
FSE        204      140         68.6%
INO        204      144         70.6%
NAV        204      204        100.0%
NVC        204       90         44.1%
PAV        204      140         68.6%
PER        194       92         47.4% (*)
PRO        204      100         49.0% (*)
QHL        186       52         28.0% (*)
RAV        204       94         46.1%
SCN        204      184         90.2%
-------------------------------------------
Remark #1: files:      number of documents
           fullpoints: number of documents, were a product 
                       reached all points during repair
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.4b: Repair Rate for Excel documents
=============================================
Scanner documents fullpoints      %
-------------------------------------------
ANT         42       16         38.1%
AVA         42       16         38.1%
AVG         42       16         38.1%
AVK         42       40         95.2%
AVP         42       40         95.2%
AVX         42       14         33.3%
CMD         42       42        100.0%
DRW         42       16         38.1%
FPW         42       42        100.0%
FSE         42       42        100.0%
INO         42       42        100.0%
NAV         42       42        100.0%
NVC         42       16         38.1%
PAV         42       40         95.2%
PER          6        2         33.3% (*)
PRO         36       18         50.0% (*)
QHL         36       14         38.9% (*)
RAV         42       16         38.1%
SCN         42       42        100.0%
-------------------------------------------
Remark #1: files:      number of documents
           fullpoints: number of documents, were a product 
                       reached all points during repair
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.4c: Repair Rate for all documents
===========================================
Scanner documents fullpoints      %
-------------------------------------------
ANT        246      106         43.1%
AVA        246      116         47.2%
AVG        246      110         44.7%
AVK        246      180         73.2%
AVP        246      180         73.2%
AVX        246       14          5.7%
CMD        246      180         73.2%
DRW        246      110         44.7%
FPW        246      182         74.0%
FSE        246      182         74.0%
INO        246      186         75.6%
NAV        246      246        100.0%
NVC        246      106         43.1%
PAV        246      180         73.2%
PER        200       94         47.0% (*)
PRO        240      118         49.2% (*)
QHL        222       66         29.7% (*)
RAV        246      110         44.7%
SCN        246      226         91.9%
-------------------------------------------
Remark #1: files:      number of documents
           fullpoints: number of documents, were a product 
                       reached all points during repair
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.5a: Repair Rate for Word viruses
==========================================
Scanner   viruses  fullpoints       %
------------------------------------------
ANT         40         1           2.5%
AVA         40         1           2.5%
AVG         40         1           2.5%
AVK         40        23          57.5%
AVP         40        23          57.5%
AVX         40         0           0.0%
CMD         40        14          35.0%
DRW         40         1           2.5%
FPW         40        21          52.5%
FSE         40        21          52.5%
INO         40        23          57.5%
NAV         40        40         100.0%
NVC         40         1           2.5%
PAV         40        23          57.5%
PER         38         1           2.6% (*)
PRO         40         1           2.5% (*)
QHL         36         1           2.8% (*)
RAV         40         1           2.5%
SCN         40        36          90.0%
------------------------------------------
Remark #1:  viruses:    number of different viruses
            fullpoints: number of viruses, were a product
                        reached all points during repair
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.5b: Repair Rate for Excel viruses
===========================================
Scanner   viruses  fullpoints       %
------------------------------------------
ANT          8         0          0.0%
AVA          8         0          0.0%
AVG          8         0          0.0%
AVK          8         7         87.5%
AVP          8         7         87.5%
AVX          8         0          0.0%
CMD          8         8        100.0%
DRW          8         0          0.0%
FPW          8         8        100.0%
FSE          8         8        100.0%
INO          8         8        100.0%
NAV          8         8        100.0%
NVC          8         0          0.0%
PAV          8         7         87.5%
PER          1         0          0.0% (*)
PRO          8         2         25.0% (*)
QHL          7         0          0.0% (*)
RAV          8         0          0.0%
SCN          8         8        100.0%
------------------------------------------
Remark #1:  viruses:    number of different viruses
            fullpoints: number of viruses, were a product
                        reached all points during repair
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.5c: Repair Rate for all viruses
=========================================
Scanner   viruses  fullpoints       %
------------------------------------------
ANT         48          1          2.1%
AVA         48          1          2.1%
AVG         48          1          2.1%
AVK         48         30         62.5%
AVP         48         30         62.5%
AVX         48          0          0.0%
CMD         48         22         45.8%
DRW         48          1          2.1%
FPW         48         29         60.4%
FSE         48         29         60.4%
INO         48         31         64.6%
NAV         48         48        100.0%
NVC         48          1          2.1%
PAV         48         30         62.5%
PER         39          1          2.6% (*)
PRO         48          3          6.3% (*)
QHL         43          1          2.3% (*)
RAV         48          1          2.1%
SCN         48         44         91.7%
------------------------------------------
Remark #1:  viruses:    number of different viruses
            fullpoints: number of viruses, were a product
                        reached all points during repair
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.6a: Loss of points for Criteria 3-7 for Word documents
================================================================
        Criterion 8 I Criterion 5 I Criterion 3 I Criterion 4 I Criterion 6 I Criterion 7
--------------------+-------------+-------------+-------------+-------------+-------------
         -- macro - I --- user -- I --- save -- I  --- VB --- I No warning  I - template -
Scanner  protection I -- macro -- I --  test -- I  - Editor - I - message - I ---- bit ---
total    80     %   I 208     %   I  204    %   I  204    %   I  204    %   I  18      %   
--------------------+-------------+-------------+-------------+-------------+-------------
ANT       0    0.0% I 204   98.1% I   58  28.4% I    0   0.0% I   58  28.4% I  18   100.0%
AVA       0    0.0% I 208  100.0% I    0   0.0% I    0   0.0% I    0   0.0% I   0     0.0%
AVG       0    0.0% I 208  100.0% I    0   0.0% I    0   0.0% I    0   0.0% I  18   100.0%
AVK       0    0.0% I  48   23.1% I   10   4.9% I   24  11.8% I    4   2.0% I  18   100.0%
AVP       0    0.0% I  48   23.1% I   10   4.9% I   24  11.8% I    4   2.0% I  18   100.0%
AVX      60   75.0% I   0    0.0% I    0   0.0% I    0   0.0% I   70  34.3% I   0     0.0%
CMD       0    0.0% I  96   46.2% I    0   0.0% I    0   0.0% I    0   0.0% I  18   100.0%
DRW       0    0.0% I 208  100.0% I    0   0.0% I    0   0.0% I    0   0.0% I  18   100.0%
FPW       0    0.0% I   0    0.0% I   16   7.8% I   36  17.6% I   64  31.4% I  12    66.7%
FSE       0    0.0% I   0    0.0% I   16   7.8% I   36  17.6% I   64  31.4% I  12    66.7%
INO       0    0.0% I 120   57.7% I   60  29.4% I    4   2.0% I   60  29.4% I  12    66.7%
NAV       0    0.0% I   0    0.0% I    0   0.0% I    0   0.0% I    0   0.0% I   0     0.0%
NVC       0    0.0% I 204   98.1% I    0   0.0% I    0   0.0% I    0   0.0% I  18   100.0%
PAV       0    0.0% I  48   23.1% I   10   4.9% I   24  11.8% I    4   2.0% I  18   100.0%
PER       0    0.0% I 196  100.0% I    0   0.0% I    0   0.0% I    0   0.0% I  12   100.0% (*)
PRO       0    0.0% I 208  100.0% I    0   0.0% I    0   0.0% I    0   0.0% I   0     0.0% (*)
QHL      38   54.3% I 192  100.0% I    0   0.0% I    0   0.0% I    0   0.0% I  18   100.0% (*)
RAV       0    0.0% I 208  100.0% I    0   0.0% I    0   0.0% I    0   0.0% I  18   100.0%
SCN       0    0.0% I   4    1.9% I    0   0.0% I    0   0.0% I    0   0.0% I  18   100.0%
--------------------+-------------+-------------+-------------+-------------+-------------
Remark #1: line "total" shows maximum number of points, which can be lost
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.6b: Loss of points for criteria 3-7 for Excel documents
=================================================================
        Criterion 8 I Criterion 5 I Criterion 3 I Criterion 4 I Criterion 6
--------------------+-------------+-------------+-------------+------------
         -- macro - I --- user -- I --- save -- I  --- VB --- I - warning -
Scanner  protection I -- macro -- I --  test -- I  - Editor - I - message -
total    16     %   I  52     %   I  42    %    I  42    %    I  42    %
--------------------+-------------+-------------+-------------+------------
ANT       0    0.0% I  52  100.0% I  26   61.9% I   0   0.0%  I  26  61.9%
AVA       0    0.0% I  52  100.0% I   0    0.0% I   0   0.0%  I   0   0.0%
AVG       0    0.0% I  52  100.0% I   0    0.0% I   0   0.0%  I   0   0.0%
AVK       2   12.5% I   0    0.0% I   0    0.0% I   0   0.0%  I   0   0.0%
AVP       2   12.5% I   0    0.0% I   0    0.0% I   0   0.0%  I   0   0.0%
AVX       0    0.0% I  44   84.6% I   0    0.0% I   0   0.0%  I   0   0.0%
CMD       0    0.0% I   0    0.0% I   0    0.0% I   0   0.0%  I   0   0.0%
DRW       0    0.0% I  52  100.0% I   0    0.0% I   0   0.0%  I   0   0.0%
FPW       0    0.0% I   0    0.0% I   0    0.0% I   0   0.0%  I   0   0.0%
FSE       0    0.0% I   0    0.0% I   0    0.0% I   0   0.0%  I   0   0.0%
INO       0    0.0% I   0    0.0% I   0    0.0% I   0   0.0%  I   0   0.0%
NAV       0    0.0% I   0    0.0% I   0    0.0% I   0   0.0%  I   0   0.0%
NVC       0    0.0% I  52  100.0% I   0    0.0% I   0   0.0%  I   0   0.0%
PAV       2   12.5% I   0    0.0% I   0    0.0% I   0   0.0%  I   0   0.0%
PER*      0    0.0% I   8  100.0% I   0    0.0% I   0   0.0%  I   0   0.0%
PRO*      2   12.5% I  32   80.0% I   0    0.0% I   0   0.0%  I   0   0.0%
QHL*      0    0.0% I  44  100.0% I   0    0.0% I   0   0.0%  I   0   0.0%
RAV       0    0.0% I  52  100.0% I   0    0.0% I   0   0.0%  I   0   0.0%
SCN       0    0.0% I   0    0.0% I   0    0.0% I   0   0.0%  I   0   0.0%
--------------------+-------------+-------------+-------------+-----------
Remark #1: line "total" shows maximum number of points, which can be lost
Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)
  

Table ART.7a: Detection Rate versus Repair Rate
===============================================
Scanner  detection  points for   rank of   repair  points for  rank of
           rate      detection  detection   rate     repair     repair
----------------------------------------------------------------------
ANT        100.0%        6          1       70.2%      0          7 
AVA        100.0%        6          1       83.8%      1          5 
AVG        100.0%        6          1       82.7%      1          5 
AVK        100.0%        6          1       93.4%      2          3 
AVP        100.0%        6          1       93.4%      2          3 
AVX        100.0%        6          1       29.9%      0          7 
CMD        100.0%        6          1       92.9%      2          3 
DRW        100.0%        6          1       82.7%      1          5 
FPW        100.0%        6          1       92.0%      2          3 
FSE        100.0%        6          1       92.0%      2          3 
INO        100.0%        6          1       84.0%      1          5 
NAV        100.0%        6          1      100.0%      3          1 
NVC        100.0%        6          1       80.7%      1          5 
PAV        100.0%        6          1       93.4%      2          3 
PER         81.3%        1          6       83.3%      1          5 (*)
PRO         97.6%        4          3       84.5%      1          5 (*)
QHL         90.2%        3          4       79.8%     0.5         6 (*)
RAV        100.0%        6          1       82.7%      1          5 
SCN        100.0%        6          1       98.6%     2.5         2 

Remark #2: AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)


Table ART.8a: Overall Detection/Repair results
==============================================
Scanner  overall      overall  classification
          points        rank 
-----------------------------------------------
ANT         6           5          fair
AVA         7           4          good
AVG         7           4          good
AVK         8           3          very good
AVP         8           3          very good
AVX         6           5          fair
CMD         8           3          very good
DRW         7           4          good
FPW         8           3          very good
FSE         8           3          very good
INO         7           4          good
NAV         9           1          perfect
NVC         7           4          good
PAV         8           3          very good
PER         2           9          very poor     (*)
PRO         5           6          only average  (*)
QHL        3.5          7          below average (*)
RAV         7           4          good
SCN        8.5          2          excellent
-----------------------------------------------
Remark:    AV products marked (*) didnot detect
           all ITW viruses (see table ART 1.a)

---------------------------- Appendix D ------------------------------------

D) ART testbed: Index of W97M and X97M viruses
==============================================

The tesbed for the AntiVirus Repair test consisted of the following 
40 Word and 8 Excel viruses:

 Index      Number of	       Type of	        Virus (variant)
Number   infected objects  infected object            name
----------------------------------------------------------------
00000		4		Word		O97M/Halfcross.A
00001		4		Word		O97M/Jerk.A
00003		6		Word		O97M/Tristate.C
00004		6		Word		W97M/Appder.A
00005		6		Word		W97M/Astia.L
00006		6		Word		W97M/Bablas.A
00007		4		Word		W97M/Brenda.A
00009		4		Word		W97M/Chack.H
00011		4		Word		W97M/Class.B
00015		4		Word		W97M/Claud.A
00016		4		Word		W97M/Coldape.A
00018		4		Word		W97M/Cont.A
00019		4		Word		W97M/Ded.A
00022		4		Word		W97M/Eight941.E
00025		4		Word		W97M/Ethan.A
00030		6		Word		W97M/Footer.A
00031		6		Word		W97M/Groov.A
00035		6		Word		W97M/Hubad.A
00036		2		Word		W97M/Locale.A
00041		4		Word		W97M/Marker.A
00053		6		Word		W97M/Melissa.A
00058		4		Word		W97M/Myna.B
00060		6		Word		W97M/Nono.A
00062		6		Word		W97M/Nottice.A
00064		6		Word		W97M/Odious.A
00065		6		Word		W97M/Opey.A
00067		6		Word		W97M/Ozwer.F
00068		6		Word		W97M/Panther.A
00069		4		Word		W97M/Pri.A
00072		6		Word		W97M/Proteced.A
00073		6		Word		W97M/Rv.A
00074		4		Word		W97M/Story.A
00075		6		Word		W97M/Thus.A
00076		6		Word		W97M/Turn.A
00077		6		Word		W97M/Twno.AC
00078		6		Word		W97M/Verlor.A
00079		6		Word		W97M/Visor.A
00080		6		Word		W97M/Vmpck1.BY
00081		4		Word		W97M/Walker.D
00083		6		Word		W97M/Wrench.C
-------------------------------------------------------------------
00084		4		Excel		O97M/Halfcross.A
00085		4		Excel		O97M/Tristate.C
00086		6		Excel		X97M/Clonar.A
00087		4		Excel		X97M/Divi.A
00090		6		Excel		X97M/Laroux.A
00103		6		Excel		X97M/Manalo.E
00104		6		Excel		X97M/Pth.D
00105		6		Excel		X97M/Vcx.A
-------------------------------------------------------------------


---------------------- End of Appendices A-D ----------------------