SlideShare a Scribd company logo
1 of 12
Download to read offline
Anti-Virus Comparative




On-demand Detection of
Malicious Software
includes false alarm test



Language: English
March 2012

Last Revision: 10th April 2012

www.av-comparatives.org
Anti-Virus Comparative – March 2012                          www.av-comparatives.org




  Table of Contents

   Tested Products                                      3

   Conditions for participation and test methodology    4 

   Tested product versions                              4 

   Comments                                             5

   Graph of missed samples                              6 

   Results                                              8 

   False positive/alarm test                            9 

   Award levels reached in this test                   10 

   Copyright and Disclaimer                            11 




                                              -2-
Anti-Virus Comparative – March 2012                                       www.av-comparatives.org




Tested Products



       AhnLab V3 Internet Security 8.0            G DATA AntiVirus 2012
       avast! Free Antivirus 7.0                  GFI Vipre Antivirus 2012
       AVG Anti-Virus 2012                        Kaspersky Anti-Virus 2012
       AVIRA Antivirus Premium 2012               McAfee AntiVirus Plus 2012
       BitDefender Antivirus Plus 2012            Microsoft Security Essentials 2.1
       BullGuard Antivirus 12                     Panda Cloud Free Antivirus 1.5.2
       eScan Anti-Virus 11.0                      PC Tools Spyware Doctor with AV 9.0
       ESET NOD32 Antivirus 5.0                   Sophos Anti-Virus 10.0
       F-Secure Anti-Virus 2012                   Trend Micro Titanium AntiVirus+ 2012
       Fortinet FortiClient Lite 4.3              Webroot SecureAnywhere AV 8.0




                                          -3-
Anti-Virus Comparative – March 2012                                                         www.av-comparatives.org




Conditions for participation and test methodology

The conditions for participation in our tests are listed in the methodology document at
http://www.av-comparatives.org/seiten/ergebnisse/methodology.pdf. Before proceeding with this
report, readers are advised to first read the above-mentioned document.

The participation is limited to not more than 20 international well-known Anti-Virus products, which
vendors agreed to get tested and included in the public test-series of 2012.




Tested Product Versions

The Malware sets have been frozen the 16th February 2012 and consisted of 291388 sample variants.
The products were updated on the 1st March 2012. The following twenty up-to-date products were
included in this public test:

       AhnLab V3 Internet Security 8.0.5.19            G DATA AntiVirus 22.1.0.2
       avast! Free Antivirus 7.0.1407                  GFI Vipre Antivirus 5.0.5134
       AVG Anti-Virus 2012.0.1913                      Kaspersky Anti-Virus 12.0.0.374
       AVIRA Antivirus Premium 12.0.0.915              McAfee AntiVirus Plus 11.0.654
       Bitdefender Anti-Virus+ 15.0.36.1530            Microsoft Security Essentials 2.1.1116.0
       BullGuard Antivirus 12.0.215                    Panda Cloud Free Antivirus 1.5.2
       eScan Anti-Virus 11.0.1139.1146                 PC Tools Spyware Doctor with Antivirus 9.0.0.909
       ESET NOD32 Antivirus 5.0.95.0                   Sophos Anti-Virus 10.0
       F-Secure Anti-Virus 12.49.104                   Trend Micro Titanium AntiVirus Plus 5.0.1280
       Fortinet FortiClient Lite 4.3.3.0436            Webroot SecureAnywhere 8.0.1.95




Please try the products1 on your own system before making a purchase decision based on these tests.
There are also some other program features and important factors (e.g. price, ease of
use/management, compatibility, graphical user interface, language, support, etc.) to consider.
Although very important, the file detection rate of a product is only one aspect of a complete Anti-
Virus product. AV-Comparatives provides also a whole product dynamic “real-world” protection test, as
well as other test reports which cover different aspects/features of the products. Please visit our
website to find the other tests and reports.




1
  Information about used additional third-party engines/signatures inside the products: Bullguard, eScan and
F-Secure are based on the BitDefender engine. G DATA is based on the Avast and Bitdefender engines. PC Tools
is using the signatures of Symantec.


                                                   -4-
Anti-Virus Comparative – March 2012                                                      www.av-comparatives.org



Most products run with highest settings by default. Certain products switch to highest settings
automatically when malware is found. This makes it impossible to test against various malware with
real “default” settings. In order to get comparable results we set the few remaining products to
highest settings or leave them to lower settings - in accordance with the respective vendors. We
kindly ask vendors to provide stronger settings by default, i.e. set their default settings to highest
levels of detection, esp. for scheduled scans or scans initiated by the user. This is usually already the
case for on-access scans and/or on-execution scans. We allow the remaining vendors (which do not
use highest settings in on-demand scans) to choose to be tested with higher setting as they e.g. use
in on-access/on-execution higher settings by default. So the results of the file detection test are
closer to the usage in the field. We ask vendors to remove paranoid settings inside the user interface
which are too high to be ever of any benefit for common users. Below are some notes about the
settings used (scan all files, scan archives, etc. is being enabled), e.g.:

AVG, AVIRA: asked to do not enable/consider the informational warnings of packers as detections.
So, we did not count them as detections (neither on the malware set, nor on the clean set).

Avast, AVIRA, Kaspersky: tested with heuristic set to high/advanced.

F-Secure, Sophos: asked to get tested and awarded based on their default settings (i.e. without using
their advanced heuristics / suspicious detections setting).

Several products make use of cloud technologies, which require an active internet connection. Our
tests are performed using an active internet connection. We do not longer show the baseline
detection rates without cloud and show instead only the results with active cloud. Users should be
aware that detection rates may be in some cases drastically lower if the scan is performed while
offline (or when the cloud is unreachable for various reasons). The cloud should be considered as an
additional benefit/feature to increase detection rates (as well as response times and false alarm
suppression) and not as a full replacement for local offline detections. Vendors should make sure that
users are warned in case that the connectivity to the cloud2 gets lost e.g. during a scan, which may
affect considerably the provided protection and make e.g. an initiated scan useless.

This report does no longer contain the splitted detection categories. As announced last year we also
do not longer provide the on-demand scanning speed test. We invite users also to look at our other
tests and not only this type of test. This is to make users aware of other types of tests that we are
providing since some years, like e.g. our Whole-Product “Real-World” Protection Test.

The used test-set has been built consulting telemetry data in attempt to include prevalent samples
from the last months which are/were hitting users in the field. Due to the increasing amount of
polymorphic malware, we applied a clustering method to classify similar files. This allows us to
evaluate prevalent polymorphic samples and reducing the size of the set without introducing bias.
Therefore, each miss represents one missed group/variant of files. We decided to apply this method as
it helps reducing influence of e.g. inappropriate samples of same family in the set. We compared the
results/rankings with and without clustering the files. Using fewer samples (one per variant) will
reduce the workload for all in the next tests, as a smaller set can be analyzed better.


2
  A good paper about the pro and contra of clouds can be found here:
http://www.av-test.org/fileadmin/pdf/publications/vb_2009_avtest_paper_why_in-the-
cloud_scanning_is_not_a_solution.pdf


                                                  -5-
Anti-Virus Comparative – March 2012                                                                            www.av-comparatives.org



The malware detection rates are grouped by the testers after looking at the clusters build with the
hierarchal clustering method. By using clusters, there are no fixed thresholds to reach, as the
thresholds change based on the results. The testers may group the clusters rationally and not rely
solely on the clusters, to avoid that if e.g. all products would in future score badly, they do not get
high rankings anyway.


                                                             Detection Rate Clusters/Groups
                                                 (given by the testers after consulting statistical methods)
                                                    4              3               2                1
                           Very few (0-2 FP’s)
                                                 TESTED       STANDARD        ADVANCED         ADVANCED+
                              Few (3-15 FP’s)

                           Many (16-100 FP’s)    TESTED         TESTED        STANDARD         ADVANCED

                    Very many (101-500 FP’s)     TESTED         TESTED        STANDARD         STANDARD

                  Crazy many (over 500 FP’s)     TESTED         TESTED          TESTED           TESTED




Graph of missed samples (lower is better)




Files have been clustered. Each miss represents a missed variant. All tested products scored quite
good and missed less than 10% of the test-set, so we decided to cluster the detection rates into three
groups.




                                                             -6-
Anti-Virus Comparative – March 2012                                                     www.av-comparatives.org



The results of our file detection rate tests are not applicable for on-execution protection technologies
(like HIPS, behaviour blockers, etc.). Such technologies are evaluated in our new heuristic/behavioral
tests which we will provide during this year. Please do not miss the second part of the report (it will
be published in a few months) containing the new heuristic/behavioral test. It evaluates how well
products are at detecting and blocking completely new/unknown malware (by on-demand/on-access
scanner with local heuristic and generic signatures without cloud, as well as by blocking malicious
behavior when files get executed).

A good file detection rate is still one of the most important, deterministic and reliable features of an
Anti-Virus product. Additionally, most products provide at least some kind of HIPS, behaviour-based,
reputation-based or other functionalities to block (or at least warn about the possibility of) malicious
actions e.g. during the execution of malware, when all other on-access and on-demand
detection/protection mechanism failed.

Even if we deliver various tests and show different aspects of Anti-Virus software, users are advised to
evaluate the software by themselves and build their own opinion about them. Test data or reviews
just provide guidance to some aspects that users cannot evaluate by themselves. We suggest and
encourage readers to research also other independent test results provided by various well-known and
established independent testing organizations, in order to get a better overview about the detection
and protection capabilities of the various products over different test scenarios and various test-sets.




                                                 -7-
Anti-Virus Comparative – March 2012                                                           www.av-comparatives.org




Results

Please consider also the false alarm rates when looking at the below file detection rates3.

Total detection rates (clustered in groups):
    1.   G DATA                         99.7%
    2.   AVIRA                          99.4%
    3.   Kaspersky                      99.3%
    4.   Sophos                         98.9%
    5.   F-Secure, Panda, Bitdefender,
         BullGuard, McAfee              98.6%
    6.   Fortinet, eScan                98.5%
    7.   Webroot                        98.2%
    8.   Avast                          98.0%
    9.   ESET                           97.6%
     10.    PC Tools                        97.2%
     11.    GFI                             97.0%
     12.    AVG                             96.4%
     13.    Trend Micro                     95.6%
     14.    AhnLab                          94.0%
     15.    Microsoft                       93.1%

The used test-set contained almost 300-thousands recent/prevalent samples from last months.

Hierarchical	Cluster	Analysis	



                                                                          This dendogram shows the results
                                                                          of the cluster analysis4. It
                                                                          indicates at what level of
                                                                          similarity the clusters are joined.
                                                                          The red drafted line defines the
                                                                          level     of    similarity.    Each
                                                                          intersection indicates a group (in
                                                                          this case 3 groups).




3
 We estimate the remaining error margin on the final set (after applying clusters) to be under 0.2%
4
 For more information about cluster analysis, see e.g. this easy to understand tutorial:
http://strata.uga.edu/software/pdf/clusterTutorial.pdf


                                                     -8-
Anti-Virus Comparative – March 2012                                                     www.av-comparatives.org




False positive/alarm test
In order to better evaluate the quality of the file detection capabilities (distinguish good files from
malicious files) of anti-virus products, we provide a false alarm test. False alarms can sometimes cause
as much troubles as a real infection. Please consider the false alarm rate when looking at the
detection rates, as a product which is prone to cause false alarms achieves higher scores easier. All
discovered false alarms were reported/sent to the respective Anti-Virus vendors and should by now
have been fixed.

False Positive Results

Number of false alarms found in our set of clean files (lower is better):

     1.        Microsoft                         0
     2.        ESET                              2         very few FPs
     3.        BitDefender, F-Secure             4
     4.        BullGuard                         5
     5.        Kaspersky                         9
     6.        Panda                            10         few FP’s
     7.        eScan                            11
     8.        G DATA                           13
     9.        Avast, Sophos                    14
     10.       AVIRA                            15
     11.       PC Tools                         22
     12.       McAfee                           28
     13.       Fortinet                         32
     14.       AVG                              38         many FP’s
     15.       AhnLab                           64
     16.       GFI                              79
     17.       Trend Micro                    166
     18.       Webroot                        428          very many FP’s

Details about the discovered false alarms (including their assumed prevalence) can be seen in a
separate report available at: http://www.av-comparatives.org/images/stories/test/fp/avc_fp_mar2012.pdf




                                                     -9-
Anti-Virus Comparative – March 2012                                                         www.av-comparatives.org




Award levels reached in this test

AV-Comparatives provides a ranking award (STANDARD, ADVANCED and ADVANCED+). As this report
contains also the raw detection rates and not only the awards, expert users that e.g. do not care
about false alarms can rely on that score alone if they want to.
                                 AWARDS                          PRODUCTS
                           (based on detection rates and false alarms)
                                                                            G DATA
                                                                            AVIRA
                                                                            Kaspersky
                                                                            Sophos
                                                                            F-Secure
                                                                            Panda
                                                                            Bitdefender
                                                                            BullGuard
                                                                            eScan
                                                                            Avast
                                                                            ESET




                                                                          McAfee*
                                                                          Fortinet*




                                                                            PC Tools*
                                                                            GFI*
                                                                            AVG*
                                                                            Microsoft
                                                                            Trend Micro*
                                                                            Webroot*




                                                                          AhnLab*




                    *: those products got lower awards due to false alarms

The Awards are not only based on detection rates - also False Positives found in our set of clean files
are considered. On page 6 of this report you can see how awards are being given.

A product that is successful at detecting a high percentage of malicious files but suffers from false
alarms may not be necessarily better than a product which detects less malicious files but which
generates less false alarms.




                                                                - 10 -
Anti-Virus Comparative – March 2012                                                      www.av-comparatives.org




Copyright and Disclaimer

This publication is Copyright © 2012 by AV-Comparatives e.V. ®. Any use of the results, etc. in whole
or in part, is ONLY permitted after the explicit written agreement of the management board of AV-
Comparatives e.V., prior to any publication. AV-Comparatives e.V. and its testers cannot be held liable
for any damage or loss, which might occur as result of, or in connection with, the use of the
information provided in this paper. We take every possible care to ensure the correctness of the basic
data, but a liability for the correctness of the test results cannot be taken by any representative of
AV-Comparatives e.V. We do not give any guarantee of the correctness, completeness, or suitability
for a specific purpose of any of the information/content provided at any given time. No one else
involved in creating, producing or delivering test results shall be liable for any indirect, special or
consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the
services provided by the website, test documents or any related data. AV-Comparatives e.V. is a
registered Austrian Non-Profit-Organization.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

                                                                      AV-Comparatives e.V. (April 2012)




                                                 - 11 -
Anti-Virus Comparative – March 2012            www.av-comparatives.org




                                                        Advertisement 




                                      - 12 -

More Related Content

What's hot

Accuracy and time_costs_of_web_app_scanners
Accuracy and time_costs_of_web_app_scannersAccuracy and time_costs_of_web_app_scanners
Accuracy and time_costs_of_web_app_scannersLarry Suto
 
Analyzing the Effectivess of Web Application Firewalls
Analyzing the Effectivess of Web Application FirewallsAnalyzing the Effectivess of Web Application Firewalls
Analyzing the Effectivess of Web Application FirewallsLarry Suto
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishКомсс Файквэе
 
ANTIVIRUS
ANTIVIRUSANTIVIRUS
ANTIVIRUSfauscha
 
Webroot Cloud vs. Six Traditional Antivirus Products
Webroot Cloud vs. Six Traditional Antivirus ProductsWebroot Cloud vs. Six Traditional Antivirus Products
Webroot Cloud vs. Six Traditional Antivirus ProductsWebroot
 
A study of anti virus' response to unknown threats
A study of anti virus' response to unknown threatsA study of anti virus' response to unknown threats
A study of anti virus' response to unknown threatsUltraUploader
 
A Document to become an Effective Tester
A Document to become an Effective TesterA Document to become an Effective Tester
A Document to become an Effective TesterArunkumar Nehru KS
 
Fuzzing101 uvm-reporting-and-mitigation-2011-02-10
Fuzzing101 uvm-reporting-and-mitigation-2011-02-10Fuzzing101 uvm-reporting-and-mitigation-2011-02-10
Fuzzing101 uvm-reporting-and-mitigation-2011-02-10Codenomicon
 
Antivirus software testing for the new millenium
Antivirus software testing for the new milleniumAntivirus software testing for the new millenium
Antivirus software testing for the new milleniumUltraUploader
 
Software data privacy threat analysis metric using no trust privacy risk metric
 Software data privacy threat analysis metric using no trust privacy risk metric Software data privacy threat analysis metric using no trust privacy risk metric
Software data privacy threat analysis metric using no trust privacy risk metricValdez Ladd MBA, CISSP, CISA,
 

What's hot (18)

Avc per 201206_en
Avc per 201206_enAvc per 201206_en
Avc per 201206_en
 
Accuracy and time_costs_of_web_app_scanners
Accuracy and time_costs_of_web_app_scannersAccuracy and time_costs_of_web_app_scanners
Accuracy and time_costs_of_web_app_scanners
 
Windows 8 kasp1248
Windows 8 kasp1248Windows 8 kasp1248
Windows 8 kasp1248
 
Avc prot 2012b_en
Avc prot 2012b_enAvc prot 2012b_en
Avc prot 2012b_en
 
Avc prot 2013a_en
Avc prot 2013a_enAvc prot 2013a_en
Avc prot 2013a_en
 
Analyzing the Effectivess of Web Application Firewalls
Analyzing the Effectivess of Web Application FirewallsAnalyzing the Effectivess of Web Application Firewalls
Analyzing the Effectivess of Web Application Firewalls
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
ANTIVIRUS
ANTIVIRUSANTIVIRUS
ANTIVIRUS
 
Avc per 201304_en
Avc per 201304_enAvc per 201304_en
Avc per 201304_en
 
Webroot Cloud vs. Six Traditional Antivirus Products
Webroot Cloud vs. Six Traditional Antivirus ProductsWebroot Cloud vs. Six Traditional Antivirus Products
Webroot Cloud vs. Six Traditional Antivirus Products
 
A study of anti virus' response to unknown threats
A study of anti virus' response to unknown threatsA study of anti virus' response to unknown threats
A study of anti virus' response to unknown threats
 
Experience Sharing on School Pentest Project
Experience Sharing on School Pentest ProjectExperience Sharing on School Pentest Project
Experience Sharing on School Pentest Project
 
Testing
Testing Testing
Testing
 
Viral Clearance Studies
Viral Clearance StudiesViral Clearance Studies
Viral Clearance Studies
 
A Document to become an Effective Tester
A Document to become an Effective TesterA Document to become an Effective Tester
A Document to become an Effective Tester
 
Fuzzing101 uvm-reporting-and-mitigation-2011-02-10
Fuzzing101 uvm-reporting-and-mitigation-2011-02-10Fuzzing101 uvm-reporting-and-mitigation-2011-02-10
Fuzzing101 uvm-reporting-and-mitigation-2011-02-10
 
Antivirus software testing for the new millenium
Antivirus software testing for the new milleniumAntivirus software testing for the new millenium
Antivirus software testing for the new millenium
 
Software data privacy threat analysis metric using no trust privacy risk metric
 Software data privacy threat analysis metric using no trust privacy risk metric Software data privacy threat analysis metric using no trust privacy risk metric
Software data privacy threat analysis metric using no trust privacy risk metric
 

Viewers also liked

Decentralized cloud an industrial reality with higher resilience by jean-pa...
Decentralized cloud   an industrial reality with higher resilience by jean-pa...Decentralized cloud   an industrial reality with higher resilience by jean-pa...
Decentralized cloud an industrial reality with higher resilience by jean-pa...Khazret Sapenov
 
Trabajo unidad 5
Trabajo unidad 5Trabajo unidad 5
Trabajo unidad 5sandrikk
 
The American Revolution Blog 6
The  American  Revolution  Blog 6The  American  Revolution  Blog 6
The American Revolution Blog 6Michelle Ryan
 
ธรรมะเดลิเวอร์รี่ 1
ธรรมะเดลิเวอร์รี่ 1ธรรมะเดลิเวอร์รี่ 1
ธรรมะเดลิเวอร์รี่ 1nuttakorn nakkerd
 
20120311 how to make origami
20120311  how to make origami20120311  how to make origami
20120311 how to make origaminuttakorn nakkerd
 

Viewers also liked (6)

Eversheds Report - September 2102
Eversheds Report - September 2102Eversheds Report - September 2102
Eversheds Report - September 2102
 
Decentralized cloud an industrial reality with higher resilience by jean-pa...
Decentralized cloud   an industrial reality with higher resilience by jean-pa...Decentralized cloud   an industrial reality with higher resilience by jean-pa...
Decentralized cloud an industrial reality with higher resilience by jean-pa...
 
Trabajo unidad 5
Trabajo unidad 5Trabajo unidad 5
Trabajo unidad 5
 
The American Revolution Blog 6
The  American  Revolution  Blog 6The  American  Revolution  Blog 6
The American Revolution Blog 6
 
ธรรมะเดลิเวอร์รี่ 1
ธรรมะเดลิเวอร์รี่ 1ธรรมะเดลิเวอร์รี่ 1
ธรรมะเดลิเวอร์รี่ 1
 
20120311 how to make origami
20120311  how to make origami20120311  how to make origami
20120311 how to make origami
 

Similar to Avc fd mar2012_intl_en

Avc report25
Avc report25Avc report25
Avc report25Era Brown
 
Antivirus test-wholedynamic2010
Antivirus test-wholedynamic2010Antivirus test-wholedynamic2010
Antivirus test-wholedynamic2010nuttakorn nakkerd
 
AV-Comparatives Real World Protection Test
AV-Comparatives Real World Protection TestAV-Comparatives Real World Protection Test
AV-Comparatives Real World Protection TestHerbert Rodriguez
 
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017Jermund Ottermo
 
AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)Doryan Mathos
 
Antivirus Comparative junio 2014
Antivirus Comparative junio 2014Antivirus Comparative junio 2014
Antivirus Comparative junio 2014Doryan Mathos
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishDaniel zhao
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishКомсс Файквэе
 
Avtest Kasım 2011 Bedava Android Antivirüs Araştırması
Avtest Kasım 2011 Bedava Android Antivirüs AraştırmasıAvtest Kasım 2011 Bedava Android Antivirüs Araştırması
Avtest Kasım 2011 Bedava Android Antivirüs AraştırmasıErol Dizdar
 
Avcomparatives Survey 2011
Avcomparatives Survey 2011Avcomparatives Survey 2011
Avcomparatives Survey 2011Anatoliy Tkachev
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAnatoliy Tkachev
 
Types of software testing
Types of software testingTypes of software testing
Types of software testingTestbytes
 
12 sdd lesson testing and evaluating
12 sdd lesson testing and evaluating12 sdd lesson testing and evaluating
12 sdd lesson testing and evaluatingMike Cusack
 

Similar to Avc fd mar2012_intl_en (20)

Avc report25
Avc report25Avc report25
Avc report25
 
Avc rem 201211_en
Avc rem 201211_enAvc rem 201211_en
Avc rem 201211_en
 
Performance dec 2010
Performance dec 2010Performance dec 2010
Performance dec 2010
 
Antivirus test-wholedynamic2010
Antivirus test-wholedynamic2010Antivirus test-wholedynamic2010
Antivirus test-wholedynamic2010
 
AV-Comparatives Real World Protection Test
AV-Comparatives Real World Protection TestAV-Comparatives Real World Protection Test
AV-Comparatives Real World Protection Test
 
Avc sum 201212_en
Avc sum 201212_enAvc sum 201212_en
Avc sum 201212_en
 
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
 
AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)
 
Antivirus Comparative junio 2014
Antivirus Comparative junio 2014Antivirus Comparative junio 2014
Antivirus Comparative junio 2014
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Avtest Kasım 2011 Bedava Android Antivirüs Araştırması
Avtest Kasım 2011 Bedava Android Antivirüs AraştırmasıAvtest Kasım 2011 Bedava Android Antivirüs Araştırması
Avtest Kasım 2011 Bedava Android Antivirüs Araştırması
 
Test plan
Test planTest plan
Test plan
 
Avcomparatives Survey 2011
Avcomparatives Survey 2011Avcomparatives Survey 2011
Avcomparatives Survey 2011
 
Avc prot 2016a_en
Avc prot 2016a_enAvc prot 2016a_en
Avc prot 2016a_en
 
Summary2011
Summary2011Summary2011
Summary2011
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Types of software testing
Types of software testingTypes of software testing
Types of software testing
 
12 sdd lesson testing and evaluating
12 sdd lesson testing and evaluating12 sdd lesson testing and evaluating
12 sdd lesson testing and evaluating
 
Test plan
Test planTest plan
Test plan
 

More from Комсс Файквэе

Rp data breach-investigations-report-2013-en_xg
Rp data breach-investigations-report-2013-en_xgRp data breach-investigations-report-2013-en_xg
Rp data breach-investigations-report-2013-en_xgКомсс Файквэе
 
Hta t07-did-you-read-the-news-http-request-hijacking
Hta t07-did-you-read-the-news-http-request-hijackingHta t07-did-you-read-the-news-http-request-hijacking
Hta t07-did-you-read-the-news-http-request-hijackingКомсс Файквэе
 

More from Комсс Файквэе (20)

Ksb 2013 ru
Ksb 2013 ruKsb 2013 ru
Ksb 2013 ru
 
Rp quarterly-threat-q3-2013
Rp quarterly-threat-q3-2013Rp quarterly-threat-q3-2013
Rp quarterly-threat-q3-2013
 
Rp data breach-investigations-report-2013-en_xg
Rp data breach-investigations-report-2013-en_xgRp data breach-investigations-report-2013-en_xg
Rp data breach-investigations-report-2013-en_xg
 
Apwg trends report_q2_2013
Apwg trends report_q2_2013Apwg trends report_q2_2013
Apwg trends report_q2_2013
 
Mobile threat report_q3_2013
Mobile threat report_q3_2013Mobile threat report_q3_2013
Mobile threat report_q3_2013
 
Scimp paper
Scimp paperScimp paper
Scimp paper
 
Ey giss-under-cyber-attack
Ey giss-under-cyber-attackEy giss-under-cyber-attack
Ey giss-under-cyber-attack
 
Hta t07-did-you-read-the-news-http-request-hijacking
Hta t07-did-you-read-the-news-http-request-hijackingHta t07-did-you-read-the-news-http-request-hijacking
Hta t07-did-you-read-the-news-http-request-hijacking
 
Analitika web 2012_positive_technologies
Analitika web 2012_positive_technologiesAnalitika web 2012_positive_technologies
Analitika web 2012_positive_technologies
 
B istr main-report_v18_2012_21291018.en-us
B istr main-report_v18_2012_21291018.en-usB istr main-report_v18_2012_21291018.en-us
B istr main-report_v18_2012_21291018.en-us
 
Threat report h1_2013
Threat report h1_2013Threat report h1_2013
Threat report h1_2013
 
B intelligence report-08-2013.en-us
B intelligence report-08-2013.en-usB intelligence report-08-2013.en-us
B intelligence report-08-2013.en-us
 
Dtl 2013 q2_home.1.2
Dtl 2013 q2_home.1.2Dtl 2013 q2_home.1.2
Dtl 2013 q2_home.1.2
 
Rp quarterly-threat-q1-2012
Rp quarterly-threat-q1-2012Rp quarterly-threat-q1-2012
Rp quarterly-threat-q1-2012
 
Kaspersky lab av_test_whitelist_test_report
Kaspersky lab av_test_whitelist_test_reportKaspersky lab av_test_whitelist_test_report
Kaspersky lab av_test_whitelist_test_report
 
The modern-malware-review-march-2013
The modern-malware-review-march-2013 The modern-malware-review-march-2013
The modern-malware-review-march-2013
 
Dtl 2012 kl-app_ctl1.2
Dtl 2012 kl-app_ctl1.2Dtl 2012 kl-app_ctl1.2
Dtl 2012 kl-app_ctl1.2
 
Panda labs annual-report-2012
Panda labs annual-report-2012Panda labs annual-report-2012
Panda labs annual-report-2012
 
H02 syllabus
H02 syllabusH02 syllabus
H02 syllabus
 
Course reader-title
Course reader-titleCourse reader-title
Course reader-title
 

Recently uploaded

How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfLoriGlavin3
 
Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rick Flair
 
Sample pptx for embedding into website for demo
Sample pptx for embedding into website for demoSample pptx for embedding into website for demo
Sample pptx for embedding into website for demoHarshalMandlekar2
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxLoriGlavin3
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxLoriGlavin3
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfAddepto
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxLoriGlavin3
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersRaghuram Pandurangan
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxLoriGlavin3
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 3652toLead Limited
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxLoriGlavin3
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 

Recently uploaded (20)

How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdf
 
Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...Rise of the Machines: Known As Drones...
Rise of the Machines: Known As Drones...
 
Sample pptx for embedding into website for demo
Sample pptx for embedding into website for demoSample pptx for embedding into website for demo
Sample pptx for embedding into website for demo
 
The State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptxThe State of Passkeys with FIDO Alliance.pptx
The State of Passkeys with FIDO Alliance.pptx
 
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptxThe Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
The Role of FIDO in a Cyber Secure Netherlands: FIDO Paris Seminar.pptx
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdf
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptxPasskey Providers and Enabling Portability: FIDO Paris Seminar.pptx
Passkey Providers and Enabling Portability: FIDO Paris Seminar.pptx
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information Developers
 
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptxDigital Identity is Under Attack: FIDO Paris Seminar.pptx
Digital Identity is Under Attack: FIDO Paris Seminar.pptx
 
Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365Ensuring Technical Readiness For Copilot in Microsoft 365
Ensuring Technical Readiness For Copilot in Microsoft 365
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptxA Deep Dive on Passkeys: FIDO Paris Seminar.pptx
A Deep Dive on Passkeys: FIDO Paris Seminar.pptx
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 

Avc fd mar2012_intl_en

  • 1. Anti-Virus Comparative On-demand Detection of Malicious Software includes false alarm test Language: English March 2012 Last Revision: 10th April 2012 www.av-comparatives.org
  • 2. Anti-Virus Comparative – March 2012 www.av-comparatives.org Table of Contents Tested Products 3 Conditions for participation and test methodology 4  Tested product versions 4  Comments 5 Graph of missed samples 6  Results 8  False positive/alarm test 9  Award levels reached in this test 10  Copyright and Disclaimer 11  -2-
  • 3. Anti-Virus Comparative – March 2012 www.av-comparatives.org Tested Products  AhnLab V3 Internet Security 8.0  G DATA AntiVirus 2012  avast! Free Antivirus 7.0  GFI Vipre Antivirus 2012  AVG Anti-Virus 2012  Kaspersky Anti-Virus 2012  AVIRA Antivirus Premium 2012  McAfee AntiVirus Plus 2012  BitDefender Antivirus Plus 2012  Microsoft Security Essentials 2.1  BullGuard Antivirus 12  Panda Cloud Free Antivirus 1.5.2  eScan Anti-Virus 11.0  PC Tools Spyware Doctor with AV 9.0  ESET NOD32 Antivirus 5.0  Sophos Anti-Virus 10.0  F-Secure Anti-Virus 2012  Trend Micro Titanium AntiVirus+ 2012  Fortinet FortiClient Lite 4.3  Webroot SecureAnywhere AV 8.0 -3-
  • 4. Anti-Virus Comparative – March 2012 www.av-comparatives.org Conditions for participation and test methodology The conditions for participation in our tests are listed in the methodology document at http://www.av-comparatives.org/seiten/ergebnisse/methodology.pdf. Before proceeding with this report, readers are advised to first read the above-mentioned document. The participation is limited to not more than 20 international well-known Anti-Virus products, which vendors agreed to get tested and included in the public test-series of 2012. Tested Product Versions The Malware sets have been frozen the 16th February 2012 and consisted of 291388 sample variants. The products were updated on the 1st March 2012. The following twenty up-to-date products were included in this public test:  AhnLab V3 Internet Security 8.0.5.19  G DATA AntiVirus 22.1.0.2  avast! Free Antivirus 7.0.1407  GFI Vipre Antivirus 5.0.5134  AVG Anti-Virus 2012.0.1913  Kaspersky Anti-Virus 12.0.0.374  AVIRA Antivirus Premium 12.0.0.915  McAfee AntiVirus Plus 11.0.654  Bitdefender Anti-Virus+ 15.0.36.1530  Microsoft Security Essentials 2.1.1116.0  BullGuard Antivirus 12.0.215  Panda Cloud Free Antivirus 1.5.2  eScan Anti-Virus 11.0.1139.1146  PC Tools Spyware Doctor with Antivirus 9.0.0.909  ESET NOD32 Antivirus 5.0.95.0  Sophos Anti-Virus 10.0  F-Secure Anti-Virus 12.49.104  Trend Micro Titanium AntiVirus Plus 5.0.1280  Fortinet FortiClient Lite 4.3.3.0436  Webroot SecureAnywhere 8.0.1.95 Please try the products1 on your own system before making a purchase decision based on these tests. There are also some other program features and important factors (e.g. price, ease of use/management, compatibility, graphical user interface, language, support, etc.) to consider. Although very important, the file detection rate of a product is only one aspect of a complete Anti- Virus product. AV-Comparatives provides also a whole product dynamic “real-world” protection test, as well as other test reports which cover different aspects/features of the products. Please visit our website to find the other tests and reports. 1 Information about used additional third-party engines/signatures inside the products: Bullguard, eScan and F-Secure are based on the BitDefender engine. G DATA is based on the Avast and Bitdefender engines. PC Tools is using the signatures of Symantec. -4-
  • 5. Anti-Virus Comparative – March 2012 www.av-comparatives.org Most products run with highest settings by default. Certain products switch to highest settings automatically when malware is found. This makes it impossible to test against various malware with real “default” settings. In order to get comparable results we set the few remaining products to highest settings or leave them to lower settings - in accordance with the respective vendors. We kindly ask vendors to provide stronger settings by default, i.e. set their default settings to highest levels of detection, esp. for scheduled scans or scans initiated by the user. This is usually already the case for on-access scans and/or on-execution scans. We allow the remaining vendors (which do not use highest settings in on-demand scans) to choose to be tested with higher setting as they e.g. use in on-access/on-execution higher settings by default. So the results of the file detection test are closer to the usage in the field. We ask vendors to remove paranoid settings inside the user interface which are too high to be ever of any benefit for common users. Below are some notes about the settings used (scan all files, scan archives, etc. is being enabled), e.g.: AVG, AVIRA: asked to do not enable/consider the informational warnings of packers as detections. So, we did not count them as detections (neither on the malware set, nor on the clean set). Avast, AVIRA, Kaspersky: tested with heuristic set to high/advanced. F-Secure, Sophos: asked to get tested and awarded based on their default settings (i.e. without using their advanced heuristics / suspicious detections setting). Several products make use of cloud technologies, which require an active internet connection. Our tests are performed using an active internet connection. We do not longer show the baseline detection rates without cloud and show instead only the results with active cloud. Users should be aware that detection rates may be in some cases drastically lower if the scan is performed while offline (or when the cloud is unreachable for various reasons). The cloud should be considered as an additional benefit/feature to increase detection rates (as well as response times and false alarm suppression) and not as a full replacement for local offline detections. Vendors should make sure that users are warned in case that the connectivity to the cloud2 gets lost e.g. during a scan, which may affect considerably the provided protection and make e.g. an initiated scan useless. This report does no longer contain the splitted detection categories. As announced last year we also do not longer provide the on-demand scanning speed test. We invite users also to look at our other tests and not only this type of test. This is to make users aware of other types of tests that we are providing since some years, like e.g. our Whole-Product “Real-World” Protection Test. The used test-set has been built consulting telemetry data in attempt to include prevalent samples from the last months which are/were hitting users in the field. Due to the increasing amount of polymorphic malware, we applied a clustering method to classify similar files. This allows us to evaluate prevalent polymorphic samples and reducing the size of the set without introducing bias. Therefore, each miss represents one missed group/variant of files. We decided to apply this method as it helps reducing influence of e.g. inappropriate samples of same family in the set. We compared the results/rankings with and without clustering the files. Using fewer samples (one per variant) will reduce the workload for all in the next tests, as a smaller set can be analyzed better. 2 A good paper about the pro and contra of clouds can be found here: http://www.av-test.org/fileadmin/pdf/publications/vb_2009_avtest_paper_why_in-the- cloud_scanning_is_not_a_solution.pdf -5-
  • 6. Anti-Virus Comparative – March 2012 www.av-comparatives.org The malware detection rates are grouped by the testers after looking at the clusters build with the hierarchal clustering method. By using clusters, there are no fixed thresholds to reach, as the thresholds change based on the results. The testers may group the clusters rationally and not rely solely on the clusters, to avoid that if e.g. all products would in future score badly, they do not get high rankings anyway. Detection Rate Clusters/Groups (given by the testers after consulting statistical methods) 4 3 2 1 Very few (0-2 FP’s) TESTED STANDARD ADVANCED ADVANCED+ Few (3-15 FP’s) Many (16-100 FP’s) TESTED TESTED STANDARD ADVANCED Very many (101-500 FP’s) TESTED TESTED STANDARD STANDARD Crazy many (over 500 FP’s) TESTED TESTED TESTED TESTED Graph of missed samples (lower is better) Files have been clustered. Each miss represents a missed variant. All tested products scored quite good and missed less than 10% of the test-set, so we decided to cluster the detection rates into three groups. -6-
  • 7. Anti-Virus Comparative – March 2012 www.av-comparatives.org The results of our file detection rate tests are not applicable for on-execution protection technologies (like HIPS, behaviour blockers, etc.). Such technologies are evaluated in our new heuristic/behavioral tests which we will provide during this year. Please do not miss the second part of the report (it will be published in a few months) containing the new heuristic/behavioral test. It evaluates how well products are at detecting and blocking completely new/unknown malware (by on-demand/on-access scanner with local heuristic and generic signatures without cloud, as well as by blocking malicious behavior when files get executed). A good file detection rate is still one of the most important, deterministic and reliable features of an Anti-Virus product. Additionally, most products provide at least some kind of HIPS, behaviour-based, reputation-based or other functionalities to block (or at least warn about the possibility of) malicious actions e.g. during the execution of malware, when all other on-access and on-demand detection/protection mechanism failed. Even if we deliver various tests and show different aspects of Anti-Virus software, users are advised to evaluate the software by themselves and build their own opinion about them. Test data or reviews just provide guidance to some aspects that users cannot evaluate by themselves. We suggest and encourage readers to research also other independent test results provided by various well-known and established independent testing organizations, in order to get a better overview about the detection and protection capabilities of the various products over different test scenarios and various test-sets. -7-
  • 8. Anti-Virus Comparative – March 2012 www.av-comparatives.org Results Please consider also the false alarm rates when looking at the below file detection rates3. Total detection rates (clustered in groups): 1. G DATA 99.7% 2. AVIRA 99.4% 3. Kaspersky 99.3% 4. Sophos 98.9% 5. F-Secure, Panda, Bitdefender, BullGuard, McAfee 98.6% 6. Fortinet, eScan 98.5% 7. Webroot 98.2% 8. Avast 98.0% 9. ESET 97.6% 10. PC Tools 97.2% 11. GFI 97.0% 12. AVG 96.4% 13. Trend Micro 95.6% 14. AhnLab 94.0% 15. Microsoft 93.1% The used test-set contained almost 300-thousands recent/prevalent samples from last months. Hierarchical Cluster Analysis This dendogram shows the results of the cluster analysis4. It indicates at what level of similarity the clusters are joined. The red drafted line defines the level of similarity. Each intersection indicates a group (in this case 3 groups). 3 We estimate the remaining error margin on the final set (after applying clusters) to be under 0.2% 4 For more information about cluster analysis, see e.g. this easy to understand tutorial: http://strata.uga.edu/software/pdf/clusterTutorial.pdf -8-
  • 9. Anti-Virus Comparative – March 2012 www.av-comparatives.org False positive/alarm test In order to better evaluate the quality of the file detection capabilities (distinguish good files from malicious files) of anti-virus products, we provide a false alarm test. False alarms can sometimes cause as much troubles as a real infection. Please consider the false alarm rate when looking at the detection rates, as a product which is prone to cause false alarms achieves higher scores easier. All discovered false alarms were reported/sent to the respective Anti-Virus vendors and should by now have been fixed. False Positive Results Number of false alarms found in our set of clean files (lower is better): 1. Microsoft 0 2. ESET 2 very few FPs 3. BitDefender, F-Secure 4 4. BullGuard 5 5. Kaspersky 9 6. Panda 10 few FP’s 7. eScan 11 8. G DATA 13 9. Avast, Sophos 14 10. AVIRA 15 11. PC Tools 22 12. McAfee 28 13. Fortinet 32 14. AVG 38 many FP’s 15. AhnLab 64 16. GFI 79 17. Trend Micro 166 18. Webroot 428 very many FP’s Details about the discovered false alarms (including their assumed prevalence) can be seen in a separate report available at: http://www.av-comparatives.org/images/stories/test/fp/avc_fp_mar2012.pdf -9-
  • 10. Anti-Virus Comparative – March 2012 www.av-comparatives.org Award levels reached in this test AV-Comparatives provides a ranking award (STANDARD, ADVANCED and ADVANCED+). As this report contains also the raw detection rates and not only the awards, expert users that e.g. do not care about false alarms can rely on that score alone if they want to. AWARDS PRODUCTS (based on detection rates and false alarms)  G DATA  AVIRA  Kaspersky  Sophos  F-Secure  Panda  Bitdefender  BullGuard  eScan  Avast  ESET  McAfee*  Fortinet*  PC Tools*  GFI*  AVG*  Microsoft  Trend Micro*  Webroot*  AhnLab* *: those products got lower awards due to false alarms The Awards are not only based on detection rates - also False Positives found in our set of clean files are considered. On page 6 of this report you can see how awards are being given. A product that is successful at detecting a high percentage of malicious files but suffers from false alarms may not be necessarily better than a product which detects less malicious files but which generates less false alarms. - 10 -
  • 11. Anti-Virus Comparative – March 2012 www.av-comparatives.org Copyright and Disclaimer This publication is Copyright © 2012 by AV-Comparatives e.V. ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV- Comparatives e.V., prior to any publication. AV-Comparatives e.V. and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives e.V. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data. AV-Comparatives e.V. is a registered Austrian Non-Profit-Organization. For more information about AV-Comparatives and the testing methodologies, please visit our website. AV-Comparatives e.V. (April 2012) - 11 -
  • 12. Anti-Virus Comparative – March 2012 www.av-comparatives.org Advertisement  - 12 -