SlideShare une entreprise Scribd logo
1  sur  11
Télécharger pour lire hors ligne
Anti-Virus Comparative




File Detection Test
of Malicious Software
includes false alarm test



Language: English
September 2012

Last Revision: 5th October 2012

www.av-comparatives.org
Anti-Virus Comparative - File Detection Test - September 2012               www.av-comparatives.org




  Table of Contents

   Tested Products                                                     3

   Conditions for participation and test methodology                   4 

   Tested product versions                                             4 

   Comments                                                            5

   Graph of missed samples                                             6 

   Results                                                             8 

   False positive/alarm test                                           9 

   Award levels reached in this test                                  10 

   Copyright and Disclaimer                                           11 




                                                                -2-
Anti-Virus Comparative - File Detection Test - September 2012                                   www.av-comparatives.org




Tested Products



       AhnLab V3 Internet Security 8.0                                  G DATA AntiVirus 2013
       avast! Free Antivirus 7.0                                        GFI Vipre Antivirus 2012
       AVG Anti-Virus 2012                                              Kaspersky Anti-Virus 2013
       AVIRA Antivirus Premium 2012                                     McAfee AntiVirus Plus 2012
       BitDefender Antivirus Plus 2013                                  Microsoft Security Essentials 4.0
       BullGuard Antivirus 12                                           Panda Cloud Free Antivirus 2.0.0
       eScan Anti-Virus 11.0                                            PC Tools Spyware Doctor with AV 9.0
       ESET NOD32 Antivirus 5.2                                         Sophos Anti-Virus 10.0
       F-Secure Anti-Virus 2012                                         Trend Micro Titanium AntiVirus+ 2013
       Fortinet FortiClient Lite 4.3                                    Webroot SecureAnywhere AV 8.0




                                                                -3-
Anti-Virus Comparative - File Detection Test - September 2012                                            www.av-comparatives.org




Conditions for participation and test methodology

The conditions for participation in our tests are listed in the methodology document at
http://www.av-comparatives.org/seiten/ergebnisse/methodology.pdf. Before proceeding with this
report, readers are advised to first read the above-mentioned document. The participation is limited
to not more than 20 international well-known Anti-Virus products, which vendors agreed to get tested
and included in the public test-series of 2012.

Tested Product Versions

The Malware sets have been frozen the 21st August 2012 and consisted of 240859 sample variants. The
products were updated on the 28th August 2012. The following twenty up-to-date products were
included in this public test (most current ones available at time of testing):

       AhnLab V3 Internet Security 8.0.6.13                         G DATA AntiVirus 23.0.3.2
       avast! Free Antivirus 7.0.1466                               GFI Vipre Antivirus 5.0.5162
       AVG Anti-Virus 2012.0.2197                                   Kaspersky Anti-Virus 13.0.1.4190
       AVIRA Antivirus Premium 12.0.0.1183                          McAfee AntiVirus Plus 11.6.385
       Bitdefender Anti-Virus+ 16.18.0.1406                         Microsoft Security Essentials 4.0.1526.0
       BullGuard Antivirus 12.0.231                                 Panda Cloud Free Antivirus 2.0.0
       eScan Anti-Virus 11.0.1139.1225                              PC Tools Spyware Doctor with Antivirus 9.0.0.2308
       ESET NOD32 Antivirus 5.2.9.12                                Sophos Anti-Virus 10.0.8
       F-Secure Anti-Virus 12.56.100                                Trend Micro Titanium AntiVirus Plus 6.0.1215
       Fortinet FortiClient Lite 4.3.5.472                          Webroot SecureAnywhere AV 8.0.1.233

Please try the products1 on your own system before making a purchase decision based on these tests.
There are also some other program features and important factors (e.g. price, ease of
use/management, compatibility, graphical user interface, language, support, etc.) to consider.
Although very important, the file detection rate of a product is only one aspect of a complete Anti-
Virus product. AV-Comparatives provides also a whole product dynamic “real-world” protection test, as
well as other test reports which cover different aspects/features of the products.

We invite users to look at our other tests and not only at this type of test. This is to make users aware
of other types of tests and reports that we are providing already since several years, like e.g. our
Whole-Product “Real-World” Protection Test. A good file detection rate is still one of the most
important, deterministic and reliable features of an Anti-Virus product. Additionally, most products
provide at least some kind of functionalities to block (or at least warn about the possibility of)
malicious actions e.g. during the execution of malware, when all other on-access and on-demand
detection/protection mechanism failed (these protection features are evaluated in other types of tests
that we provide on our website).


1
  Information about used additional third-party engines/signatures inside the products: Bullguard, eScan and
F-Secure are based on the BitDefender engine. G DATA is based on the Avast and Bitdefender engines. PC Tools
is using the signatures of Symantec.


                                                                -4-
Anti-Virus Comparative - File Detection Test - September 2012                            www.av-comparatives.org



Most products run with highest settings by default. Certain products switch to highest settings
automatically when malware is found. This makes it impossible to test against various malware with
real “default” settings. In order to get comparable results we set the few remaining products to
highest settings or leave them to lower settings - in accordance with the respective vendors. We
kindly ask vendors to provide stronger settings by default, i.e. set their default settings to highest
levels of detection, esp. for scheduled scans or scans initiated by the user. This is usually already the
case for on-access scans and/or on-execution scans. We allow the remaining vendors (which do not
use highest settings in on-demand scans) to choose to be tested with higher setting as they e.g. use
in on-access/on-execution higher settings by default. So the results of the file detection test are
closer to the usage in the field. We ask vendors to remove paranoid settings inside the user interface
which are too high to be ever of any benefit for common users. Below are some notes about the
settings used (scan all files, scan archives, etc. is being enabled), e.g.:

AVG, AVIRA: asked to do not enable/consider the informational warnings of packers as detections.
So, we did not count them as detections (neither on the malware set, nor on the clean set).

F-Secure, Sophos: asked to get tested and awarded based on their default settings (i.e. without using
their advanced heuristics / suspicious detections setting).

AVIRA, Kaspersky: asked to get tested with heuristic set to high/advanced.

Several products make use of cloud technologies, which require an active internet connection. Our
tests are performed using an active internet connection. Users should be aware that detection rates
may be in some cases drastically lower if the scan is performed while offline (or when the cloud is
unreachable for various reasons). The cloud should be considered as an additional benefit/feature to
increase detection rates (as well as response times and false alarm suppression) and not as a full
replacement for local offline detections. Vendors should make sure that users are warned in case that
the connectivity to the cloud gets lost e.g. during a scan, which may affect considerably the provided
protection and make e.g. an initiated scan useless.

The used test-set has been built consulting telemetry data in attempt to include prevalent samples
from the last weeks/months which are/were hitting users in the field. We applied a clustering method
to classify similar files (one sample - the most prevalent one - per variant). This allows us to evaluate
prevalent similar samples and reducing the size of the set without introducing bias. Therefore, each
miss is intended to represent one missed group/variant of files. As we used even more recent malware
samples and even more prevalent samples than previously, the size of test-set is even smaller (and
should be easier to get discovered/detected by AV products).

As usual, all test results have been peer-verified before publishing. We also include crosschecks e.g.
checking cloud connectivity from different IPs, service providers, countries, using various product
licenses, changing test environment and related variables, testing over a period of time, etc.
Furthermore, we sometimes provide random sets of samples to other trusted independent labs to let
them check if the product behaves the same and if they get the same results as we do.




                                                                -5-
Anti-Virus Comparative - File Detection Test - September 2012                                                      www.av-comparatives.org



The malware detection rates are grouped by the testers after looking at the clusters build with the
hierarchal clustering method. By using clusters, there are no fixed thresholds to reach, as the
thresholds change based on the results. The testers may group the clusters rationally and not rely
solely on the clusters, to avoid that if e.g. all products would in future score badly, they do not get
high rankings anyway. As we reduced also the clean set, we updated also the awarding system, as
some products manage to get way too false alarms even in the reduced clean set. We are still
evaluating whether to apply clusters also for the FP ranges or to set next year the threshold for
“Many” at 11 instead 16.

                                                                 Detection Rate Clusters/Groups
                                                     (given by the testers after consulting statistical methods)
                                                         4             3               2                1
                            Very few (0-2 FP’s)
                                                      TESTED      STANDARD        ADVANCED         ADVANCED+
                               Few (3-15 FP’s)

                             Many (16-50 FP’s)        TESTED        TESTED        STANDARD         ADVANCED

                      Very many (51-100 FP’s)         TESTED        TESTED          TESTED         STANDARD

                   Crazy many (over 100 FP’s)         TESTED        TESTED          TESTED           TESTED



Graph of missed samples (lower is better)




Even if we deliver various tests and show different aspects of Anti-Virus software, users are advised to
evaluate the software by themselves and build their own opinion about them. Test data or reviews
just provide guidance to some aspects that users cannot evaluate by themselves. We suggest and
encourage readers to research also other independent test results provided by various well-known and
established independent testing organizations, in order to get a better overview about the detection
and protection capabilities of the various products over different test scenarios and various test-sets.



                                                                 -6-
Anti-Virus Comparative - File Detection Test - September 2012                                www.av-comparatives.org




Results

Please consider also the false alarm rates when looking at the below file detection rates2.

Total detection rates (clustered in groups):
    1.   G DATA                         99.9%
    2.   AVIRA                          99.8%
    3.   Panda, Trend Micro             99.6%
    4.   F-Secure                       99.3%
    5.   Kaspersky, BitDefender,        99.2%
         BullGuard, Fortinet, eScan
    6.   McAfee                         98.8%
    7.   Sophos                         98.7%
    8.   Avast                          98.6%
    9.   GFI Vipre                      98.5%
     10.     AVG                                       98.0%
     11.     ESET                                      97.4%
     12.     AhnLab                                    95.6%
     13.     Microsoft                                 94.9%
     14.     PC Tools                                  94.4%
     15.     Webroot                            below 80%3

The used test-set contained around 240000 recent/prevalent samples from last weeks/months.

Hierarchical	Cluster	Analysis	



                                                                         This dendogram shows the results
                                                                         of the cluster analysis4. It
                                                                         indicates at what level of
                                                                         similarity the clusters are joined.
                                                                         The red drafted line defines the
                                                                         level     of    similarity.    Each
                                                                         intersection indicates a group (in
                                                                         this case four groups).




2
  We estimate the remaining error margin on the final percentages to be below 0.2%
3
  Results and misses have been confirmed with several tests and also by the vendor. Even two weeks after the
test-date, the detection rate was still lower than 80%. Also a representative set of misses was sent for peer-
review to an independent party.
4
  For more information about cluster analysis, see e.g. this easy to understand tutorial:
http://strata.uga.edu/software/pdf/clusterTutorial.pdf


                                                                -7-
Anti-Virus Comparative - File Detection Test - September 2012                                 www.av-comparatives.org




False positive/alarm test
In order to better evaluate the quality of the file detection capabilities (distinguish good files from
malicious files) of anti-virus products, we provide a false alarm test. False alarms can sometimes cause
as much troubles as a real infection. Please consider the false alarm rate when looking at the
detection rates, as a product which is prone to cause false alarms achieves higher scores easier. All
discovered false alarms were reported/sent to the respective Anti-Virus vendors and should by now
have been fixed.

False Positive Results

Number of false alarms found in our set of clean files (lower is better):
     1.         Microsoft                                         0         very few FPs
     2.         ESET                                             4
     3.         Kaspersky                                        5
     4.         Trend Micro                                      7
     5.         AVIRA, BitDefender, BullGuard                   10          few FP’s
     6.         Avast, Fortinet                                 11
     7.         McAfee                                          12
     8.         eScan                                           14
     9.         F-secure, PC Tools                              15
     10.        Sophos                                           19
     11.        AhnLab, Panda                                    20
     12.        G DATA                                           23         many FP’s
     13.        GFI Vipre                                        34
     14.        AVG                                              36
     15.        Webroot                                         210         crazy many FP’s

Details about the discovered false alarms (including their assumed prevalence) can be seen in a
separate report available at: http://www.av-comparatives.org/images/docs/avc_fps_201209_en.pdf




                                                                      -8-
Anti-Virus Comparative - File Detection Test - September 2012                               www.av-comparatives.org




Award levels reached in this test

AV-Comparatives provides a ranking award. As this report contains also the raw detection rates and
not only the awards, expert users that e.g. do not care about false alarms can rely on that score alone
if they want to.

                                           AWARDS                             PRODUCTS
                            (based on detection rates and false alarms)
                                                                             AVIRA
                                                                             Trend Micro
                                                                             F-Secure
                                                                             Kaspersky
                                                                             BitDefender
                                                                             BullGuard
                                                                             eScan
                                                                             McAfee
                                                                             Avast


                                                                             G DATA*
                                                                             Panda*
                                                                             Fortinet*
                                                                             Sophos*
                                                                             GFI Vipre*
                                                                             ESET




                                                                           AVG*
                                                                           Microsoft
                                                                           PC Tools




                                                                           AhnLab*
                                                                           Webroot




                     *: those products got lower awards due to false alarms

The Awards are not only based on detection rates - also False Positives found in our set of clean files
are considered. On page 6 of this report you can see how awards are being given.

A product that is successful at detecting a high percentage of malicious files but suffers from false
alarms may not be necessarily better than a product which detects less malicious files but which
generates less false alarms.



                                                                  -9-
Anti-Virus Comparative - File Detection Test - September 2012                                www.av-comparatives.org




Copyright and Disclaimer

This publication is Copyright © 2012 by AV-Comparatives e.V. ®. Any use of the results, etc. in whole
or in part, is ONLY permitted after the explicit written agreement of the management board of AV-
Comparatives e.V., prior to any publication. AV-Comparatives e.V. and its testers cannot be held liable
for any damage or loss, which might occur as result of, or in connection with, the use of the
information provided in this paper. We take every possible care to ensure the correctness of the basic
data, but a liability for the correctness of the test results cannot be taken by any representative of
AV-Comparatives e.V. We do not give any guarantee of the correctness, completeness, or suitability
for a specific purpose of any of the information/content provided at any given time. No one else
involved in creating, producing or delivering test results shall be liable for any indirect, special or
consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the
services provided by the website, test documents or any related data. AV-Comparatives e.V. is a
registered Austrian Non-Profit-Organization.

For more information about AV-Comparatives and the testing methodologies, please visit our website.

                                                                         AV-Comparatives e.V. (October 2012)




                                                                - 10 -
Anti-Virus Comparative - File Detection Test - September 2012            www.av-comparatives.org




                                                                                  Advertisement 




                                                                - 11 -

Contenu connexe

Tendances

Gabriele Lana: Testing Web Applications
Gabriele Lana: Testing Web ApplicationsGabriele Lana: Testing Web Applications
Gabriele Lana: Testing Web ApplicationsFrancesco Fullone
 
IRJET- Faces of Testing Strategies: Why &When?
IRJET- Faces of Testing Strategies: Why &When?IRJET- Faces of Testing Strategies: Why &When?
IRJET- Faces of Testing Strategies: Why &When?IRJET Journal
 
On the Static Analysis of Hybrid Mobile Apps: A Report on the State of Apache...
On the Static Analysis of Hybrid Mobile Apps: A Report on the State of Apache...On the Static Analysis of Hybrid Mobile Apps: A Report on the State of Apache...
On the Static Analysis of Hybrid Mobile Apps: A Report on the State of Apache...Achim D. Brucker
 
Software testing
Software testingSoftware testing
Software testingfatboysec
 
Using Third Party Components for Building an Application Might be More Danger...
Using Third Party Components for Building an Application Might be More Danger...Using Third Party Components for Building an Application Might be More Danger...
Using Third Party Components for Building an Application Might be More Danger...Achim D. Brucker
 
Agile Secure Software Development in a Large Software Development Organisatio...
Agile Secure Software Development in a Large Software Development Organisatio...Agile Secure Software Development in a Large Software Development Organisatio...
Agile Secure Software Development in a Large Software Development Organisatio...Achim D. Brucker
 
What Good is this Tool? A Guide to Choosing the Right Application Security Te...
What Good is this Tool? A Guide to Choosing the Right Application Security Te...What Good is this Tool? A Guide to Choosing the Right Application Security Te...
What Good is this Tool? A Guide to Choosing the Right Application Security Te...Kevin Fealey
 
Integrating Application Security into a Software Development Process
Integrating Application Security into a Software Development ProcessIntegrating Application Security into a Software Development Process
Integrating Application Security into a Software Development ProcessAchim D. Brucker
 
AppSec California 2016 - Making Security Agile
AppSec California 2016 - Making Security AgileAppSec California 2016 - Making Security Agile
AppSec California 2016 - Making Security AgileOleg Gryb
 
Continuous Security Testing
Continuous Security TestingContinuous Security Testing
Continuous Security TestingRay Lai
 
Security's DevOps Transformation
Security's DevOps TransformationSecurity's DevOps Transformation
Security's DevOps TransformationMichele Chubirka
 
The State of Open Source Vulnerabilities Management
The State of Open Source Vulnerabilities ManagementThe State of Open Source Vulnerabilities Management
The State of Open Source Vulnerabilities ManagementWhiteSource
 
Rapid software testing
Rapid software testingRapid software testing
Rapid software testingSachin MK
 
EFFECTIVE TEST CASE DESING: A REVIEW
EFFECTIVE TEST CASE DESING: A REVIEWEFFECTIVE TEST CASE DESING: A REVIEW
EFFECTIVE TEST CASE DESING: A REVIEWJournal For Research
 
Owasp testing guide_v4
Owasp testing guide_v4Owasp testing guide_v4
Owasp testing guide_v4Nguyen Van Duy
 
RSA 2015 Blending the Automated and the Manual: Making Application Vulnerabil...
RSA 2015 Blending the Automated and the Manual: Making Application Vulnerabil...RSA 2015 Blending the Automated and the Manual: Making Application Vulnerabil...
RSA 2015 Blending the Automated and the Manual: Making Application Vulnerabil...Denim Group
 

Tendances (17)

Gabriele Lana: Testing Web Applications
Gabriele Lana: Testing Web ApplicationsGabriele Lana: Testing Web Applications
Gabriele Lana: Testing Web Applications
 
IRJET- Faces of Testing Strategies: Why &When?
IRJET- Faces of Testing Strategies: Why &When?IRJET- Faces of Testing Strategies: Why &When?
IRJET- Faces of Testing Strategies: Why &When?
 
On the Static Analysis of Hybrid Mobile Apps: A Report on the State of Apache...
On the Static Analysis of Hybrid Mobile Apps: A Report on the State of Apache...On the Static Analysis of Hybrid Mobile Apps: A Report on the State of Apache...
On the Static Analysis of Hybrid Mobile Apps: A Report on the State of Apache...
 
Software testing
Software testingSoftware testing
Software testing
 
Using Third Party Components for Building an Application Might be More Danger...
Using Third Party Components for Building an Application Might be More Danger...Using Third Party Components for Building an Application Might be More Danger...
Using Third Party Components for Building an Application Might be More Danger...
 
Avc sum 201212_en
Avc sum 201212_enAvc sum 201212_en
Avc sum 201212_en
 
Agile Secure Software Development in a Large Software Development Organisatio...
Agile Secure Software Development in a Large Software Development Organisatio...Agile Secure Software Development in a Large Software Development Organisatio...
Agile Secure Software Development in a Large Software Development Organisatio...
 
What Good is this Tool? A Guide to Choosing the Right Application Security Te...
What Good is this Tool? A Guide to Choosing the Right Application Security Te...What Good is this Tool? A Guide to Choosing the Right Application Security Te...
What Good is this Tool? A Guide to Choosing the Right Application Security Te...
 
Integrating Application Security into a Software Development Process
Integrating Application Security into a Software Development ProcessIntegrating Application Security into a Software Development Process
Integrating Application Security into a Software Development Process
 
AppSec California 2016 - Making Security Agile
AppSec California 2016 - Making Security AgileAppSec California 2016 - Making Security Agile
AppSec California 2016 - Making Security Agile
 
Continuous Security Testing
Continuous Security TestingContinuous Security Testing
Continuous Security Testing
 
Security's DevOps Transformation
Security's DevOps TransformationSecurity's DevOps Transformation
Security's DevOps Transformation
 
The State of Open Source Vulnerabilities Management
The State of Open Source Vulnerabilities ManagementThe State of Open Source Vulnerabilities Management
The State of Open Source Vulnerabilities Management
 
Rapid software testing
Rapid software testingRapid software testing
Rapid software testing
 
EFFECTIVE TEST CASE DESING: A REVIEW
EFFECTIVE TEST CASE DESING: A REVIEWEFFECTIVE TEST CASE DESING: A REVIEW
EFFECTIVE TEST CASE DESING: A REVIEW
 
Owasp testing guide_v4
Owasp testing guide_v4Owasp testing guide_v4
Owasp testing guide_v4
 
RSA 2015 Blending the Automated and the Manual: Making Application Vulnerabil...
RSA 2015 Blending the Automated and the Manual: Making Application Vulnerabil...RSA 2015 Blending the Automated and the Manual: Making Application Vulnerabil...
RSA 2015 Blending the Automated and the Manual: Making Application Vulnerabil...
 

En vedette

How to think like a startup
How to think like a startupHow to think like a startup
How to think like a startupLoic Le Meur
 
32 Ways a Digital Marketing Consultant Can Help Grow Your Business
32 Ways a Digital Marketing Consultant Can Help Grow Your Business32 Ways a Digital Marketing Consultant Can Help Grow Your Business
32 Ways a Digital Marketing Consultant Can Help Grow Your BusinessBarry Feldman
 
Teaching Students with Emojis, Emoticons, & Textspeak
Teaching Students with Emojis, Emoticons, & TextspeakTeaching Students with Emojis, Emoticons, & Textspeak
Teaching Students with Emojis, Emoticons, & TextspeakShelly Sanchez Terrell
 
Hype vs. Reality: The AI Explainer
Hype vs. Reality: The AI ExplainerHype vs. Reality: The AI Explainer
Hype vs. Reality: The AI ExplainerLuminary Labs
 
Study: The Future of VR, AR and Self-Driving Cars
Study: The Future of VR, AR and Self-Driving CarsStudy: The Future of VR, AR and Self-Driving Cars
Study: The Future of VR, AR and Self-Driving CarsLinkedIn
 

En vedette (6)

Inaugural Addresses
Inaugural AddressesInaugural Addresses
Inaugural Addresses
 
How to think like a startup
How to think like a startupHow to think like a startup
How to think like a startup
 
32 Ways a Digital Marketing Consultant Can Help Grow Your Business
32 Ways a Digital Marketing Consultant Can Help Grow Your Business32 Ways a Digital Marketing Consultant Can Help Grow Your Business
32 Ways a Digital Marketing Consultant Can Help Grow Your Business
 
Teaching Students with Emojis, Emoticons, & Textspeak
Teaching Students with Emojis, Emoticons, & TextspeakTeaching Students with Emojis, Emoticons, & Textspeak
Teaching Students with Emojis, Emoticons, & Textspeak
 
Hype vs. Reality: The AI Explainer
Hype vs. Reality: The AI ExplainerHype vs. Reality: The AI Explainer
Hype vs. Reality: The AI Explainer
 
Study: The Future of VR, AR and Self-Driving Cars
Study: The Future of VR, AR and Self-Driving CarsStudy: The Future of VR, AR and Self-Driving Cars
Study: The Future of VR, AR and Self-Driving Cars
 

Similaire à Avc fdt 201209_en

Avc report25
Avc report25Avc report25
Avc report25Era Brown
 
Antivirus test-wholedynamic2010
Antivirus test-wholedynamic2010Antivirus test-wholedynamic2010
Antivirus test-wholedynamic2010nuttakorn nakkerd
 
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017Jermund Ottermo
 
Antivirus Comparative junio 2014
Antivirus Comparative junio 2014Antivirus Comparative junio 2014
Antivirus Comparative junio 2014Doryan Mathos
 
AV-Comparatives Real World Protection Test
AV-Comparatives Real World Protection TestAV-Comparatives Real World Protection Test
AV-Comparatives Real World Protection TestHerbert Rodriguez
 
Malware Removal Test OCT 2009
Malware Removal Test OCT 2009Malware Removal Test OCT 2009
Malware Removal Test OCT 2009Kim Jensen
 
AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)Doryan Mathos
 
Measuring the Actual Security that Vendors Provide to Customers
Measuring the Actual Security that Vendors Provide to CustomersMeasuring the Actual Security that Vendors Provide to Customers
Measuring the Actual Security that Vendors Provide to CustomersAnthony Arrott
 
Avtest Kasım 2011 Bedava Android Antivirüs Araştırması
Avtest Kasım 2011 Bedava Android Antivirüs AraştırmasıAvtest Kasım 2011 Bedava Android Antivirüs Araştırması
Avtest Kasım 2011 Bedava Android Antivirüs AraştırmasıErol Dizdar
 
ANTIVIRUS
ANTIVIRUSANTIVIRUS
ANTIVIRUSfauscha
 
Webroot Cloud vs. Six Traditional Antivirus Products
Webroot Cloud vs. Six Traditional Antivirus ProductsWebroot Cloud vs. Six Traditional Antivirus Products
Webroot Cloud vs. Six Traditional Antivirus ProductsWebroot
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishDaniel zhao
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishКомсс Файквэе
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishКомсс Файквэе
 

Similaire à Avc fdt 201209_en (20)

Avc report25
Avc report25Avc report25
Avc report25
 
Avc per 201206_en
Avc per 201206_enAvc per 201206_en
Avc per 201206_en
 
Avc prot 2012b_en
Avc prot 2012b_enAvc prot 2012b_en
Avc prot 2012b_en
 
Performance dec 2010
Performance dec 2010Performance dec 2010
Performance dec 2010
 
Antivirus test-wholedynamic2010
Antivirus test-wholedynamic2010Antivirus test-wholedynamic2010
Antivirus test-wholedynamic2010
 
Avc rem 201211_en
Avc rem 201211_enAvc rem 201211_en
Avc rem 201211_en
 
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
The AV-Comparatives Guide to the Best Cybersecurity Solutions of 2017
 
Antivirus Comparative junio 2014
Antivirus Comparative junio 2014Antivirus Comparative junio 2014
Antivirus Comparative junio 2014
 
AV-Comparatives Real World Protection Test
AV-Comparatives Real World Protection TestAV-Comparatives Real World Protection Test
AV-Comparatives Real World Protection Test
 
Malware Removal Test OCT 2009
Malware Removal Test OCT 2009Malware Removal Test OCT 2009
Malware Removal Test OCT 2009
 
AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)AV Comparatives 2013 (Comparación de Antivirus)
AV Comparatives 2013 (Comparación de Antivirus)
 
Avc per 201304_en
Avc per 201304_enAvc per 201304_en
Avc per 201304_en
 
Measuring the Actual Security that Vendors Provide to Customers
Measuring the Actual Security that Vendors Provide to CustomersMeasuring the Actual Security that Vendors Provide to Customers
Measuring the Actual Security that Vendors Provide to Customers
 
Avtest Kasım 2011 Bedava Android Antivirüs Araştırması
Avtest Kasım 2011 Bedava Android Antivirüs AraştırmasıAvtest Kasım 2011 Bedava Android Antivirüs Araştırması
Avtest Kasım 2011 Bedava Android Antivirüs Araştırması
 
ANTIVIRUS
ANTIVIRUSANTIVIRUS
ANTIVIRUS
 
Webroot Cloud vs. Six Traditional Antivirus Products
Webroot Cloud vs. Six Traditional Antivirus ProductsWebroot Cloud vs. Six Traditional Antivirus Products
Webroot Cloud vs. Six Traditional Antivirus Products
 
Avc prot 2016a_en
Avc prot 2016a_enAvc prot 2016a_en
Avc prot 2016a_en
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 
Avtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_englishAvtest 2012 02-android_anti-malware_report_english
Avtest 2012 02-android_anti-malware_report_english
 

Avc fdt 201209_en

  • 1. Anti-Virus Comparative File Detection Test of Malicious Software includes false alarm test Language: English September 2012 Last Revision: 5th October 2012 www.av-comparatives.org
  • 2. Anti-Virus Comparative - File Detection Test - September 2012 www.av-comparatives.org Table of Contents Tested Products 3 Conditions for participation and test methodology 4  Tested product versions 4  Comments 5 Graph of missed samples 6  Results 8  False positive/alarm test 9  Award levels reached in this test 10  Copyright and Disclaimer 11  -2-
  • 3. Anti-Virus Comparative - File Detection Test - September 2012 www.av-comparatives.org Tested Products  AhnLab V3 Internet Security 8.0  G DATA AntiVirus 2013  avast! Free Antivirus 7.0  GFI Vipre Antivirus 2012  AVG Anti-Virus 2012  Kaspersky Anti-Virus 2013  AVIRA Antivirus Premium 2012  McAfee AntiVirus Plus 2012  BitDefender Antivirus Plus 2013  Microsoft Security Essentials 4.0  BullGuard Antivirus 12  Panda Cloud Free Antivirus 2.0.0  eScan Anti-Virus 11.0  PC Tools Spyware Doctor with AV 9.0  ESET NOD32 Antivirus 5.2  Sophos Anti-Virus 10.0  F-Secure Anti-Virus 2012  Trend Micro Titanium AntiVirus+ 2013  Fortinet FortiClient Lite 4.3  Webroot SecureAnywhere AV 8.0 -3-
  • 4. Anti-Virus Comparative - File Detection Test - September 2012 www.av-comparatives.org Conditions for participation and test methodology The conditions for participation in our tests are listed in the methodology document at http://www.av-comparatives.org/seiten/ergebnisse/methodology.pdf. Before proceeding with this report, readers are advised to first read the above-mentioned document. The participation is limited to not more than 20 international well-known Anti-Virus products, which vendors agreed to get tested and included in the public test-series of 2012. Tested Product Versions The Malware sets have been frozen the 21st August 2012 and consisted of 240859 sample variants. The products were updated on the 28th August 2012. The following twenty up-to-date products were included in this public test (most current ones available at time of testing):  AhnLab V3 Internet Security 8.0.6.13  G DATA AntiVirus 23.0.3.2  avast! Free Antivirus 7.0.1466  GFI Vipre Antivirus 5.0.5162  AVG Anti-Virus 2012.0.2197  Kaspersky Anti-Virus 13.0.1.4190  AVIRA Antivirus Premium 12.0.0.1183  McAfee AntiVirus Plus 11.6.385  Bitdefender Anti-Virus+ 16.18.0.1406  Microsoft Security Essentials 4.0.1526.0  BullGuard Antivirus 12.0.231  Panda Cloud Free Antivirus 2.0.0  eScan Anti-Virus 11.0.1139.1225  PC Tools Spyware Doctor with Antivirus 9.0.0.2308  ESET NOD32 Antivirus 5.2.9.12  Sophos Anti-Virus 10.0.8  F-Secure Anti-Virus 12.56.100  Trend Micro Titanium AntiVirus Plus 6.0.1215  Fortinet FortiClient Lite 4.3.5.472  Webroot SecureAnywhere AV 8.0.1.233 Please try the products1 on your own system before making a purchase decision based on these tests. There are also some other program features and important factors (e.g. price, ease of use/management, compatibility, graphical user interface, language, support, etc.) to consider. Although very important, the file detection rate of a product is only one aspect of a complete Anti- Virus product. AV-Comparatives provides also a whole product dynamic “real-world” protection test, as well as other test reports which cover different aspects/features of the products. We invite users to look at our other tests and not only at this type of test. This is to make users aware of other types of tests and reports that we are providing already since several years, like e.g. our Whole-Product “Real-World” Protection Test. A good file detection rate is still one of the most important, deterministic and reliable features of an Anti-Virus product. Additionally, most products provide at least some kind of functionalities to block (or at least warn about the possibility of) malicious actions e.g. during the execution of malware, when all other on-access and on-demand detection/protection mechanism failed (these protection features are evaluated in other types of tests that we provide on our website). 1 Information about used additional third-party engines/signatures inside the products: Bullguard, eScan and F-Secure are based on the BitDefender engine. G DATA is based on the Avast and Bitdefender engines. PC Tools is using the signatures of Symantec. -4-
  • 5. Anti-Virus Comparative - File Detection Test - September 2012 www.av-comparatives.org Most products run with highest settings by default. Certain products switch to highest settings automatically when malware is found. This makes it impossible to test against various malware with real “default” settings. In order to get comparable results we set the few remaining products to highest settings or leave them to lower settings - in accordance with the respective vendors. We kindly ask vendors to provide stronger settings by default, i.e. set their default settings to highest levels of detection, esp. for scheduled scans or scans initiated by the user. This is usually already the case for on-access scans and/or on-execution scans. We allow the remaining vendors (which do not use highest settings in on-demand scans) to choose to be tested with higher setting as they e.g. use in on-access/on-execution higher settings by default. So the results of the file detection test are closer to the usage in the field. We ask vendors to remove paranoid settings inside the user interface which are too high to be ever of any benefit for common users. Below are some notes about the settings used (scan all files, scan archives, etc. is being enabled), e.g.: AVG, AVIRA: asked to do not enable/consider the informational warnings of packers as detections. So, we did not count them as detections (neither on the malware set, nor on the clean set). F-Secure, Sophos: asked to get tested and awarded based on their default settings (i.e. without using their advanced heuristics / suspicious detections setting). AVIRA, Kaspersky: asked to get tested with heuristic set to high/advanced. Several products make use of cloud technologies, which require an active internet connection. Our tests are performed using an active internet connection. Users should be aware that detection rates may be in some cases drastically lower if the scan is performed while offline (or when the cloud is unreachable for various reasons). The cloud should be considered as an additional benefit/feature to increase detection rates (as well as response times and false alarm suppression) and not as a full replacement for local offline detections. Vendors should make sure that users are warned in case that the connectivity to the cloud gets lost e.g. during a scan, which may affect considerably the provided protection and make e.g. an initiated scan useless. The used test-set has been built consulting telemetry data in attempt to include prevalent samples from the last weeks/months which are/were hitting users in the field. We applied a clustering method to classify similar files (one sample - the most prevalent one - per variant). This allows us to evaluate prevalent similar samples and reducing the size of the set without introducing bias. Therefore, each miss is intended to represent one missed group/variant of files. As we used even more recent malware samples and even more prevalent samples than previously, the size of test-set is even smaller (and should be easier to get discovered/detected by AV products). As usual, all test results have been peer-verified before publishing. We also include crosschecks e.g. checking cloud connectivity from different IPs, service providers, countries, using various product licenses, changing test environment and related variables, testing over a period of time, etc. Furthermore, we sometimes provide random sets of samples to other trusted independent labs to let them check if the product behaves the same and if they get the same results as we do. -5-
  • 6. Anti-Virus Comparative - File Detection Test - September 2012 www.av-comparatives.org The malware detection rates are grouped by the testers after looking at the clusters build with the hierarchal clustering method. By using clusters, there are no fixed thresholds to reach, as the thresholds change based on the results. The testers may group the clusters rationally and not rely solely on the clusters, to avoid that if e.g. all products would in future score badly, they do not get high rankings anyway. As we reduced also the clean set, we updated also the awarding system, as some products manage to get way too false alarms even in the reduced clean set. We are still evaluating whether to apply clusters also for the FP ranges or to set next year the threshold for “Many” at 11 instead 16. Detection Rate Clusters/Groups (given by the testers after consulting statistical methods) 4 3 2 1 Very few (0-2 FP’s) TESTED STANDARD ADVANCED ADVANCED+ Few (3-15 FP’s) Many (16-50 FP’s) TESTED TESTED STANDARD ADVANCED Very many (51-100 FP’s) TESTED TESTED TESTED STANDARD Crazy many (over 100 FP’s) TESTED TESTED TESTED TESTED Graph of missed samples (lower is better) Even if we deliver various tests and show different aspects of Anti-Virus software, users are advised to evaluate the software by themselves and build their own opinion about them. Test data or reviews just provide guidance to some aspects that users cannot evaluate by themselves. We suggest and encourage readers to research also other independent test results provided by various well-known and established independent testing organizations, in order to get a better overview about the detection and protection capabilities of the various products over different test scenarios and various test-sets. -6-
  • 7. Anti-Virus Comparative - File Detection Test - September 2012 www.av-comparatives.org Results Please consider also the false alarm rates when looking at the below file detection rates2. Total detection rates (clustered in groups): 1. G DATA 99.9% 2. AVIRA 99.8% 3. Panda, Trend Micro 99.6% 4. F-Secure 99.3% 5. Kaspersky, BitDefender, 99.2% BullGuard, Fortinet, eScan 6. McAfee 98.8% 7. Sophos 98.7% 8. Avast 98.6% 9. GFI Vipre 98.5% 10. AVG 98.0% 11. ESET 97.4% 12. AhnLab 95.6% 13. Microsoft 94.9% 14. PC Tools 94.4% 15. Webroot below 80%3 The used test-set contained around 240000 recent/prevalent samples from last weeks/months. Hierarchical Cluster Analysis This dendogram shows the results of the cluster analysis4. It indicates at what level of similarity the clusters are joined. The red drafted line defines the level of similarity. Each intersection indicates a group (in this case four groups). 2 We estimate the remaining error margin on the final percentages to be below 0.2% 3 Results and misses have been confirmed with several tests and also by the vendor. Even two weeks after the test-date, the detection rate was still lower than 80%. Also a representative set of misses was sent for peer- review to an independent party. 4 For more information about cluster analysis, see e.g. this easy to understand tutorial: http://strata.uga.edu/software/pdf/clusterTutorial.pdf -7-
  • 8. Anti-Virus Comparative - File Detection Test - September 2012 www.av-comparatives.org False positive/alarm test In order to better evaluate the quality of the file detection capabilities (distinguish good files from malicious files) of anti-virus products, we provide a false alarm test. False alarms can sometimes cause as much troubles as a real infection. Please consider the false alarm rate when looking at the detection rates, as a product which is prone to cause false alarms achieves higher scores easier. All discovered false alarms were reported/sent to the respective Anti-Virus vendors and should by now have been fixed. False Positive Results Number of false alarms found in our set of clean files (lower is better): 1. Microsoft 0 very few FPs 2. ESET 4 3. Kaspersky 5 4. Trend Micro 7 5. AVIRA, BitDefender, BullGuard 10 few FP’s 6. Avast, Fortinet 11 7. McAfee 12 8. eScan 14 9. F-secure, PC Tools 15 10. Sophos 19 11. AhnLab, Panda 20 12. G DATA 23 many FP’s 13. GFI Vipre 34 14. AVG 36 15. Webroot 210 crazy many FP’s Details about the discovered false alarms (including their assumed prevalence) can be seen in a separate report available at: http://www.av-comparatives.org/images/docs/avc_fps_201209_en.pdf -8-
  • 9. Anti-Virus Comparative - File Detection Test - September 2012 www.av-comparatives.org Award levels reached in this test AV-Comparatives provides a ranking award. As this report contains also the raw detection rates and not only the awards, expert users that e.g. do not care about false alarms can rely on that score alone if they want to. AWARDS PRODUCTS (based on detection rates and false alarms)  AVIRA  Trend Micro  F-Secure  Kaspersky  BitDefender  BullGuard  eScan  McAfee  Avast  G DATA*  Panda*  Fortinet*  Sophos*  GFI Vipre*  ESET  AVG*  Microsoft  PC Tools  AhnLab*  Webroot *: those products got lower awards due to false alarms The Awards are not only based on detection rates - also False Positives found in our set of clean files are considered. On page 6 of this report you can see how awards are being given. A product that is successful at detecting a high percentage of malicious files but suffers from false alarms may not be necessarily better than a product which detects less malicious files but which generates less false alarms. -9-
  • 10. Anti-Virus Comparative - File Detection Test - September 2012 www.av-comparatives.org Copyright and Disclaimer This publication is Copyright © 2012 by AV-Comparatives e.V. ®. Any use of the results, etc. in whole or in part, is ONLY permitted after the explicit written agreement of the management board of AV- Comparatives e.V., prior to any publication. AV-Comparatives e.V. and its testers cannot be held liable for any damage or loss, which might occur as result of, or in connection with, the use of the information provided in this paper. We take every possible care to ensure the correctness of the basic data, but a liability for the correctness of the test results cannot be taken by any representative of AV-Comparatives e.V. We do not give any guarantee of the correctness, completeness, or suitability for a specific purpose of any of the information/content provided at any given time. No one else involved in creating, producing or delivering test results shall be liable for any indirect, special or consequential damage, or loss of profits, arising out of, or related to, the use or inability to use, the services provided by the website, test documents or any related data. AV-Comparatives e.V. is a registered Austrian Non-Profit-Organization. For more information about AV-Comparatives and the testing methodologies, please visit our website. AV-Comparatives e.V. (October 2012) - 10 -
  • 11. Anti-Virus Comparative - File Detection Test - September 2012 www.av-comparatives.org Advertisement  - 11 -