AV-TEST Product Review and Certification Report – Q1/2011

During January, February and March 2011 we continuously evaluated 22 security products using their default settings. We always used the most current publicly-available version of all products for the testing. They were allowed to update themselves at any time and query their in-the-cloud services. We focused on realistic test scenarios and challenged the products against real-world threats. Products had to demonstrate their capabilities using all components and protection layers.

certified

Internet Security

Version10.0
PlatformWindows 7 (RTM, 32 bit)
Report110971
DateQ1/2011

Protection

Protection against malware infections
(such as viruses, worms or Trojan horses)
More information

Industry averageIndustry averageJanuary February March
Protection against 0-day malware attacks from the internet, inclusive of web and e-mail threats (Real-World Testing) 107 samples used 84%74.0%77.0%72.0%
Blocking of malware on or post execution (Dynamic Detection Testing) 29 samples used 62%66.0%
Detection of a representative set of malware discovered in the last 2-3 months (AV-TEST reference set) 522,634 samples used 97%99.0%100%100%
Detection of widespread malware (according to the WildList) 24,106 samples used 100%100%100%100%
Protection Score 4.0/6.0

Repair

Cleaning and repair of a malware-infected computer

Industry averageIndustry averageJanuary February March
Removal of all active components of widespread malware (according to the WildList) from a computer 21 samples used 86%81.0%
Removal of further malicious components and remediation of critical system modifications made by malware 21 samples used 48%57.0%
Detection of deliberately hidden active malware (Rootkits and stealth malware) 14 samples used 79%86.0%
Removal of deliberately hidden active malware (Rootkits and stealth malware) 14 samples used 57%43.0%
Repair Score 3.5/6.0

Usability

Impact of the security software on the usability of the whole computer
(lower values indicate better results)
More information

Industry averageIndustry averageJanuary February March
Average slow-down of the computer by the security software in daily use Number of test cases: 13 171%138s
False detections of legitimate software as malware during a system scan (false positives) 841,767 samples used 7 2 6 0
False warnings of certain actions during the installation and use of legitimate software 20 samples used 1 7
False blockings of certain actions during the installation and use of legitimate software 20 samples used 0 1
Usability Score 4.0/6.0

All tested manufacturers

AhnLab
Avast
AVG
Avira
Baidu
Bitdefender
BullGuard
Check Point
Comodo
Cylance
Emsisoft
ESET
F-Secure
Fortinet
G Data
K7 Computing
Kaspersky
Lavasoft
Malwarebytes
McAfee
Microsoft
MicroWorld
Norton
Panda Security
PC Matic
PC Tools
Protected.net
Qihoo 360
Quick Heal
Sophos
Tencent (PC)
ThreatTrack
Total Defense
Trend Micro
VIPRE Security
Webroot