AV-TEST Product Review Report – Q4/2010

During October, November and December 2010 we continuously evaluated 22 security products using their default settings. We always used the most current publicly-available version of all products for the testing. They were allowed to update themselves at any time and query their in-the-cloud services. We focused on realistic test scenarios and challenged the products against real-world threats. Products had to demonstrate their capabilities using all components and protection layers.

Internet Security Suite

Version2011
PlatformWindows Vista (SP2, 32 bit)
Report104889
DateQ4/2010

Protection

Protection against malware infections
(such as viruses, worms or Trojan horses)
More information

Industry averageIndustry averageOctober November December
Protection against 0-day malware attacks from the internet, inclusive of web and e-mail threats (Real-World Testing) 77 samples used 82%46.0%45.0%39.0%
Blocking of malware on or post execution (Dynamic Detection Testing) 20 samples used 55%20.0%
Detection of a representative set of malware discovered in the last 2-3 months (AV-TEST reference set) 642,693 samples used 96%89.0%86.0%91.0%
Detection of widespread malware (according to the WildList) 30,532 samples used 100%100%100%100%
Protection Score 2.0/6.0

Repair

Cleaning and repair of a malware-infected computer

Industry averageIndustry averageOctober November December
Removal of all active components of widespread malware (according to the WildList) from a computer 22 samples used 82%68.0%
Removal of further malicious components and remediation of critical system modifications made by malware 22 samples used 45%36.0%
Detection of deliberately hidden active malware (Rootkits and stealth malware) 10 samples used 100%90.0%
Removal of deliberately hidden active malware (Rootkits and stealth malware) 10 samples used 60%30.0%
Repair Score 2.5/6.0

Usability

Impact of the security software on the usability of the whole computer
(lower values indicate better results)
More information

Industry averageIndustry averageOctober November December
Average slow-down of the computer by the security software in daily use Number of test cases: 13 221%369s
False detections of legitimate software as malware during a system scan (false positives) 533,690 samples used 7 10 15 9
False warnings of certain actions during the installation and use of legitimate software 20 samples used 1 2
False blockings of certain actions during the installation and use of legitimate software 20 samples used 1 1
Usability Score 2.0/6.0

All tested manufacturers

AhnLab
Avast
AVG
Avira
Baidu
Bitdefender
BullGuard
Check Point
Comodo
Cylance
Emsisoft
ESET
F-Secure
Fortinet
G Data
Heimdal Security
K7 Computing
Kaspersky
Lavasoft
Malwarebytes
McAfee
Microsoft
MicroWorld
Norton
Panda Security
PC Matic
PC Tools
Protected.net
Qihoo 360
Quick Heal
Sophos
Tencent (PC)
ThreatTrack
Total Defense
Trend Micro
VIPRE Security
Webroot