AV-TEST Product Review and Certification Report – Q3/2010

During July, August and September 2010 we continuously evaluated 16 security products using their default settings. They were allowed to update themselves at any time and query their in-the-cloud services. We focused on realistic test scenarios and challenged the products against real-world threats. Products had to demonstrate their capabilities using all components and protection layers.

certified

Internet Security

Version2010 & 2011
PlatformWindows XP (SP3, 32 bit)
Report103581
DateQ3/2010

Protection

Protection against malware infections
(such as viruses, worms or Trojan horses)
More information

Industry averageIndustry averageJuly August September
Protection against 0-day malware attacks from the internet, inclusive of web and e-mail threats (Real-World Testing) 66 samples used 77%95.0%94.0%86.0%
Blocking of malware on or post execution (Dynamic Detection Testing) 20 samples used 50%100%
Detection of a representative set of malware discovered in the last 2-3 months (AV-TEST reference set) 1,038,795 samples used 96%98.0%96.0%98.0%
Detection of widespread malware (according to the WildList) 28,838 samples used 100%100%100%100%
Protection Score 5.5/6.0

Repair

Cleaning and repair of a malware-infected computer

Industry averageIndustry averageJuly August September
Removal of all active components of widespread malware (according to the WildList) from a computer 20 samples used 80%90.0%
Removal of further malicious components and remediation of critical system modifications made by malware 20 samples used 50%85.0%
Detection of deliberately hidden active malware (Rootkits and stealth malware) 18 samples used 83%89.0%
Removal of deliberately hidden active malware (Rootkits and stealth malware) 18 samples used 56%83.0%
Repair Score 5.5/6.0

Usability

Impact of the security software on the usability of the whole computer
(lower values indicate better results)
More information

Industry averageIndustry averageJuly August September
Average slow-down of the computer by the security software in daily use Number of test cases: 13 236%309s
False detections of legitimate software as malware during a system scan (false positives) 530,462 samples used 3 1 1 3
False warnings of certain actions during the installation and use of legitimate software 20 samples used 1 0
False blockings of certain actions during the installation and use of legitimate software 20 samples used 1 0
Usability Score 4.5/6.0

All tested manufacturers

AhnLab
Avast
AVG
Avira
Baidu
Bitdefender
BullGuard
Check Point
Comodo
Cylance
Emsisoft
ESET
F-Secure
Fortinet
G Data
K7 Computing
Kaspersky
Lavasoft
Malwarebytes
McAfee
Microsoft
MicroWorld
Norton
Panda Security
PC Matic
PC Tools
Protected.net
Qihoo 360
Quick Heal
Sophos
Tencent (PC)
ThreatTrack
Total Defense
Trend Micro
VIPRE Security
Webroot