AV-TEST Product Review Report – Q1/2011

During January, February and March 2011 we continuously evaluated 22 security products using their default settings. We always used the most current publicly-available version of all products for the testing. They were allowed to update themselves at any time and query their in-the-cloud services. We focused on realistic test scenarios and challenged the products against real-world threats. Products had to demonstrate their capabilities using all components and protection layers.

Total Protection

Version2011
PlatformWindows 7 (RTM, 32 bit)
Report110925
DateQ1/2011

Protection

Protection against malware infections
(such as viruses, worms or Trojan horses)
More information

Industry averageIndustry averageJanuary February March
Protection against 0-day malware attacks from the internet, inclusive of web and e-mail threats (Real-World Testing) 107 samples used 84%70.0%80.0%72.0%
Blocking of malware on or post execution (Dynamic Detection Testing) 29 samples used 62%28.0%
Detection of a representative set of malware discovered in the last 2-3 months (AV-TEST reference set) 522,634 samples used 97%93.0%100%100%
Detection of widespread malware (according to the WildList) 24,106 samples used 100%100%100%100%
Protection Score 3.0/6.0

Repair

Cleaning and repair of a malware-infected computer

Industry averageIndustry averageJanuary February March
Removal of all active components of widespread malware (according to the WildList) from a computer 21 samples used 86%76.0%
Removal of further malicious components and remediation of critical system modifications made by malware 21 samples used 48%24.0%
Detection of deliberately hidden active malware (Rootkits and stealth malware) 14 samples used 79%57.0%
Removal of deliberately hidden active malware (Rootkits and stealth malware) 14 samples used 57%43.0%
Repair Score 2.0/6.0

Usability

Impact of the security software on the usability of the whole computer
(lower values indicate better results)
More information

Industry averageIndustry averageJanuary February March
Average slow-down of the computer by the security software in daily use Number of test cases: 13 171%128s
False detections of legitimate software as malware during a system scan (false positives) 841,767 samples used 7 19 7 5
False warnings of certain actions during the installation and use of legitimate software 20 samples used 1 3
False blockings of certain actions during the installation and use of legitimate software 20 samples used 0 0
Usability Score 3.5/6.0

Subscribe to the AV-TEST Newsletter

Well-informed
on security

More ›