AV-TEST Product Review and Certification Report – Q2/2011

During April, May and June 2011 we continuously evaluated 22 security products using their default settings. We always used the most current publicly-available version of all products for the testing. They were allowed to update themselves at any time and query their in-the-cloud services. We focused on realistic test scenarios and challenged the products against real-world threats. Products had to demonstrate their capabilities using all components and protection layers.

certified

Smart Security

Version4.2
PlatformWindows XP (SP3, 32 bit)
Report112256
DateQ2/2011

Protection

Protection against malware infections
(such as viruses, worms or Trojan horses)
More information

Industry averageIndustry averageApril May June
Protection against 0-day malware attacks from the internet, inclusive of web and e-mail threats (Real-World Testing) 108 samples used 81%87.0%90.0%83.0%
Blocking of malware on or post execution (Dynamic Detection Testing) 34 samples used 65%53.0%
Detection of a representative set of malware discovered in the last 2-3 months (AV-TEST reference set) 424,860 samples used 98%98.0%98.0%98.0%
Detection of widespread malware (according to the WildList) 10,224 samples used 100%100%100%100%
Protection Score 4.0/6.0

Repair

Cleaning and repair of a malware-infected computer

Industry averageIndustry averageApril May June
Removal of all active components of widespread malware (according to the WildList) from a computer 23 samples used 96%96.0%
Removal of further malicious components and remediation of critical system modifications made by malware 23 samples used 70%70.0%
Detection of deliberately hidden active malware (Rootkits and stealth malware) 18 samples used 78%67.0%
Removal of deliberately hidden active malware (Rootkits and stealth malware) 18 samples used 44%17.0%
Repair Score 3.5/6.0

Usability

Impact of the security software on the usability of the whole computer
(lower values indicate better results)
More information

Industry averageIndustry averageApril May June
Average slow-down of the computer by the security software in daily use Number of test cases: 13 140%72s
False detections of legitimate software as malware during a system scan (false positives) 699,760 samples used 9 5 3 0
False warnings of certain actions during the installation and use of legitimate software 22 samples used 1 0
False blockings of certain actions during the installation and use of legitimate software 22 samples used 0 0
Usability Score 5.5/6.0

Evaluation based on a point system

All products can achieve a maximum of 6 points each in the three categories of protection, performance and usability. This means 18 points are the best possible test result.
At 10 points or higher, a product is awarded the AV-TEST seal of approval.
At 17.5 points or higher, AV-TEST also issues the "TOP PRODUCT" award. Additional details ›

   

out­stan­ding

sa­tis­fac­to­ry

in­su­ffi­cient

cer­ti­fi­cates

All tested manufacturers – listed by year/month