AV-TEST Product Review and Certification Report – Q2/2011
During April, May and June 2011 we continuously evaluated 22 security products using their default settings. We always used the most current publicly-available version of all products for the testing. They were allowed to update themselves at any time and query their in-the-cloud services. We focused on realistic test scenarios and challenged the products against real-world threats. Products had to demonstrate their capabilities using all components and protection layers.

Security Essentials
| Version | 2.0 |
| Platform | Windows XP (SP3, 32 bit) |
| Report | 112233 |
| Date | Q2/2011 |
Protection
Protection against malware infections
(such as viruses, worms or Trojan horses)
More information
Industry average![]() | April | May | June | |
| Protection against 0-day malware attacks from the internet, inclusive of web and e-mail threats (Real-World Testing) 108 samples used | 81% | 44.0% | 63.0% | 59.0% |
| Blocking of malware on or post execution (Dynamic Detection Testing) 34 samples used | 65% | 32.0% | ||
| Detection of a representative set of malware discovered in the last 2-3 months (AV-TEST reference set) 424,860 samples used | 98% | 96.0% | 99.0% | 98.0% |
| Detection of widespread malware (according to the WildList) 10,224 samples used | 100% | 100% | 100% | 100% |
| Protection Score | 2.5/ 6.0 | |||
Repair
Cleaning and repair of a malware-infected computer
Industry average![]() | April | May | June | |
| Removal of all active components of widespread malware (according to the WildList) from a computer 23 samples used | 96% | 91.0% | ||
| Removal of further malicious components and remediation of critical system modifications made by malware 23 samples used | 70% | 87.0% | ||
| Detection of deliberately hidden active malware (Rootkits and stealth malware) 18 samples used | 78% | 89.0% | ||
| Removal of deliberately hidden active malware (Rootkits and stealth malware) 18 samples used | 44% | 50.0% | ||
| Repair Score | 4.5/ 6.0 | |||
Usability
Impact of the security software on the usability of the whole computer
(lower values indicate better results)
More information
Industry average![]() | April | May | June | |
| Average slow-down of the computer by the security software in daily use Number of test cases: 13 | 140% | 168s | ||
| False detections of legitimate software as malware during a system scan (false positives) 699,760 samples used | 9 | 0 | 0 | 0 |
| False warnings of certain actions during the installation and use of legitimate software 22 samples used | 1 | 0 | ||
| False blockings of certain actions during the installation and use of legitimate software 22 samples used | 0 | 0 | ||
| Usability Score | 5.0/ 6.0 | |||
Evaluation based on a point system
All products can achieve a maximum of 6 points each in the three categories of protection, performance and usability. This means 18 points are the best possible test result.
At 10 points or higher, a product is awarded the AV-TEST seal of approval.
At 17.5 points or higher, AV-TEST also issues the "TOP PRODUCT" award. Additional details ›
outstanding | satisfactory | insufficient | certificates |






















































