AV-TEST Product Review and Certification Report – Q3/2010
During July, August and September 2010 we continuously evaluated 16 security products using their default settings. They were allowed to update themselves at any time and query their in-the-cloud services. We focused on realistic test scenarios and challenged the products against real-world threats. Products had to demonstrate their capabilities using all components and protection layers.
Protection
Protection against malware infections
(such as viruses, worms or Trojan horses)
More information
Industry average | July | August | September | |
Protection against 0-day malware attacks from the internet, inclusive of web and e-mail threats (Real-World Testing) 66 samples used | 77% | 76.0% | 88.0% | 79.0% |
Blocking of malware on or post execution (Dynamic Detection Testing) 20 samples used | 50% | 30.0% | ||
Detection of a representative set of malware discovered in the last 2-3 months (AV-TEST reference set) 1,038,795 samples used | 96% | 96.0% | 94.0% | 89.0% |
Detection of widespread malware (according to the WildList) 28,838 samples used | 100% | 100% | 100% | 100% |
Protection Score | 3.5/6.0 |
Repair
Cleaning and repair of a malware-infected computer
Industry average | July | August | September | |
Removal of all active components of widespread malware (according to the WildList) from a computer 20 samples used | 80% | 90.0% | ||
Removal of further malicious components and remediation of critical system modifications made by malware 20 samples used | 50% | 60.0% | ||
Detection of deliberately hidden active malware (Rootkits and stealth malware) 18 samples used | 83% | 94.0% | ||
Removal of deliberately hidden active malware (Rootkits and stealth malware) 18 samples used | 56% | 44.0% | ||
Repair Score | 4.5/6.0 |
Usability
Impact of the security software on the usability of the whole computer
(lower values indicate better results)
More information
Industry average | July | August | September | |
Average slow-down of the computer by the security software in daily use Number of test cases: 13 | 236% | 151s | ||
False detections of legitimate software as malware during a system scan (false positives) 530,462 samples used | 3 | 1 | 0 | 6 |
False warnings of certain actions during the installation and use of legitimate software 20 samples used | 1 | 0 | ||
False blockings of certain actions during the installation and use of legitimate software 20 samples used | 1 | 0 | ||
Usability Score | 5.0/6.0 |
Evaluation based on a point system
All products can achieve a maximum of 6 points each in the three categories of protection, performance and usability. This means 18 points are the best possible test result.
At 10 points or higher, a product is awarded the AV-TEST seal of approval.
At 17.5 points or higher, AV-TEST also issues the "TOP PRODUCT" award. Additional details ›
outstanding | satisfactory | insufficient | certificates |
Endurance Test: 14 Security Solutions for Corporate Users ...
October 25, 2024 | Antivirus for Windows
ATP endurance test: 31 security products for 6 months in the advanced...
September 25, 2024 | Antivirus for Windows
Ransomware and info stealers: 17 security solutions in the ATP test ...
September 12, 2024 | Antivirus for Windows
Advanced EDR test 2024: Bitdefender Endpoint Security ...
September 04, 2024 | Antivirus for Windows