AV-TEST Product Review and Certification Report – Q2/2010
During April, May and June 2010 we continuously evaluated 17 security products using their default settings. They were allowed to update themselves at any time and query their in-the-cloud services. We focused on realistic test scenarios and challenged the products against real-world threats. Products had to demonstrate their capabilities using all components and protection layers.
|Platform||Windows 7 (RTM, 32 bit)|
Protection against malware infections
(such as viruses, worms or Trojan horses)
|Protection against 0-day malware attacks from the internet, inclusive of web and e-mail threats (Real-World Testing) 59 samples used||75%||71.0%||95.0%||83.0%|
|Blocking of malware on or post execution (Dynamic Detection Testing) 16 samples used||63%||88.0%|
|Detection of a representative set of malware discovered in the last 2-3 months (AV-TEST reference set) 1,359,862 samples used||95%||96.0%||98.0%||90.0%|
|Detection of widespread malware (according to the WildList) 26,964 samples used||100%||100%||100%||100%|
Cleaning and repair of a malware-infected computer
|Removal of all active components of widespread malware (according to the WildList) from a computer 20 samples used||90%||85.0%|
|Removal of further malicious components and remediation of critical system modifications made by malware 20 samples used||50%||60.0%|
|Detection of deliberately hidden active malware (Rootkits and stealth malware) 7 samples used||100%||100%|
|Removal of deliberately hidden active malware (Rootkits and stealth malware) 7 samples used||71%||100%|
Impact of the security software on the usability of the whole computer
(lower values indicate better results)
|Average slow-down of the computer by the security software in daily use Number of test cases: 13||251%||204s|
|False detections of legitimate software as malware during a system scan (false positives) 611,548 samples used||4||20||18||2|
|False warnings of certain actions during the installation and use of legitimate software 20 samples used||2||0|
|False blockings of certain actions during the installation and use of legitimate software 20 samples used||1||0|
Evaluation based on a point system
All products can achieve a maximum of 6 points each in the three categories of protection, performance and usability. This means 18 points are the best possible test result.
At 10 points or higher, a product is awarded the AV-TEST seal of approval.
At 17.5 points or higher, AV-TEST also issues the "TOP PRODUCT" award. Additional details ›
Tests for home usersMore ›
Tests for business usersMore ›
Internet of Things (IoT)More ›
Subscribe to the AV-TEST Newsletter
Security Apps under Android 11 Put to the Test
March 20, 2023 | Antivirus for Android
Security Software against the latest Ransomware Techniques ...
March 13, 2023 | Antivirus for Windows
AV-TEST Award 2022: Tested and Award-Winning Security ...
March 02, 2023 | Awards
AV-TEST Award 2022 for Avast
March 02, 2023 | Awards