AV-TEST Product Review and Certification Report – Sep-Oct/2012
During September and October 2012 we continuously evaluated 24 home user security products using their default settings. We always used the most current publicly-available version of all products for the testing. They were allowed to update themselves at any time and query their in-the-cloud services. We focused on realistic test scenarios and challenged the products against real-world threats. Products had to demonstrate their capabilities using all components and protection layers.
Ad-Aware Pro Security
Version | 10.2 & 10.3 |
Platform | Windows 7 (SP1, 32 bit) |
Report | 123643 |
Date | Sep-Oct/2012 |
Protection
Protection against malware infections
(such as viruses, worms or Trojan horses)
More information
Industry average | September | October | |
Protection against 0-day malware attacks, inclusive of web and e-mail threats (Real-World Testing) 102 samples used | 89% | 92.0% | 80.0% |
Detection of a representative set of malware discovered in the last 2-3 months (AV-TEST reference set) 272,799 samples used | 97% | 99.0% | 98.0% |
Detection of widespread and prevalent malware (according to AV-TEST data) 5,000 samples used | 100% | 100% | 100% |
Protection Score | 4.0/6.0 |
Repair
Cleaning and repair of a malware-infected computer
Industry average | September | October | |
Detection of actively running widespread malware (including Rootkits and stealth malware) 40 samples used | 95% | 98.0% | |
Removal of all active components of widespread malware (including Rootkits and stealth malware) 40 samples used | 85% | 90.0% | |
Removal of further malicious components and remediation of critical system modifications 40 samples used | 60% | 70.0% | |
Repair Score | 4.0/6.0 |
Usability
Impact of the security software on the usability of the whole computer
(lower values indicate better results)
More information
Industry average | September | October | |
Average slow-down of the computer by the security software in daily use Number of test cases: 5 | 10% | 12s | |
False detections of legitimate software as malware during a system scan (false positives) 657,250 samples used | 4 | 3 | 4 |
False warnings of certain actions during download, installation and use of legitimate software 27 samples used | 1 | 0 | |
False blockings of certain actions during download, installation and use of legitimate software 27 samples used | 1 | 0 | |
Usability Score | 4.5/6.0 |
Evaluation based on a point system
All products can achieve a maximum of 6 points each in the three categories of protection, performance and usability. This means 18 points are the best possible test result.
At 10 points or higher, a product is awarded the AV-TEST seal of approval.
At 17.5 points or higher, AV-TEST also issues the "TOP PRODUCT" award. Additional details ›
outstanding | satisfactory | insufficient | certificates |
ATP TEST: concealed attacks of ransomware and info stealers ...
November 20, 2024 | Antivirus for Windows
Evaluation of an Additional Security Feature for VPN’s ...
November 19, 2024 | Commissioned tests, VPN tests