AV-TEST Product Review and Certification Report – Q3/2010

During July, August and September 2010 we continuously evaluated 16 security products using their default settings. They were allowed to update themselves at any time and query their in-the-cloud services. We focused on realistic test scenarios and challenged the products against real-world threats. Products had to demonstrate their capabilities using all components and protection layers.

certified

Internet Security Suite

Version2010 & 2011
PlatformWindows XP (SP3, 32 bit)
Report103595
DateQ3/2010

Protection

Protection against malware infections
(such as viruses, worms or Trojan horses)
More information

Industry averageIndustry averageJuly August September
Protection against 0-day malware attacks from the internet, inclusive of web and e-mail threats (Real-World Testing) 66 samples used 77%81.0%71.0%64.0%
Blocking of malware on or post execution (Dynamic Detection Testing) 20 samples used 50%55.0%
Detection of a representative set of malware discovered in the last 2-3 months (AV-TEST reference set) 1,038,795 samples used 96%100%99.0%98.0%
Detection of widespread malware (according to the WildList) 28,838 samples used 100%100%100%100%
Protection Score 4.5/6.0

Repair

Cleaning and repair of a malware-infected computer

Industry averageIndustry averageJuly August September
Removal of all active components of widespread malware (according to the WildList) from a computer 20 samples used 80%70.0%
Removal of further malicious components and remediation of critical system modifications made by malware 20 samples used 50%30.0%
Detection of deliberately hidden active malware (Rootkits and stealth malware) 18 samples used 83%78.0%
Removal of deliberately hidden active malware (Rootkits and stealth malware) 18 samples used 56%44.0%
Repair Score 2.0/6.0

Usability

Impact of the security software on the usability of the whole computer
(lower values indicate better results)
More information

Industry averageIndustry averageJuly August September
Average slow-down of the computer by the security software in daily use Number of test cases: 13 236%126s
False detections of legitimate software as malware during a system scan (false positives) 530,462 samples used 3 0 0 1
False warnings of certain actions during the installation and use of legitimate software 20 samples used 1 2
False blockings of certain actions during the installation and use of legitimate software 20 samples used 1 0
Usability Score 5.5/6.0

Evaluation based on a point system

All products can achieve a maximum of 6 points each in the three categories of protection, performance and usability. This means 18 points are the best possible test result.
At 10 points or higher, a product is awarded the AV-TEST seal of approval.
At 17.5 points or higher, AV-TEST also issues the "TOP PRODUCT" award. Additional details ›

   

out­stan­ding

sa­tis­fac­to­ry

in­su­ffi­cient

cer­ti­fi­cates

All tested manufacturers – listed by year/month