Win XP, 7 & 8.1: Internet Security Suites Complete an Endurance Test Lasting 6 Months
Although the endurance test involves a large amount of time and effort, it provides results that are exclusive on a global level by requiring 24 Internet security suites to complete six rounds of testing within a period of six months. Three additional suites were also involved in the test, but were only able to complete two months of testing each and are therefore recorded separately in the overall result.
Each individual test can only represent the performance of a protection package recorded at a given moment in time. A test lasting a period of 6 months, on the other hand, is a completely different kettle of fish and revealed a number of ups and downs in the case of several products. Nevertheless, the endurance test also clearly shows which protection solution provides top-level performance on a long-term basis.
Who protects Windows the best?
All of the protection packages underwent three tests lasting a period of two months each on XP, Win 7 and Windows 8.1 operating systems. The basic protection provided by Windows was used to provide a base value for comparison in each of these tests, in the form of either Windows Defender or Security Essentials in combination with the Windows Firewall depending on the system in question.
Although the table shows that the basic protection from Windows is better than nothing, the use of an external protection solution is actually a must, all things considered. This conclusion can also be explained in terms of numbers: from a maximum possible total of 18 points, Internet Security from Kaspersky Lab scored a 17.8 points overall, enabling it to achieve first place. In comparison, the basic protection from Windows was only able to achieve 10.3 points, coming in at 17th place.
Another interesting factor is the comparison between free and purchaseable products, in which the top 5 places in the overall result only feature purchaseable products but the free protection package from Panda Security managed to achieve joint sixth place. Although the purchaseable version of Avira’s Internet Security package came in at third place, the result of this product cannot simply be applied to the free version also available from Avira.
The Test Candidates
A total of 27 protection packages from the developers AhnLab, Avast, AVG (purchaseable and free version), Avira (purchaseable version), Baidu, Bitdefender, BullGuard, Check Point, Comodo, ESET, F-Secure, G Data, K7 Computing, Kaspersky Lab, Kingsoft, McAfee, Microworld, Norman, Panda Security, PCKeeper, Qihoo, Symantec, Tencent, ThreatTrack and Trend Micro were involved in the test.
The products from Baidu, Check Point and PCKeeper were only tested in two of the six months of testing and are therefore listed separately in the results table. The scores achieved by the other 24 packages, including the comparative values from Microsoft, are listed in the table as overall results.
Test Hurdles and Continuous Protection
The testers at AV-TEST awarded the products a maximum of 6 points for each of a total of three test hurdles, namely the categories of Protection, System Load and Usability. The maximum total available in the endurance test was therefore 18 points.
One of the most important categories in the test is that of Protection. The products’ score in this part of the test is based on their combined detection rates in the real-world test and when analysing the reference set. While detection of the reference set test is an absolute must for all products, the real-world test can be viewed as the freestyle aspect, requiring the protection packages to identify approximately 400 brand-new pieces of unknown malware, so-called zero-day malware. The products’ performance in the real-world test is analysed by visiting extremely recently infected websites or opening dangerous e-mail attachments.
The test using the reference set, on the other hand, requires the products to detect nearly 60,000 known and widespread pieces of malware without making any errors. All protection solutions should actually be able to master this challenge with a detection rate of 100%.
Separating the Wheat from the Chaff in Malware Detection
Only the packages from Bitdefender, F-Secure, Kaspersky Lab, G Data, Trend Micro and Comodo were able to detect nearly 400 pieces of unknown malware without making any errors.
These were followed by the packages from Avira, Microworld and Panda Security, which made a few false detections and therefore only achieved a detection rate of 99 percent.
These nine protection packages also all achieved a detection rate of 100% when required to identify the reference set.
When tested using the basic protection from Microsoft, a detection rate of just 74 percent meant that one in every four pieces of unknown malware snuck its way onto the system. The basic protection’s detection rate for the reference set of known malware was a meagre 90 percent.
Only the purchaseable products were able to achieve scanning rates of 100 percent for both sets of malware. The best free package in this part of the test came from Panda Security.
System Load Is Still a Problem
Having the best protection package doesn’t help a user much if this solution massively slows down the Windows system in day-to-day use. Despite the fact that manufacturers should actually have the system load of their programs under control, this test category also revealed a number of serious differences. The system packages delay the system particularly often when copying files or installing and launching applications.
The best suite in the endurance test was the protection package from Kaspersky Lab, which was one of a number of solutions to achieve top marks when detecting malware, but also managed to maintain a low system load. Kaspersky Lab therefore also achieved the top result in this test category, scoring 5.8 of a maximum possible total of 6 points. It was closely followed by the protection package from Bitdefender with 5.5 points.
An example of the performance of the other suites with the highest detection rates is that of Trend Micro, which used much more resources than other packages and therefore only received 4 out of 6 points in the System Load category. The basic protection from Windows achieved a similar score for slowing down the system, while the products with the worst system load even scored totals of as low as 2 or 3 points.
A High Level of Usability
Many of the system guardians used the test category of Usability as an excellent source of points for their overall results. In fact, only 3 of the 24 suites tested scored just 4.2 or 4.3 points, while all other packages achieved between 5 and 6 points.
These results confirm that most of the suites were able to correctly identify the over 860,000 clean programs tested as harmless and do not annoy their users with false positives. The packages also rarely blocked the wrong websites or prevented applications from launching.
Although the basic protection from Microsoft also scored 6 points in this category, it failed to score a single point for its malware detection performance in the Protection category.
Summary: Good Protection Packages for Windows
The comparison of the products only factors in the protection packages that completed all 6 months of the test between September 2013 and February 2014. The top products in the overall results table are the suites from Kaspersky Lab and Bitdefender with 17.8 and 17.5 of a maximum possible total of 18 points respectively.
With its overall score of 16.3 points, the purchaseable version of the Avira software was able to achieve third place ahead of G Data, which scored a total of 16.2 points, despite the fact that G Data offers a much better detection rate.
Although Trend Micro and F-Secure performed strongly in the Protection category, achieving the maximum total of 6 points, both programs have a severe impact on the day-to-day use of Windows systems in terms of system load.
The basic protection from Microsoft, namely Windows Defender or Security Essentials, combined with the Windows Firewall, is an insecure option. Its result in the test category of Protection is simply too poor.
22 of the 24 packages involved in the test were awarded the AV-TEST certificate for tested security. The product from AhnLab, on the other hand, failed to achieve a certificate due to its insufficient performance in the Protection category. The Microsoft results were only recorded for comparison and were therefore not included in the certification process.