Skip navigation


The most important category where the protective effect of products is concerned is the test against current online threats (protection against zero-day malware attacks from the Internet, inclusive of web and e-mail threats (Real-World Testing)). This involves accessing known malicious websites or e-mails in order to test whether the protection product is able to ward off attacks.

The test procedure:

  1. The products are installed, updated and started up using standard/default settings. The protection program has complete Internet access at all times.
  2. AV-TEST uses the analysis program Sunshine, which it developed itself, to produce a map of the non-infected system.
  3. It then attempts to access the malicious website or e-mail.
  4. If access to the website or e-mail is blocked or the protection program displays a message, this is documented. The point at which access is blocked or the technique used to do so does not play a role in this stage of the procedure:

    1. Access to the URL is blocked.
    2. The exploit on the website is identified and blocked.
    3. Download of malicious components is blocked.
    4. Use of malicious components is blocked.

  5. Given that the detection of malicious components or actions is not always synonymous to successful blockage, Sunshine constantly monitors all actions on the computer in order to determine whether the attack was completely or partially blocked or not blocked at all.
  6. A result for the test case is then determined based on the documented detection according to the protection program and the actions on the system recorded by Sunshine.

This procedure is carried out on all tested programs and all test cases at the same time in order to ensure that all protection programs have the exact same test conditions. If a test case is no longer available or can no longer be run during the test procedure or its response varies in different protection programs (which can be clearly determined using the Sunshine analyses), the test case is deleted. This ensures that all products were tested in the exact same test scenarios. All test cases are solely obtained from internal AV-TEST sources and are always fully analysed by AV-TEST. We never resort to using test cases or analyses provided by manufacturers or other external sources.

The fact that this test only uses real current threats means that it perfectly reflects the actual risk situation and threat potential. Given the complexity of the test procedure and the analysis of the results, the quantity of test cases must be limited to a reasonable number. This typically has no influence on the validity of the test because the thousands of different threats present online always only concern variations of a small number of specific malware families.

In order to increase the statistical relevance of the tests, further analyses are carried out with regard to a large number of current threats. This involves decreasing the complexity of the test and in turn increasing the number of test cases many times over. This test refers to the static detection of files, which includes detection with signatures, heuristics and in-the-cloud queries. AV-TEST uses two different test sets to carry out these analyses:

  1. All malicious files that were discovered by AV-TEST in the last 6 - 8 weeks prior to the beginning of the test (detection of a representative set of malware discovered in the last 2 - 3 months (AV-TEST reference set)): around 100,000 – 150,000 files.
  2. Extremely widespread malicious files that were discovered by AV-TEST in the last 6 – 8 weeks prior to the beginning of the test (detection of widespread malware): around 2,000 – 2,500 files.

Both test sets only use files that have been discovered and analysed by AV-TEST. In order to prevent the test sets from being influenced by the manufacturers in their favour, data from manufacturers are not incorporated into the tests. As a result, the independent analysis carried out by AV-TEST achieves a very high level of quality.

The test procedure:

  1. The products are installed, updated and started up using standard/default settings.
  2. An on-demand scan is then carried out for the respective test amount, during which the product has full Internet access.
  3. After the scan, the report files are evaluated in detail in order to log detection and scan errors and to determine the overall result.

It is also important in this test that all products are simultaneously analysed in terms of the same test cases in order to ensure that all products have been identically updated. Furthermore, the products can also be updated at any time. This is important in order to also be able to sensibly and fairly integrate products with in-the-cloud queries into the test environment.