Skip navigation

Analysis of 160 Million Websites: Are Google and Other Search Engines Platforms for Distributing Malware?

Search engines such as Google and others provide access to roughly more than 1 billion websites and globally handle 4 to 6 billion search queries – daily. But how many malware threats lurk among the search results? In 2015 and 2016 respectively, AV-TEST analyzed 80 million websites and discovered an unsettling trend.

Do search engines also deliver links with malware threats in their results?

The examination of 80 million links per year reveals: the number of infected links is continuously increasing.

The results from many different search machines and Twitter tweets are used as a database.

The overview of the analysis clearly demonstrates that the number of delivered search results containing infected links is continually growing.

As far back as 2013, the laboratory at AV-TEST examined how many infected web pages were found in the results of search engines. It was already clear back then that search engine operators such as Google and others endeavor to filter the results, but they are not able to get a handle on the flood of malware threats. In the current examination, the collected results from January 2015 to August 2016 were used as a database.

Fact: the number of infected results has been increasing year by year since 2013, despite the fact that search engine operators use many tools and technologies to try to filter them out.

80 million search engine results analyzed annually

The database collated by AV-TEST over the past 20 months is gigantic. In fact, the lab already analyzed over 80 million websites throughout the whole year of 2015, checking for malware threats. The examination continued into 2016. Over the past 8 months, more than 80 million additional websites were examined. This provides a good basis for comparing the results. The analyzed websites originate in various proportions from the search engines of Google, Bing, Yandex and Faroo. Additionally, in 2015, over 315 million and in 2016 over 200 million Twitter tweets were examined for malicious links.

The operators of search engines pre-filter the results and sort out infected links. Google claims to provide better protection: with the Google Safe Browsing tools. They either work in the search interface or are available via Firefox and Chrome if a different search machine other than Google is used. But do the Safe Browsing tools really perform the additional function of reliably sorting out malicious links? This was analyzed as well.

The testers examine the websites on various levels:

How many infected websites are there now?

Both evaluations from AV-TEST through the year 2015 up until August 2016 ultimately yield two important final results (without Google safe browsing):

  • 2015, 80 million websites examined: 18,280 infected web pages
  • 2016 (up to Aug.), 81 million websites examined: 29,632 infected web pages

By comparison: already in 2013, among roughly 40 million web pages examined, 5,060 malware threats were found. You don't need to be a mathematician to see this clear growth trend.

The result of the analysis is already bad enough. Here is how the results look like if the Google Safe Browsing tools examine the millions of pages:

  • 2015, 80 million websites examined: 9,725 warnings
  • 2016 (up to Aug.), 81 million websites examined: 19,794 warnings

Interestingly enough, these don't involve exactly the same websites filtered out with the help of the VTEST multi-scanner. Because VTEST sorted the pages out that led directly to a downloadable malware threat and the web pages that directly attack the visitor. This includes phishing pages that try to capture data.

That is why the laboratory conducted a counter test: All of the pages with malware threats found by AV-TEST were visited using the Google Safe Browsing tools. They reported the following:

  • 2015: 18,280 pages with malware threats, 555 Google warnings 
  • 2016: 29,632 pages with malware threats, 1,337 Google warnings

Additional danger: Twitter tweets

A large part of the over 80 million links examined come from Twitter tweets. In 2015, over 315 million tweets were examined and almost 23 million links to websites were extracted and examined. In 2016, there were more than 200 million tweets with some 25 million links. With over 1,100 malware threats in 2015 and 1,500 malware threats in 2016, links in tweets are infected at almost exactly the same rate of frequency as in links filtered by Google. Thus, Twitter appears to also examine the links and filter malicious tweets. But in final analysis, Twitter fails in doing so, exactly as do the search engines.

What malware is lurking in websites?

Among the 80 million websites examined, all types of threats could be found. For 2 million links examined in 2015 and 2.2 million in 2016, there was no website; a download was launched immediately. As many as 10,000 times in 2015 and just under 18,000 times in 2016, the malware threat attempted delivery via direct download.

In roughly more than 60% of the cases, the attack was attempted directly with a file. For the remaining 40% of the attacks, code snippets, Java, Flash and other exploits were used for vulnerabilities.

The laboratory registered the file types used in attacks and created a Top 5 list. First place, as expected, is made up of EXE files. Here is the order:

  • EXE: executable EXE files
  • ZIP: compressed archive files
  • RAR: compressed archive files
  • SWF: Adobe Flash Multimedia file
  • MSI: Microsoft Windows installer file

According to the ranking, there are 10 or fewer documented specimens of many of the other file types on record.

Search machines deliver infected websites

The number of infected websites discovered in the test is in fact not very high, yet it is necessary to fathom their potential. Google alone handles some 2 to 3 billion search queries per day out of a pool of an estimated 1.1 billion websites. In this, it is important to consider that a popular sports website, for example, is called up 100,000 times, whereas the website for breeding dwarf hamsters may only be searched 100 times. Accordingly, the one infected website can be delivered thousands of times per day, whereas another may be delivered only 10 times.

Making an estimate of how many websites are actually circulating through the web, infected with malware, is actually impossible. But the test findings collected over the years indicate that the number of websites infected with malware is continually growing. From 2015 to August 2016 alone, there is an increase of over 60% from a small pool of 80 million examined websites!

Experts continue to recommend security software

Despite the efforts of search engine operators and tools such as the Google safe browsing tools, the realization remains that malware threats continue to be on the upswing on the Internet. The experts from AV-TEST can therefore only continue to recommend the use of a security solution for PCs or mobile devices. This is something users need to take to heart. The laboratory constantly publishes materials on this topic on its website Tests of security software for Windows, Mac, iOS and Android.

Attackers use the dynamic growth of the web

Maik Morgenstern,

Although AV-TEST has been recording and constantly evaluating websites for 20 months now, this repeatedly involves only a new snapshot in time. The Internet is far too dynamic for rigid long-term statistics.

Even for an institute like AV-TEST, research is not always easy. An excellent example of this is the final interpretation of the study data on malware in search results. The study does indeed reliably indicate how many websites or links delivered by search engines have infected content or malware threats. But it cannot record how long a website with a malware threat has been on the web and how often it has been delivered.

The lab attempted to analyze what happens when you enter current search terms, evaluate the websites found and repeat this procedure after 14 days. The result: roughly 25 to 30% new websites were added to those already present. These new websites showed precisely the same percentage of infection as the initial search query. There was a new crop of websites with new malware threats.

The Internet is constantly in flux and therefore cannot be statistically recorded. What's more, research is also being made more difficult. Search engine operators, for example, demand a volume-based service fee for accessing their database via API. It is already required by Google, and it will be charged by Bing in the near future.

It is important to point out that search engine operators are not virus hunters. It is an additional job that they do not perform entirely on a voluntary basis. Because if users increasingly download infected links and malicious files as a result, they might consider using a different search engine.

Share news: