Google Keeps You In Personal “Search Bubble”

Google Search Engine

Google Keeps You In Personal “Search Bubble”

We all know for a long time that Google, Bing, and other commercial search engines analyze the history of visits and the user’s search history, on the basis of which clever algorithms try to predict what will “be useful” to a person the next time.

When this technology was in its infancy, it seemed logical and understandable, but now this approach looks extremely and extremely frightening. Because earlier, representatives of Google, the most popular search engine in the world, argued that it was enough to leave the account to get rid of the “bias” of search results and get neutral results when using the service.

The guys from DuckDuckGo in their study claim that even when you log out and go into the anonymous mode, Google continues to adjust the search results according to the information previously collected about the user. That is, the company leaves no choice to a person and still manipulates the search, even if the user’s actions clearly indicate that he wants to get a “neutral” result.

Even if we consider that DuckDuckGo is also a search engine and can be extremely biased towards Google and its algorithms, the research results make us think. In order to test Google’s search performance, the DuckDuckGo team gathered a group of 87 volunteers and received a selection of 76 search results.

  • These people first entered a predetermined and general for all requests in anonymous mode, which in theory should provide a sample free from the influence of external factors such as search history, and after – from under their accounts. The search was performed by all volunteers at the same time (the experiment was conducted on June 24, 2018, at 21:00), first in private mode without a login, and after – from under their accounts. The study involved only residents of the United States to avoid the influence of filters on the “state”.

Under Control

Under these conditions, it was expected that all users of the + \ – would receive a similar search result, since requests were made approximately at the same time, from one country and in a certain order. However, the majority of volunteers received unique search results in private mode + personal advertising links (the first 3-4 results). More specifically, digits into three used queries:

  • Gun Control: 62 links, 52 out of 76 participants (68%) got unique results.
  • Immigration: 57 links, 43 out of 76 participants (57%) got unique results.
  • “Vaccinations”: 73 links, 70 out of 76 participants (92%) received unique results.

Logically, if such variation in search results is in an “anonymous” mode without a login, then the uniqueness of the search when entering the account should simply go off the scale. However, in the case of the usual search (with the influence of the query history), the picture by the uniqueness of the issue almost did not move:

  • Gun Control: 58 links, 45 out of 76 participants (59%) got unique results.
  • Immigration: 59 links, 48 out of 76 participants (63%) got unique results.
  • “Vaccinations”: 73 links, 70 out of 76 participants (92%) received unique results.

What does it mean? This means that Google claims that the company resolved the issue of issuing bias back in the fall of this year are not true.

At the same time, the DuckDuckGo team also analyzed the “news” issue in the infobox. Some sources say all the participants of the experiment, but “for some reason”, from time to time, Google slipped to some people individual links that only they saw. A few people did not see the infobox at all, which is also “very strange.” Detailed numbers can be found in the original post.

In the ideal world of anonymous search, every user who does not want to provide Google with their personal data should receive the same set of the most relevant links to their query, like any other user from his country. However, Google continues to track its users and impose a “smart” issue on them in order to sell products and services to advertisers. Big Brother knows best what you need.

Here in the XLS-file are instructions that guided the volunteers. The full results of the study can be found in this .xls file. Python code for analyzing the results is in the repository on GitHub.