Zum Inhalt springen
Fotostrecke

Photo Gallery: The Dangers of Autocomplete

Autocompleting Bettina Wulff Can a Google Function Be Libelous?

Search engines attempt to guess what users are looking for and offer them suggested words. But can these terms constitute defamation? Former German first lady Bettina Wulff says they do, and is suing Google over searches that pair her name with terms like "prostitute." Google maintains it is an automatic mechanism.
Von Stefan Niggemeier

The problem has its roots in the American service mentality. One could presumably imagine Google as a somewhat overzealous assistant. "Rest your fingers," says the friendly search engine provider. Right from the very first letter that we type in the search box, it rushes to guess what we might be looking for. "S." Is it SPIEGEL? Samsung? Savings and loan? Skype?

It's pure service-mindedness, but for Bettina Wulff  it's a nightmare. The wife of former German President Christian Wulff wants the search engine to cease suggesting terms that she finds defamatory. This has nothing to do with the search results, but rather with the recommendations made by Google's "Autocomplete" function, a service that is also offered by competitors like Bing and Yahoo. All one has to do is type her first name and the first letter of her last name to get search suggestions such as "Bettina Wulff prostitute," "Bettina Wulff escort" and "Bettina Wulff red-light district."

Google acts as if all this were unavoidable. "The search terms in Google Autocomplete reflect the actual search terms of all users," says a company spokesman. He also spoke of the "algorithmic result of several objective factors, including the popularity of search terms," which sounds far more complex and typically vague, but basically amounts to the same shoulder-shrugging response: One cannot accuse an automatic mechanism of defamation. The company maintains that the search engine only shows what exists. It's not its fault, argues Google, if someone doesn't like the computed results.

How We Perceive the World

Google increasingly influences how we perceive the world. What are we more afraid of? That behind the computing processes stands a merciless machine, or the opaque and arbitrary decisions of a large US corporation?

Both are to be feared and, in the case of Google, both come into play. Contrary to what the Google spokesman suggests, the displayed search terms are by no means solely based on objective calculations. And even if that were the case, just because the search engine means no harm, it doesn't mean that it does no harm. The Autocomplete function, the usefulness of which Google so guilelessly praises as a means of giving one's fingers a rest, undeniably helps spread rumors. Assuming that someone unsuspectingly begins to look for information on "Bettina Wulff" and is offered "prostitute," "Hanover" and "dress" as additional search terms -- where, independent of their actual interests, will users most likely click?

And everyone who selects the most exciting suggestion adds to the popularity of this search, and thus increases the probability that others will see this suggestion in the future.

Perhaps this is one reason why we find these functions and their algorithms so unsettling -- because they so relentlessly expose human behavior. Google is a rumormonger for the simple reason that people are rumormongers. When we hear that there is a rumor concerning Bettina Wulff, we want the details.

Diligently Searching

Who looked up these terms so diligently that they became popular enough to appear in the Autocomplete suggestion box in the first place? Indeed, such unsubstantiated rumors don't reach the top of the search list by merely surfacing on some obscure website in a dark corner of the Web. It may well have been the politicians and journalists who spread the false rumor that Ms. Wulff had been a prostitute -- a rumor she has vehemently denied.

For many months, they looked so hard and long for details on the Internet that the algorithms at Google and other search engines eventually concluded that it would be helpful to suggest the term "prostitute" to people who were looking for "Bettina Wulff" -- just as they recommend "iphone 5" to people who type in "iph."

Anyone who looks for "Angela Merkel" will, depending on their country location, be given "Zeuthen" as an additional search term. After pursuing the initial results here, they will find fairly skeptical news stories about the rumor that the chancellor supposedly wants to move to this town southeast of Berlin.

Until recently, anyone who followed the search suggestions on Bettina Wulff found no newspaper articles, no professional search results and no denials, only the rumor itself. Anyone with a little imagination -- and on the Internet there are certainly people who fall into this category -- could see a conspiracy in the deafening silence of the traditional media on a story that appeared to permeate the Web. The fact that the purported story was not being reported made the rumor even more plausible for those who were spreading it.

A Tacit Agreement Not to Report

In fact, there was apparently a tacit agreement among journalists not to report on the rumor, despite the fact that so many people had heard it. Even critical reporting aimed at refuting the rumor was off-limits, no doubt due to concerns that Ms. Wulff would take legal action against the publishers.

This case shows how dangerous it can be in the age of the Internet when the traditional media don't report on an issue -- even if it is with the best of intentions. The rumor of Ms. Wulff's alleged past life took on special importance and apparent credibility when news leaked that her husband, the German president , felt that he was being pressured by the editor-in-chief of the mass-circulation Bild newspaper.

Anyone who had heard the rumor about his wife could suddenly see a reason to suspect that Wulff was not just concerned about reporting on allegations over a loan for his home, but also on his wife's alleged "red-light past." This could not be denied, of course, because the rumor was already off-limits.

Google promotes its Chrome browser with the slogan: "The Web is what you make of it." That is also an accurate description of how the Autocomplete function works. The more we search for dark secrets, the more dark secrets others will discover.

On the other hand, it would be wrong to see the search suggestions as allegations and to see something negative in every factually incorrect term. Anyone who looks up German national soccer team coach "Jogi Löw" on Google is given "gay" in German as an additional suggested search term. The top search results, though, lead to texts in which Löw denies this rumor. Is it good or bad that Google can thus contribute both to spreading and denying rumors?

And which terms should search engines no longer be allowed to suggest in connection with the name Bettina Wulff to avoid leading unsuspecting users to defamatory content and, as she argues in her case against Google, to avoid being guilty of making false allegations? The list of taboo words that her lawyer has presented not only contains the specific name of a bordello, but also the phrase "wild past life." It appears impossible to create a comprehensive list of terms and phrases that eliminates all possible words that could indirectly point to the incriminating rumor.

For the search engines, this is not merely a question of practicality, but rather a matter of principle. They deny that a combination of terms the algorithm generates as a suggestion constitutes an allegation. More importantly, they deny that they are responsible for the word combinations created by such an algorithm. They argue that the displayed content is generated by others -- in this case, by other users of the search engine.

Blocking Certain Words

Google refuses to accept any responsibility for what people search for -- and for what they find.

Nevertheless, search engine providers can intervene in the functioning of the mechanism. Indeed, they already do so. The Autocomplete help page says that Google uses "a narrow set of removal policies for pornography, violence, hate speech, and terms that are frequently used to find content that infringes copyrights."

Google doesn't suggest words in Autocomplete like "bomb," "porn" and "torrent" (a term used when searching for file-sharing websites), no matter how popular they may be. Bing and Yahoo have differing criteria, but also block certain suggestions.

Search engine providers evidently recognize that it can be problematic if they suggest certain terms to their users, and they have taken appropriate action to ensure that this does not happen. But they do not disclose the criteria they use to make those decisions.

Why, for instance, should the interests of the film industry, which wants to avoid suggesting that anyone look for pirated copies of their works, outweigh the personal rights of Ms. Wulff?

On the other hand, the suggestions made by the Google algorithm are often in the interests of users when they are not in the interests of the individuals or companies concerned. For example, a service provider who used dubious methods to convince companies to purchase high-priced listings in a business directory sued Google because his name came up in connection with search suggestions such as "rip-off" and "fraud." A regional appeals court in Munich ruled that the suggestions did not represent Google's own content, but rather "third-party content, namely search requests from previous users of the search engine," and dismissed the claim. Google has lost similar cases, however, in France and Italy.

A Complex Dilemma

The current conflict resembles the legal battle between Max Mosley  and Google. The former president of FIA, the governing body for world motor sport, wants Google to stop making it possible to locate illegal photos of him online. He wants them to be automatically filtered out of the search results.

This case alone leads to a complex dilemma over whether or not it is good if Google has to monitor its search results -- good for the company, good for those concerned, or good for society? On its own initiative, the Google subsidiary YouTube has decided to block links in Egypt and Libya to the Islamophobic film "Innocence of Muslims," which has sparked outrage and violence in the Islamic world.

The debate over the Autocomplete function is based on the same fundamental conflict. In fact, the function exacerbates the conflict because individual users do not actively have to search for possibly legally contentious content. Instead, Google brings it to their attention.

When a passive search engine morphs into an active suggestion generator, it makes Google's controversial role even more complex, namely its function as a medium for perceiving and determining reality. It is hard to imagine a law or legal decision that would provide a fair and practicable solution to these conflicts.

At the same time, the problems that the Autocomplete function creates for those concerned, and ultimately also for Google, appear to be in no way commensurate with the advantages enjoyed by its users, namely a little convenience, speed and a "rest for the fingers."

Google could simply discontinue this feature, without seriously compromising its functionality as a search engine. But refusing to do so is, of course, also a matter of principle.

Translated from the German by Paul Cohen