Skip to main content

Hong Kong court finds Google liable for defamation via auto-complete suggestions


Things are getting interesting for Google on the legal front. Not long after the ‘right to be forgotten‘ ruling and the messy fallout from that, a Hong Kong court has ruled that the company is responsible for auto-complete suggestions where they could be said to defame.

MyBroadband (via The Loop) reports:

A Hong Kong court has ruled that a local tycoon can sue Google Inc for defamation because searches for his name on Google suggest adding the word ‘triad’, Hong Kong’s notorious organized crime groups.

Searches in both English and Chinese for Albert Yeung Sau-shing, the founder and chairman of Hong Kong-based conglomerate Emperor Group, will automatically suggest phrases related to organized crime using Google’s ‘autocomplete’ function.

On Tuesday, the High Court of Hong Kong dismissed Google’s argument that it was not responsible for the autocomplete suggestions related to Yeung and that the court did not have personal jurisdiction over the U.S. search giant … 

Similar rulings have been made in Italy and Japan, with other cases brought in Germany and France – including one by Germany’s former first lady, which was rejected.

It’s an interesting decision because, as Google explains on its support pages, auto-complete suggestions are generated automatically by user activity.

Autocomplete predictions are automatically generated by an algorithm without any human involvement based on a number of objective factors, including how often past users have searched for a term.

Our algorithm is designed to reflect the diversity of our users’ searches and content on the web. Just like the web, the search terms shown may seem silly, strange, or surprising. The algorithm automatically detects and excludes a small set of search terms for things like pornography, violence, hate speech, illegal and dangerous things, and terms that are frequently used to find content that violates copyrights.

So a search term only makes it to the auto-complete list when enough people carry out that search.

Clearly from Google’s statement that it filters out things like pornography, it already has the technical capabilities in place to exclude derogatory terms. But auto-complete suggestions play a valuable role in searches, and asking Google to police which suggestions should be allowed through and which should be blocked places the company in the same ridiculous situation it faces in the ‘right to be forgotten’ mess. Buy shares in Google’s lawyers now …

FTC: We use income earning auto affiliate links. More.

You’re reading 9to5Google — experts who break news about Google and its surrounding ecosystem, day after day. Be sure to check out our homepage for all the latest news, and follow 9to5Google on Twitter, Facebook, and LinkedIn to stay in the loop. Don’t know where to start? Check out our exclusive stories, reviews, how-tos, and subscribe to our YouTube channel