FACT FOCUS: Google autocomplete results around Trump lead to claims of election interference

With fewer than 100 days until the 2024 election, social media users are claiming that a lack of Google autocomplete results about former President Donald Trump and his attempted assassination is evidence of election interference.

Many posts include screenshots showing what the autocomplete feature, which predicts what users are trying to type, has generated for text such as “attempted assassination of tr” or “president donald.” Among the pictured results for the former phrase are references to other assassination attempts, including that of Harry Truman and Gerald Ford, but nothing for Trump. The latter provides two options — “president donald duck” and “president donald regan.”

Multiple high-profile figures, including Trump and sitting members of Congress, promoted the claim across social media platforms, collectively amassing more than 1 million likes and shares by Tuesday. Trump did not immediately respond to a request for comment.

Google attributed the situation to existing protections against autocomplete predictions associated with political violence, noting that “no manual action was taken” to suppress information about Trump.

Search engine experts said there are many reasons that could explain why some autocomplete results concerning the former president were not appearing.

Here’s a closer look at the facts.

CLAIM: Google is engaging in election interference by censoring autocomplete results about former President Donald Trump, including the assassination attempt at his Pennsylvania rally on July 13.

THE FACTS: It is true that Google’s autocomplete feature as of Monday was not finishing certain phrases related to Trump and the assassination attempt as shown in screenshots spreading online, but there is no evidence it was related to election interference.

By Tuesday, some of the same terms were providing relevant autocomplete results. The text “president donald” now also suggests “Donald Trump” as a search option. Similarly, the phrase “attempted assassination of” includes Trump’s name in autocomplete predictions. Adding “tr” to the same phrase though makes the option disappear.

Completed searches about Trump and the assassination attempt done on both Monday and Tuesday yielded extensive relevant results regardless of what autocomplete predictions came up.

Google told the AP that its autocomplete feature has automated protections regarding violent topics, including for searches about theoretical assassination attempts. The company further explained that its systems were out of date even prior to July 13, meaning that the protections already in place couldn’t take into account that an actual assassination attempt had occurred.

Additional autocomplete results now appearing about Trump are the result of systemic improvements — rather than targeted manual fixes — that will affect many other topics, according to the company.

“We’re rolling out improvements to our Autocomplete systems to show more up-to-date predictions,” Google told The Associated Press in a statement. “The issues are beginning to resolve, and we’ll continue to make improvements as needed. As always, predictions change over time and there may be some imperfections. Autocomplete helps save people time, but they can always search for whatever they want, and we will continue to connect them with helpful information.”

Search engine experts told the AP that they don’t see evidence of suspicious activities on Google’s part and that there are plenty of other reasons to explain why there have been a lack of autocomplete predictions about Trump.

“It’s very plausible that there’s nothing nefarious here, that it’s other systems that are set up for neutral or good purposes that are causing these query suggestions to not show up,” said Michael Ekstrand, an assistant professor at Drexel University who studies AI-powered information access systems. “I don’t have a reason not to believe Google’s claim that this is just normal systems for other purposes, particularly around political violence.”

Thorsten Joachims, a professor at Cornell University who researches machine learning for search engines, explained that autocomplete tools typically work by looking at queries people make frequently over a certain period of time, providing the most frequent completions of those queries. Beyond that, a search engine may automatically prune predictions based on concerns such as safety and privacy.

This means that it’s plausible that Google’s autocomplete feature wouldn’t have accounted for recent searches about the assassination attempt on Trump, especially if its systems indeed had not been updated since before the shooting.

“Depending on how big the window is that they’re averaging over, that may simply not be a frequent query,” Joachims said. “And it may not be a candidate for autocompletion.” He added that it’s typical not to update a search model on a daily basis, given the costs and technical risks involved.

A 2020 Google blog post about its autocomplete feature describes how the system reflects previous searches and why users might not see certain predictions, including those that are violent in nature. The post also explains that predictions may vary based on variables such as a user’s location, the language they speak or rising interest in a topic.

Both Ekstrand and Joachims agreed that proving bias in a complex system like Google’s search engine from the outside would be extremely difficult. It would require much more data than just a couple of searches, for example, and would risk setting off the company’s protections against data scraping, reverse engineering and fraud.

“In general, claims that platforms are taking particular targeted actions against specific people on political bases are hard to substantiate,” Ekstrand said. “They sometimes, I’m sure, happen, but there’s so many other explanations that it’s difficult to substantiate such claims.”

Joachims noted that the demographics of Google’s user base could impact the results of such a study if they skewed toward one side of the political aisle or another and therefore searched more for their preferred candidates. In other words, the way the system works would make it difficult to probe the system.

Technical issues aside, limiting autocomplete predictions as a method of political influence could simply be bad for business.

“Even if Google would like to do that, I think it would be a very bad decision because they could lose a lot of users,” said Ricardo Baeza-Yates, a professor at Northeastern University whose research includes web search and information retrieval.

___

Find AP Fact Checks here: https://apnews.com/APFactCheck.

Source: post