Apple AI tool misinterprets the word ‘racist’ as ‘Trump’

Apple is currently addressing an issue with its speech-to-text feature, known as Dictation, after some users on social media reported a strange problem. When they said the word “racist” into their iPhones, the tool sometimes typed it out as “Trump.” This unexpected outcome raised concerns among users, prompting Apple to investigate the situation.
According to a spokesperson from Apple, the company believes the problem stems from its speech recognition model having difficulty distinguishing words that contain the letter “r.” They announced that a fix would be implemented right away. Despite this claim, a speech recognition expert from the University of Edinburgh, Peter Bell, expressed skepticism about Apple’s explanation. He suggested that it seemed more likely that the software used for Dictation had been intentionally modified, rather than being a simple glitch in word recognition.
Users have shared videos online demonstrating the issue. In these videos, some users see the word “racist” typed out accurately, while others witness it being incorrectly shown as “Trump,” only to be corrected shortly after. Reports indicate that the BBC has not been able to replicate this specific problem, which may indicate that Apple’s fix is already having an effect.
Professor Bell argued that Apple’s justification regarding phonetic similarities between “racist” and “Trump” does not hold up. He stated that the words are not close enough in pronunciation to confuse an artificial intelligence (AI) system. Speech recognition technology is usually trained by feeding it recordings of real people speaking along with accurate transcripts of their words. This training helps these systems recognize words in context. For example, the model can differentiate between “cup” and “cut” based on the phrases they are used in.
Prof. Bell pointed out that Apple’s speech recognition model should have been trained on hundreds of thousands of hours of English speech, giving it a high degree of accuracy. Therefore, it is unlikely that a problem with data is causing the issue. He noted that in less-studied languages, AI might struggle due to insufficient training, but in this case, he thought it might indicate tampering with the software.
A former Apple employee, who previously worked on the Siri voice assistant, suggested to the New York Times that the situation looks like a “serious prank,” which raises further questions about the integrity of the software.
This incident is not the first time Apple has had to correct problems with its AI features. Recently, the company decided to pause its AI-generated summaries of news headlines after several media outlets, including the BBC, raised concerns about misleading notifications. One of the errors incorrectly reported that tennis player Rafael Nadal had come out as gay.
Despite these troubles, Apple continues to invest heavily in its technology. The company recently announced plans to invest $500 billion in the U.S. over the next four years, including the construction of a large data center in Texas aimed at boosting its AI capabilities. Additionally, Apple CEO Tim Cook mentioned that the company might need to reassess its policies on diversity, equity, and inclusion in light of remarks made by former President Donald Trump advocating for changes in these programs.
In summary, while Apple works on resolving the speech recognition issue, it highlights ongoing challenges in the field of AI and speech technology and raises important discussions about the responsibility of technology companies in addressing such problems.