Apple attributes iPhone voice-to-text mishap to bug that misinterprets ‘racist’ as ‘Trump’

Apple is addressing a problem that some iPhone users have encountered with the voice-to-text feature. These users noticed that when they say the word “racist,” it briefly types out “Trump” before changing back to “racist.” This issue gained attention after a TikTok user highlighted the glitch in a video that went viral last week. Conservative commentators, including well-known figures like Alex Jones, expressed their confusion and outrage about this situation online.
Several reporters from NBC News confirmed that they could replicate the same strange behavior on different iPhones. When they used the voice dictation feature and said “racist,” the text momentarily showed “Trump” before quickly updating to the correct term. However, it is important to note that not every attempt resulted in the same glitch.
An Apple representative acknowledged the problem in a statement, saying, “We are aware of an issue with the speech recognition model that powers Dictation, and we are rolling out a fix today.” The company explained that specifically, the speech recognition technology sometimes mixes up words that sound similar at first before correcting itself to the intended word. This specific glitch has been incorrectly suggesting “Trump” when users say any word containing the letter “r.”
This incident isn’t the first time a major tech company has faced accusations of political bias. For instance, last month, Meta, the parent company of Facebook and Instagram, came under fire after users claimed that its platforms were unfairly promoting Donald Trump and his administration after he took office. Some users were alarmed to discover that they seemed to be automatically following Trump’s page as well as that of Vice President JD Vance. In response, Meta stated that this was a typical practice during presidential transitions.
Meta has also responded to complaints regarding its Instagram platform. Users noticed issues when searching for the hashtag #democrat, leading some to believe that the platform was blocking results. The company addressed this by stating that the problem affected multiple hashtags across the site, not just those related to left-leaning content.
In September, Amazon dealt with its own controversy involving Alexa, its virtual assistant. There was an issue where users noticed that when asked about voting for Trump, Alexa gave vague responses, while it provided detailed reasons for supporting Democratic candidate Kamala Harris. When asked about Trump, Alexa refused to offer information promoting any specific political party or candidate, but when it came to Harris, it often provided in-depth support for her candidacy.
Overall, these incidents highlight ongoing discussions about how technology can impact the political landscape and the challenges companies face in managing their platforms in a fair and unbiased way.