AI Transcription “Whisper” Found to Invent False Statements
A recent Global News investigation examined the performance of Whisper, an AI-powered transcription tool used in various industries, including some medical settings. The report found that Whisper frequently generates fabricated text—known in the industry as “hallucinations.” These hallucinations can include invented sentences, inappropriate commentary, and even imagined medical treatments.
According to the investigation, one machine learning engineer discovered hallucinations in roughly half of the more than 100 hours of Whisper transcripts he reviewed. Another developer reported finding hallucinations in nearly every one of the 26,000 transcripts he generated using the tool. These findings raise significant concerns about the reliability of AI transcription, especially in situations where accuracy is critical.
In Ontario, court transcripts must still be produced by a human Authorized Court Transcriptionist (ACT). However, some non-court settings—such as Landlord and Tenant Board hearings, police interviews, and body-worn camera footage—may use AI transcription tools. This creates a risk: a person’s testimony or statement is a dangerous place for an AI hallucination to occur.
For this reason, I strongly recommend having your transcript produced by a qualified, licensed, and registered ACT. Human transcription ensures accuracy, confidentiality, and compliance with Ontario’s legal requirements—standards that AI tools cannot meet.
Source:
Garance Burke & Hilke Schellmann (October 26, 2024). AI-powered transcription tool used in hospitals reportedly invents things no one ever said. Global News.
Read the original article