Researchers Say AI Tool Used in Hospitals Invents Things No One Ever Said AmiMoJo shares a report: Tech behemoth OpenAI has touted its artificial intelligence-powered transcription tool Whisper as having near "human level robustness and accuracy." But Whisper has a major flaw: It is prone to making up chunks of text or even entire sentences, according to interviews with more than a dozen software engineers, developers and academic researchers. Those experts said some of the invented text -- known in the industry as hallucinations -- can include racial commentary, violent rhetoric and even imagined medical treatments. Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos. [...] It's impossible to compare Nabla's AI-generated transcript to the original recording because Nabla's tool erases the original audio for "data safety reasons," Nabla's chief technology officer Martin Raison said. <a href="http://twitter.com/home?status=Researchers+Say+AI+Tool+Used+in+Hospitals+Invents+Things+No+One+Ever+Said%3A+https%3A%2F%2Ftech.slashdot.org%2Fstory%2F24%2F10%2F28%2F1510255%2F%3Futm_source%3Dtwitter%26utm_medium%3Dtwitter" rel="nofollow"><img src="https://a.fsdn.com/sd/twitter_icon_large.png"></a> <a href="http://www.facebook.com/sharer.php?u=https%3A%2F%2Ftech.slashdot.org%2Fstory%2F24%2F10%2F28%2F1510255%2Fresearchers-say-ai-tool-used-in-hospitals-invents-things-no-one-ever-said%3Futm_source%3Dslashdot%26utm_medium%3Dfacebook" rel="nofollow"><img src="https://a.fsdn.com/sd/facebook_icon_large.png"></a> https://tech.slashdot.org/story/24/10/28/1510255/researchers-say-ai-tool-used-in-hospitals-invents-things-no-one-ever-said?utm_source=rss1.0moreanon&utm_medium=feed at Slashdot. https://tech.slashdot.org/story/24/10/28/1510255/researchers-say-ai-tool-used-in-hospitals-invents-things-no-one-ever-said?utm_source=rss1.0mainlinkanon&utm_medium=feed