Concerns have been raised about OpenAI's Whisper tool, a popular AI transcription service used by hospitals and medical institutions. An investigation found that the tool may create fabricated text in medical transcripts, which could lead to inaccurate patient records and compromised care. Despite warnings against using Whisper for high-risk domains, over 30,000 medical workers are currently using the tool. Source: https://arstechnica.com/ai/2024/10/hospitals-adopt-error-prone-ai-transcription-tools-despite-warnings/