Many medical centers use an AI-powered tool called Whisper to transcribe patients’ interactions with their doctors. But researchers have found that it sometimes invents text, a phenomenon known in the industry as hallucinations, raising the possibility of errors like misdiagnosis. John Yang speaks with Associated Press global investigative reporter Garance Burke to learn more.
Watch PBS News for daily, breaking and live news, plus special coverage. We are home to PBS News Hour, ranked the most credible and objective TV news show.
Stream your PBS favorites with the PBS app: https://to.pbs.org/2Jb8twG
Find more from PBS NewsHour at https://www.pbs.org/newshour
Subscribe to our YouTube channel: https://bit.ly/2HfsCD6
Follow us:
TikTok: https://www.tiktok.com/@pbsnews
X (formerly Twitter): http://www.twitter.com/newshour
Instagram: http://www.instagram.com/newshour
Facebook: http://www.pbs.org/newshour
Subscribe:
PBS NewsHour podcasts: https://www.pbs.org/newshour/podcasts
Newsletters: https://www.pbs.org/newshour/subscribe