Another example of AI being sold at a level it has yet to achieve negatively affecting us all
Reposted from
PBS News
Many medical centers use an AI-powered tool called Whisper to transcribe patients’ interactions with their doctors. But researchers have found that it sometimes invents text, a phenomenon known in the industry as hallucinations, raising the possibility of errors like misdiagnosis.
Comments