AI medical tools found to downplay symptoms of women, ethnic minorities
by Melissa Heikkilä, Financial Times from Ars Technica - All content on (#705D3)
Artificial intelligence tools used by doctors risk leading to worse health outcomes for women and ethnic minorities, as a growing body of research shows that many large language models downplay the symptoms of these patients.
A series of recent studies have found that the uptake of AI models across the healthcare sector could lead to biased medical decisions, reinforcing patterns of under-treatment that already exist across different groups in Western societies.
The findings by researchers at leading US and UK universities suggest that medical AI tools powered by LLMs have a tendency to not reflect the severity of symptoms among female patients, while also displaying less empathy" toward Black and Asian ones.