When AI gets it wrong: ChatGPT ‘nearly kills’ woman by confidently misidentifying a deadly plant
January 28, 2026
0 views
1 min read
A near-fatal incident has sparked concerns over AI chatbots. YouTuber Kristi shared how her best friend was repeatedly assured that a deadly poison hemlock plant was safe, even after providing multiple images. Kristi wrote, “Chat GPT NEARLY k*lled my best friend by telling her that POISON HEMLOCK was CARROT... its poison hemlock. Which there is NO antidote for and is EXTREMELY deadly.”