ChatGPT keeps hallucinating—and that’s bad for your privacy

Estimated read time 4 min read



After triggering a spike in VPN service downloads following a temporary ban about a year ago, OpenAI faces troubles in the European Union once again. The culprit this time? ChatGPT‘s hallucination problems.

The popular AI chatbot is infamous for making up false information about individuals—something that OpenAI is admittedly unable to fix or control, experts say. That’s why Austria-based digital rights group Noyb (stylized as noyb, short for “none of your business”) filed a complaint to the country’s data protection authority on April 29, 2024, for allegedly breaking GDPR rules.





Source link

You May Also Like

More From Author

+ There are no comments

Add yours