Hacker News .hnnew | past | comments | ask | show | jobs | submitlogin
End of hallucinations? How Vancouver AI firms achieve accuracy (biv.com)
2 points by ClearwayLaw 75 days ago | hide | past | favorite | 2 comments


Ventures increasingly train AI agents on retrieval augmented generation (RAG) systems that containerize data in small data sets


Its not that difficult really.

Considering that AI never hallucinated in the first place.

It basically fucks up and squirts out shit.

Its like putting too much animal feed in a cows mouth and waiting at the other end with a bucket.

hallucinate is a made up word for the stuff you eventually get in your bucket.

Excuse me for two minutes while I pop to my toilet to hallucinate a big turd.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: