Data poisoning is stupidly easy to do. Any idiot could massively degrade LLM outputs if they wanted to, and they just might.
February 19, 2026 - 07:58 UTC
0
0
5