Data poisoning is stupidly easy to do. Any idiot could massively degrade LLM outputs if they wanted to, and they just might.