I love this but you haven't "tricked" it. It is working as intended. Regurgitating strings of tokens that statistically should come after one another based on the input. You have swayed those probabilities by making your blog post.
It is an inherent flaw in LLMs. Garbage in, garbage out.
It is an inherent flaw in LLMs. Garbage in, garbage out.
February 18, 2026 - 20:01 UTC
3
0
67