It's the same error that people make when they assume that because an LLM succeeds at a task humans need to use their full mental powers to perform well at, that means the LLM matches or exceeds those human mental powers, just in the other direction.
March 31, 2026 - 05:53 UTC
1
0
6