No LLM has any perception of meaning. Underneath it all, the way they work is "Based on this reference data, show me something that looks like a statistically probable answer to this prompt"
"How many R's are in strawberry?" It don't know what an R is.
"How many R's are in strawberry?" It don't know what an R is.
September 21, 2025 - 17:55 UTC
1
0
14