Um is the information the chatbot returns the truth, or some kind of other thing?
4
0
2
It is not a truth or even a falsehood, because to be a falsehood something must be possible to be truth, and truth requires intent.

Throwing scramble tiles on the ground cannot result in truth, even if the statements appear to be factually correct.
1
0
4
That's not true at all. True and false are binary concepts, mutually exclusive, and do not require intent.
2
0
0
It returns an answer-shaped object.
1
0
25
And is that answer shaped object true, or is it false?
1
0
1
they're nondeterministic and just generate the most likely content for previous context, so it could give you the truth one time or hallucinate another because there's no real difference to it
1
0
6
Rightm so would this make the answers on average true, or false?
2
0
0
Some kind of other thing. It is output based on its inputted training data, so pretty much like a fancy version of the autocomplete on a mobile phone (before that became LLM-led).

If a topic is widely factual in the training data, the output is more likely to be factual.
0
0
0