I don't think it was designed to render people helpless, truly, but it's definitely a consequence of how people use them. I'm an adversarial researcher on these systems and they are quite complex, even capable of a degree of synthesis in the right circumstances.
1
0
3
The purpose is a system is what it does. If it is consistently is doing things like offloading cognitive load onto a computers hallucinations and hurting its users, then that's part of its purpose.
3
0
7
That’s not the traditional definition of “purpose” nor how most people use it in practice. We have the terms “unintended consequences” or “side effects” to describe outputs of a system that were not part of the original intention.
1
0
0
Came here to disagree with orig post "designed" terminology.

It's not designed to make the population helpless, that's just an important result of how people work.

chat GPT is designed to create viable sentence structure that consequently seems like it answers questions.
0
0
0
Objectively, what it does is parse information via statistical inference to produce natural language text outputs.

Hallucinations, true hallucinations, occur at a rate of less than 1% in my research experience.
1
0
0