I think this is poorly written.

Because this 'intuition' is where cops think everyone is lying to them and doctors forget to diagnose rare conditions.
1
0
1
This may be true, but the current model for AI is to aggregate the advice from those same cops and doctors. So not only are you getting the same results, you're doing it through a system that erases culpability for them.
2
0
0
To get better results you'd still have to train people to train the system. In the meantime you can just train employees directly, without burning down both the environment and the economy in the process.
1
0
1
That'a yet another way LLMs can be deceiving but most applications of AI are not LLMs.

I was just pointing out this 'intuition' argument is literally the way machine learning works; and intuition (human or machine) develops quirks based upon sampling error.
0
0
1