Experienced researchers don't need a machine to output answer-shaped objects. LLMs can never do anything but string words together probabilistically. They don't synthesize new knowledge, and can't be stopped from making up nonsense no matter how well curated. They're not even a good search engine.
1
0
8
Not for writing papers, just as research tools, but if they can't be good search engines with proper controls in place, that throws that idea out of the window.
1
0
0
Look at the number of lawyers getting sanctioned for all the false citations in their AI researched filings. No, it's worse for research than it is for writing because it creates plausible looking results that requires more vigilance and mental load to double check than just doing it yourself.
1
0
8