>>724681413
nta but I don't really see how he's wrong. Researching is a skill in and of itself, and like any skill it will deteriorate if you don't use it enough or at all. It was already starting to become a problem at least as early as the creation of search engines, when people would develop a habit of just trusting the first result to be the best one and roll with it. Now we have "AI summaries" of search results which are frequently factually incorrect while they confidently link you to an alleged source article. The link SHOULD be there of course, but its presence alone actually increases peoples' confidence in the bad information they're given and makes them more likely to just take it at face value without bothering to actually look at the source.
Just in my own line of work I've had tons of clients get enormously confused and make huge mistakes based on bad information they were confidently given by an LLM. You have to understand that while tons of people have some passing familiarity with AI at this point, most of them aren't dialed in to the "tech space" internet culture and keeping up with the capabilities, the controversies, etc.