This is all true, with one key difference: search results (used to) point you to the actual source. LLMs answer you with that information as if they thought of it, with no attribution. So at least search results have a benefit for the source of indexed content.
I don’t know about all AI products, but I know that I use the Copilot sidebar built into edge for work and school questions and it always provides citations to the source information. In fact if I ask a question for school and add in the prompt to cite all sources with a reference in APA format, it gives me everything I need in proper format.
Yeah, it’s useful but double check your sources and never hand in anything, even the citations by just copy and pasting it without scrutiny. It can make up all kinds of bullshit, pretend cited works say something when they don’t, etc.
You don’t want to it to hallucinate you in front of an academic ethics committee. Again, not against using it, but never base anything on stuff it says, only base stuff on primary sources it helped you find.
Fully agree. Honestly, it’s why I like the Copilot branding Microsoft used. It is a Copilot, not the Captain. You still need to be in control and verify and scrutinize.
This is all true, with one key difference: search results (used to) point you to the actual source. LLMs answer you with that information as if they thought of it, with no attribution. So at least search results have a benefit for the source of indexed content.
I don’t know about all AI products, but I know that I use the Copilot sidebar built into edge for work and school questions and it always provides citations to the source information. In fact if I ask a question for school and add in the prompt to cite all sources with a reference in APA format, it gives me everything I need in proper format.
Yeah, it’s useful but double check your sources and never hand in anything, even the citations by just copy and pasting it without scrutiny. It can make up all kinds of bullshit, pretend cited works say something when they don’t, etc.
You don’t want to it to hallucinate you in front of an academic ethics committee. Again, not against using it, but never base anything on stuff it says, only base stuff on primary sources it helped you find.
Fully agree. Honestly, it’s why I like the Copilot branding Microsoft used. It is a Copilot, not the Captain. You still need to be in control and verify and scrutinize.
That’s not the same. In that case copilot is also doing a search. They’re talking about the model itself