Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
The problem is that you need to check those sources today make sure it’s not just making up bullshit and at that point you didn’t gain anything from the genai
As I said the links provide some entry points for further research. It’s providing some use to me because I don’t need to check every search result. But to each their own and I understand the general scepticism of generative “AI”
If you don’t check everyone source. It might be just bullshitting you. There’s people who followed your approach and got into hot shit with their bosses and judges
There is absolutely value in something compiling sources for you to personally review. Anyone who cannot use AI efficiently is analogous to someone who can’t see the utility in a graphing calculator. It’s not magic, it’s a tool. And tools need to be used precisely, and for appropriate purposes.
My plumber fucks up I don’t blame his wrench. My lawyers don’t vet their case work, I blame them.
It’s an LLM. Odds are it’s hallucinating the sources and they don’t even exist.
Know what does compile sources for you which are guaranteed to exist and be related to what you’re looking for…? A good old not LLM infected search engine.
If my plumber replaces their wrench for a rabid gerbil claiming it’ll be just as good I’m definitely changing plumbers.
I’m not an llm hater. I run one of the biggest Foss genai services. It’s because of that that I know their limitations.
You said that you’re not going to check every search result, which implied you’re not checking every time source either which will lead you to eventually believe some llm bullshit. And of you’re using an llm just to compile sources that you check with yourself it’s no difference than a search engine without llm
I’m a different person then who you were replying to, I know that LLMs do hallucinate/lie/make up sources, but you can quickly check if the sources are real
The problem is that you need to check those sources today make sure it’s not just making up bullshit and at that point you didn’t gain anything from the genai
As I said the links provide some entry points for further research. It’s providing some use to me because I don’t need to check every search result. But to each their own and I understand the general scepticism of generative “AI”
If you don’t check everyone source. It might be just bullshitting you. There’s people who followed your approach and got into hot shit with their bosses and judges
There is absolutely value in something compiling sources for you to personally review. Anyone who cannot use AI efficiently is analogous to someone who can’t see the utility in a graphing calculator. It’s not magic, it’s a tool. And tools need to be used precisely, and for appropriate purposes.
My plumber fucks up I don’t blame his wrench. My lawyers don’t vet their case work, I blame them.
It’s an LLM. Odds are it’s hallucinating the sources and they don’t even exist.
Know what does compile sources for you which are guaranteed to exist and be related to what you’re looking for…? A good old not LLM infected search engine.
If my plumber replaces their wrench for a rabid gerbil claiming it’ll be just as good I’m definitely changing plumbers.
Spoken like someone who never even tried to use an LLM and just parrots the bad things they hear online.
Lemmy is full of LLM haters, I get where they’re coming from but they take it to the extreme every single time.
I’m not an llm hater. I run one of the biggest Foss genai services. It’s because of that that I know their limitations.
You said that you’re not going to check every search result, which implied you’re not checking every time source either which will lead you to eventually believe some llm bullshit. And of you’re using an llm just to compile sources that you check with yourself it’s no difference than a search engine without llm
I’m a different person then who you were replying to, I know that LLMs do hallucinate/lie/make up sources, but you can quickly check if the sources are real
The problem is that even if the sources are real, it’s no guarantee that the LLM even used them
The sources are the same result of the search? Or at least the top results?