8 points | by doener 2 hours ago ago
3 comments
In a fair article there would be some attempt to compare it to the competitors. I'm willing to be convinced that Grok is worse but this doesn't try.
tl;dr LLMs hallucinate
No. Per Oxford Languages…
> (of an artificial intelligence program or tool)
> produce a response that appears to be accurate or plausible but that contains inaccurate or misleading information.
The examples cited in the article were, in my opinion, neither accurate nor plausible.
In this case, I would say, LLMs lie.
In a fair article there would be some attempt to compare it to the competitors. I'm willing to be convinced that Grok is worse but this doesn't try.
tl;dr LLMs hallucinate
No. Per Oxford Languages…
> (of an artificial intelligence program or tool)
> produce a response that appears to be accurate or plausible but that contains inaccurate or misleading information.
The examples cited in the article were, in my opinion, neither accurate nor plausible.
In this case, I would say, LLMs lie.