Let’s not anthropomorphize silicon now. It’s wrong, not hallucinating.
> CPUs are getting so small they're starting to hallucinate
This is blatant editorializing of the title. Data corruptions aren't "hallucinations," and the linked post doesn't use that term either. The push to muddy the waters of every LLM deficiency has gone too far.
This is very bad. She talks about "...needing to accept potentially imperfect compute....", which of course underpins the entire "AI" phenomenon. We are going to have to make very hard and sharp distinctions between areas where correctness does and does not matter. In order for anything to be correct, cross-contamination will have to be absolutely prevented.
Let’s not anthropomorphize silicon now. It’s wrong, not hallucinating.
> CPUs are getting so small they're starting to hallucinate
This is blatant editorializing of the title. Data corruptions aren't "hallucinations," and the linked post doesn't use that term either. The push to muddy the waters of every LLM deficiency has gone too far.
This is very bad. She talks about "...needing to accept potentially imperfect compute....", which of course underpins the entire "AI" phenomenon. We are going to have to make very hard and sharp distinctions between areas where correctness does and does not matter. In order for anything to be correct, cross-contamination will have to be absolutely prevented.