

Yeah totally. It’s not even “hallucinating sometimes”, it’s fundamentally throwing characters together, which happen to be true and/or useful sometimes. Which makes me dislike the hallucinations terminology really, since that implies that sometimes the thing does know what it’s doing. Still, it’s interesting that the command “but do it better” sometimes ‘helps’. E.g. “now fix a bug in your output” probably occasionally’ll work. “Don’t lie” is not going to fly ever though with LLMs (afaik).


I think it was an airplane air inlet duct that melted and collapsed. And it was bought from a 3D printing supplier, not printed themselves. The person aboard lived. So it was more subtle, which makes it even more insidious. I.e. even for a simple plastic tube you need the expensive part, for non-obvious reasons.