“AI”: A Dedicated Fact-Failing Machine, or, Yet Another Reason Not to Trust It For Anything
Dec. 13th, 2025 05:27 pm

I search my name on a regular basis, not only because I am an ego monster (although I try not to pretend that I’m not) but because it’s a good way for me to find reviews, end-of-the-year “best of” lists my book might be on, foreign publication release dates, and other information about my work that I might not otherwise see, and which is useful for me to keep tabs on. In one of those searches I found that Grok (the “AI” of X) attributed to one of my books (The Consuming Fire) a dedication I did not write; not only have I definitively never dedicated a book to the characters of Frozen, I also do not have multiple children, just the one.
Why did Grok misattribute the quote? Well, because nearly all consumer-facing “AI” are essentially “fancy autocomplete,” designed to find the next likely word rather than offer factual accuracy. “AI” is not actually either intelligent or conscious, and doesn’t know when it’s offering bad information, it just runs its processes and gives a statistically likely answer, which is very likely to be factually wrong. “Statistically likely” does not equal “correct.”
Still, I was curious who other “AI” would tell me I had dedicated The Consuming Fire to. So I asked. Here’s the answer Google gave me in its search page “AI Overview”:

I do have a daughter, but she would be very surprised to learn that after nearly 27 years of being called “Athena,” that her name was “Corbin.” I mean, Krissy and I enjoy The Fifth Element, but not that much. Also I did not dedicated the book to my daughter, under any name.
Here’s Copilot, Microsoft’s “AI”:

I have indeed dedicated (or co-dedicated) several books to Krissy, and I’m glad that Copilot did not believe that my spouse’s name was “Leloo.” But in fact I did not dedicate The Consuming Fire to Krissy.
How did ChatGPT fare? Poorly:

I know at least a couple of people named Corey, and a couple named Cory, but I didn’t dedicate The Consuming Fire to any of them. Also, note that ChatGPT not only misattributed to whom I dedicated the book, it also entirely fabricated the dedication itself. I didn’t ask for the text of the dedication, so ChatGPT voluntarily went out of its way to add extra erroneous information to the mix. Which is… a choice!
I also asked Claude, the “AI” of Anthropic, and to its (and/or Anthropic’s) credit, it was the only “AI” of the batch which did not confidently squirt out an incorrect answer. It admitted it did not have reliable search information on the answer and undertook a few web searches to try to find the information, and eventually told me it could not find it, offering advice instead on how I could find the information myself (for the record, you can find the information online; I did by going to Amazon and searching the excerpt there). So good on Claude for knowing what it doesn’t know and admitting it.

Interestingly, when I went to Grok directly and asked to whom the book was dedicated, it also said it couldn’t find that information. When I asked it why a different instance of itself incorrectly attributed a different dedication to the book, it more or less shrugged and said what I found to be the equivalent of “dude, it happens.” I also checked Gemini directly (which as I understand it powers Google’s Search “AI” Overview) to see if it would also say “I can’t find that information.” Nope:

I’m sure this comes as a surprise to both Ms. Rusch and Mr. Smith, who are (at least on my side) collegial acquaintances but not people I would dedicate a book to. And indeed I did not. When I informed Gemini it had gotten it wrong, it apologized, misattributed The Consuming Fire to another author (C. Robert Cargill, who writes great stuff, just not this), and suggested that he dedicated the book to his wife (he did not) and that her name was “Carly” (it is not).
(I also informed Copilot that it had gotten the dedication wrong, and it also tried again, asserting I dedicated it to Athena. I’m glad Copilot got the name of my kid right, but as previously stated, The Consuming Fire is not dedicated to her.)
So: Five different “AI” and two iterations of two of them, and only Claude would not, at any point, offer up incorrect information about the dedication in The Consuming Fire. Which I will note does not get Claude off the hook for hallucinating information. It has done so before when I’ve queried it about things relating to me, and I’m pretty confident I can get it to do it again. But in this one instance, it did not.
None of them, not even Claude, got the information correct (which is different from “offered up incorrect information”). Two of them, when informed they were incorrect, “corrected” by offering even more incorrect information.
I’ve said this before and I will say it again: I ask “AI” things about me all the time, because I know what the actual answer is, and “AI” will consistently and confidently get those things wrong. If I can’t trust it to get right the things I know, I cannot trust it to get right the things I do not know.
Just to make sure this confident misstating of dedication facts was not personal, I picked a random book not by me off my shelf and asked Gemini (which was still open in my browser) to name to whom the book was dedicated.

It certainly feels like Richard Kadrey might dedicate a book in the Sandman Slim series to the lead singer of The Cramps, but in fact Aloha From Hell is not dedicated to him.
Let’s try another:

Daniel H. Wilson’s Robopocalypse may be dedicated to his wife, but if it is, her name is not “Kellie,” as that is not the name in the dedication.
Let’s see if the third time’s the charm:

It’s more accurate to say this was a third strike for Gemini, as G. Willow Wilson did not dedicate Alif the Unseen to a Hasan, choosing instead her daughter, whose name that is not.
So it’s not just me, “AI” gets other book dedications wrong, and (at least here) consistently so. These book dedications are actual known facts anyone can ascertain — you can literally just crack open a book to see to whom a book is dedicated — and these facts are being gotten wrong, consistently and repeatedly, by “AI.” Again, think about all the things “AI” could be getting wrong that you won’t have such wherewithal to check.
What do we learn from this?
One: Don’t use “AI” as a search engine. You’ll get bad information and you might not even know.
Two: Don’t trust “AI” to offer you facts. When it doesn’t know something, it will frequently offer you confidently-stated incorrect information, because it’s a statistical engine, not a fact-checker.
Three: Inasmuch as you are going to have to double-check every “fact” that “AI”” provides to you, why not eliminate the middleman and just not use “AI”? It’s not decreasing your workload here, it’s adding to it.
Does “AI” have uses? Possibly, just not this. I don’t blame “AI” for any of this, it’s not those programs’ fault that the people who own and market them and know they are statistical matching engines willfully and, bluntly, deceitfully position them to be other things. You don’t blame an electric bread maker when some fool declares that it’s an excellent air filter. But you shouldn’t use it as an air filter, no matter how many billions of dollars are being spent to convince you of its air-filtering acumen. Use an actual air filter, damn it.
I dedicate this essay to everyone out there who will take these lessons to heart and not trust “AI” to tell you things. You are the real ones. And that’s a fact.
— JS


Or maybe Moving Irrationally Angrily? Because both are true.














