AI writing is not a calculator. So what is a good analogy?
This post doesn’t go where you think it does
The analogy of AI bots, such as ChatGPT or Claude, as a calculator has gotten a lot of traction. I’ve seen technophiles use it extensively. It’s clearly a bad, terrible, awful, no-good analogy.
There is a massive mismatch between comparing AI bots with calculators.
Calculators yield answers consistently, as this blog post shows. This post superbly critiques the analogy I’m discussing. (Seriously, though, read that blog post. It’s excellent.) Calculators are built on hard numbers. A calculator will never give you different answers to the same equation. More importantly, every calculator made by any company will give you the same answer to a math problem. If you use a calculator from Google or Sony or Texas Instruments, it will give you the same answer.
But AI bots are NOTHING like this. They use numbers but they not built on numbers. They are built on texts, words, and images. They have monstrously large databases.
AI bots are built using probabilistic statistics to predict text. The results they yield, consequently, can be wildly differently. Qualitatively different. This means that ChatGPT or Claud or Deepseek can give you different answers depending on…just about anything. One minor word difference yields a totally different answer. You can even prompt these bots to try again, yielding—yet again—different results. But more to my point, the people at the companies have a say in what the models predict. Each company is trying to build “special sauce,” like how Claude is better for creative writing while ChatGPT is better at programming.
In other words, each company’s model can yield different results, largely based on proprietary secrets. Let’s try the calculator analogy now.
Imagine: Every K-12 kid brings in their own AI tool to school, with teachers having no idea how any of these tools actually execute tasks or, more importantly, having no way to discern why the calculators gave the answer they did. Sometimes the answers are good. Sometimes bad. Sometimes, weird. Most are boring, uninspiring. We would never allow it precisely because these tools aren’t calculators. The calculator analogy is wrong.
A better analogy: Having AI tools write for you is like going to the gym but having a gym buddy lift the weights for you. When you go the gym, and lift weights, it’s difficult and tiring but it results in muscle growth. Having someone else won’t help you gain the needed muscle growth. The weight still gets lifted but you didn’t have anything to do with it. Of course, I can use the atrophy comparison here: If you don’t use the muscle, you lose it. If you don’t write yourself, you lose that craft for writing.
More to the point: The person isn’t weightlifting at all. The gym buddy is. This is the problem of using these AI tools to write for you. They’re doing the writing for you.
But let’s trace this analogy out a bit more. What if the gym buddy didn’t lift weight for you but instead spotted you. Or coached you. This is why I like the analogy of the gym buddy better. AI tools could be a great coach or an excellent spotter. But we’re using these tools all wrong because of who makes them.
We’re using the technologies in the transactional way that corporations prescribe them. AI tools are infused with a business model of efficiency over community. Writing is reduced to nothing more than documentation. We don’t live our lives like this, not ideally at least. No one ever wants, “Turned a tidy profit” engraved on their tombstone. They don’t want, “I wrote a bunch of documents no one ever read,” in their obituary. Or worse, “I answered all the emails on time.”
I can’t help but see the corporate transactional approach as an outgrowth of the COVID lockdowns. It’s true that COVID and AI chatbots emerged at the same time. But I think the connection runs deeper. COVID lockdowns stripped society down to its barest bones. The bare minimum was accepted for work and school. But eventually “just good enough” is not good enough anymore. When you strip down things to their most transactional, most efficient, bare minimum, you strip down life to a hollow simulation.
I can’t help but feel that AI emerged precisely when we needed it least. COVID left gaping holes in our social infrastructure. We needed to rebuild the lost relationships. To rekindle communities displaced and dispersed. But instead, we got AI simulations. Writing and images that are “just good enough.” But eventually, just good enough is not good enough. When and where it becomes not good enough is not clear, which is what so many businesses and technology companies are exploiting. They’re hoping to replace lost relationships with AI content.
Let me return to that gym buddy analogy again. During COVID, gyms shut down. A lot of people I know understandably turned to electronic apps as their gym buddy. But the best gym buddy is not a robot or an app or a spreadsheet or a tracking device. The best gym buddy is a human friend because the gym buddies are more than a spotter. They are there to get something from working out too! They exercise. They want to be spotter and coached, too. Coaches get something out of coaching.
AI writing isn’t an actual audience that gets anything out of our writing. It’s a simulation of an audience. Machinic and algorithmic audiences are a real thing, no doubt. (This approach has its uses, as I’ve argued across multiple publications.) There’s nothing to be gained on the chatbot’s end.
Allow me to overextend the analogy a bit here. When you replace the human gym buddy with a robot gym buddy, you really don’t have a gym buddy anymore. You have something else. You’re doing something else entirely. It makes going to the gym or exercises less fulfilling and more of an obligation, a transaction.
When everything is a barter, everyone becomes a commodity.
I hope some press is going to compile and publish these AI-related essays of yours in a book. They’re sooooo good! They’re essential for right now but I also think they’re going to age great.
I use AI as a starting point and move on from it. Like Wikipedia for researching something I don't know about and go on from there to reliable sources.