From diapers to death, Americans are raised to pursue consumer surplus. That sweet space created by the gap between what we’re willing to pay, and what we actually paid. Heat-seeking missiles propelled by consumer credit, the American shopper is in constant pursuit of the ‘deal’. Few things spark envy like hearing about a deal you missed out on. Proclaiming our own deals amongst friends and acquaintances causes us to swell with pride. Ever heard of a little tome called “Art of the Deal”? It’s a bestseller, and it’s about deals.
The understanding of what is or is not a deal must have broad consensus. Without this recognition, a deal isn’t a deal. Got a great deal on a car? Everyone knows car, everyone understands car. The price to value ratio is easy to compute. Got a great price on a limited edition red-transparent promo copy of Deftones’ 2000 landmark album ‘White Pony’ on vinyl?
Try explaining this to your dentist and see if it resonates. No, because most people don’t value that esoteric thing. At a general population level, a deal is only a deal if the thing upon which a deal was earned is known by most people.
What happens when lots of people start to lose their 6th sense, the one that enables them to smell the aforementioned broad-consensus deals? This is where we find ourselves with AI. AI, as Steve Bannon would say, has “flooded the zone with shit.” We’re surrounded by crap, but it’s crap that’s being used in a way that’s meant to deliver value. Accordingly, AI has begun to impede our ability to perceive value.
The AI most of us interact with–customer service chatbots, image generators, website copy–is demonstrably bad vis-a-vis its human competition. In general, AI’s language is off, its answers aren’t constructive, the images it produces look corny. And this is where flooding the zone with shit really comes into play–not only is this content bad, it’s easy to access in high volumes. You can chat with a chatbot in your underoos all day if you want and request images from Google Gemini until your neighbors have to make a 311 call. Mostly, the only visible cost in doing so is time.
Since this AI is used in places we need it, websites, phone trees, etc., there’s an implication that it’s doing something valuable. However, the return we get for the prompts we give to AI only has the veneer of value. Input a halfway thought-through prompt or question and out comes a fully finished answer.
Take the above example, where I inquired to ChatGPT “where do bird?” My prompt has the kernel of a thought, but there’s a transitive verb missing its direct object counterpart. The language equivalent of a bike where the wheels and handlebars switched places. You think you might know its purpose, but you can’t tell if it’s broken, or if it’s high art. ChatGPT does its best to answer, but devoid of context, it has no way of knowing if I’ve just made a grammar error, I’m writing poetry, or I earnestly want to know where do bird.
On the face of it, ChatGPT has given me a complete answer, despite myself. There’s a veneer of value here because the answer comes instantaneously and implies completeness. The last line in ChatGPT’s response is telling: “Want to know where a certain bird do?” It’s signaling we’ve gotten what we need and it’s okay to move on. Dive in deeper and though the product might look finished, it’s mostly styrofoam and office tape. It’s missing its foundation, context, history, and empathy. This example is extreme, but demonstrates AI can’t accommodate for nuance the same way a human ever could, and therefore could never deliver as much value.
There’s been lots already said about the robots taking our jobs, but it’s worse than that. AI is also devaluing the work of jobs–still mostly done by people–its creators are aspiring for it to do. Eg: customer service, sales, design, coding, etc. The more ubiquitous AI slop becomes, the lower the standard is and the more human labor is devalued. AI is a caterpillar munching away at both ends of the value leaf–things are getting worse and so is our ability to recognize what’s worth spending money on. In my experience, caterpillars rarely munch the entire leaf, but the leaves are certainly much worse for wear once the caterpillar’s through with them.
A perfect recent example of the fuzzy AI caterpillar munching away has been its effects on search results. A recent Bain & Company study revealed 80% of searchers relied on AI summaries at least 40% of the time. Meaning, a large number of people don’t ever click on a search result after inputting their query. Instead, searchers rely on hastily put together, often incorrect, AI summaries of their queries. When I input ‘how to get stronger’ into Google search, Google gives me an AI summary of how to do so. Several links lower, and there’s the Mayo Clinic, the renowned academic medical center (staffed by real people), giving me its response. The Mayo Clinic’s work is cheapened, I learn less than I could, and any strength training businesses likely lose traffic as well. Search traffic is the lifeblood of the internet, and AI is damming it up.
Remember “Enshittification”? The process and rationale for why so many things are getting worse? Also, shrinkflation? The ongoing phenomenon of prices staying the same but quantities shrinking? Think of AI as another combatant in the ultra competitive ring of ‘things ambiently making your life worse’. Enshittification makes stuff worse, shrinkflation gives us less, and AI…gives us a lot of crap.
Together, all three factors are spinning circles around us, making us dizzy. This makes it increasingly difficult to ascertain how we’re getting screwed. A buying process involving all three factors requires more of our time, more of our effort, and potentially more of our money. Remember, for a deal to be a deal, a consumer must be able to sort out the price to value ratio. As this gets more opaque, the likelihood of someone doing the requisite computation and research to understand value becomes less likely. This is because value isn’t only utility versus price, value is also determined by the time and effort it took to discover something in the first place. If it takes ages of searching to find something decent at a low price, those up front costs must also be considered in consumer surplus.
Many people reading this might rightly assert that our value of labor, and therefore the things it produces, has never really been that high. Historically, U.S. workers have had to fight mightily for things like weekends, 40 hour work weeks, and minimum wages. And once all that good stuff got too expensive for certain industries to provide, we shipped the work off to places where workers didn’t demand it. It’s difficult to envision a world where consumer value doesn’t exist, and it’s even less likely that world would come to be. Nevertheless, as AI muddies our ability to appreciate value, it’s worth keeping in mind.