Friday, January 30, 2026

Fast, Cheap, and Selectively Moral


Modern life runs on two unspoken priorities: speed and affordability. We want information instantly, products delivered overnight, food year-round, and solutions that don’t slow us down. Entire systems—economic, technological, and cultural—have been built to meet those expectations.

And yet, every so often, we collectively pause and draw a moral line around a single tool.

Right now, that tool is AI.

Concerns about AI’s environmental impact, labor exploitation, and long-term consequences are real and worth examining. But it’s also worth asking why AI, specifically, gets singled out—especially when it’s often being used to support reasoning, not replace it.

I think back to a second-grade social studies lesson on industrialization. We learned how a small number of sewing machines could replace hundreds of tailors. The lesson wasn’t framed as a moral failure of the machine; it was presented as a fact of progress—machines made production faster, cheaper, and scalable. The human cost was acknowledged, but the trajectory was treated as inevitable.

That framing matters, because it mirrors how society has always absorbed trade-offs.

We already live comfortably inside a world shaped by environmentally costly systems. Cars, air travel, fast fashion, streaming platforms, industrial agriculture, and global supply chains all rely on massive energy use and low-wage labor, often in developing nations. These systems persist not because they’re harmless, but because they deliver what modern life demands: convenience at speed.

Fast and cheap has always won.

Industrialization didn’t just replace labor; it reshaped expectations. Clothing became affordable. Goods became abundant. Productivity became the measure of success. And while entire professions were disrupted—often at great human cost—society didn’t opt out. It adapted, normalized the new tools, and moved forward.

AI fits squarely into that lineage.

What feels different now isn’t the existence of trade-offs, but how selectively we moralize them. Instead of interrogating the systems that demand constant acceleration, we focus on individuals who use the tools those systems incentivize. We criticize the person instead of the structure.

There’s also an added layer of discomfort: AI challenges how we define thinking itself.

When someone uses AI to compare ideas, articulate reasoning, or stress-test a decision, it exposes something we’ve long avoided admitting—human thought has always been supported by tools. Writing externalized memory. Calculators offloaded arithmetic. Search engines reorganized knowledge. None of these eliminated thinking; they changed its shape.

AI simply makes that process visible.

So when AI is used to clarify reasoning rather than replace it, labeling it as intellectual laziness feels less like a principled stance and more like a reaction to unease. Unease with speed. Unease with shifting definitions of competence. Unease with how much modern life already depends on invisible systems doing work for us.

We want things fast and cheap—but we also want to feel ethically intact.

That contradiction fuels selective outrage. We worry about AI’s environmental impact while streaming endlessly. We decry labor exploitation while relying on global supply chains designed around it. We condemn one tool while quietly accepting dozens of others that operate on the same principles.

The question isn’t whether AI has costs. It does.
The question is why we expect individuals to opt out of this particular tool—especially when used thoughtfully—while remaining fully embedded in every other system optimized for speed and scale.

If the sewing machine taught us anything in second grade, it’s that tools don’t replace values—they reveal them. And the value modern society has consistently chosen is efficiency, even when the costs are unevenly distributed.

Until we’re willing to interrogate that, singling out AI isn’t moral clarity. It’s displacement.

And maybe the discomfort we feel around AI isn’t about environmental harm at all, but about recognizing—again—that we’ve built a world where fast and cheap isn’t just preferred. It’s expected.

No comments:

Post a Comment