Potemkin Software

In 1787, Grigory Potemkin needed to impress Catherine the Great during her boat tour off the coast of Crimea. The popular version of the story claims that he made fake portable building facades and moved them around the island to make it look like it had beautiful, populated, established villages. Though this was likely exaggerated and perpetrated by his political rivals, the reality is almost more interesting: Potemkin did decorate real settlements, stage elaborate spectacles, and dress things up dramatically for the empress's visit. The buildings were real. They just weren't as finished as they looked.

I keep thinking about this story when I watch what's happening with AI-generated software. We're not building fake software. We're building real software that isn't what it appears to be. And most people can't tell the difference.

AI lets anyone build. You describe what you want, and something appears — a dashboard, a workflow tool, a prototype that looks like the real thing. As a proof of concept, this is extraordinary. Going from idea to working prototype in an afternoon, testing whether a concept has legs before investing months of engineering — that's a genuine superpower.

The problem isn't the proof of concept. The problem is that it no longer looks like one.

What AI excels at is generating the visible layer — the part that faces the user, the part that gets screenshotted and pasted into a pitch deck. A proof of concept used to look rough, obviously incomplete. Now it arrives with polished UI, smooth transitions, and a settings page.

In President Barack Obama's 2020 memoir A Promised Land, he identifies a similar dissonance in his writing. "I still like writing things out in longhand, finding that a computer gives even my roughest drafts too smooth a gloss and lends half-baked thoughts the mask of tidiness." He was talking about prose, but the observation scales. AI-generated software gives the roughest of drafts too smooth a gloss.

Two Biases That Compound Each Other

There's a concept called "completion bias" — the tendency to mistake the appearance of a finished thing for an actually finished thing. We see a polished UI and our brain checks the box. Done. Shipped. Next.

But there's a second force at work: the IKEA effect. People dramatically overvalue things they had a hand in building. You assemble a flat-pack bookshelf and you think it's beautiful — you don't notice the wobble because it's yours.

AI-generated software triggers both simultaneously. You prompted the tool. You described what you wanted. You iterated on it. By the time you have a working prototype, you feel like you built it — because you did. The IKEA effect tells you it's precious. The completion bias tells you it's done. Together they erode the discernment to sense the delta between proof of concept and production-ready product. That delta is where authentication edge cases, security and privacy vulnerabilities, error handling, data integrity, and graceful failure all live — the 90% of the work that a demo never shows you.

The emotional experience of creating something with AI feels exactly like the emotional experience of finishing something. And that feeling is a liar.

What Actually Helps

AI is extraordinarily good at proving concepts, and we should celebrate this. But we need a hard, bright line between proving a concept and shipping a product. The proof of concept is the beginning of the work, and not even close to the end.

There's an old adage in software: the first 80% of the work takes 20% of the time, and the last 20% takes the remaining 80%. With AI coding, this ratio has gotten more extreme – the first 90% now takes 5% of the time. The last 10% takes 95%. Plan for this — the speed of the prototype isn't evidence that production will be fast. It's a warning that the hardest work hasn't started.

Software engineers can capture this moment by taking the opportunity to educate newly empowered non-engineers on the complexity of building production-ready, safe, durable products. Non-engineers can learn to resist the urge to ship the prototype and instead treat it as what it is — a brilliant starting point that has revealed how much work remains. And everyone on the team can commit to asking the uncomfortable question before anything goes live: what happens when someone actually tries to live in this building?

Previous
Previous

Psychological Safety Is the Killer Feature of AI Coding