04-05-2026

The Vibe Coding Problem Nobody’s Talking About

Shak Schiff

Image by Samuele Schirò from Pixabay

Your team just shipped a landing page in 12 minutes.

The copy was AI-generated. The design was AI-assisted. The build happened in a no-code tool that a junior strategist figured out over lunch. No developer touched it. No QA process existed because there was nothing to QA, right? The tool built it. It looks fine on the screen in front of you.

Except the form doesn’t submit on Safari. The mobile layout breaks at 390px. The privacy policy link points to a 404. The headline font renders differently on Outlook than it does in Gmail. And the ADA-compliance language your pharma client requires is missing from the footer entirely.

Nobody catches this. The campaign goes live. The ad spend starts burning. And every click that hits that broken page is money you lit on fire.

This is the vibe coding problem.

The bottleneck moved. Most agencies haven’t.

For 20 years, the bottleneck in digital campaign production was building. Design, development, integration, staging, revision cycles. It took weeks. It was expensive. And because it was slow, there was time baked into the process for someone to check the work before it shipped.

AI removed that bottleneck. Which is genuinely great.

But it also removed the buffer. The speed that makes AI-assisted production so attractive is the same speed that eliminates the window where a human being looks at the thing, clicks through it, and says: “This is broken.”

Agencies are now producing 3x to 5x more campaign assets than they were two years ago. Landing pages, email variants, ad creatives, microsites. The volume is unprecedented. But the quality control infrastructure has not scaled with it. In most shops, it has actually shrunk, because the assumption is that if AI built it, AI checked it.

It didn’t.

AI can build a page. It cannot judge one.

Generative AI is very good at producing things that look right. It is not good at catching the things that are wrong in context.

It doesn’t know your client’s brand guidelines changed last quarter. It doesn’t know the legal team rejected that specific CTA phrasing. It doesn’t know the form integration requires a hidden field for the CRM to attribute the lead correctly. It doesn’t know that the email renders beautifully in Apple Mail and is completely unreadable in Outlook 2019, which 40% of your client’s audience still uses.

These are not edge cases. They are the normal, daily, predictable failure modes of digital campaign work. And they require a human being who has seen them a thousand times before to catch them before they cost real money.

How much money? In one engagement with a major biotech company, across 44 website projects, 79 email campaigns, and 15 HTML5 banner ad builds, structured human QA caught over 800 issues. The estimated avoided cost: $3.8 million.

That is not a rounding error. That is the gap between “we shipped fast” and “we shipped right.”

The missing layer

Here is the contrarian truth that most agencies are not ready to hear: the faster you produce, the more you need human QA, not less.

AI did not eliminate the need for quality control. It made quality control the single most important step in your production workflow. Because now you are shipping more assets, to more channels, at higher velocity, with fewer people reviewing the output before it goes live.

The agencies that figure this out will stop losing money on broken campaigns. The ones that don’t will keep wondering why their conversion rates are underperforming despite “great creative.”

The creative was fine. The execution was broken. Nobody checked.

BadTesting has been the human QA layer for agency campaign production for years. Websites, emails, landing pages, HTML5 ads, CMS integrations. One person who has seen every way these things break and catches what AI cannot.

If your production speed has outrun your quality control, that gap has a cost. And it is almost certainly higher than you think.

Bring Your Vision to Life