When product managers ship code: AI just broke the software org chart
Our take

Last week, one of our product managers (PMs) built and shipped a feature. Not spec'd it. Not filed a ticket for it. Built it, tested it, and shipped it to production. In a day.
A few days earlier, our designer noticed that the visual appearance of our IDE plugins had drifted from the design system. In the old world, that meant screenshots, a JIRA ticket, a conversation to explain the intent, and a sprint slot. Instead, he opened an agent, adjusted the layout himself, experimented, iterated, and tuned in real time, then pushed the fix. The person with the strongest design intuition fixed the design directly. No translation layer required.
None of this is new in theory. Vibe coding opened the gates of software creation to millions. That was aspiration. When I shared the data on how our engineers doubled throughput, shifted from coding to validation, brought design upfront for rapid experimentation, it was still an engineering story. What changed is that the theory became practice. Here's how it actually played out.
The bottleneck moved
When we went AI-first in 2025, implementation cost collapsed. Agents took over scaffolding, tests, and the repetitive glue code that used to eat half the sprint. Cycle times dropped from weeks to days, from days to hours. Engineers started thinking less in files and functions and more in architecture, constraints, and execution plans.
But once engineering capacity stopped being the bottleneck, we noticed something: Decision velocity was. All the coordination mechanisms we'd built to protect engineering time (specs, tickets, handoffs, backlog grooming) were now the slowest part of the system. We were optimizing for a constraint that no longer existed.
What happens when building is cheaper than coordination
We started asking a different question: What would it look like if the people closest to the intent could ship the software directly?
PMs already think in specifications. Designers already define structure, layout, and behavior. They don't think in syntax. They think in outcomes. When the cost of turning intent into working software dropped far enough, these roles didn't need to "learn to code." The cost of implementation simply fell to their level.
I asked one of our PMs, Dmitry, to describe what changed from his perspective. He told me: "While agents are generating tasks in Zenflow, there's a few minutes of idle time. Just dead air. I wanted to build a small game, something to interact with while you wait."
If you've ever run a product team, you know this kind of idea. It doesn't move a KPI. It's impossible to justify in a prioritization meeting. It gets deferred forever. But it adds personality. It makes the product feel like someone cared about the small details. These are exactly the things that get optimized out of every backlog grooming session, and exactly the things users remember.
He built it in a day.
In the past, that idea would have died in a prioritization spreadsheet. Not because it was bad, but because the cost of implementation made it irrational to pursue. When that cost drops to near zero, the calculus changes completely.
Shipping became cheaper than explaining
As more people started building directly, entire layers of process quietly vanished. Fewer tickets. Fewer handoffs. Fewer "can you explain what you mean by..." conversations. Fewer lost-in-translation moments.
For a meaningful class of tasks, it became faster to just build the thing than to describe what you wanted and wait for someone else to build it. Think about that for a second. Every modern software organization is structured around the assumption that implementation is the expensive part. When that assumption breaks, the org has to change with it.
Our designer fixing the plugin UI is a perfect example. The old workflow (screenshot the problem, file a ticket, explain the gap between intent and implementation, wait for a sprint slot, review the result, request adjustments) existed entirely to protect engineering bandwidth. When the person with the design intuition can act on it directly, that whole stack disappears. Not because we eliminated process for its own sake, but because the process was solving a problem that no longer existed.
The compounding effect
Here's what surprised me most: It compounds.
When PMs build their own ideas, their specifications get sharper, because they now understand what the agent needs to execute well. Sharper specs produce better agent output. Better output means fewer iteration cycles. We're seeing velocity compound week over week, not just because the models improved, but because the people using them got closer to the work.
Dmitry put it well: The feedback loop between intent and outcome went from weeks to minutes. When you can see the result of your specification immediately, you learn what precision the system needs, and you start providing it instinctively.
There's a second-order effect that's harder to measure but impossible to miss: Ownership. People stop waiting. They stop filing tickets for things they could just fix. "Builder" stopped being a job title. It became the default behavior.
What this means for the industry
A lot of the "everyone can code" narrative last year was theoretical, or focused on solo founders and tiny teams. What we experienced is different. We have ~50 engineers working in a complex brownfield codebase: Multiple surfaces and programming languages, enterprise integrations, the full weight of a real production system.
I don't think we're unique. I think we're early. And with each new generation of models, the gap between who can build and who can't is closing faster than most organizations realize. Every software company is about to discover that their PMs and designers are sitting on unrealized building capacity, blocked not by skill, but by the cost of implementation. As that cost continues to fall, the organizational implications are profound.
We started with an intent to accelerate software engineering. What we're becoming is something different: A company where everyone ships.
Andrew Filev is founder and CEO of Zencoder.
Read on the original site
Open the publisher's page for the full experience
Related Articles
- AI lowered the cost of building software. Enterprise governance hasn’t caught upPresented by Retool The logic used to be: buying software is cheaper, faster, and safer for most use cases. Building was reserved for companies with large engineering teams, deep pockets, and problems so specific that no vendor could address them. But now, the cost to code a piece of software has dropped to zero. Anyone can build their own software now, but enterprise and governance models have yet to catch up. Retool’s 2026 Build vs. Buy Shift Report, based on a survey of 817 builders, traces exactly how this shift is playing out. The cost curve changed; SaaS pricing didn’t Two years ago, a custom internal tool might have taken an engineering team weeks or months and cost six figures. Today, an operations lead with the right platform can have a working prototype in a day or two. This structural shift is driven by AI-assisted development and the maturation of enterprise app-building platforms. Meanwhile, SaaS pricing hasn’t adjusted, still charging per-seat for generic software that requires customization and integration costs on top. When the cost of building drops by an order of magnitude but the cost of buying stays flat, the math changes for every company, not just the ones with large engineering teams. The data reflects this. Retool’s report found that 35% of teams have already replaced at least one SaaS tool with a custom build, and 78% plan to build more custom tooling in 2026. Workflow automations and admin tools are among SaaS tools at risk The shift isn’t happening uniformly. The top SaaS tools respondents have replaced or considered replacing include workflow automations (35%) and internal admin tools (33%), followed by BI tools (29%) and CRMs (25%). A purchased workflow automation tool has to serve thousands of customers, so it optimizes for the average case — and the average case is nobody’s actual case. Every company’s internal workflows are different. They reflect org structure, compliance requirements, data systems, and business logic unique to that organization. Internal admin tools carry the same problem: they’re inherently company-specific. These categories were always the most awkward fit for off-the-shelf software, and there’s now an affordable, accessible alternative (MIT’s State of AI in Business reported $2-10 million in savings annually for customer service and document processing tasks). The replacement pattern tends to be additive rather than wholesale (nobody is just ripping out Salesforce). They’re replacing the specific pieces that never quite fit: an approval flow that required three workarounds, the dashboard that couldn’t connect to their actual data … but those narrow replacements add up. Once a team builds one tool that works better than what they bought, the default question shifts from “What should we buy?” to “Can we build this?” Builders go around IT, signaling broader procurement challenges The clearest evidence that procurement processes haven’t kept up with building capability is the scale of shadow IT now occurring inside enterprises. Retool’s report found that 60% of builders have created tools, workflows, or automations outside of IT oversight in the past year — and 25% report doing so frequently. Even experienced, high-judgment people choose speed over process. Two-thirds of total survey respondents (64%) are senior managers and above. Existing procurement cycles weren’t designed for a world where building software takes days rather than months. When people love to quote the 95% generative AI pilot failure rate they’re not accounting for the robust grassroots adoption happening under executives’ noses. Shadow IT at this scale is a demand signal. The people closest to the problems are telling organizations that the existing process can’t can't keep up — 31% of those going around IT do so simply because they can build faster than IT can provision tools. So, suppression isn’t a productive response. The challenge is that the tools being built in the shadows are also the ones most likely to stall before they become useful. A vibe-coded prototype running on sample data is impressive. A production tool connected to your actual Salesforce instance, with role-based access and a security review, is useful. The report found that 51% of builders have shipped production software currently in use by their teams, and among those, about half report saving six or more hours per week. When building happens in an ungoverned environment, organizations get neither outcome reliably. Someone connects an AI-powered tool to production data with no audit trail, no access controls, and no owner. Multiply that by dozens of builders across an organization, and you have an expanding security surface that IT doesn’t even know exists.[1] The teams whose homebuilt solutions reach production tend to have three things the others don’t: connectivity to real data sources, a security and permissions model they trust, and a review process for what gets deployed. Channeling builder energy into governed environments, where speed and security aren’t in conflict, is how organizations avoid shadow IT becoming a liability. Governance will define the next era of SaaS The build vs. buy shift is already underway. The more important question now is who controls the environment where that building happens. Ungoverned building invites security risks and makes the ROI case difficult to close. You can’t measure time saved by tools IT doesn’t know exist, or are only run in one individual’s workflow. You can’t enforce access controls on a prototype that someone connected to production data last Tuesday. And those aren’t hypothetical risks: in Deloitte’s 2026 State of AI in the Enterprise survey of 3,200+ leaders, data privacy and security ranked as the top AI concern at 73%, with governance capabilities close behind at 46%. The 35% of organizations with no AI productivity metrics are missing more than just a dashboard. They’re missing the accountability infrastructure that justifies building over buying in the first place. The organizations that treat governed environments as a prerequisite for building at scale will be the ones that can actually prove it’s working. The ones that don’t will find out when something breaks. For a closer look at the data, including how enterprises are approaching AI-assisted building, read the full 2026 Build vs. Buy Shift Report. [1] The cost of which can be steep: IBM’s 2025 Cost of Data Breach Report found that AI-associated cases cost organizations more than $650,000 per breach. David Hsu is CEO at Retool. Sponsored articles are content produced by a company that is either paying for the post or has a business relationship with VentureBeat, and they’re always clearly marked. For more information, contact sales@venturebeat.com.
- Presentation: Engineering at AI Speed: Lessons from the First Agentically Accelerated Software ProjectAdam Wolff discusses the evolution of Claude Code, explaining how AI shifts the SDLC bottleneck from implementation to architectural decision-making. He shares three "war stories" to show why dogfooding and rapid unshipping are vital. He explains that when coding costs drop to zero, the speed of learning becomes the only competitive advantage. By Adam Wolff