Everyone's shipping AI. Almost no one's validating it → Check your idea free
ai innovation

Phase 1: Rebuilding a Platform With AI

Leslie Barry ·

This is Part 2 of a series about what actually happens when you give AI real access to your business. Not theory. Not demos. Real work, real results, and the honest details of what went right and wrong. If you missed it, Part 1 covers why I handed operational work to AI and what changed.

In December 2025, I had a problem that had been weighing on me for months.

Our experimentation platform, the core product behind Exponentially, had been rebranded earlier in the year. We renamed it, migrated it, changed the positioning. It seemed like a good idea at the time. It wasn’t. It confused clients, muddied our messaging, and created a tangle of technical debt that my development team estimated would take months to unwind.

I was paying for a development team, a DevOps team, a content marketing team, and a virtual assistant. Good people doing good work. But the pace didn’t match the urgency. And the cost of maintaining all those teams while the platform sat in a state of confusion was becoming harder to justify.

So I made a decision that, looking back, was either brave or reckless. Probably a bit of both.

I decided to do it myself. With AI.

Starting from zero (well, almost)

I need to be honest about where I was starting from. I used to code. About 30 years ago. Since then I’ve managed development teams, run technical projects, built four businesses. I understand how software gets built. But I hadn’t personally written code in any meaningful way for decades.

I’d heard all the terms. GitHub, Cursor, vibe coding, Codex, Claude Code. I’d seen the demos. But I’d never actually sat down and used any of them to build something real.

December 15 was my first day. And the learning curve was brutal.

Not because the AI couldn’t do the work. It could, and often faster than I expected. The hard part was learning how to direct it. How to break a complex platform into tasks an AI agent could execute reliably. How to review what it produced. How to catch the subtle things it got wrong. How to structure a workflow that held together across a codebase with tens of thousands of lines.

I went from zero to functional in about a week. Not expert. Functional. Enough to start making real progress on the platform migration.

By the end of the first month, something had shifted. I wasn’t just directing the AI anymore. I was having technical conversations with it, by voice, about architecture decisions, security trade-offs, and deployment strategies. I’d gone from someone who hadn’t touched code in 30 years to someone running a production rebuild by talking to an AI agent while making coffee.

That transition is worth understanding, because I think it’s what’s coming for a lot of people. Not “learn to code.” Learn to direct AI that codes for you. The skill isn’t syntax. It’s judgment.

What the AI found that we’d all missed

Before I started rebuilding anything, I did something that turned out to be the most valuable decision of the entire project. I asked the AI to audit the existing codebase first.

The platform had been built and iterated on over several years by different development teams. Features had been added, modified, patched. Nobody had ever done a comprehensive inventory of what was actually there.

The AI went through the entire codebase and came back with a comprehensive audit. The platform had far more depth than we’d been giving it credit for. Multiple service layers, enterprise-grade architecture, full lifecycle management, billing, authentication, the lot. It was a proper self-service innovation platform, not just a tool.

But the audit didn’t just map what was there. It identified where things were slow and why. The database layer, for example, had grown organically over years of different development teams adding to it. The AI recommended a series of normalisation and architecture optimisations that we then implemented together. The result was a 20x improvement in database access speed. That’s not a typo. Twenty times faster, from restructuring and optimising what was already there.

That single insight changed the trajectory of the entire business. We’d been thinking about the platform as something that needed to be rebuilt. In reality, it needed to be understood, consolidated, and properly positioned. We were sitting on something valuable and hadn’t seen it because we were too close to the code.

This is one of those AI moments that doesn’t get enough attention. Everyone talks about AI generating new things. Sometimes the most valuable thing it does is help you see what you already have.

The actual rebuild

With the audit complete, the scope changed completely. Instead of a ground-up rebuild, the work became a focused consolidation:

Production hardening. Taking 12 service modules through security review, error handling improvements, and performance optimisation. The AI ran through each module systematically, identifying vulnerabilities and suggesting fixes. I reviewed every change before it went in.

The rebrand migration. Untangling the naming confusion. Updating every reference, every URL, every email template. But it wasn’t just a find-and-replace job. The AI also brought the platform back to the original Exponentially brand colours from the purple we’d adopted during the rebrand. It then built out a complete design card and style guide from scratch. That design system turned out to be one of the most valuable outputs of the entire phase. Every piece of work that followed, the website migration, the Idea Validator, new feature builds, all of it moved faster because the visual language was already defined and documented.

AI-enablement refactor. This is the one that’s still paying dividends. Part of the consolidation was specifically about restructuring the platform so it could be AI-enabled more easily going forward. Cleaning up the architecture, standardising how modules connected, making it possible to build out new features and experiments faster with AI assistance. That decision has made everything since Phase 1 significantly quicker.

Security audit. This one deserves its own section (and will get one later in the series). Short version: we ran a three-agent security audit. Scanner, prioritiser, fixer. Found exposed services, open firewall rules, world-readable credentials, tokens in git history. All fixed. Then I manually audited the AI’s work and found things it had missed. The trust-but-verify pattern became a core part of how I work with AI.

Database cleanup. 15 normalised tables reviewed, indexed properly, data access layer documented. The AI wrote the documentation by reading the code, which meant the docs actually matched the implementation for the first time.

Testing. Reviewed and extended the existing 59 unit tests and 26 end-to-end tests. Added coverage for the security fixes and new hardening work.

The numbers at the end of Phase 1: 798 commits over six weeks. 12 service modules production-ready. Platform migrated, rebranded, secured, and documented.

The team transition

By mid-January, the maths had changed.

I’d been paying for a development team and a DevOps team to maintain and extend this platform. Good people, doing solid work. But I’d just done six weeks of platform work myself, with AI, that would have taken the team significantly longer and cost significantly more.

The decision wasn’t easy, and I want to be honest about that. These were people I’d worked with for years. But the operating model had fundamentally shifted. What used to require coordination across multiple people, sprint planning, code reviews, deployments, I could now do in concentrated sessions by directing AI agents through the work.

I transitioned out the development and DevOps teams in January. The ongoing cost of maintaining the platform dropped to essentially zero beyond my own time and AI token costs.

This is the part of the AI conversation that makes people uncomfortable, and I think it should. I’m not celebrating the fact that AI can replace teams. I’m acknowledging it as a reality that business owners and leaders need to think about honestly. The economics have changed. Pretending they haven’t doesn’t help anyone.

What I learned

Phase 1 taught me three things that shaped everything that followed:

First, the learning curve is real but surprisingly short. I went from not having touched code in 30 years to running a production rebuild in about a week. Not because I’m exceptional. Because the AI handles the syntax and I provide the judgment. The skill that matters isn’t coding. It’s knowing what good looks like and being able to spot when something’s wrong.

Second, AI sees things humans miss. Not because it’s smarter. Because it’s fresh. It doesn’t have the accumulated assumptions that come from years of working on the same codebase. The discovery that our platform was already a complete self-service innovation product came from the AI looking at it without any of the baggage we’d built up around it.

Third, voice changes everything. By the end of Phase 1, most of my technical work was done by voice. Talking through architecture decisions. Describing what I wanted built. Reviewing code by having the AI explain it to me. This isn’t a small thing. It means that the barrier to doing technical work has dropped from “can you code” to “can you think clearly about what needs to happen.” That’s a fundamental shift.

Phase 1 was the foundation. The platform was solid, the costs were down, and I had a workflow that actually worked.

Next came the part that surprised me most: handing off daily operations to an AI agent. That’s Post 3.

Tools used for this project

If you want to follow along as each piece comes out, I write a monthly newsletter called Experimenter’s Edge where I share what I’m learning about AI, rapid validation, and building differently. You can sign up here.

Stop Guessing. Validate your Idea first.

Get AI-powered feedback in 2 minutes.

Join the Community

Join 3,000+ subscribers. No spam, unsubscribe anytime.