I run a software company with 30 engineers. I built this app with AI and none of them.
I run a software company. We have around 30 engineers, designers, and QA people. We build products — some for clients, some our own. B2B tools, internal platforms, that kind of thing. It's been going for over a decade.
When AI coding tools got to the point where you could describe a feature in plain English and get working code back — not autocomplete, actual working features — I couldn't stop thinking about two things.
One was practical: our clients are going to start asking why they need a team of five when this exists. We need to figure that out before they do.
The other was personal. I haven't written production code in years. I run the company, I make product decisions, I review architecture — but I don't ship code. Could I build a real app with just an AI? Not a prototype. A real thing, in the App Store, with actual users.
I wanted to know. So I tried.
The idea
I needed something I actually wanted. Not a to-do app, not a SaaS dashboard — something I'd use every day.
Here's my thing: I'm obsessed with football. Not watching-on-TV obsessed. Going-to-matches obsessed. I travel for games. I collect stadiums the way some people collect countries. I've been to grounds across Europe — big cathedrals and tiny non-league fields with 200 people and a burger van.
For years, I tracked it all in Apple Notes. A messy list: "Anfield, March 2019. Craven Cottage, August 2021." No stats, no map, no way to see the full picture.
There are apps for this. I tried them. One charges you a subscription to log matches you already paid to attend. Another looks like it was designed in 2012 and never updated. A third has 90,000 stadiums in its database — most of which don't exist.
So the project chose itself: a football match tracker. Log every match you've been to. Map every stadium on a world map. Track your stats. Think Strava, but for football fans.
Week 1: "This might actually work"
I started with Claude. Not a team of AI tools — just one. I described what I wanted in plain language, the way I'd brief a senior developer.
The first day was weird. By evening, I had a working app with authentication, a database schema, and a home screen with a map. Ugly as hell, but it ran.
By the end of week one, I had:
- User accounts with Google and Apple sign-in
- A Supabase database with leagues, clubs, stadiums, and matches
- A searchable match catalogue (pull data from an API, let users find any match since 2010)
- A map that lights up stadiums you've visited
- Basic stats: matches attended, stadiums visited, countries
In a normal project at my company, week one is when we're still arguing about the database schema in Confluence. I'm not even exaggerating — I checked our Jira history.
Week 2-3: The ugly middle
This is where the honeymoon ended.
AI is brilliant at generating code. It's terrible at making decisions. Every time I gave a vague prompt — "make the add-match flow better" — I got technically correct code that was productionally wrong. Features that worked but felt off. Screens that had every element but no soul.
Here's what I figured out the painful way: AI is a 10x executor with zero product sense.
It doesn't know that a modal should close and return you to the tab you were on, not dump you on the home screen. It doesn't know that showing a skeleton loader for 8 seconds is worse than showing an error after 3. It doesn't know that fans care about win rate but not about "average goals per match to two decimal places."
I had to catch every single one of these, work out why it felt wrong, and re-prompt until it didn't. You don't need to know how to code. You need to know what good looks like.
Other things that went wrong:
- The "works on my machine" problem. AI-generated code worked in development but crashed in production. A background sync that was fine with 10 matches would OOM with 500.
- State management chaos. The AI would add a
useStateafter an early return in a React component. React silently breaks. No error, no crash — the button just stops working. I spent an entire day on this before figuring out the pattern. - The database grew teeth. Started with 4 tables. Now it's 30+. Every time I added a feature, the data model got more complex, and the AI would sometimes write queries that were technically correct but would take 40 seconds on the real dataset. Technically correct is the worst kind of correct when your user is staring at a spinner.
Week 4-5: It gets good
After a few weeks of figuring out how to prompt properly — be specific, set constraints, always read what it gives you before accepting — things clicked. And then the speed got ridiculous.
One Thursday, I decided I wanted a "Travel Planner" feature: pick a city and dates, see every upcoming match within 100km. In my company, this would be a 2-week sprint: spec, design, frontend, backend, QA, deploy.
I shipped it in a day. Fully working, with a search UI, geolocation queries, match cards, and mobile-responsive design. Deployed to production before dinner. My wife asked what I did at work today. "Built a travel planner." "The whole thing?" "The whole thing."
That's when it stopped feeling like an experiment.
Here's what the velocity looked like:
| What | Traditional estimate | AI + me |
|---|---|---|
| Match logging with search | 2 weeks | 3 days |
| Interactive stadium map (Mapbox) | 1 week | 1 day |
| Travel Planner tool | 2 weeks | 1 day |
| SEO website with 650+ pages | 1 month | 4 days |
| Multilingual landing pages (7 languages) | 2 weeks | 3 hours |
| CI/CD pipeline + 600 tests | 1 week | 2 days |
I want to be clear: these aren't proof-of-concept demos. This is production code serving real users, with error handling, edge cases, monitoring, and all the boring plumbing that separates a demo from a product.
The numbers (8 weeks in)
As of today:
- 14,000+ stadiums in the database
- 1,200+ leagues across 170+ countries
- 600,000+ fixtures going back to 2010
- iOS + Android + Web — all from one codebase
- 650+ SEO pages — club pages, stadium pages, league pages, country hubs, blog
- 7 language versions of the landing page
- 600+ automated tests in CI
- Sentry monitoring, crash-free rate >99%
- $0 spent on developers, designers, or QA
The app is called Footbeen. It's free — everything that competitors charge for, you get for nothing. That's the upside of building with AI and no team: when your costs are near zero, you can afford to be generous. I built it to solve my own problem. Maybe I've solved yours too.
What AI can't do
I don't want to paint this as some AI miracle story. It's not. Here's what AI is genuinely bad at:
1. Product vision. AI doesn't know what to build. It can build anything you describe, but it can't tell you what users want, what the market needs, or what will make someone open the app daily instead of forgetting about it.
2. Design taste. It generates functional UIs, not delightful ones. Every screen needs a human eye asking "does this feel right?" The answer from AI-generated UI is usually "almost, but not quite."
3. Saying no. If you ask for a feature, AI builds it. It never says "that's a bad idea" or "your users won't care about this." A good senior developer pushes back. AI is a yes-man.
4. System-level thinking. It optimises the code in front of it but doesn't think about how 6 concurrent API requests will exhaust a browser's connection pool. Or that a clever React hook will break if it's called after a conditional return. These are the bugs that take hours to find.
5. Knowing when it's wrong. The most dangerous failures are the confident ones. AI will generate code that looks correct, passes tests, works in dev, and subtly breaks in production with 1,000 users on a slow 3G connection.
What it changed about my company
This is the part I'm still processing.
I built a production app — not a simple one — in 8 weeks, alone, with zero engineering headcount. The app has more features, better test coverage, and a faster iteration cycle than most projects my full team delivers.
That's not a comfortable thought when you employ 30 engineers.
We've always done two things: client projects and our own products. The client work pays the bills. Our own products are where we experiment. They've always been in the B2B space — tools, platforms, enterprise stuff. Footbeen is the first consumer product I've personally driven, and it happened almost by accident: the experiment took on a life of its own.
But the bigger realisation is about how we work as a company. Our clients are already asking the question: "If AI can do this, why am I paying for a team of five?" And honestly — they should be asking.
I'm not about to fire everyone. But I am rethinking how we staff projects. The old model was: 1 PM, 2-3 developers, 1 designer, 1 QA = 5-6 people per project. The new model we're experimenting with: 1 senior architect who deeply understands the domain + AI tools = potentially the same output.
That doesn't mean everyone's getting fired. But it does mean the job description is changing. Writing CRUD endpoints is not a career anymore. Understanding systems, having taste, knowing what to build and what to leave out — that's where the value is now.
The part I didn't expect: this side project took over my brain. I started it as a test and now I use it every matchday. I'm adding features because I want them as a fan, not because some roadmap says so. I haven't felt that way about our B2B products in years.
Would I do it again?
Yes. Without hesitation.
Not because it was easy — weeks 2-3 nearly broke me. But because for the first time in my career, the thing I imagined and the thing that exists are almost the same. There's no "we'll get to that next sprint." There's no compromise because the designer and the developer couldn't agree. There's no feature that got cut because we ran out of budget.
If you know your field well enough — doesn't matter if it's football, cooking, music, logistics — you can now build the tool you always wished existed. The hard part was never writing code. It was always knowing what was worth building.
As for me — I have a match to go to this weekend. And for the first time, I won't be logging it in Apple Notes.
Footbeen is free on iOS, Android, and the web. If you've ever kept a list of stadiums you've visited — in a notes app, a spreadsheet, or your head — give it a try. I'd genuinely love to hear what you think.
Stack for the curious: React Native (Expo), Supabase (Postgres + Auth), Mapbox, React Query, Vercel + EAS. Questions about the AI workflow or the product — [email protected].