9 months ago I joined Linkby with a blank slate.
The data existed — in RDS, ready to be unlocked. What was missing was a modern stack to transform it into something production ML-grade.
For the first three months, it was just me.
Here is what I shipped in that solo window:
- Full production data platform on Databricks — ingestion, behavioural event pipelines, feature store, model deployment
Then I made my first hire — a ML Engineer. Together we pushed the stack further:
- Complete analytics stack migrated to ClickHouse
- Repo-driven data pipelines in Dagster, built with AI agents accelerating development
Six months in, a Data Scientist joined. By month nine we had shipped a personalised click pricing model live in production — touching Linkby's core revenue mechanism.
Three people total.
Jeff Bezos famously said that if a team cannot be fed by two pizzas, it is too big. The idea is that small, autonomous teams move faster and own more.
Previously I worked on data projects with 5+ engineers. Solid teams, solid output, genuine complexity.
But here is what has changed: the tooling. AI-augmented engineering has fundamentally shifted what a one-pizza team can deliver. Every pipeline, every review, every piece of documentation moves faster because we built AI into the workflow itself. Not as a novelty. As the actual operating model.
What AI-augmented engineering actually means
It does not mean asking an AI to write all your code. It means building AI into the workflow at the right checkpoints so the human time goes to the decisions that require human judgement.
At Linkby, this looked like:
- Claude Code for pipeline scaffolding. Schema definitions, transformation logic, test generation — the boilerplate that takes half a day conventionally. We handled it in an hour, then spent the rest of the day on the data model decisions that actually mattered.
- AI agents for Dagster pipeline development. Our ML Engineer built a system where agents write, test, and deploy pipeline components. Review time dropped significantly. Deployment confidence went up because the tests were generated alongside the code.
- ClickHouse migration accelerated by AI. Migrating an entire analytics stack is normally months of careful data validation. We moved faster because we used AI to generate migration scripts, validate row counts, and surface discrepancies — then focused human attention on the edge cases the scripts could not handle.
The thesis, nine months later
My thesis going into Linkby was simple: a one-pizza team with the right AI tooling and MLOps discipline can outship a conventional team three times its size.
Nine months in, the data supports it.
The constraint was never people. It was always leverage.
If you are a founder or CTO wondering whether you can build serious AI infrastructure without a large data team — the answer is yes. But only if you are ruthless about tooling, architecture, and how the team actually works.
The one thing I would do differently: Hire the ML Engineer in month one, not month three. The ClickHouse and Dagster work we built together in months four through six could have started earlier. The solo period was valuable for architecture decisions, but the team velocity unlocked by that first hire was significant.
Happy to share what the actual workflow looks like in practice. Drop a comment on LinkedIn or DM me directly.