- The Pieces Post
- Posts
- How much bigger can we really go before “more” starts meaning “less”?
How much bigger can we really go before “more” starts meaning “less”?

👋 Hey Builders,
This is Antreas, and I’ve been thinking about a pattern we’re all living through right now.
The rise of ChatGPT and GPT-4 was a massive win for AI. But it also pulled almost the entire field into one lane: bigger models, more compute, one architecture to rule them all.
The “Cambrian Explosion” of ideas from the 2010s — GANs vs VAEs, meta-learning, interpretability, reinforcement learning breakthroughs — gave way to what I can only call a Great Amnesia.
We stopped asking “What else?” and doubled down on “How much bigger?”
If scale was the Answer, why are we looking elsewhere for innovation?
Now, the limits of scale are starting to show. Long-context bottlenecks, finite high-quality data, and centralized control are forcing researchers to rediscover smaller, more efficient, more diverse approaches.
Architectures like Mamba, data-quality-first training like Microsoft’s Phi, and the grassroots Local AI movement are proving there’s life — and innovation — beyond the monolith.
🧠 What you'll learn:
How AI’s obsession with scale led to a research monoculture
The tradeoffs of LLM-only thinking
Where the next breakthroughs may come from (hint: smaller, faster, local)
Why architectural diversity and data quality might beat raw size in the long run
And what do we do with the direction of AI progress?

The best AI won’t live in a chat window, it will quietly weave into your workflow, anticipating needs and surfacing help only at the perfect moment. That kind of deep, contextual partnership demands privacy, trust, and speed — all of which the cloud alone can’t deliver.
Local-first isn’t a limitation; it’s a catalyst. Without the “just scale it” shortcut, we’re pushed to design smaller, smarter architectures and orchestrations of specialized models that work together seamlessly.
📖 Read the next post here to see where this shift in AI is already taking shape, and join the conversation on what comes next.
Know other developers fighting with context-switching in their AI tools? Join our Ambassador Program to get Pieces swag, participate in giveaways, and even get paid for talking about Pieces!
Ready to give your AI a memory upgrade? → Download Pieces Now
Happy coding!
The Pieces Team
P.S. Join our Discord community to share your experience with long-term memory, get help from the Pieces team, and connect with other developers revolutionizing their AI workflows.