Mainstream LLMs are centralized. What if they weren't?

Favour Chukwuedo is an innovator and thought leader in the field of educational technology. Currently spearheading projects at DigiLearns, Favour is dedicated to democratizing access to quality education for disadvantaged students. With a background in data science and machine learning, Favour has developed advanced Bayesian models that allow for the adaptive personalization of educational content. This work is not only advancing the technological frontier but also making significant strides in addressing educational inequality. Favour is passionate about leveraging data-driven insights to create impactful educational experiences for all students, regardless of their socio-economic background. In addition to this, Favour's research in the realm of personalized education opens new avenues for pedagogical theories and educational policies, making them a respected contributor to both academia and industry.
Today in #Vancouver, I presented llm-node at the Codex Community Meetup, a project that explores what the future of AI infrastructure could look like.
Here’s the problem most people overlook:
Every time you use tools like ChatGPT, Claude, or Gemini, your prompt goes to a server controlled by a single company. They see it, gate it, and price it. Running your own model? A 70B model can cost $10,000+ in hardware.
So I asked: what if no one had to own the whole model and you didn’t need thousands of dollars in GPUs to use it?
I built llm-node. Think of BitTorrent, but for #AI. Instead of one machine doing everything, multiple nodes each run a small part of the model. Your laptop or a Raspberry Pi runs a few layers, someone else runs the next, and the network stitches it together.
The result is a fully functional AI model with no central owner.
Key ideas:
• Nodes auto-discover public nodes (no central server)
• Workload adapts to your hardware
• Onion routing to preserve privacy
• Setup in under 60 seconds
Built in Rust, supporting #Llama, #Qwen, and #Phi-3 and with zero-config model detection.
But what excites me most is the real-world application:
Imagine public institutions in Nigeria. Instead of spending $200K on GPU clusters or sending sensitive data abroad, agencies could pool existing infrastructure to run a shared, sovereign AI system with no new procurement, no vendor lock-in and no data leaving the country.
That used to be a theory but this is what the project enables.
AI is becoming critical infrastructure. The question isn’t just how powerful it gets but who controls it.
Decentralised AI puts ownership back where it belongs- with the people using it.
If you’re thinking about decentralised systems, sovereign compute, or privacy-first AI, I’d love to connect.
Speaker Deck: https://speakerdeck.com/favourchukwuedo/mainstream-llms-are-centralized-what-if-they-werent
Open source. MIT licensed. Look out for GitHub link in the coming days.
#AI #Decentralization #OpenSource #MachineLearning #Privacy #DistributedSystems #Rust #SovereignAI #GovTech

