Nov 15, 2025
Personal Computing with AI.
Tiny Corp sells a computer called the tinybox to “commoditize the petaflop and enable AI for everyone.” I love the idea of tinygrad and the tinybox, but I don’t think the tinybox will ever be able to run the best LLMs. I think it will be infeasible for me to afford sufficient compute (VRAM and FLOP/s) to run the best LLMs for many years. Thus, I have to outsource that to model providers (OpenAI and Anthropic, but also third parties like Groq or Cerebras).
In contrast, I can spend compute to provide more parallel environments for the LLMs. Rather than use Codex Cloud or Google’s Jules, I can run coding environments (Docker images) on my personal compute in parallel. I will still be compute bound, but I think it will be more general compute (CPU) rather than matrix-multiplies.
Sam Stevens, 2024