Learn how AI coding tools transform your ideas into code through implicit tools and text generation
Aman Khan reveals what happens after Bolt gets your requirements - breaking down the “agent tools” that make the coding magic happen.These aren’t external APIs or function calls, Aman explains. They’re implicit tools - capabilities defined right in the system prompt that work together to transform your ideas into code.First, there’s structure the problem - the reasoning output that thinks about what to build. Then framework selection kicks in. The agent examines your task and decides whether to use React, Python, or Next.js. You’ll see this in the chain of thought reasoning: “This project is a React TypeScript application.”While that’s happening, the terminal tool creates the environment, installing packages and running commands to execute the code. Meanwhile, the retrieval tool references React or TypeScript documentation, making sure the agent has access to the right context for what the code should look like.“Now why do I keep using this word implicit tools?” Aman asks. The answer reveals how simple the underlying system really is. Everything - the agent tools, context, user input, and system prompt - gets combined and sent to the LLM as one request.After looking at the code and diving deeper, it felt surprisingly simple because it is. The LLM just generates text - text for the plan, text for the PRD, text for the code itself, and text for the terminal commands. All of this gets piped into what’s essentially WebAssembly - “ported to Wasm,” as Aman puts it, “basically saying, here’s a ton of text.”In practice, Bolt operates in two modes: code (writing the actual files) and preview (executing via terminal). When you ask follow-up questions like “change the color of this box,” Bolt uses what Aman calls “memory” - maintaining your entire state while adding new context to accomplish specific tasks.This architecture explains why my button didn’t work in the demo. The agent can’t search for additional button options or reference the internet for patterns. It’s limited to the tools defined in its prompt. That’s why Bolt open-sources their code - so developers can add more tools and make it smarter for specific use cases.➡️ AI coding tools are sophisticated text generators creating code, instructions, and commands - all constrained by their implicit tools. Understanding these limitations helps you work within them or extend them for your needs.Check out Aman’s course (not sponsored, I just learn a ton from him).