| title: | Introducing Val Town MCP |
|---|---|
| description: | Bring Val Town to your favorite LLM |
| pubDate: | 2025-11-13:00:00.000Z |
| author: | Pete Millspaugh |
On Val Town, you can deploy TypeScript apps in milliseconds, and with the new Val Town MCP server you can do that from Claude, ChatGPT, Cursor, VSCode, or wherever you do your AI coding. Try it out!
Val Town MCP and Claude Code creating a Hono+SQLite blog deployed instantly on Val Town. Demo by Jackson who implemented Val Town MCP.
If you've been following Steve's tweets ("I've gotta rant about LLMs, MCP, and tool-calling for a second", "MCP is overhyped", etc.) you might be surprised by this announcement. But we think MCP is the right AI form factor for Val Town to meet developers where they are (for now). In Cursor or Claude Code or Zed or wherever.
We have guides for some of the popular LLMs, but it should work with any MCP client. If you'd like a hand with setup, ask in our discord server or send us an email.
So why MCP?
- Meet developers where they are
- Write once, integrate everywhere (+ bet on open standards)
- Ship faster, thanks to OAuth DCR
Instead of fast-following all the best AI coding assistants with Townie (more on that in a bit), Val Town MCP can be used with the latest and greatest LLM of your choosing. Users are happier, and it ends the whack-a-mole implementation game.
Whereas tool calling required a different implementation per LLM, betting on MCP
is betting on an open standard. Write once, integrate everywhere. It aligns with
Val Town's other bets on Web standards, like using Deno instead of a custom
runtime with magic like console.email. Just as Val Town shifted from custom
syntax to standard JavaScript, it's now shifting from a custom AI code assistant
to a standard tool that any LLM can pull off the shelf. From a walled garden to
an open ecosystem.
MCP also solves auth in a way that OpenAPI didn't with OAuth DCR (Dynamic Client Registration). And not only does it solve auth, it unlocks faster MCP improvements: whereas traditional API requires careful version control for backward compatibility to prevent breaking changes, an MCP spec can change continuously because LLMs read the spec and run inference at runtime.
MCP is not perfect (again, see Steve's tweets), and the whip lash from AI tooling has real
Before MCP, the happy path of Val Town + AI ran through the
vt CLI. You'd run vt watch then fire up
Claude Code to have your local edits synced instantly on Val Town. Before
that, it was Townie, an open source app builder
like Bolt or Lovable but with Val Town's underlying infrastructure for instant
deploys, apis, crons, email, sqlite, etc. So:
- Townie
vtCLI- MCP
The common thread running through all of these implementations is that with Val Town what you and your LLM are coding is live immediately. It's the idea from Bret Victor's Inventing on Principle talk, that "Creators need an immediate connection to what they're creating...when you're making something, if you make a change, or you make a decision, you need to see the effect of that immediately." When you—or your LLM—makes an edit, your code is deployed on Val Town immediately. That means your website/api/script/whatever is alive as you (and your LLM) create it.
Val Town's most loyal users (Townies? No, right, that's taken) have been beta testing MCP, and now it's ready for prime time. Bring it to your favorite LLM and let us know what you think.