• Blog
  • Docs
  • Pricing
  • We’re hiring!
Log inSign up
valdottown

valdottown

blog

Val Town's Blog
Public
Like
9
blog
Home
Code
9
components
13
posts
16
routes
6
styles
2
utils
9
IMAGES.md
README.md
TODOs.md
H
index.ts
Branches
4
Pull requests
2
Remixes
18
History
Environment variables
Val Town is a collaborative website to build and scale JavaScript apps.
Deploy APIs, crons, & store data – all from the browser, and deployed in milliseconds.
Sign up now
Code
/
/
x
/
valdottown
/
blog
/
branch
/
mcp
/
version
/
24
/
code
/
posts
/
2025-11-13-mcp.md
/
posts
/
2025-11-13-mcp.md
Code
/
/
x
/
valdottown
/
blog
/
branch
/
mcp
/
version
/
24
/
code
/
posts
/
2025-11-13-mcp.md
/
posts
/
2025-11-13-mcp.md
Search
11/13/2025
Viewing readonly version of mcp branch: v24
View latest version
2025-11-13-mcp.md
title:
Introducing Val Town MCP
description:
Bring Val Town to your favorite LLM
pubDate:
2025-11-13:00:00.000Z
author:
Steve Krouse

On Val Town, you can deploy TypeScript apps in milliseconds. And now, with the Val Town MCP server, you can do that from Claude, ChatGPT, Cursor, VSCode, or wherever you do your AI coding. Try it!

Val Town MCP

If you've been following my tweets ("I've gotta rant about LLMs, MCP, and tool-calling for a second", "MCP is overhyped", etc.) you might be surprised by this announcement. But we think MCP is the right AI form factor for Val Town to meet developers where they are (for now). In Cursor or Claude Code or Zed or wherever.

We have guides for some of the popular LLMs—Claude Code, ChatGPT, Claude Web/Desktop—but it should work with any MCP client. If you'd like a hand with setup, ask in our discord server or send us an email.

Why MCP

MCP is not perfect (again, see tweets), and the whip lash from AI tooling is real, but MCP has a few things going for it:

  1. Meet developers where they are
  2. Write once, integrate everywhere (+ bet on open standards)
  3. Ship faster

Instead of fast-following all the best AI coding assistants with Townie (more on that in a bit), Val Town MCP can be used with the latest and greatest LLM of your choosing. Users are happier, and it ends the whack-a-mole implementation game.

MCP allows us to ship faster: whereas traditional APIs require careful version control for backward compatibility to prevent breaking changes, an MCP spec can change continuously because LLMs read the spec and run inference at runtime.

How we got here

Before MCP, the happy path of Val Town + AI ran through the vt CLI. You'd run vt watch then fire up Claude Code to have your local edits synced instantly on Val Town. Before that, it was Townie, an open source app builder like Bolt or Lovable but with Val Town's underlying infrastructure for instant deploys, apis, crons, email, sqlite, etc. So:

  1. Townie
  2. vt CLI
  3. MCP

The common thread running through all of these implementations is that with Val Town what you and your LLM are coding is live immediately. It's the idea from Bret Victor's Inventing on Principle talk, that "Creators need an immediate connection to what they're creating...when you're making something, if you make a change, or you make a decision, you need to see the effect of that immediately." When you—or your LLM—makes an edit, your code is deployed on Val Town immediately. That means your website/api/script/whatever is alive as you (and your LLM) create it.

Val Town's most loyal users (Townies? No, right, that's taken) have been beta testing MCP, and now it's ready for prime time. Bring it to your favorite LLM and let us know what you think.

FeaturesVersion controlCode intelligenceCLIMCP
Use cases
TeamsAI agentsSlackGTM
DocsShowcaseTemplatesNewestTrendingAPI examplesNPM packages
PricingNewsletterBlogAboutCareers
We’re hiring!
Brandhi@val.townStatus
X (Twitter)
Discord community
GitHub discussions
YouTube channel
Bluesky
Open Source Pledge
Terms of usePrivacy policyAbuse contact
© 2025 Val Town, Inc.