• Blog
  • Docs
  • Pricing
  • We’re hiring!
Log inSign up
valdottown

valdottown

blog

Val Town's Blog
Public
Like
9
blog
Home
Code
9
components
13
posts
16
routes
6
styles
2
utils
9
IMAGES.md
README.md
TODOs.md
H
index.ts
Branches
4
Pull requests
2
Remixes
18
History
Environment variables
Val Town is a collaborative website to build and scale JavaScript apps.
Deploy APIs, crons, & store data – all from the browser, and deployed in milliseconds.
Sign up now
Code
/
/
x
/
valdottown
/
blog
/
branch
/
mcp
/
version
/
86
/
code
/
posts
/
2025-11-14-mcp.md
/
posts
/
2025-11-14-mcp.md
Code
/
/
x
/
valdottown
/
blog
/
branch
/
mcp
/
version
/
86
/
code
/
posts
/
2025-11-14-mcp.md
/
posts
/
2025-11-14-mcp.md
Search
11/14/2025
Viewing readonly version of mcp branch: v86
View latest version
2025-11-14-mcp.md
title:
Introducing Val Town MCP
description:
Bring Val Town to your favorite LLM
pubDate:
2025-11-14:00:00.000Z
author:
Steve Krouse

On Val Town, you deploy JavaScript in 100ms. Now with the Val Town MCP server, you can do that from Claude, ChatGPT, Cursor, VSCode, or wherever you do your AI coding. Try it!

Val Town MCP

If you've been following my tweets recently – "I've gotta rant about LLMs, MCP, and tool-calling for a second", "MCP is mostly nonsense", "MCP is overhyped" – you might be surprised by this announcement. Well, how did you think I got those salty takes except by building an MCP server?

Yes, I think MCP is dubious as a protocol. But for now, MCP is the right way for Val Town to meet developers where they are. In Cursor or Claude Code or Zed or wherever. For example, here we use Claude Code to make a CRUD blog app in a couple prompts.

We have guides for some of the popular LLMs:

  • Claude Code
  • ChatGPT
  • Claude Web/Desktop

But it should work with any MCP client. If you'd like a hand with setup, ask in our discord server or send us an email.

Why MCP

MCP is not perfect (again, see tweets), and the whiplash from AI tooling is real, but MCP has a few things going for it:

  1. Cheaper – Don't pay us for credits. Pay the inference provider directly.
  2. Better – Use whatever state-of-the art LLM you want. We at Val Town don't have to fast-follow it
  3. Val Town everywhere – Get the best parts of Val Town – instant deployments, built-in SQLite, etc – in your favorite LLM coding tool

MCP also allows us to ship faster. Traditional APIs require careful versioning to prevent breaking changes, but an MCP spec can change continuously because LLMs read the spec and run inference at runtime.

How we got here

There's a common thread running through every feature we build – AI or otherwise. On Val Town, whatever you and your LLM are coding is live immediately. It's the idea from Bret Victor's Inventing on Principle talk:

Creators need an immediate connection to what they're creating...when you're making something, if you make a change, or you make a decision, you need to see the effect of that immediately.

When you (or your LLM) make an edit, your code is deployed on Val Town immediately. We believe in maintaining a single environment that runs your code, the deployed environment. All code should be already-always deployed, on save, in 100ms or less. Running code locally is a constant tax on collaboration and feedback loops. With Val Town, you can iterate directly in production – either right on prod (when the stakes are low) or in a feature branch.

Val Town isn't an AI company, but this always-deployed model works quite well with LLMs. Just give your favorite LLM a branch, and the code it writes will be alive and sharable by default.

Bring Val Town MCP to your favorite LLM, and let us know what you think.

FeaturesVersion controlCode intelligenceCLIMCP
Use cases
TeamsAI agentsSlackGTM
DocsShowcaseTemplatesNewestTrendingAPI examplesNPM packages
PricingNewsletterBlogAboutCareers
We’re hiring!
Brandhi@val.townStatus
X (Twitter)
Discord community
GitHub discussions
YouTube channel
Bluesky
Open Source Pledge
Terms of usePrivacy policyAbuse contact
© 2025 Val Town, Inc.