Livestorm API MCP Server

This project creates a Model Context Protocol (MCP) server that wraps the Livestorm API, exposing:

  • GET endpoints as Resources
  • POST, PUT, DELETE endpoints as Tools

How it works

  1. The server fetches and parses the Livestorm API's OpenAPI definition
  2. It dynamically creates MCP Resources and Tools based on the API endpoints
  3. When a client requests a Resource or Tool, the server proxies the request to the Livestorm API

MCP Definition

When defining the MCP server in your LLM client applications use the following snippet:

Streaming HTTP

{ "mcpServers": { "livestorm-api": { "type": "streamable-http", "url": "https://supagroova--7fab7ae4322911f080e9569c3dd06744.web.val.run/mcp", "note": "For Streamable HTTP connections, add this URL directly in Client" } } }

SSE

{ "mcpServers": { "livestorm-api": { "type": "sse", "serverUrl": "https://supagroova--7fab7ae4322911f080e9569c3dd06744.web.val.run/sse" } } }

Files

  • index.ts: Main entry point with HTTP trigger
  • livestorm.ts: Functions to fetch and parse the OpenAPI definition
  • mcp.ts: MCP server setup and configuration

Running Locally with Deno

Using a .env file

You can store your environment variables in a .env file in the project root. These will be loaded automatically when you run the server (using Deno std/dotenv).

Example .env:

RUN_LOCAL=1
PORT=8787
LIVESTORM_API_TOKEN=your-livestorm-api-token-here

This is useful for API tokens and local configuration.

You can run this MCP server locally using the Deno runtime, which supports TypeScript and npm:/URL imports out of the box.

1. Install Deno

With Homebrew (macOS):

brew install deno

2. Run the server

In your project directory, run:

deno task dev

To specify a custom port:

PORT=3000 deno task dev

Deno will automatically fetch and cache all dependencies.