• Blog
  • Docs
  • Pricing
  • We’re hiring!
Log inSign up
tmp100nuaa

tmp100nuaa

openai-compatible-proxy

OpenAI-compatible API proxy using Val Town built-in AI models
Public
Like
openai-compatible-proxy
Home
Code
2
README.md
H
main.ts
Environment variables
1
Branches
1
Pull requests
Remixes
History
Val Town is a collaborative website to build and scale JavaScript apps.
Deploy APIs, crons, & store data – all from the browser, and deployed in milliseconds.
Sign up now
Code
/
README.md
Code
/
README.md
Search
…
README.md

OpenAI-Compatible API Proxy v2.0

A production-ready, non-streaming backend that exposes OpenAI-compatible endpoints for both OpenAI and Anthropic Claude models — including full tool_calls / function calling support.

Base URL

https://tmp100nuaa--b8479a5a310011f1a25642dde27851f2.web.val.run/api/v1

Setup

For Claude models (required)

Add your Anthropic API key as an environment variable in this val:

KeyValue
ANTHROPIC_API_KEYsk-ant-...

OpenAI models work out-of-the-box via Val Town's built-in proxy (no key needed).


Endpoints

GET /api/v1/models

Returns all available models in OpenAI format. No auth required.

curl https://.../api/v1/models

POST /api/v1/chat/completions

Non-streaming chat completions with tool_calls support.

Auth: Any Bearer <token> header is accepted.


Model Reference

OpenAI models (built-in, no key needed)

Model IDDescription
gpt-4oGPT-4o
gpt-4o-miniGPT-4o Mini
gpt-4.1 / gpt-4.1-mini / gpt-4.1-nanoGPT-4.1 family
gpt-5-nanoGPT-5 Nano

Claude models (requires ANTHROPIC_API_KEY)

Model IDUpstreamDescription
claude-opus-4claude-opus-4-5Most capable
claude-sonnet-4claude-sonnet-4-5Balanced
claude-3-5-sonnetclaude-3-5-sonnet-20241022Previous gen
claude-3-5-haikuclaude-3-5-haiku-20241022Fast & cheap
claude-haiku-3-5claude-haiku-3-5-20241022Alias
claude-3-opusclaude-3-opus-20240229Legacy

Friendly aliases (auto-mapped)

AliasRoutes to
gpt-5, gpt-5-turbo, gpt-4, gpt-4-turbogpt-4o
claude-4.6-sonnet, claude-4-sonnet, claude-sonnet-4-5claude-sonnet-4
claude-4-opus, claude-4.6-opusclaude-opus-4
claude-3-sonnetclaude-3-5-sonnet
gemini-2.5-pro, gemini-1.5-progpt-4o (fallback)

Examples

Non-streaming chat (OpenAI)

curl -X POST https://.../api/v1/chat/completions \ -H "Authorization: Bearer any-token" \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-4o-mini", "messages": [{"role": "user", "content": "Hello!"}] }'

Non-streaming chat (Claude)

curl -X POST https://.../api/v1/chat/completions \ -H "Authorization: Bearer any-token" \ -H "Content-Type: application/json" \ -d '{ "model": "claude-3-5-sonnet", "messages": [ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "What is 2+2?"} ] }'

Tool calling (Claude)

curl -X POST https://.../api/v1/chat/completions \ -H "Authorization: Bearer any-token" \ -H "Content-Type: application/json" \ -d '{ "model": "claude-3-5-sonnet", "messages": [{"role": "user", "content": "Whats the weather in Beijing?"}], "tools": [{ "type": "function", "function": { "name": "get_weather", "description": "Get weather for a city", "parameters": { "type": "object", "properties": { "city": {"type": "string"}, "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]} }, "required": ["city"] } } }], "tool_choice": "auto" }'

Multi-turn tool calling (returning tool result)

curl -X POST https://.../api/v1/chat/completions \ -H "Authorization: Bearer any-token" \ -H "Content-Type: application/json" \ -d '{ "model": "claude-3-5-sonnet", "messages": [ {"role": "user", "content": "Whats the weather in Beijing?"}, {"role": "assistant", "content": null, "tool_calls": [{"id": "call_123", "type": "function", "function": {"name": "get_weather", "arguments": "{\"city\":\"Beijing\"}"}}]}, {"role": "tool", "tool_call_id": "call_123", "content": "{\"temp\": 22, \"condition\": \"Sunny\"}"} ] }'

Python (OpenAI SDK)

from openai import OpenAI client = OpenAI( api_key="any-token", base_url="https://tmp100nuaa--b8479a5a310011f1a25642dde27851f2.web.val.run/api/v1" ) # Simple chat response = client.chat.completions.create( model="claude-3-5-sonnet", messages=[{"role": "user", "content": "Hello Claude!"}] ) print(response.choices[0].message.content) # With tools tools = [{ "type": "function", "function": { "name": "get_weather", "description": "Get current weather", "parameters": { "type": "object", "properties": {"city": {"type": "string"}}, "required": ["city"] } } }] response = client.chat.completions.create( model="claude-3-5-sonnet", messages=[{"role": "user", "content": "Weather in Shanghai?"}], tools=tools, tool_choice="auto" ) msg = response.choices[0].message if msg.tool_calls: print("Tool called:", msg.tool_calls[0].function.name) print("Arguments:", msg.tool_calls[0].function.arguments)

Notes

  • Streaming is disabled — set stream: false or omit it entirely.
  • tool_choice mapping: "auto" → Anthropic auto, "required" → Anthropic any, {"type":"function","function":{"name":"..."}} → Anthropic tool type.
  • system messages are correctly lifted to Anthropic's top-level system parameter.
  • tool role messages (tool results) are converted to Anthropic's tool_result block format.
FeaturesVersion controlCode intelligenceCLIMCP
Use cases
TeamsAI agentsSlackGTM
DocsShowcaseTemplatesNewestTrendingAPI examplesNPM packages
AboutAlternativesPricingBlogNewsletterCareers
We’re hiring!
Brandhi@val.townStatus
X (Twitter)
Discord community
GitHub discussions
YouTube channel
Bluesky
Open Source Pledge
Terms of usePrivacy policyAbuse contact
© 2026 Val Town, Inc.