Public
Likeopenai-compatible-proxy
Val Town is a collaborative website to build and scale JavaScript apps.
Deploy APIs, crons, & store data – all from the browser, and deployed in milliseconds.
Viewing readonly version of main branch: v8View latest version
A production-ready, non-streaming backend that exposes OpenAI-compatible endpoints for both OpenAI and Anthropic Claude models — including full tool_calls / function calling support.
https://tmp100nuaa--b8479a5a310011f1a25642dde27851f2.web.val.run/api/v1
Add your Anthropic API key as an environment variable in this val:
| Key | Value |
|---|---|
ANTHROPIC_API_KEY | sk-ant-... |
OpenAI models work out-of-the-box via Val Town's built-in proxy (no key needed).
Returns all available models in OpenAI format. No auth required.
curl https://.../api/v1/models
Non-streaming chat completions with tool_calls support.
Auth: Any Bearer <token> header is accepted.
| Model ID | Description |
|---|---|
gpt-4o | GPT-4o |
gpt-4o-mini | GPT-4o Mini |
gpt-4.1 / gpt-4.1-mini / gpt-4.1-nano | GPT-4.1 family |
gpt-5-nano | GPT-5 Nano |
| Model ID | Upstream | Description |
|---|---|---|
claude-opus-4 | claude-opus-4-5 | Most capable |
claude-sonnet-4 | claude-sonnet-4-5 | Balanced |
claude-3-5-sonnet | claude-3-5-sonnet-20241022 | Previous gen |
claude-3-5-haiku | claude-3-5-haiku-20241022 | Fast & cheap |
claude-haiku-3-5 | claude-haiku-3-5-20241022 | Alias |
claude-3-opus | claude-3-opus-20240229 | Legacy |
| Alias | Routes to |
|---|---|
gpt-5, gpt-5-turbo, gpt-4, gpt-4-turbo | gpt-4o |
claude-4.6-sonnet, claude-4-sonnet, claude-sonnet-4-5 | claude-sonnet-4 |
claude-4-opus, claude-4.6-opus | claude-opus-4 |
claude-3-sonnet | claude-3-5-sonnet |
gemini-2.5-pro, gemini-1.5-pro | gpt-4o (fallback) |
curl -X POST https://.../api/v1/chat/completions \ -H "Authorization: Bearer any-token" \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-4o-mini", "messages": [{"role": "user", "content": "Hello!"}] }'
curl -X POST https://.../api/v1/chat/completions \ -H "Authorization: Bearer any-token" \ -H "Content-Type: application/json" \ -d '{ "model": "claude-3-5-sonnet", "messages": [ {"role": "system", "content": "You are a helpful assistant."}, {"role": "user", "content": "What is 2+2?"} ] }'
curl -X POST https://.../api/v1/chat/completions \ -H "Authorization: Bearer any-token" \ -H "Content-Type: application/json" \ -d '{ "model": "claude-3-5-sonnet", "messages": [{"role": "user", "content": "Whats the weather in Beijing?"}], "tools": [{ "type": "function", "function": { "name": "get_weather", "description": "Get weather for a city", "parameters": { "type": "object", "properties": { "city": {"type": "string"}, "unit": {"type": "string", "enum": ["celsius", "fahrenheit"]} }, "required": ["city"] } } }], "tool_choice": "auto" }'
curl -X POST https://.../api/v1/chat/completions \ -H "Authorization: Bearer any-token" \ -H "Content-Type: application/json" \ -d '{ "model": "claude-3-5-sonnet", "messages": [ {"role": "user", "content": "Whats the weather in Beijing?"}, {"role": "assistant", "content": null, "tool_calls": [{"id": "call_123", "type": "function", "function": {"name": "get_weather", "arguments": "{\"city\":\"Beijing\"}"}}]}, {"role": "tool", "tool_call_id": "call_123", "content": "{\"temp\": 22, \"condition\": \"Sunny\"}"} ] }'
from openai import OpenAI client = OpenAI( api_key="any-token", base_url="https://tmp100nuaa--b8479a5a310011f1a25642dde27851f2.web.val.run/api/v1" ) # Simple chat response = client.chat.completions.create( model="claude-3-5-sonnet", messages=[{"role": "user", "content": "Hello Claude!"}] ) print(response.choices[0].message.content) # With tools tools = [{ "type": "function", "function": { "name": "get_weather", "description": "Get current weather", "parameters": { "type": "object", "properties": {"city": {"type": "string"}}, "required": ["city"] } } }] response = client.chat.completions.create( model="claude-3-5-sonnet", messages=[{"role": "user", "content": "Weather in Shanghai?"}], tools=tools, tool_choice="auto" ) msg = response.choices[0].message if msg.tool_calls: print("Tool called:", msg.tool_calls[0].function.name) print("Arguments:", msg.tool_calls[0].function.arguments)
- Streaming is disabled — set
stream: falseor omit it entirely. - tool_choice mapping:
"auto"→ Anthropicauto,"required"→ Anthropicany,{"type":"function","function":{"name":"..."}}→ Anthropictooltype. - system messages are correctly lifted to Anthropic's top-level
systemparameter. - tool role messages (tool results) are converted to Anthropic's
tool_resultblock format.