Search
Code3,018
Once you've piped input into a val, you could analyze or classify data with[OpenAI](https://docs.val.town/std/openai/) or Anthropic, enrich it with browserautomation via [Browserbase](https://docs.val.town/integrations/browserbase/) or[Kernel](https://docs.val.town/integrations/kernel/), verify phone numbers orWe charge a 50% markup on top of raw LLM costs. If you use $10 in Towniecredits, Anthropic will get $6.66 and we'll get $3.33. We think this is fair,sustainable, and transparent. We don't want to be in the business of havingmurky limits, obfuscated credits, or unsustainable margins.
project, branchId, // anthropicApiKey, // bearerToken, selectedFiles,- [x] Add a "view source" / "send me a PR" link- [x] Show the HTTP preview in second column if there is one (and let the user pick which one to preview in the iframe)- [x] Figure out a convention to teach in the anthropic prompt mod where the LLM always checks the readme for the scope (if not provided) and keeps it up to take with every change- [x] Ability to create new projects from the interface- [x] Figure out why OpenTownie can't create HTTP vals. Maybe give it a seperate tool for it?- [x] Start a timer for messages- [x] Add more indicators that it's "still working"- [x] Require users supply their own Anthropic token?- [x] Add cost indications on messages- [x] Add a bell noise when the message is done to let us knowimport { readFile } from "https://esm.town/v/std/utils/index.ts";import { createAnthropic } from "npm:@ai-sdk/anthropic@1.2.12";import { convertToCoreMessages, } = await c.req.json(); const apiKey = Deno.env.get("ANTHROPIC_API_KEY"); if (await hasInsufficientCredits({ bearerToken })) { }); const anthropic = createAnthropic({ apiKey }); let tracedModel = anthropic(model); if (Deno.env.get("POSTHOG_PROJECT_API_KEY")) { const traceId = `townie_${rowid}_${Date.now()}`; // Wrap the Anthropic model with PostHog tracing tracedModel = withTracing(anthropic(model), phClient, { posthogDistinctId: user.id, posthogTraceId: traceId, // @ts-ignore lastMessage.content.at(-1).providerOptions = { anthropic: { cacheControl: { type: "ephemeral" } }, }; } output_tokens: result.usage.completionTokens, cache_read_tokens: result.providerMetadata.anthropic.cacheReadInputTokens, cache_write_tokens: result.providerMetadata.anthropic.cacheCreationInputTokens, }); output_tokens: result.usage.completionTokens, cache_read_tokens: result.providerMetadata.anthropic.cacheReadInputTokens, cache_write_tokens: result.providerMetadata.anthropic.cacheCreationInputTokens, }); },Townie is fully open-source and itself runs on Val Town. Pull requests welcome!To get Townie running in your Val Town account, click the **Remix** button and then add your ANTHROPIC_API_KEY. You can leave all the other environment variables blank.Authentication in Townie is handled via Val Town Oauth. However, we have not yet opened up our OAuth to anyone else, which currently makes it very awkward to use your own Townie. Here is a temporary workaround: </ul> <p> The application proxies requests to the Anthropic API and Val Town API, allowing Claude to view and edit your project files directly. </p> "sdk", "ai", "anthropic" ], "author": "Cameron Pak <cam@faith.tools>", project, branchId, // anthropicApiKey, // bearerToken, selectedFiles,- [x] Add a "view source" / "send me a PR" link- [x] Show the HTTP preview in second column if there is one (and let the user pick which one to preview in the iframe)- [x] Figure out a convention to teach in the anthropic prompt mod where the LLM always checks the readme for the scope (if not provided) and keeps it up to take with every change- [x] Ability to create new projects from the interface- [x] Figure out why OpenTownie can't create HTTP vals. Maybe give it a seperate tool for it?- [x] Start a timer for messages- [x] Add more indicators that it's "still working"- [x] Require users supply their own Anthropic token?- [x] Add cost indications on messages- [x] Add a bell noise when the message is done to let us know