Search

3,029 results found for anthropic (2141ms)

Code
3,018

Once you've piped input into a val, you could analyze or classify data with
[OpenAI](https://docs.val.town/std/openai/) or Anthropic, enrich it with browser
automation via [Browserbase](https://docs.val.town/integrations/browserbase/) or
[Kernel](https://docs.val.town/integrations/kernel/), verify phone numbers or
We charge a 50% markup on top of raw LLM costs. If you use $10 in Townie
credits, Anthropic will get $6.66 and we'll get $3.33. We think this is fair,
sustainable, and transparent. We don't want to be in the business of having
murky limits, obfuscated credits, or unsustainable margins.
project,
branchId,
// anthropicApiKey,
// bearerToken,
selectedFiles,
- [x] Add a "view source" / "send me a PR" link
- [x] Show the HTTP preview in second column if there is one (and let the user pick which one to
- [x] Figure out a convention to teach in the anthropic prompt mod where the LLM always checks t
- [x] Ability to create new projects from the interface
- [x] Figure out why OpenTownie can't create HTTP vals. Maybe give it a seperate tool for it?
- [x] Start a timer for messages
- [x] Add more indicators that it's "still working"
- [x] Require users supply their own Anthropic token?
- [x] Add cost indications on messages
- [x] Add a bell noise when the message is done to let us know
import { readFile } from "https://esm.town/v/std/utils/index.ts";
import { createAnthropic } from "npm:@ai-sdk/anthropic@1.2.12";
import {
convertToCoreMessages,
} = await c.req.json();
const apiKey = Deno.env.get("ANTHROPIC_API_KEY");
if (await hasInsufficientCredits({ bearerToken })) {
});
const anthropic = createAnthropic({ apiKey });
let tracedModel = anthropic(model);
if (Deno.env.get("POSTHOG_PROJECT_API_KEY")) {
const traceId = `townie_${rowid}_${Date.now()}`;
// Wrap the Anthropic model with PostHog tracing
tracedModel = withTracing(anthropic(model), phClient, {
posthogDistinctId: user.id,
posthogTraceId: traceId,
// @ts-ignore
lastMessage.content.at(-1).providerOptions = {
anthropic: { cacheControl: { type: "ephemeral" } },
};
}
output_tokens: result.usage.completionTokens,
cache_read_tokens:
result.providerMetadata.anthropic.cacheReadInputTokens,
cache_write_tokens:
result.providerMetadata.anthropic.cacheCreationInputTokens,
});
output_tokens: result.usage.completionTokens,
cache_read_tokens:
result.providerMetadata.anthropic.cacheReadInputTokens,
cache_write_tokens:
result.providerMetadata.anthropic.cacheCreationInputTokens,
});
},
Townie is fully open-source and itself runs on Val Town. Pull requests welcome!
n account, click the **Remix** button and then add your ANTHROPIC_API_KEY. You can leave all the
Authentication in Townie is handled via Val Town Oauth. However, we have not yet opened up our O
</ul>
<p>
The application proxies requests to the Anthropic API and Val Town API, allowing Claud
project files directly.
</p>
"sdk",
"ai",
"anthropic"
],
"author": "Cameron Pak <cam@faith.tools>",
project,
branchId,
// anthropicApiKey,
// bearerToken,
selectedFiles,
- [x] Add a "view source" / "send me a PR" link
- [x] Show the HTTP preview in second column if there is one (and let the user pick which one to
- [x] Figure out a convention to teach in the anthropic prompt mod where the LLM always checks t
- [x] Ability to create new projects from the interface
- [x] Figure out why OpenTownie can't create HTTP vals. Maybe give it a seperate tool for it?
- [x] Start a timer for messages
- [x] Add more indicators that it's "still working"
- [x] Require users supply their own Anthropic token?
- [x] Add cost indications on messages
- [x] Add a bell noise when the message is done to let us know

Vals

10
View more
diegoivo
anthropicWorkflow
 
Public
diegoivo
sdkAnthropic
 
Public
maddy
anthropicProxy
 
Public
stevekrouse
anthropicStreamDemo
 
Public
toowired
anthropicCaching
 
Public

Users

No users found
No docs found