Search

3,155 results found for anthropic (5300ms)

Code
3,144

import { nanoid } from "https://esm.sh/nanoid@5.0.5";
import { sqlite } from "https://esm.town/v/stevekrouse/sqlite";
import Anthropic from "npm:@anthropic-ai/sdk@0.24.3";
const TABLE_NAME = `memories`;
try {
// Get API key from environment
const apiKey = Deno.env.get("ANTHROPIC_API_KEY");
if (!apiKey) {
console.error("Anthropic API key is not configured.");
return null;
}
// Initialize Anthropic client
const anthropic = new Anthropic({ apiKey });
// Format previous facts for the prompt
console.log({ message });
const response = await anthropic.messages.create({
model: "claude-3-5-sonnet-latest",
max_tokens: 1000,
project,
branchId,
// anthropicApiKey,
// bearerToken,
selectedFiles,
- [x] Add a "view source" / "send me a PR" link
- [x] Show the HTTP preview in second column if there is one (and let the user pick which one to
- [x] Figure out a convention to teach in the anthropic prompt mod where the LLM always checks t
- [x] Ability to create new projects from the interface
- [x] Figure out why OpenTownie can't create HTTP vals. Maybe give it a seperate tool for it?
- [x] Start a timer for messages
- [x] Add more indicators that it's "still working"
- [x] Require users supply their own Anthropic token?
- [x] Add cost indications on messages
- [x] Add a bell noise when the message is done to let us know
import { readFile } from "https://esm.town/v/std/utils@71-main/index.ts";
import { createAnthropic } from "npm:@ai-sdk/anthropic@1.2.12";
import { ValTown } from "npm:@valtown/sdk@0.37.0";
import {
project,
branchId,
anthropicApiKey,
selectedFiles,
images,
// do we want to allow user-provided tokens still
const apiKey = anthropicApiKey || Deno.env.get("ANTHROPIC_API_KEY");
const our_api_token = apiKey === Deno.env.get("ANTHROPIC_API_KEY");
if (our_api_token) {
const traceId = `townie_${rowid}_${Date.now()}`;
const anthropic = createAnthropic({ apiKey });
// Wrap the Anthropic model with PostHog tracing
tracedModel = withTracing(anthropic(model), phClient, {
posthogDistinctId: distinctId,
posthogTraceId: traceId,
console.log("inside posthog" + tracedModel);
} else {
// Fallback to regular Anthropic call if PostHog is not configured
const anthropic = createAnthropic({ apiKey });
tracedModel = anthropic(model);
}
// @ts-ignore
lastMessage.content.at(-1).providerOptions = {
anthropic: { cacheControl: { type: "ephemeral" } },
};
}
output_tokens: result.usage.completionTokens,
cache_read_tokens:
result.providerMetadata.anthropic.cacheReadInputTokens,
cache_write_tokens:
result.providerMetadata.anthropic.cacheCreationInputTokens,
});
output_tokens: result.usage.completionTokens,
cache_read_tokens:
result.providerMetadata.anthropic.cacheReadInputTokens,
cache_write_tokens:
result.providerMetadata.anthropic.cacheCreationInputTokens,
});
},
Townie is fully open-source and itself runs on Val Town. Pull requests welcome!
n account, click the **Remix** button and then add your ANTHROPIC_API_KEY. You can leave all the
Authentication in Townie is handled via Val Town Oauth. However, we have not yet opened up our O
</ul>
<p>
The application proxies requests to the Anthropic API and Val Town API, allowing Claud
project files directly.
</p>
/** AI models, AI labs */
"https://openai.com/news/rss.xml",
"https://raw.githubusercontent.com/Olshansk/rss-feeds/main/feeds/feed_anthropic_news.xml",
"https://raw.githubusercontent.com/Olshansk/rss-feeds/main/feeds/feed_ollama.xml",
"https://raw.githubusercontent.com/Olshansk/rss-feeds/main/feeds/feed_anthropic_news.xml",
ithubusercontent.com/Olshansk/rss-feeds/main/feeds/feed_anthropic_engineering.xml",
ithubusercontent.com/Olshansk/rss-feeds/main/feeds/feed_anthropic_research.xml",
/** Individuals */
It is easy to reconfigure the agent to work with other providers that are supported by Vercel's
Anthropic's models should work well, but the system prompt would need to be retuned to work with
Learn more: https://ai-sdk.dev/docs/foundations/providers-and-models
import { anthropic } from "npm:@ai-sdk/anthropic";
import { openai } from "npm:@ai-sdk/openai";
import { generateText, streamText } from "npm:ai";
const maxSteps = 10;
const model = Deno.env.get("ANTHROPIC_API_KEY") ? anthropic("claude-3-7-sonnet-latest") : open
const options = {
It is easy to reconfigure the agent to work with other providers that are supported by Vercel's
Anthropic's models should work well, but the system prompt would need to be retuned to work with
Learn more: https://ai-sdk.dev/docs/foundations/providers-and-models

Vals

10
View more
diegoivo
anthropicWorkflow
 
Public
diegoivo
sdkAnthropic
 
Public
maddy
anthropicProxy
 
Public
stevekrouse
anthropicStreamDemo
 
Public
toowired
anthropicCaching
 
Public

Users

No users found
No docs found