Search

3,155 results found for anthropic (5356ms)

Code
3,144

so the whole thing is kind of recursive. I guess it should also let you put in (for completion
s that claude react artifacts can now make calls to the anthropic api, so you should be able to
it should by default find your influences' top 3-5 influences then stop, then have a button to p
// Anthropic API call to get influences
async function getInfluences(person: string): Promise<{ influences: string[], relationships: str
const apiKey = Deno.env.get('ANTHROPIC_API_KEY');
if (!apiKey) {
throw new Error('ANTHROPIC_API_KEY environment variable is required');
}
try {
const response = await fetch('https://api.anthropic.com/v1/messages', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'x-api-key': apiKey,
'anthropic-version': '2023-06-01'
},
body: JSON.stringify({
if (!response.ok) {
const errorText = await response.text();
throw new Error(`Anthropic API error: ${response.status} - ${errorText}`);
}
return { influences, relationships };
} catch (error) {
console.error('Error calling Anthropic API:', error);
// Return error info for debugging
throw error;
## Environment Setup
Requires `ANTHROPIC_API_KEY` environment variable for Claude API access.
async function getInfluences(personName: string, context?: string): Promise<InfluenceResponse> {
const apiKey = Deno.env.get("ANTHROPIC_API_KEY");
if (!apiKey) {
throw new Error("ANTHROPIC_API_KEY not found in environment variables");
}
try {
const response = await fetch("https://api.anthropic.com/v1/messages", {
method: "POST",
headers: {
"Content-Type": "application/json",
"x-api-key": apiKey,
"anthropic-version": "2023-06-01",
},
body: JSON.stringify({
if (!response.ok) {
throw new Error(`Anthropic API error: ${response.status} ${response.statusText}`);
}
Once you've piped input into a val, you could analyze or classify data with
[OpenAI](https://docs.val.town/std/openai/) or Anthropic, enrich it with browser
automation via [Browserbase](https://docs.val.town/integrations/browserbase/) or
[Kernel](https://docs.val.town/integrations/kernel/), verify phone numbers or
We charge a 50% markup on top of raw LLM costs. If you use $10 in Townie
credits, Anthropic will get $6.66 and we'll get $3.33. We think this is fair,
sustainable, and transparent. We don't want to be in the business of having
murky limits, obfuscated credits, or unsustainable margins.
Once you've piped input into a val, you could analyze or classify data with
[OpenAI](https://docs.val.town/std/openai/) or Anthropic, enrich it with browser
automation via [Browserbase](https://docs.val.town/integrations/browserbase/) or
[Kernel](https://docs.val.town/integrations/kernel/), verify phone numbers or
We charge a 50% markup on top of raw LLM costs. If you use $10 in Townie
credits, Anthropic will get $6.66 and we'll get $3.33. We think this is fair,
sustainable, and transparent. We don't want to be in the business of having
murky limits, obfuscated credits, or unsustainable margins.
project,
branchId,
// anthropicApiKey,
// bearerToken,
selectedFiles,
- [x] Add a "view source" / "send me a PR" link
- [x] Show the HTTP preview in second column if there is one (and let the user pick which one to
- [x] Figure out a convention to teach in the anthropic prompt mod where the LLM always checks t
- [x] Ability to create new projects from the interface
- [x] Figure out why OpenTownie can't create HTTP vals. Maybe give it a seperate tool for it?
- [x] Start a timer for messages
- [x] Add more indicators that it's "still working"
- [x] Require users supply their own Anthropic token?
- [x] Add cost indications on messages
- [x] Add a bell noise when the message is done to let us know

Vals

10
View more
diegoivo
anthropicWorkflow
 
Public
diegoivo
sdkAnthropic
 
Public
maddy
anthropicProxy
 
Public
stevekrouse
anthropicStreamDemo
 
Public
toowired
anthropicCaching
 
Public

Users

No users found
No docs found