Search
Code3,144
so the whole thing is kind of recursive. I guess it should also let you put in (for completion of showing your chart to others)my impression is that claude react artifacts can now make calls to the anthropic api, so you should be able to make the whole thing happen. and then mermaid I guess for the flowchartit should by default find your influences' top 3-5 influences then stop, then have a button to push to go one layer deeper.
// Anthropic API call to get influencesasync function getInfluences(person: string): Promise<{ influences: string[], relationships: string[] }> { const apiKey = Deno.env.get('ANTHROPIC_API_KEY'); if (!apiKey) { throw new Error('ANTHROPIC_API_KEY environment variable is required'); } try { const response = await fetch('https://api.anthropic.com/v1/messages', { method: 'POST', headers: { 'Content-Type': 'application/json', 'x-api-key': apiKey, 'anthropic-version': '2023-06-01' }, body: JSON.stringify({ if (!response.ok) { const errorText = await response.text(); throw new Error(`Anthropic API error: ${response.status} - ${errorText}`); } return { influences, relationships }; } catch (error) { console.error('Error calling Anthropic API:', error); // Return error info for debugging throw error;## Environment SetupRequires `ANTHROPIC_API_KEY` environment variable for Claude API access.async function getInfluences(personName: string, context?: string): Promise<InfluenceResponse> { const apiKey = Deno.env.get("ANTHROPIC_API_KEY"); if (!apiKey) { throw new Error("ANTHROPIC_API_KEY not found in environment variables"); } try { const response = await fetch("https://api.anthropic.com/v1/messages", { method: "POST", headers: { "Content-Type": "application/json", "x-api-key": apiKey, "anthropic-version": "2023-06-01", }, body: JSON.stringify({ if (!response.ok) { throw new Error(`Anthropic API error: ${response.status} ${response.statusText}`); }Once you've piped input into a val, you could analyze or classify data with[OpenAI](https://docs.val.town/std/openai/) or Anthropic, enrich it with browserautomation via [Browserbase](https://docs.val.town/integrations/browserbase/) or[Kernel](https://docs.val.town/integrations/kernel/), verify phone numbers orWe charge a 50% markup on top of raw LLM costs. If you use $10 in Towniecredits, Anthropic will get $6.66 and we'll get $3.33. We think this is fair,sustainable, and transparent. We don't want to be in the business of havingmurky limits, obfuscated credits, or unsustainable margins.
Once you've piped input into a val, you could analyze or classify data with[OpenAI](https://docs.val.town/std/openai/) or Anthropic, enrich it with browserautomation via [Browserbase](https://docs.val.town/integrations/browserbase/) or[Kernel](https://docs.val.town/integrations/kernel/), verify phone numbers orWe charge a 50% markup on top of raw LLM costs. If you use $10 in Towniecredits, Anthropic will get $6.66 and we'll get $3.33. We think this is fair,sustainable, and transparent. We don't want to be in the business of havingmurky limits, obfuscated credits, or unsustainable margins.
project, branchId, // anthropicApiKey, // bearerToken, selectedFiles,- [x] Add a "view source" / "send me a PR" link- [x] Show the HTTP preview in second column if there is one (and let the user pick which one to preview in the iframe)- [x] Figure out a convention to teach in the anthropic prompt mod where the LLM always checks the readme for the scope (if not provided) and keeps it up to take with every change- [x] Ability to create new projects from the interface- [x] Figure out why OpenTownie can't create HTTP vals. Maybe give it a seperate tool for it?- [x] Start a timer for messages- [x] Add more indicators that it's "still working"- [x] Require users supply their own Anthropic token?- [x] Add cost indications on messages- [x] Add a bell noise when the message is done to let us know