Search
Code3,266
1. When a new message is posted in a configured Slack channel (ie. #bugs, or #support), Slack sends an event to this Val2. The val makes an OpenAI call to determine if the message is a bug 3. If it is, then it searches GitHub for semantically related open issues with a separate OpenAI call4. It posts a comment in the Slack thread with links to related GitHub issues, with a "Relevance Score"
// @ts-ignoreimport { OpenAI } from "https://esm.town/v/std/openai?v=4";import { Hono } from "npm:hono@4.4.12";app.post("/api/analyze", async (c) => { try { const openai = new OpenAI({ apiKey: Deno.env.get("OPENAI_API_KEY") }); const body = await c.req.json(); .replace("{{supplier_database}}", JSON.stringify(body.supplier_database, null, 2)); const completion = await openai.chat.completions.create({ model: "gpt-4o", messages: [{ role: "user", content: prompt }],
import { Hono } from "npm:hono@4.4.12";// @ts-ignoreimport { OpenAI } from "https://esm.town/v/std/openai?v=4";// @ts-ignoreimport { sqlite } from "https://esm.town/v/std/sqlite?v=4"; `; const openai = new OpenAI(); const completion = await openai.chat.completions.create({ model: "gpt-4o", messages: [
[ SQLite ](https://docs.val.town/std/sqlite) Store and retrieve structured data
[ Blob Storage ](https://docs.val.town/std/blob) Store and retrieve any data
[ OpenAI ](https://docs.val.town/std/openai) Use the OpenAI API
[ Email ](https://docs.val.town/std/email) Send emails
##### API and SDK
Note: When changing a SQLite table's schema, change the table's name (e.g., add \_2 or \_3) to create a fresh table.
### OpenAI
```ts
import { OpenAI } from "https://esm.town/v/std/openai";
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [{ role: "user", content: "Say hello in a creative way" }],
model: "gpt-4o-mini",
- [ ] Get OpenTownie or Gemini or Claude or OpenAI to synthesize the core of these patterns into a prompt we can use to make more ReactRouter apps, such as...- [ ] Convert this or into the basic react router guest book (and preserve this forum app in another project?)- [ ] To what extent can these patterns be packaged up into a Val Town Router project? Would be neat to get the version pinning thing all centralized, can this as-a-library be that centralized place?
import { AgentContext, AgentInput, AgentOutput } from "https://esm.town/v/AIWB/agentRegistry/interfaces.ts";import { fetch } from "https://esm.town/v/std/fetch";import { OpenAI } from "https://esm.town/v/std/openai";// If Deno.env is used, ensure it's for Val Town secrets if possible or clearly noted.// For direct Deno.env.get('OPENAI_API_KEY'), it relies on the Val Town environment variable being set.// The std/openai library usually handles API key abstraction using user's Val Town secrets.// Your provided summarizerAgent } const openai = new OpenAI(); // Using OpenAI client as per your example log("INFO", "SummarizerAgent", "Generating summary with OpenAI..."); const completion = await openai.chat.completions.create({ model: "gpt-4o-mini", // Explicitly setting a model messages: [ if (summary === "Could not generate summary.") { log("WARN", "SummarizerAgent", "OpenAI did not return a valid summary content."); }// Your provided combinerAgent (not directly used in the 12-step orchestrator, but kept for reference)export async function combinerAgent( // ... (combinerAgent code as you provided, also updating its OpenAI call) ... input: AgentInput<{ summary?: string; } const openai = new OpenAI(); // Using OpenAI client as per your example log("INFO", "CombinerAgent", "Combining text with OpenAI..."); const completion = await openai.chat.completions.create({ model: "gpt-4o-mini", // Explicitly setting a model messages: [ const combined = completion.choices[0]?.message?.content?.trim() ?? "Could not combine information."; if (combined === "Could not combine information.") log("WARN", "CombinerAgent", "OpenAI did not return valid combined content."); log("SUCCESS", "CombinerAgent", "Combined successfully"); return { mandateId, correlationId: taskId, payload: { combined } }; const { userQuery } = input.payload; log("INFO", "ConfigurationAgent", `Processing query: ${userQuery}`); const openai = new OpenAI(); const systemPrompt = try { const completion = await openai.chat.completions.create({ model: "gpt-4o-mini", messages: [{ role: "system", content: systemPrompt }, { role: "user", content: userQuery }], const { log } = context; const { articles, topic, keywords } = input.payload; const openai = new OpenAI(); let relevantArticles: FetchedArticle[] = []; log("INFO", "RelevanceAssessmentAgent", `Assessing relevance for ${articles.length} articles on topic: ${topic}`); let assessment = { isRelevant: false, rationale: "LLM assessment failed or unclear." }; try { const completion = await openai.chat.completions.create({ model: "gpt-4o-mini", messages: [{ role: "system", content: "You are a relevance assessor. Output JSON." }, {): Promise<AgentOutput<{ articlesWithSentiment: FetchedArticle[] }>> { const { log } = context; const openai = new OpenAI(); let articlesWithSentiment: FetchedArticle[] = []; if (!input.payload.articles) let sentimentScore: FetchedArticle["sentimentScore"] = 0.0; try { const completion = await openai.chat.completions.create({ model: "gpt-4o-mini", messages: [{ role: "system", content: "You are a sentiment analysis expert. Output JSON." }, {): Promise<AgentOutput<{ articlesWithThemes: FetchedArticle[] }>> { const { log } = context; const openai = new OpenAI(); let articlesWithThemes: FetchedArticle[] = []; log( let themeResult: { themes: string[]; entities: FetchedArticle["entities"] } = { themes: [], entities: [] }; try { const completion = await openai.chat.completions.create({ model: "gpt-4o-mini", messages: [{ role: "system", content: "You are a thematic/NER expert. Output JSON." }, { const { log } = context; const { articlesWithThemes, topic, historicalContextSummary } = input.payload; const openai = new OpenAI(); log( "INFO", let trendReport: any = { dominantSentiment: "N/A", keyThemes: [], emergingPatterns: "Not analyzed." }; try { const completion = await openai.chat.completions.create({ model: "gpt-4o-mini", messages: [{ role: "system", content: "Trend analysis expert. Output JSON." }, { let anomalyReport: any = { anomaliesFound: false, anomalyDetails: [] }; try { const completion = await openai.chat.completions.create({ model: "gpt-4o-mini", messages: [{ role: "system", content: "Anomaly detection expert. Output JSON." }, { const { log } = context; const { trendReport, anomalyReport, config, articlesCount } = input.payload; const openai = new OpenAI(); log("INFO", "InsightGenerationAgent", `Generating insights for topic: ${config.topic}`); const contextSummary = let insights: string[] = ["No specific insights generated."]; try { const completion = await openai.chat.completions.create({ model: "gpt-4o-mini", messages: [{ role: "system", content: "Strategic analyst providing concise insights." }, { log("INFO", "ReportCompilationAgent", `Compiling report for topic: ${config.topic}`); // Directly use 'config' const openai = new OpenAI(); // Ensure API key is configured via environment variables or client options // The agent code used 'relevantArticles'. If 'articlesWithThemes' is the correct data source let finalReport = "Report generation failed."; try { const completion = await openai.chat.completions.create({ model: "gpt-4o-mini", messages: [ const { log } = context; const { anomalyReport, insights, config } = input.payload; const openai = new OpenAI(); let criticalInfo = ""; if (anomalyReport?.anomaliesFound && anomalyReport.anomalyDetails.length > 0) { let alertMessage: string | undefined = undefined; try { const completion = await openai.chat.completions.create({ model: "gpt-4o-mini", messages: [{ role: "system", content: "Alert generation specialist." }, { role: "user", content: prompt }],
import { AgentContext, AgentInput, AgentOutput } from "https://esm.town/v/AIWB/agentRegistry/interfaces.ts";import { fetch } from "https://esm.town/v/std/fetch";import { OpenAI } from "https://esm.town/v/std/openai";// Summarizer Agent (unchanged, but shown for completeness) } const openai = new OpenAI(); log("INFO", "SummarizerAgent", "Generating summary with OpenAI..."); const completion = await openai.chat.completions.create({ messages: [ { if (summary === "Could not generate summary.") { log("WARN", "SummarizerAgent", "OpenAI did not return a valid summary content."); } } const openai = new OpenAI(); log("INFO", "CombinerAgent", "Combining text with OpenAI..."); const completion = await openai.chat.completions.create({ messages: [ { if (combined === "Could not combine information.") { log("WARN", "CombinerAgent", "OpenAI did not return valid combined content."); }
import { blob } from "https://esm.town/v/std/blob?v=11";// @ts-ignoreimport { OpenAI } from "https://esm.town/v/std/openai?v=4";// --- CONSTANTS & CONFIGURATION ---
- [ ] Get OpenTownie or Gemini or Claude or OpenAI to synthesize the core of these patterns into a prompt we can use to make more ReactRouter apps, such as...- [ ] Convert this or into the basic react router guest book (and preserve this forum app in another project?)- [ ] To what extent can these patterns be packaged up into a Val Town Router project? Would be neat to get the version pinning thing all centralized, can this as-a-library be that centralized place?
reconsumeralization
import { OpenAI } from "https://esm.town/v/std/openai";
import { sqlite } from "https://esm.town/v/stevekrouse/sqlite";
/**
* Practical Implementation of Collective Content Intelligence
* Bridging advanced AI with collaborative content creation
*/
exp
kwhinnery_openai
lost1991
import { OpenAI } from "https://esm.town/v/std/openai";
export default async function(req: Request): Promise<Response> {
if (req.method === "OPTIONS") {
return new Response(null, {
headers: {
"Access-Control-Allow-Origin": "*",
No docs found