Search

3,361 results found for openai (4554ms)

Code
3,266

1. When a new message is posted in a configured Slack channel (ie. #bugs, or #support), Slack se
2. The val makes an OpenAI call to determine if the message is a bug
tHub for semantically related open issues with a separate OpenAI call
4. It posts a comment in the Slack thread with links to related GitHub issues, with a "Relevance
join/manu/main.tsx
3 matches
// @ts-ignore
import { OpenAI } from "https://esm.town/v/std/openai?v=4";
import { Hono } from "npm:hono@4.4.12";
app.post("/api/analyze", async (c) => {
try {
const openai = new OpenAI({ apiKey: Deno.env.get("OPENAI_API_KEY") });
const body = await c.req.json();
.replace("{{supplier_database}}", JSON.stringify(body.supplier_database, null, 2));
const completion = await openai.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: prompt }],
import { Hono } from "npm:hono@4.4.12";
// @ts-ignore
import { OpenAI } from "https://esm.town/v/std/openai?v=4";
// @ts-ignore
import { sqlite } from "https://esm.town/v/std/sqlite?v=4";
`;
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
model: "gpt-4o",
messages: [
[ SQLite ](https://docs.val.town/std/sqlite) Store and retrieve structured data
[ Blob Storage ](https://docs.val.town/std/blob) Store and retrieve any data
[ OpenAI ](https://docs.val.town/std/openai) Use the OpenAI API
[ Email ](https://docs.val.town/std/email) Send emails
##### API and SDK
Note: When changing a SQLite table's schema, change the table's name (e.g., add \_2 or \_3) to c
### OpenAI
```ts
import { OpenAI } from "https://esm.town/v/std/openai";
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [{ role: "user", content: "Say hello in a creative way" }],
model: "gpt-4o-mini",
- [ ] Get OpenTownie or Gemini or Claude or OpenAI to synthesize the core of these patterns into
- [ ] Convert this or into the basic react router guest book (and preserve this forum app in ano
- [ ] To what extent can these patterns be packaged up into a Val Town Router project? Would be
import { AgentContext, AgentInput, AgentOutput } from "https://esm.town/v/AIWB/agentRegistry/int
import { fetch } from "https://esm.town/v/std/fetch";
import { OpenAI } from "https://esm.town/v/std/openai";
// If Deno.env is used, ensure it's for Val Town secrets if possible or clearly noted.
// For direct Deno.env.get('OPENAI_API_KEY'), it relies on the Val Town environment variable bei
// The std/openai library usually handles API key abstraction using user's Val Town secrets.
// Your provided summarizerAgent
}
const openai = new OpenAI(); // Using OpenAI client as per your example
log("INFO", "SummarizerAgent", "Generating summary with OpenAI...");
const completion = await openai.chat.completions.create({
model: "gpt-4o-mini", // Explicitly setting a model
messages: [
if (summary === "Could not generate summary.") {
log("WARN", "SummarizerAgent", "OpenAI did not return a valid summary content.");
}
// Your provided combinerAgent (not directly used in the 12-step orchestrator, but kept for refe
export async function combinerAgent(
// ... (combinerAgent code as you provided, also updating its OpenAI call) ...
input: AgentInput<{
summary?: string;
}
const openai = new OpenAI(); // Using OpenAI client as per your example
log("INFO", "CombinerAgent", "Combining text with OpenAI...");
const completion = await openai.chat.completions.create({
model: "gpt-4o-mini", // Explicitly setting a model
messages: [
const combined = completion.choices[0]?.message?.content?.trim() ?? "Could not combine infor
if (combined === "Could not combine information.")
log("WARN", "CombinerAgent", "OpenAI did not return valid combined content.");
log("SUCCESS", "CombinerAgent", "Combined successfully");
return { mandateId, correlationId: taskId, payload: { combined } };
const { userQuery } = input.payload;
log("INFO", "ConfigurationAgent", `Processing query: ${userQuery}`);
const openai = new OpenAI();
const systemPrompt =
try {
const completion = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "system", content: systemPrompt }, { role: "user", content: userQuery }
const { log } = context;
const { articles, topic, keywords } = input.payload;
const openai = new OpenAI();
let relevantArticles: FetchedArticle[] = [];
log("INFO", "RelevanceAssessmentAgent", `Assessing relevance for ${articles.length} articles o
let assessment = { isRelevant: false, rationale: "LLM assessment failed or unclear." };
try {
const completion = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "system", content: "You are a relevance assessor. Output JSON." }, {
): Promise<AgentOutput<{ articlesWithSentiment: FetchedArticle[] }>> {
const { log } = context;
const openai = new OpenAI();
let articlesWithSentiment: FetchedArticle[] = [];
if (!input.payload.articles)
let sentimentScore: FetchedArticle["sentimentScore"] = 0.0;
try {
const completion = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "system", content: "You are a sentiment analysis expert. Output JSON.
): Promise<AgentOutput<{ articlesWithThemes: FetchedArticle[] }>> {
const { log } = context;
const openai = new OpenAI();
let articlesWithThemes: FetchedArticle[] = [];
log(
let themeResult: { themes: string[]; entities: FetchedArticle["entities"] } = { themes: [],
try {
const completion = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "system", content: "You are a thematic/NER expert. Output JSON." }, {
const { log } = context;
const { articlesWithThemes, topic, historicalContextSummary } = input.payload;
const openai = new OpenAI();
log(
"INFO",
let trendReport: any = { dominantSentiment: "N/A", keyThemes: [], emergingPatterns: "Not analy
try {
const completion = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "system", content: "Trend analysis expert. Output JSON." }, {
let anomalyReport: any = { anomaliesFound: false, anomalyDetails: [] };
try {
const completion = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "system", content: "Anomaly detection expert. Output JSON." }, {
const { log } = context;
const { trendReport, anomalyReport, config, articlesCount } = input.payload;
const openai = new OpenAI();
log("INFO", "InsightGenerationAgent", `Generating insights for topic: ${config.topic}`);
const contextSummary =
let insights: string[] = ["No specific insights generated."];
try {
const completion = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "system", content: "Strategic analyst providing concise insights." }, {
log("INFO", "ReportCompilationAgent", `Compiling report for topic: ${config.topic}`); // Direc
const openai = new OpenAI(); // Ensure API key is configured via environment variables or clie
// The agent code used 'relevantArticles'. If 'articlesWithThemes' is the correct data source
let finalReport = "Report generation failed.";
try {
const completion = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [
const { log } = context;
const { anomalyReport, insights, config } = input.payload;
const openai = new OpenAI();
let criticalInfo = "";
if (anomalyReport?.anomaliesFound && anomalyReport.anomalyDetails.length > 0) {
let alertMessage: string | undefined = undefined;
try {
const completion = await openai.chat.completions.create({
model: "gpt-4o-mini",
messages: [{ role: "system", content: "Alert generation specialist." }, { role: "user", co
import { AgentContext, AgentInput, AgentOutput } from "https://esm.town/v/AIWB/agentRegistry/int
import { fetch } from "https://esm.town/v/std/fetch";
import { OpenAI } from "https://esm.town/v/std/openai";
// Summarizer Agent (unchanged, but shown for completeness)
}
const openai = new OpenAI();
log("INFO", "SummarizerAgent", "Generating summary with OpenAI...");
const completion = await openai.chat.completions.create({
messages: [
{
if (summary === "Could not generate summary.") {
log("WARN", "SummarizerAgent", "OpenAI did not return a valid summary content.");
}
}
const openai = new OpenAI();
log("INFO", "CombinerAgent", "Combining text with OpenAI...");
const completion = await openai.chat.completions.create({
messages: [
{
if (combined === "Could not combine information.") {
log("WARN", "CombinerAgent", "OpenAI did not return valid combined content.");
}
import { blob } from "https://esm.town/v/std/blob?v=11";
// @ts-ignore
import { OpenAI } from "https://esm.town/v/std/openai?v=4";
// --- CONSTANTS & CONFIGURATION ---
- [ ] Get OpenTownie or Gemini or Claude or OpenAI to synthesize the core of these patterns into
- [ ] Convert this or into the basic react router guest book (and preserve this forum app in ano
- [ ] To what extent can these patterns be packaged up into a Val Town Router project? Would be