Search
Code3,279
import { OpenAI } from "https://esm.town/v/std/openai";export default async function(req: Request): Promise<Response> { }); } const openai = new OpenAI(); try { } const stream = await openai.chat.completions.create(body); if (!body.stream) {
import { OpenAI } from "https://esm.town/v/std/openai";export default async function(req: Request): Promise<Response> { }); } const openai = new OpenAI(); try { } const stream = await openai.chat.completions.create(body); if (!body.stream) {
const openCageKey = Deno.env.get("OPENCAGE_API_KEY"); const openAIKey = Deno.env.get("OPENAI_API_KEY"); if (!openCageKey || !openAIKey) { return new Response( JSON.stringify({ error: "API keys not set in environment variables." }), const summary = wikiData.extract || "A place of indescribable intrigue."; // Prepare prompt for OpenAI const prompt = `You are Richard Ayoade from Travel Man. Write a short, dry, funny fact about this place:\n\nLocation: ${placeName}\nFact: ${summary}`; // Call OpenAI API for witty fact const openaiResp = await fetch("https://api.openai.com/v1/chat/completions", { method: "POST", headers: { Authorization: `Bearer ${openAIKey}`, "Content-Type": "application/json", }, }); if (!openaiResp.ok) { const errorText = await openaiResp.text(); return new Response( JSON.stringify({ error: `OpenAI API error: ${errorText}` }), { status: 500, headers: { "Content-Type": "application/json" } }, ); } const aiData = await openaiResp.json(); const wittyFact = aiData.choices?.[0]?.message?.content || "I couldn’t think of anything witty.";
```### OpenAIDo not use val.town std library, import from https://esm.town/v/cricks_unmixed4u/openai-client/main.tsx and use## GreenPTClient```tsximport { GreenPTClient } from "https://esm.town/v/cricks_unmixed4u/openai-client/main.tsx";const client = GreenPTClient("green-l");
---description: You can use openai-client when integrating vals to an LLMglobs: alwaysApply: false---TypeScript interface for interacting with OpenAI's chat models, with optional global rate limiting, and uses Val Town's SQLite for persistent rate limit tracking.Key ComponentsMessage Type: Defines the structure for chat messages (role and content).ChatOpenAI(model: string): Factory function returning an object with an invoke(messages) method. This method sends an array of messages to the specified OpenAI chat model and returns the assistant's response.GlobalRateLimitedChatOpenAI(model: string, requestsPerSecond: number): Decorator for ChatOpenAI that enforces a global rate limit (requests per second) using a persistent SQLite table.GlobalRateLimiter: Class that implements the rate limiting logic. It checks the number of requests in the current time window and throws an error if the limit is exceeded. It uses a table (global_rate_limit_1) in Val Town's SQLite.ensureGlobalRateLimitTableExists: Ensures the rate limit tracking table exists in the database at startup.UsageUse ChatOpenAI(model) for direct, unlimited access to OpenAI chat completions.Use GlobalRateLimitedChatOpenAI(model, requestsPerSecond) to enforce a global rate limit on chat completions, suitable for shared or public-facing endpoints.Val Town/Platform NotesUses Val Town’s standard SQLite API for persistent storage.Designed for server-side use (no browser-specific code).No secrets are hardcoded; OpenAI API keys are managed by the OpenAI SDK/environment.
Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.### OpenAI```tsimport { OpenAI } from "https://esm.town/v/std/openai";const openai = new OpenAI();const completion = await openai.chat.completions.create({ messages: [ { role: "user", content: "Say hello in a creative way" },
const app = new Hono();const GPT_WRAPPER_URL = 'https://esm.town/v/cricks_unmixed4u/openai-client/main.tsx';export default async (req: Request) => {
```### OpenAIDo not use val.town std library, import from https://esm.town/v/cricks_unmixed4u/openai-client/main.tsx and use## GreenPTClient```tsximport { GreenPTClient } from "https://esm.town/v/cricks_unmixed4u/openai-client/main.tsx";const client = GreenPTClient("green-l");
---description: You can use openai-client when integrating vals to an LLMglobs: alwaysApply: false---TypeScript interface for interacting with OpenAI's chat models, with optional global rate limiting, and uses Val Town's SQLite for persistent rate limit tracking.Key ComponentsMessage Type: Defines the structure for chat messages (role and content).ChatOpenAI(model: string): Factory function returning an object with an invoke(messages) method. This method sends an array of messages to the specified OpenAI chat model and returns the assistant's response.GlobalRateLimitedChatOpenAI(model: string, requestsPerSecond: number): Decorator for ChatOpenAI that enforces a global rate limit (requests per second) using a persistent SQLite table.GlobalRateLimiter: Class that implements the rate limiting logic. It checks the number of requests in the current time window and throws an error if the limit is exceeded. It uses a table (global_rate_limit_1) in Val Town's SQLite.ensureGlobalRateLimitTableExists: Ensures the rate limit tracking table exists in the database at startup.UsageUse ChatOpenAI(model) for direct, unlimited access to OpenAI chat completions.Use GlobalRateLimitedChatOpenAI(model, requestsPerSecond) to enforce a global rate limit on chat completions, suitable for shared or public-facing endpoints.Val Town/Platform NotesUses Val Town’s standard SQLite API for persistent storage.Designed for server-side use (no browser-specific code).No secrets are hardcoded; OpenAI API keys are managed by the OpenAI SDK/environment.
Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.### OpenAI```tsimport { OpenAI } from "https://esm.town/v/std/openai";const openai = new OpenAI();const completion = await openai.chat.completions.create({ messages: [ { role: "user", content: "Say hello in a creative way" },
reconsumeralization
import { OpenAI } from "https://esm.town/v/std/openai";
import { sqlite } from "https://esm.town/v/stevekrouse/sqlite";
/**
* Practical Implementation of Collective Content Intelligence
* Bridging advanced AI with collaborative content creation
*/
exp
kwhinnery_openai
lost1991
import { OpenAI } from "https://esm.town/v/std/openai";
export default async function(req: Request): Promise<Response> {
if (req.method === "OPTIONS") {
return new Response(null, {
headers: {
"Access-Control-Allow-Origin": "*",
No docs found