Search
Code3,185
_2 or_3) to create a fresh table.### OpenAI```tsimport { OpenAI } from "https://esm.town/v/std/openai";const openai = new OpenAI();const completion = await openai.chat.completions.create({ messages: [ { role: "user", content: "Say hello in a creative way" },
// @ts-ignoreimport { OpenAI } from "https://esm.town/v/std/openai?v=4";// @ts-ignoreimport { Hono } from "npm:hono@4.4.12"; wizardState.rawRequirements = rawRequirements; const openai = new OpenAI(); const completion = await openai.chat.completions.create({ model: "gpt-4o", messages: [
// @ts-ignoreimport { OpenAI } from "https://esm.town/v/std/openai?v=4";// --- TYPE DEFINITIONS --- const action = url.searchParams.get("action"); const openai = new OpenAI(); if (req.method === "POST") { throw new Error("Invalid 'goal' provided."); } const completion = await openai.chat.completions.create({ model: "gpt-4o", messages: [ throw new Error("Invalid 'tasks' array provided."); } const completion = await openai.chat.completions.create({ model: "gpt-4o", messages: [
```### OpenAIDo not use val.town std library, import from https://esm.town/v/cricks_unmixed4u/openai-client/main.tsxTypeScript interface for interacting with OpenAI's chat models, with optional global rate limiting, and uses Val Town's SQLite for persistent rate limit tracking.Key ComponentsMessage Type: Defines the structure for chat messages (role and content).ChatOpenAI(model: string): Factory function returning an object with an invoke(messages) method. This method sends an array of messages to the specified OpenAI chat model and returns the assistant's response.GlobalRateLimitedChatOpenAI(model: string, requestsPerSecond: number): Decorator for ChatOpenAI that enforces a global rate limit (requests per second) using a persistent SQLite table.GlobalRateLimiter: Class that implements the rate limiting logic. It checks the number of requests in the current time window and throws an error if the limit is exceeded. It uses a table (global_rate_limit_1) in Val Town's SQLite.ensureGlobalRateLimitTableExists: Ensures the rate limit tracking table exists in the database at startup.UsageUse ChatOpenAI(model) for direct, unlimited access to OpenAI chat completions.Use GlobalRateLimitedChatOpenAI(model, requestsPerSecond) to enforce a global rate limit on chat completions, suitable for shared or public-facing endpoints.Val Town/Platform NotesUses Val Town’s standard SQLite API for persistent storage.Designed for server-side use (no browser-specific code).No secrets are hardcoded; OpenAI API keys are managed by the OpenAI SDK/environment.### Email
Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.### OpenAI```tsimport { OpenAI } from "https://esm.town/v/std/openai";const openai = new OpenAI();const completion = await openai.chat.completions.create({ messages: [ { role: "user", content: "Say hello in a creative way" },
Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.### OpenAI```tsimport { OpenAI } from "https://esm.town/v/std/openai";const openai = new OpenAI();const completion = await openai.chat.completions.create({ messages: [ { role: "user", content: "Say hello in a creative way" },
}, { "slug": "portkey-openai-to-groq", "title": "portkey-openai-to-groq", "description": "Portkey is a control panel for AI apps that provides a unified API to connect to 150+ models, view metrics and logs, and implement features like semantic cache and automatic retries.", "image": null, "username": null, "avatarUrl": null, "github": "https://github.com/groq/groq-api-cookbook/tree/main/tutorials/portkey-openai-to-groq", "demo": null, "language": null, "date": null, "readmeSource": null, "ipynbLink": "https://github.com/groq/groq-api-cookbook/blob/main/tutorials/portkey-openai-to-groq/Switch_from_OpenAI_to_Groq.ipynb", "ctas": [ { "icon": "mdi:github", "href": "https://github.com/groq/groq-api-cookbook/tree/main/tutorials/portkey-openai-to-groq" }, { "icon": "mdi:notebook", "href": "https://github.com/groq/groq-api-cookbook/blob/main/tutorials/portkey-openai-to-groq/Switch_from_OpenAI_to_Groq.ipynb" } ],
import "jsr:@std/dotenv/load";const GROQ_CHAT_URL = "https://api.groq.com/openai/v1/chat/completions";function stripMarkdownAndHtml(input) { const content = stripMarkdownAndHtml(readmeText).slice(0, 12000); const body = { // model: "openai/gpt-oss-20b", model: "llama-3.1-8b-instant", messages: [
Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to create a fresh table.### OpenAI```tsimport { OpenAI } from "https://esm.town/v/std/openai";const openai = new OpenAI();const completion = await openai.chat.completions.create({ messages: [ { role: "user", content: "Say hello in a creative way" },
// It serves a minimal UI on GET and returns title suggestions on POST.import { OpenAI } from "https://esm.town/v/std/openai"; // Val Town’s built-in OpenAI wrapperconst openai = new OpenAI();export default async function (req: Request): Promise<Response> { `.trim(); const completion = await openai.chat.completions.create({ model: "gpt-5-nano", max_tokens: 400,<body> <h1>PDF Title Suggester</h1> <p class="muted">Upload one or more PDFs. We extract a small text sample client-side and ask OpenAI for concise title ideas.</p> <div class="card"> <form id="form"> } statusEl.textContent = "Calling OpenAI…"; try {
reconsumeralization
import { OpenAI } from "https://esm.town/v/std/openai";
import { sqlite } from "https://esm.town/v/stevekrouse/sqlite";
/**
* Practical Implementation of Collective Content Intelligence
* Bridging advanced AI with collaborative content creation
*/
exp
kwhinnery_openai
lost1991
import { OpenAI } from "https://esm.town/v/std/openai";
export default async function(req: Request): Promise<Response> {
if (req.method === "OPTIONS") {
return new Response(null, {
headers: {
"Access-Control-Allow-Origin": "*",
No docs found