Search

3,280 results found for openai (2270ms)

Code
3,185

_2 or_3) to create a fresh table.
### OpenAI
```ts
import { OpenAI } from "https://esm.town/v/std/openai";
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [
{ role: "user", content: "Say hello in a creative way" },
// @ts-ignore
import { OpenAI } from "https://esm.town/v/std/openai?v=4";
// @ts-ignore
import { Hono } from "npm:hono@4.4.12";
wizardState.rawRequirements = rawRequirements;
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
model: "gpt-4o",
messages: [
// @ts-ignore
import { OpenAI } from "https://esm.town/v/std/openai?v=4";
// --- TYPE DEFINITIONS ---
const action = url.searchParams.get("action");
const openai = new OpenAI();
if (req.method === "POST") {
throw new Error("Invalid 'goal' provided.");
}
const completion = await openai.chat.completions.create({
model: "gpt-4o",
messages: [
throw new Error("Invalid 'tasks' array provided.");
}
const completion = await openai.chat.completions.create({
model: "gpt-4o",
messages: [
```
### OpenAI
library, import from https://esm.town/v/cricks_unmixed4u/openai-client/main.tsx
TypeScript interface for interacting with OpenAI's chat models, with optional global rate limiti
Key Components
Message Type: Defines the structure for chat messages (role and content).
ChatOpenAI(model: string): Factory function returning an object with an invoke(messages) method.
GlobalRateLimitedChatOpenAI(model: string, requestsPerSecond: number): Decorator for ChatOpenAI
GlobalRateLimiter: Class that implements the rate limiting logic. It checks the number of reques
ensureGlobalRateLimitTableExists: Ensures the rate limit tracking table exists in the database a
Usage
Use ChatOpenAI(model) for direct, unlimited access to OpenAI chat completions.
Use GlobalRateLimitedChatOpenAI(model, requestsPerSecond) to enforce a global rate limit on chat
Val Town/Platform Notes
Uses Val Town’s standard SQLite API for persistent storage.
Designed for server-side use (no browser-specific code).
No secrets are hardcoded; OpenAI API keys are managed by the OpenAI SDK/environment.
### Email
Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to cre
### OpenAI
```ts
import { OpenAI } from "https://esm.town/v/std/openai";
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [
{ role: "user", content: "Say hello in a creative way" },
Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to cre
### OpenAI
```ts
import { OpenAI } from "https://esm.town/v/std/openai";
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [
{ role: "user", content: "Say hello in a creative way" },
},
{
"slug": "portkey-openai-to-groq",
"title": "portkey-openai-to-groq",
"description": "Portkey is a control panel for AI apps that provides a unified API to
"image": null,
"username": null,
"avatarUrl": null,
ub.com/groq/groq-api-cookbook/tree/main/tutorials/portkey-openai-to-groq",
"demo": null,
"language": null,
"date": null,
"readmeSource": null,
ub.com/groq/groq-api-cookbook/blob/main/tutorials/portkey-openai-to-groq/Switch_from_OpenAI_to_G
"ctas": [
{
"icon": "mdi:github",
ub.com/groq/groq-api-cookbook/tree/main/tutorials/portkey-openai-to-groq"
},
{
"icon": "mdi:notebook",
ub.com/groq/groq-api-cookbook/blob/main/tutorials/portkey-openai-to-groq/Switch_from_OpenAI_to_G
}
],
import "jsr:@std/dotenv/load";
const GROQ_CHAT_URL = "https://api.groq.com/openai/v1/chat/completions";
function stripMarkdownAndHtml(input) {
const content = stripMarkdownAndHtml(readmeText).slice(0, 12000);
const body = {
// model: "openai/gpt-oss-20b",
model: "llama-3.1-8b-instant",
messages: [
Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to cre
### OpenAI
```ts
import { OpenAI } from "https://esm.town/v/std/openai";
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [
{ role: "user", content: "Say hello in a creative way" },
// It serves a minimal UI on GET and returns title suggestions on POST.
import { OpenAI } from "https://esm.town/v/std/openai"; // Val Town’s built-in OpenAI wrapper
const openai = new OpenAI();
export default async function (req: Request): Promise<Response> {
`.trim();
const completion = await openai.chat.completions.create({
model: "gpt-5-nano",
max_tokens: 400,
<body>
<h1>PDF Title Suggester</h1>
PDFs. We extract a small text sample client-side and ask OpenAI for concise title ideas.</p>
<div class="card">
<form id="form">
}
statusEl.textContent = "Calling OpenAI…";
try {