Search

107 results found for openai (289ms)

Code
105

Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to cre
### OpenAI
```ts
import { OpenAI } from "https://esm.town/v/std/openai";
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [
{ role: "user", content: "Say hello in a creative way" },
Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to cre
### OpenAI
```ts
import { OpenAI } from "https://esm.town/v/std/openai";
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [
{ role: "user", content: "Say hello in a creative way" },
const openrouterApiKey2 = Deno.env.get("OPENROUTER_API_KEY2");
// Poe API (new OpenAI-compatible format)
const poeApiKey = Deno.env.get("POE_API_KEY");
];
// Poe vision models (new API - OpenAI compatible)
const POE_VISION_MODELS = [
"Gemini-2.5-Flash-Lite",
];
// ============ POE API (NEW OpenAI-compatible format) ============
async function callPoeApi(
/**
* Vision pipeline:
* 1) Poe Vision models (primary - new OpenAI-compatible API)
* 2) NVIDIA Vision models (fallback)
* 3) OpenRouter Vision models (final fallback)
prompt: string
): Promise<string> {
// 1. Try Poe Vision first (new OpenAI-compatible API)
console.log("Trying Poe vision models first...");
const poeResult = await analyzeImageWithPoe(imageBase64, prompt);
import { OpenAI } from "https://esm.sh/openai@4.20.1";
export default async function handler(req: Request): Promise<Response> {
console.log(`📸 Processing ${base64Images.length} image(s)`);
const apiKey = Deno.env.get("OPENAI_API_KEY");
if (!apiKey) {
console.error("❌ OPENAI_API_KEY not set");
return new Response(
JSON.stringify({
}
const openai = new OpenAI({ apiKey });
console.log("🤖 Calling OpenAI for fish identification...");
// Language-specific instruction
}
const response = await openai.chat.completions.create({
model: "gpt-4o",
messages: [
if (!content) {
throw new Error("No response from OpenAI");
}
if (error instanceof Error) {
if (
error.message === "No response from OpenAI" ||
error.message === "Invalid response format from AI"
) {
| `TODOS_DB_ID` | Database ID where todos are synced (32-char ID without hyphens)
### Recommended: OpenAI API Key
| Variable | Description |
| ---------------- | ----------------------------------------------------- |
| `OPENAI_API_KEY` | OpenAI API key for AI features (strongly recommended) |
**What AI does:**
- Disambiguates when multiple contacts match a name
**Without `OPENAI_API_KEY`:** Falls back to Val Town's shared OpenAI client (10 req/min limit).
### Required Database Properties
Visit the root URL of your val (`main.http.tsx`) to see a dashboard showing:
- **Connection status**: Notion API and OpenAI connectivity
- **Property mappings**: Which properties are configured and whether they exist in your database
| ----------------------- | --------------------------------------------- |
| `NOTION_WEBHOOK_SECRET` | API key for protecting endpoints |
| `OPENAI_API_KEY` | For AI features (see Quick Start for details) |
---
options = {},
) => {
// Initialize OpenAI API stub with custom configuration for iFlow
const { Configuration, OpenAIApi } = await import(
"https://esm.sh/openai@3.3.0"
);
const configuration = new Configuration({
apiKey: process.env.OPENAI_API_KEY || process.env.OPENAI,
basePath: process.env.OPENAI_BASE_URL || undefined,
});
const openai = new OpenAIApi(configuration);
// Request chat completion
const messages = typeof prompt === "string"
: prompt;
const model = process.env.MODEL_NAME || "gpt-3.5-turbo-0613";
const { data } = await openai.createChatCompletion({
model,
messages: messages as any,
handler
- Environment variables via `Deno.env.get()`
- Val Town std libraries used: `email` (forwarding), `openai` (LLM fallback)
## Code Standards
- **Geocoder**: US Census Bureau (`geocoding.geo.census.gov`) — free, no API
key, handles DC intersections
- **LLM fallback**: OpenAI gpt-4o-mini via Val Town `std/openai` for locations
that resist deterministic parsing
- **Email format**: Location is always in
intersections well)
- **LLM fallback**: When deterministic parsing can't produce a geocodable
address, OpenAI gpt-4o-mini rewrites the location string before a second
geocode attempt
- **Email parsing**: Location is extracted from the pattern
_2 or _3) to create a fresh table.
### OpenAI
```ts
import { OpenAI } from "https://esm.town/v/std/openai";
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [
{ role: "user", content: "Say hello in a creative way" },
async function geocodeWithLLM(rawLocation: string): Promise<GeoResult | null> {
const { OpenAI } = await import("https://esm.town/v/std/openai");
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [
{
openai-agents
kidjs
openai-agents
Template to use the OpenAI Agents SDK
Public
openai-agents
EatPraySin
openai-agents
Template to use the OpenAI Agents SDK
Public

Users

No users found

Docs

No docs found