Readme

OpenAI - Docs β†—

Use OpenAI's chat completion API with std/openai. This integration enables access to OpenAI's language models without needing to acquire API keys.

For free Val Town users, all calls are sent to gpt-3.5-turbo.

Streaming is not yet supported. Upvote the HTTP response streaming feature request if you need it!

Usage

Create valimport { OpenAI } from "https://esm.town/v/std/openai"; const openai = new OpenAI(); const completion = await openai.chat.completions.create({ messages: [ { role: "user", content: "Say hello in a creative way" }, ], model: "gpt-4", max_tokens: 30, }); console.log(completion.choices[0].message.content);

Limits

While our wrapper simplifies the integration of OpenAI, there are a few limitations to keep in mind:

  • Usage Quota: We limit each user to 10 requests per minute.
  • Features: Chat completions is the only endpoint available.

If these limits are too low, let us know! You can also get around the limitation by using your own keys:

  1. Create your own API key on OpenAI's website
  2. Create an environment variable named OPENAI_API_KEY
  3. Use the OpenAI client from npm:openai:
Create valimport { OpenAI } from "npm:openai"; const openai = new OpenAI();

πŸ“ Edit docs

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
import { type ClientOptions, OpenAI as RawOpenAI } from "npm:openai";
/**
* API Client for interfacing with the OpenAI API. Uses Val Town credentials.
*/
export class OpenAI {
private rawOpenAIClient: RawOpenAI;
/**
* API Client for interfacing with the OpenAI API. Uses Val Town credentials.
*
* @param {number} [opts.timeout=10 minutes] - The maximum amount of time (in milliseconds) the client will wait for a response before timing out.
* @param {number} [opts.httpAgent] - An HTTP agent used to manage HTTP(s) connections.
* @param {Core.Fetch} [opts.fetch] - Specify a custom `fetch` function implementation.
* @param {number} [opts.maxRetries=2] - The maximum number of times the client will retry a request.
* @param {Core.Headers} opts.defaultHeaders - Default headers to include with every request to the API.
* @param {Core.DefaultQuery} opts.defaultQuery - Default query parameters to include with every request to the API.
* @param {boolean} [opts.dangerouslyAllowBrowser=false] - By default, client-side use of this library is not allowed, as it risks exposing your secret API credentials to attackers.
*/
constructor(options: Omit<ClientOptions, "baseURL" | "apiKey" | "organization"> = {}) {
this.rawOpenAIClient = new RawOpenAI({
...options,
baseURL: "https://std-openaiproxy.web.val.run/v1",
apiKey: Deno.env.get("valtown"),
organization: null,
});
}
get chat() {
return this.rawOpenAIClient.chat;
}
readonly beta = {
get chat(): RawOpenAI["beta"]["chat"] {
return this.rawOpenAIClient.beta.chat;
}
}
}
πŸ‘† This is a val. Vals are TypeScript snippets of code, written in the browser and run on our servers. Create scheduled functions, email yourself, and persist small pieces of data β€” all from the browser.