Search

3,267 results found for openai (1467ms)

Code
3,172

brian@airbnb.com,Brian Chesky
drew@dropbox.com,Drew Houston
sam@openai.com,Sam Altman
tim@apple.com,Tim Cook
jeff@amazon.com,Jeff Bezos
Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to cre
### OpenAI
```ts
import { OpenAI } from "https://esm.town/v/std/openai";
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [
{ role: "user", content: "Say hello in a creative way" },
Note: When changing a SQLite table's schema, change the table's name (e.g., add _2 or _3) to cre
### OpenAI
```ts
import { OpenAI } from "https://esm.town/v/std/openai";
const openai = new OpenAI();
const completion = await openai.chat.completions.create({
messages: [
{ role: "user", content: "Say hello in a creative way" },
In addition to tools you make available to the model with [function calling](/docs/guides/functi
* **Connectors** are OpenAI-maintained MCP wrappers for popular services like Google Workspace
* **Remote MCP servers** can be any server on the public Internet that implements a remote [Mo
```bash
curl https://api.openai.com/v1/responses \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
"model": "gpt-5",
```javascript
import OpenAI from "openai";
const client = new OpenAI();
const resp = await client.responses.create({
```python
from openai import OpenAI
client = OpenAI()
resp = client.responses.create(
```bash
curl https://api.openai.com/v1/responses \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
"model": "gpt-5",
```javascript
import OpenAI from "openai";
const client = new OpenAI();
const resp = await client.responses.create({
```python
from openai import OpenAI
client = OpenAI()
resp = client.responses.create(
```bash
curl https://api.openai.com/v1/responses \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
"model": "gpt-5",
```javascript
import OpenAI from "openai";
const client = new OpenAI();
const resp = await client.responses.create({
```python
from openai import OpenAI
client = OpenAI()
resp = client.responses.create(
#### Approvals
By default, OpenAI will request your approval before any data is shared with a connector or remo
```json
```bash
curl https://api.openai.com/v1/responses \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
"model": "gpt-5",
```javascript
import OpenAI from "openai";
const client = new OpenAI();
const resp = await client.responses.create({
```python
from openai import OpenAI
client = OpenAI()
resp = client.responses.create(
```bash
curl https://api.openai.com/v1/responses \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
"model": "gpt-5",
```javascript
import OpenAI from "openai";
const client = new OpenAI();
const resp = await client.responses.create({
```python
from openai import OpenAI
client = OpenAI()
resp = client.responses.create(
```bash
curl https://api.openai.com/v1/responses \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
"model": "gpt-5",
```javascript
import OpenAI from "openai";
const client = new OpenAI();
const resp = await client.responses.create({
```python
from openai import OpenAI
client = OpenAI()
resp = client.responses.create(
The Responses API has built-in support for a limited set of connectors to third-party services.
e used in the same way as remote MCP servers. Both let an OpenAI model access additional third-p
### Available connectors
```bash
curl https://api.openai.com/v1/responses \
-H "Content-Type: application/json" \
-H "Authorization: Bearer $OPENAI_API_KEY" \
-d '{
"model": "gpt-5",
```javascript
import OpenAI from "openai";
const client = new OpenAI();
const resp = await client.responses.create({
```python
from openai import OpenAI
client = OpenAI()
resp = client.responses.create(
----------------
The MCP tool permits you to connect OpenAI models to external services. This is a powerful featu
there is a risk of potentially sending sensitive data to OpenAI, or allowing models read access
arry those same risks, but also have not been verified by OpenAI. These servers can allow models
If you come across a malicious MCP server, please report it to `security@openai.com`.
Below are some best practices to consider when integrating connectors and remote MCP servers.
We also recommend logging any data sent to MCP servers. If you're using the Responses API with `
hidden instructions (prompt injections) designed to make OpenAI models behave unexpectedly. Whi
MCP servers may update tool behavior unexpectedly, potentially leading to unintended or maliciou
The MCP tool is compatible with Zero Data Retention and Data Residency, but it's important to no
if you're an organization with Data Residency in Europe, OpenAI will limit inference and storag
Usage notes
Connect to the Realtime API using WebSockets on a server.
e data transfer, and a great choice for connecting to the OpenAI Realtime API in server-to-serve
In a server-to-server integration with Realtime, your backend system will connect via WebSocket
![connect directly to realtime API](https://openaidevs.retool.com/api/file/464d4334-c467-4862-90
Connect via WebSocket
---------------------
ill also need to pass an authentication header using your OpenAI API key.
It is possible to use WebSocket in browsers with an ephemeral API token as shown in the [WebRTC
import WebSocket from "ws";
const url = "wss://api.openai.com/v1/realtime?model=gpt-realtime";
const ws = new WebSocket(url, {
headers: {
Authorization: "Bearer " + process.env.OPENAI_API_KEY,
},
});
import websocket
OPENAI_API_KEY = os.environ.get("OPENAI_API_KEY")
url = "wss://api.openai.com/v1/realtime?model=gpt-realtime"
headers = ["Authorization: Bearer " + OPENAI_API_KEY]
def on_open(ws):
const ws = new WebSocket(
"wss://api.openai.com/v1/realtime?model=gpt-realtime",
[
"realtime",
// Auth
"openai-insecure-api-key." + OPENAI_API_KEY,
// Optional
"openai-organization." + OPENAI_ORG_ID,
"openai-project." + OPENAI_PROJECT_ID,
]
);
import WebSocket from "ws";
const url = "wss://api.openai.com/v1/realtime?model=gpt-realtime";
const ws = new WebSocket(url, {
headers: {
Authorization: "Bearer " + process.env.OPENAI_API_KEY,
},
});
Connect to the Realtime API using WebRTC.
ndard interfaces for building real-time applications. The OpenAI Realtime API supports connectin
end starting with the [Agents SDK for TypeScript](https://openai.github.io/openai-agents-js/guid
When connecting to a Realtime model from the client (like a web browser or mobile device), we re
1. A browser makes a request to a developer-controlled server to mint an ephemeral API key.
anization/api-keys) to request an ephemeral key from the [OpenAI REST API](/docs/api-reference/r
ephemeral key to authenticate a session directly with the OpenAI Realtime API as a [WebRTC peer
![connect to realtime via WebRTC](https://openaidevs.retool.com/api/file/55b47800-9aaf-48b9-90d5
Initializing a peer connection
```javascript
// Get a session token for OpenAI Realtime API
const tokenResponse = await fetch("/token");
const data = await tokenResponse.json();
await pc.setLocalDescription(offer);
const baseUrl = "https://api.openai.com/v1/realtime/calls";
const model = "gpt-realtime";
const sdpResponse = await fetch(`${baseUrl}?model=${model}`, {
---------------------------
plication (or integrate with an existing one) to make an [OpenAI REST API](/docs/api-reference/r
Below is an example of a simple Node.js [express](https://expressjs.com/) server which mints an
try {
const response = await fetch(
"https://api.openai.com/v1/realtime/client_secrets",
{
method: "POST",
```
e HTTP requests. Just ensure that **you only use standard OpenAI API keys on the server, not in
Sending and receiving events
Check out the WebRTC Realtime API in this light weight example app.
/Users/emcho/devel/openai-realtime-console
Was this page useful?
If you want to connect a phone number to the Realtime API, use a SIP trunking provider (e.g., Tw
k](/docs/guides/webhooks) for incoming calls, at platform.openai.com. Then, point your SIP trunk
When OpenAI receives SIP traffic associated with your project, the webhook that you configured w
This webhook lets you accept or reject the call. When accepting the call, you'll provide the con
URIs used for interacting with Realtime API and SIP:
|SIP URI|sip:$PROJECT_ID@sip.api.openai.com;transport=tls|
|Accept URI|https://api.openai.com/v1/realtime/calls/$CALL_ID/accept|
|Reject URI|https://api.openai.com/v1/realtime/calls/$CALL_ID/reject|
|Refer URI|https://api.openai.com/v1/realtime/calls/$CALL_ID/refer|
|Events URI|wss://api.openai.com/v1/realtime?call_id=$CALL_ID|
Find your `$CALL_ID` in the `call_id` field in data object present in the webhook. See an exampl
```python
from flask import Flask, request, Response, jsonify, make_response
from openai import OpenAI, InvalidWebhookSignatureError
import asyncio
import json
app = Flask(__name__)
client = OpenAI(
webhook_secret=os.environ["OPENAI_WEBHOOK_SECRET"]
)
AUTH_HEADER = {
"Authorization": "Bearer " + os.getenv("OPENAI_API_KEY")
}
try:
async with websockets.connect(
"wss://api.openai.com/v1/realtime?call_id=" + call_id,
additional_headers=AUTH_HEADER,
) as websocket:
if event.type == "realtime.call.incoming":
requests.post(
"https://api.openai.com/v1/realtime/calls/"
+ event.data.call_id
+ "/accept",
It's also possible to redirect the call to another number. During the call, make a POST to the `
|URL|https://api.openai.com/v1/realtime/calls/$CALL_ID/refer|
|Payload|JSON with one key target_uriThis is the value used in the Refer-To. You can use Tel-URI
|Headers|Authorization: Bearer YOUR_API_KEYSubstitute YOUR_API_KEY with a standard API key|
```javascript
const baseUrl = "https://api.openai.com/v1/realtime/calls";
const model = "gpt-realtime";
const sdpResponse = await fetch(`${baseUrl}?model=${model}`, {
// Connect to a WebSocket for the in-progress call
const url = "wss://api.openai.com/v1/realtime?call_id=" + callId;
const ws = new WebSocket(url, {
headers: {
Authorization: "Bearer " + process.env.OPENAI_API_KEY,
},
});
### With SIP
1. A user connects to OpenAI via phone over SIP.
2. OpenAI sends a webhook to your application’s backend webhook URL, notifying your app of the
```text
POST https://my_website.com/webhook_endpoint
user-agent: OpenAI/1.0 (+https://platform.openai.com/docs/webhooks)
content-type: application/json
webhook-id: wh_685342e6c53c8190a1be43f081506c52 # unique id for idempotency
```
n the webhook. This `call_id` looks like this: `wss://api.openai.com/v1/realtime?call_id={callId
Was this page useful?
Our most advanced speech-to-speech model is [gpt-realtime](/docs/models/gpt-realtime).
ore information, see the [announcement blog post](https://openai.com/index/introducing-gpt-realt
Update your session to use a prompt
----------------------
g, see the [realtime prompting cookbook](https://cookbook.openai.com/examples/realtime_prompting
### General usage tips
--------------------------------------------
s, see the [realtime prompting cookbook](https://cookbook.openai.com/examples/realtime_prompting
#### 1\. Be precise. Kill conflicts.
You can include sample phrases for preambles to add variety and better tailor to your use case.
s, see the [realtime prompting cookbook](https://cookbook.openai.com/examples/realtime_prompting
#### 9\. Use LLMs to improve your prompt.
This guide is long but not exhaustive! For more in a specific area, see the following resources:
* [Realtime prompting cookbook](https://cookbook.openai.com/examples/realtime_prompting_guide)
* [Inputs and outputs](/docs/guides/realtime-inputs-outputs): Text and audio input requirement
* [Managing conversations](/docs/guides/realtime-conversations): Learn to manage a conversatio
* [MCP servers](/docs/guides/realtime-mcp): How to use MCP servers to access additional tools
* [Realtime transcription](/docs/guides/realtime-transcription): How to transcribe audio with
* [Voice agents](https://openai.github.io/openai-agents-js/guides/voice-agents/quickstart/): A
Was this page useful?
Build low-latency, multimodal LLM applications with the Realtime API.
The OpenAI Realtime API enables low-latency communication with [models](/docs/models) that nativ
Voice agents
------------
f applications is the [Agents SDK for TypeScript](https://openai.github.io/openai-agents-js/guid
```js
import { RealtimeAgent, RealtimeSession } from "@openai/agents/realtime";
const agent = new RealtimeAgent({
Follow the voice agent quickstart to build Realtime agents in the browser.
](https://openai.github.io/openai-agents-js/guides/voice-agents/quickstart/)
To use the Realtime API directly outside the context of voice agents, check out the other connec
------------------
While building [voice agents with the Agents SDK](https://openai.github.io/openai-agents-js/guid
There are three primary supported interfaces for the Realtime API: