Search

18 results found for openai function calling

Code
18

const INSTRUCTIONS = `
the user in English and tell them that they're using the OpenAI Realtime API, powered by the {{
Give them a very brief summary of the benefits of the Realtime API based on the details below,
When the user says goodbye or you think the call is over, say a brief goodbye and then
invoke the end_call function.
---
5 model offers more reliable instruction following, tool calling, and multilingual accuracy.
Specifically, the model delivers a +5% intelligence lift on Big Bench Audio, which measures re
const HANGUP_TOOL = {
type: "function",
name: "end_call",
description: `
Use this function to hang up the call when the user says goodbye
or otherwise indicates they are about to end the call.`,
// Builds the declarative session configuration for a Realtime API session.
export function makeSession(modelOverride?: string) {
const model = modelOverride || MODEL;
// AgentHandler implements the agent runtime behavior (tool calling, etc).
class AgentHandler implements ObserverHandler {
// Creates the runtime handler for the session.
export function createHandler(client: ObserverClient, callId: string) {
return new AgentHandler(client, callId);
You can chat with llms over email, the email thread functions as memory. The biggest thing is th
ke interface with llms. Pair that with back end data and functions and you got something really
### Toolings
* Llms can uses [tools](https://platform.openai.com/docs/guides/function-calling), meaning you c
import { OpenAI } from "npm:openai";
const openai = new OpenAI();
const functionExpression = await openai.chat.completions.create({
"messages": [
],
"functions": [
{
});
console.log(functionExpression);
// TODO pull out function call and initial message
let args = functionExpression.choices[0].message.function_call.arguments;
let functionCallResult = { "temperature": "22", "unit": "celsius", "description": "Sunny" };
const result = await openai.chat.completions.create({
"messages": [
"content": null,
"function_call": { "name": "get_current_weather", "arguments": "{ \"location\": \"Boston,
},
{
"role": "function",
"name": "get_current_weather",
"content": JSON.stringify(functionCallResult),
},
],
"functions": [
{
Migrated from folder: External_APIs/openai/function_calling/gpt4FunctionCallingExample
export default async function(req: Request) {
if (req.method !== "POST") {
console.log("Calling Groq TTS API...");
// Call Groq Speech API
const response = await fetch("https://api.groq.com/openai/v1/audio/speech", {
method: "POST",
export default async function(req: Request): Promise<Response> {
// Read secrets from x-secrets header
const apiKey = __secrets['openai_api_key'];
if (!apiKey) {
console.error('OpenAI API key not found in environment');
return Response.json(
{ error: 'OpenAI API key not configured' },
{ status: 500 }
console.log('Calling OpenAI API with prompt:', prompt);
// Call OpenAI API
const response = await fetch('https://api.openai.com/v1/chat/completions', {
method: 'POST',
const error = await response.text();
console.error('OpenAI API error:', response.status, error);
return Response.json(
{ error: `OpenAI API error: ${response.status} - ${error}` },
{ status: 500 }
const data = await response.json();
console.log('OpenAI response received');
import { createToolCallingAgent } from 'npm:langchain/agents'
import { AgentExecutor } from 'npm:langchain/agents'
import { ChatPromptTemplate } from 'npm:@langchain/core/prompts'
import { OpenAI } from "https://esm.town/v/std/openai";
import { WikipediaQueryRun } from 'npm:@langchain/community/tools/wikipedia_query_run'
export async function agentExample() {
const llm = new OpenAI()
const agent = createToolCallingAgent({ llm: llm, tools: tools, prompt })
const agentExecutor = new AgentExecutor({ agent, tools })
# askSMHI
Using OpenAI chat completion with function calls to [SMHI](https://en.wikipedia.org/wiki/Swedish
* [SMHI, forecast documentation](https://opendata.smhi.se/apidocs/metfcst/get-forecast.html)
* [OPEN AI, GPT function calling documentation](https://platform.openai.com/docs/guides/functio
2. Send the question to Open AI API moderation
3. Create tool calling by converting schema to JSON schema
4. Send the question to Open AI Chat Completion and expose tool calling
5. Make the API call to the SMHI API with parameters from OPEN AI
## Enviroment variables
* OPENAI_CHAT: Needs to be authorized to write chat completions and to the moderation API.
## Packages used
* openai: For typesafe API request and responses
* valibot: for describing the SMHI API response and function API input
* valibot/to-json-schema: Transform the schema to json schema (readable by the GPT API)
You can chat with llms over email, the email thread functions as memory. The biggest thing is th
ke interface with llms. Pair that with back end data and functions and you got something really
### Toolings
* Llms can uses [tools](https://platform.openai.com/docs/guides/function-calling), meaning you c
You can chat with llms over email, the email thread functions as memory. The biggest thing is th
ke interface with llms. Pair that with back end data and functions and you got something really
### Toolings
* Llms can uses [tools](https://platform.openai.com/docs/guides/function-calling), meaning you c
You can chat with llms over email, the email thread functions as memory. The biggest thing is th
ke interface with llms. Pair that with back end data and functions and you got something really
### Toolings
* Llms can uses [tools](https://platform.openai.com/docs/guides/function-calling), meaning you c
import { OpenAI } from "https://esm.town/v/std/openai?v=4";
const openai = new OpenAI();
async function runConversation() {
const inputWord = "almond latte";
const response = await openai.chat.completions.create({
messages: [
// for (let i = 0; i < message.tool_calls.length; i++) {
// console.log("[CALLING]", message.tool_calls[i].function);
// const tool = toolbox[message.tool_calls[i].function.name];
// if (tool) {
// const result = await tool.call(JSON.parse(message.tool_calls[i].function.arguments));
// console.log("[RESULT]", truncate(result));
Migrated from folder: openai_function_calling/grayWildfowl
Migrated from folder: openai_function_calling/fetchWebpage
Migrated from folder: openai_function_calling/weatherOfLatLon

Vals

No vals found

Users

No users found