Providers

OpenAI Providers

Use Composio with OpenAI's Responses and Chat Completion APIs

The OpenAI Provider is the default provider for the Composio SDK. It transforms Composio tools into a format compatible with OpenAI's function calling capabilities through both the Responses and Chat Completion APIs.

Setup

By default, the OpenAI Provider is installed when you install the Composio SDK. You can also install it manually:

pip install composio_openai
npm install @composio/openai

Responses API

The Responses API is the recommended way to build more agentic flows with the OpenAI API. Read more about it in the OpenAI documentation

Before executing any tools that require authentication (like Gmail), you'll need to:

  1. Create an Auth Configuration for your integration
  2. Set up a Connected Account for the user.
from openai import OpenAI
from composio import Composio
from composio_openai import OpenAIResponsesProvider

# Initialize Composio client with OpenAI Provider
composio = Composio(provider=OpenAIResponsesProvider())
openai = OpenAI()

# Make sure to create an auth config and a connected account for the user with gmail toolkit
# Make sure to replace "your-user-id" with the actual user ID
user_id = "your-user-id"

tools = composio.tools.get(user_id=user_id, tools=["GMAIL_SEND_EMAIL"])

response = openai.responses.create(
    model="gpt-5",
    tools=tools,
    input=[
        {
            "role": "user",
            "content": "Send an email to soham.g@composio.dev with the subject 'Running OpenAI Provider snippet' and body 'Hello from the code snippet in openai docs'"
        }
    ]
)

# Execute the function calls
result = composio.provider.handle_tool_calls(response=response, user_id=user_id)
print(result)
import class OpenAI
API Client for interfacing with the OpenAI API.
OpenAI
from 'openai';
import { class Composio<TProvider extends BaseComposioProvider<unknown, unknown, unknown> = OpenAIProvider>
This is the core class for Composio. It is used to initialize the Composio SDK and provide a global configuration.
Composio
} from '@composio/core';
import { class OpenAIResponsesProvider
OpenAI-specific MCP server response format
OpenAIResponsesProvider
} from '@composio/openai';
// Initialize Composio client with OpenAI Provider const const composio: Composio<OpenAIResponsesProvider>composio = new new Composio<OpenAIResponsesProvider>(config?: ComposioConfig<OpenAIResponsesProvider> | undefined): Composio<OpenAIResponsesProvider>
Creates a new instance of the Composio SDK. The constructor initializes the SDK with the provided configuration options, sets up the API client, and initializes all core models (tools, toolkits, etc.).
@paramconfig - Configuration options for the Composio SDK@paramconfig.apiKey - The API key for authenticating with the Composio API@paramconfig.baseURL - The base URL for the Composio API (defaults to production URL)@paramconfig.allowTracking - Whether to allow anonymous usage analytics@paramconfig.provider - The provider to use for this Composio instance (defaults to OpenAIProvider)@example```typescript // Initialize with default configuration const composio = new Composio(); // Initialize with custom API key and base URL const composio = new Composio({ apiKey: 'your-api-key', baseURL: 'https://api.composio.dev' }); // Initialize with custom provider const composio = new Composio({ apiKey: 'your-api-key', provider: new CustomProvider() }); ```
Composio
({
provider?: OpenAIResponsesProvider | undefined
The tool provider to use for this Composio instance.
@examplenew OpenAIProvider()
provider
: new new OpenAIResponsesProvider(options?: OpenAIResponsesProviderOptions): OpenAIResponsesProvider
Creates a new instance of the OpenAIProvider. This is the default provider for the Composio SDK and is automatically available without additional installation.
@paramoptions - Optional provider options@returnsThe OpenAIResponsesProvider instance@example```typescript // The OpenAIProvider is used by default when initializing Composio const composio = new Composio({ apiKey: 'your-api-key' }); // You can also explicitly specify it const composio = new Composio({ apiKey: 'your-api-key', provider: new OpenAIResponsesProvider({ strict: true // Optional, default is false }) }); ```
OpenAIResponsesProvider
(),
}); const const openai: OpenAIopenai = new new OpenAI({ baseURL, apiKey, organization, project, webhookSecret, ...opts }?: ClientOptions): OpenAI
API Client for interfacing with the OpenAI API.
@paramopts.apiKey@paramopts.organization@paramopts.project@paramopts.webhookSecret@paramopts.baseURL ://api.openai.com/v1] - Override the default base URL for the API.@paramopts.timeout minutes] - The maximum amount of time (in milliseconds) the client will wait for a response before timing out.@paramopts.fetchOptions - Additional `RequestInit` options to be passed to `fetch` calls.@paramopts.fetch - Specify a custom `fetch` function implementation.@paramopts.maxRetries - The maximum number of times the client will retry a request.@paramopts.defaultHeaders - Default headers to include with every request to the API.@paramopts.defaultQuery - Default query parameters to include with every request to the API.@paramopts.dangerouslyAllowBrowser - By default, client-side use of this library is not allowed, as it risks exposing your secret API credentials to attackers.
OpenAI
({});
// Make sure to create an auth config and a connected account for the user with gmail toolkit // Make sure to replace "your-user-id" with the actual user ID const const userId: "your-user-id"userId = "your-user-id"; async function function main(): Promise<void>main() { try { const const tools: OpenAiToolCollectiontools = await const composio: Composio<OpenAIResponsesProvider>composio.Composio<OpenAIResponsesProvider>.tools: Tools<unknown, unknown, OpenAIResponsesProvider>
List, retrieve, and execute tools
tools
.Tools<unknown, unknown, OpenAIResponsesProvider>.get<OpenAIResponsesProvider>(userId: string, filters: ToolListParams, options?: ToolOptions | undefined): Promise<OpenAiToolCollection> (+1 overload)
Get a list of tools from Composio based on filters. This method fetches the tools from the Composio API and wraps them using the provider.
@paramuserId - The user id to get the tools for@paramfilters - The filters to apply when fetching tools@paramoptions - Optional provider options including modifiers@returnsThe wrapped tools collection@example```typescript // Get tools from the GitHub toolkit const tools = await composio.tools.get('default', { toolkits: ['github'], limit: 10 }); // Get tools with search const searchTools = await composio.tools.get('default', { search: 'user', limit: 10 }); // Get a specific tool by slug const hackerNewsUserTool = await composio.tools.get('default', 'HACKERNEWS_GET_USER'); // Get a tool with schema modifications const tool = await composio.tools.get('default', 'GITHUB_GET_REPOS', { modifySchema: (toolSlug, toolkitSlug, schema) => { // Customize the tool schema return {...schema, description: 'Custom description'}; } }); ```
get
(const userId: "your-user-id"userId, {tools: string[]tools: ["GMAIL_SEND_EMAIL"]});
const
const response: OpenAI.Responses.Response & {
    _request_id?: string | null;
}
response
= await const openai: OpenAIopenai.OpenAI.responses: OpenAI.Responsesresponses.Responses.create(body: OpenAI.Responses.ResponseCreateParamsNonStreaming, options?: RequestOptions): APIPromise<OpenAI.Responses.Response> (+2 overloads)
Creates a model response. Provide [text](https://platform.openai.com/docs/guides/text) or [image](https://platform.openai.com/docs/guides/images) inputs to generate [text](https://platform.openai.com/docs/guides/text) or [JSON](https://platform.openai.com/docs/guides/structured-outputs) outputs. Have the model call your own [custom code](https://platform.openai.com/docs/guides/function-calling) or use built-in [tools](https://platform.openai.com/docs/guides/tools) like [web search](https://platform.openai.com/docs/guides/tools-web-search) or [file search](https://platform.openai.com/docs/guides/tools-file-search) to use your own data as input for the model's response.
@example```ts const response = await client.responses.create(); ```
create
({
ResponseCreateParamsBase.model?: ResponsesModel | undefined
Model ID used to generate the response, like `gpt-4o` or `o3`. OpenAI offers a wide range of models with different capabilities, performance characteristics, and price points. Refer to the [model guide](https://platform.openai.com/docs/models) to browse and compare available models.
model
: "gpt-5",
ResponseCreateParamsBase.tools?: OpenAI.Responses.Tool[] | undefined
An array of tools the model may call while generating a response. You can specify which tool to use by setting the `tool_choice` parameter. We support the following categories of tools: - **Built-in tools**: Tools that are provided by OpenAI that extend the model's capabilities, like [web search](https://platform.openai.com/docs/guides/tools-web-search) or [file search](https://platform.openai.com/docs/guides/tools-file-search). Learn more about [built-in tools](https://platform.openai.com/docs/guides/tools). - **MCP Tools**: Integrations with third-party systems via custom MCP servers or predefined connectors such as Google Drive and SharePoint. Learn more about [MCP Tools](https://platform.openai.com/docs/guides/tools-connectors-mcp). - **Function calls (custom tools)**: Functions that are defined by you, enabling the model to call your own code with strongly typed arguments and outputs. Learn more about [function calling](https://platform.openai.com/docs/guides/function-calling). You can also use custom tools to call your own code.
tools
: const tools: OpenAiToolCollectiontools,
ResponseCreateParamsBase.input?: string | OpenAI.Responses.ResponseInput | undefined
Text, image, or file inputs to the model, used to generate a response. Learn more: - [Text inputs and outputs](https://platform.openai.com/docs/guides/text) - [Image inputs](https://platform.openai.com/docs/guides/images) - [File inputs](https://platform.openai.com/docs/guides/pdf-files) - [Conversation state](https://platform.openai.com/docs/guides/conversation-state) - [Function calling](https://platform.openai.com/docs/guides/function-calling)
input
: [
{ role: "user"role: "user", content: stringcontent: "Send an email to soham.g@composio.dev with the subject 'Running OpenAI Provider snippet' and body 'Hello from the code snippet in openai docs'" }, ], }); // Execute the function calls const const result: OpenAI.Responses.ResponseInputItem.FunctionCallOutput[]result = await const composio: Composio<OpenAIResponsesProvider>composio.Composio<OpenAIResponsesProvider>.provider: OpenAIResponsesProvider
The tool provider instance used for wrapping tools in framework-specific formats
provider
.OpenAIResponsesProvider.handleToolCalls(userId: string, toolCalls: OpenAI.Responses.ResponseOutputItem[], options?: ExecuteToolFnOptions, modifiers?: ExecuteToolModifiers): Promise<OpenAI.Responses.ResponseInputItem.FunctionCallOutput[]>
Handles tool calls from OpenAI's response. This method processes tool calls from an OpenAI response, executes each tool call, and returns the results.
@paramuserId - The user ID for authentication and tracking@paramchatCompletion - The response from OpenAI@paramoptions - Optional execution options@parammodifiers - Optional execution modifiers@returnsArray of tool execution results as JSON strings@example```typescript // Handle tool calls from a response const response = await openai.responses.create({ model: 'gpt-4o-2024-11-20', input: 'What is the capital of France?', tools: await composio.tools.get(composioTools) }); const inputItems = await composio.provider.handleToolCalls( 'user123', response.output ); console.log(inputItems); // Array of tool execution results // Submit tool outputs back to OpenAI const response = await openai.responses.create({ model: 'gpt-4o-2024-11-20', input: inputItems, tools: await composio.tools.get(composioTools), }); ``` ```
handleToolCalls
(const userId: "your-user-id"userId,
const response: OpenAI.Responses.Response & {
    _request_id?: string | null;
}
response
.Response.output: OpenAI.Responses.ResponseOutputItem[]
An array of content items generated by the model. - The length and order of items in the `output` array is dependent on the model's response. - Rather than accessing the first item in the `output` array and assuming it's an `assistant` message with the content generated by the model, you might consider using the `output_text` property where supported in SDKs.
output
);
var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const count = 5; console.log('count: %d', count); // Prints: count: 5, to stdout console.log('count:', count); // Prints: count: 5, to stdout ``` See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log
(const result: OpenAI.Responses.ResponseInputItem.FunctionCallOutput[]result);
} catch (function (local var) error: unknownerror) { var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.error(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stderr` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const code = 5; console.error('error #%d', code); // Prints: error #5, to stderr console.error('error', code); // Prints: error 5, to stderr ``` If formatting elements (e.g. `%d`) are not found in the first string then [`util.inspect()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilinspectobject-options) is called on each argument and the resulting string values are concatenated. See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
error
('Error:', function (local var) error: unknownerror);
} } function main(): Promise<void>main();

Chat Completion API

The Chat Completion API generates a model response from a list of messages. Read more about it in the OpenAI documentation. The OpenAI Chat Provider is the default provider used by Composio SDK, but you can also explicitly initialise it.

Before executing any tools that require authentication (like Gmail), you'll need to:

  1. Create an Auth Configuration for your integration
  2. Set up a Connected Account for the user.
from openai import OpenAI
from composio import Composio
from composio_openai import OpenAIProvider

# Initialize Composio client with OpenAI Provider
composio = Composio(provider=OpenAIProvider())
openai = OpenAI()

# Make sure to create an auth config and a connected account for the user with gmail toolkit
# Make sure to replace "your-user-id" with the actual user ID
user_id = "your-user-id"

tools = composio.tools.get(user_id=user_id, tools=["GMAIL_SEND_EMAIL"])

response = openai.chat.completions.create(
    model="gpt-5",
    tools=tools,
    messages=[
        {"role": "user", "content": "Send an email to soham.g@composio.dev with the subject 'Running OpenAI Provider snippet' and body 'Hello from the code snippet in openai docs'"},
    ],
)

# Execute the function calls
result = composio.provider.handle_tool_calls(response=response, user_id=user_id)
print(result)
import class OpenAI
API Client for interfacing with the OpenAI API.
OpenAI
from 'openai';
import { class Composio<TProvider extends BaseComposioProvider<unknown, unknown, unknown> = OpenAIProvider>
This is the core class for Composio. It is used to initialize the Composio SDK and provide a global configuration.
Composio
} from '@composio/core';
import { class OpenAIProviderOpenAIProvider } from '@composio/openai'; // Initialize Composio client with OpenAI Provider const const composio: Composio<OpenAIProvider>composio = new new Composio<OpenAIProvider>(config?: ComposioConfig<OpenAIProvider> | undefined): Composio<OpenAIProvider>
Creates a new instance of the Composio SDK. The constructor initializes the SDK with the provided configuration options, sets up the API client, and initializes all core models (tools, toolkits, etc.).
@paramconfig - Configuration options for the Composio SDK@paramconfig.apiKey - The API key for authenticating with the Composio API@paramconfig.baseURL - The base URL for the Composio API (defaults to production URL)@paramconfig.allowTracking - Whether to allow anonymous usage analytics@paramconfig.provider - The provider to use for this Composio instance (defaults to OpenAIProvider)@example```typescript // Initialize with default configuration const composio = new Composio(); // Initialize with custom API key and base URL const composio = new Composio({ apiKey: 'your-api-key', baseURL: 'https://api.composio.dev' }); // Initialize with custom provider const composio = new Composio({ apiKey: 'your-api-key', provider: new CustomProvider() }); ```
Composio
({
provider?: OpenAIProvider | undefined
The tool provider to use for this Composio instance.
@examplenew OpenAIProvider()
provider
: new new OpenAIProvider(): OpenAIProvider
Creates a new instance of the OpenAIProvider. This is the default provider for the Composio SDK and is automatically available without additional installation.
@example```typescript // The OpenAIProvider is used by default when initializing Composio const composio = new Composio({ apiKey: 'your-api-key' }); // You can also explicitly specify it const composio = new Composio({ apiKey: 'your-api-key', provider: new OpenAIProvider() }); ```
OpenAIProvider
(),
}); const const openai: OpenAIopenai = new new OpenAI({ baseURL, apiKey, organization, project, webhookSecret, ...opts }?: ClientOptions): OpenAI
API Client for interfacing with the OpenAI API.
@paramopts.apiKey@paramopts.organization@paramopts.project@paramopts.webhookSecret@paramopts.baseURL ://api.openai.com/v1] - Override the default base URL for the API.@paramopts.timeout minutes] - The maximum amount of time (in milliseconds) the client will wait for a response before timing out.@paramopts.fetchOptions - Additional `RequestInit` options to be passed to `fetch` calls.@paramopts.fetch - Specify a custom `fetch` function implementation.@paramopts.maxRetries - The maximum number of times the client will retry a request.@paramopts.defaultHeaders - Default headers to include with every request to the API.@paramopts.defaultQuery - Default query parameters to include with every request to the API.@paramopts.dangerouslyAllowBrowser - By default, client-side use of this library is not allowed, as it risks exposing your secret API credentials to attackers.
OpenAI
();
// Make sure to create an auth config and a connected account for the user with gmail toolkit // Make sure to replace "your-user-id" with the actual user ID const const userId: "your-user-id"userId = "your-user-id"; async function function main(): Promise<void>main() { try { const const tools: OpenAiToolCollectiontools = await const composio: Composio<OpenAIProvider>composio.Composio<OpenAIProvider>.tools: Tools<unknown, unknown, OpenAIProvider>
List, retrieve, and execute tools
tools
.Tools<unknown, unknown, OpenAIProvider>.get<OpenAIProvider>(userId: string, filters: ToolListParams, options?: ToolOptions | undefined): Promise<OpenAiToolCollection> (+1 overload)
Get a list of tools from Composio based on filters. This method fetches the tools from the Composio API and wraps them using the provider.
@paramuserId - The user id to get the tools for@paramfilters - The filters to apply when fetching tools@paramoptions - Optional provider options including modifiers@returnsThe wrapped tools collection@example```typescript // Get tools from the GitHub toolkit const tools = await composio.tools.get('default', { toolkits: ['github'], limit: 10 }); // Get tools with search const searchTools = await composio.tools.get('default', { search: 'user', limit: 10 }); // Get a specific tool by slug const hackerNewsUserTool = await composio.tools.get('default', 'HACKERNEWS_GET_USER'); // Get a tool with schema modifications const tool = await composio.tools.get('default', 'GITHUB_GET_REPOS', { modifySchema: (toolSlug, toolkitSlug, schema) => { // Customize the tool schema return {...schema, description: 'Custom description'}; } }); ```
get
(const userId: "your-user-id"userId, {tools: string[]tools: ["GMAIL_SEND_EMAIL"]});
const
const response: OpenAI.Chat.Completions.ChatCompletion & {
    _request_id?: string | null;
}
response
= await const openai: OpenAIopenai.OpenAI.chat: OpenAI.Chatchat.Chat.completions: OpenAI.Chat.Completionscompletions.Completions.create(body: OpenAI.Chat.Completions.ChatCompletionCreateParamsNonStreaming, options?: RequestOptions): APIPromise<OpenAI.Chat.Completions.ChatCompletion> (+2 overloads)
**Starting a new project?** We recommend trying [Responses](https://platform.openai.com/docs/api-reference/responses) to take advantage of the latest OpenAI platform features. Compare [Chat Completions with Responses](https://platform.openai.com/docs/guides/responses-vs-chat-completions?api-mode=responses). --- Creates a model response for the given chat conversation. Learn more in the [text generation](https://platform.openai.com/docs/guides/text-generation), [vision](https://platform.openai.com/docs/guides/vision), and [audio](https://platform.openai.com/docs/guides/audio) guides. Parameter support can differ depending on the model used to generate the response, particularly for newer reasoning models. Parameters that are only supported for reasoning models are noted below. For the current state of unsupported parameters in reasoning models, [refer to the reasoning guide](https://platform.openai.com/docs/guides/reasoning).
@example```ts const chatCompletion = await client.chat.completions.create( { messages: [{ content: 'string', role: 'developer' }], model: 'gpt-4o', }, ); ```
create
({
ChatCompletionCreateParamsBase.model: (string & {}) | ChatModel
Model ID used to generate the response, like `gpt-4o` or `o3`. OpenAI offers a wide range of models with different capabilities, performance characteristics, and price points. Refer to the [model guide](https://platform.openai.com/docs/models) to browse and compare available models.
model
: "gpt-5",
ChatCompletionCreateParamsBase.tools?: OpenAI.Chat.Completions.ChatCompletionTool[] | undefined
A list of tools the model may call. You can provide either [custom tools](https://platform.openai.com/docs/guides/function-calling#custom-tools) or [function tools](https://platform.openai.com/docs/guides/function-calling).
tools
: const tools: OpenAiToolCollectiontools,
ChatCompletionCreateParamsBase.messages: OpenAI.Chat.Completions.ChatCompletionMessageParam[]
A list of messages comprising the conversation so far. Depending on the [model](https://platform.openai.com/docs/models) you use, different message types (modalities) are supported, like [text](https://platform.openai.com/docs/guides/text-generation), [images](https://platform.openai.com/docs/guides/vision), and [audio](https://platform.openai.com/docs/guides/audio).
messages
: [
{ ChatCompletionUserMessageParam.role: "user"
The role of the messages author, in this case `user`.
role
: "user",
ChatCompletionUserMessageParam.content: string | OpenAI.Chat.Completions.ChatCompletionContentPart[]
The contents of the user message.
content
: "Send an email to soham.g@composio.dev with the subject 'Running OpenAI Provider snippet' and body 'Hello from the code snippet in openai docs'"
}, ], }); // Execute the function calls const const result: ChatCompletionToolMessageParam[]result = await const composio: Composio<OpenAIProvider>composio.Composio<OpenAIProvider>.provider: OpenAIProvider
The tool provider instance used for wrapping tools in framework-specific formats
provider
.OpenAIProvider.handleToolCalls(userId: string, chatCompletion: ChatCompletion, options?: ExecuteToolFnOptions, modifiers?: ExecuteToolModifiers): Promise<ChatCompletionToolMessageParam[]>
Handles tool calls from OpenAI's chat completion response. This method processes tool calls from an OpenAI chat completion response, executes each tool call, and returns the results.
@paramuserId - The user ID for authentication and tracking@paramchatCompletion - The chat completion response from OpenAI@paramoptions - Optional execution options@parammodifiers - Optional execution modifiers@returnsArray of tool execution results as JSON strings@example```typescript // Handle tool calls from a chat completion response const chatCompletion = { choices: [ { message: { tool_calls: [ { id: 'call_abc123', type: 'function', function: { name: 'SEARCH_TOOL', arguments: '{"query":"composio documentation"}' } } ] } } ] }; const results = await provider.handleToolCalls( 'user123', chatCompletion, { connectedAccountId: 'conn_xyz456' } ); console.log(results); // Array of tool execution results ```
handleToolCalls
(const userId: "your-user-id"userId,
const response: OpenAI.Chat.Completions.ChatCompletion & {
    _request_id?: string | null;
}
response
);
var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.log(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stdout` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const count = 5; console.log('count: %d', count); // Prints: count: 5, to stdout console.log('count:', count); // Prints: count: 5, to stdout ``` See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
log
(const result: ChatCompletionToolMessageParam[]result);
} catch (function (local var) error: unknownerror) { var console: Console
The `console` module provides a simple debugging console that is similar to the JavaScript console mechanism provided by web browsers. The module exports two specific components: * A `Console` class with methods such as `console.log()`, `console.error()` and `console.warn()` that can be used to write to any Node.js stream. * A global `console` instance configured to write to [`process.stdout`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstdout) and [`process.stderr`](https://nodejs.org/docs/latest-v24.x/api/process.html#processstderr). The global `console` can be used without importing the `node:console` module. _**Warning**_: The global console object's methods are neither consistently synchronous like the browser APIs they resemble, nor are they consistently asynchronous like all other Node.js streams. See the [`note on process I/O`](https://nodejs.org/docs/latest-v24.x/api/process.html#a-note-on-process-io) for more information. Example using the global `console`: ```js console.log('hello world'); // Prints: hello world, to stdout console.log('hello %s', 'world'); // Prints: hello world, to stdout console.error(new Error('Whoops, something bad happened')); // Prints error message and stack trace to stderr: // Error: Whoops, something bad happened // at [eval]:5:15 // at Script.runInThisContext (node:vm:132:18) // at Object.runInThisContext (node:vm:309:38) // at node:internal/process/execution:77:19 // at [eval]-wrapper:6:22 // at evalScript (node:internal/process/execution:76:60) // at node:internal/main/eval_string:23:3 const name = 'Will Robinson'; console.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to stderr ``` Example using the `Console` class: ```js const out = getStreamSomehow(); const err = getStreamSomehow(); const myConsole = new console.Console(out, err); myConsole.log('hello world'); // Prints: hello world, to out myConsole.log('hello %s', 'world'); // Prints: hello world, to out myConsole.error(new Error('Whoops, something bad happened')); // Prints: [Error: Whoops, something bad happened], to err const name = 'Will Robinson'; myConsole.warn(`Danger ${name}! Danger!`); // Prints: Danger Will Robinson! Danger!, to err ```
@see[source](https://github.com/nodejs/node/blob/v24.x/lib/console.js)
console
.Console.error(message?: any, ...optionalParams: any[]): void (+1 overload)
Prints to `stderr` with newline. Multiple arguments can be passed, with the first used as the primary message and all additional used as substitution values similar to [`printf(3)`](http://man7.org/linux/man-pages/man3/printf.3.html) (the arguments are all passed to [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args)). ```js const code = 5; console.error('error #%d', code); // Prints: error #5, to stderr console.error('error', code); // Prints: error 5, to stderr ``` If formatting elements (e.g. `%d`) are not found in the first string then [`util.inspect()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilinspectobject-options) is called on each argument and the resulting string values are concatenated. See [`util.format()`](https://nodejs.org/docs/latest-v24.x/api/util.html#utilformatformat-args) for more information.
@sincev0.1.100
error
('Error:', function (local var) error: unknownerror);
} } function main(): Promise<void>main();

Modifiers

Modifiers are functions that can be used to intercept and optionally modify the schema, the tool call request and the response from the tool call.

OpenAI provider modifiers are the standard framework modifiers. Read more here: Modifying tool schemas