Check out our upcoming events and meetups! View events →
Create a response using Telnyx’s OpenAI-compatible Responses API. This endpoint is compatible with the OpenAI Responses API and may be used with the OpenAI JS or Python SDK by setting the base URL to https://api.telnyx.com/v2/ai/openai.
The conversation parameter refers to a Telnyx Conversation rather than an OpenAI-hosted conversation object. To persist a thread across turns, first create a conversation with POST /ai/conversations, then pass that conversation’s id in the Responses request as conversation. The endpoint appends the new input, assistant output, reasoning, and tool-call messages to that conversation. Reuse the same conversation id on subsequent Responses requests, including tool-result followups, so the model receives the prior context.
If conversation is omitted, the request is processed without persisting messages to a Telnyx conversation. Use the Conversations API to manage history: list conversations (optionally filtered by metadata), fetch messages for a conversation, and optionally add messages outside the Responses flow.
You can attach arbitrary metadata when creating a conversation (for example to tag the conversation’s source, channel, or user) and later filter by it when listing conversations.
import Telnyx from 'telnyx';
const client = new Telnyx({
apiKey: process.env['TELNYX_API_KEY'], // This is the default and can be omitted
});
const response = await client.ai.openai.createResponse({
conversation: '6a09cdc3-8948-47f0-aa62-74ac943d6c58',
input: [{ role: 'user', content: [{ type: 'input_text', text: 'Hello, world!' }] }],
instructions: 'You are a friendly chatbot.',
model: 'zai-org/GLM-5.1-FP8',
stream: true,
});
console.log(response);{}Documentation Index
Fetch the complete documentation index at: https://developers.telnyx.com/llms.txt
Use this file to discover all available pages before exploring further.
Bearer authentication header of the form Bearer <token>, where <token> is your auth token.
Model identifier to use for the response, for example zai-org/GLM-5.1-FP8 or another model available from the Telnyx OpenAI-compatible models endpoint.
The input items for this turn, using the OpenAI Responses API input format.
Optional Telnyx Conversation ID from POST /ai/conversations. When provided, Telnyx stores this turn on that conversation and uses the conversation's prior messages as context. Reuse the same ID for subsequent turns and tool-result followups. Omit it for a non-persisted, stateless response.
Optional system/developer instructions for the model. When used with a persisted conversation, send these on the first request that creates the thread; subsequent turns can rely on the stored history.
Set to true to stream Server-Sent Events, matching OpenAI's Responses streaming format.
Successful Response
The response is of type object.
Was this page helpful?
import Telnyx from 'telnyx';
const client = new Telnyx({
apiKey: process.env['TELNYX_API_KEY'], // This is the default and can be omitted
});
const response = await client.ai.openai.createResponse({
conversation: '6a09cdc3-8948-47f0-aa62-74ac943d6c58',
input: [{ role: 'user', content: [{ type: 'input_text', text: 'Hello, world!' }] }],
instructions: 'You are a friendly chatbot.',
model: 'zai-org/GLM-5.1-FP8',
stream: true,
});
console.log(response);{}