Skip to main content
POST
/
ai
/
openai
/
responses
JavaScript
import Telnyx from 'telnyx';

const client = new Telnyx({
  apiKey: process.env['TELNYX_API_KEY'], // This is the default and can be omitted
});

const response = await client.ai.openai.createResponse({
  conversation: '6a09cdc3-8948-47f0-aa62-74ac943d6c58',
  input: [{ role: 'user', content: [{ type: 'input_text', text: 'Hello, world!' }] }],
  instructions: 'You are a friendly chatbot.',
  model: 'zai-org/GLM-5.1-FP8',
  stream: true,
});

console.log(response);
{}

Documentation Index

Fetch the complete documentation index at: https://developers.telnyx.com/llms.txt

Use this file to discover all available pages before exploring further.

Authorizations

Authorization
string
header
required

Bearer authentication header of the form Bearer <token>, where <token> is your auth token.

Body

application/json
model
string

Model identifier to use for the response, for example zai-org/GLM-5.1-FP8 or another model available from the Telnyx OpenAI-compatible models endpoint.

input
any

The input items for this turn, using the OpenAI Responses API input format.

conversation
string<uuid>

Optional Telnyx Conversation ID from POST /ai/conversations. When provided, Telnyx stores this turn on that conversation and uses the conversation's prior messages as context. Reuse the same ID for subsequent turns and tool-result followups. Omit it for a non-persisted, stateless response.

instructions
string

Optional system/developer instructions for the model. When used with a persisted conversation, send these on the first request that creates the thread; subsequent turns can rely on the stored history.

stream
boolean

Set to true to stream Server-Sent Events, matching OpenAI's Responses streaming format.

Response

Successful Response

The response is of type object.