Skip to main content
OpenAI-compatible. Use OpenAILike with api_base swap.

Setup

pip install llama-index-core llama-index-llms-openai-like

Usage

import os
from llama_index.llms.openai_like import OpenAILike
from llama_index.core.llms import ChatMessage

llm = OpenAILike(
    api_base="https://api.telnyx.com/v2/ai/openai",
    api_key=os.getenv("TELNYX_API_KEY"),
    model="moonshotai/Kimi-K2.6",
    is_chat_model=True,
)

chat = llm.stream_chat([ChatMessage(role="user", content="Help me plan my vacation")])
for chunk in chat:
    print(chunk.delta, end="")

RAG with Embeddings

Combine with Telnyx Embeddings for retrieval-augmented generation. See the Embeddings guide for document upload and indexing.