Skip to main content
Understanding how Edge Compute executes your functions helps you optimize performance and build reliable applications.

Overview

Edge Compute runs your functions in lightweight Linux containers. When a request arrives:
  1. Routing — Request is routed to the nearest edge location
  2. Container selection — An existing warm container handles the request, or a new one starts (cold start)
  3. Execution — Your function processes the request
  4. Response — Result is returned to the caller
  5. Keep-alive — Container stays warm for subsequent requests

Function Lifecycle

Cold Start Phase

When a function receives its first request or scales up, a new container initializes:
// Global initialization (runs once per container)
const { Pool } = require('pg');

// Expensive operations here — only run once
const pool = new Pool({ connectionString: process.env.DATABASE_URL });
const cache = {};

// Function handler (runs per request)
async function handler(req, res) {
    // Fast path — reuse initialized resources
    const result = await pool.query('SELECT * FROM users');
    res.json(result.rows);
}

module.exports = { handler };
Cold start timeline:
  1. Container image pulled (cached at edge)
  2. Runtime initializes (Python/Go/Node/Java)
  3. Global code executes (imports, connections)
  4. First request handled

Warm Execution

Subsequent requests reuse the same container instance:
// Global variables persist between requests
const axios = require('axios');

const client = axios.create({
    timeout: 10000
});

async function handler(req, res) {
    // Fast execution — resources already initialized
    console.log(`Request: ${req.method} ${req.path}`);
    
    const resp = await client.get('https://api.example.com/data');
    res.json(resp.data);
}

Container Recycling

Containers are recycled when:
  • Idle for extended periods (to free resources)
  • Memory limits approached
  • New deployment shipped
  • Platform scaling decisions
Don’t rely on container persistence for critical state. Use KV or external storage for data that must survive restarts.

Cold Start Optimization

Minimize cold start latency with these patterns:

1. Lazy Initialization

Defer expensive operations until needed:
// Bad — always initializes
const mlModel = loadModel('large_model.pkl'); // 2 seconds

async function handler(req, res) {
    const result = mlModel.predict(req.body);
    res.json(result);
}
// Good — only loads when needed
let mlModel = null;

function getModel() {
    if (!mlModel) {
        mlModel = loadModel('large_model.pkl');
    }
    return mlModel;
}

async function handler(req, res) {
    if (req.path === '/predict') {
        const result = getModel().predict(req.body);
        return res.json(result);
    }
    res.json({ status: 'ok' });
}

2. Minimize Dependencies

// Bad — imports everything
const _ = require('lodash');
const moment = require('moment');

// Good — import only what you need
const pick = require('lodash/pick');
// Or use native: Object.fromEntries, Date

3. Connection Pooling

// Initialize pool globally
const { Pool } = require('pg');

const pool = new Pool({
    connectionString: process.env.DATABASE_URL,
    max: 10,
    idleTimeoutMillis: 30000
});

async function handler(req, res) {
    const client = await pool.connect();
    try {
        const result = await client.query('SELECT * FROM users');
        res.json(result.rows);
    } finally {
        client.release();
    }
}

Concurrency

Each container handles one request at a time by default. The platform automatically scales containers based on traffic:
Traffic: 100 requests/second

Platform scales to ~100 containers

Each container handles ~1 req/sec

Scaling Behavior

Traffic PatternPlatform Response
Traffic spikeNew containers start (cold starts)
Sustained loadContainers stay warm
Traffic dropsContainers gradually recycle
Zero trafficAll containers recycle after idle timeout

Request Timeouts

Functions have execution time limits:
TierTimeout
Default30 seconds
Extended60 seconds (configurable)
Handle timeouts gracefully:
async function handler(req, res) {
    // Set timeout for 25 seconds (5 sec buffer before platform timeout)
    const timeout = new Promise((_, reject) => 
        setTimeout(() => reject(new Error('Timeout')), 25000)
    );
    
    try {
        const result = await Promise.race([
            longRunningOperation(),
            timeout
        ]);
        res.json(result);
    } catch (err) {
        if (err.message === 'Timeout') {
            res.json({ error: 'Operation timed out', partial: getPartialResult() });
        } else {
            throw err;
        }
    }
}

Triggers

Functions can be invoked by:

HTTP Requests

# func.toml
[edge_compute]
func_name = "my-api"

# Accessible at: https://my-api-{orgId}.telnyxcompute.com

Webhooks

Configure Telnyx services to call your function:
const express = require('express');
const app = express();

app.use(express.json());

app.post('/webhook', (req, res) => {
    const event = req.body;
    
    if (event.event_type === 'message.received') {
        handleIncomingMessage(event.data);
    } else if (event.event_type === 'call.initiated') {
        handleCall(event.data);
    }
    
    res.json({ status: 'ok' });
});

Cron Triggers (Coming Soon)

🔜 Scheduled execution via cron expressions is planned for a future release.

Graceful Shutdown

When containers recycle, your function receives a SIGTERM signal:
process.on('SIGTERM', () => {
    console.log('Shutting down...');
    
    // Clean up resources
    pool.end();
    cache.flush();
    
    process.exit(0);
});

Best Practices

  1. Initialize globally — Move expensive setup outside request handlers
  2. Keep handlers fast — Aim for < 100ms p99 latency
  3. Use connection pools — Reuse database/HTTP connections
  4. Handle errors gracefully — Return meaningful error responses
  5. Don’t store state in memory — Use KV or external storage for persistence
  6. Set appropriate timeouts — On outbound requests to prevent hanging

Next Steps

  • Bindings — Connect to Telnyx platform services
  • Limits — Understand resource constraints
  • Configuration — Set environment variables and secrets