Skip to main content

Python SDK

Use the official Lexa Python SDK to integrate Lexa AI into your applications with an OpenAI-compatible interface.

Install from PyPI

Get the latest version: pip install lexa

Quick Start

Installation

pip install lexa

Basic Usage

from lexa_sdk import Lexa

# Initialize the client with your API key
client = Lexa(api_key="your-api-key")

# Simple chat completion
response = client.chat.completions.create(
    model="lexa-mml",
    messages=[
        {"role": "user", "content": "Hello! Tell me a joke."}
    ],
    temperature=0.7,
    max_tokens=100
)

print(response["choices"][0]["message"]["content"])

Features

  • OpenAI-Compatible - Drop-in replacement for OpenAI SDK
  • Async Support - Full async/await support for high-performance applications
  • Type Safety - Comprehensive type hints and validation with Pydantic
  • Streaming - Real-time streaming responses for interactive applications
  • Auto SSL - Automatic SSL certificate handling - works out of the box
  • Multiple Models - Support for all Lexa models (lexa-mml, lexa-x1, lexa-rho)
  • Flexible Configuration - Optional SSL and configuration overrides
  • High Performance - Optimized HTTP clients with connection pooling

Available Models

The SDK provides access to the following Lexa models:

lexa-mml

Multimodal model with vision capabilities (8,192 context window, 4,096 max tokens)

lexa-x1

Fast, lightweight text-based model (4,096 context window, 2,048 max tokens)

lexa-rho

Reasoning model with enhanced capabilities (16,384 context window, 8,192 max tokens)

Usage Examples

Basic Chat Completion

from lexa_sdk import Lexa

client = Lexa(api_key="your-api-key")

response = client.chat.completions.create(
    model="lexa-mml",
    messages=[
        {"role": "user", "content": "Hello! Tell me a joke."}
    ],
    temperature=0.7,
    max_tokens=100
)

print(response["choices"][0]["message"]["content"])

Async Support

import asyncio
from lexa_sdk import Lexa

async def main():
    client = Lexa(api_key="your-api-key")

    # Async chat completion
    response = await client.chat.completions.acreate(
        model="lexa-mml",
        messages=[{"role": "user", "content": "Explain quantum computing"}],
        temperature=0.3
    )

    print(response["choices"][0]["message"]["content"])

asyncio.run(main())

Streaming Responses

from lexa_sdk import Lexa

client = Lexa(api_key="your-api-key")

# Streaming chat completion
stream = client.chat.completions.create(
    model="lexa-mml",
    messages=[{"role": "user", "content": "Write a short story"}],
    temperature=0.8,
    stream=True
)

for chunk in stream:
    if chunk["choices"][0]["delta"].get("content"):
        print(chunk["choices"][0]["delta"]["content"], end="", flush=True)

List Available Models

from lexa_sdk import Lexa

client = Lexa(api_key="your-api-key")

# List available models
models = client.models.list()
print(models)

# Async list models
async def get_models():
    models = await client.models.alist()
    print(models)

Advanced Configuration

Custom SSL Configuration

from lexa_sdk import Lexa

# For environments with SSL issues (not recommended for production)
client = Lexa(
    api_key="your-api-key",
    verify_ssl=False  # ⚠️  Only use if necessary
)

# Or use enhanced SSL (default behavior)
client = Lexa(
    api_key="your-api-key",
    enhanced_ssl=True  # Automatically download and use correct certificates
)

API Reference

Client Methods

  • client.chat.completions.create() - Create chat completion
  • client.chat.completions.acreate() - Async chat completion
  • client.models.list() - List available models
  • client.models.alist() - Async list models

Parameters

  • model: Model to use (required)
  • messages: List of messages (required)
  • temperature: Sampling temperature (0.0 to 2.0)
  • max_tokens: Maximum tokens to generate
  • stream: Enable streaming responses
  • top_p: Nucleus sampling parameter
  • frequency_penalty: Frequency penalty
  • presence_penalty: Presence penalty

Security & SSL

The Lexa SDK automatically handles SSL certificate verification:
  • Default: Uses enhanced SSL with automatic certificate management
  • Fallback: Gracefully falls back to standard SSL verification
  • Manual Override: Allows custom SSL configuration when needed

Migration from OpenAI

If you’re currently using OpenAI’s SDK, migrating to Lexa is straightforward:
  • Before (OpenAI)
  • After (Lexa)
import openai

client = openai.OpenAI(api_key="your-key")

response = client.chat.completions.create(
    messages=[{"role": "user", "content": "Hello!"}],
    model="gpt-3.5-turbo"
)

Installation Options

  • pip
  • pip with version
  • From source
pip install lexa

Requirements

  • Python: 3.8 or higher
  • Dependencies: Automatically managed by pip
  • SSL: Automatic certificate handling (no manual setup required)

Get Your API Key

1

Visit Lexa Chat

2

Sign up or log in

Create an account or sign in to your existing account
3

Navigate to Account settings

Click your avatar → Settings → Account tab
4

Generate API key

Find “API Keys” and click “Show” to view your key
5

Use in your application

Copy the key and use it in your Lexa SDK initialization

Support

Reference: lexa on PyPI
I