Skip to main content

JavaScript SDK

Use the official Lexa JavaScript SDK to integrate Lexa AI into your applications with a familiar OpenAI-style API.

Install from NPM

Get the latest version: npm install @robilabs/lexa@latest

Quick Start

Installation

npm install @robilabs/lexa@latest

Basic Usage

import Lexa from '@robilabs/lexa';

const lexa = new Lexa('your-api-key');

const completion = await lexa.chat({
  messages: [
    { role: 'user', content: 'Hello! How are you?' }
  ],
  model: 'lexa-mml',
  temperature: 0.7,
  max_tokens: 100
});

console.log(completion.choices[0].message.content);

Features

  • OpenAI-style API - Familiar interface, easy migration
  • Real Lexa Models - Access to lexa-mml, lexa-x1, lexa-rho
  • TypeScript Support - Full type definitions included
  • Streaming Support - Real-time text generation
  • Multimodal Capabilities - Vision and text processing
  • Simple Setup - Just install and start using

Available Models

The SDK provides access to the following Lexa models:

lexa-mml

Multimodal model with vision capabilities (8192 context window)

lexa-x1

Fast, lightweight text-based model (4096 context window)

lexa-rho

Reasoning model with enhanced capabilities (16384 context window)

Usage Examples

Basic Chat Completion

import Lexa from '@robilabs/lexa';

const lexa = new Lexa('your-api-key');

const completion = await lexa.chat({
  messages: [
    { role: 'user', content: 'Hello! Can you tell me a joke?' }
  ],
  model: 'lexa-mml',
  temperature: 0.7,
  max_tokens: 100
});

console.log(completion.choices[0].message.content);

Conversation with System Message

const completion = await lexa.chat({
  messages: [
    { 
      role: 'system', 
      content: 'You are a helpful assistant that speaks like a pirate.' 
    },
    { 
      role: 'user', 
      content: 'What is the weather like today?' 
    }
  ],
  model: 'lexa-x1',
  temperature: 0.9
});

Streaming Response

const completion = await lexa.chat({
  messages: [
    { role: 'user', content: 'Write a short story.' }
  ],
  model: 'lexa-rho',
  stream: true
});

// Handle streaming response
for await (const chunk of completion.stream) {
  console.log(chunk);
}

List Available Models

const models = await lexa.models();
console.log(models.data);
// Output: [
//   { id: 'lexa-mml', object: 'model', owned_by: 'lexa' },
//   { id: 'lexa-x1', object: 'model', owned_by: 'lexa' },
//   { id: 'lexa-rho', object: 'model', owned_by: 'lexa' }
// ]

API Reference

Constructor

const lexa = new Lexa(apiKey, config);
Parameters:
  • apiKey (string): Your Lexa API key
  • config (object, optional): Configuration options

Chat Completion

const completion = await lexa.chat(options);
Options:
  • messages (array): Array of message objects
  • model (string): Model to use (default: ‘lexa-mml’)
  • temperature (number, optional): Controls randomness (0-2)
  • max_tokens (number, optional): Maximum tokens to generate
  • stream (boolean, optional): Enable streaming response

List Models

const models = await lexa.models();
Returns available models with their metadata.

Migration from OpenAI

If you’re currently using OpenAI’s SDK, migrating to Lexa is straightforward:
  • Before (OpenAI)
  • After (Lexa)
import OpenAI from 'openai';
const openai = new OpenAI({ apiKey: 'your-key' });

const completion = await openai.chat.completions.create({
  messages: [{ role: 'user', content: 'Hello!' }],
  model: 'gpt-3.5-turbo'
});

Installation Options

  • npm
  • yarn
  • pnpm
npm install @robilabs/lexa@latest

Get Your API Key

1

Visit Lexa Chat

2

Sign up or log in

Create an account or sign in to your existing account
3

Navigate to Account settings

Click your avatar → Settings → Account tab
4

Generate API key

Find “API Keys” and click “Show” to view your key
5

Use in your application

Copy the key and use it in your Lexa SDK initialization

Support

Reference: @robilabs/lexa on npm
I