Skip to main content
The official Lexa VSCode extension is coming soon. For now, you can integrate Lexa in VSCode using the Continue.dev extension configured to talk to your Lexa endpoint.

Download Extension

You can download the Continue.dev extension from the Visual Studio Marketplace or directly via the Extensions view in VSCode by searching for “continue”. Once installed, you can access it via the Continue tab in the sidebar.

Setup

Click the assistant selector to the right of the main chat input in Continue.dev. Hover over “Local Assistant” and click the settings icon (⚙️). This opens config.yaml, where you configure your local assistant.
If your chosen provider does not support authentication, configure Continue.dev to use the openai provider so it can use your Lexa authentication token.

Get your Lexa API key

1

Go to Lexa

Visit https://www.lexa.chat.
2

Sign up or log in

Create an account or sign in to your existing account.
3

Open your profile menu

Click your avatar (top-right) or the avatar at the bottom of the sidebar.
4

Open Settings

Click “Settings”.
5

Go to Account tab

Select the “Account” tab.
6

Show your API keys

Find “API Keys” and click “Show”.
7

Copy and use your key

Copy your API key and paste it into your Continue.dev config.yaml (see Example config below). Store it securely.
apiKey: [api-key]
You should see your key referenced in Continue.dev and requests should authenticate.
Treat API keys like passwords. Never commit them to source control. Rotate immediately if exposed.

Example config

Below is a Lexa-specific config.yaml. Update apiKey if needed.
config.yaml
name: Lexa
version: 1.0.0
schema: v1
models:
  - name: Lexa-MML
    provider: openai
    model: lexa-mml
    env:
      useLegacyCompletionsEndpoint: false
    apiBase: http://lexa.chat/api
    apiKey: [api-key]
    roles:
      - chat
      - edit
      - apply
      - summarize
    capabilities:
      - tool_use
      - image_input

context:
  - provider: code
  - provider: docs
  - provider: diff
  - provider: terminal
  - provider: problems
  - provider: folder
  - provider: codebase

Miscellaneous configuration settings

These values are required by the extension:
name: Local Assistant
version: 1.0.0
schema: v1
The context section provides additional information to models:
context:
  - provider: code
  - provider: docs
  - provider: diff
  - provider: terminal
  - provider: problems
  - provider: folder
  - provider: codebase

Models

Define all models you want to use:
models:
  - ...

Name

Sets the display name shown in the Continue.dev chat input:
name: Lexa

Provider

Use the OpenAI-compatible API provided by Lexa:
provider: openai

Model

Use the Lexa model identifier you intend to target. Available identifiers: lexa-mml, lexa-x1, lexa-rho.
model: lexa-mml

Legacy completions endpoint

Not required:
env:
  useLegacyCompletionsEndpoint: false

API base

Point Continue.dev to the Lexa endpoint:
apiBase: http://lexa.chat/api

API key

Use your Lexa API key:
apiKey: [api-key]

Roles and capabilities

Allow the model to be used for specific tasks and enable capabilities:
roles:
  - chat
  - edit
  - apply
  - summarize
capabilities:
  - tool_use
# - image_input  # Only supported by lexa-mml
Image input is supported only by lexa-mml. The lexa-rho and lexa-x1 models do not support image_input.
Setup complete. You can now interact with your model(s) via the Continue.dev chat input in VSCode.

Tips

Use concise prompts like “Refactor this function for readability” or “Summarize the open file”. For docs, try “Draft MDX frontmatter for a new integration page”.

Troubleshooting

  • If the Continue panel is missing, reload VSCode (Cmd+R) or restart.
  • If authentication fails, verify your API base and key, then re-authenticate.
  • For issues, contact support at lexa@robiai.com.
I