Skip to main content

Overview

OpenRouterLLMService provides access to OpenRouter’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and context management with access to multiple model providers through a single API.

Installation

To use OpenRouter services, install the required dependencies:
pip install "pipecat-ai[openrouter]"

Prerequisites

OpenRouter Account Setup

Before using OpenRouter LLM services, you need:
  1. OpenRouter Account: Sign up at OpenRouter
  2. API Key: Generate an API key from your account dashboard
  3. Model Selection: Choose from hundreds of available models from different providers
  4. Credits: Add credits to your account for model usage

Required Environment Variables

  • OPENROUTER_API_KEY: Your OpenRouter API key for authentication

Configuration

api_key
str
default:"None"
OpenRouter API key for authentication. If not provided, the client will attempt to read from environment variables.
model
str
default:"None"
deprecated
Deprecated in v0.0.105. Use settings=OpenRouterLLMService.Settings(model=...) instead.
settings
OpenRouterLLMService.Settings
default:"None"
Runtime-configurable settings. See Settings below.
base_url
str
default:"https://openrouter.ai/api/v1"
Base URL for OpenRouter API endpoint.

Settings

Runtime-configurable settings passed via the settings constructor argument using OpenRouterLLMService.Settings(...). These can be updated mid-conversation with LLMUpdateSettingsFrame. See Service Settings for details. This service uses the same settings as OpenAILLMService. See OpenAI LLM Settings for the full parameter reference.

Usage

Basic Setup

import os
from pipecat.services.openrouter import OpenRouterLLMService

llm = OpenRouterLLMService(
    api_key=os.getenv("OPENROUTER_API_KEY"),
    model="openai/gpt-4o-2024-11-20",
)

With a Different Provider Model

from pipecat.services.openrouter import OpenRouterLLMService

llm = OpenRouterLLMService(
    api_key=os.getenv("OPENROUTER_API_KEY"),
    settings=OpenRouterLLMService.Settings(
        model="anthropic/claude-sonnet-4-20250514",
        temperature=0.7,
        max_completion_tokens=1024,
    ),
)

Notes

  • OpenRouter model identifiers use the provider/model format (e.g., openai/gpt-4o, anthropic/claude-sonnet-4-20250514, google/gemini-pro).
  • When using Gemini models through OpenRouter, the service automatically handles the constraint that only one system message is allowed by converting additional system messages to user messages.
The InputParams / params= pattern is deprecated as of v0.0.105. Use Settings / settings= instead. See the Service Settings guide for migration details.