Skip to main content

Overview

QwenLLMService provides access to Alibaba Cloud’s Qwen language models through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and context management, with particularly strong capabilities for Chinese language processing.

Installation

To use Qwen services, install the required dependencies:
pip install "pipecat-ai[qwen]"

Prerequisites

Qwen Account Setup

Before using Qwen LLM services, you need:
  1. Alibaba Cloud Account: Sign up at Alibaba Cloud
  2. API Key: Generate an API key from your Model Studio dashboard
  3. Model Selection: Choose from available Qwen models with multilingual capabilities

Required Environment Variables

  • QWEN_API_KEY: Your Qwen API key for authentication

Configuration

api_key
str
required
Qwen (DashScope) API key for authentication.
base_url
str
Base URL for Qwen API endpoint.
model
str
default:"None"
deprecated
Deprecated in v0.0.105. Use settings=QwenLLMService.Settings(model=...) instead.
settings
QwenLLMService.Settings
default:"None"
Runtime-configurable settings. See Settings below.

Settings

Runtime-configurable settings passed via the settings constructor argument using QwenLLMService.Settings(...). These can be updated mid-conversation with LLMUpdateSettingsFrame. See Service Settings for details. This service uses the same settings as OpenAILLMService. See OpenAI LLM Settings for the full parameter reference.

Usage

Basic Setup

import os
from pipecat.services.qwen import QwenLLMService

llm = QwenLLMService(
    api_key=os.getenv("QWEN_API_KEY"),
    model="qwen-plus",
)

With Custom Settings

from pipecat.services.qwen import QwenLLMService

llm = QwenLLMService(
    api_key=os.getenv("QWEN_API_KEY"),
    settings=QwenLLMService.Settings(
        model="qwen-plus",
        temperature=0.7,
        top_p=0.9,
        max_completion_tokens=1024,
    ),
)

Notes

  • Qwen models are particularly strong for Chinese language processing and multilingual tasks.
  • Qwen fully supports the OpenAI-compatible parameter set inherited from OpenAILLMService.
  • The API endpoint uses the DashScope international URL by default. For users in mainland China, you may want to override base_url with the domestic endpoint.
The InputParams / params= pattern is deprecated as of v0.0.105. Use Settings / settings= instead. See the Service Settings guide for migration details.