Skip to main content

Overview

DeepSeekLLMService provides access to DeepSeek’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and context management with advanced reasoning capabilities.

Installation

To use DeepSeek services, install the required dependency:
pip install "pipecat-ai[deepseek]"

Prerequisites

DeepSeek Account Setup

Before using DeepSeek LLM services, you need:
  1. DeepSeek Account: Sign up at DeepSeek Platform
  2. API Key: Generate an API key from your account dashboard
  3. Model Selection: Choose from available DeepSeek models with reasoning capabilities

Required Environment Variables

  • DEEPSEEK_API_KEY: Your DeepSeek API key for authentication

Configuration

api_key
str
required
DeepSeek API key for authentication.
base_url
str
default:"https://api.deepseek.com/v1"
Base URL for DeepSeek API endpoint.
model
str
default:"None"
deprecated
Model identifier to use.Deprecated in v0.0.105. Use settings=DeepSeekLLMService.Settings(model=...) instead.
settings
DeepSeekLLMService.Settings
default:"None"
Runtime-configurable settings. See Settings below.

Settings

Runtime-configurable settings passed via the settings constructor argument using DeepSeekLLMService.Settings(...). These can be updated mid-conversation with LLMUpdateSettingsFrame. See Service Settings for details. This service uses the same settings as OpenAILLMService. See OpenAI LLM Settings for the full parameter reference.

Usage

Basic Setup

import os
from pipecat.services.deepseek import DeepSeekLLMService

llm = DeepSeekLLMService(
    api_key=os.getenv("DEEPSEEK_API_KEY"),
    model="deepseek-chat",
)

With Custom Settings

from pipecat.services.deepseek import DeepSeekLLMService

llm = DeepSeekLLMService(
    api_key=os.getenv("DEEPSEEK_API_KEY"),
    settings=DeepSeekLLMService.Settings(
        model="deepseek-chat",
        temperature=0.7,
        top_p=0.9,
        max_tokens=2048,
    ),
)

Notes

  • DeepSeek does not support the seed and max_completion_tokens parameters. Use max_tokens instead.
  • DeepSeek models offer strong reasoning capabilities, particularly the deepseek-reasoner model variant.
The InputParams / params= pattern is deprecated as of v0.0.105. Use Settings / settings= instead. See the Service Settings guide for migration details.