Overview
SambaNovaLLMService provides access to SambaNova’s language models through an OpenAI-compatible interface. It inherits from OpenAILLMService and supports streaming responses, function calling, and context management with SambaNova’s high-performance inference platform.
SambaNova LLM API Reference
Pipecat’s API methods for SambaNova integration
Example Implementation
Complete example with function calling
SambaNova Documentation
Official SambaNova API documentation and features
SambaNova Cloud
Access models and manage API keys
Installation
To use SambaNova services, install the required dependencies:Prerequisites
SambaNova Account Setup
Before using SambaNova LLM services, you need:- SambaNova Account: Sign up at SambaNova Cloud
- API Key: Generate an API key from your account dashboard
- Model Selection: Choose from available high-performance models
Required Environment Variables
SAMBANOVA_API_KEY: Your SambaNova API key for authentication
Configuration
SambaNova API key for authentication.
Deprecated in v0.0.105. Use
settings=SambaNovaLLMService.Settings(model=...) instead.Runtime-configurable settings. See Settings below.
Base URL for SambaNova API endpoint.
Settings
Runtime-configurable settings passed via thesettings constructor argument using SambaNovaLLMService.Settings(...). These can be updated mid-conversation with LLMUpdateSettingsFrame. See Service Settings for details.
This service uses the same settings as OpenAILLMService. See OpenAI LLM Settings for the full parameter reference.
Usage
Basic Setup
With Custom Settings
Notes
- SambaNova does not support
frequency_penalty,presence_penalty, orseedparameters. - SambaNova has custom handling for tool call indexing. The service includes compatibility logic for processing function calls from the SambaNova API.
- SambaNova is known for high-throughput inference on large language models.