Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions docs.json
Original file line number Diff line number Diff line change
Expand Up @@ -168,6 +168,7 @@
"server/services/llm/grok",
"server/services/llm/groq",
"server/services/llm/nvidia",
"server/services/llm/novita",
"server/services/llm/ollama",
"server/services/llm/openai",
"server/services/llm/openpipe",
Expand Down
150 changes: 150 additions & 0 deletions server/services/llm/novita.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,150 @@
---
title: "Novita AI"
description: "LLM service implementation using Novita AI's OpenAI-compatible API"
---

## Overview

`NovitaLLMService` provides access to Novita AI's language models through an OpenAI-compatible interface. It inherits from `OpenAILLMService` and supports streaming responses, function calling, and context management with competitive pricing and a wide selection of open-source models.

<CardGroup cols={2}>
<Card
title="Novita LLM API Reference"
icon="code"
href="https://reference-server.pipecat.ai/en/latest/api/pipecat.services.novita.llm.html"
>
Pipecat's API methods for Novita AI integration
</Card>
<Card
title="Example Implementation"
icon="play"
href="https://github.com/pipecat-ai/pipecat/blob/main/examples/foundational/14z-function-calling-novita.py"
>
Complete example with function calling
</Card>
<Card
title="Novita AI Documentation"
icon="book"
href="https://novita.ai/docs"
>
Official Novita AI API documentation and features
</Card>
<Card title="Novita AI Platform" icon="microphone" href="https://novita.ai">
Access models and manage API keys
</Card>
</CardGroup>

## Installation

To use Novita AI services, install the required dependency:

```bash
pip install "pipecat-ai[novita]"
```

## Prerequisites

### Novita AI Account Setup

Before using Novita AI LLM services, you need:

1. **Novita AI Account**: Sign up at [Novita AI](https://novita.ai)
2. **API Key**: Generate an API key from your account dashboard
3. **Model Selection**: Choose from a wide selection of open-source models

### Required Environment Variables

- `NOVITA_API_KEY`: Your Novita AI API key for authentication

## Configuration

<ParamField path="api_key" type="str" required>
Novita AI API key for authentication.
</ParamField>

<ParamField path="base_url" type="str" default="https://api.novita.ai/openai">
Base URL for Novita AI API endpoint.
</ParamField>

<ParamField path="settings" type="NovitaLLMService.Settings" default="None">
Runtime-configurable settings. See [Settings](#settings) below.
</ParamField>

### Settings

Runtime-configurable settings passed via the `settings` constructor argument using `NovitaLLMService.Settings(...)`. These can be updated mid-conversation with `LLMUpdateSettingsFrame`. See [Service Settings](/guides/fundamentals/service-settings) for details.

This service uses the same settings as `OpenAILLMService`. See [OpenAI LLM Settings](/server/services/llm/openai#settings) for the full parameter reference.

The default model is `"moonshotai/kimi-k2.5"`.

## Usage

### Basic Setup

```python
import os
from pipecat.services.novita import NovitaLLMService

llm = NovitaLLMService(
api_key=os.getenv("NOVITA_API_KEY"),
settings=NovitaLLMService.Settings(
model="openai/gpt-oss-120b",
),
)
```

### With Custom Settings

```python
llm = NovitaLLMService(
api_key=os.getenv("NOVITA_API_KEY"),
settings=NovitaLLMService.Settings(
model="moonshotai/kimi-k2.5",
temperature=0.7,
max_tokens=500,
),
)
```

### With Function Calling

```python
from pipecat.adapters.schemas.function_schema import FunctionSchema
from pipecat.adapters.schemas.tools_schema import ToolsSchema
from pipecat.processors.aggregators.llm_context import LLMContext
from pipecat.services.llm_service import FunctionCallParams

async def get_weather(params: FunctionCallParams):
await params.result_callback({"temperature": "75", "conditions": "sunny"})

llm = NovitaLLMService(
api_key=os.getenv("NOVITA_API_KEY"),
settings=NovitaLLMService.Settings(
model="openai/gpt-oss-120b",
),
)

llm.register_function("get_weather", get_weather)

weather_function = FunctionSchema(
name="get_weather",
description="Get the current weather",
properties={
"location": {
"type": "string",
"description": "City and state, e.g. San Francisco, CA",
},
},
required=["location"],
)

tools = ToolsSchema(standard_tools=[weather_function])
context = LLMContext(tools=tools)
```

## Notes

- Novita AI provides an OpenAI-compatible API, so all OpenAI features and patterns work with this service
- The service supports streaming responses, function calling, and other OpenAI-compatible features
- Model selection depends on your Novita AI account access and pricing tier
1 change: 1 addition & 0 deletions server/services/supported-services.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -73,6 +73,7 @@ LLMs receive text or audio based input and output a streaming text response.
| [Grok](/server/services/llm/grok) | `pip install "pipecat-ai[grok]"` |
| [Groq](/server/services/llm/groq) | `pip install "pipecat-ai[groq]"` |
| [NVIDIA](/server/services/llm/nvidia) | `pip install "pipecat-ai[nvidia]"` |
| [Novita AI](/server/services/llm/novita) | `pip install "pipecat-ai[novita]"` |
| [Ollama](/server/services/llm/ollama) | `pip install "pipecat-ai[ollama]"` |
| [OpenAI](/server/services/llm/openai) | `pip install "pipecat-ai[openai]"` |
| [OpenPipe](/server/services/llm/openpipe) | `pip install "pipecat-ai[openpipe]"` |
Expand Down
Loading