Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion guides/fundamentals/custom-frame-processor.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ With this positioning, the `MetricsFrameLogger` FrameProcessor will receive ever

## Key Requirements

FrameProcessors must inherit from the base `FrameProcessor` class. This ensures that your custom FrameProcessor will correctly handle frames like `StartFrame`, `EndFrame`, `StartInterruptionFrame` without having to write custom logic for those frames. This inheritance also provides it with the ability to `process_frame()` and `push_frame()`:
FrameProcessors must inherit from the base `FrameProcessor` class. This ensures that your custom FrameProcessor will correctly handle frames like `StartFrame`, `EndFrame`, `InterruptionFrame` without having to write custom logic for those frames. This inheritance also provides it with the ability to `process_frame()` and `push_frame()`:

- **`process_frame()`** is what allows the FrameProcessor to receive frames and add custom conditional logic based on the frames that are received.
- **`push_frame()`** allows the FrameProcessor to push frames to the pipeline. Normally, frames are pushed DOWNSTREAM, but based on which processors need the output, you can also push UPSTREAM or in both directions.
Expand Down
2 changes: 1 addition & 1 deletion guides/learn/pipeline.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -84,7 +84,7 @@ class ControlFrame(Frame):
# SystemFrames (processed immediately)
InputAudioRawFrame # User audio input
UserStartedSpeakingFrame # Speech detection events
StartInterruptionFrame # Interruption control
InterruptionFrame # Interruption control
ErrorFrame # Error notifications

# DataFrames (queued and ordered)
Expand Down
10 changes: 5 additions & 5 deletions server/pipeline/heartbeats.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ When heartbeats are enabled:
1. The pipeline sends a `HeartbeatFrame` every second
2. The frame traverses through all processors in the pipeline, from source to sink
3. The pipeline monitors how long it takes for heartbeat frames to complete their journey
4. If a heartbeat frame isn't received within 5 seconds, a warning is logged
4. If a heartbeat frame isn't received within 10 seconds, a warning is logged

## Monitoring Output

Expand All @@ -52,11 +52,11 @@ Heartbeat monitoring is useful for:

## Configuration

The heartbeat system uses two key timing constants:
The heartbeat system uses two timing values:

- `HEARTBEAT_SECONDS = 1.0` - Interval between heartbeat frames
- `HEARTBEAT_MONITOR_SECONDS = 10.0` - Time before warning if no heartbeat received
- **Interval** (default 1.0s) — how often heartbeat frames are sent. Configurable via `heartbeats_period_secs` in `PipelineParams`.
- **Monitor window** (10x the interval) — how long to wait before logging a warning if no heartbeat is received.

<Note>
These values are currently fixed but may be configurable in future versions.
The heartbeat interval is configurable via the `heartbeats_period_secs` parameter in `PipelineParams`. The monitor window is always 10x the interval.
</Note>
2 changes: 1 addition & 1 deletion server/pipeline/pipeline-params.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ task = PipelineTask(pipeline, params=params)

## Available Parameters

<ParamField path="allow_interruptions" type="bool" default="False">
<ParamField path="allow_interruptions" type="bool" default="True" deprecated>
<Warning>
DEPRECATED: This parameter is deprecated. Configure interruption behavior
via [User Turn
Expand Down
13 changes: 7 additions & 6 deletions server/pipeline/pipeline-task.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -66,7 +66,7 @@ await runner.run(task)
<ParamField
path="idle_timeout_frames"
type="Tuple[Type[Frame], ...]"
default="(BotSpeakingFrame, LLMFullResponseEndFrame)"
default="(BotSpeakingFrame, UserSpeakingFrame)"
>
Frame types that should prevent the pipeline from being considered idle. See
[Pipeline Idle Detection](/server/pipeline/pipeline-idle-detection) for
Expand All @@ -84,7 +84,7 @@ await runner.run(task)
guide](/server/utilities/opentelemetry) for details.
</ParamField>

<ParamField path="enable_turn_tracking" type="bool" default="False">
<ParamField path="enable_turn_tracking" type="bool" default="True">
Whether to enable turn tracking. See [The OpenTelemetry
guide](/server/utilities/opentelemetry) for details.
</ParamField>
Expand All @@ -108,10 +108,11 @@ await runner.run(task)
### Task Lifecycle Management

<ResponseField name="run()" type="async">
Starts and manages the pipeline execution until completion or cancellation.
Starts and manages the pipeline execution until completion or cancellation. Typically called via `PipelineRunner` rather than directly:

```python
await task.run()
runner = PipelineRunner()
await runner.run(task)
```

</ResponseField>
Expand Down Expand Up @@ -163,7 +164,7 @@ Downstream frames are pushed from the beginning of the pipeline. Upstream frames
await task.queue_frame(TTSSpeakFrame("Hello!"))

# Push a frame upstream from the end of the pipeline
from pipecat.frames.frames import FrameDirection
from pipecat.processors.frame_processor import FrameDirection
await task.queue_frame(UserStoppedSpeakingFrame(), direction=FrameDirection.UPSTREAM)
```

Expand All @@ -188,7 +189,7 @@ frames = [TTSSpeakFrame("Hello!"), TTSSpeakFrame("How are you?")]
await task.queue_frames(frames)

# Push frames upstream from the end of the pipeline
from pipecat.frames.frames import FrameDirection
from pipecat.processors.frame_processor import FrameDirection
frames = [TranscriptionFrame("user input"), UserStoppedSpeakingFrame()]
await task.queue_frames(frames, direction=FrameDirection.UPSTREAM)
```
Expand Down
18 changes: 7 additions & 11 deletions server/utilities/dtmf-aggregator.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -41,10 +41,6 @@ aggregator = DTMFAggregator(
Contains a single keypad button press with a KeypadEntry value
</ParamField>

<ParamField path="StartInterruptionFrame" type="Frame">
Flushes any pending aggregation when user interruption begins
</ParamField>

<ParamField path="EndFrame" type="Frame">
Flushes pending aggregation and stops the aggregation task
</ParamField>
Expand Down Expand Up @@ -73,7 +69,7 @@ The aggregator flushes (emits a TranscriptionFrame) when:

1. **Termination digit**: The configured termination digit is pressed (default: `#`)
2. **Timeout**: No new digits received within the timeout period (default: 2 seconds)
3. **Interruption**: A `StartInterruptionFrame` is received
3. **Interruption**: A `InterruptionFrame` is received
4. **Pipeline end**: An `EndFrame` is received

## Usage Examples
Expand Down Expand Up @@ -138,12 +134,12 @@ Respond appropriately to both voice and keypad input."""

## Sequence Examples

| User Input | Aggregation Trigger | Output TranscriptionFrame |
| ------------------ | ---------------------- | ------------------------- |
| `1`, `2`, `3`, `#` | Termination digit | `"DTMF: 123#"` |
| `*`, `0` | 2-second timeout | `"DTMF: *0"` |
| `5`, interruption | StartInterruptionFrame | `"DTMF: 5"` |
| `9`, `9`, EndFrame | Pipeline shutdown | `"DTMF: 99"` |
| User Input | Aggregation Trigger | Output TranscriptionFrame |
| ------------------ | ------------------- | ------------------------- |
| `1`, `2`, `3`, `#` | Termination digit | `"DTMF: 123#"` |
| `*`, `0` | 2-second timeout | `"DTMF: *0"` |
| `5`, interruption | InterruptionFrame | `"DTMF: 5"` |
| `9`, `9`, EndFrame | Pipeline shutdown | `"DTMF: 99"` |

## Frame Flow

Expand Down
2 changes: 1 addition & 1 deletion server/utilities/filters/stt-mute.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ The processor is configured using `STTMuteConfig`, which determines when and how
Indicates an interim transcription result (suppressed when muted)
</ParamField>

<ParamField path="StartInterruptionFrame" type="Frame">
<ParamField path="InterruptionFrame" type="Frame">
User interruption start event (suppressed when muted)
</ParamField>

Expand Down
6 changes: 3 additions & 3 deletions server/utilities/observers/debug-observer.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ task = PipelineTask(
Filter frames based on their type and source/destination:

```python
from pipecat.frames.frames import StartInterruptionFrame, UserStartedSpeakingFrame, LLMTextFrame
from pipecat.frames.frames import InterruptionFrame, UserStartedSpeakingFrame, LLMTextFrame
from pipecat.observers.loggers.debug_log_observer import DebugLogObserver, FrameEndpoint
from pipecat.transports.base_output_transport import BaseOutputTransport
from pipecat.services.stt_service import STTService
Expand All @@ -67,8 +67,8 @@ task = PipelineTask(
params=PipelineParams(
observers=[
DebugLogObserver(frame_types={
# Only log StartInterruptionFrame when source is BaseOutputTransport
StartInterruptionFrame: (BaseOutputTransport, FrameEndpoint.SOURCE),
# Only log InterruptionFrame when source is BaseOutputTransport
InterruptionFrame: (BaseOutputTransport, FrameEndpoint.SOURCE),

# Only log UserStartedSpeakingFrame when destination is STTService
UserStartedSpeakingFrame: (STTService, FrameEndpoint.DESTINATION),
Expand Down
6 changes: 3 additions & 3 deletions server/utilities/observers/observer-pattern.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -68,7 +68,7 @@ Here's an example observer that logs interruptions and bot speaking events:
```python
from pipecat.observers.base_observer import BaseObserver, FramePushed, FrameProcessed
from pipecat.frames.frames import (
StartInterruptionFrame,
InterruptionFrame,
BotStartedSpeakingFrame,
BotStoppedSpeakingFrame,
)
Expand All @@ -79,7 +79,7 @@ class DebugObserver(BaseObserver):
"""Observer to log interruptions and bot speaking events to the console.

Logs all frame instances of:
- StartInterruptionFrame
- InterruptionFrame
- BotStartedSpeakingFrame
- BotStoppedSpeakingFrame

Expand All @@ -91,7 +91,7 @@ class DebugObserver(BaseObserver):
time_sec = data.timestamp / 1_000_000_000
arrow = "→" if data.direction == FrameDirection.DOWNSTREAM else "←"

if isinstance(data.frame, StartInterruptionFrame):
if isinstance(data.frame, InterruptionFrame):
logger.info(f"⚡ INTERRUPTION START: {data.source} {arrow} {data.destination} at {time_sec:.2f}s")
elif isinstance(data.frame, BotStartedSpeakingFrame):
logger.info(f"🤖 BOT START SPEAKING: {data.source} {arrow} {data.destination} at {time_sec:.2f}s")
Expand Down
Loading