Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions docs/content/docs/api-reference/canvas.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -94,7 +94,7 @@ chain.apply(queue: Queue | None = None) -> JobResult
```

Execute the chain by enqueuing and waiting for each step sequentially. Returns
the [`JobResult`](/docs/api-reference/result) of the **last** step.
the [`JobResult`](/api-reference/result) of the **last** step.

Each step's return value is prepended to the next mutable signature's args.
Immutable signatures (`task.si()`) receive their args as-is.
Expand Down Expand Up @@ -154,7 +154,7 @@ group.apply(queue: Queue | None = None) -> list[JobResult]
```

Enqueue all signatures and return a list of
[`JobResult`](/docs/api-reference/result) handles. Jobs run concurrently
[`JobResult`](/api-reference/result) handles. Jobs run concurrently
across available workers.

```python
Expand Down Expand Up @@ -209,7 +209,7 @@ chord.apply(queue: Queue | None = None) -> JobResult

Execute the group, wait for all results, then run the callback with the list
of results prepended to its args (unless immutable). Returns the
[`JobResult`](/docs/api-reference/result) of the callback.
[`JobResult`](/api-reference/result) of the callback.

```python
@queue.task()
Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/api-reference/cli.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -162,7 +162,7 @@ The server exposes three routes:
taskito scaler --app myapp:queue --port 9091 --target-queue-depth 5
```

See the [KEDA Integration guide](/docs/guides/operations) for Kubernetes deploy
See the [KEDA Integration guide](/guides/operations) for Kubernetes deploy
templates.

## Error messages
Expand Down
6 changes: 3 additions & 3 deletions docs/content/docs/api-reference/context.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -86,8 +86,8 @@ current_job.update_progress(progress: int) -> None
```

Update the job's progress percentage (0–100). The value is written directly to
the database and can be read via [`job.progress`](/docs/api-reference/result)
or [`queue.get_job()`](/docs/api-reference/queue/jobs).
the database and can be read via [`job.progress`](/api-reference/result)
or [`queue.get_job()`](/api-reference/queue/jobs).

```python
@queue.task()
Expand Down Expand Up @@ -116,7 +116,7 @@ current_job.publish(data: Any) -> None
```

Publish a partial result visible to
[`job.stream()`](/docs/api-reference/result) consumers. Use this to stream
[`job.stream()`](/api-reference/result) consumers. Use this to stream
intermediate data from long-running tasks.

`data` must be JSON-serializable. It is stored as a task log entry with
Expand Down
18 changes: 9 additions & 9 deletions docs/content/docs/api-reference/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -23,55 +23,55 @@ signature, parameters, return values, and a short example.
<Card
icon={<Compass />}
title="Overview"
href="/docs/api-reference/overview"
href="/api-reference/overview"
description="The shape of the public API — what's stable, what's experimental."
/>
<Card
icon={<Inbox />}
title="Queue"
href="/docs/api-reference/queue"
href="/api-reference/queue"
description="Construct queues, enqueue jobs, manage workers, inspect state."
/>
<Card
icon={<ListTodo />}
title="Task"
href="/docs/api-reference/task"
href="/api-reference/task"
description="The @queue.task decorator — options, methods, async variants."
/>
<Card
icon={<CheckCircle2 />}
title="Result"
href="/docs/api-reference/result"
href="/api-reference/result"
description="JobResult, polling, awaiting, timeouts, sync and async APIs."
/>
<Card
icon={<Variable />}
title="Context"
href="/docs/api-reference/context"
href="/api-reference/context"
description="current_job — progress, logging, cancellation, timeouts inside a task."
/>
<Card
icon={<GitFork />}
title="Canvas"
href="/docs/api-reference/canvas"
href="/api-reference/canvas"
description="chain, group, chord — the Celery-compatible composition primitives."
/>
<Card
icon={<Workflow />}
title="Workflows"
href="/docs/api-reference/workflows"
href="/api-reference/workflows"
description="The DAG builder — fan-out, fan-in, gates, conditions, sub-workflows."
/>
<Card
icon={<TestTube />}
title="Testing"
href="/docs/api-reference/testing"
href="/api-reference/testing"
description="test_mode, MockResource, fixtures for synchronous in-process verification."
/>
<Card
icon={<Terminal />}
title="CLI"
href="/docs/api-reference/cli"
href="/api-reference/cli"
description="taskito worker, dashboard, info — every flag and subcommand."
/>
</Cards>
16 changes: 8 additions & 8 deletions docs/content/docs/api-reference/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -7,11 +7,11 @@ Complete Python API reference for all public classes and methods.

| Class | Description |
|-------|-------------|
| [Queue](/docs/api-reference/queue) | Central orchestrator — task registration, enqueue, workers, and all queue operations |
| [TaskWrapper](/docs/api-reference/task) | Handle returned by `@queue.task()` — `delay()`, `apply_async()`, `map()`, signatures |
| [JobResult](/docs/api-reference/result) | Handle for an enqueued job — status polling, result retrieval, dependencies |
| [JobContext](/docs/api-reference/context) | Runtime context inside a running task — job ID, retry count, progress updates |
| [Canvas](/docs/api-reference/canvas) | Workflow primitives — `Signature`, `chain`, `group`, `chord` |
| [Workflows](/docs/api-reference/workflows) | DAG workflow builder, `WorkflowRun`, gates, sub-workflows |
| [Testing](/docs/api-reference/testing) | Test mode, `TestResult`, `MockResource` for unit testing tasks |
| [CLI](/docs/api-reference/cli) | `taskito` command-line interface — `worker`, `info`, `scaler` |
| [Queue](/api-reference/queue) | Central orchestrator — task registration, enqueue, workers, and all queue operations |
| [TaskWrapper](/api-reference/task) | Handle returned by `@queue.task()` — `delay()`, `apply_async()`, `map()`, signatures |
| [JobResult](/api-reference/result) | Handle for an enqueued job — status polling, result retrieval, dependencies |
| [JobContext](/api-reference/context) | Runtime context inside a running task — job ID, retry count, progress updates |
| [Canvas](/api-reference/canvas) | Workflow primitives — `Signature`, `chain`, `group`, `chord` |
| [Workflows](/api-reference/workflows) | DAG workflow builder, `WorkflowRun`, gates, sub-workflows |
| [Testing](/api-reference/testing) | Test mode, `TestResult`, `MockResource` for unit testing tasks |
| [CLI](/api-reference/cli) | `taskito` command-line interface — `worker`, `info`, `scaler` |
20 changes: 10 additions & 10 deletions docs/content/docs/api-reference/queue/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -10,11 +10,11 @@ The central class for creating and managing a task queue.
<Callout type="info" title="Sub-pages">
The Queue API is split across several pages for readability:

- **[Job Management](/docs/api-reference/queue/jobs)** — get, list, cancel, archive, replay jobs
- **[Queue & Stats](/docs/api-reference/queue/queues)** — rate limits, concurrency, pause/resume, statistics, dead letters
- **[Workers & Hooks](/docs/api-reference/queue/workers)** — run workers, lifecycle hooks, circuit breakers, async methods
- **[Resources & Locking](/docs/api-reference/queue/resources)** — resource system, distributed locks
- **[Events & Logs](/docs/api-reference/queue/events)** — event callbacks, webhooks, structured logs
- **[Job Management](/api-reference/queue/jobs)** — get, list, cancel, archive, replay jobs
- **[Queue & Stats](/api-reference/queue/queues)** — rate limits, concurrency, pause/resume, statistics, dead letters
- **[Workers & Hooks](/api-reference/queue/workers)** — run workers, lifecycle hooks, circuit breakers, async methods
- **[Resources & Locking](/api-reference/queue/resources)** — resource system, distributed locks
- **[Events & Logs](/api-reference/queue/events)** — event callbacks, webhooks, structured logs
</Callout>

## Constructor
Expand Down Expand Up @@ -54,7 +54,7 @@ Queue(
| `result_ttl` | `int \| None` | `None` | Auto-cleanup completed/dead jobs older than this many seconds. `None` disables. |
| `middleware` | `list[TaskMiddleware] \| None` | `None` | Queue-level middleware applied to all tasks. |
| `drain_timeout` | `int` | `30` | Seconds to wait for in-flight tasks during graceful shutdown. |
| `interception` | `str` | `"off"` | Argument interception mode: `"strict"`, `"lenient"`, or `"off"`. See [Resource System](/docs/guides/resources). |
| `interception` | `str` | `"off"` | Argument interception mode: `"strict"`, `"lenient"`, or `"off"`. See [Resource System](/guides/resources). |
| `max_intercept_depth` | `int` | `10` | Max recursion depth for argument walking. |
| `recipe_signing_key` | `str \| None` | `None` | HMAC-SHA256 key for proxy recipe integrity. Falls back to `TASKITO_RECIPE_SECRET` env var. |
| `max_reconstruction_timeout` | `int` | `10` | Max seconds allowed for proxy reconstruction. |
Expand Down Expand Up @@ -92,7 +92,7 @@ Queue(
) -> TaskWrapper
```

Register a function as a background task. Returns a [`TaskWrapper`](/docs/api-reference/task).
Register a function as a background task. Returns a [`TaskWrapper`](/api-reference/task).

| Parameter | Type | Default | Description |
|---|---|---|---|
Expand All @@ -109,7 +109,7 @@ Register a function as a background task. Returns a [`TaskWrapper`](/docs/api-re
| `circuit_breaker` | `dict \| None` | `None` | Circuit breaker config: `{"threshold": 5, "window": 60, "cooldown": 120}`. |
| `middleware` | `list[TaskMiddleware] \| None` | `None` | Per-task middleware, applied in addition to queue-level middleware. |
| `expires` | `float \| None` | `None` | Seconds until the job expires if not started. |
| `inject` | `list[str] \| None` | `None` | Resource names to inject as keyword arguments. See [Resource System](/docs/guides/resources). |
| `inject` | `list[str] \| None` | `None` | Resource names to inject as keyword arguments. See [Resource System](/guides/resources). |
| `serializer` | `Serializer \| None` | `None` | Per-task serializer override. Falls back to queue-level serializer. |
| `max_concurrent` | `int \| None` | `None` | Max concurrent running instances. `None` = no limit. |

Expand Down Expand Up @@ -157,11 +157,11 @@ queue.enqueue(
) -> JobResult
```

Enqueue a task for execution. Returns a [`JobResult`](/docs/api-reference/result) handle.
Enqueue a task for execution. Returns a [`JobResult`](/api-reference/result) handle.

| Parameter | Type | Default | Description |
|---|---|---|---|
| `depends_on` | `str \| list[str] \| None` | `None` | Job ID(s) this job depends on. See [Dependencies](/docs/guides/advanced-execution). |
| `depends_on` | `str \| list[str] \| None` | `None` | Job ID(s) this job depends on. See [Dependencies](/guides/advanced-execution). |

### `queue.enqueue_many()`

Expand Down
2 changes: 1 addition & 1 deletion docs/content/docs/api-reference/queue/resources.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -63,7 +63,7 @@ queue.load_resources(toml_path: str) -> None
```

Load resource definitions from a TOML file. Must be called before `run_worker()`.
See [TOML configuration](/docs/guides/resources).
See [TOML configuration](/guides/resources).

### `queue.health_check()`

Expand Down
16 changes: 8 additions & 8 deletions docs/content/docs/api-reference/result.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -6,9 +6,9 @@ description: "Handle for an enqueued job — status polling, result retrieval, d
Handle to an enqueued job. Provides methods to check status and retrieve
results, both synchronously and asynchronously.

Returned by [`task.delay()`](/docs/api-reference/task),
[`task.apply_async()`](/docs/api-reference/task),
[`queue.enqueue()`](/docs/api-reference/queue), and canvas operations.
Returned by [`task.delay()`](/api-reference/task),
[`task.apply_async()`](/api-reference/task),
[`queue.enqueue()`](/api-reference/queue), and canvas operations.

## Properties

Expand Down Expand Up @@ -45,7 +45,7 @@ job.progress -> int | None
```

Current progress (0–100) if reported by the task via
[`current_job.update_progress()`](/docs/api-reference/context). Returns `None`
[`current_job.update_progress()`](/api-reference/context). Returns `None`
if no progress has been reported. Refreshes from the database.

### `job.error`
Expand Down Expand Up @@ -88,7 +88,7 @@ job.dependencies -> list[str]
```

List of job IDs this job depends on. Returns an empty list if the job has no
dependencies. See [Dependencies](/docs/guides/advanced-execution).
dependencies. See [Dependencies](/guides/advanced-execution).

### `job.dependents`

Expand All @@ -108,8 +108,8 @@ job.to_dict() -> dict
```

Return all job fields as a plain dictionary. Useful for JSON serialization
(e.g. in the [dashboard](/docs/guides/observability) or
[FastAPI integration](/docs/guides/integrations)).
(e.g. in the [dashboard](/guides/observability) or
[FastAPI integration](/guides/integrations)).

### `job.result()`

Expand Down Expand Up @@ -192,7 +192,7 @@ job.stream(
```

Iterate over partial results published by the task via
[`current_job.publish()`](/docs/api-reference/context). Yields each result as
[`current_job.publish()`](/api-reference/context). Yields each result as
it arrives, stops when the job reaches a terminal state.

| Parameter | Type | Default | Description |
Expand Down
10 changes: 5 additions & 5 deletions docs/content/docs/api-reference/task.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ task.delay(*args, **kwargs) -> JobResult
```

Enqueue the task for background execution using the decorator's default options.
Returns a [`JobResult`](/docs/api-reference/result) handle.
Returns a [`JobResult`](/api-reference/result) handle.

```python
@queue.task(priority=5)
Expand Down Expand Up @@ -68,7 +68,7 @@ falls back to the decorator's default.
| `timeout` | `int \| None` | `None` | Override timeout in seconds |
| `unique_key` | `str \| None` | `None` | Deduplicate active jobs with same key |
| `metadata` | `str \| None` | `None` | Arbitrary JSON metadata to attach |
| `depends_on` | `str \| list[str] \| None` | `None` | Job ID(s) this job depends on. See [Dependencies](/docs/guides/advanced-execution). |
| `depends_on` | `str \| list[str] \| None` | `None` | Job ID(s) this job depends on. See [Dependencies](/guides/advanced-execution). |

```python
job = send_email.apply_async(
Expand Down Expand Up @@ -102,8 +102,8 @@ print(results) # [3, 7, 11]
task.s(*args, **kwargs) -> Signature
```

Create a **mutable** [`Signature`](/docs/api-reference/canvas). In a
[`chain`](/docs/api-reference/canvas), the previous task's return value is
Create a **mutable** [`Signature`](/api-reference/canvas). In a
[`chain`](/api-reference/canvas), the previous task's return value is
prepended to `args`.

```python
Expand All @@ -118,7 +118,7 @@ sig = add.s(10)
task.si(*args, **kwargs) -> Signature
```

Create an **immutable** [`Signature`](/docs/api-reference/canvas). Ignores the
Create an **immutable** [`Signature`](/api-reference/canvas). Ignores the
previous task's result — arguments are used as-is.

```python
Expand Down
4 changes: 2 additions & 2 deletions docs/content/docs/architecture/failure-model.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ Same behavior as database unavailable. The scheduler retries on the next poll cy
`claim_execution` prevents two workers from picking up the same job simultaneously.
But if a worker crashes *after* starting execution, the job will be retried —
potentially executing the same task twice. Design tasks to be
[idempotent](/docs/guides/reliability) to handle this safely.
[idempotent](/guides/reliability) to handle this safely.

## Recovery timeline

Expand All @@ -46,7 +46,7 @@ potentially executing the same task twice. Design tasks to be
If a task completes successfully but the result write to the database fails (e.g.,
database full, connection lost), the job stays in `running` status. The stale reaper
eventually marks it failed and retries it. The task will execute again — make sure
it's [idempotent](/docs/guides/reliability).
it's [idempotent](/guides/reliability).

## Jobs without timeouts

Expand Down
16 changes: 8 additions & 8 deletions docs/content/docs/architecture/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -22,49 +22,49 @@ high-level model, then drill into the layer that interests you.
<Card
icon={<Compass />}
title="Overview"
href="/docs/architecture/overview"
href="/architecture/overview"
description="The big picture — Python ↔ PyO3 ↔ Rust core, end to end."
/>
<Card
icon={<GitBranch />}
title="Job lifecycle"
href="/docs/architecture/job-lifecycle"
href="/architecture/job-lifecycle"
description="Every state a job moves through, from enqueue to result or dead letter."
/>
<Card
icon={<Layers />}
title="Worker pool"
href="/docs/architecture/worker-pool"
href="/architecture/worker-pool"
description="OS-thread pool, prefork children, async executor — how each model dispatches work."
/>
<Card
icon={<CalendarClock />}
title="Scheduler"
href="/docs/architecture/scheduler"
href="/architecture/scheduler"
description="The Tokio polling loop — how jobs are picked, claimed, and routed to workers."
/>
<Card
icon={<Database />}
title="Storage"
href="/docs/architecture/storage"
href="/architecture/storage"
description="The Storage trait, Diesel for SQLite/Postgres, Redis backend, table schemas."
/>
<Card
icon={<Boxes />}
title="Resources"
href="/docs/architecture/resources"
href="/architecture/resources"
description="The 3-layer DI pipeline — argument interception, worker injection, proxy reconstruction."
/>
<Card
icon={<ShieldAlert />}
title="Failure model"
href="/docs/architecture/failure-model"
href="/architecture/failure-model"
description="Retries, dead letters, circuit breakers, cancellation — what happens when things go wrong."
/>
<Card
icon={<FileCode2 />}
title="Serialization"
href="/docs/architecture/serialization"
href="/architecture/serialization"
description="Cloudpickle, JSON, MsgPack — how arguments and results cross the worker boundary."
/>
</Cards>
14 changes: 7 additions & 7 deletions docs/content/docs/architecture/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -13,10 +13,10 @@ worker management.

| Page | What it covers |
|---|---|
| [Job Lifecycle](/docs/architecture/job-lifecycle) | State machine, status codes, transitions |
| [Worker Pool](/docs/architecture/worker-pool) | Thread architecture, async dispatch, GIL management |
| [Storage Layer](/docs/architecture/storage) | SQLite pragmas, schema, indexes, Postgres differences |
| [Scheduler](/docs/architecture/scheduler) | Poll loop, dispatch flow, periodic tasks |
| [Resource System](/docs/architecture/resources) | Argument interception, DI, proxy reconstruction |
| [Failure Model](/docs/architecture/failure-model) | Crash recovery, duplicate execution, partial writes |
| [Serialization](/docs/architecture/serialization) | Pluggable serializers, format details |
| [Job Lifecycle](/architecture/job-lifecycle) | State machine, status codes, transitions |
| [Worker Pool](/architecture/worker-pool) | Thread architecture, async dispatch, GIL management |
| [Storage Layer](/architecture/storage) | SQLite pragmas, schema, indexes, Postgres differences |
| [Scheduler](/architecture/scheduler) | Poll loop, dispatch flow, periodic tasks |
| [Resource System](/architecture/resources) | Argument interception, DI, proxy reconstruction |
| [Failure Model](/architecture/failure-model) | Crash recovery, duplicate execution, partial writes |
| [Serialization](/architecture/serialization) | Pluggable serializers, format details |
2 changes: 1 addition & 1 deletion docs/content/docs/architecture/storage.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,7 @@ per-connection).
## Postgres differences

taskito also supports PostgreSQL as an alternative storage backend. See the
[Postgres backend guide](/docs/guides/operations) for full details.
[Postgres backend guide](/guides/operations) for full details.

- **Connection pooling**: `r2d2` pool with a default of 10 connections (vs. 8 for SQLite)
- **Schema isolation**: all tables are created inside a configurable PostgreSQL schema
Expand Down
Loading
Loading