diff --git a/AGENTS.md b/AGENTS.md index 341d8a17..fe520bbe 100644 --- a/AGENTS.md +++ b/AGENTS.md @@ -69,22 +69,34 @@ inference-endpoint benchmark from-config --config config.yaml ``` Dataset Manager --> Load Generator --> Endpoint Client --> External Endpoint | - Metrics Collector (EventRecorder + MetricsReporter) + EventPublisher (events PUB) + | + +---------------+---------------+ + | | + EventLoggerService MetricsAggregatorService + (events.jsonl) (registry → publisher) + | + metrics PUB + | + Main process SUB + | + Report.from_snapshot ``` ### Key Components -| Component | Location | Purpose | -| ------------------- | ------------------------------------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | -| **Load Generator** | `src/inference_endpoint/load_generator/` | Central orchestrator: `BenchmarkSession` owns the lifecycle, `Scheduler` controls timing, `LoadGenerator` issues queries | -| **Endpoint Client** | `src/inference_endpoint/endpoint_client/` | Multi-process HTTP workers communicating via ZMQ IPC. `HTTPEndpointClient` is the main entry point | -| **Dataset Manager** | `src/inference_endpoint/dataset_manager/` | Loads JSONL, HuggingFace, CSV, JSON, Parquet datasets. `Dataset` base class with `load_sample()`/`num_samples()` interface | -| **Metrics** | `src/inference_endpoint/metrics/` | `EventRecorder` writes to SQLite, `MetricsReporter` reads and aggregates (QPS, latency, TTFT, TPOT) | -| **Config** | `src/inference_endpoint/config/`, `endpoint_client/config.py` | Pydantic-based YAML schema (`schema.py`), `HTTPClientConfig` (single Pydantic model for CLI/YAML/runtime), `RuntimeSettings` | -| **CLI** | `src/inference_endpoint/main.py`, `commands/benchmark/cli.py` | cyclopts-based, auto-generated from `schema.py` and `HTTPClientConfig` Pydantic models. Flat shorthands via `cyclopts.Parameter(alias=...)` | -| **Async Utils** | `src/inference_endpoint/async_utils/` | `LoopManager` (uvloop + eager_task_factory), ZMQ transport layer, event publisher | -| **OpenAI/SGLang** | `src/inference_endpoint/openai/`, `sglang/` | Protocol adapters and response accumulators for different API formats. `openai_completions` adapter (`completions_adapter.py`) sends pre-tokenized token IDs to `/v1/completions`, bypassing the server chat template — required for gpt-oss-120b on vLLM. `sglang` adapter sends to `/generate` via `input_ids`. Both apply `Harmonize()` client-side. | -| **VideoGen** | `src/inference_endpoint/videogen/` | Adapter for video-generation endpoints (e.g. trtllm-serve `POST /v1/videos/generations`, used by MLPerf WAN2.2-T2V-A14B). Defaults to `response_format=video_path` (server saves video to shared storage and returns path) to avoid large byte payloads; switch to `video_bytes` for accuracy mode. Dataset is ingested via the generic JSONL loader. | +| Component | Location | Purpose | +| ---------------------- | ----------------------------------------------------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- | +| **Load Generator** | `src/inference_endpoint/load_generator/` | Central orchestrator: `BenchmarkSession` owns the lifecycle, `Scheduler` controls timing, `LoadGenerator` issues queries. Emits `ERROR` before `COMPLETE` for failed queries (metrics aggregator depends on this order). | +| **Endpoint Client** | `src/inference_endpoint/endpoint_client/` | Multi-process HTTP workers communicating via ZMQ IPC. `HTTPEndpointClient` is the main entry point | +| **Dataset Manager** | `src/inference_endpoint/dataset_manager/` | Loads JSONL, HuggingFace, CSV, JSON, Parquet datasets. `Dataset` base class with `load_sample()`/`num_samples()` interface | +| **Metrics Aggregator** | `src/inference_endpoint/async_utils/services/metrics_aggregator/` | Subprocess. Subscribes to events, aggregates per-sample metrics into a `MetricsRegistry` (counters + HDR-histogram series + raw values), publishes `MetricsSnapshot` over IPC PUB at a configurable cadence (`SessionState`: `INITIALIZE` → `LIVE` → `DRAINING` → {`COMPLETE` \| `INTERRUPTED`}). Final snapshot is atomically written to `final_snapshot.json` as the **primary** Report source; the terminal pub/sub frame is a TUI "run finished" signal only. | +| **Report** | `src/inference_endpoint/metrics/report.py` | `Report.from_snapshot(dict)` — pure-function builder consuming the dict form (`snapshot_to_dict`). Reads `final_snapshot.json` directly via `json.loads` (no Struct decode). Plumbs `complete = (state == "complete" and n_pending_tasks == 0)`; renders an explicit warning for `INTERRUPTED` runs. | +| **Config** | `src/inference_endpoint/config/`, `endpoint_client/config.py` | Pydantic-based YAML schema (`schema.py`), `HTTPClientConfig` (single Pydantic model for CLI/YAML/runtime), `RuntimeSettings` | +| **CLI** | `src/inference_endpoint/main.py`, `commands/benchmark/cli.py` | cyclopts-based, auto-generated from `schema.py` and `HTTPClientConfig` Pydantic models. Flat shorthands via `cyclopts.Parameter(alias=...)` | +| **Async Utils** | `src/inference_endpoint/async_utils/` | `LoopManager` (uvloop + eager_task_factory), ZMQ transport layer, generic `MessageCodec[T]`-parametrized pub/sub, event publisher | +| **OpenAI/SGLang** | `src/inference_endpoint/openai/`, `sglang/` | Protocol adapters and response accumulators for different API formats. `openai_completions` adapter (`completions_adapter.py`) sends pre-tokenized token IDs to `/v1/completions`, bypassing the server chat template — required for gpt-oss-120b on vLLM. `sglang` adapter sends to `/generate` via `input_ids`. Both apply `Harmonize()` client-side. | +| **VideoGen** | `src/inference_endpoint/videogen/` | Adapter for video-generation endpoints (e.g. trtllm-serve `POST /v1/videos/generations`, used by MLPerf WAN2.2-T2V-A14B). Defaults to `response_format=video_path` (server saves video to shared storage and returns path) to avoid large byte payloads; switch to `video_bytes` for accuracy mode. Dataset is ingested via the generic JSONL loader. | ### Hot-Path Architecture @@ -96,6 +108,16 @@ Multi-process, event-loop design optimized for throughput: - CPU affinity support (`cpu_affinity.py`) for performance tuning - Custom HTTP connection pooling (`http.py`) with `httptools` parser +### Metrics Aggregator subprocess (pub/sub) + +The aggregator is a separate process (`python -m inference_endpoint.async_utils.services.metrics_aggregator`) that subscribes to events and publishes `MetricsSnapshot` messages. Key facts for working in this layer: + +- **Series storage**: each `SeriesSampler` keeps three parallel views: O(1) cheap rollups (count/total/min/max/sum_sq, exact), an HDR Histogram (cheap live percentiles), and an in-memory `array.array` of raw values (for exact percentiles in the `COMPLETE` snapshot). Hot path is `registry.record(name, value)` — no allocation, no I/O. +- **Counter API**: `registry.increment(name, delta=1)` for sample-event counters. `registry.set_counter(name, value)` only for the two duration counters (`total_duration_ns` max-of-elapsed, `tracked_duration_ns` sum-of-blocks). +- **Lifecycle**: `INITIALIZE` (constructed, awaiting first `STARTED`) → `LIVE` (run in progress, ticking every `--publish-interval` seconds) → `DRAINING` (set on `ENDED`; tick continues; bounded by `--drain-timeout` budget, default 60 s) → terminal: `COMPLETE` (clean end via `publish_final`, exact stats) **or** `INTERRUPTED` (signal-handler-triggered final via SIGTERM/SIGINT; best-effort partial stats). Drain timeout detected by consumers as `state == COMPLETE and n_pending_tasks > 0`; interrupted runs are detected as `state == INTERRUPTED` directly. +- **Final delivery is dual-path with separated concerns**: `publish_final` atomically writes `final_snapshot.json` (`tmp + fsync(file) + rename + fsync(parent_dir)`) — this is the **primary** Report source — AND emits the terminal-state snapshot over pub/sub as a TUI shutdown signal. Each path is wrapped in its own try/except so one failure cannot suppress the other. Main process consumer reads `final_snapshot.json` (via `json.loads` to dict, no Struct decode); falls back to the subscriber's `latest` live snapshot only if the file is missing (e.g. SIGKILL / OOM before the signal handler ran). The dict form is the canonical consumer contract (see `snapshot_to_dict`). +- **Histogram bucket edges are dynamic per snapshot**: log-spaced over the observed `[min, max]`. Bucket count is fixed at construction; consumers MUST re-render from the snapshot's `(lo, hi, count)` triples each frame and MUST NOT track bucket-by-index across snapshots. + ### CLI Modes CLI is auto-generated from `config/schema.py` Pydantic models via cyclopts. Fields annotated with `cyclopts.Parameter(alias="--flag")` get flat shorthands; all other fields get auto-generated dotted flags (kebab-case). @@ -172,9 +194,17 @@ src/inference_endpoint/ │ ├── event_publisher.py # Async event pub/sub │ ├── services/ │ │ ├── event_logger/ # EventLoggerService: writes EventRecords to JSONL/SQLite -│ │ └── metrics_aggregator/ # MetricsAggregatorService: real-time metrics (TTFT, TPOT, ISL, OSL) +│ │ └── metrics_aggregator/ # MetricsAggregatorService: subscribes to events, publishes MetricsSnapshot +│ │ ├── __main__.py # Subprocess entry: --metrics-socket, --metrics-output-dir, --publish-interval, --hdr-sig-figs, --n-histogram-buckets +│ │ ├── aggregator.py # MetricsAggregatorService (event router); SessionState lifecycle; tracked_samples_failed +│ │ ├── snapshot.py # MetricsSnapshot wire schema + SessionState enum + msgpack codec +│ │ ├── registry.py # MetricsRegistry, CounterSampler, SeriesSampler (HDR + raw array.array + cheap rollups) +│ │ ├── publisher.py # MetricsPublisher (tick task + atomic disk fallback) +│ │ ├── subscriber.py # MetricsSnapshotSubscriber (latest + COMPLETE snapshot capture) +│ │ ├── metrics_table.py # In-flight sample rows + trigger dispatch (TTFT/TPOT/ISL/OSL) +│ │ └── token_metrics.py # TokenizePool (HF tokenizer thread pool for ISL/OSL/TPOT) │ └── transport/ # ZMQ-based IPC transport layer -│ ├── protocol.py # Transport protocols + TransportConfig base +│ ├── protocol.py # Transport protocols + TransportConfig + MessageCodec[T] │ └── zmq/ # ZMQ implementation (context, pubsub, transport, ZMQTransportConfig) ├── dataset_manager/ │ ├── dataset.py # Dataset base class, DatasetFormat enum @@ -182,8 +212,7 @@ src/inference_endpoint/ │ ├── transforms.py # ColumnRemap and other transforms │ └── predefined/ # Built-in datasets (aime25, cnndailymail, gpqa, etc.) ├── metrics/ -│ ├── recorder.py # EventRecorder (SQLite-backed) -│ ├── reporter.py # MetricsReporter (aggregation) +│ ├── report.py # Report.from_snapshot(MetricsSnapshot); display + JSON serialization │ └── metric.py # Metric types (Throughput, etc.) ├── config/ │ ├── schema.py # Single source of truth: Pydantic models + cyclopts annotations @@ -283,11 +312,10 @@ See [Development Guide](docs/DEVELOPMENT.md) for full setup and workflow details - `mock_http_oracle_server` — dataset-driven response server - `dummy_dataset` — in-memory test dataset - `hf_squad_dataset` — HuggingFace squad dataset -- `events_db` — pre-populated SQLite events database - `max_throughput_runtime_settings`, `poisson_runtime_settings`, `concurrency_runtime_settings` — preset configs - `clean_sample_event_hooks` — ensures event hooks are cleared between tests -**Test data**: `tests/datasets/dummy_1k.jsonl` (1000 samples), `tests/datasets/squad_pruned/` +**Test data**: `tests/assets/datasets/dummy_1k.jsonl` (1000 samples), `tests/assets/datasets/squad_pruned/` ### Performance Guidelines @@ -304,22 +332,83 @@ These apply especially to code in the hot path (load generator, endpoint client, ### Key Dependencies -| Package | Purpose | -| -------------- | --------------------------------------------------- | -| `uvloop` | Performance-optimized event loop | -| `httptools` | Fast HTTP parser for custom connection pool | -| `msgspec` | Fast serialization for core types and ZMQ transport | -| `pyzmq` | ZMQ IPC between main process and workers | -| `pydantic` | Configuration validation | -| `cyclopts` | CLI framework — auto-generates flags from Pydantic | -| `duckdb` | Data aggregation | -| `transformers` | Tokenization for OSL reporting | +| Package | Purpose | +| -------------- | ---------------------------------------------------------------------- | +| `uvloop` | Performance-optimized event loop | +| `httptools` | Fast HTTP parser for custom connection pool | +| `msgspec` | Fast serialization for core types, ZMQ transport, MetricsSnapshot wire | +| `pyzmq` | ZMQ IPC between main process and workers / metrics aggregator | +| `hdrhistogram` | HDR Histogram for live percentiles in metrics aggregator (C-backed) | +| `pydantic` | Configuration validation | +| `cyclopts` | CLI framework — auto-generates flags from Pydantic | +| `duckdb` | Data aggregation | +| `transformers` | Tokenization for OSL reporting | ### Files to NOT Modify - `src/inference_endpoint/openai/openai_types_gen.py` — auto-generated, excluded from ruff/pre-commit - `src/inference_endpoint/openai/openapi.yaml` — OpenAI API spec, excluded from pre-commit +### Documentation references — no local-only artifacts + +**Code, comments, docstrings, tests, and committed Markdown MUST NOT reference paths that aren't in the repository.** This includes anything under `.gitignore`d directories (e.g. `.cursor_artifacts/`, design scratch dirs, untracked working notes), absolute paths to a contributor's workstation, build outputs, or unmerged branch artifacts. A reviewer fetching the PR should be able to follow every reference cited in the diff. + +**Why:** stale references compound — `See foo.md §3` is meaningless once `foo.md` is gone, renamed, or never existed in the merged tree, and rotting cross-references are how docs stop being trusted. AI agents reading the codebase later treat dangling pointers as ground truth and propagate confusion. + +**Allowed:** + +- Paths to files committed to the repo (`docs/...`, `src/...`, `tests/...`, `README.md`, etc.). +- External URLs (issue trackers, PRs, RFCs, vendor docs). +- Generic references to environment/setup that the reader is expected to create themselves (e.g. `source .venv/bin/activate` in a setup README, where `.venv` is the user's local venv). + +**Disallowed examples:** + +- `See .cursor_artifacts/foo_design.md §2` — `.cursor_artifacts/` is gitignored. +- `See ~/work/notes/architecture.txt` — contributor-local. +- `Tracked in metrics_pubsub_design_v5.md test impact section` — same gitignored doc. + +If a design doc is worth referencing from the source tree, commit it to `docs/` or inline the relevant content into the code comment / docstring. For one-off rationale that won't survive the conversation, prefer a self-contained explanation in the comment itself rather than a pointer to ephemera. + +### Comments and docstrings — describe current state, not development history + +**Don't write comments or docstrings that narrate iteration on the codebase.** Pointers to abandoned approaches, prior implementations, or design pivots belong in the PR description and `git log`, not in the source tree. They rot quickly: the prior implementation is gone, the reader has no way to evaluate the comparison, and the scaffolding accumulates with every iteration. Future readers — humans and AI agents alike — treat the comment as if it describes load-bearing context when it's actually historical clutter. + +This applies especially to AI-assisted development, where it's tempting to leave a paper trail of "we tried X first, then switched to Y" inside the source. That paper trail belongs in the PR description. + +**Disallowed patterns:** + +- `# Originally used X, but switched to Y for ...` +- `# An earlier implementation did X — this version does Y` +- `# Removed the foo parameter` / `# Replaced bar with baz` +- `# Note: this used to be sync but is now async` +- `# Regression: an earlier shape did X` — even in regression-test docstrings, drop the narrative framing. +- `# An alternative design considered ... but was rejected because ...` (unless the rejected alternative is a _common_ path a future contributor might re-attempt — in that case, frame it as "**don't** do X because Y", not as developer history). + +**Allowed:** + +- **Current rationale**: `# Uses dict dispatch — hot path measured at sub-ms` (describes why the current design exists; no history). +- **Regression context that doesn't narrate the prior bug's discovery**: `# Without this check, value > hdr_high silently corrupts the histogram total` (describes the bug being prevented, framed as a current invariant — not "we used to have a bug here"). +- **Inline TODO/FIXME** pointing at a tracking issue (URL or issue number, not "we plan to do X eventually"). + +**Rule of thumb:** if removing the comment would leave the code's intent unchanged for someone seeing it for the first time, the comment is fine. If the comment only makes sense to someone who saw the prior version, delete it. + +### Comments and docstrings — no line-of-code estimates + +**Don't reference line counts in comments or docstrings.** Phrasing like "one ~20-line block", "this 50-LOC walker", "(~30 LOC)" rots the moment the code is refactored, conveys no actionable meaning, and adds maintenance burden — every edit nearby risks invalidating the count, and there is no warning when it does. + +**Disallowed patterns:** + +- `# Manual mapping (one ~20-line block) is the source of truth` +- `# ~50 LOC of mirror types below` +- `# This function is 30 lines; consider splitting if it grows past 50` + +**Allowed:** + +- Describe _what_ the code does and _why_, not how big it is. +- If size is genuinely the point (e.g. a perf comment about an inlined hot path), name the property that matters: "kept inlined to avoid the call overhead measured at X µs", not "this is 12 lines". + +**Why:** the value of a comment is the invariant it pins. A line count isn't an invariant — it's an accident of formatting and current scope. Future readers will trust the number; it will be wrong; you've now created a misleading comment instead of an absent one. + ## Keeping AGENTS.md Up to Date **This file is the source of truth for AI agents working in this repo.** If it is stale or wrong, every AI-assisted session starts from a broken foundation. diff --git a/README.md b/README.md index a07b9765..4a06a973 100644 --- a/README.md +++ b/README.md @@ -40,13 +40,13 @@ uv run inference-endpoint probe \ uv run inference-endpoint benchmark offline \ --endpoints http://your-endpoint:8000 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl + --dataset tests/assets/datasets/dummy_1k.jsonl # Run online benchmark (sustained QPS) uv run inference-endpoint benchmark online \ --endpoints http://your-endpoint:8000 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl \ + --dataset tests/assets/datasets/dummy_1k.jsonl \ --load-pattern poisson \ --target-qps 100 ``` @@ -59,7 +59,7 @@ uv run python -m inference_endpoint.testing.echo_server --port 8765 & uv run inference-endpoint benchmark offline \ --endpoints http://localhost:8765 \ --model test-model \ - --dataset tests/datasets/dummy_1k.jsonl + --dataset tests/assets/datasets/dummy_1k.jsonl pkill -f echo_server ``` diff --git a/docs/CLI_QUICK_REFERENCE.md b/docs/CLI_QUICK_REFERENCE.md index 8a0e563d..e1659346 100644 --- a/docs/CLI_QUICK_REFERENCE.md +++ b/docs/CLI_QUICK_REFERENCE.md @@ -13,13 +13,13 @@ Command-line reference for all `inference-endpoint` subcommands, flags, load pat inference-endpoint benchmark offline \ --endpoints URL \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl + --dataset tests/assets/datasets/dummy_1k.jsonl # Online (sustained QPS - requires --load-pattern, --target-qps) inference-endpoint benchmark online \ --endpoints URL \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl \ + --dataset tests/assets/datasets/dummy_1k.jsonl \ --load-pattern poisson \ --target-qps 100 @@ -35,14 +35,14 @@ inference-endpoint benchmark offline \ inference-endpoint benchmark offline \ --endpoints URL \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl \ + --dataset tests/assets/datasets/dummy_1k.jsonl \ --report-dir my_benchmark_report # YAML-based inference-endpoint benchmark from-config --config test.yaml ``` -**Default Test Dataset:** Use `tests/datasets/dummy_1k.jsonl` (1000 samples) for local testing. +**Default Test Dataset:** Use `tests/assets/datasets/dummy_1k.jsonl` (1000 samples) for local testing. **Dataset format:** `--dataset [perf|acc:][,key=value...]` — TOML-style dotted paths. Type prefix is optional (defaults to `perf`): @@ -200,7 +200,7 @@ inference-endpoint benchmark offline \ inference-endpoint benchmark offline \ --endpoints http://localhost:8000 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl + --dataset tests/assets/datasets/dummy_1k.jsonl ``` ### Production Benchmark diff --git a/docs/LOCAL_TESTING.md b/docs/LOCAL_TESTING.md index 1fe51c0b..91e19bda 100644 --- a/docs/LOCAL_TESTING.md +++ b/docs/LOCAL_TESTING.md @@ -6,7 +6,7 @@ How to run and test the CLI locally using the built-in echo server and the inclu ### 1. Prepare Test Environment -**Dataset:** The repo includes `tests/datasets/dummy_1k.jsonl` (1000 samples) +**Dataset:** The repo includes `tests/assets/datasets/dummy_1k.jsonl` (1000 samples) **Format:** Automatically inferred from the file extension. Common local formats include `jsonl`, `json`, `csv`, `parquet`, and HuggingFace datasets. ### 2. Start the Echo Server @@ -74,14 +74,14 @@ Waiting for 5 responses... uv run inference-endpoint -v benchmark offline \ --endpoints http://localhost:8765 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl \ + --dataset tests/assets/datasets/dummy_1k.jsonl \ --duration 0 # Production test with custom params and report generation uv run inference-endpoint -v benchmark offline \ --endpoints http://localhost:8765 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl \ + --dataset tests/assets/datasets/dummy_1k.jsonl \ --num-samples 5000 \ --workers 4 \ --report-dir benchmark_report @@ -114,7 +114,7 @@ Cleaning up... uv run inference-endpoint -v benchmark online \ --endpoints http://localhost:8765 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl \ + --dataset tests/assets/datasets/dummy_1k.jsonl \ --duration 0 \ --load-pattern poisson \ --target-qps 100 \ @@ -154,7 +154,7 @@ uv run inference-endpoint validate-yaml --config offline_template.yaml uv run inference-endpoint benchmark offline \ --endpoints http://localhost:8765 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/ds_samples.jsonl \ + --dataset tests/assets/datasets/ds_samples.jsonl \ -v ``` @@ -246,7 +246,7 @@ uv run inference-endpoint probe --endpoints http://localhost:8000 --model Qwen/Q uv run inference-endpoint -v benchmark offline \ --endpoints http://localhost:8000 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl \ + --dataset tests/assets/datasets/dummy_1k.jsonl \ --workers 4 \ --report-dir benchmark_report @@ -261,14 +261,14 @@ pkill -f echo_server uv run inference-endpoint benchmark offline \ --endpoints http://localhost:8765 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl \ + --dataset tests/assets/datasets/dummy_1k.jsonl \ --report-dir offline_report # Online (Poisson distribution) uv run inference-endpoint benchmark online \ --endpoints http://localhost:8765 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl \ + --dataset tests/assets/datasets/dummy_1k.jsonl \ --load-pattern poisson \ --target-qps 500 \ --report-dir online_report @@ -277,21 +277,21 @@ uv run inference-endpoint benchmark online \ uv run inference-endpoint benchmark offline \ --endpoints http://localhost:8765 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl \ + --dataset tests/assets/datasets/dummy_1k.jsonl \ --num-samples 500 # Force streaming on for offline mode (to test TTFT metrics) uv run inference-endpoint benchmark offline \ --endpoints http://localhost:8765 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl \ + --dataset tests/assets/datasets/dummy_1k.jsonl \ --streaming on # Concurrency mode (fixed concurrent requests) uv run inference-endpoint benchmark online \ --endpoints http://localhost:8765 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl \ + --dataset tests/assets/datasets/dummy_1k.jsonl \ --load-pattern concurrency \ --concurrency 32 ``` @@ -317,7 +317,7 @@ uv run inference-endpoint benchmark online \ - Use `-v` for INFO logging, `-vv` for DEBUG - Echo server mirrors prompts back - perfect for quick testing without real inference - Press `Ctrl+C` to gracefully interrupt benchmarks -- Default test dataset: `tests/datasets/dummy_1k.jsonl` (1000 samples) +- Default test dataset: `tests/assets/datasets/dummy_1k.jsonl` (1000 samples) **Advanced:** diff --git a/examples/02_ServerBenchmarking/README.md b/examples/02_ServerBenchmarking/README.md index 95f4e477..bfb8e5b9 100644 --- a/examples/02_ServerBenchmarking/README.md +++ b/examples/02_ServerBenchmarking/README.md @@ -49,7 +49,7 @@ enroot start -e HF_TOKEN=$HF_TOKEN -m $HF_HOME:/root/.cache/huggingface vllm+vll Once the server is up and running, we can send requests to the endpoint by passing in the endpoint address and model name: ``` -uv run inference-endpoint benchmark offline --endpoints http://localhost:8000 --dataset tests/datasets/dummy_1k.jsonl --model ${MODEL_NAME} +uv run inference-endpoint benchmark offline --endpoints http://localhost:8000 --dataset tests/assets/datasets/dummy_1k.jsonl --model ${MODEL_NAME} ``` # Using a config file diff --git a/pyproject.toml b/pyproject.toml index 3f32cfa1..40a33128 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -63,6 +63,9 @@ dependencies = [ "sentencepiece==0.2.1", "protobuf==7.34.1", "openai_harmony==0.0.8", + # HDR Histogram for live percentile/histogram approximations in the + # metrics aggregator (PyPI: hdrhistogram, importable as hdrh.histogram). + "hdrhistogram==0.10.3", # Color support for cross-platform terminals "colorama==0.4.6", # Fix pytz-2024 import warning @@ -131,7 +134,7 @@ Issues = "https://github.com/mlperf/inference-endpoint/issues" target-version = "py312" line-length = 88 exclude = [ - "tests/datasets/*", + "tests/assets/datasets/*", "src/inference_endpoint/openai/openai_types_gen.py", "src/inference_endpoint/openai/openapi.yaml", "datasets/*", diff --git a/scripts/create_dummy_dataset.py b/scripts/create_dummy_dataset.py index d5a75fae..11800eac 100644 --- a/scripts/create_dummy_dataset.py +++ b/scripts/create_dummy_dataset.py @@ -36,7 +36,7 @@ def create_dummy_dataset(num_samples: int = 1000, output_path: str = None): Args: num_samples: Number of samples to generate - output_path: Output file path (default: tests/datasets/dummy_1k.jsonl) + output_path: Output file path (default: tests/assets/datasets/dummy_1k.jsonl) """ # Create varied prompts prompt_templates = [ @@ -122,7 +122,7 @@ def main(): "--output", "-o", type=str, - help="Output file path (default: tests/datasets/dummy_1k.jsonl)", + help="Output file path (default: tests/assets/datasets/dummy_1k.jsonl)", ) args = parser.parse_args() diff --git a/scripts/regenerate_templates.py b/scripts/regenerate_templates.py index 5e407768..6ac86c42 100644 --- a/scripts/regenerate_templates.py +++ b/scripts/regenerate_templates.py @@ -74,14 +74,14 @@ PERF_DATASET = { "name": "perf", "type": "performance", - "path": "", + "path": "", "parser": {"prompt": "text_input"}, } ACC_DATASET = { "name": "accuracy", "type": "accuracy", - "path": "", + "path": "", "eval_method": "exact_match", "parser": {"prompt": "question", "system": "system_prompt"}, "accuracy_config": { diff --git a/src/inference_endpoint/async_utils/services/metrics_aggregator/__main__.py b/src/inference_endpoint/async_utils/services/metrics_aggregator/__main__.py index 50a3163d..35a59b45 100644 --- a/src/inference_endpoint/async_utils/services/metrics_aggregator/__main__.py +++ b/src/inference_endpoint/async_utils/services/metrics_aggregator/__main__.py @@ -17,6 +17,9 @@ import argparse import asyncio +import logging +import signal +from collections.abc import Callable from contextlib import AbstractContextManager, nullcontext from pathlib import Path @@ -25,10 +28,72 @@ from inference_endpoint.async_utils.transport.zmq.ready_check import send_ready_signal from inference_endpoint.utils.logging import setup_logging -from .aggregator import MetricsAggregatorService -from .kv_store import BasicKVStore +from .aggregator import MetricCounterKey, MetricsAggregatorService +from .metrics_table import MetricsTable +from .publisher import MetricsPublisher +from .registry import MetricsRegistry +from .snapshot import MetricsSnapshotCodec from .token_metrics import TokenizePool +logger = logging.getLogger(__name__) + + +def _make_sigterm_handler( + *, + loop: asyncio.AbstractEventLoop, + registry: MetricsRegistry, + publisher: MetricsPublisher, + table: MetricsTable, + shutdown_event: asyncio.Event, +) -> tuple[Callable[[], None], set[asyncio.Task]]: + """Build the SIGTERM handler that writes the INTERRUPTED final snapshot. + + Returns ``(handler, pending_tasks)``. ``pending_tasks`` is the + strong-reference container that keeps spawned finalize tasks alive + while they run: asyncio tracks tasks only by weakref, so a task + whose only reference is the local variable inside the handler can + be garbage-collected mid-execution (per Python's asyncio docs). + Each spawned task self-removes from the set via + ``add_done_callback`` once it completes. + + Exposed at module level (rather than nested in ``main()``) so the + GC-safety contract is unit-testable without driving the whole + subprocess lifecycle. + """ + pending_tasks: set[asyncio.Task] = set() + + async def _signal_finalize() -> None: + try: + # Mirror the ENDED-driven path: refresh tracked_duration_ns + # from the table BEFORE publish_final, otherwise an + # interrupted run whose STOP_PERFORMANCE_TRACKING never + # fired would report duration_ns=0 and QPS=N/A in the final + # report even after processing many tracked samples. + registry.set_counter( + MetricCounterKey.TRACKED_DURATION_NS.value, + table.total_tracked_duration_ns, + ) + await publisher.publish_final( + registry, + n_pending_tasks=table.in_flight_tasks_count, + interrupted=True, + ) + except Exception: # noqa: BLE001 — best-effort. + logger.exception( + "metrics aggregator: SIGTERM-triggered publish_final failed" + ) + shutdown_event.set() + + def _on_sigterm() -> None: + logger.warning( + "metrics aggregator received SIGTERM; " "writing INTERRUPTED final snapshot" + ) + task = loop.create_task(_signal_finalize()) + pending_tasks.add(task) + task.add_done_callback(pending_tasks.discard) + + return _on_sigterm, pending_tasks + async def main() -> None: parser = argparse.ArgumentParser( @@ -44,13 +109,49 @@ async def main() -> None: "--socket-name", type=str, required=True, - help="Socket name within socket-dir", + help="EventRecord PUB socket name within socket-dir to subscribe to", ) parser.add_argument( - "--metrics-dir", + "--metrics-socket", type=str, required=True, - help="Directory for mmap-backed metric files (created by the parent process)", + help="IPC socket name (within socket-dir) for the metrics PUB output", + ) + parser.add_argument( + "--metrics-output-dir", + type=Path, + required=True, + help="Directory for the final-snapshot disk fallback (created if missing)", + ) + parser.add_argument( + "--publish-interval", + type=float, + default=0.25, + help="Live snapshot publish interval in seconds (default: 0.25, i.e. 4 Hz)", + ) + parser.add_argument( + "--drain-timeout", + type=float, + default=60.0, + help=( + "Wall-clock budget (seconds) to wait for in-flight async tokenize " + "tasks to finish after ENDED before the aggregator cancels them " + "and emits the final snapshot with n_pending_tasks > 0 " + "(default: 60.0). Increase for long-context / low-worker-count " + "tokenize workloads." + ), + ) + parser.add_argument( + "--hdr-sig-figs", + type=int, + default=3, + help="HDR Histogram significant figures (default: 3)", + ) + parser.add_argument( + "--n-histogram-buckets", + type=int, + default=30, + help="Number of dense histogram buckets per series (default: 30)", ) parser.add_argument( "--tokenizer", @@ -85,7 +186,20 @@ async def main() -> None: args = parser.parse_args() setup_logging(level="INFO") - metrics_dir = Path(args.metrics_dir) + # The parent owns directory setup — `commands/benchmark/execute.py` + # creates `/metrics/` and validates it before launching + # this subprocess. Validate here as a fail-fast contract check so a + # misbehaving launcher (or a manual invocation) surfaces a clear + # error in this subprocess's stderr instead of crashing later on + # the atomic-write path. + metrics_output_dir: Path = args.metrics_output_dir + if not metrics_output_dir.is_dir(): + raise SystemExit( + f"FATAL: --metrics-output-dir {metrics_output_dir!s} does not " + "exist or is not a directory. The parent process is responsible " + "for creating it before launching the aggregator subprocess." + ) + shutdown_event = asyncio.Event() loop = LoopManager().default_loop @@ -102,27 +216,102 @@ async def main() -> None: pool_cm as pool, ManagedZMQContext.scoped(socket_dir=args.socket_dir) as zmq_ctx, ): - kv_store = BasicKVStore(metrics_dir) + registry = MetricsRegistry() + publisher = MetricsPublisher( + MetricsSnapshotCodec(), + zmq_ctx, + args.metrics_socket, + loop, + final_snapshot_path=metrics_output_dir / "final_snapshot.json", + ) try: aggregator = MetricsAggregatorService( args.socket_name, zmq_ctx, loop, topics=None, - kv_store=kv_store, + registry=registry, + publisher=publisher, + publish_interval_s=args.publish_interval, + sig_figs=args.hdr_sig_figs, + n_histogram_buckets=args.n_histogram_buckets, tokenize_pool=pool, streaming=args.streaming, shutdown_event=shutdown_event, + drain_timeout_s=args.drain_timeout, ) aggregator.start() + # SIGTERM only — the parent's ServiceLauncher.kill_all uses + # SIGTERM to kill the aggregator child before an ENDED event + # arrives; without this handler that path leaves the Report + # consumer with no final_snapshot file. The signal-triggered + # snapshot is tagged INTERRUPTED so Report can distinguish + # "parent killed the run" from a clean shutdown. + # publish_final is idempotent (see + # MetricsPublisher._finalized), so racing with the + # ENDED-driven call is safe. + # + # SIGINT is deliberately NOT handled in the same way. On an + # interactive ^C, the OS sends SIGINT to the whole + # foreground process group — parent + child both receive + # it. If we finalized eagerly here, the aggregator would + # write final_snapshot.json from whatever state it had at + # signal time, then exit; samples that completed during the + # parent's own graceful shutdown window would never reach + # the file (the parent eventually emits ENDED on its events + # channel, but `_finalized=True` makes that a no-op). The + # parent's clean-shutdown path is what we want to drive the + # aggregator's finalize — so we install a no-op handler for + # SIGINT here, which prevents Python's default + # KeyboardInterrupt and lets the parent control the lifecycle. + on_sigterm, _sigterm_tasks = _make_sigterm_handler( + loop=loop, + registry=registry, + publisher=publisher, + table=aggregator._table, + shutdown_event=shutdown_event, + ) + loop.add_signal_handler(signal.SIGTERM, on_sigterm) + # No-op SIGINT handler: silence the default KeyboardInterrupt + # and let the parent's ENDED-driven path drive shutdown. + loop.add_signal_handler( + signal.SIGINT, + lambda: logger.info( + "metrics aggregator received SIGINT — ignoring " + "(parent's ENDED path is authoritative)" + ), + ) + if args.readiness_path: await send_ready_signal(zmq_ctx, args.readiness_path, args.readiness_id) await shutdown_event.wait() finally: - kv_store.close() + # aclose() awaits the tick task before closing the underlying + # transport, avoiding cancelled-tick-vs-socket-close races. + await publisher.aclose() if __name__ == "__main__": - LoopManager().default_loop.run_until_complete(main()) + # Surface startup / bind / tokenizer-load failures with structured + # context. Without this wrap, the parent's ServiceLauncher only sees + # the non-zero exit code and a raw traceback — no diagnostic context + # to correlate against the parent's logs. The except/raise pattern + # preserves the original exit code (1) and traceback while emitting + # the structured logger.exception line before the interpreter prints + # the trace. + try: + LoopManager().default_loop.run_until_complete(main()) + except SystemExit: + # argparse / explicit sys.exit — already user-facing, don't dress up. + raise + except Exception as e: + # Catch Exception (not BaseException) so KeyboardInterrupt / + # SystemExit propagate untouched — those are control-flow + # signals, not crashes, and labeling them as "crashed" would + # mislead operators. The exception type goes first in the log + # message so it's grep-able without scrolling through the + # traceback. + logger.exception("metrics aggregator subprocess crashed (%s)", type(e).__name__) + raise diff --git a/src/inference_endpoint/async_utils/services/metrics_aggregator/aggregator.py b/src/inference_endpoint/async_utils/services/metrics_aggregator/aggregator.py index 0186a26d..781e0f5f 100644 --- a/src/inference_endpoint/async_utils/services/metrics_aggregator/aggregator.py +++ b/src/inference_endpoint/async_utils/services/metrics_aggregator/aggregator.py @@ -20,6 +20,7 @@ import asyncio import logging from enum import Enum +from typing import Final from inference_endpoint.async_utils.transport.zmq.pubsub import ( ZmqMessageSubscriber, @@ -32,10 +33,10 @@ SessionEventType, ) -from .kv_store import KVStore from .metrics_table import ( ChunkDeltaTrigger, IslTrigger, + MetricSeriesKey, MetricsTable, OslTrigger, SampleField, @@ -43,6 +44,9 @@ TpotTrigger, TtftTrigger, ) +from .publisher import MetricsPublisher +from .registry import MetricsRegistry +from .snapshot import SessionState from .token_metrics import TokenizePool logger = logging.getLogger(__name__) @@ -60,15 +64,17 @@ class MetricCounterKey(str, Enum): TOTAL_SAMPLES_FAILED = "total_samples_failed" TRACKED_SAMPLES_ISSUED = "tracked_samples_issued" TRACKED_SAMPLES_COMPLETED = "tracked_samples_completed" + # Failed samples that were within a performance-tracking window. + # Counted at ERROR-event time; correctness depends on + # session.py:_handle_response emitting ERROR before COMPLETE so the + # tracked row still exists when the aggregator sees the ERROR. + TRACKED_SAMPLES_FAILED = "tracked_samples_failed" TRACKED_DURATION_NS = "tracked_duration_ns" # Total wall-clock duration since session start. Updated on every event as - # max(current, event_timestamp - session_start) to be defensive against - # non-monotonic timestamps. - # - # An alternative design was considered: store session_start_ns once and - # compute duration as (now - start) on read. This is infeasible because - # time.monotonic_ns() has inconsistent epoch per process — a reader in - # another process would get a meaningless value. + # max(current, event_timestamp - session_start). Stored as a counter + # rather than computed from (now - start) at read time because + # time.monotonic_ns() has a process-local epoch — a reader in another + # process would get a meaningless value. TOTAL_DURATION_NS = "total_duration_ns" @@ -82,46 +88,133 @@ class MetricCounterKey(str, Enum): ) +# HDR bounds per series — chosen conservatively so realistic benchmark +# values cannot fall outside [low, high]. Values outside the range are +# clamped on insert and a warning is logged once per series. +_NS_HDR_LOW: Final[int] = 1 +_NS_HDR_HIGH: Final[int] = 3_600_000_000_000 # 1 hour in ns +_TOKEN_HDR_LOW: Final[int] = 1 +_TOKEN_HDR_HIGH: Final[int] = 10_000_000 # 10M tokens + +_DEFAULT_DRAIN_TIMEOUT_S: Final[float] = 60.0 + + class MetricsAggregatorService(ZmqMessageSubscriber[EventRecord]): """Subscribes to EventRecords and computes per-sample metrics in real time. The aggregator is a thin event router. All state management, trigger - dispatch, and row lifecycle are handled by MetricsTable. The KVStore - is shared between the table (for series metrics via triggers) and the - aggregator (for counter metrics like n_issued, n_completed, etc.). + dispatch, and row lifecycle are handled by ``MetricsTable``. The + ``MetricsRegistry`` holds counters and series; the ``MetricsPublisher`` + publishes ``MetricsSnapshot`` over pub/sub at a fixed cadence and + mirrors the final snapshot to disk. """ def __init__( self, *args, - kv_store: KVStore, + registry: MetricsRegistry, + publisher: MetricsPublisher, + publish_interval_s: float, + sig_figs: int, + n_histogram_buckets: int, tokenize_pool: TokenizePool | None = None, streaming: bool = False, shutdown_event: asyncio.Event | None = None, + drain_timeout_s: float = _DEFAULT_DRAIN_TIMEOUT_S, **kwargs, ): + # drain_timeout_s is injected (not derived) because the right + # value is workload-dependent: long-context tokenize-heavy runs + # need more headroom than the default 60 s, and the aggregator + # itself can't measure that ahead of time. Keeping it as an arg + # lets the __main__ CLI flag plumb the user's choice through + # without coupling this class to argparse. super().__init__(EventRecordCodec(), *args, **kwargs) - self._kv_store = kv_store + self._registry = registry + self._publisher = publisher + self._publish_interval_s = publish_interval_s self._tokenize_pool = tokenize_pool + self._streaming = streaming self._shutdown_event = shutdown_event self._shutdown_received = False + self._drain_timeout_s = drain_timeout_s - for key in MetricCounterKey: - kv_store.create_key(key.value, "counter") - - self._total_issued = 0 - self._total_completed = 0 - self._total_failed = 0 - self._tracked_issued = 0 - self._tracked_completed = 0 self._session_start_ns: int | None = None self._total_duration_ns: int = 0 self._total_processed = 0 self._last_log_count = 0 + # Tracks the run's lifecycle state, surfaced on the wire as + # MetricsSnapshot.state. Transitions are forward-only: + # INITIALIZE → LIVE (on first STARTED) → DRAINING (on ENDED) → + # COMPLETE (set implicitly via publish_final). + self._session_state: SessionState = SessionState.INITIALIZE + + # Pre-register all metrics on the registry. Tests can introspect via + # registry.has_counter / has_series. + self._register_metrics(streaming, sig_figs, n_histogram_buckets) - self._table = MetricsTable(kv_store) + self._table = MetricsTable(self._registry) self._register_triggers(streaming) + # ------------------------------------------------------------------ + # Registration helpers + # ------------------------------------------------------------------ + + def _register_metrics( + self, streaming: bool, sig_figs: int, n_histogram_buckets: int + ) -> None: + """Register all counters and series on the registry.""" + for key in MetricCounterKey: + self._registry.register_counter(key.value) + + # Always-present series + self._registry.register_series( + MetricSeriesKey.SAMPLE_LATENCY_NS.value, + hdr_low=_NS_HDR_LOW, + hdr_high=_NS_HDR_HIGH, + sig_figs=sig_figs, + n_histogram_buckets=n_histogram_buckets, + ) + self._registry.register_series( + MetricSeriesKey.ISL.value, + hdr_low=_TOKEN_HDR_LOW, + hdr_high=_TOKEN_HDR_HIGH, + sig_figs=sig_figs, + n_histogram_buckets=n_histogram_buckets, + ) + self._registry.register_series( + MetricSeriesKey.OSL.value, + hdr_low=_TOKEN_HDR_LOW, + hdr_high=_TOKEN_HDR_HIGH, + sig_figs=sig_figs, + n_histogram_buckets=n_histogram_buckets, + ) + + # Streaming-only series + if streaming: + self._registry.register_series( + MetricSeriesKey.TTFT_NS.value, + hdr_low=_NS_HDR_LOW, + hdr_high=_NS_HDR_HIGH, + sig_figs=sig_figs, + n_histogram_buckets=n_histogram_buckets, + ) + self._registry.register_series( + MetricSeriesKey.CHUNK_DELTA_NS.value, + hdr_low=_NS_HDR_LOW, + hdr_high=_NS_HDR_HIGH, + sig_figs=sig_figs, + n_histogram_buckets=n_histogram_buckets, + ) + self._registry.register_series( + MetricSeriesKey.TPOT_NS.value, + hdr_low=_NS_HDR_LOW, + hdr_high=_NS_HDR_HIGH, + sig_figs=sig_figs, + n_histogram_buckets=n_histogram_buckets, + dtype=float, + ) + def _register_triggers(self, streaming: bool) -> None: """Register metric triggers on the table. @@ -129,25 +222,31 @@ def _register_triggers(self, streaming: bool) -> None: when ``streaming=True``. """ table = self._table - store = self._kv_store + registry = self._registry pool = self._tokenize_pool loop = self.loop # Always registered - table.add_trigger(SampleField.ISSUED_NS, IslTrigger(store, pool, loop)) - table.add_trigger(SampleField.COMPLETE_NS, SampleLatencyTrigger(store)) - table.add_trigger(SampleField.COMPLETE_NS, OslTrigger(store, pool, loop)) + table.add_trigger(SampleField.ISSUED_NS, IslTrigger(registry, pool, loop)) + table.add_trigger(SampleField.COMPLETE_NS, SampleLatencyTrigger(registry)) + table.add_trigger(SampleField.COMPLETE_NS, OslTrigger(registry, pool, loop)) # Streaming-only if streaming: - table.add_trigger(SampleField.RECV_FIRST_NS, TtftTrigger(store)) - table.add_trigger(SampleField.LAST_RECV_NS, ChunkDeltaTrigger(store)) - table.add_trigger(SampleField.COMPLETE_NS, TpotTrigger(store, pool, loop)) + table.add_trigger(SampleField.RECV_FIRST_NS, TtftTrigger(registry)) + table.add_trigger(SampleField.LAST_RECV_NS, ChunkDeltaTrigger(registry)) + table.add_trigger( + SampleField.COMPLETE_NS, TpotTrigger(registry, pool, loop) + ) + + # ------------------------------------------------------------------ + # Event processing + # ------------------------------------------------------------------ async def process(self, records: list[EventRecord]) -> None: saw_shutdown = False table = self._table - store = self._kv_store + registry = self._registry self._total_processed += len(records) if self._total_processed - self._last_log_count >= 10000: @@ -169,7 +268,7 @@ async def process(self, records: list[EventRecord]) -> None: elapsed = record.timestamp_ns - self._session_start_ns if elapsed > self._total_duration_ns: self._total_duration_ns = elapsed - store.update( + registry.set_counter( MetricCounterKey.TOTAL_DURATION_NS.value, self._total_duration_ns, ) @@ -182,10 +281,42 @@ async def process(self, records: list[EventRecord]) -> None: saw_shutdown = True else: if ev == SessionEventType.STARTED: - self._session_start_ns = record.timestamp_ns + if self._session_start_ns is not None: + # A duplicate STARTED is a producer bug: + # re-assigning _session_start_ns would freeze + # total_duration_ns (the max-of-elapsed guard + # never updates once the start moves forward) + # and corrupt every downstream rate calc for + # the rest of the run. Surface loudly and + # ignore — the publisher.start guard already + # rejects the second tick-task spawn, but + # session-state must also be defended here. + logger.error( + "Duplicate STARTED event received " + "(original at ts=%d, duplicate at ts=%d); " + "ignoring — producer must emit STARTED " + "exactly once per session.", + self._session_start_ns, + record.timestamp_ns, + ) + else: + self._session_start_ns = record.timestamp_ns + self._session_state = SessionState.LIVE + # Now that we have an event loop running, start + # the publisher tick task. The callable is + # invoked once per tick to capture the live + # (state, n_pending_tasks) pair at each emit. + self._publisher.start( + registry, + self._publish_interval_s, + get_runtime_state=lambda: ( + self._session_state, + table.in_flight_tasks_count, + ), + ) table.handle_session_event(record) if ev == SessionEventType.STOP_PERFORMANCE_TRACKING: - store.update( + registry.set_counter( MetricCounterKey.TRACKED_DURATION_NS.value, table.total_tracked_duration_ns, ) @@ -193,11 +324,13 @@ async def process(self, records: list[EventRecord]) -> None: continue # --- Error events --- + # Counted BEFORE the COMPLETE event (session.py emits ERROR + # first), so the tracked row still exists for tracked-failed + # detection. if isinstance(ev, ErrorEventType): - self._total_failed += 1 - store.update( - MetricCounterKey.TOTAL_SAMPLES_FAILED.value, self._total_failed - ) + registry.increment(MetricCounterKey.TOTAL_SAMPLES_FAILED.value) + if record.sample_uuid and table.get_row(record.sample_uuid) is not None: + registry.increment(MetricCounterKey.TRACKED_SAMPLES_FAILED.value) logger.debug("Error event: %s", record) continue @@ -214,16 +347,9 @@ async def process(self, records: list[EventRecord]) -> None: if ev == SampleEventType.ISSUED: table.set_field(uuid, SampleField.ISSUED_NS, ts, record) - self._total_issued += 1 - store.update( - MetricCounterKey.TOTAL_SAMPLES_ISSUED.value, self._total_issued - ) + registry.increment(MetricCounterKey.TOTAL_SAMPLES_ISSUED.value) if table.get_row(uuid) is not None: - self._tracked_issued += 1 - store.update( - MetricCounterKey.TRACKED_SAMPLES_ISSUED.value, - self._tracked_issued, - ) + registry.increment(MetricCounterKey.TRACKED_SAMPLES_ISSUED.value) elif ev == SampleEventType.RECV_FIRST: table.set_field(uuid, SampleField.RECV_FIRST_NS, ts, record) table.set_field(uuid, SampleField.LAST_RECV_NS, ts, record) @@ -233,27 +359,55 @@ async def process(self, records: list[EventRecord]) -> None: # Check if tracked before set_field (which removes the row) is_tracked = table.get_row(uuid) is not None table.set_field(uuid, SampleField.COMPLETE_NS, ts, record) - self._total_completed += 1 - store.update( - MetricCounterKey.TOTAL_SAMPLES_COMPLETED.value, - self._total_completed, - ) + registry.increment(MetricCounterKey.TOTAL_SAMPLES_COMPLETED.value) if is_tracked: - self._tracked_completed += 1 - store.update( - MetricCounterKey.TRACKED_SAMPLES_COMPLETED.value, - self._tracked_completed, - ) + registry.increment(MetricCounterKey.TRACKED_SAMPLES_COMPLETED.value) if saw_shutdown: - logger.info("Draining %d async tasks...", len(table._in_flight_tasks)) - await table.drain_tasks() - logger.info("Async tasks drained") - store.update( + # ENDED has been observed; transition to DRAINING so any tick + # that fires before publish_final reflects the new state. + self._session_state = SessionState.DRAINING + logger.info("Draining %d async tasks...", table.in_flight_tasks_count) + # drain_tasks owns the timeout + cancel-and-await sequence so + # the pending count is captured BEFORE done-callbacks empty + # the in-flight set. Reading in_flight_tasks_count out here + # would always be 0 (see drain_tasks docstring). + n_pending = await table.drain_tasks(timeout=self._drain_timeout_s) + if n_pending > 0: + logger.warning( + "drain_tasks timed out after %.1fs; %d async tasks " + "did not complete and were cancelled", + self._drain_timeout_s, + n_pending, + ) + logger.info( + "Async tasks drained (n_pending_tasks=%d at finalize)", n_pending + ) + registry.set_counter( MetricCounterKey.TRACKED_DURATION_NS.value, table.total_tracked_duration_ns, ) - self._finalize() + try: + await self._publisher.publish_final(registry, n_pending_tasks=n_pending) + finally: + # Whatever happens above, the aggregator MUST close the + # publisher and signal shutdown — otherwise the main() + # entry point's `await shutdown_event.wait()` hangs + # forever and the subprocess never exits cleanly. Each + # cleanup step is independently wrapped: a failure in + # aclose must not prevent _finalize, since _finalize is + # what sets the shutdown event. + try: + await self._publisher.aclose() + except Exception: # noqa: BLE001 — best-effort cleanup. + logger.exception( + "metrics: publisher.aclose failed during ENDED finalize" + ) + self._finalize() + + # ------------------------------------------------------------------ + # Lifecycle + # ------------------------------------------------------------------ def _finalize(self) -> None: logger.info( @@ -266,5 +420,8 @@ def _finalize(self) -> None: self.loop.stop() def close(self) -> None: - self._kv_store.close() + try: + self._publisher.close() + except Exception: # noqa: BLE001 — close is best-effort during shutdown. + logger.exception("metrics: publisher close failed") super().close() diff --git a/src/inference_endpoint/async_utils/services/metrics_aggregator/fs_check.py b/src/inference_endpoint/async_utils/services/metrics_aggregator/fs_check.py deleted file mode 100644 index fea99811..00000000 --- a/src/inference_endpoint/async_utils/services/metrics_aggregator/fs_check.py +++ /dev/null @@ -1,140 +0,0 @@ -# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. -# SPDX-License-Identifier: Apache-2.0 -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Filesystem type detection for mmap ordering decisions. - -On tmpfs (/dev/shm), msync() is a no-op because there is no backing store. -On a real on-disk filesystem, msync() flushes dirty pages to the shared page -cache, which provides write ordering for cross-process mmap readers. - -On ARM (weak memory model), we need msync() to act as an ordering mechanism -between the value write and the count update in _SeriesItem.append(). This -only works on a real filesystem — not tmpfs. Detecting the filesystem type -lets us: - - Skip the useless msync() syscall on tmpfs (any architecture) - - Warn if ARM code is running on tmpfs (msync won't provide ordering) -""" - -from __future__ import annotations - -import ctypes -import ctypes.util -import logging -import platform -from pathlib import Path - -logger = logging.getLogger(__name__) - -_TMPFS_MAGIC = 0x01021994 -"""Special tmpfs filesystem header value.""" - - -def _is_tmpfs_via_statfs(path: str) -> bool | None: - """Check filesystem type via libc statfs(2). Returns None if unavailable.""" - try: - lib_name = ctypes.util.find_library("c") - if lib_name is None: - return None - libc = ctypes.CDLL(lib_name, use_errno=True) - - # Allocate a large buffer to account for differently sized statfs - # structs across architectures. f_type is always the first field - # (__SWORD_TYPE / long) at offset 0 on all Linux archs. - buf = ctypes.create_string_buffer(256) - if libc.statfs(path.encode(), buf) != 0: - return None - # f_type is a native-endian long at offset 0 - f_type = ctypes.c_long.from_buffer(buf, 0).value - return f_type == _TMPFS_MAGIC - except (OSError, AttributeError, ValueError): - return None - - -def _is_tmpfs_via_proc_mounts(path: str) -> bool | None: - """Check filesystem type via /proc/mounts. Returns None if unavailable.""" - try: - resolved = str(Path(path).resolve()) - best_match = "" - best_fstype = "" - with open("/proc/mounts") as f: - for line in f: - parts = line.split() - if len(parts) < 3: - continue - mount_point, fstype = parts[1], parts[2] - if resolved.startswith(mount_point) and len(mount_point) > len( - best_match - ): - best_match = mount_point - best_fstype = fstype - if not best_match: - return None - return best_fstype == "tmpfs" - except OSError: - return None - - -def is_tmpfs(path: str | Path) -> bool: - """Check if a path resides on a tmpfs filesystem. - - Tries statfs(2) via ctypes first, falls back to /proc/mounts. - Returns False if detection fails (safe default — will call msync). - """ - path_str = str(path) - - result = _is_tmpfs_via_statfs(path_str) - if result is not None: - return result - - result = _is_tmpfs_via_proc_mounts(path_str) - if result is not None: - return result - - logger.warning( - "Could not determine filesystem type for %s " - "(statfs and /proc/mounts both unavailable). " - "Assuming non-tmpfs (msync will be called on every series append).", - path_str, - ) - return False - - -def needs_msync(path: str | Path) -> bool: - """Determine if msync() is needed for mmap write ordering at this path. - - Returns True if msync should be called between value write and count - update in series append. This is needed on ARM when the backing store - is a real filesystem (not tmpfs). - - On x86-64 (TSO), store ordering is guaranteed by hardware — msync is - never needed regardless of filesystem type. - - On ARM with tmpfs, msync is a no-op and won't help — log a warning - since the caller should use an on-disk directory for correct ordering. - """ - if platform.machine() == "x86_64": - return False - - on_tmpfs = is_tmpfs(path) - if on_tmpfs: - logger.warning( - "ARM platform with tmpfs-backed metrics at %s. " - "Python does not support memory fences. " - "Use an on-disk metrics directory for correct cross-process reads.", - path, - ) - return False - - return True diff --git a/src/inference_endpoint/async_utils/services/metrics_aggregator/kv_store.py b/src/inference_endpoint/async_utils/services/metrics_aggregator/kv_store.py deleted file mode 100644 index 9f846234..00000000 --- a/src/inference_endpoint/async_utils/services/metrics_aggregator/kv_store.py +++ /dev/null @@ -1,511 +0,0 @@ -# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. -# SPDX-License-Identifier: Apache-2.0 -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Key-value store for metrics with per-key /dev/shm backing files. - -Each key in the store maps to a KVItem backed by an individual mmap'd file. -Two item types are supported: - -- **counter**: A single float64 value (e.g., error_count, n_in_flight). - File layout: [value: 8B float64] - -- **series**: An append-only list of float64 values with a length header - (e.g., ttft_ns, sample_latency_ns). Rollup stats are computed lazily on read. - File layout: [count: 8B uint64] [v0: 8B float64] [v1: 8B float64] ... - -Write protocol (single writer): - Counter: overwrite the 8-byte value. - Series: write float64 at HEADER + count*8, then update count. - On x86-64, aligned 8-byte stores are atomic (TSO), so readers always - see a consistent state. - -Read protocol (any process): - Counter: read 8 bytes. - Series: read count, then values[:count]. Rollup computed lazily with - incremental progress tracking (_last_rollup_idx). -""" - -from __future__ import annotations - -import logging -import math -import mmap -import os -import shutil -import struct -from abc import ABC, abstractmethod -from pathlib import Path -from typing import Literal - -from .fs_check import needs_msync - -# --------------------------------------------------------------------------- -# Series rollup stats (computed on read) -# --------------------------------------------------------------------------- - -logger = logging.getLogger(__name__) - -_HEADER_BYTES = 8 # uint64 count for series -_VALUE_BYTES = 8 # 8 bytes per value (uint64 or float64) -_DEFAULT_CAPACITY = 128 * 1024 # pre-allocate for 128k values (~1 MB) -_DEFAULT_FILE_MODE = 0o600 # rw------- - -# Struct format: endian prefix + per-dtype value character -_ENDIAN = "<" -_STRUCT_CHAR: dict[type, str] = { - int: "Q", # unsigned 64-bit integer - float: "d", # 64-bit IEEE 754 float -} - - -class SeriesStats: - """Lazily-computed statistics over a series of values. - - Rollup stats (count, total, min, max, sum_sq) are computed on read, - not on write. ``_last_rollup_idx`` caches progress so subsequent - reads only process newly appended values. - - When ``dtype=int`` (default), accumulators use Python int for arbitrary - precision with uint64 values. When ``dtype=float``, accumulators - use float (for float64 series). - """ - - __slots__ = ( - "count", - "total", - "min_val", - "max_val", - "sum_sq", - "values", - "_last_rollup_idx", - ) - - def __init__(self, values: list | None = None, dtype: type = int) -> None: - self.values: list = values if values is not None else [] - self.count: int = 0 - zero = dtype() - self.total: int | float = zero - self.min_val: int | float = math.inf - self.max_val: int | float = -math.inf - self.sum_sq: int | float = zero - self._last_rollup_idx: int = 0 - if self.values: - self._update_rollup() - - def _update_rollup(self) -> None: - """Incrementally update rollup stats from _last_rollup_idx onward.""" - for v in self.values[self._last_rollup_idx :]: - self.total += v - self.sum_sq += v * v - if v < self.min_val: - self.min_val = v - if v > self.max_val: - self.max_val = v - self.count = len(self.values) - self._last_rollup_idx = self.count - - -# --------------------------------------------------------------------------- -# KVStore ABC -# --------------------------------------------------------------------------- - - -class KVStore(ABC): - """Abstract key-value store for metrics. - - Keys are created with a type (counter or series). Values are updated - via update() and read via get() or snapshot(). Implementations may - back keys with /dev/shm files, Prometheus, or in-memory dicts. - """ - - @abstractmethod - def create_key( - self, - key: str, - key_type: Literal["series", "counter"], - dtype: type = int, - ) -> None: - """Register a new key in the store. - - Args: - key: Key name. - key_type: "counter" (single uint64) or "series" (append-only). - dtype: Value type for series keys (int or float). - Ignored for counters (always int/uint64). - """ - raise NotImplementedError - - @abstractmethod - def update(self, key: str, value: int | float) -> None: - """Update a key. For counters, sets the value. For series, appends.""" - raise NotImplementedError - - @abstractmethod - def get(self, key: str) -> int | SeriesStats: - """Read the current value of a key.""" - raise NotImplementedError - - @abstractmethod - def snapshot(self) -> dict[str, int | SeriesStats]: - """Return a dict of all keys and their current values.""" - raise NotImplementedError - - @abstractmethod - def close(self) -> None: - """Release resources.""" - raise NotImplementedError - - -# --------------------------------------------------------------------------- -# KVItem implementations (per-key mmap files) -# --------------------------------------------------------------------------- - - -class _CounterItem: - """Single uint64 value backed by an 8-byte mmap file.""" - - _FMT = f"{_ENDIAN}{_STRUCT_CHAR[int]}" - __slots__ = ("_mm", "_path", "_closed") - - def __init__(self, path: Path) -> None: - self._path = path - self._closed = False - fd = os.open(str(path), os.O_CREAT | os.O_RDWR, _DEFAULT_FILE_MODE) - try: - os.ftruncate(fd, _VALUE_BYTES) - self._mm = mmap.mmap(fd, _VALUE_BYTES) - finally: - os.close(fd) - struct.pack_into(_CounterItem._FMT, self._mm, 0, 0) - - def set(self, value: int) -> None: - if not self._closed: - struct.pack_into(_CounterItem._FMT, self._mm, 0, value) - - def get(self) -> int: - return struct.unpack_from(_CounterItem._FMT, self._mm, 0)[0] - - def close(self) -> None: - if not self._closed: - self._closed = True - self._mm.close() - - -class _CounterReader: - """Reader for a counter item.""" - - _FMT = _CounterItem._FMT - __slots__ = ("_fd", "_mm", "_path") - - def __init__(self, path: Path) -> None: - self._path = path - self._fd: int | None = None - self._mm: mmap.mmap | None = None - if path.exists(): - self._open() - - def _open(self) -> None: - fd = os.open(str(self._path), os.O_RDONLY) - try: - self._mm = mmap.mmap(fd, _VALUE_BYTES, prot=mmap.PROT_READ) - self._fd = fd - except Exception: - os.close(fd) - raise - - def get(self) -> int: - if self._mm is None: - if self._path.exists(): - self._open() - if self._mm is None: - return 0 - return struct.unpack_from(_CounterReader._FMT, self._mm, 0)[0] - - def close(self) -> None: - if self._mm is not None: - self._mm.close() - self._mm = None - if self._fd is not None: - os.close(self._fd) - self._fd = None - - -class _SeriesItem: - """Append-only series backed by an mmap file. - - Default dtype is int (uint64 storage, suitable for nanosecond timestamps). - Pass dtype=float for floating-point series. - """ - - __slots__ = ( - "_mm", - "_capacity", - "_count", - "_path", - "_closed", - "_dtype", - "_char", - "_fmt", - "_needs_msync", - ) - - def __init__( - self, - path: Path, - capacity: int = _DEFAULT_CAPACITY, - dtype: type = int, - ) -> None: - self._path = path - self._capacity = capacity - self._count = 0 - self._closed = False - self._dtype = dtype - self._char = _STRUCT_CHAR[dtype] - self._fmt = f"{_ENDIAN}{self._char}" - self._needs_msync = needs_msync(path.parent) - total = _HEADER_BYTES + capacity * _VALUE_BYTES - fd = os.open(str(path), os.O_CREAT | os.O_RDWR, _DEFAULT_FILE_MODE) - try: - os.ftruncate(fd, total) - self._mm = mmap.mmap(fd, total) - finally: - os.close(fd) - struct.pack_into(" None: - if self._closed: - logger.warning("append() called on closed series: %s", self._path) - return - if not isinstance(value, self._dtype): - raise TypeError( - f"Expected {self._dtype.__name__}, got {type(value).__name__}" - ) - if self._count >= self._capacity: - self._grow() - offset = _HEADER_BYTES + self._count * _VALUE_BYTES - struct.pack_into(self._fmt, self._mm, offset, value) - # Cross-process ordering note: msync between value write and count - # update is only needed for concurrent readers. In the current - # architecture, the reader (Report builder) runs after the writer - # process exits, so process exit flushes all dirty pages and - # ordering is guaranteed by the kernel. msync is skipped entirely. - # If concurrent reading is ever needed, re-enable via needs_msync(): - # if self._needs_msync: - # self._mm.flush() - # This has shown to be a considerable bottleneck on ARM systems - this will require a more - # sophisticated redesign for concurrent read/write and live metrics. - self._count += 1 - struct.pack_into(" SeriesStats: - """Read all values from the mmap and return as SeriesStats.""" - if self._count == 0: - return SeriesStats(dtype=self._dtype) - raw = self._mm[_HEADER_BYTES : _HEADER_BYTES + self._count * _VALUE_BYTES] - values = list(struct.unpack(f"{_ENDIAN}{self._count}{self._char}", raw)) - return SeriesStats(values, dtype=self._dtype) - - def close(self) -> None: - if not self._closed: - self._closed = True - self._mm.close() - - def _grow(self) -> None: - # Concurrency safety: readers in other processes hold their own mmap of - # this file. ftruncate() extends the file and zero-fills the new region; - # the reader's existing mmap remains valid (the kernel keeps the mapping - # alive independently). The reader detects the size change via fstat() - # and remaps. Between ftruncate and the next append(), the new region - # contains zeros, but readers are safe because they only read up to the - # count header value, which hasn't been updated yet. - old_mm = self._mm - new_capacity = self._capacity * 2 - total = _HEADER_BYTES + new_capacity * _VALUE_BYTES - fd = os.open(str(self._path), os.O_RDWR) - try: - os.ftruncate(fd, total) - self._mm = mmap.mmap(fd, total) - self._capacity = new_capacity - except Exception: - self._mm = old_mm - raise - finally: - os.close(fd) - old_mm.close() - - -class _SeriesReader: - """Reader for a series item with lazy rollup.""" - - __slots__ = ("_fd", "_mm", "_path", "_stats", "_char") - - def __init__(self, path: Path, dtype: type = int) -> None: - self._path = path - self._char = _STRUCT_CHAR[dtype] - self._stats = SeriesStats(dtype=dtype) - self._fd: int | None = None - self._mm: mmap.mmap | None = None - if path.exists(): - self._open() - - def _open(self) -> None: - fd = os.open(str(self._path), os.O_RDONLY) - try: - size = os.fstat(fd).st_size - if size > 0: - self._mm = mmap.mmap(fd, 0, prot=mmap.PROT_READ) - self._fd = fd - else: - os.close(fd) - except Exception: - os.close(fd) - raise - - def get(self) -> SeriesStats: - if self._mm is None: - if self._path.exists(): - self._open() - if self._mm is None: - return self._stats - - # Re-map if file grew - file_size = os.fstat(self._fd).st_size # type: ignore[arg-type] - if file_size > self._mm.size(): - self._mm.close() - self._mm = mmap.mmap(self._fd, 0, prot=mmap.PROT_READ) # type: ignore[arg-type] - - count = struct.unpack_from(" old_count: - start_offset = _HEADER_BYTES + old_count * _VALUE_BYTES - n_new = count - old_count - raw = self._mm[start_offset : start_offset + n_new * _VALUE_BYTES] - new_vals = list(struct.unpack(f"{_ENDIAN}{n_new}{self._char}", raw)) - self._stats.values.extend(new_vals) - self._stats._update_rollup() - - return self._stats - - def close(self) -> None: - if self._mm is not None: - self._mm.close() - self._mm = None - if self._fd is not None: - os.close(self._fd) - self._fd = None - - -# --------------------------------------------------------------------------- -# BasicKVStore (mmap-backed) -# --------------------------------------------------------------------------- - - -class BasicKVStore(KVStore): - """KVStore backed by per-key mmap files on /dev/shm (or any directory). - - Each key gets its own file: counters are 8 bytes, series are append-only - with a count header. Suitable for single-writer, multi-reader access. - """ - - def __init__(self, store_dir: Path) -> None: - self._dir = store_dir - self._dir.mkdir(parents=True, exist_ok=True) - self._items: dict[str, _CounterItem | _SeriesItem] = {} - - def create_key( - self, - key: str, - key_type: Literal["series", "counter"], - dtype: type = int, - ) -> None: - if key in self._items: - return - path = self._dir / f"{key}.kv" - if key_type == "counter": - self._items[key] = _CounterItem(path) - elif key_type == "series": - self._items[key] = _SeriesItem(path, dtype=dtype) - else: - raise ValueError(f"Unknown key type: {key_type}") - - def update(self, key: str, value: int | float) -> None: - item = self._items.get(key) - if item is None: - raise KeyError(f"Key not created: {key}") - if isinstance(item, _CounterItem): - item.set(int(value)) - else: - item.append(value) - - def get(self, key: str) -> int | SeriesStats: - item = self._items.get(key) - if item is None: - raise KeyError(f"Key not created: {key}") - return item.get() - - def snapshot(self) -> dict[str, int | SeriesStats]: - return {key: item.get() for key, item in self._items.items()} - - def close(self) -> None: - for item in self._items.values(): - item.close() - - def unlink(self) -> None: - """Close all items and remove the store directory.""" - self.close() - shutil.rmtree(self._dir, ignore_errors=True) - - -class BasicKVStoreReader: - """Read-only view of a BasicKVStore from another process. - - Lazily opens files and reads values. Each call to get() or snapshot() - picks up new values appended by the writer. - """ - - def __init__(self, store_dir: Path) -> None: - self._dir = store_dir - self._readers: dict[str, _CounterReader | _SeriesReader] = {} - - def register_key( - self, - key: str, - key_type: Literal["series", "counter"], - dtype: type = int, - ) -> None: - """Register a key to read. Call before get()/snapshot().""" - if key in self._readers: - return - path = self._dir / f"{key}.kv" - if key_type == "counter": - self._readers[key] = _CounterReader(path) - elif key_type == "series": - self._readers[key] = _SeriesReader(path, dtype=dtype) - - def get(self, key: str) -> int | SeriesStats: - reader = self._readers.get(key) - if reader is None: - raise KeyError(f"Key not registered: {key}") - return reader.get() - - def snapshot(self) -> dict[str, int | SeriesStats]: - return {key: reader.get() for key, reader in self._readers.items()} - - def close(self) -> None: - for reader in self._readers.values(): - reader.close() diff --git a/src/inference_endpoint/async_utils/services/metrics_aggregator/metrics_table.py b/src/inference_endpoint/async_utils/services/metrics_aggregator/metrics_table.py index a66c1e8d..ae66821a 100644 --- a/src/inference_endpoint/async_utils/services/metrics_aggregator/metrics_table.py +++ b/src/inference_endpoint/async_utils/services/metrics_aggregator/metrics_table.py @@ -29,8 +29,8 @@ from inference_endpoint.core.types import PromptData, TextModelOutput if TYPE_CHECKING: - from inference_endpoint.async_utils.services.metrics_aggregator.kv_store import ( - KVStore, + from inference_endpoint.async_utils.services.metrics_aggregator.registry import ( + MetricsRegistry, ) from inference_endpoint.async_utils.services.metrics_aggregator.token_metrics import ( TokenizePool, @@ -55,7 +55,7 @@ class SampleField(str, Enum): class MetricSeriesKey(str, Enum): - """Series metric keys written by triggers to the KV store.""" + """Series metric keys written by triggers to the registry.""" ISL = "isl" OSL = "osl" @@ -117,24 +117,26 @@ def duration_ns(self) -> int: class EmitTrigger(ABC): """A metric computation that fires when a SampleRow field is set. - Each trigger has a ``metric_name`` and a ``kv_store`` reference. - When ``fire()`` computes a value, it writes directly to - ``self.kv_store.update(self.metric_name, value)``. + Each trigger has a ``metric_name`` and a ``registry`` reference. When + ``fire()`` computes a value, it writes directly via + ``self.registry.record(self.metric_name, value)``. Series registration + (with HDR bounds, dtype, etc.) is the aggregator's responsibility — + the trigger itself never registers metrics. """ def __init__( self, metric_name: str, - kv_store: KVStore, + registry: MetricsRegistry, requires: tuple[str, ...] = (), dtype: type = int, ): - # Resolve enum to its value string so KVStore filenames match - # what the reader expects (e.g. "ttft_ns" not "MetricSeriesKey.TTFT_NS"). + # Resolve enum to its value string so registry names match the + # registered series names (e.g. "ttft_ns" not "MetricSeriesKey.TTFT_NS"). self.metric_name = ( metric_name.value if isinstance(metric_name, Enum) else metric_name ) - self.kv_store = kv_store + self.registry = registry self.requires = requires self.dtype = dtype @@ -158,14 +160,19 @@ class TimeDeltaTrigger(EmitTrigger): not yet opened for this sample). """ - def __init__(self, metric_name: str, kv_store: KVStore, delta_start_fieldname: str): - super().__init__(metric_name, kv_store, requires=(delta_start_fieldname,)) + def __init__( + self, + metric_name: str, + registry: MetricsRegistry, + delta_start_fieldname: str, + ): + super().__init__(metric_name, registry, requires=(delta_start_fieldname,)) self._delta_start_fieldname = delta_start_fieldname def fire(self, ev_rec, row, pre_change): baseline = pre_change.get(self._delta_start_fieldname) if baseline is not None: - self.kv_store.update(self.metric_name, ev_rec.timestamp_ns - baseline) + self.registry.record(self.metric_name, ev_rec.timestamp_ns - baseline) return None @@ -181,13 +188,13 @@ class AsyncTokenTrigger(EmitTrigger): def __init__( self, metric_name: str, - kv_store: KVStore, + registry: MetricsRegistry, tokenize_pool: TokenizePool | None, loop: asyncio.AbstractEventLoop | None, requires: tuple[str, ...] = (), dtype: type = int, ): - super().__init__(metric_name, kv_store, requires=requires, dtype=dtype) + super().__init__(metric_name, registry, requires=requires, dtype=dtype) self._pool = tokenize_pool self._loop = loop @@ -212,7 +219,7 @@ def fire(self, ev_rec, row, pre_change): return None pool, loop = self._pool, self._loop - store, name = self.kv_store, self.metric_name + registry, name = self.registry, self.metric_name uuid = row.sample_uuid async def _tokenize_and_emit() -> None: @@ -220,7 +227,7 @@ async def _tokenize_and_emit() -> None: count = await pool.token_count_async(text, loop) value = self._compute_value(count, ev_rec, pre_change) if value is not None: - store.update(name, value) + registry.record(name, value) except Exception: logger.exception("%s tokenization failed for %s", name, uuid) @@ -235,10 +242,10 @@ async def _tokenize_and_emit() -> None: class TtftTrigger(TimeDeltaTrigger): """TTFT = recv_first_ns (new) - issued_ns.""" - def __init__(self, kv_store: KVStore): + def __init__(self, registry: MetricsRegistry): super().__init__( MetricSeriesKey.TTFT_NS, - kv_store, + registry, delta_start_fieldname=SampleField.ISSUED_NS, ) @@ -249,10 +256,10 @@ class ChunkDeltaTrigger(TimeDeltaTrigger): Skips when pre-change last_recv_ns is None (first recv via RECV_FIRST). """ - def __init__(self, kv_store: KVStore): + def __init__(self, registry: MetricsRegistry): super().__init__( MetricSeriesKey.CHUNK_DELTA_NS, - kv_store, + registry, delta_start_fieldname=SampleField.LAST_RECV_NS, ) @@ -260,10 +267,10 @@ def __init__(self, kv_store: KVStore): class SampleLatencyTrigger(TimeDeltaTrigger): """sample_latency_ns = complete_ns (new) - issued_ns.""" - def __init__(self, kv_store: KVStore): + def __init__(self, registry: MetricsRegistry): super().__init__( MetricSeriesKey.SAMPLE_LATENCY_NS, - kv_store, + registry, delta_start_fieldname=SampleField.ISSUED_NS, ) @@ -278,16 +285,16 @@ class IslTrigger(AsyncTokenTrigger): def __init__( self, - kv_store: KVStore, + registry: MetricsRegistry, tokenize_pool: TokenizePool | None, loop: asyncio.AbstractEventLoop | None, ): - super().__init__(MetricSeriesKey.ISL, kv_store, tokenize_pool, loop) + super().__init__(MetricSeriesKey.ISL, registry, tokenize_pool, loop) def fire(self, ev_rec, row, pre_change): # Sync fast path: any backend that pre-populates token_ids (e.g. SGLang). if isinstance(ev_rec.data, PromptData) and ev_rec.data.token_ids is not None: - self.kv_store.update(self.metric_name, len(ev_rec.data.token_ids)) + self.registry.record(self.metric_name, len(ev_rec.data.token_ids)) return None # Async path: tokenize raw text — used when token_ids are unavailable # (e.g. OpenAI-compatible endpoints). Handled by the base class. @@ -304,11 +311,11 @@ class OslTrigger(AsyncTokenTrigger): def __init__( self, - kv_store: KVStore, + registry: MetricsRegistry, tokenize_pool: TokenizePool | None, loop: asyncio.AbstractEventLoop | None, ): - super().__init__(MetricSeriesKey.OSL, kv_store, tokenize_pool, loop) + super().__init__(MetricSeriesKey.OSL, registry, tokenize_pool, loop) def _extract_text(self, ev_rec, row, pre_change): if isinstance(ev_rec.data, TextModelOutput): @@ -334,13 +341,13 @@ class TpotTrigger(AsyncTokenTrigger): def __init__( self, - kv_store: KVStore, + registry: MetricsRegistry, tokenize_pool: TokenizePool | None, loop: asyncio.AbstractEventLoop | None, ): super().__init__( MetricSeriesKey.TPOT_NS, - kv_store, + registry, tokenize_pool, loop, requires=(SampleField.RECV_FIRST_NS,), @@ -369,9 +376,10 @@ def _compute_value(self, token_count, ev_rec, pre_change): class MetricsTable: """Stores in-flight sample rows, session state, and dispatches triggers. - Takes a KVStore for metric storage. When triggers are registered via - add_trigger(), the table creates the key in the store and wires the - store onto the trigger. + Takes a ``MetricsRegistry`` for metric storage. Triggers are passed to + ``add_trigger`` already wired against the registry. The table does NOT + register the underlying series — the aggregator pre-registers all + series with explicit HDR bounds before constructing triggers. Row lifecycle is managed internally via ``set_field``: - ISSUED: creates the row if tracking is on, assigns block index. @@ -381,8 +389,8 @@ class MetricsTable: Session state is updated via ``handle_session_event``. """ - def __init__(self, kv_store: KVStore) -> None: - self._kv_store = kv_store + def __init__(self, registry: MetricsRegistry) -> None: + self._registry = registry self._in_flight: dict[str, SampleRow] = {} self._triggers: dict[str, list[EmitTrigger]] = {} self._in_flight_tasks: set[asyncio.Task] = set() @@ -397,10 +405,10 @@ def __init__(self, kv_store: KVStore) -> None: def add_trigger(self, field_name: str, trigger: EmitTrigger) -> None: """Register a trigger for a SampleRow field. - Creates the trigger's metric key in the KV store as a series, - using the trigger's declared dtype. + The trigger's underlying series MUST already be registered on the + registry by the aggregator (which knows the right HDR bounds and + dtype). The table only stores the trigger reference. """ - self._kv_store.create_key(trigger.metric_name, "series", dtype=trigger.dtype) self._triggers.setdefault(field_name, []).append(trigger) # --- Session event handling --- @@ -481,11 +489,42 @@ def set_field( # --- Task draining --- - async def drain_tasks(self) -> None: - """Await all in-flight async trigger tasks.""" - if self._in_flight_tasks: + @property + def in_flight_tasks_count(self) -> int: + """Number of async trigger tasks currently in flight.""" + return len(self._in_flight_tasks) + + async def drain_tasks(self, *, timeout: float | None = None) -> int: + """Await in-flight async trigger tasks. + + With ``timeout``, the pending set at the timeout boundary is + cancelled and awaited; the count of those pending tasks is + returned (>0 indicates the drain timed out). Without + ``timeout``, blocks indefinitely and returns 0 on clean drain. + + The pending count must be captured BEFORE the cancel-and-await + step: each task's ``add_done_callback(_in_flight_tasks.discard)`` + empties ``_in_flight_tasks`` as cancellation propagates, so + reading ``in_flight_tasks_count`` after this method returns + would always be 0 — making a drain timeout indistinguishable + from a clean run. + """ + if not self._in_flight_tasks: + return 0 + if timeout is None: await asyncio.gather(*self._in_flight_tasks, return_exceptions=True) self._in_flight_tasks.clear() + return 0 + _, still_pending = await asyncio.wait( + list(self._in_flight_tasks), timeout=timeout + ) + n_pending = len(still_pending) + if still_pending: + for t in still_pending: + t.cancel() + await asyncio.gather(*still_pending, return_exceptions=True) + self._in_flight_tasks.clear() + return n_pending # --- Internal --- diff --git a/src/inference_endpoint/async_utils/services/metrics_aggregator/publisher.py b/src/inference_endpoint/async_utils/services/metrics_aggregator/publisher.py new file mode 100644 index 00000000..d21973a3 --- /dev/null +++ b/src/inference_endpoint/async_utils/services/metrics_aggregator/publisher.py @@ -0,0 +1,321 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""``MetricsPublisher``: publish ``MetricsSnapshot`` over pub/sub + JSON file.""" + +from __future__ import annotations + +import asyncio +import json +import logging +import os +from collections.abc import Callable +from pathlib import Path + +from inference_endpoint.async_utils.services.metrics_aggregator.registry import ( + MetricsRegistry, +) +from inference_endpoint.async_utils.services.metrics_aggregator.snapshot import ( + MetricsSnapshot, + MetricsSnapshotCodec, + SessionState, + snapshot_to_dict, +) +from inference_endpoint.async_utils.transport.zmq.context import ManagedZMQContext +from inference_endpoint.async_utils.transport.zmq.pubsub import ZmqMessagePublisher + +logger = logging.getLogger(__name__) + + +class MetricsPublisher: + """Periodic snapshot publisher: pub/sub for live, JSON file for final. + + The live tick task runs at ``publish_interval_s`` cadence and publishes + a non-final snapshot over pub/sub each tick. ``publish_final``: + + 1. Cancels the tick task (and awaits its exit). + 2. Atomically writes the final snapshot as pretty-printed JSON to + ``final_snapshot_path`` — this is the **primary** delivery path + and what the Report consumer reads. + 3. Publishes a (msgpack) terminal-state snapshot over pub/sub as a + **TUI shutdown signal** — a future TUI can switch to "final view" + on seeing this frame without polling the file. The Report consumer + does NOT depend on this pub/sub send. + + Decoupling the file from pub/sub means ``conflate=True`` on the SUB + side is unambiguously safe (a TUI that drops the COMPLETE frame just + needs to notice the file appeared / the publisher socket dropped), + and the file artifact is self-contained: ``cat final_snapshot.json`` + is the canonical source of truth for a finished run. + + Pub/sub publish and disk write are **independent** best-effort + paths: a failure in one MUST NOT suppress the other. + """ + + def __init__( + self, + codec: MetricsSnapshotCodec, + zmq_ctx: ManagedZMQContext, + socket_name: str, + loop: asyncio.AbstractEventLoop, + final_snapshot_path: Path, + ) -> None: + # final_snapshot_path is the absolute path the JSON file is written + # to. Injected (not derived from output_dir) so tests can place it + # in tmp_path without recomputing extension/filename. + self._publisher: ZmqMessagePublisher[MetricsSnapshot] = ZmqMessagePublisher( + codec, + socket_name, + zmq_ctx, + loop=loop, + send_threshold=1, + sndhwm=4, + linger=10_000, + ) + self._loop = loop + self._final_snapshot_path = final_snapshot_path + self._tick_task: asyncio.Task | None = None + self._closed = False + # publish_final is idempotent: the SIGTERM/SIGINT handler in + # __main__.py and the aggregator's ENDED-driven path can both + # call it; the second call must not re-publish or re-write. + self._finalized = False + + # ------------------------------------------------------------------ + # Live tick task + # ------------------------------------------------------------------ + + def start( + self, + registry: MetricsRegistry, + publish_interval_s: float, + get_runtime_state: Callable[[], tuple[SessionState, int]], + ) -> None: + """Begin publishing live ticks every ``publish_interval_s`` seconds. + + ``get_runtime_state`` returns ``(state, n_pending_tasks)`` for the + current moment: the aggregator's session state (``LIVE`` or + ``DRAINING``) and the count of in-flight async tokenize tasks. The + callable is invoked once per tick and the values are plumbed into + the published snapshot. ``COMPLETE`` is emitted only by + ``publish_final``, never by the tick task. + + Idempotent on the tick-task slot: a second call (e.g. from a + spurious duplicate ``STARTED`` event or a buggy replay producer) + is a no-op rather than orphaning the original task. The original + task remains the one cancelled by ``publish_final`` / ``aclose``. + """ + if self._tick_task is not None: + logger.warning( + "MetricsPublisher.start called again while tick task is " + "still running (id=%r); ignoring the second start.", + id(self._tick_task), + ) + return + if publish_interval_s <= 0: + raise ValueError( + f"publish_interval_s must be positive, got {publish_interval_s}" + ) + + async def _tick() -> None: + while True: + try: + await asyncio.sleep(publish_interval_s) + state, n_pending = get_runtime_state() + snap = registry.build_snapshot( + state=state, n_pending_tasks=n_pending + ) + self._publisher.publish(snap) + except asyncio.CancelledError: + # Graceful cancellation from publish_final/close. + return + except Exception: # noqa: BLE001 — keep ticking on transient errors. + logger.exception("metrics tick failed; continuing") + + self._tick_task = self._loop.create_task(_tick()) + + # ------------------------------------------------------------------ + # Final delivery + # ------------------------------------------------------------------ + + async def publish_final( + self, + registry: MetricsRegistry, + *, + n_pending_tasks: int, + interrupted: bool = False, + ) -> None: + """Write the final snapshot to disk and signal pub/sub consumers. + + ``n_pending_tasks`` is the count of in-flight async tokenize tasks + at finalization time. Drain timeout is detected by Report consumers + as ``state == COMPLETE and n_pending_tasks > 0``. + + ``interrupted=True`` is set by the signal handler in __main__.py + when SIGTERM/SIGINT triggers shutdown before ``ENDED`` arrived; + the resulting snapshot is tagged ``state=INTERRUPTED`` so Report + can distinguish "user killed the run mid-execution" from a clean + end. Stats in an INTERRUPTED snapshot are best-effort partial + captures of whatever the aggregator had at signal time. + + Two delivery channels, independent best-effort: + + 1. **JSON file at ``final_snapshot_path``** (primary). Atomic + write (tmp + fsync(file) + rename + fsync(parent dir)) so the + file is either fully present or absent — partial reads are + impossible. Pretty-printed for ``cat`` / ``jq`` use. This is + what the Report consumer reads. + 2. **msgpack pub/sub** (TUI signal). A future TUI uses this as + the "run is over, switch to final view" cue without polling + the file. The Report consumer does NOT read this channel. + + A failure in one channel MUST NOT suppress the other; each is + wrapped in its own try/except. + + Awaits tick-task cancellation BEFORE building the snapshot so a + late live tick cannot land after the terminal frame on the wire + (which would let a conflate-mode TUI see the live tick instead + of the terminal state as the last message). + + Idempotent: only the first call writes/publishes; subsequent + calls early-return. The SIGTERM/SIGINT handler relies on this to + race safely with the ENDED-driven path. + """ + if self._finalized: + return + self._finalized = True + if self._tick_task is not None: + self._tick_task.cancel() + try: + await self._tick_task + except asyncio.CancelledError: + # Expected: we just cancelled it. + pass + self._tick_task = None + + terminal_state = ( + SessionState.INTERRUPTED if interrupted else SessionState.COMPLETE + ) + snap = registry.build_snapshot( + state=terminal_state, n_pending_tasks=n_pending_tasks + ) + + # Primary: atomic JSON file write. Run on a worker thread because + # fsync(file) + fsync(parent dir) can block tens-to-hundreds of ms + # on a busy host and would otherwise back-pressure any in-flight + # event-record processing on the aggregator's event loop. + try: + # ``allow_nan=False`` makes a producer-side NaN/Inf leak a + # hard error here rather than a silent ``NaN`` / ``Infinity`` + # token in the file (which strict JSON consumers reject). + # ``snapshot_to_dict`` already scrubs non-finite floats to + # ``None``, so the only way this raises is a genuine bug. + payload = json.dumps( + snapshot_to_dict(snap), indent=2, allow_nan=False + ).encode("utf-8") + await asyncio.to_thread(self._write_atomic_json, payload) + except Exception: # noqa: BLE001 — best-effort; pub/sub still needs to fire. + logger.exception("metrics: final JSON snapshot write failed") + + # TUI signal: msgpack pub/sub send. Wrapped so a transport bug + # doesn't suppress the file write above and so a SUB-side issue + # doesn't crash the aggregator on shutdown. Also legitimately + # covers the ENDED-vs-SIGTERM race: if a signal-driven + # publish_final raced ahead and reached `aclose()` before this + # publish call runs, the underlying ZMQ socket is already + # closed and the send raises. Dropping the TUI frame in that + # race is acceptable — the JSON file written above is the + # authoritative Report source. + try: + self._publisher.publish(snap) + except Exception: # noqa: BLE001 — best-effort; file is the source of truth. + logger.exception("metrics: pub/sub final signal failed") + + def _write_atomic_json(self, payload: bytes) -> None: + """Write payload atomically to ``final_snapshot_path``. + + Sequence: write tmp + fsync(tmp) → rename → fsync(parent dir) so + the rename itself is durable across crashes. The path either + contains the new snapshot or contains the old contents (if any) + — never partial bytes. The parent directory is the caller's + responsibility — `__main__.py` validates it on startup so a + missing directory surfaces in the subprocess's own context + rather than as a 30 s parent-side launch timeout. + """ + path = self._final_snapshot_path + tmp = path.with_suffix(path.suffix + ".tmp") + # Wrap so a failure between tmp-write and rename doesn't leak + # the .tmp file across runs (e.g. EXDEV cross-device, parent + # directory removed mid-write, permission change). The rename + # is the atomicity boundary: before it, .tmp is partial state; + # after it, .tmp doesn't exist (rename consumed it). + try: + with tmp.open("wb") as f: + f.write(payload) + f.flush() + os.fsync(f.fileno()) + os.rename(tmp, path) + except BaseException: + tmp.unlink(missing_ok=True) + raise + dir_fd = os.open(path.parent, os.O_RDONLY) + try: + os.fsync(dir_fd) + finally: + os.close(dir_fd) + + # ------------------------------------------------------------------ + # Lifecycle + # ------------------------------------------------------------------ + + def close(self) -> None: + """Cancel tick task and close the underlying publisher. + + Sync, best-effort. The tick task is cancelled but NOT awaited; + if a live tick is mid-publish when this runs, it may print a + CancelledError-during-shutdown trace before the loop tears down. + Prefer :meth:`aclose` from async contexts to avoid that. This + sync form exists for error-path / signal-handler fallbacks where + no event loop is reasonably available to await on. + + ``ZmqMessagePublisher.close()`` drains pending frames; bounded by + the ``linger=10s`` set at construction. + """ + if self._closed: + return + self._closed = True + if self._tick_task is not None: + self._tick_task.cancel() + self._publisher.close() + + async def aclose(self) -> None: + """Async-aware close: cancel the tick task and await its exit. + + Preferred over :meth:`close` whenever the caller is running on + an event loop. Eliminates the cancelled-tick-task-vs-publisher- + close race that the sync :meth:`close` is exposed to. + """ + if self._closed: + return + self._closed = True + if self._tick_task is not None: + self._tick_task.cancel() + try: + await self._tick_task + except asyncio.CancelledError: + # Expected: we just cancelled it. + pass + self._tick_task = None + self._publisher.close() diff --git a/src/inference_endpoint/async_utils/services/metrics_aggregator/registry.py b/src/inference_endpoint/async_utils/services/metrics_aggregator/registry.py new file mode 100644 index 00000000..1268ba9e --- /dev/null +++ b/src/inference_endpoint/async_utils/services/metrics_aggregator/registry.py @@ -0,0 +1,449 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Sampler hierarchy and registry for the metrics aggregator. + +A ``MetricsRegistry`` holds one ``CounterSampler`` per counter and one +``SeriesSampler`` per series. The aggregator hot path calls +``registry.increment(...)`` / ``registry.record(...)`` for every event; +the publisher periodically calls ``registry.build_snapshot(...)`` to +materialize a ``MetricsSnapshot``. + +Series samplers maintain three parallel views: + +1. Cheap exact rollups (count/total/min/max/sum_sq) — O(1), exact. +2. HDR Histogram — supports cheap live percentiles/histogram. +3. ``array.array`` of raw values — supports exact final percentiles. +""" + +from __future__ import annotations + +import array +import bisect +import logging +import math +import time +from abc import ABC, abstractmethod +from typing import Final + +import numpy as np +from hdrh.histogram import HdrHistogram + +from .snapshot import CounterStat, MetricsSnapshot, MetricStat, SeriesStat, SessionState + +logger = logging.getLogger(__name__) + + +# array.array typecodes per dtype. 'q' = signed int64, 'd' = float64. +_ARRAY_TYPECODE: Final[dict[type, str]] = {int: "q", float: "d"} +_NUMPY_DTYPE: Final[dict[type, type]] = {int: np.int64, float: np.float64} + + +class MetricSampler(ABC): + """A single named sampler that builds a ``MetricStat`` on demand.""" + + name: str + + @abstractmethod + def build_stat(self, *, exact: bool) -> MetricStat: + """Materialize the current state into a wire ``MetricStat``. + + ``exact=True`` selects the raw-values-driven computation path used + for the ``COMPLETE`` snapshot (sort + np.percentile/histogram). + ``exact=False`` selects the cheap HDR-derived path used for ``LIVE`` + and ``DRAINING`` snapshots. + """ + raise NotImplementedError + + +# --------------------------------------------------------------------------- +# Counter +# --------------------------------------------------------------------------- + + +class CounterSampler(MetricSampler): + """A monotonic (or settable) counter.""" + + __slots__ = ("name", "_value", "_dtype") + + def __init__(self, name: str, dtype: type = int) -> None: + self.name = name + self._dtype = dtype + # Use the dtype to seed the zero so we keep int/float identity. + self._value: int | float = dtype() + + def increment(self, delta: int | float) -> None: + self._value += delta + + def set(self, value: int | float) -> None: # noqa: A003 — domain term. + self._value = value + + def value(self) -> int | float: + return self._value + + def build_stat(self, *, exact: bool) -> CounterStat: # noqa: ARG002 + # Counters are exact at every tick — the ``exact`` flag is part of + # the sampler protocol but has no effect on counter output. + return CounterStat(name=self.name, value=self._value) + + +# --------------------------------------------------------------------------- +# Series +# --------------------------------------------------------------------------- + + +def _log_spaced_edges(low: float, high: float, n_buckets: int) -> list[float]: + """Return ``n_buckets+1`` log-spaced edges over ``[low, high]``. + + ``low`` is clamped to ``max(low, 1)`` so the log is well-defined for + zero-bound metrics (e.g. token counts starting at 1). + """ + safe_low = max(float(low), 1.0) + safe_high = max(float(high), safe_low * 10.0) + log_lo = math.log(safe_low) + log_hi = math.log(safe_high) + step = (log_hi - log_lo) / n_buckets + return [math.exp(log_lo + i * step) for i in range(n_buckets + 1)] + + +class SeriesSampler(MetricSampler): + """An append-only series sampler with cheap rollups + HDR + raw values.""" + + __slots__ = ( + "name", + "_dtype", + "_hdr", + "_hdr_low", + "_hdr_high", + "_raw", + "_n_histogram_buckets", + "_percentiles", + "_count", + "_total", + "_sum_sq", + "_min", + "_max", + "_warned_clamp", + ) + + def __init__( + self, + name: str, + *, + hdr_low: int, + hdr_high: int, + sig_figs: int, + n_histogram_buckets: int, + percentiles: tuple[float, ...], + dtype: type, + ) -> None: + if dtype not in _ARRAY_TYPECODE: + raise ValueError(f"Unsupported series dtype: {dtype!r}") + self.name = name + self._dtype = dtype + # HDR low must be >=1; a bound of 0 is rejected by the C library. + self._hdr_low = max(int(hdr_low), 1) + self._hdr_high = int(hdr_high) + # hdrhistogram's C constructor requires `high >= 2*low`; the error + # it raises is opaque ("ValueError: Could not allocate..."), so + # validate up front with both values in the message for callers + # who hit this from a custom registration site. + if self._hdr_high < self._hdr_low * 2: + raise ValueError( + f"{name}: HDR high ({self._hdr_high}) must be >= 2 * low " + f"({self._hdr_low}); got high/low={self._hdr_high / self._hdr_low:.2f}" + ) + self._hdr = HdrHistogram(self._hdr_low, self._hdr_high, sig_figs) + self._raw: array.array = array.array(_ARRAY_TYPECODE[dtype]) + # Bucket count is fixed; edges are derived per snapshot from the + # observed [min, max] so the histogram auto-zooms to the data. + self._n_histogram_buckets = n_histogram_buckets + self._percentiles: tuple[float, ...] = percentiles + + self._count: int = 0 + zero = dtype() + self._total: int | float = zero + self._sum_sq: int | float = zero + self._min: int | float = math.inf + self._max: int | float = -math.inf + self._warned_clamp: bool = False + + # -- hot path ---------------------------------------------------------- + + def record(self, value: int | float) -> None: + # 1. Cheap exact rollups. + self._count += 1 + self._total += value + self._sum_sq += value * value + if value < self._min: + self._min = value + if value > self._max: + self._max = value + + # 2. HDR (clamp into [hdr_low, hdr_high]). + if self._dtype is int: + clamped: int | float = max(int(value), self._hdr_low) + else: + clamped = max(float(value), float(self._hdr_low)) + if clamped > self._hdr_high: + clamped = self._hdr_high + if not self._warned_clamp and clamped != value: + logger.warning( + "%s: value %r outside HDR bounds [%d, %d]; clamped (warn-once)", + self.name, + value, + self._hdr_low, + self._hdr_high, + ) + self._warned_clamp = True + # HDR API accepts ints; coerce floats to int for the HDR view. + self._hdr.record_value(int(clamped)) + + # 3. Raw values for exact-final percentile/histogram computation. + self._raw.append(value) + + # -- snapshot construction -------------------------------------------- + + def build_stat(self, *, exact: bool) -> SeriesStat: + if self._count == 0: + # No data → no histogram. Edges are dynamic and only meaningful + # once min/max are observed; consumers should treat an empty + # histogram as "no data yet". + return SeriesStat( + name=self.name, + count=0, + total=self._dtype(), + min=0, + max=0, + sum_sq=self._dtype(), + percentiles={str(p): 0.0 for p in self._percentiles}, + histogram=[], + ) + + if exact: + return self._exact_stat() + return self._hdr_stat() + + def _hdr_stat(self) -> SeriesStat: + perc_dict: dict[str, float] = { + str(p): float(self._hdr.get_value_at_percentile(p)) + for p in self._percentiles + } + + # Dynamic display edges, log-spaced over the observed [min, max]. + # Re-derived per snapshot: edges auto-zoom to data, no wasted + # buckets. Consumers must re-render from (lo, hi, count) triples + # each frame rather than tracking bucket-by-index. + n_buckets = self._n_histogram_buckets + edges = _log_spaced_edges(self._min, self._max, n_buckets) + counts = [0] * n_buckets + + # Bin HDR sub-bucket counts into the display histogram. Walk the + # recorded iterator (length bounded by distinct sub-buckets, + # typically hundreds to thousands per series, not millions). + for it in self._hdr.get_recorded_iterator(): + v = it.value_iterated_to + c = it.count_added_in_this_iter_step + # Place v into the display bucket [edges[idx], edges[idx+1]). + idx = bisect.bisect_right(edges, v) - 1 + if idx < 0: + idx = 0 + elif idx >= n_buckets: + idx = n_buckets - 1 + counts[idx] += c + + histogram: list[tuple[tuple[float, float], int]] = [ + ((edges[i], edges[i + 1]), counts[i]) for i in range(n_buckets) + ] + + return SeriesStat( + name=self.name, + count=self._count, + total=self._total, + min=self._min, + max=self._max, + sum_sq=self._sum_sq, + percentiles=perc_dict, + histogram=histogram, + ) + + def _exact_stat(self) -> SeriesStat: + np_dtype = _NUMPY_DTYPE[self._dtype] + arr = np.frombuffer(self._raw, dtype=np_dtype) + # method="lower" returns observed values (not interpolated) so + # percentiles round-trip through int dtypes cleanly. + perc_values = np.percentile(arr, self._percentiles, method="lower") + perc_dict = { + str(p): float(v) + for p, v in zip(self._percentiles, perc_values, strict=True) + } + + # Dynamic edges from observed [min, max], same as the live HDR path, + # so consumers see consistent edge semantics across LIVE/DRAINING/ + # COMPLETE. ``_log_spaced_edges`` clamps the lower edge to >=1; clip + # values into the resulting edge range so any value below 1 (rare, + # but possible for sub-clamp raw recordings) lands in the first + # bucket instead of being dropped by np.histogram. Total bucket + # count then equals the recorded count. + edges = _log_spaced_edges( + float(self._min), float(self._max), self._n_histogram_buckets + ) + arr_clipped = np.clip(arr, edges[0], edges[-1]) + counts, _ = np.histogram(arr_clipped, bins=edges) + histogram: list[tuple[tuple[float, float], int]] = [ + ((float(edges[i]), float(edges[i + 1])), int(counts[i])) + for i in range(len(edges) - 1) + ] + + return SeriesStat( + name=self.name, + count=self._count, + total=self._total, + min=self._min, + max=self._max, + sum_sq=self._sum_sq, + percentiles=perc_dict, + histogram=histogram, + ) + + +# --------------------------------------------------------------------------- +# Registry +# --------------------------------------------------------------------------- + + +_DEFAULT_PERCENTILES: Final[tuple[float, ...]] = ( + 99.9, + 99.0, + 97.0, + 95.0, + 90.0, + 80.0, + 75.0, + 50.0, + 25.0, + 10.0, + 5.0, + 1.0, +) + + +class MetricsRegistry: + """Central registry of all counter and series samplers.""" + + def __init__(self) -> None: + self._counters: dict[str, CounterSampler] = {} + self._series: dict[str, SeriesSampler] = {} + self._seen_names: set[str] = set() + # Monotonic snapshot emit counter; surfaced on the wire as + # MetricsSnapshot.counter for diagnostic use by consumers. + self._counter: int = 0 + + # -- registration ----------------------------------------------------- + + def register_counter(self, name: str, dtype: type = int) -> CounterSampler: + if name in self._seen_names: + raise ValueError(f"Metric name already registered: {name}") + sampler = CounterSampler(name, dtype=dtype) + self._counters[name] = sampler + self._seen_names.add(name) + return sampler + + def register_series( + self, + name: str, + *, + hdr_low: int, + hdr_high: int, + sig_figs: int = 3, + n_histogram_buckets: int = 30, + percentiles: tuple[float, ...] = _DEFAULT_PERCENTILES, + dtype: type = int, + ) -> SeriesSampler: + """Register a new series. + + ``percentiles`` MUST include ``50.0`` (or ``50``) — median is a + mandatory metric on every series's display rollup, and + ``Report._series_to_metric_dict`` reads p50 from this tuple + rather than recomputing it from raw values. Without p50 the + median fallback degrades to ``(min + max) / 2`` (midrange), + which bears no relationship to the actual median; we reject + such registrations at construction time instead of producing + misleading reports downstream. + """ + if name in self._seen_names: + raise ValueError(f"Metric name already registered: {name}") + if 50.0 not in percentiles and 50 not in percentiles: + raise ValueError( + f"register_series({name!r}): percentiles must include 50.0 — " + f"median is a mandatory metric on every series. Got: " + f"{percentiles!r}" + ) + sampler = SeriesSampler( + name, + hdr_low=hdr_low, + hdr_high=hdr_high, + sig_figs=sig_figs, + n_histogram_buckets=n_histogram_buckets, + percentiles=percentiles, + dtype=dtype, + ) + self._series[name] = sampler + self._seen_names.add(name) + return sampler + + # -- hot path --------------------------------------------------------- + # Direct dict lookup, no isinstance dispatch — these are called once per + # event in the aggregator's process() loop. + + def increment(self, name: str, delta: int | float = 1) -> None: + """Increment a counter by ``delta`` (default 1).""" + self._counters[name].increment(delta) + + def set_counter(self, name: str, value: int | float) -> None: + self._counters[name].set(value) + + def record(self, name: str, value: int | float) -> None: + self._series[name].record(value) + + # -- snapshot --------------------------------------------------------- + + def build_snapshot( + self, *, state: SessionState, n_pending_tasks: int + ) -> MetricsSnapshot: + # Exact (raw-values) computation is reserved for the COMPLETE snapshot; + # live and draining snapshots use the cheap HDR path. + exact = state == SessionState.COMPLETE + self._counter += 1 + metrics: list[MetricStat] = [] + for c_sampler in self._counters.values(): + metrics.append(c_sampler.build_stat(exact=exact)) + for s_sampler in self._series.values(): + metrics.append(s_sampler.build_stat(exact=exact)) + return MetricsSnapshot( + counter=self._counter, + timestamp_ns=time.monotonic_ns(), + state=state, + n_pending_tasks=n_pending_tasks, + metrics=metrics, + ) + + # -- introspection (mostly for tests) -------------------------------- + + def has_counter(self, name: str) -> bool: + return name in self._counters + + def has_series(self, name: str) -> bool: + return name in self._series diff --git a/src/inference_endpoint/async_utils/services/metrics_aggregator/snapshot.py b/src/inference_endpoint/async_utils/services/metrics_aggregator/snapshot.py new file mode 100644 index 00000000..95c68ab1 --- /dev/null +++ b/src/inference_endpoint/async_utils/services/metrics_aggregator/snapshot.py @@ -0,0 +1,277 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Wire schema and codec for metrics snapshots published over pub/sub. + +The aggregator subprocess publishes ``MetricsSnapshot`` messages at a fixed +cadence. Each snapshot carries a ``SessionState`` (``LIVE`` during the run, +``DRAINING`` between ``ENDED`` and the final publish, ``COMPLETE`` for the +last snapshot). The snapshot is the only public wire format between the +aggregator and any consumer (main process, future TUI). +""" + +from __future__ import annotations + +import math +from enum import Enum +from typing import ClassVar, Final + +import msgspec +import msgspec.msgpack +from inference_endpoint.core.record import TOPIC_FRAME_SIZE + + +class SessionState(str, Enum): + """The aggregator's session state at the time a snapshot was emitted. + + INITIALIZE → aggregator has been constructed but no ``STARTED`` event + has arrived yet. The tick task is not running, so consumers + should not see a snapshot in this state on the wire today; + it exists so the in-process state machine has a well-defined + starting point (and so future setup-phase ticks have a + state to carry). + LIVE → run in progress; tick task publishing live HDR-derived stats. + DRAINING → ``SessionEventType.ENDED`` has been received; the aggregator + is awaiting the in-flight async tokenize tasks (bounded by + the ``--drain-timeout`` budget, default 60 s). Tick task + continues at this stage, still HDR-derived; no new events + will arrive. + COMPLETE → terminal clean state. The ``publish_final()`` snapshot + written from the ``ENDED`` path. Percentiles and histograms + are exact (computed from raw values). + INTERRUPTED → terminal interrupted state. The ``publish_final()`` snapshot + written from a signal handler (SIGTERM / SIGINT) before + ``ENDED`` arrived. Stats are best-effort partial captures of + whatever the aggregator had at signal time — drain didn't + complete and raw values may be missing late samples. + Distinguishes "user killed the run" from "clean shutdown"; + Report renders this with a clear interrupted indicator. + + Transitions are forward-only: + INITIALIZE → LIVE → DRAINING → {COMPLETE | INTERRUPTED} + No state ever moves backward, and the terminal states (COMPLETE, + INTERRUPTED) are not re-entrant (``MetricsPublisher._finalized`` + enforces a single publish_final call). + + Drain timeout is detected as ``state == COMPLETE and n_pending_tasks > 0``. + Interrupted-run is detected as ``state == INTERRUPTED`` directly. + """ + + INITIALIZE = "initialize" + LIVE = "live" + DRAINING = "draining" + COMPLETE = "complete" + INTERRUPTED = "interrupted" + + +class CounterStat( + msgspec.Struct, + tag="counter", + frozen=True, + array_like=True, +): # type: ignore[call-arg] + """A single counter value (e.g. ``total_samples_issued``).""" + + name: str + value: int | float + + +class SeriesStat( + msgspec.Struct, + tag="series", + frozen=True, + array_like=True, +): # type: ignore[call-arg] + """Aggregated statistics for a single series (e.g. ``ttft_ns``). + + For LIVE/DRAINING snapshots, ``percentiles`` and ``histogram`` come from + a live HDR Histogram. For COMPLETE snapshots they are computed exactly + from the full in-memory raw values. + + Histogram bucket edges are **dynamic per snapshot**: log-spaced over the + observed ``[min, max]`` of the data so far. The bucket count is fixed + at construction (default 30); the edges auto-zoom each frame so no + buckets are wasted on empty range. Empty series (no recordings) emit + ``histogram=[]``. + + Consumers MUST re-render from ``(lo, hi, count)`` triples each frame + and MUST NOT track bucket-by-index across snapshots — bucket ``i`` is + not guaranteed to span the same range in consecutive snapshots. + """ + + name: str + count: int + total: int | float + min: int | float # noqa: A003 — wire field name; collides with builtin only here. + max: int | float # noqa: A003 — wire field name; collides with builtin only here. + sum_sq: int | float + percentiles: dict[str, float] + histogram: list[tuple[tuple[float, float], int]] + + +# Tagged union: msgspec dispatches on the ``tag`` literal at decode time. +MetricStat = CounterStat | SeriesStat + + +class MetricsSnapshot( + msgspec.Struct, + frozen=True, + array_like=True, +): # type: ignore[call-arg] + """A single point-in-time view of all aggregator metrics. + + Fields: + counter: Monotonic emit count, incremented by the producing + ``MetricsRegistry`` on every ``build_snapshot()`` + call. Resets only on aggregator restart. Consumers + can use it to detect dropped/out-of-order delivery + or producer restarts. Diagnostic only — not used + for ordering on the wire. Unrelated to the + ``CounterStat`` metric kind in ``metrics``. + timestamp_ns: ``time.monotonic_ns()`` from the aggregator process + at snapshot composition time. Producer-local; not + comparable across processes. + state: ``SessionState`` enum — ``INITIALIZE``, ``LIVE``, + ``DRAINING``, ``COMPLETE``, or ``INTERRUPTED``. See + the enum docstring. Terminal states (``COMPLETE``, + ``INTERRUPTED``) mark the last snapshot of the run; + for ``COMPLETE`` snapshots percentiles and + histograms are exact, otherwise HDR-derived. + n_pending_tasks: Count of in-flight async tokenize tasks at snapshot + composition time. ``> 0`` during normal load (ISL/ + OSL/TPOT post-processing in flight) and during the + drain phase. **Drain timeout is detected as** + ``state == COMPLETE and n_pending_tasks > 0``: the + aggregator gave up draining; some async-only series + are missing samples that were still being tokenized. + metrics: Tagged union of ``CounterStat`` and ``SeriesStat``, + ordered counters-first then series, registration + order within each. + """ + + counter: int + timestamp_ns: int + state: SessionState + n_pending_tasks: int + metrics: list[MetricStat] + + +# 4-byte topic to match TOPIC_FRAME_SIZE-prefix protocol used by the +# pub/sub layer. The topic is null-padded to TOPIC_FRAME_SIZE on the wire. +METRICS_SNAPSHOT_TOPIC: Final[bytes] = b"MET\x00".ljust(TOPIC_FRAME_SIZE, b"\x00") + + +# --------------------------------------------------------------------------- +# Dict form of a snapshot. +# +# This is the shape used by: +# - the persisted ``final_snapshot.json`` file (writer in ``publisher.py``) +# - ``Report.from_snapshot`` as its canonical input +# +# The wire ``MetricsSnapshot`` Struct uses ``array_like=True`` for compact +# msgpack on the pub/sub hot path — that encoding is positional, which is +# wrong for both file storage (unreadable JSON arrays) and for consumer +# code that wants to read fields by name. ``snapshot_to_dict`` is the +# one-way bridge from the wire form to the consumer form. +# +# There is intentionally no inverse: consumers operate on the dict +# directly with ``dict.get(key, default)``. Decoding a dict back into an +# ``array_like=True`` Struct is ergonomically painful (msgspec's decoders +# follow the Struct's array_like flag), and the consumer doesn't need it. +# --------------------------------------------------------------------------- + + +def _scrub_nonfinite(v): + """Map non-finite floats (``NaN`` / ``±Inf``) to ``None``. + + The dict form is consumed by ``json.dumps(..., allow_nan=False)``, + which rejects non-finite floats so the producer-side bug surfaces + loudly rather than silently writing ``NaN`` / ``Infinity`` literals + that ``jq``, Go's ``encoding/json``, and any strict-JSON consumer + refuse to parse. Mapping non-finite to ``None`` keeps the JSON + strict and self-describes the gap to the consumer. + """ + if isinstance(v, float) and not math.isfinite(v): + return None + return v + + +def snapshot_to_dict(snap: MetricsSnapshot) -> dict: + """Convert a wire ``MetricsSnapshot`` to its dict form. + + Manual mapping is the source of truth for the dict schema. When + adding a field to ``MetricsSnapshot`` (or ``CounterStat`` / + ``SeriesStat``), update this function so the field appears in both + the persisted JSON file and the input to ``Report.from_snapshot``. + """ + return { + "counter": snap.counter, + "timestamp_ns": snap.timestamp_ns, + "state": snap.state.value, + "n_pending_tasks": snap.n_pending_tasks, + "metrics": [_metric_to_dict(m) for m in snap.metrics], + } + + +def _metric_to_dict(m: MetricStat) -> dict: + if isinstance(m, CounterStat): + return {"type": "counter", "name": m.name, "value": _scrub_nonfinite(m.value)} + return { + "type": "series", + "name": m.name, + "count": m.count, + "total": _scrub_nonfinite(m.total), + "min": _scrub_nonfinite(m.min), + "max": _scrub_nonfinite(m.max), + "sum_sq": _scrub_nonfinite(m.sum_sq), + "percentiles": {k: _scrub_nonfinite(v) for k, v in m.percentiles.items()}, + # Histogram tuples → JSON arrays. Consumers reading the dict can + # iterate the two-element ranges directly without coercion. + # Bucket edges are floats from log-spacing — scrub for safety. + "histogram": [ + [[_scrub_nonfinite(rng[0]), _scrub_nonfinite(rng[1])], c] + for rng, c in m.histogram + ], + } + + +class MetricsSnapshotCodec: + """``MessageCodec[MetricsSnapshot]`` — binds pub/sub layer to msgpack. + + Implements the structural ``MessageCodec`` Protocol from + ``inference_endpoint.async_utils.transport.protocol`` without importing + it (avoids a transport→service back-import). Mirrors the pattern in + ``EventRecordCodec``. + """ + + __slots__ = () + + _ENCODER: ClassVar = msgspec.msgpack.Encoder() + _DECODER: ClassVar = msgspec.msgpack.Decoder(type=MetricsSnapshot) + + def encode(self, item: MetricsSnapshot) -> tuple[bytes, bytes]: + return METRICS_SNAPSHOT_TOPIC, self._ENCODER.encode(item) + + def decode(self, payload: bytes) -> MetricsSnapshot: + return self._DECODER.decode(payload) + + def on_decode_error(self, payload: bytes, exc: Exception) -> MetricsSnapshot | None: + # Only swallow genuine wire-format failures. Anything else is a bug + # in the decode path and should propagate. + if not isinstance(exc, msgspec.DecodeError): + raise exc + # A malformed metrics frame is always safe to drop: snapshots are + # idempotent and the next live tick or final replaces it. + return None diff --git a/src/inference_endpoint/async_utils/services/metrics_aggregator/subscriber.py b/src/inference_endpoint/async_utils/services/metrics_aggregator/subscriber.py new file mode 100644 index 00000000..c171624b --- /dev/null +++ b/src/inference_endpoint/async_utils/services/metrics_aggregator/subscriber.py @@ -0,0 +1,73 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Subscribe to ``MetricsSnapshot`` from the aggregator subprocess. + +A live-state subscriber for TUI / dashboard consumers. Keeps the latest +snapshot in ``self.latest`` and updates it on every tick. Terminal +snapshots (``SessionState.COMPLETE`` / ``INTERRUPTED``) arrive over +pub/sub as a "run finished" signal for consumers that want to switch to +a final-view rendering on the wire event. +""" + +from __future__ import annotations + +import asyncio +import logging + +from inference_endpoint.async_utils.services.metrics_aggregator.snapshot import ( + MetricsSnapshot, + MetricsSnapshotCodec, +) +from inference_endpoint.async_utils.transport.zmq.context import ManagedZMQContext +from inference_endpoint.async_utils.transport.zmq.pubsub import ZmqMessageSubscriber + +logger = logging.getLogger(__name__) + + +class MetricsSnapshotSubscriber(ZmqMessageSubscriber[MetricsSnapshot]): + """Subscriber that tracks the latest ``MetricsSnapshot`` for live views. + + ``latest`` is updated on every received snapshot regardless of state. + A consumer detects "run finished" by observing + ``latest.state in {COMPLETE, INTERRUPTED}`` — both are terminal and + no further snapshots will arrive. + """ + + def __init__( + self, + path: str, + zmq_ctx: ManagedZMQContext, + loop: asyncio.AbstractEventLoop, + *, + conflate: bool = True, + ) -> None: + # conflate=True (default) keeps only the freshest snapshot in the + # SUB queue — the right shape for live consumers that render the + # current state on a timer. Pass conflate=False if a consumer + # needs every intermediate tick (no current callers do). + super().__init__( + MetricsSnapshotCodec(), + path, + zmq_ctx, + loop, + topics=None, + conflate=conflate, + ) + self.latest: MetricsSnapshot | None = None + + async def process(self, items: list[MetricsSnapshot]) -> None: + for snap in items: + self.latest = snap diff --git a/src/inference_endpoint/commands/benchmark/execute.py b/src/inference_endpoint/commands/benchmark/execute.py index 34a9fd40..1abb44c3 100644 --- a/src/inference_endpoint/commands/benchmark/execute.py +++ b/src/inference_endpoint/commands/benchmark/execute.py @@ -26,7 +26,6 @@ import asyncio import json import logging -import platform import shutil import signal import tempfile @@ -37,6 +36,7 @@ from typing import Any from urllib.parse import urljoin +import msgspec import msgspec.json from huggingface_hub import model_info from tqdm import tqdm @@ -48,14 +48,11 @@ ServiceConfig, ServiceLauncher, ) -from inference_endpoint.async_utils.services.metrics_aggregator.aggregator import ( - MetricCounterKey, +from inference_endpoint.async_utils.services.metrics_aggregator.snapshot import ( + snapshot_to_dict, ) -from inference_endpoint.async_utils.services.metrics_aggregator.kv_store import ( - BasicKVStoreReader, -) -from inference_endpoint.async_utils.services.metrics_aggregator.metrics_table import ( - MetricSeriesKey, +from inference_endpoint.async_utils.services.metrics_aggregator.subscriber import ( + MetricsSnapshotSubscriber, ) from inference_endpoint.async_utils.transport.zmq.context import ManagedZMQContext from inference_endpoint.config.runtime_settings import RuntimeSettings @@ -133,7 +130,6 @@ class BenchmarkResult: collector: ResponseCollector report: Report | None tmpfs_dir: Path - metrics_dir: Path | None = None @dataclass @@ -388,26 +384,25 @@ def _build_phases(ctx: BenchmarkContext) -> list[PhaseConfig]: return phases -def _setup_kv_reader( - metrics_dir: Path, - streaming: bool, -) -> BasicKVStoreReader: - """Create a KVStoreReader pre-registered with all metric keys.""" - reader = BasicKVStoreReader(metrics_dir) - for counter_key in MetricCounterKey: - reader.register_key(counter_key.value, "counter") - _STREAMING_ONLY = { - MetricSeriesKey.TTFT_NS, - MetricSeriesKey.CHUNK_DELTA_NS, - MetricSeriesKey.TPOT_NS, - } - _FLOAT_SERIES = {MetricSeriesKey.TPOT_NS} - for series_key in MetricSeriesKey: - if series_key in _STREAMING_ONLY and not streaming: - continue - dtype = float if series_key in _FLOAT_SERIES else int - reader.register_key(series_key.value, "series", dtype=dtype) - return reader +def _load_final_snapshot_from_disk(path: Path) -> dict[str, Any] | None: + """Read the persisted ``final_snapshot.json`` written by the aggregator. + + Returns the snapshot in its dict form — the same shape produced by + ``snapshot_to_dict`` and consumed by ``Report.from_snapshot``. No + intermediate Struct decode (see ``Report.from_snapshot`` docstring + for why the dict shape is the consumer contract). + + Returns ``None`` if the file is missing (the aggregator was killed + by an uncatchable signal — SIGKILL, OOM-kill — before its handler + could write) or unreadable. + """ + if not path.exists(): + return None + try: + return json.loads(path.read_bytes()) + except Exception as e: # noqa: BLE001 — best-effort. + logger.warning("Failed to read final snapshot %s: %s", path, e) + return None async def _run_benchmark_async( @@ -432,44 +427,50 @@ async def _run_benchmark_async( publisher = EventPublisherService(zmq_ctx) pub_socket_name = publisher.socket_name - # Tmpfs for high-frequency writes (metrics mmap + event log). - # On ARM, metrics need an on-disk directory so msync provides - # write ordering for cross-process mmap reads. Event logs are - # append-only and don't have ordering requirements, so they - # can stay on tmpfs. + # Tmpfs for high-frequency writes (event log). shm = Path("/dev/shm") use_shm = shm.exists() tmpfs_base = shm if use_shm else Path(tempfile.gettempdir()) tmpfs_dir = tmpfs_base / f"benchmark_{session_id}" tmpfs_dir.mkdir(parents=True, exist_ok=True) - # On ARM, mmap write ordering requires msync on a real filesystem. - # msync is a no-op on tmpfs, so metrics must use an on-disk directory. - if use_shm and platform.machine() != "x86_64": - logger.info( - "ARM platform: using on-disk metrics directory for mmap ordering" - ) - metrics_dir = Path( - tempfile.mkdtemp(prefix=f"metrics_{session_id}_", dir=".") - ) - else: - metrics_dir = tmpfs_dir / "metrics" - metrics_dir.mkdir(parents=True, exist_ok=True) - event_log_dir = tmpfs_dir / "events" event_log_dir.mkdir(parents=True, exist_ok=True) - # Launch service subprocesses - launcher = ServiceLauncher(zmq_ctx) + # Metrics-snapshot output (disk fallback for the final snapshot). + # Lives under the report dir so it's preserved with the rest of + # the run artifacts. + metrics_output_dir = ctx.report_dir / "metrics" + metrics_output_dir.mkdir(parents=True, exist_ok=True) + + metrics_socket_name = f"metrics_pub_{uuid.uuid4().hex[:8]}" + + # Connect the metrics-snapshot subscriber BEFORE launching the + # aggregator subprocess that binds the matching PUB socket. ZMQ + # tolerates connect-before-bind on IPC (the connect resolves once + # the binder appears), and starting the SUB reader early gives + # the subscription handshake time to complete during the + # ~1-2 second subprocess-launch window. This eliminates the + # slow-joiner risk of dropping early live ticks (or the worst + # case: missing COMPLETE if the SUB handshake never warms up). if zmq_ctx.socket_dir is None: raise RuntimeError("ZMQ socket_dir must be set after publisher bind") + metrics_subscriber = MetricsSnapshotSubscriber( + metrics_socket_name, zmq_ctx, loop + ) + metrics_subscriber.start() + + # Launch service subprocesses + launcher = ServiceLauncher(zmq_ctx) aggregator_args: list[str] = [ "--socket-dir", zmq_ctx.socket_dir, "--socket-name", pub_socket_name, - "--metrics-dir", - str(metrics_dir), + "--metrics-socket", + metrics_socket_name, + "--metrics-output-dir", + str(metrics_output_dir), ] if ctx.enable_streaming: aggregator_args.append("--streaming") @@ -558,25 +559,48 @@ async def _run_benchmark_async( logger.info("Waiting for services to finish processing...") await asyncio.to_thread(launcher.wait_for_exit, None) - # Build report AFTER aggregator has exited — ensures all metrics - # (TTFT, TPOT, OSL, latency) are fully written to KVStore. - try: - kv_reader = _setup_kv_reader(metrics_dir, ctx.enable_streaming) - report = Report.from_kv_reader(kv_reader) - kv_reader.close() - except Exception as e: - logger.warning(f"Failed to build report from metrics: {e}") - + # Source the snapshot dict for Report: + # 1. Preferred: the JSON file the aggregator atomically wrote + # in publish_final (ENDED-driven or signal-handler-driven). + # 2. Fallback: convert the last live snapshot from pub/sub to + # its dict form. Only reached when the aggregator was killed + # by an uncatchable signal (SIGKILL / OOM) before its + # handler could write. Report will be marked incomplete + # because state will be LIVE / DRAINING, not "complete". + snap_dict: dict[str, Any] | None = _load_final_snapshot_from_disk( + metrics_output_dir / "final_snapshot.json" + ) + if snap_dict is not None: + logger.info("Built report from final_snapshot.json") + elif metrics_subscriber.latest is not None: + snap_dict = snapshot_to_dict(metrics_subscriber.latest) + logger.warning( + "No final_snapshot.json on disk; falling back to last " + "pub/sub snapshot (state may or may not be terminal)" + ) + else: + logger.error("No metrics snapshot available; cannot build report") + + if snap_dict is not None: + try: + report = Report.from_snapshot(snap_dict) + if not report.complete: + logger.warning( + "Report is incomplete (state=%s, n_pending_tasks=%d)", + report.state, + snap_dict.get("n_pending_tasks", 0), + ) + except Exception as e: # noqa: BLE001 — best-effort report build. + logger.warning(f"Failed to build report from snapshot: {e}") + + metrics_subscriber.close() pbar.close() - # Track metrics_dir separately if it's not under tmpfs_dir (ARM on-disk case) - separate_metrics = metrics_dir if metrics_dir.parent != tmpfs_dir else None return BenchmarkResult( session=result, collector=collector, report=report, tmpfs_dir=tmpfs_dir, - metrics_dir=separate_metrics, ) @@ -628,16 +652,6 @@ def _salvage_tmpfs(report_dir: Path, tmpfs_dir: Path) -> None: shutil.copy2(src_events, dst_events) logger.debug(f"Copied {src_events} -> {dst_events}") - # metrics mmap files (from MetricsAggregator KVStore) - src_metrics = tmpfs_dir / "metrics" - if src_metrics.exists(): - dst_metrics = report_dir / "metrics" - dst_metrics.mkdir(parents=True, exist_ok=True) - for f in src_metrics.iterdir(): - if f.is_file(): - shutil.copy2(f, dst_metrics / f.name) - logger.debug(f"Copied metrics from {src_metrics} -> {dst_metrics}") - def finalize_benchmark(ctx: BenchmarkContext, bench: BenchmarkResult) -> None: """Score accuracy, aggregate results, write JSON.""" @@ -646,7 +660,7 @@ def finalize_benchmark(ctx: BenchmarkContext, bench: BenchmarkResult) -> None: collector = bench.collector report = bench.report - # Display report if available (from MetricsAggregator KVStore) + # Display report if available (from MetricsAggregator pub/sub snapshot) if report is not None: report.display(fn=lambda s: logger.info(s), summary_only=True) report.to_json(save_to=ctx.report_dir / "result_summary.json") @@ -682,7 +696,7 @@ def finalize_benchmark(ctx: BenchmarkContext, bench: BenchmarkResult) -> None: } logger.info(f"Score for {eval_cfg.dataset_name}: {score} ({n_repeats} repeats)") - # Report metrics: prefer Report from KVStore, fall back to SessionResult + # Report metrics: prefer Report from MetricsSnapshot, fall back to SessionResult if report is not None and report.duration_ns is not None: perf_elapsed = report.duration_ns / 1e9 total_issued = report.n_samples_issued @@ -761,6 +775,4 @@ def run_benchmark(config: BenchmarkConfig, test_mode: TestMode) -> None: if bench.tmpfs_dir.exists(): _salvage_tmpfs(ctx.report_dir, bench.tmpfs_dir) shutil.rmtree(bench.tmpfs_dir, ignore_errors=True) - if bench.metrics_dir and bench.metrics_dir.exists(): - shutil.rmtree(bench.metrics_dir, ignore_errors=True) logger.info(f"Partial results saved to {ctx.report_dir}") diff --git a/src/inference_endpoint/config/templates/concurrency_template.yaml b/src/inference_endpoint/config/templates/concurrency_template.yaml index 7b560ed7..db87b71c 100644 --- a/src/inference_endpoint/config/templates/concurrency_template.yaml +++ b/src/inference_endpoint/config/templates/concurrency_template.yaml @@ -5,7 +5,7 @@ model_params: datasets: # Dataset configs - name: perf type: performance # Dataset purpose: performance or accuracy | options: performance, accuracy - path: '' # Dataset file path + path: '' # Dataset file path parser: # Column remapping: {prompt: , system: } prompt: text_input settings: diff --git a/src/inference_endpoint/config/templates/concurrency_template_full.yaml b/src/inference_endpoint/config/templates/concurrency_template_full.yaml index b2a0c89d..2e16bc0d 100644 --- a/src/inference_endpoint/config/templates/concurrency_template_full.yaml +++ b/src/inference_endpoint/config/templates/concurrency_template_full.yaml @@ -15,7 +15,7 @@ model_params: datasets: # Dataset configs - name: perf type: performance # Dataset purpose: performance or accuracy | options: performance, accuracy - path: '' # Dataset file path + path: '' # Dataset file path format: null # Dataset format (auto-detected) samples: null # Number of samples to use eval_method: null @@ -24,7 +24,7 @@ datasets: # Dataset configs accuracy_config: null # Accuracy evaluation settings - name: accuracy type: accuracy # Dataset purpose: performance or accuracy | options: performance, accuracy - path: '' # Dataset file path + path: '' # Dataset file path format: null # Dataset format (auto-detected) samples: null # Number of samples to use eval_method: exact_match # Accuracy evaluation method | options: exact_match, contains, judge diff --git a/src/inference_endpoint/config/templates/offline_template.yaml b/src/inference_endpoint/config/templates/offline_template.yaml index 6e83d10f..d672491c 100644 --- a/src/inference_endpoint/config/templates/offline_template.yaml +++ b/src/inference_endpoint/config/templates/offline_template.yaml @@ -5,7 +5,7 @@ model_params: datasets: # Dataset configs - name: perf type: performance # Dataset purpose: performance or accuracy | options: performance, accuracy - path: '' # Dataset file path + path: '' # Dataset file path parser: # Column remapping: {prompt: , system: } prompt: text_input settings: diff --git a/src/inference_endpoint/config/templates/offline_template_full.yaml b/src/inference_endpoint/config/templates/offline_template_full.yaml index 6914ca3c..a3b4ed0a 100644 --- a/src/inference_endpoint/config/templates/offline_template_full.yaml +++ b/src/inference_endpoint/config/templates/offline_template_full.yaml @@ -15,7 +15,7 @@ model_params: datasets: # Dataset configs - name: perf type: performance # Dataset purpose: performance or accuracy | options: performance, accuracy - path: '' # Dataset file path + path: '' # Dataset file path format: null # Dataset format (auto-detected) samples: null # Number of samples to use eval_method: null @@ -24,7 +24,7 @@ datasets: # Dataset configs accuracy_config: null # Accuracy evaluation settings - name: accuracy type: accuracy # Dataset purpose: performance or accuracy | options: performance, accuracy - path: '' # Dataset file path + path: '' # Dataset file path format: null # Dataset format (auto-detected) samples: null # Number of samples to use eval_method: exact_match # Accuracy evaluation method | options: exact_match, contains, judge diff --git a/src/inference_endpoint/config/templates/online_template.yaml b/src/inference_endpoint/config/templates/online_template.yaml index d33c1fd5..c8431687 100644 --- a/src/inference_endpoint/config/templates/online_template.yaml +++ b/src/inference_endpoint/config/templates/online_template.yaml @@ -5,7 +5,7 @@ model_params: datasets: # Dataset configs - name: perf type: performance # Dataset purpose: performance or accuracy | options: performance, accuracy - path: '' # Dataset file path + path: '' # Dataset file path parser: # Column remapping: {prompt: , system: } prompt: text_input settings: diff --git a/src/inference_endpoint/config/templates/online_template_full.yaml b/src/inference_endpoint/config/templates/online_template_full.yaml index 0e45267e..d6c44dcc 100644 --- a/src/inference_endpoint/config/templates/online_template_full.yaml +++ b/src/inference_endpoint/config/templates/online_template_full.yaml @@ -15,7 +15,7 @@ model_params: datasets: # Dataset configs - name: perf type: performance # Dataset purpose: performance or accuracy | options: performance, accuracy - path: '' # Dataset file path + path: '' # Dataset file path format: null # Dataset format (auto-detected) samples: null # Number of samples to use eval_method: null @@ -24,7 +24,7 @@ datasets: # Dataset configs accuracy_config: null # Accuracy evaluation settings - name: accuracy type: accuracy # Dataset purpose: performance or accuracy | options: performance, accuracy - path: '' # Dataset file path + path: '' # Dataset file path format: null # Dataset format (auto-detected) samples: null # Number of samples to use eval_method: exact_match # Accuracy evaluation method | options: exact_match, contains, judge diff --git a/src/inference_endpoint/load_generator/session.py b/src/inference_endpoint/load_generator/session.py index 3be480cb..0ccfe6a0 100644 --- a/src/inference_endpoint/load_generator/session.py +++ b/src/inference_endpoint/load_generator/session.py @@ -407,6 +407,27 @@ def _handle_response(self, resp: QueryResult | StreamChunk) -> None: if isinstance(resp, QueryResult): query_id = resp.id + # Emit ERROR before COMPLETE for failed queries so downstream + # consumers (notably the metrics aggregator) see the ERROR + # while the in-flight tracked row still exists. COMPLETE + # removes the row, so any state lookup at ERROR time after + # COMPLETE would silently miss tracked failures. + # + # Invariant: the EventPublisher MUST preserve publish-call + # order on the wire (ZMQ PUB→SUB delivers in order to a + # single SUB, and ZmqMessagePublisher batches without + # reordering). Any future transport refactor that breaks + # this property breaks tracked-failure counting — and + # silently, since neither side has an assertion. + if resp.error is not None: + self._publisher.publish( + EventRecord( + event_type=ErrorEventType.GENERIC, + timestamp_ns=time.monotonic_ns(), + sample_uuid=query_id, + data=resp.error, + ) + ) self._publisher.publish( EventRecord( event_type=SampleEventType.COMPLETE, @@ -417,15 +438,6 @@ def _handle_response(self, resp: QueryResult | StreamChunk) -> None: data=resp.response_output, ) ) - if resp.error is not None: - self._publisher.publish( - EventRecord( - event_type=ErrorEventType.GENERIC, - timestamp_ns=time.monotonic_ns(), - sample_uuid=query_id, - data=resp.error, - ) - ) if phase_issuer is not None and query_id in phase_issuer.uuid_to_index: phase_issuer.inflight -= 1 if phase_issuer.inflight <= 0: diff --git a/src/inference_endpoint/metrics/report.py b/src/inference_endpoint/metrics/report.py index e24d954b..a6c69c52 100644 --- a/src/inference_endpoint/metrics/report.py +++ b/src/inference_endpoint/metrics/report.py @@ -24,90 +24,78 @@ from typing import Any import msgspec.json -import numpy as np -from inference_endpoint.async_utils.services.metrics_aggregator.kv_store import ( - BasicKVStoreReader, - SeriesStats, -) from inference_endpoint.utils.version import get_version_info from ..utils import monotime_to_datetime -# --------------------------------------------------------------------------- -# Summary computation -# --------------------------------------------------------------------------- - -_DEFAULT_PERCENTILES = (99.9, 99, 97, 95, 90, 80, 75, 50, 25, 10, 5, 1) +def _series_to_metric_dict(stat: dict[str, Any]) -> dict[str, Any]: + """Convert a series-stat dict into the shape ``display()`` expects. -def compute_summary( - stats: SeriesStats, - percentiles: tuple[float, ...] = _DEFAULT_PERCENTILES, - n_histogram_buckets: int = 10, -) -> dict[str, Any]: - """Compute rollup statistics from pre-computed SeriesStats. + Input is the dict form produced by ``snapshot_to_dict``. Derives + ``avg``, ``std_dev``, and ``median`` from the rollups + percentiles. + ``median`` falls back to ``(min + max) / 2`` if the producer didn't + emit p50. - Scalar stats (total, min, max, avg, std_dev) are derived from the - incrementally maintained rollups in SeriesStats. Numpy is only used - for percentiles and histograms, which require the raw values. - - Returns a dict with: total, min, max, avg, std_dev, median, - percentiles (dict), and histogram (buckets + counts). + All field reads use ``.get(...)`` with sensible defaults so a + truncated / partial dict (e.g. an INTERRUPTED snapshot) produces an + honest empty rollup instead of crashing. """ - if stats.count == 0: - return { - "total": 0, - "min": 0, - "max": 0, - "median": 0.0, - "avg": 0.0, - "std_dev": 0.0, - "percentiles": {str(p): 0.0 for p in percentiles}, - "histogram": {"buckets": [], "counts": []}, - } - - # Scalar stats from pre-computed rollups (no numpy needed) - avg = stats.total / stats.count - # Bessel's correction (ddof=1) for sample standard deviation - if stats.count > 1: - n = stats.count - std_dev = math.sqrt((stats.sum_sq - stats.total**2 / n) / (n - 1)) + count = stat.get("count", 0) + if count == 0: + return {} + + total = stat.get("total", 0) + sum_sq = stat.get("sum_sq", 0) + s_min = stat.get("min", 0) + s_max = stat.get("max", 0) + + avg = total / count if count > 0 else 0.0 + if count > 1: + n = count + # Integer-aggregate series (latency in ns) can have very large + # sum_sq and total values; the naive `sum_sq - total^2 / n` + # form loses precision when total^2 / n is close to sum_sq. + # Use the exact integer form `n*sum_sq - total^2` when inputs + # are int, falling back to the float form otherwise. + if isinstance(total, int) and isinstance(sum_sq, int): + var_num_int = n * sum_sq - total * total + std_dev = math.sqrt(max(0, var_num_int)) / math.sqrt(n * (n - 1)) + else: + var_num = sum_sq - total * total / n + std_dev = math.sqrt(max(0.0, var_num / (n - 1))) else: std_dev = 0.0 - # Percentiles and histogram require raw values - # Don't force float64 — numpy preserves int for uint64 series, - # so percentile(method="lower") returns actual observed values - # in their original type. - arr = np.array(stats.values) - arr.sort() - - # Inject 50th percentile for median if not already requested - need_median = 50 not in percentiles - all_percentiles = (*percentiles, 50) if need_median else percentiles - - perc_values = np.percentile(arr, all_percentiles, method="lower") - perc_dict = { - str(p): v.item() for p, v in zip(all_percentiles, perc_values, strict=True) - } - median = perc_dict.pop("50") if need_median else perc_dict["50"] - - bounds = np.histogram_bin_edges(arr, bins=n_histogram_buckets) - counts, _ = np.histogram(arr, bins=bounds) - hist_buckets = [ - (float(bounds[i]), float(bounds[i + 1])) for i in range(len(bounds) - 1) - ] + # p50 is contractually required on every registered series — see + # ``MetricsRegistry.register_series``, which rejects registrations + # whose percentiles tuple omits 50.0. The midrange fallback below + # only fires for hand-crafted snapshot dicts that bypass the + # registration path (e.g. a manually-edited JSON file), in which + # case the midrange is wrong-but-displayable rather than crashing. + perc = stat.get("percentiles", {}) + if "50" in perc: + median: float = perc["50"] + elif "50.0" in perc: + median = perc["50.0"] + else: + # Approximate-only fallback for non-registry-produced dicts. + median = (s_min + s_max) / 2 + histogram = stat.get("histogram", []) return { - "total": stats.total, - "min": stats.min_val, - "max": stats.max_val, + "total": total, + "min": s_min, + "max": s_max, "median": median, "avg": avg, "std_dev": std_dev, - "percentiles": perc_dict, - "histogram": {"buckets": hist_buckets, "counts": counts.tolist()}, + "percentiles": dict(perc), + "histogram": { + "buckets": [(rng[0], rng[1]) for rng, _ in histogram], + "counts": [c for _, c in histogram], + }, } @@ -126,8 +114,18 @@ class Report(msgspec.Struct, frozen=True): # type: ignore[call-arg] n_samples_completed: int n_samples_failed: int duration_ns: int | None - - # Per-metric rollup dicts (output of compute_summary) + # The terminal SessionState that produced this report. Surfaced as a + # raw string so ``display()`` can render an INTERRUPTED indicator + # without re-parsing the source dict, and so JSON round-trips don't + # depend on Report importing the SessionState enum. + state: str + # True iff state=="complete" AND n_pending_tasks==0. False signals + # partial async metrics — drain timed out (state=="complete", + # n_pending_tasks>0), the run was interrupted (state=="interrupted"), + # or no final snapshot was found and we fell back to a live tick. + complete: bool + + # Per-metric rollup dicts (output of _series_to_metric_dict) ttft: dict[str, Any] tpot: dict[str, Any] latency: dict[str, Any] @@ -147,44 +145,69 @@ def tps(self) -> float | None: return total / (self.duration_ns / 1e9) @classmethod - def from_kv_reader(cls, reader: BasicKVStoreReader) -> Report: - """Build a Report from the current KVStore state. - - Reads counters and series from the reader, computes rollup summaries - (percentiles, histograms) for each series metric, and returns a Report. - - Works identically for live metrics (mid-test) and final reports - (post-drain). The caller decides when to call. + def from_snapshot(cls, snap: dict[str, Any]) -> Report: + """Build a Report from a snapshot dict. + + Input is the dict form produced by + ``inference_endpoint.async_utils.services.metrics_aggregator.snapshot + .snapshot_to_dict``, which is also the shape persisted to + ``final_snapshot.json``. Consumers can therefore feed + ``json.loads(path.read_bytes())`` straight in without an + intermediate Struct decode — this is deliberate, because the + wire ``MetricsSnapshot`` uses ``array_like=True`` for compact + msgpack and decoding a dict back into an array-like Struct is + ergonomically painful (msgspec's decoders follow the Struct's + ``array_like`` flag). + + All field reads use ``.get(...)`` with defaults that produce an + honest "incomplete" report on missing fields instead of crashing: + missing ``state`` defaults to ``"interrupted"`` (worst-case), + missing counters / series to zero / empty. """ - snap = reader.snapshot() + counters: dict[str, int | float] = {} + series: dict[str, dict[str, Any]] = {} + for stat in snap.get("metrics", []): + stat_type = stat.get("type") + name = stat.get("name", "") + if not name: + continue + if stat_type == "counter": + counters[name] = stat.get("value", 0) + elif stat_type == "series": + series[name] = stat def _counter(key: str) -> int: - val = snap.get(key) - return int(val) if isinstance(val, int) else 0 + return int(counters.get(key, 0)) - def _summarize(key: str) -> dict: - val = snap.get(key) - if isinstance(val, SeriesStats) and val.count > 0: - return compute_summary(val) - return {} + def _series_dict(key: str) -> dict[str, Any]: + stat = series.get(key) + if stat is None or stat.get("count", 0) == 0: + return {} + return _series_to_metric_dict(stat) version_info = get_version_info() duration_ns = _counter("tracked_duration_ns") + # Default missing state to "interrupted" — a malformed / partial + # snapshot dict is treated as worst-case (run did not reach a + # clean completion). Drives complete=False and the interrupted + # indicator in display(). + state = snap.get("state", "interrupted") + n_pending_tasks = snap.get("n_pending_tasks", 0) return cls( version=str(version_info.get("version", "unknown")), git_sha=version_info.get("git_sha"), - test_started_at=0, # TODO: add test_started_at counter to aggregator + test_started_at=0, # TODO: surface session_started_ns via snapshot n_samples_issued=_counter("tracked_samples_issued"), n_samples_completed=_counter("tracked_samples_completed"), - # TODO: Add tracked_samples_failed to MetricCounterKey. - # For now, total_samples_failed is the best available. - n_samples_failed=_counter("total_samples_failed"), + n_samples_failed=_counter("tracked_samples_failed"), duration_ns=duration_ns if duration_ns > 0 else None, - ttft=_summarize("ttft_ns"), - tpot=_summarize("tpot_ns"), - latency=_summarize("sample_latency_ns"), - output_sequence_lengths=_summarize("osl"), + state=state, + complete=(state == "complete" and n_pending_tasks == 0), + ttft=_series_dict("ttft_ns"), + tpot=_series_dict("tpot_ns"), + latency=_series_dict("sample_latency_ns"), + output_sequence_lengths=_series_dict("osl"), ) def to_json(self, save_to: os.PathLike | None = None) -> bytes: @@ -201,6 +224,16 @@ def display( newline: str = "", ) -> None: fn(f"----------------- Summary -----------------{newline}") + if self.state == "interrupted": + fn( + "WARNING: run was interrupted (SIGTERM/SIGINT) — " + f"metrics below are best-effort partial data.{newline}" + ) + elif not self.complete: + fn( + "WARNING: report is incomplete (drain timed out or no " + f"final snapshot received) — some async metrics may be missing.{newline}" + ) fn(f"Version: {self.version}{newline}") if self.git_sha: fn(f"Git SHA: {self.git_sha}{newline}") @@ -263,6 +296,16 @@ def _display_metric( scale_factor: float = 1.0, newline: str = "", ) -> None: + # ``_scrub_nonfinite`` (snapshot.py) maps producer-side NaN/±Inf to + # ``None`` so the persisted JSON stays strict. Any of the named + # scalars / percentile values below can therefore be ``None`` — + # render an ``N/A`` indicator instead of crashing on + # ``None * scale_factor``. + def _scaled(v: Any) -> str: + if v is None: + return "N/A" + return f"{v * scale_factor:.2f}" + for name, key in [ ("Min", "min"), ("Max", "max"), @@ -270,7 +313,7 @@ def _display_metric( ("Avg.", "avg"), ("Std Dev.", "std_dev"), ]: - fn(f" {name}: {metric_dict[key] * scale_factor:.2f} {unit}{newline}") + fn(f" {name}: {_scaled(metric_dict[key])} {unit}{newline}") fn(f"\n Histogram:{newline}") buckets = metric_dict["histogram"]["buckets"] @@ -278,8 +321,7 @@ def _display_metric( if buckets: bucket_strs = [ - f" [{lo * scale_factor:.2f}, {hi * scale_factor:.2f}" - + ("]" if i == len(buckets) - 1 else ")") + f" [{_scaled(lo)}, {_scaled(hi)}" + ("]" if i == len(buckets) - 1 else ")") for i, (lo, hi) in enumerate(buckets) ] max_count = max(counts) @@ -292,4 +334,4 @@ def _display_metric( fn(f"\n Percentiles:{newline}") for p, val in metric_dict.get("percentiles", {}).items(): - fn(f" {p:>6}: {val * scale_factor:.2f} {unit}{newline}") + fn(f" {p:>6}: {_scaled(val)} {unit}{newline}") diff --git a/tests/datasets/Readme.md b/tests/assets/datasets/Readme.md similarity index 98% rename from tests/datasets/Readme.md rename to tests/assets/datasets/Readme.md index a8ce2319..4315d31a 100644 --- a/tests/datasets/Readme.md +++ b/tests/assets/datasets/Readme.md @@ -63,7 +63,7 @@ Describe space exploration (case 3) in detail uv run inference-endpoint benchmark offline \ --endpoints http://localhost:8765 \ --model Qwen/Qwen3-8B \ - --dataset tests/datasets/dummy_1k.jsonl + --dataset tests/assets/datasets/dummy_1k.jsonl # Test probe uv run inference-endpoint probe \ diff --git a/tests/datasets/ds_samples.jsonl b/tests/assets/datasets/ds_samples.jsonl similarity index 100% rename from tests/datasets/ds_samples.jsonl rename to tests/assets/datasets/ds_samples.jsonl diff --git a/tests/datasets/dummy_1k.jsonl b/tests/assets/datasets/dummy_1k.jsonl similarity index 100% rename from tests/datasets/dummy_1k.jsonl rename to tests/assets/datasets/dummy_1k.jsonl diff --git a/tests/datasets/squad_pruned/dataset_dict.json b/tests/assets/datasets/squad_pruned/dataset_dict.json similarity index 100% rename from tests/datasets/squad_pruned/dataset_dict.json rename to tests/assets/datasets/squad_pruned/dataset_dict.json diff --git a/tests/datasets/squad_pruned/train/data-00000-of-00001.arrow b/tests/assets/datasets/squad_pruned/train/data-00000-of-00001.arrow similarity index 100% rename from tests/datasets/squad_pruned/train/data-00000-of-00001.arrow rename to tests/assets/datasets/squad_pruned/train/data-00000-of-00001.arrow diff --git a/tests/datasets/squad_pruned/train/dataset_info.json b/tests/assets/datasets/squad_pruned/train/dataset_info.json similarity index 100% rename from tests/datasets/squad_pruned/train/dataset_info.json rename to tests/assets/datasets/squad_pruned/train/dataset_info.json diff --git a/tests/datasets/squad_pruned/train/state.json b/tests/assets/datasets/squad_pruned/train/state.json similarity index 100% rename from tests/datasets/squad_pruned/train/state.json rename to tests/assets/datasets/squad_pruned/train/state.json diff --git a/tests/datasets/squad_pruned/validation/data-00000-of-00001.arrow b/tests/assets/datasets/squad_pruned/validation/data-00000-of-00001.arrow similarity index 100% rename from tests/datasets/squad_pruned/validation/data-00000-of-00001.arrow rename to tests/assets/datasets/squad_pruned/validation/data-00000-of-00001.arrow diff --git a/tests/datasets/squad_pruned/validation/dataset_info.json b/tests/assets/datasets/squad_pruned/validation/dataset_info.json similarity index 100% rename from tests/datasets/squad_pruned/validation/dataset_info.json rename to tests/assets/datasets/squad_pruned/validation/dataset_info.json diff --git a/tests/datasets/squad_pruned/validation/state.json b/tests/assets/datasets/squad_pruned/validation/state.json similarity index 100% rename from tests/datasets/squad_pruned/validation/state.json rename to tests/assets/datasets/squad_pruned/validation/state.json diff --git a/tests/assets/tokenizers/char/tokenizer.json b/tests/assets/tokenizers/char/tokenizer.json new file mode 100644 index 00000000..d9eeb653 --- /dev/null +++ b/tests/assets/tokenizers/char/tokenizer.json @@ -0,0 +1,187 @@ +{ + "version": "1.0", + "truncation": null, + "padding": null, + "added_tokens": [ + { + "id": 0, + "content": "", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + }, + { + "id": 1, + "content": "", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + }, + { + "id": 2, + "content": "", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + }, + { + "id": 3, + "content": "", + "single_word": false, + "lstrip": false, + "rstrip": false, + "normalized": false, + "special": true + } + ], + "normalizer": null, + "pre_tokenizer": { + "type": "Split", + "pattern": { + "String": "" + }, + "behavior": "Isolated", + "invert": false + }, + "post_processor": { + "type": "TemplateProcessing", + "single": [ + { + "Sequence": { + "id": "A", + "type_id": 0 + } + } + ], + "pair": [ + { + "Sequence": { + "id": "A", + "type_id": 0 + } + }, + { + "Sequence": { + "id": "B", + "type_id": 1 + } + } + ], + "special_tokens": {} + }, + "decoder": null, + "model": { + "type": "WordLevel", + "vocab": { + "": 0, + "": 1, + "": 2, + "": 3, + "a": 4, + "b": 5, + "c": 6, + "d": 7, + "e": 8, + "f": 9, + "g": 10, + "h": 11, + "i": 12, + "j": 13, + "k": 14, + "l": 15, + "m": 16, + "n": 17, + "o": 18, + "p": 19, + "q": 20, + "r": 21, + "s": 22, + "t": 23, + "u": 24, + "v": 25, + "w": 26, + "x": 27, + "y": 28, + "z": 29, + "A": 30, + "B": 31, + "C": 32, + "D": 33, + "E": 34, + "F": 35, + "G": 36, + "H": 37, + "I": 38, + "J": 39, + "K": 40, + "L": 41, + "M": 42, + "N": 43, + "O": 44, + "P": 45, + "Q": 46, + "R": 47, + "S": 48, + "T": 49, + "U": 50, + "V": 51, + "W": 52, + "X": 53, + "Y": 54, + "Z": 55, + "0": 56, + "1": 57, + "2": 58, + "3": 59, + "4": 60, + "5": 61, + "6": 62, + "7": 63, + "8": 64, + "9": 65, + " ": 66, + "\t": 67, + "\n": 68, + "\r": 69, + "!": 70, + "\"": 71, + "#": 72, + "$": 73, + "%": 74, + "&": 75, + "'": 76, + "(": 77, + ")": 78, + "*": 79, + "+": 80, + ",": 81, + "-": 82, + ".": 83, + "/": 84, + ":": 85, + ";": 86, + "<": 87, + "=": 88, + ">": 89, + "?": 90, + "@": 91, + "[": 92, + "\\": 93, + "]": 94, + "^": 95, + "_": 96, + "`": 97, + "{": 98, + "|": 99, + "}": 100, + "~": 101 + }, + "unk_token": "" + } +} diff --git a/tests/assets/tokenizers/char/tokenizer_config.json b/tests/assets/tokenizers/char/tokenizer_config.json new file mode 100644 index 00000000..556684e6 --- /dev/null +++ b/tests/assets/tokenizers/char/tokenizer_config.json @@ -0,0 +1,9 @@ +{ + "backend": "tokenizers", + "bos_token": "", + "eos_token": "", + "model_max_length": 1000000000000000019884624838656, + "pad_token": "", + "tokenizer_class": "TokenizersBackend", + "unk_token": "" +} diff --git a/tests/conftest.py b/tests/conftest.py index d69d3c75..f4005788 100644 --- a/tests/conftest.py +++ b/tests/conftest.py @@ -173,7 +173,7 @@ def ds_dataset_path(): """ Returns the path to the ds_samples.jsonl file. """ - return "tests/datasets/ds_samples.jsonl" + return "tests/assets/datasets/ds_samples.jsonl" @pytest.fixture @@ -189,7 +189,7 @@ def hf_squad_dataset_path(): """ Returns the path to the squad dataset. """ - return "tests/datasets/squad_pruned" + return "tests/assets/datasets/squad_pruned" @pytest.fixture diff --git a/tests/integration/async_utils/__init__.py b/tests/integration/async_utils/__init__.py new file mode 100644 index 00000000..46707983 --- /dev/null +++ b/tests/integration/async_utils/__init__.py @@ -0,0 +1,14 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. diff --git a/tests/integration/async_utils/services/__init__.py b/tests/integration/async_utils/services/__init__.py new file mode 100644 index 00000000..46707983 --- /dev/null +++ b/tests/integration/async_utils/services/__init__.py @@ -0,0 +1,14 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. diff --git a/tests/integration/async_utils/services/metrics_aggregator/__init__.py b/tests/integration/async_utils/services/metrics_aggregator/__init__.py new file mode 100644 index 00000000..46707983 --- /dev/null +++ b/tests/integration/async_utils/services/metrics_aggregator/__init__.py @@ -0,0 +1,14 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. diff --git a/tests/integration/async_utils/services/metrics_aggregator/test_signal_handling.py b/tests/integration/async_utils/services/metrics_aggregator/test_signal_handling.py new file mode 100644 index 00000000..010536c0 --- /dev/null +++ b/tests/integration/async_utils/services/metrics_aggregator/test_signal_handling.py @@ -0,0 +1,192 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Integration tests for the aggregator subprocess's signal handlers. + +The aggregator's INTERRUPTED-snapshot path is the only mechanism that +produces a ``state=interrupted`` ``final_snapshot.json``, and the +SIGINT-no-op path is the only thing standing between an interactive +^C and silent sample loss. These tests spawn a real subprocess and +exercise both paths end-to-end — the unit tests in ``test_publisher.py`` +cover the API surface, not the signal wiring. +""" + +from __future__ import annotations + +import json +import os +import signal +import subprocess +import sys +import time +import uuid +from pathlib import Path + +import pytest + + +def _spawn_aggregator( + socket_dir: Path, + output_dir: Path, + *, + socket_name: str, + metrics_socket: str, +) -> subprocess.Popen: + """Launch the metrics-aggregator subprocess in its own process group. + + Own process group is critical for the SIGINT-no-op test — sending + SIGINT to just this group (not the test runner's group) emulates a + user Ctrl-C in the foreground process group of the subprocess and + not the test runner. + """ + return subprocess.Popen( + [ + sys.executable, + "-m", + "inference_endpoint.async_utils.services.metrics_aggregator", + "--socket-dir", + str(socket_dir), + "--socket-name", + socket_name, + "--metrics-socket", + metrics_socket, + "--metrics-output-dir", + str(output_dir), + ], + # New process group so we can signal it without disturbing the + # test runner. + preexec_fn=os.setsid, + stdout=subprocess.PIPE, + stderr=subprocess.PIPE, + ) + + +def _wait_for_file(path: Path, timeout: float) -> bool: + """Poll for ``path`` existing within ``timeout`` seconds.""" + deadline = time.monotonic() + timeout + while time.monotonic() < deadline: + if path.exists(): + return True + time.sleep(0.05) + return False + + +@pytest.mark.integration +class TestAggregatorSignalHandling: + def test_sigterm_writes_interrupted_final_snapshot(self, tmp_path: Path): + """SIGTERM to the aggregator MUST produce ``final_snapshot.json`` + with ``state == "interrupted"``. This is the only path that + produces an INTERRUPTED snapshot — without it, a parent + ``ServiceLauncher.kill_all`` would leave the Report consumer with + no final-snapshot file at all. + """ + socket_dir = tmp_path / "sockets" + socket_dir.mkdir() + output_dir = tmp_path / "output" + # The parent owns directory setup — the aggregator subprocess + # fail-fasts (SystemExit) on a missing output dir to surface + # contract violations in its own stderr instead of crashing + # later on the atomic-write path. Mirror that contract here. + output_dir.mkdir() + # Use a unique socket name per test to avoid collisions if a + # previous test run left an IPC file behind. + suffix = uuid.uuid4().hex[:8] + proc = _spawn_aggregator( + socket_dir, + output_dir, + socket_name=f"events_{suffix}", + metrics_socket=f"metrics_{suffix}", + ) + try: + # Give the subprocess time to: parse args, set up ZMQ, bind + # sockets, register signal handlers, enter the await loop. + # The signal-handler registration is what we're testing, so + # we MUST wait for it before sending the signal. + time.sleep(2.0) + assert ( + proc.poll() is None + ), f"aggregator died early: stderr={(proc.stderr.read() if proc.stderr else b"").decode()}" + + # SIGTERM the process group → triggers _signal_finalize. + os.killpg(proc.pid, signal.SIGTERM) + proc.wait(timeout=10.0) + finally: + if proc.poll() is None: + os.killpg(proc.pid, signal.SIGKILL) + proc.wait(timeout=5.0) + + # The signal handler MUST have written final_snapshot.json + # before the subprocess exited. + final = output_dir / "final_snapshot.json" + assert final.exists(), ( + f"SIGTERM did not produce final_snapshot.json — " + f"stderr: {(proc.stderr.read() if proc.stderr else b"").decode()[-2000:]}" + ) + decoded = json.loads(final.read_bytes()) + assert decoded["state"] == "interrupted" + + def test_sigint_does_not_finalize_aggregator(self, tmp_path: Path): + """SIGINT to the aggregator MUST NOT trigger publish_final. + + On an interactive ^C, the OS sends SIGINT to the whole foreground + process group; both parent and child receive it. If the + aggregator finalized eagerly here, samples that completed during + the parent's clean-shutdown window would never reach the file. + The aggregator's contract is: SIGINT is a no-op, the parent's + ENDED-driven path is authoritative. + + Verification: send SIGINT, wait long enough for any naive + signal-driven write to have happened, then assert the file did + NOT appear and the subprocess is still alive. + """ + socket_dir = tmp_path / "sockets" + socket_dir.mkdir() + output_dir = tmp_path / "output" + output_dir.mkdir() # parent owns dir setup (see sibling test) + suffix = uuid.uuid4().hex[:8] + proc = _spawn_aggregator( + socket_dir, + output_dir, + socket_name=f"events_{suffix}", + metrics_socket=f"metrics_{suffix}", + ) + try: + time.sleep(2.0) + assert proc.poll() is None, "aggregator died before signal-handler test" + + os.killpg(proc.pid, signal.SIGINT) + # Wait a beat — if SIGINT were naively driving publish_final, + # the file would appear well within this window. + time.sleep(1.0) + + final = output_dir / "final_snapshot.json" + assert not final.exists(), ( + "SIGINT must NOT trigger publish_final; the parent's " + "ENDED-driven path is authoritative on interactive ^C" + ) + assert proc.poll() is None, ( + "aggregator must remain alive after SIGINT; only " + "SIGTERM (parent kill) or ENDED should finalize it" + ) + finally: + # Use SIGTERM (which is the correct shutdown path) for + # cleanup, then SIGKILL as belt-and-suspenders. + if proc.poll() is None: + os.killpg(proc.pid, signal.SIGTERM) + try: + proc.wait(timeout=5.0) + except subprocess.TimeoutExpired: + os.killpg(proc.pid, signal.SIGKILL) + proc.wait(timeout=5.0) diff --git a/tests/integration/commands/test_benchmark_command.py b/tests/integration/commands/test_benchmark_command.py index f052a90b..cf396f1f 100644 --- a/tests/integration/commands/test_benchmark_command.py +++ b/tests/integration/commands/test_benchmark_command.py @@ -16,7 +16,6 @@ """Integration tests for benchmark commands against echo server.""" import json -import os import re from pathlib import Path @@ -184,11 +183,27 @@ def test_mode_logging(self, mock_http_echo_server, ds_dataset_path, caplog): ) +# Local character-level tokenizer fixture used in place of the templates' +# default (which references gated `meta-llama/Llama-3.1-*`). The echo-server +# e2e path doesn't care about the model identity, only that a tokenizer +# loads for the metrics aggregator's ISL/OSL/TPOT triggers. Using a local +# fixture removes the HuggingFace Hub dependency from CI: no network call, +# no ~1 MB download, no HF_TOKEN requirement, and the load completes in +# milliseconds rather than seconds — well inside the parent launcher's +# readiness timeout. ``AutoTokenizer.from_pretrained`` supports local +# directories as a first-class input, so this uses the same production +# code path with no test-only hooks. +_TEST_TOKENIZER_DIR = Path(__file__).resolve().parents[2] / "assets/tokenizers/char" +_TEST_MODEL_NAME = str(_TEST_TOKENIZER_DIR) + + def _resolve_template(template_path: Path, server_url: str) -> dict: """Load a template YAML, strip wrappers, and patch for testing. - Only replaces placeholders with working values and caps n_samples_to_issue. - Everything else stays as the template defines it. + Replaces placeholders with working values, swaps the gated default + model for a non-gated tokenizer (so tests run without ``HF_TOKEN``), + and caps ``n_samples_to_issue``. Everything else stays as the template + defines it. """ raw = template_path.read_text() # Strip → value (all templates use eg: form) @@ -197,11 +212,28 @@ def _resolve_template(template_path: Path, server_url: str) -> dict: raw = re.sub(r"http://localhost:\d+", server_url, raw) data = yaml.safe_load(raw) + # Swap the placeholder-default model name for a non-gated tokenizer + # (see _TEST_MODEL_NAME above) so these tests can run in CI without + # HF_TOKEN. + if "model_params" in data and isinstance(data["model_params"], dict): + data["model_params"]["name"] = _TEST_MODEL_NAME + # Cap total samples so test finishes in seconds data.setdefault("settings", {}) data["settings"].setdefault("runtime", {}) data["settings"]["runtime"]["n_samples_to_issue"] = 10 + # Bump the worker-init timeout for CI. The production default (60 s) is + # tight on small CI runners where Python's `spawn`-mode multiprocessing + # pays a full re-import cost per worker on top of ZMQ IPC setup; cold- + # start of the *first* parametrized template (alphabetical, so + # `concurrency_template.yaml`) consistently exceeds the budget in CI. + # The other 5 templates benefit from warm module / IPC caches and don't + # need the headroom. 120 s is a generous safety margin that does not + # change the production default, only this integration test. + data["settings"].setdefault("client", {}) + data["settings"]["client"]["worker_initialization_timeout"] = 120.0 + # Accuracy datasets can't run e2e against echo server (no scorer), so keep only performance datasets. data["datasets"] = [ ds for ds in data.get("datasets", []) if ds.get("type") != "accuracy" @@ -213,10 +245,6 @@ class TestTemplateIntegration: """Verify generated templates run end-to-end against a local server.""" @pytest.mark.integration - @pytest.mark.skipif( - not os.environ.get("HF_TOKEN"), - reason="Templates reference gated HF models; requires HF_TOKEN to fetch tokenizer", - ) @pytest.mark.parametrize("template", _GENERATED_TEMPLATES) def test_template_runs(self, mock_http_echo_server, tmp_path, caplog, template): data = _resolve_template(TEMPLATE_DIR / template, mock_http_echo_server.url) diff --git a/tests/unit/async_utils/services/metrics_aggregator/conftest.py b/tests/unit/async_utils/services/metrics_aggregator/conftest.py index eb80b2ba..db6bcfcf 100644 --- a/tests/unit/async_utils/services/metrics_aggregator/conftest.py +++ b/tests/unit/async_utils/services/metrics_aggregator/conftest.py @@ -13,21 +13,34 @@ # See the License for the specific language governing permissions and # limitations under the License. -"""Shared test doubles and factories for metrics aggregator tests.""" +"""Shared test doubles and factories for metrics aggregator tests. + +Tests that need to inspect emitted values build them directly off a +``MetricsRegistry`` and a ``MetricsSnapshot``. + +The helpers here are intentionally small — most reused-across-tests +construction lives in ``_make_aggregator`` style fixtures local to each +test file (the aggregator's wire surface is small enough that a single +shared fixture would mostly hide it). +""" from __future__ import annotations import asyncio -from typing import Literal -from unittest.mock import MagicMock +from unittest.mock import AsyncMock, MagicMock from inference_endpoint.async_utils.services.metrics_aggregator.aggregator import ( MetricsAggregatorService, ) -from inference_endpoint.async_utils.services.metrics_aggregator.kv_store import ( - KVStore, - SeriesStats, +from inference_endpoint.async_utils.services.metrics_aggregator.registry import ( + MetricsRegistry, ) +from inference_endpoint.async_utils.services.metrics_aggregator.snapshot import ( + CounterStat, + SeriesStat, + SessionState, +) +from inference_endpoint.async_utils.transport.zmq.context import ManagedZMQContext from inference_endpoint.core.record import ( EventRecord, SampleEventType, @@ -36,71 +49,7 @@ from inference_endpoint.core.types import TextModelOutput # --------------------------------------------------------------------------- -# In-memory KVStore for tests -# --------------------------------------------------------------------------- - - -class InMemoryKVStore(KVStore): - """In-memory KVStore for unit tests. No /dev/shm files needed.""" - - def __init__(self) -> None: - self._counters: dict[str, int] = {} - self._series: dict[str, list] = {} - self._series_dtype: dict[str, type] = {} - self.closed: bool = False - - def create_key( - self, key: str, key_type: Literal["series", "counter"], dtype: type = int - ) -> None: - if key_type == "counter" and key not in self._counters: - self._counters[key] = 0 - elif key_type == "series" and key not in self._series: - self._series[key] = [] - self._series_dtype[key] = dtype - - def update(self, key: str, value: int | float) -> None: - if key in self._counters: - self._counters[key] = int(value) - elif key in self._series: - self._series[key].append(value) - else: - raise KeyError(f"Key not created: {key}") - - def get(self, key: str) -> int | SeriesStats: - if key in self._counters: - return self._counters[key] - if key in self._series: - dtype = self._series_dtype[key] - return SeriesStats(list(self._series[key]), dtype=dtype) - raise KeyError(f"Key not created: {key}") - - def snapshot(self) -> dict[str, int | SeriesStats]: - result: dict[str, int | SeriesStats] = {} - for k, v in self._counters.items(): - result[k] = v - for k, vals in self._series.items(): - dtype = self._series_dtype[k] - result[k] = SeriesStats(list(vals), dtype=dtype) - return result - - def close(self) -> None: - self.closed = True - - # --- Test helpers --- - - def get_series_values(self, key: str) -> list: - return list(self._series.get(key, [])) - - def get_counter(self, key: str) -> int: - return self._counters.get(key, 0) - - def get_all_series(self) -> dict[str, list[float]]: - """All series as {metric_name: [values]}.""" - return {k: list(v) for k, v in self._series.items()} - - -# --------------------------------------------------------------------------- -# Mock TokenizePool +# Mock TokenizePool — used by tests that exercise async triggers directly. # --------------------------------------------------------------------------- @@ -129,52 +78,6 @@ def __exit__(self, *args): self.close() -# --------------------------------------------------------------------------- -# Aggregator factories -# --------------------------------------------------------------------------- - - -def mock_zmq_context() -> MagicMock: - """Create a mock ManagedZMQContext that no-ops all ZMQ operations.""" - ctx = MagicMock() - ctx.socket.return_value = MagicMock() - ctx.connect.return_value = "ipc:///mock/socket" - return ctx - - -def make_stub_aggregator( - kv_store: KVStore, - tokenize_pool=None, - streaming: bool = True, -) -> MetricsAggregatorService: - """Create a MetricsAggregatorService with ZMQ mocked out.""" - return MetricsAggregatorService( - "mock_path", - mock_zmq_context(), - MagicMock(spec=asyncio.AbstractEventLoop), - kv_store=kv_store, - tokenize_pool=tokenize_pool, - streaming=streaming, - ) - - -def make_async_stub_aggregator( - kv_store: KVStore, - tokenize_pool, - loop: asyncio.AbstractEventLoop, - streaming: bool = True, -) -> MetricsAggregatorService: - """Create a MetricsAggregatorService with a real loop and mock ZMQ.""" - return MetricsAggregatorService( - "mock_path", - mock_zmq_context(), - loop, - kv_store=kv_store, - tokenize_pool=tokenize_pool, - streaming=streaming, - ) - - # --------------------------------------------------------------------------- # EventRecord factories # --------------------------------------------------------------------------- @@ -196,3 +99,88 @@ def text_output(s: str) -> TextModelOutput: def streaming_text(*chunks: str) -> TextModelOutput: return TextModelOutput(output=tuple(chunks)) + + +# --------------------------------------------------------------------------- +# Registry / snapshot inspection helpers +# --------------------------------------------------------------------------- + + +def snapshot_counters(registry: MetricsRegistry) -> dict[str, int | float]: + """Return all counter values from a fresh snapshot. + + State/n_pending values don't matter for counter inspection — they + bypass the exact-vs-HDR fork. Tests that need series inspection + should call ``snapshot_series_values`` instead. + """ + snap = registry.build_snapshot(state=SessionState.LIVE, n_pending_tasks=0) + return {m.name: m.value for m in snap.metrics if isinstance(m, CounterStat)} + + +def snapshot_series_count(registry: MetricsRegistry, name: str) -> int: + """Return ``count`` of a named series from a fresh snapshot. + + Returns 0 if the series is unregistered or has no recordings. + """ + snap = registry.build_snapshot(state=SessionState.LIVE, n_pending_tasks=0) + for m in snap.metrics: + if isinstance(m, SeriesStat) and m.name == name: + return m.count + return 0 + + +def snapshot_series_total(registry: MetricsRegistry, name: str) -> int | float: + """Return ``total`` of a named series from a fresh snapshot.""" + snap = registry.build_snapshot(state=SessionState.LIVE, n_pending_tasks=0) + for m in snap.metrics: + if isinstance(m, SeriesStat) and m.name == name: + return m.total + return 0 + + +# --------------------------------------------------------------------------- +# Aggregator factory +# --------------------------------------------------------------------------- + + +def make_aggregator( + zmq_ctx: ManagedZMQContext, + loop: asyncio.AbstractEventLoop, + socket_name: str, + *, + tokenize_pool=None, + streaming: bool = True, + shutdown_event: asyncio.Event | None = None, +) -> tuple[MetricsAggregatorService, MetricsRegistry, MagicMock]: + """Construct an aggregator wired to a real SUB socket and a mocked publisher. + + The aggregator's ``start()`` is intentionally not called: tests inject + events directly via ``await agg.process([...])``. The publisher is a + ``MagicMock`` so the aggregator's STARTED branch (which calls + ``publisher.start(...)``) and ENDED branch (which calls ``publish_final`` + + ``close``) don't touch real I/O. + + Returns ``(agg, registry, publisher_mock)``. + """ + registry = MetricsRegistry() + # ``publish_final`` and ``aclose`` are awaited by the aggregator's + # ENDED handler, so they must be AsyncMocks. The remaining surface + # (``start``, ``close``) is synchronous and falls back to MagicMock's + # default attribute behavior. + publisher = MagicMock() + publisher.publish_final = AsyncMock() + publisher.aclose = AsyncMock() + agg = MetricsAggregatorService( + socket_name, + zmq_ctx, + loop, + registry=registry, + publisher=publisher, + publish_interval_s=0.25, + sig_figs=3, + n_histogram_buckets=10, + tokenize_pool=tokenize_pool, + streaming=streaming, + shutdown_event=shutdown_event, + ) + return agg, registry, publisher diff --git a/tests/unit/async_utils/services/metrics_aggregator/test_aggregator.py b/tests/unit/async_utils/services/metrics_aggregator/test_aggregator.py index 98674a46..0669a31e 100644 --- a/tests/unit/async_utils/services/metrics_aggregator/test_aggregator.py +++ b/tests/unit/async_utils/services/metrics_aggregator/test_aggregator.py @@ -13,15 +13,28 @@ # See the License for the specific language governing permissions and # limitations under the License. -"""Tests for MetricsAggregatorService.process() logic. +"""Tests for ``MetricsAggregatorService.process()``. -These tests exercise the aggregator's event dispatch and metric computation -without ZMQ transport by calling process() directly. +Events are injected directly via ``await agg.process([...])``; emitted +metrics are inspected by reading the ``MetricsRegistry``'s snapshot +output. The aggregator is constructed with a real SUB socket (so the +``ZmqMessageSubscriber`` base initializes cleanly) and a mocked +``MetricsPublisher`` (so ``STARTED``/``ENDED`` paths don't touch real +I/O). """ +from __future__ import annotations + import asyncio import pytest +from inference_endpoint.async_utils.services.metrics_aggregator.aggregator import ( + MetricCounterKey, +) +from inference_endpoint.async_utils.services.metrics_aggregator.metrics_table import ( + MetricSeriesKey, +) +from inference_endpoint.async_utils.transport.zmq.context import ManagedZMQContext from inference_endpoint.core.record import ( ErrorEventType, EventRecord, @@ -31,12 +44,13 @@ from inference_endpoint.core.types import ErrorData, PromptData from .conftest import ( - InMemoryKVStore, MockTokenizePool, - make_async_stub_aggregator, - make_stub_aggregator, + make_aggregator, sample_event, session_event, + snapshot_counters, + snapshot_series_count, + snapshot_series_total, streaming_text, text_output, ) @@ -49,116 +63,176 @@ @pytest.mark.unit class TestTrackingWindow: @pytest.mark.asyncio - async def test_not_tracked_before_start(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.STARTED, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=100), - ] - ) - assert agg._table.get_row("s1") is None, ( - "Sample issued before START_PERFORMANCE_TRACKING must not create a " - "table row — warmup samples should be excluded from the tracked set." - ) - assert ( - store.get_series_values("ttft_ns") == [] - ), "No TTFT should be recorded for samples issued before tracking begins." - assert store.get_series_values("sample_latency_ns") == [], ( - "No sample_latency should be recorded for samples issued before " - "tracking begins." - ) - - @pytest.mark.asyncio - async def test_tracked_after_start(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=100), - ] - ) - assert agg._table.get_row("s1") is not None, ( - "Sample issued after START_PERFORMANCE_TRACKING must create a table " - "row so its metrics are included in the tracked set." - ) - - @pytest.mark.asyncio - async def test_not_tracked_after_stop(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - session_event(SessionEventType.STOP_PERFORMANCE_TRACKING, ts=50), - sample_event(SampleEventType.ISSUED, "s1", ts=100), - ] - ) - assert agg._table.get_row("s1") is None, ( - "Sample issued after STOP_PERFORMANCE_TRACKING must not create a " - "table row — the tracking window has closed." - ) - - @pytest.mark.asyncio - async def test_inflight_sample_continues_after_stop(self): + async def test_not_tracked_before_start(self, tmp_path): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_not_tracked_before") + try: + await agg.process( + [ + session_event(SessionEventType.STARTED, ts=0), + sample_event(SampleEventType.ISSUED, "s1", ts=100), + ] + ) + assert agg._table.get_row("s1") is None, ( + "Sample issued before START_PERFORMANCE_TRACKING must " + "not create a table row — warmup samples should be " + "excluded from the tracked set." + ) + assert ( + snapshot_series_count(registry, MetricSeriesKey.TTFT_NS.value) == 0 + ) + assert ( + snapshot_series_count( + registry, MetricSeriesKey.SAMPLE_LATENCY_NS.value + ) + == 0 + ) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_tracked_after_start(self, tmp_path): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, _, _ = make_aggregator(ctx, loop, "agg_tracked_after_start") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=100), + ] + ) + assert agg._table.get_row("s1") is not None + finally: + agg.close() + + @pytest.mark.asyncio + async def test_not_tracked_after_stop(self, tmp_path): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, _, _ = make_aggregator(ctx, loop, "agg_not_tracked_after_stop") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + session_event( + SessionEventType.STOP_PERFORMANCE_TRACKING, ts=50 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=100), + ] + ) + assert agg._table.get_row("s1") is None + finally: + agg.close() + + @pytest.mark.asyncio + async def test_inflight_sample_continues_after_stop(self, tmp_path): """A sample issued during tracking completes normally after STOP.""" - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=100), - session_event(SessionEventType.STOP_PERFORMANCE_TRACKING, ts=200), - sample_event(SampleEventType.RECV_FIRST, "s1", ts=300), - sample_event(SampleEventType.COMPLETE, "s1", ts=500), - ] - ) - assert 200 in store.get_series_values("ttft_ns") - assert 400 in store.get_series_values("sample_latency_ns") - - @pytest.mark.asyncio - async def test_restart_tracking_window(self): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_inflight_after_stop") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=100), + session_event( + SessionEventType.STOP_PERFORMANCE_TRACKING, ts=200 + ), + sample_event(SampleEventType.RECV_FIRST, "s1", ts=300), + sample_event(SampleEventType.COMPLETE, "s1", ts=500), + ] + ) + # ttft = 300 - 100 = 200, sample_latency = 500 - 100 = 400 + assert ( + snapshot_series_total(registry, MetricSeriesKey.TTFT_NS.value) + == 200 + ) + assert ( + snapshot_series_total( + registry, MetricSeriesKey.SAMPLE_LATENCY_NS.value + ) + == 400 + ) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_restart_tracking_window(self, tmp_path): """START -> STOP -> START creates a second tracking window.""" - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=100), - session_event(SessionEventType.STOP_PERFORMANCE_TRACKING, ts=200), - sample_event(SampleEventType.ISSUED, "s2", ts=300), # not tracked - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=400), - sample_event(SampleEventType.ISSUED, "s3", ts=500), # tracked - sample_event(SampleEventType.COMPLETE, "s1", ts=600), - sample_event(SampleEventType.COMPLETE, "s3", ts=700), - ] - ) - assert agg._table.get_row("s2") is None # never tracked - latencies = store.get_series_values("sample_latency_ns") - assert len(latencies) == 2 # s1 and s3 both completed - - @pytest.mark.asyncio - async def test_tracked_block_durations(self): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_restart_tracking") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=100), + session_event( + SessionEventType.STOP_PERFORMANCE_TRACKING, ts=200 + ), + # not tracked + sample_event(SampleEventType.ISSUED, "s2", ts=300), + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=400 + ), + # tracked + sample_event(SampleEventType.ISSUED, "s3", ts=500), + sample_event(SampleEventType.COMPLETE, "s1", ts=600), + sample_event(SampleEventType.COMPLETE, "s3", ts=700), + ] + ) + # s2 was never tracked + assert agg._table.get_row("s2") is None + # Two completed samples (s1 and s3) emitted sample_latency_ns. + assert ( + snapshot_series_count( + registry, MetricSeriesKey.SAMPLE_LATENCY_NS.value + ) + == 2 + ) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_tracked_block_durations(self, tmp_path): """Tracked blocks extend to last sample completion.""" - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=100), - session_event(SessionEventType.STOP_PERFORMANCE_TRACKING, ts=200), - sample_event(SampleEventType.COMPLETE, "s1", ts=700), - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=800), - sample_event(SampleEventType.ISSUED, "s2", ts=900), - sample_event(SampleEventType.COMPLETE, "s2", ts=1000), - ] - ) - assert agg._table.tracked_blocks[0].duration_ns == 700 # 700 - 0 - assert agg._table.tracked_blocks[1].duration_ns == 200 # 1000 - 800 - assert agg._table.total_tracked_duration_ns == 900 - assert agg._table.total_completed_tracked_samples == 2 + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, _, _ = make_aggregator(ctx, loop, "agg_tracked_block_dur") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=100), + session_event( + SessionEventType.STOP_PERFORMANCE_TRACKING, ts=200 + ), + sample_event(SampleEventType.COMPLETE, "s1", ts=700), + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=800 + ), + sample_event(SampleEventType.ISSUED, "s2", ts=900), + sample_event(SampleEventType.COMPLETE, "s2", ts=1000), + ] + ) + assert agg._table.tracked_blocks[0].duration_ns == 700 # 700 - 0 + assert agg._table.tracked_blocks[1].duration_ns == 200 # 1000 - 800 + assert agg._table.total_tracked_duration_ns == 900 + assert agg._table.total_completed_tracked_samples == 2 + finally: + agg.close() # --------------------------------------------------------------------------- @@ -169,76 +243,140 @@ async def test_tracked_block_durations(self): @pytest.mark.unit class TestTimingMetrics: @pytest.mark.asyncio - async def test_ttft_and_sample_latency(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - sample_event(SampleEventType.RECV_FIRST, "s1", ts=2500), - sample_event(SampleEventType.COMPLETE, "s1", ts=5000), - ] - ) - assert 1500 in store.get_series_values("ttft_ns") - assert 4000 in store.get_series_values("sample_latency_ns") - - @pytest.mark.asyncio - async def test_chunk_deltas(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), - sample_event(SampleEventType.RECV_NON_FIRST, "s1", ts=3000), - sample_event(SampleEventType.RECV_NON_FIRST, "s1", ts=4500), - sample_event(SampleEventType.COMPLETE, "s1", ts=5000), - ] - ) - assert store.get_series_values("chunk_delta_ns") == [1000, 1500] - - @pytest.mark.asyncio - async def test_non_streaming_latency_only(self): - """Non-streaming sample emits sample_latency_ns and OSL, but no TTFT/chunk_delta/TPOT.""" - store = InMemoryKVStore() - loop = asyncio.get_running_loop() + async def test_ttft_and_sample_latency(self, tmp_path): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_ttft_latency") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + sample_event(SampleEventType.RECV_FIRST, "s1", ts=2500), + sample_event(SampleEventType.COMPLETE, "s1", ts=5000), + ] + ) + # ttft = 2500-1000 = 1500 + # sample_latency = 5000-1000 = 4000 + assert ( + snapshot_series_total(registry, MetricSeriesKey.TTFT_NS.value) + == 1500 + ) + assert ( + snapshot_series_total( + registry, MetricSeriesKey.SAMPLE_LATENCY_NS.value + ) + == 4000 + ) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_chunk_deltas(self, tmp_path): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_chunk_deltas") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), + sample_event(SampleEventType.RECV_NON_FIRST, "s1", ts=3000), + sample_event(SampleEventType.RECV_NON_FIRST, "s1", ts=4500), + sample_event(SampleEventType.COMPLETE, "s1", ts=5000), + ] + ) + # chunk_delta_ns is emitted on each RECV_NON_FIRST: 3000-2000=1000 and + # 4500-3000=1500. + assert ( + snapshot_series_count( + registry, MetricSeriesKey.CHUNK_DELTA_NS.value + ) + == 2 + ) + assert ( + snapshot_series_total( + registry, MetricSeriesKey.CHUNK_DELTA_NS.value + ) + == 2500 + ) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_non_streaming_latency_only(self, tmp_path): + """Non-streaming: emits sample_latency_ns + OSL, no TTFT/chunk_delta/TPOT.""" + loop = asyncio.get_event_loop() pool = MockTokenizePool(delay=0.0) - agg = make_async_stub_aggregator(store, pool, loop) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - sample_event( - SampleEventType.COMPLETE, - "s1", - ts=3000, - data=text_output("hello world"), - ), - ] - ) - await agg._table.drain_tasks() - assert 2000 in store.get_series_values("sample_latency_ns") - assert 2 in store.get_series_values("osl") - assert store.get_series_values("ttft_ns") == [] - assert store.get_series_values("chunk_delta_ns") == [] - assert store.get_series_values("tpot_ns") == [] - - @pytest.mark.asyncio - async def test_chunk_delta_not_emitted_without_last_recv(self): + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator( + ctx, loop, "agg_non_streaming", tokenize_pool=pool + ) + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + sample_event( + SampleEventType.COMPLETE, + "s1", + ts=3000, + data=text_output("hello world"), + ), + ] + ) + await agg._table.drain_tasks() + # sample_latency = 3000-1000 = 2000 + assert ( + snapshot_series_total( + registry, MetricSeriesKey.SAMPLE_LATENCY_NS.value + ) + == 2000 + ) + # OSL = token_count("hello world") = 2 + assert snapshot_series_total(registry, MetricSeriesKey.OSL.value) == 2 + assert ( + snapshot_series_count(registry, MetricSeriesKey.TTFT_NS.value) == 0 + ) + assert ( + snapshot_series_count( + registry, MetricSeriesKey.CHUNK_DELTA_NS.value + ) + == 0 + ) + assert ( + snapshot_series_count(registry, MetricSeriesKey.TPOT_NS.value) == 0 + ) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_chunk_delta_not_emitted_without_last_recv(self, tmp_path): """RECV_NON_FIRST without prior RECV_FIRST: no chunk_delta emitted.""" - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - ] - ) - row = agg._table.get_row("s1") - assert row is not None - assert row.last_recv_ns is None # No recv events yet + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_chunk_delta_no_recv") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + ] + ) + row = agg._table.get_row("s1") + assert row is not None + assert row.last_recv_ns is None + finally: + agg.close() # --------------------------------------------------------------------------- @@ -249,34 +387,46 @@ async def test_chunk_delta_not_emitted_without_last_recv(self): @pytest.mark.unit class TestIsl: @pytest.mark.asyncio - async def test_issued_with_token_ids_emits_isl_directly(self): + async def test_issued_with_token_ids_emits_isl_directly(self, tmp_path): """SGLang path: PromptData with token_ids emits ISL = len(token_ids).""" - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event( - SampleEventType.ISSUED, - "s1", - ts=1000, - data=PromptData(token_ids=(101, 202, 303, 404, 505)), - ), - ] - ) - assert 5 in store.get_series_values("isl") - - @pytest.mark.asyncio - async def test_issued_without_data_no_isl(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - ] - ) - assert store.get_series_values("isl") == [] + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_isl_token_ids") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event( + SampleEventType.ISSUED, + "s1", + ts=1000, + data=PromptData(token_ids=(101, 202, 303, 404, 505)), + ), + ] + ) + assert snapshot_series_total(registry, MetricSeriesKey.ISL.value) == 5 + finally: + agg.close() + + @pytest.mark.asyncio + async def test_issued_without_data_no_isl(self, tmp_path): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_isl_no_data") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + ] + ) + assert snapshot_series_count(registry, MetricSeriesKey.ISL.value) == 0 + finally: + agg.close() # --------------------------------------------------------------------------- @@ -287,152 +437,332 @@ async def test_issued_without_data_no_isl(self): @pytest.mark.unit class TestEdgeCases: @pytest.mark.asyncio - async def test_untracked_sample_events_ignored(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.RECV_FIRST, "unknown", ts=2000), - sample_event(SampleEventType.COMPLETE, "unknown", ts=5000), - ] - ) - assert store.get_series_values("ttft_ns") == [] - assert store.get_series_values("sample_latency_ns") == [] - - @pytest.mark.asyncio - async def test_complete_removes_row(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - sample_event(SampleEventType.COMPLETE, "s1", ts=5000), - ] - ) - assert agg._table.get_row("s1") is None - assert len(agg._table) == 0 - - @pytest.mark.asyncio - async def test_session_ended_closes_store(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.STARTED, ts=0), - session_event(SessionEventType.ENDED, ts=100), - ] - ) - assert store.closed - - @pytest.mark.asyncio - async def test_events_after_ended_are_dropped(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=100), - session_event(SessionEventType.ENDED, ts=200), - sample_event(SampleEventType.RECV_FIRST, "s1", ts=300), - ] - ) - assert store.get_series_values("ttft_ns") == [] - - @pytest.mark.asyncio - async def test_empty_sample_uuid_ignored(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "", ts=1000), - ] - ) - assert len(agg._table) == 0 - - @pytest.mark.asyncio - async def test_multiple_samples_independent(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - sample_event(SampleEventType.ISSUED, "s2", ts=1500), - sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), - sample_event(SampleEventType.RECV_FIRST, "s2", ts=3000), - sample_event(SampleEventType.COMPLETE, "s1", ts=4000), - sample_event(SampleEventType.COMPLETE, "s2", ts=5000), - ] - ) - ttfts = store.get_series_values("ttft_ns") - latencies = store.get_series_values("sample_latency_ns") - assert 1000 in ttfts - assert 1500 in ttfts - assert 3000 in latencies - assert 3500 in latencies - - @pytest.mark.asyncio - async def test_error_events_ignored(self): - """Error events should not crash the aggregator.""" - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - EventRecord( - event_type=ErrorEventType.GENERIC, - timestamp_ns=500, - data=ErrorData(error_type="test", error_message="boom"), - ), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - sample_event(SampleEventType.COMPLETE, "s1", ts=2000), - ] - ) - assert 1000 in store.get_series_values("sample_latency_ns") - - @pytest.mark.asyncio - async def test_session_started_stores_timestamp(self): - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process([session_event(SessionEventType.STARTED, ts=42)]) - assert agg._table.session_started_ns == 42 - - @pytest.mark.asyncio - async def test_process_multiple_batches(self): + async def test_untracked_sample_events_ignored(self, tmp_path): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_untracked_ignored") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.RECV_FIRST, "unknown", ts=2000), + sample_event(SampleEventType.COMPLETE, "unknown", ts=5000), + ] + ) + assert ( + snapshot_series_count(registry, MetricSeriesKey.TTFT_NS.value) == 0 + ) + assert ( + snapshot_series_count( + registry, MetricSeriesKey.SAMPLE_LATENCY_NS.value + ) + == 0 + ) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_duplicate_started_logs_error_and_preserves_state( + self, tmp_path, caplog + ): + """A duplicate ``STARTED`` event is a producer bug. + + The aggregator MUST NOT re-assign ``_session_start_ns`` on a + second STARTED — doing so freezes ``total_duration_ns`` for the + rest of the run (the max-of-elapsed guard never beats the new + smaller deltas). Verify the error is logged AND the original + start timestamp is preserved. + """ + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, _, _ = make_aggregator(ctx, loop, "agg_dup_started") + try: + with caplog.at_level("ERROR"): + await agg.process( + [ + session_event(SessionEventType.STARTED, ts=1_000), + session_event(SessionEventType.STARTED, ts=5_000), + ] + ) + # Original start timestamp must be preserved. + assert agg._session_start_ns == 1_000 + # And an error must have been logged with both timestamps. + error_records = [ + r for r in caplog.records if "Duplicate STARTED" in r.message + ] + assert ( + len(error_records) == 1 + ), "duplicate STARTED must log exactly one error" + assert "1000" in error_records[0].getMessage() + assert "5000" in error_records[0].getMessage() + finally: + agg.close() + + @pytest.mark.asyncio + async def test_complete_removes_row(self, tmp_path): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, _, _ = make_aggregator(ctx, loop, "agg_complete_removes") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + sample_event(SampleEventType.COMPLETE, "s1", ts=5000), + ] + ) + assert agg._table.get_row("s1") is None + assert len(agg._table) == 0 + finally: + agg.close() + + @pytest.mark.asyncio + async def test_session_ended_calls_publish_final(self, tmp_path): + """ENDED triggers ``publish_final`` and ``close`` on the publisher.""" + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, _, publisher = make_aggregator(ctx, loop, "agg_ended_publish_final") + try: + await agg.process( + [ + session_event(SessionEventType.STARTED, ts=0), + session_event(SessionEventType.ENDED, ts=100), + ] + ) + publisher.publish_final.assert_awaited_once() + # close() is invoked twice: once explicitly in the ENDED branch, + # then again from the aggregator's ``close`` (via _finalize). + assert publisher.close.call_count >= 1 + finally: + agg.close() + + @pytest.mark.asyncio + async def test_events_after_ended_are_dropped(self, tmp_path): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_events_after_ended") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=100), + session_event(SessionEventType.ENDED, ts=200), + # Should be dropped — the aggregator stops processing + # at the ENDED record. + sample_event(SampleEventType.RECV_FIRST, "s1", ts=300), + ] + ) + assert ( + snapshot_series_count(registry, MetricSeriesKey.TTFT_NS.value) == 0 + ) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_empty_sample_uuid_ignored(self, tmp_path): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, _, _ = make_aggregator(ctx, loop, "agg_empty_uuid") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "", ts=1000), + ] + ) + assert len(agg._table) == 0 + finally: + agg.close() + + @pytest.mark.asyncio + async def test_multiple_samples_independent(self, tmp_path): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_multi_independent") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + sample_event(SampleEventType.ISSUED, "s2", ts=1500), + sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), + sample_event(SampleEventType.RECV_FIRST, "s2", ts=3000), + sample_event(SampleEventType.COMPLETE, "s1", ts=4000), + sample_event(SampleEventType.COMPLETE, "s2", ts=5000), + ] + ) + # ttft: s1 = 1000, s2 = 1500 + # sample_latency: s1 = 3000, s2 = 3500 + assert ( + snapshot_series_count(registry, MetricSeriesKey.TTFT_NS.value) == 2 + ) + assert ( + snapshot_series_total(registry, MetricSeriesKey.TTFT_NS.value) + == 1000 + 1500 + ) + assert ( + snapshot_series_count( + registry, MetricSeriesKey.SAMPLE_LATENCY_NS.value + ) + == 2 + ) + assert ( + snapshot_series_total( + registry, MetricSeriesKey.SAMPLE_LATENCY_NS.value + ) + == 3000 + 3500 + ) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_error_event_increments_total_failed(self, tmp_path): + """ERROR for an untracked event increments TOTAL_SAMPLES_FAILED only. + + Tracked-failed paths are covered by ``test_aggregator_error_handler.py``; + here we just confirm the error doesn't crash the loop and the rest of + the batch processes normally. + """ + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_error_total") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + EventRecord( + event_type=ErrorEventType.GENERIC, + timestamp_ns=500, + data=ErrorData(error_type="test", error_message="boom"), + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + sample_event(SampleEventType.COMPLETE, "s1", ts=2000), + ] + ) + assert ( + snapshot_series_total( + registry, MetricSeriesKey.SAMPLE_LATENCY_NS.value + ) + == 1000 + ) + counters = snapshot_counters(registry) + assert counters[MetricCounterKey.TOTAL_SAMPLES_FAILED.value] == 1 + finally: + agg.close() + + @pytest.mark.asyncio + async def test_session_started_stores_timestamp(self, tmp_path): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, _, _ = make_aggregator(ctx, loop, "agg_started_ts") + try: + await agg.process([session_event(SessionEventType.STARTED, ts=42)]) + assert agg._table.session_started_ns == 42 + finally: + agg.close() + + @pytest.mark.asyncio + async def test_process_multiple_batches(self, tmp_path): """Two sequential process() calls maintain state correctly.""" - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - ] - ) - assert agg._table.get_row("s1") is not None - - await agg.process( - [ - sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), - sample_event(SampleEventType.COMPLETE, "s1", ts=3000), - ] - ) - assert 1000 in store.get_series_values("ttft_ns") - assert 2000 in store.get_series_values("sample_latency_ns") - assert agg._table.get_row("s1") is None - - @pytest.mark.asyncio - async def test_ended_in_second_batch(self): + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_multi_batch") + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + ] + ) + assert agg._table.get_row("s1") is not None + + await agg.process( + [ + sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), + sample_event(SampleEventType.COMPLETE, "s1", ts=3000), + ] + ) + assert ( + snapshot_series_total(registry, MetricSeriesKey.TTFT_NS.value) + == 1000 + ) + assert ( + snapshot_series_total( + registry, MetricSeriesKey.SAMPLE_LATENCY_NS.value + ) + == 2000 + ) + assert agg._table.get_row("s1") is None + finally: + agg.close() + + @pytest.mark.asyncio + async def test_ended_in_second_batch(self, tmp_path): """ENDED in a later batch still triggers finalize.""" - store = InMemoryKVStore() - agg = make_stub_aggregator(store) - await agg.process([session_event(SessionEventType.STARTED, ts=0)]) - assert not store.closed - await agg.process([session_event(SessionEventType.ENDED, ts=100)]) - assert store.closed + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, _, publisher = make_aggregator(ctx, loop, "agg_ended_second_batch") + try: + await agg.process([session_event(SessionEventType.STARTED, ts=0)]) + publisher.publish_final.assert_not_awaited() + await agg.process([session_event(SessionEventType.ENDED, ts=100)]) + publisher.publish_final.assert_awaited_once() + finally: + agg.close() + + +# --------------------------------------------------------------------------- +# Counter accounting (issued / completed) +# --------------------------------------------------------------------------- + + +@pytest.mark.unit +class TestCounterAccounting: + @pytest.mark.asyncio + async def test_total_vs_tracked_counters(self, tmp_path): + """Untracked ISSUED counts toward ``total_samples_issued`` but not + ``tracked_samples_issued``; same for COMPLETED. + """ + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator(ctx, loop, "agg_total_vs_tracked") + try: + await agg.process( + [ + session_event(SessionEventType.STARTED, ts=0), + # Untracked: warmup ISSUED before START_PERFORMANCE_TRACKING. + sample_event(SampleEventType.ISSUED, "warmup", ts=10), + sample_event(SampleEventType.COMPLETE, "warmup", ts=20), + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=30 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=100), + sample_event(SampleEventType.COMPLETE, "s1", ts=200), + ] + ) + counters = snapshot_counters(registry) + # Both samples count toward total. + assert counters[MetricCounterKey.TOTAL_SAMPLES_ISSUED.value] == 2 + assert counters[MetricCounterKey.TOTAL_SAMPLES_COMPLETED.value] == 2 + # Only s1 was tracked. + assert counters[MetricCounterKey.TRACKED_SAMPLES_ISSUED.value] == 1 + assert counters[MetricCounterKey.TRACKED_SAMPLES_COMPLETED.value] == 1 + finally: + agg.close() # --------------------------------------------------------------------------- @@ -443,211 +773,350 @@ async def test_ended_in_second_batch(self): @pytest.mark.unit class TestAsyncTriggers: @pytest.mark.asyncio - async def test_isl_text_path_async(self): + async def test_isl_text_path_async(self, tmp_path): """ISL with text prompt triggers async tokenization.""" - store = InMemoryKVStore() - loop = asyncio.get_running_loop() + loop = asyncio.get_event_loop() pool = MockTokenizePool(delay=0.01) - agg = make_async_stub_aggregator(store, pool, loop) - - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event( - SampleEventType.ISSUED, - "s1", - ts=1000, - data=PromptData(text="hello world foo bar"), - ), - ] - ) - # ISL task is in-flight; drain it - await agg._table.drain_tasks() - assert 4 in store.get_series_values("isl") - - @pytest.mark.asyncio - async def test_osl_emitted_on_complete(self): - """OSL is emitted via async tokenization when COMPLETE carries TextModelOutput.""" - store = InMemoryKVStore() - loop = asyncio.get_running_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator( + ctx, loop, "agg_isl_text_async", tokenize_pool=pool + ) + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event( + SampleEventType.ISSUED, + "s1", + ts=1000, + data=PromptData(text="hello world foo bar"), + ), + ] + ) + # ISL task is in-flight; drain it + await agg._table.drain_tasks() + assert snapshot_series_total(registry, MetricSeriesKey.ISL.value) == 4 + finally: + agg.close() + + @pytest.mark.asyncio + async def test_osl_emitted_on_complete(self, tmp_path): + """OSL is emitted via async tokenization when COMPLETE carries text.""" + loop = asyncio.get_event_loop() pool = MockTokenizePool(delay=0.01) - agg = make_async_stub_aggregator(store, pool, loop) - - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - sample_event( - SampleEventType.COMPLETE, - "s1", - ts=5000, - data=text_output("the quick brown fox"), - ), - ] - ) - await agg._table.drain_tasks() - assert 4000 in store.get_series_values("sample_latency_ns") - assert 4 in store.get_series_values("osl") - - @pytest.mark.asyncio - async def test_tpot_emitted_for_streaming(self): + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator( + ctx, loop, "agg_osl_complete", tokenize_pool=pool + ) + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + sample_event( + SampleEventType.COMPLETE, + "s1", + ts=5000, + data=text_output("the quick brown fox"), + ), + ] + ) + await agg._table.drain_tasks() + # sample_latency_ns = 5000-1000 = 4000 + assert ( + snapshot_series_total( + registry, MetricSeriesKey.SAMPLE_LATENCY_NS.value + ) + == 4000 + ) + # OSL = 4 tokens + assert snapshot_series_total(registry, MetricSeriesKey.OSL.value) == 4 + finally: + agg.close() + + @pytest.mark.asyncio + async def test_tpot_emitted_for_streaming(self, tmp_path): """TPOT is emitted for streaming responses using text_after_first_chunk.""" - store = InMemoryKVStore() - loop = asyncio.get_running_loop() + loop = asyncio.get_event_loop() pool = MockTokenizePool(delay=0.0) - agg = make_async_stub_aggregator(store, pool, loop) - - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), - sample_event( - SampleEventType.COMPLETE, - "s1", - ts=5000, - # Streaming: 3 chunks, text_after_first_chunk = "world foo" - data=streaming_text("hello", " world", " foo"), - ), - ] - ) - await agg._table.drain_tasks() - assert 3 in store.get_series_values("osl") # "hello world foo" = 3 tokens - # tpot = (5000 - 2000) / token_count("world foo") = 3000 / 2 = 1500 - assert 1500.0 in store.get_series_values("tpot_ns") - - @pytest.mark.asyncio - async def test_tpot_skipped_when_single_chunk(self): + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator( + ctx, loop, "agg_tpot_streaming", tokenize_pool=pool + ) + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), + sample_event( + SampleEventType.COMPLETE, + "s1", + ts=5000, + # Streaming: text_after_first_chunk = "world foo" + data=streaming_text("hello", " world", " foo"), + ), + ] + ) + await agg._table.drain_tasks() + # OSL = "hello world foo" = 3 tokens + assert snapshot_series_total(registry, MetricSeriesKey.OSL.value) == 3 + # tpot = (5000 - 2000) / token_count("world foo") = 3000 / 2 = 1500 + assert snapshot_series_total( + registry, MetricSeriesKey.TPOT_NS.value + ) == pytest.approx(1500.0) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_tpot_skipped_when_single_chunk(self, tmp_path): """TPOT is not emitted when there are no tokens after the first chunk.""" - store = InMemoryKVStore() - loop = asyncio.get_running_loop() + loop = asyncio.get_event_loop() pool = MockTokenizePool(delay=0.0) - agg = make_async_stub_aggregator(store, pool, loop) - - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), - sample_event( - SampleEventType.COMPLETE, - "s1", - ts=5000, - # Single chunk: text_after_first_chunk = "" - data=streaming_text("only"), - ), - ] - ) - await agg._table.drain_tasks() - assert 1 in store.get_series_values("osl") - assert store.get_series_values("tpot_ns") == [] - - @pytest.mark.asyncio - async def test_tpot_not_emitted_without_streaming_flag(self): - """TPOT trigger is not registered when streaming=False.""" - store = InMemoryKVStore() - loop = asyncio.get_running_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator( + ctx, loop, "agg_tpot_single_chunk", tokenize_pool=pool + ) + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), + sample_event( + SampleEventType.COMPLETE, + "s1", + ts=5000, + # Single chunk: text_after_first_chunk = "" + data=streaming_text("only"), + ), + ] + ) + await agg._table.drain_tasks() + assert snapshot_series_total(registry, MetricSeriesKey.OSL.value) == 1 + assert ( + snapshot_series_count(registry, MetricSeriesKey.TPOT_NS.value) == 0 + ) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_tpot_not_emitted_without_streaming_flag(self, tmp_path): + """When ``streaming=False``, TPOT/TTFT/chunk_delta series are NOT + registered at all — the aggregator's snapshot has no entry for them. + """ + loop = asyncio.get_event_loop() pool = MockTokenizePool(delay=0.0) - agg = make_async_stub_aggregator(store, pool, loop, streaming=False) - - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), - sample_event( - SampleEventType.COMPLETE, - "s1", - ts=5000, - data=streaming_text("hello", " world", " foo"), - ), - ] - ) - await agg._table.drain_tasks() - assert 4000 in store.get_series_values("sample_latency_ns") - assert 3 in store.get_series_values("osl") - assert store.get_series_values("tpot_ns") == [] - assert store.get_series_values("ttft_ns") == [] - assert store.get_series_values("chunk_delta_ns") == [] - - @pytest.mark.asyncio - async def test_tpot_non_streaming_output_skipped(self): + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator( + ctx, + loop, + "agg_tpot_no_streaming", + tokenize_pool=pool, + streaming=False, + ) + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), + sample_event( + SampleEventType.COMPLETE, + "s1", + ts=5000, + data=streaming_text("hello", " world", " foo"), + ), + ] + ) + await agg._table.drain_tasks() + # sample_latency / OSL still emitted in non-streaming mode. + assert ( + snapshot_series_total( + registry, MetricSeriesKey.SAMPLE_LATENCY_NS.value + ) + == 4000 + ) + assert snapshot_series_total(registry, MetricSeriesKey.OSL.value) == 3 + # The streaming-only series are unregistered. + assert not registry.has_series(MetricSeriesKey.TPOT_NS.value) + assert not registry.has_series(MetricSeriesKey.TTFT_NS.value) + assert not registry.has_series(MetricSeriesKey.CHUNK_DELTA_NS.value) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_tpot_non_streaming_output_skipped(self, tmp_path): """TPOT is not emitted for non-streaming (str) TextModelOutput.""" - store = InMemoryKVStore() - loop = asyncio.get_running_loop() + loop = asyncio.get_event_loop() pool = MockTokenizePool(delay=0.0) - agg = make_async_stub_aggregator(store, pool, loop) - - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event(SampleEventType.ISSUED, "s1", ts=1000), - sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), - sample_event( - SampleEventType.COMPLETE, - "s1", - ts=5000, - # Non-streaming: str output, text_after_first_chunk = "" - data=text_output("hello world foo"), - ), - ] - ) - await agg._table.drain_tasks() - assert 3 in store.get_series_values("osl") - assert store.get_series_values("tpot_ns") == [] - - @pytest.mark.asyncio - async def test_drain_tasks_awaits_in_flight(self): + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator( + ctx, loop, "agg_tpot_str_output", tokenize_pool=pool + ) + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event(SampleEventType.ISSUED, "s1", ts=1000), + sample_event(SampleEventType.RECV_FIRST, "s1", ts=2000), + sample_event( + SampleEventType.COMPLETE, + "s1", + ts=5000, + # Non-streaming str output: text_after_first_chunk = "" + data=text_output("hello world foo"), + ), + ] + ) + await agg._table.drain_tasks() + assert snapshot_series_total(registry, MetricSeriesKey.OSL.value) == 3 + assert ( + snapshot_series_count(registry, MetricSeriesKey.TPOT_NS.value) == 0 + ) + finally: + agg.close() + + @pytest.mark.asyncio + async def test_drain_tasks_awaits_in_flight(self, tmp_path): """drain_tasks() properly awaits all in-flight async trigger tasks.""" - store = InMemoryKVStore() - loop = asyncio.get_running_loop() + loop = asyncio.get_event_loop() pool = MockTokenizePool(delay=0.05) - agg = make_async_stub_aggregator(store, pool, loop) - - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event( - SampleEventType.ISSUED, - "s1", - ts=1000, - data=PromptData(text="a b c d e"), - ), - ] - ) - # Tasks are in-flight but not yet complete - assert len(agg._table._in_flight_tasks) > 0 - - await agg._table.drain_tasks() - assert len(agg._table._in_flight_tasks) == 0 - assert 5 in store.get_series_values("isl") - - @pytest.mark.asyncio - async def test_shutdown_drains_async_tasks(self): + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, _ = make_aggregator( + ctx, loop, "agg_drain_in_flight", tokenize_pool=pool + ) + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event( + SampleEventType.ISSUED, + "s1", + ts=1000, + data=PromptData(text="a b c d e"), + ), + ] + ) + # Tasks are in-flight but not yet complete + assert agg._table.in_flight_tasks_count > 0 + + await agg._table.drain_tasks() + assert agg._table.in_flight_tasks_count == 0 + assert snapshot_series_total(registry, MetricSeriesKey.ISL.value) == 5 + finally: + agg.close() + + @pytest.mark.asyncio + async def test_shutdown_drains_async_tasks(self, tmp_path): """ENDED drains in-flight async tasks before finalizing.""" - store = InMemoryKVStore() - loop = asyncio.get_running_loop() + loop = asyncio.get_event_loop() pool = MockTokenizePool(delay=0.02) - agg = make_async_stub_aggregator(store, pool, loop) - - await agg.process( - [ - session_event(SessionEventType.START_PERFORMANCE_TRACKING, ts=0), - sample_event( - SampleEventType.ISSUED, - "s1", - ts=1000, - data=PromptData(text="one two three"), - ), - session_event(SessionEventType.ENDED, ts=2000), - ] - ) - # After ENDED, drain_tasks was called, so ISL should be emitted - assert 3 in store.get_series_values("isl") - assert store.closed - - # TODO: Add tests for trigger exception handling (logger.exception paths). - # Inject a MockTokenizePool that raises on token_count_async and verify: - # - No metric is emitted for the failing trigger - # - The aggregator does not crash - # - The task set is cleaned up (done_callback fires on failed tasks) + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, registry, publisher = make_aggregator( + ctx, loop, "agg_shutdown_drain", tokenize_pool=pool + ) + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event( + SampleEventType.ISSUED, + "s1", + ts=1000, + data=PromptData(text="one two three"), + ), + session_event(SessionEventType.ENDED, ts=2000), + ] + ) + # After ENDED, drain_tasks ran inside process() — ISL emitted. + assert snapshot_series_total(registry, MetricSeriesKey.ISL.value) == 3 + publisher.publish_final.assert_awaited_once() + finally: + agg.close() + + # NOTE(agents): Trigger exception handling (logger.exception paths) is not + # exercised here. Adding a MockTokenizePool that raises on + # token_count_async would let us assert no metric is emitted, the + # aggregator does not crash, and the task set is cleaned up. + + @pytest.mark.asyncio + async def test_drain_timeout_reports_pending_count(self, tmp_path): + """On drain timeout, publish_final must receive n_pending_tasks > 0. + + AGENTS.md and the ``MetricsSnapshot.n_pending_tasks`` docstring + document the consumer contract: a drain-timeout run is detected + downstream as ``state == COMPLETE and n_pending_tasks > 0``. If + the producer always reports 0 here, the timeout is silently + rebadged as a clean run and the Report shows no warning. + """ + loop = asyncio.get_event_loop() + + class BlockingTokenizePool: + async def token_count_async(self, text, _loop): + await asyncio.sleep(10.0) # exceeds drain timeout + return 0 + + def token_count(self, text): + return 0 + + def close(self): + pass + + def __enter__(self): + return self + + def __exit__(self, *args): + self.close() + + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: + agg, _, publisher = make_aggregator( + ctx, + loop, + "agg_drain_timeout", + tokenize_pool=BlockingTokenizePool(), + ) + agg._drain_timeout_s = 0.05 + try: + await agg.process( + [ + session_event( + SessionEventType.START_PERFORMANCE_TRACKING, ts=0 + ), + sample_event( + SampleEventType.ISSUED, + "s1", + ts=1000, + data=PromptData(text="some text to tokenize"), + ), + ] + ) + assert ( + agg._table.in_flight_tasks_count > 0 + ), "precondition: ISL task must be in-flight before ENDED" + await agg.process([session_event(SessionEventType.ENDED, ts=2000)]) + + publisher.publish_final.assert_awaited_once() + kwargs = publisher.publish_final.await_args.kwargs + assert kwargs["n_pending_tasks"] > 0, ( + f"drain timeout must report stuck tasks; got " + f"n_pending_tasks={kwargs['n_pending_tasks']}" + ) + finally: + agg.close() diff --git a/tests/unit/async_utils/services/metrics_aggregator/test_aggregator_e2e.py b/tests/unit/async_utils/services/metrics_aggregator/test_aggregator_e2e.py index 20319cec..6c0e6f65 100644 --- a/tests/unit/async_utils/services/metrics_aggregator/test_aggregator_e2e.py +++ b/tests/unit/async_utils/services/metrics_aggregator/test_aggregator_e2e.py @@ -13,321 +13,253 @@ # See the License for the specific language governing permissions and # limitations under the License. -"""End-to-end tests for MetricsAggregatorService with real ZMQ pub/sub. +"""End-to-end tests for the metrics aggregator's two delivery channels. -These tests launch an EventPublisherService, connect a MetricsAggregatorService -over ZMQ IPC, publish EventRecords, and verify the aggregator computes and -emits the correct metrics into the KVStore. +The aggregator publishes snapshots through two independent paths: + +1. ``final_snapshot.json`` on disk — the **primary** delivery surface + for the Report consumer. Written atomically by ``publish_final``. +2. ZMQ PUB → ``MetricsSnapshotSubscriber`` — live ticks for TUI / live + consumers, plus a terminal-state frame at end-of-run as a + "run is over" signal. + +These tests stand up a real ``MetricsPublisher`` and +``MetricsSnapshotSubscriber`` against a single ``ManagedZMQContext.scoped`` +context and verify both channels deliver the right state. The full event +pipeline (events → aggregator → metrics) is covered in +``test_aggregator.py``; this module is concerned with the publish layer. """ +from __future__ import annotations + import asyncio -import time -from threading import Lock +import json +from pathlib import Path import pytest import zmq -from inference_endpoint.async_utils.event_publisher import EventPublisherService -from inference_endpoint.async_utils.loop_manager import LoopManager -from inference_endpoint.async_utils.services.metrics_aggregator.aggregator import ( - MetricsAggregatorService, +from inference_endpoint.async_utils.services.metrics_aggregator.publisher import ( + MetricsPublisher, ) -from inference_endpoint.async_utils.transport.zmq.context import ManagedZMQContext -from inference_endpoint.core.record import ( - EventRecord, - SampleEventType, - SessionEventType, +from inference_endpoint.async_utils.services.metrics_aggregator.registry import ( + MetricsRegistry, ) +from inference_endpoint.async_utils.services.metrics_aggregator.snapshot import ( + MetricsSnapshotCodec, + SessionState, +) +from inference_endpoint.async_utils.services.metrics_aggregator.subscriber import ( + MetricsSnapshotSubscriber, +) +from inference_endpoint.async_utils.transport.zmq.context import ManagedZMQContext -from .conftest import InMemoryKVStore - -# --------------------------------------------------------------------------- -# Signaling KVStore for e2e tests -# --------------------------------------------------------------------------- - - -class SignalingKVStore(InMemoryKVStore): - """InMemoryKVStore that signals an asyncio.Event when a target series count is reached. - - This replaces the old CollectingEmitter.set_wait_target() pattern. Call - set_wait_target(event, count) before publishing records; the event will be - set once the total number of series values across all series keys reaches - the target count. - """ - - def __init__(self) -> None: - super().__init__() - self._target_event: asyncio.Event | None = None - self._target_count: int = 0 - self._lock = Lock() - - def set_wait_target(self, event: asyncio.Event, count: int) -> None: - self._target_event = event - self._target_count = count - - def update(self, key: str, value: float) -> None: - super().update(key, value) - with self._lock: - if self._target_event is not None: - total = sum(len(v) for v in self._series.values()) - if total >= self._target_count: - self._target_event.set() - +# Small but generous: ZMQ's slow-joiner means the subscriber may miss the +# first few publishes; we wait for the first delivered snapshot below +# rather than racing on a wall clock. +_WAIT_TIMEOUT = 3.0 -# --------------------------------------------------------------------------- -# Fixtures -# --------------------------------------------------------------------------- -_WAIT_TIMEOUT = 3.0 +async def _wait_for_terminal_state( + subscriber: MetricsSnapshotSubscriber, timeout: float = _WAIT_TIMEOUT +) -> bool: + """Poll ``subscriber.latest`` until a terminal-state frame arrives.""" + deadline = asyncio.get_event_loop().time() + timeout + while asyncio.get_event_loop().time() < deadline: + latest = subscriber.latest + if latest is not None and latest.state in ( + SessionState.COMPLETE, + SessionState.INTERRUPTED, + ): + return True + await asyncio.sleep(0.02) + return False @pytest.fixture -def zmq_context(): - with ManagedZMQContext.scoped() as ctx: +def zmq_ctx_scope(tmp_path: Path): + """Provide a scoped ManagedZMQContext for the duration of a test.""" + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as ctx: yield ctx -@pytest.fixture -def publisher(zmq_context): +def _make_pair( + socket_name: str, + zmq_ctx: ManagedZMQContext, + loop: asyncio.AbstractEventLoop, + final_snapshot_path: Path, + *, + conflate: bool = False, +) -> tuple[MetricsPublisher, MetricsSnapshotSubscriber]: + """Bind a publisher then connect a subscriber on the same socket name. + + Order matters for IPC: the publisher binds first so the IPC file + exists before the subscriber connects. ``conflate=False`` (default) + keeps every received message — appropriate for these tests where we + want to count deliveries rather than just observe the freshest. + """ try: - service = EventPublisherService(zmq_context) + publisher = MetricsPublisher( + MetricsSnapshotCodec(), + zmq_ctx, + socket_name, + loop, + final_snapshot_path=final_snapshot_path, + ) except zmq.ZMQError as exc: pytest.skip(f"ZMQ IPC bind unavailable (sandboxed?): {exc}") - yield service - service.close() - - -@pytest.fixture -def aggregator_loop(): - manager = LoopManager() - # Use unique name per test invocation to avoid loop reuse across tests - name = f"test_metrics_agg_{id(object())}" - return manager.create_loop(name) - - -@pytest.fixture -def signaling_store(): - return SignalingKVStore() - - -@pytest.fixture -def shutdown_event(): - return asyncio.Event() - - -@pytest.fixture -def aggregator( - publisher, aggregator_loop, zmq_context, signaling_store, shutdown_event -): - """MetricsAggregatorService connected to the publisher via ZMQ.""" - agg = MetricsAggregatorService( - publisher.bind_path, - zmq_context, - aggregator_loop, - topics=None, - kv_store=signaling_store, - tokenize_pool=None, - streaming=True, - shutdown_event=shutdown_event, + subscriber = MetricsSnapshotSubscriber( + socket_name, zmq_ctx, loop, conflate=conflate ) - aggregator_loop.call_soon_threadsafe(agg.start) - # Allow ZMQ slow-joiner to connect - time.sleep(0.5) - yield agg - if not agg.is_closed: - agg.close() - - -def _publish_and_sleep(publisher, record, delay=0.05): - """Publish a record, flush, and sleep briefly to let the event loop drain.""" - publisher.publish(record) - publisher.flush() - time.sleep(delay) - - -# --------------------------------------------------------------------------- -# E2E tests -# --------------------------------------------------------------------------- + subscriber.start() + return publisher, subscriber -@pytest.mark.integration -class TestAggregatorE2E: +@pytest.mark.unit +class TestFinalSnapshotDelivery: @pytest.mark.asyncio - async def test_single_sample_timing_metrics( - self, publisher, aggregator, signaling_store + async def test_publish_final_writes_json_and_signals_pubsub( + self, tmp_path: Path, zmq_ctx_scope: ManagedZMQContext ): - """Full streaming sample lifecycle over real ZMQ pub/sub.""" - done = asyncio.Event() - # Expect: ttft_ns, chunk_delta_ns, sample_latency_ns = 3 series values - signaling_store.set_wait_target(done, 3) - - _publish_and_sleep( - publisher, - EventRecord( - event_type=SessionEventType.START_PERFORMANCE_TRACKING, - timestamp_ns=0, - ), - ) - _publish_and_sleep( - publisher, - EventRecord( - event_type=SampleEventType.ISSUED, - timestamp_ns=1000, - sample_uuid="s1", - ), - ) - _publish_and_sleep( - publisher, - EventRecord( - event_type=SampleEventType.RECV_FIRST, - timestamp_ns=2000, - sample_uuid="s1", - ), - ) - _publish_and_sleep( - publisher, - EventRecord( - event_type=SampleEventType.RECV_NON_FIRST, - timestamp_ns=3000, - sample_uuid="s1", - ), + """``publish_final`` writes the JSON file AND fires the pub/sub signal. + + The JSON file is the primary Report source; the pub/sub frame is + the TUI shutdown signal. Both must land on a clean shutdown. + """ + loop = asyncio.get_event_loop() + target = tmp_path / "final_snapshot.json" + publisher, subscriber = _make_pair( + "test_e2e_final", + zmq_ctx_scope, + loop, + target, ) - _publish_and_sleep( - publisher, - EventRecord( - event_type=SampleEventType.COMPLETE, - timestamp_ns=4000, - sample_uuid="s1", - ), - ) - - await asyncio.wait_for(done.wait(), timeout=_WAIT_TIMEOUT) - - assert 1000 in signaling_store.get_series_values("ttft_ns") - assert 1000 in signaling_store.get_series_values("chunk_delta_ns") - assert 3000 in signaling_store.get_series_values("sample_latency_ns") + try: + registry = MetricsRegistry() + registry.register_counter("total_samples_completed") + registry.increment("total_samples_completed", 7) + + # ZMQ slow-joiner: give the SUB time to attach before publishing. + await asyncio.sleep(0.2) + await publisher.publish_final(registry, n_pending_tasks=0) + + # JSON file landed with the right terminal state. + assert target.exists(), "publish_final must write final_snapshot.json" + decoded = json.loads(target.read_bytes()) + assert decoded["state"] == SessionState.COMPLETE.value + assert decoded["n_pending_tasks"] == 0 + + # Pub/sub signal landed at the subscriber as the most recent frame. + arrived = await _wait_for_terminal_state(subscriber) + assert arrived, "subscriber must receive terminal-state frame" + assert subscriber.latest is not None + assert subscriber.latest.state == SessionState.COMPLETE + finally: + subscriber.close() + publisher.close() @pytest.mark.asyncio - async def test_tracking_window_respected( - self, publisher, aggregator, signaling_store + async def test_live_tick_then_final( + self, tmp_path: Path, zmq_ctx_scope: ManagedZMQContext ): - """Samples issued before START_PERFORMANCE_TRACKING are not tracked.""" - done = asyncio.Event() - # Only s2 should produce metrics (1 metric: sample_latency_ns) - signaling_store.set_wait_target(done, 1) - - # Issue s1 before tracking starts -- should be ignored - _publish_and_sleep( - publisher, - EventRecord( - event_type=SampleEventType.ISSUED, - timestamp_ns=100, - sample_uuid="s1", - ), - ) - _publish_and_sleep( - publisher, - EventRecord( - event_type=SessionEventType.START_PERFORMANCE_TRACKING, - timestamp_ns=200, - ), - ) - _publish_and_sleep( - publisher, - EventRecord( - event_type=SampleEventType.ISSUED, - timestamp_ns=300, - sample_uuid="s2", - ), - ) - _publish_and_sleep( - publisher, - EventRecord( - event_type=SampleEventType.COMPLETE, - timestamp_ns=500, - sample_uuid="s1", - ), - ) - _publish_and_sleep( - publisher, - EventRecord( - event_type=SampleEventType.COMPLETE, - timestamp_ns=600, - sample_uuid="s2", - ), + """Live ticks deliver LIVE-state snapshots; final flips to COMPLETE. + + Tracks the lifecycle a TUI sees: subscriber's ``latest`` is + updated by every live tick, then replaced by the terminal-state + frame at end-of-run. + """ + loop = asyncio.get_event_loop() + publisher, subscriber = _make_pair( + "test_e2e_live_then_final", + zmq_ctx_scope, + loop, + tmp_path / "final_snapshot.json", + # conflate=True mirrors the default subscriber setting — we + # only care which state is *most recent*, not the count. + conflate=True, ) + try: + registry = MetricsRegistry() + registry.register_counter("c") - await asyncio.wait_for(done.wait(), timeout=_WAIT_TIMEOUT) + # Slow-joiner grace. + await asyncio.sleep(0.2) - assert 300 in signaling_store.get_series_values("sample_latency_ns") - # s1 should not have produced any latency values besides s2's - latencies = signaling_store.get_series_values("sample_latency_ns") - assert len(latencies) == 1 + publisher.start( + registry, + publish_interval_s=0.05, + get_runtime_state=lambda: (SessionState.LIVE, 0), + ) - @pytest.mark.asyncio - async def test_session_ended_triggers_shutdown( - self, publisher, aggregator, signaling_store, shutdown_event - ): - """ENDED event causes store close and shutdown signal.""" - _publish_and_sleep( - publisher, - EventRecord( - event_type=SessionEventType.ENDED, - timestamp_ns=1000, - ), - ) - await asyncio.wait_for(shutdown_event.wait(), timeout=_WAIT_TIMEOUT) - assert signaling_store.closed + # Wait for at least one live snapshot to arrive. + for _ in range(50): + await asyncio.sleep(0.05) + if subscriber.latest is not None: + break + assert subscriber.latest is not None, "expected at least one live tick" + assert subscriber.latest.state == SessionState.LIVE + + await publisher.publish_final(registry, n_pending_tasks=0) + arrived = await _wait_for_terminal_state(subscriber) + assert arrived + assert subscriber.latest is not None + assert subscriber.latest.state == SessionState.COMPLETE + finally: + subscriber.close() + publisher.close() @pytest.mark.asyncio - async def test_multiple_samples_concurrent( - self, publisher, aggregator, signaling_store + async def test_multiple_metrics_round_trip( + self, tmp_path: Path, zmq_ctx_scope: ManagedZMQContext ): - """Multiple samples in flight concurrently produce independent metrics.""" - done = asyncio.Event() - # 2 samples x 2 metrics each (ttft_ns + sample_latency_ns) = 4 - signaling_store.set_wait_target(done, 4) - - _publish_and_sleep( - publisher, - EventRecord( - event_type=SessionEventType.START_PERFORMANCE_TRACKING, - timestamp_ns=0, - ), + """Counters and series both land in the JSON file with the right shape. + + Counter values must be exact; series count + total must + round-trip. Histogram bucket geometry is covered in + ``test_registry.py`` and ``test_snapshot.py`` — here we confirm + the on-disk format preserves the shape end-to-end. + """ + loop = asyncio.get_event_loop() + target = tmp_path / "final_snapshot.json" + publisher, subscriber = _make_pair( + "test_e2e_multimetric", + zmq_ctx_scope, + loop, + target, ) - for uuid, issued_ts, recv_ts, complete_ts in [ - ("a", 100, 200, 400), - ("b", 150, 350, 500), - ]: - _publish_and_sleep( - publisher, - EventRecord( - event_type=SampleEventType.ISSUED, - timestamp_ns=issued_ts, - sample_uuid=uuid, - ), - ) - _publish_and_sleep( - publisher, - EventRecord( - event_type=SampleEventType.RECV_FIRST, - timestamp_ns=recv_ts, - sample_uuid=uuid, - ), + try: + registry = MetricsRegistry() + registry.register_counter("tracked_samples_issued") + registry.register_counter("tracked_samples_completed") + registry.register_series( + "sample_latency_ns", + hdr_low=1, + hdr_high=3_600_000_000_000, + sig_figs=3, + n_histogram_buckets=10, + percentiles=(50.0, 99.0), ) - _publish_and_sleep( - publisher, - EventRecord( - event_type=SampleEventType.COMPLETE, - timestamp_ns=complete_ts, - sample_uuid=uuid, - ), - ) - - await asyncio.wait_for(done.wait(), timeout=_WAIT_TIMEOUT) - - ttfts = signaling_store.get_series_values("ttft_ns") - latencies = signaling_store.get_series_values("sample_latency_ns") - assert 100 in ttfts # a: 200 - 100 - assert 300 in latencies # a: 400 - 100 - assert 200 in ttfts # b: 350 - 150 - assert 350 in latencies # b: 500 - 150 + for _ in range(2): + registry.increment("tracked_samples_issued") + registry.increment("tracked_samples_completed") + registry.record("sample_latency_ns", 1_500_000) + registry.record("sample_latency_ns", 2_500_000) + + # Slow-joiner grace. + await asyncio.sleep(0.2) + await publisher.publish_final(registry, n_pending_tasks=0) + + decoded = json.loads(target.read_bytes()) + counters = { + m["name"]: m["value"] + for m in decoded["metrics"] + if m["type"] == "counter" + } + series = {m["name"]: m for m in decoded["metrics"] if m["type"] == "series"} + assert counters["tracked_samples_issued"] == 2 + assert counters["tracked_samples_completed"] == 2 + assert "sample_latency_ns" in series + assert series["sample_latency_ns"]["count"] == 2 + assert series["sample_latency_ns"]["total"] == 4_000_000 + finally: + subscriber.close() + publisher.close() diff --git a/tests/unit/async_utils/services/metrics_aggregator/test_aggregator_error_handler.py b/tests/unit/async_utils/services/metrics_aggregator/test_aggregator_error_handler.py new file mode 100644 index 00000000..4e8222c4 --- /dev/null +++ b/tests/unit/async_utils/services/metrics_aggregator/test_aggregator_error_handler.py @@ -0,0 +1,243 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Targeted regression tests for the aggregator's ERROR-event handler. + +These tests cover the ``TRACKED_SAMPLES_FAILED`` increment path (design +v5 §3) without reviving the broader ``test_aggregator.py`` module. They +construct the aggregator with a mocked publisher and inject events +directly via ``process()``. +""" + +from __future__ import annotations + +from unittest.mock import MagicMock + +import pytest +from inference_endpoint.async_utils.services.metrics_aggregator.aggregator import ( + MetricCounterKey, + MetricsAggregatorService, +) +from inference_endpoint.async_utils.services.metrics_aggregator.registry import ( + MetricsRegistry, +) +from inference_endpoint.async_utils.services.metrics_aggregator.snapshot import ( + CounterStat, + SessionState, +) +from inference_endpoint.async_utils.transport.zmq.context import ManagedZMQContext +from inference_endpoint.core.record import ( + ErrorData, + ErrorEventType, + EventRecord, + SampleEventType, + SessionEventType, +) + + +def _counters(registry: MetricsRegistry) -> dict[str, int | float]: + """Read all counter values via a snapshot. State/n_pending don't matter + for counter inspection; we just need the snapshot to materialize values.""" + snap = registry.build_snapshot(state=SessionState.LIVE, n_pending_tasks=0) + return {m.name: m.value for m in snap.metrics if isinstance(m, CounterStat)} + + +def _make_aggregator( + zmq_ctx: ManagedZMQContext, + loop, + socket_name: str, + *, + streaming: bool = False, +) -> tuple[MetricsAggregatorService, MetricsRegistry, MagicMock]: + """Construct an aggregator with a real ZMQ subscriber and a mocked + publisher. ``start()`` is intentionally NOT called — we don't want the + socket reader added to the loop, since we'll inject events directly via + ``process()``. + + ``zmq_ctx`` must have a ``socket_dir`` set (pass via ``ManagedZMQContext. + scoped(socket_dir=...)``) since the aggregator's SUB socket connects on + IPC. + """ + registry = MetricsRegistry() + publisher = MagicMock() + agg = MetricsAggregatorService( + socket_name, + zmq_ctx, + loop, + registry=registry, + publisher=publisher, + publish_interval_s=0.25, + sig_figs=3, + n_histogram_buckets=10, + streaming=streaming, + ) + return agg, registry, publisher + + +@pytest.mark.unit +@pytest.mark.asyncio +async def test_error_event_increments_tracked_failed_when_row_exists(tmp_path): + """ERROR for a tracked, in-flight sample increments BOTH total and + tracked failure counters. + + This only works because session.py emits ERROR before COMPLETE — if + the order regresses, the row is removed by set_field(...COMPLETE...) + before the ERROR handler runs and ``TRACKED_SAMPLES_FAILED`` silently + stays at 0. + """ + import asyncio + + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as zmq_ctx: + agg, registry, _ = _make_aggregator(zmq_ctx, loop, "test_agg_err_in_flight") + try: + ts = 1_000_000_000 + uuid = "tracked-uuid-1" + + await agg.process( + [ + EventRecord(event_type=SessionEventType.STARTED, timestamp_ns=ts), + EventRecord( + event_type=SessionEventType.START_PERFORMANCE_TRACKING, + timestamp_ns=ts, + ), + EventRecord( + event_type=SampleEventType.ISSUED, + timestamp_ns=ts + 100, + sample_uuid=uuid, + ), + ] + ) + # Pre-condition: ISSUED while tracking creates a row. + assert agg._table.get_row(uuid) is not None + + # ERROR arrives while the row is still in flight. + await agg.process( + [ + EventRecord( + event_type=ErrorEventType.GENERIC, + timestamp_ns=ts + 200, + sample_uuid=uuid, + data=ErrorData(error_type="t", error_message="boom"), + ) + ] + ) + + counters = _counters(registry) + assert counters[MetricCounterKey.TOTAL_SAMPLES_FAILED.value] == 1 + assert counters[MetricCounterKey.TRACKED_SAMPLES_FAILED.value] == 1 + finally: + agg.close() + + +@pytest.mark.unit +@pytest.mark.asyncio +async def test_error_after_complete_misses_tracked_failed(tmp_path): + """If COMPLETE arrives before ERROR, the tracked row is gone and the + aggregator cannot tell the failure was tracked. This documents the + failure mode that motivated the session.py event-order swap. + """ + import asyncio + + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as zmq_ctx: + agg, registry, _ = _make_aggregator( + zmq_ctx, loop, "test_agg_err_after_complete" + ) + try: + ts = 1_000_000_000 + uuid = "out-of-order-uuid" + + # Reverse-order delivery: COMPLETE then ERROR. + await agg.process( + [ + EventRecord(event_type=SessionEventType.STARTED, timestamp_ns=ts), + EventRecord( + event_type=SessionEventType.START_PERFORMANCE_TRACKING, + timestamp_ns=ts, + ), + EventRecord( + event_type=SampleEventType.ISSUED, + timestamp_ns=ts + 100, + sample_uuid=uuid, + ), + EventRecord( + event_type=SampleEventType.COMPLETE, + timestamp_ns=ts + 200, + sample_uuid=uuid, + ), + EventRecord( + event_type=ErrorEventType.GENERIC, + timestamp_ns=ts + 201, + sample_uuid=uuid, + data=ErrorData(error_type="t", error_message="boom"), + ), + ] + ) + + counters = _counters(registry) + # Total still increments — the ERROR is observed. + assert counters[MetricCounterKey.TOTAL_SAMPLES_FAILED.value] == 1 + # But tracked DOES NOT — the row was already gone. This is the + # bug the session.py event-order swap was added to prevent. + assert counters[MetricCounterKey.TRACKED_SAMPLES_FAILED.value] == 0 + finally: + agg.close() + + +@pytest.mark.unit +@pytest.mark.asyncio +async def test_error_for_untracked_sample_only_increments_total(tmp_path): + """Sample issued outside a tracking window has no row. ERROR for it + increments TOTAL but not TRACKED. + """ + import asyncio + + loop = asyncio.get_event_loop() + with ManagedZMQContext.scoped(socket_dir=str(tmp_path)) as zmq_ctx: + agg, registry, _ = _make_aggregator(zmq_ctx, loop, "test_agg_err_untracked") + try: + ts = 1_000_000_000 + uuid = "untracked-uuid" + + await agg.process( + [ + EventRecord(event_type=SessionEventType.STARTED, timestamp_ns=ts), + # No START_PERFORMANCE_TRACKING — ISSUED creates no row. + EventRecord( + event_type=SampleEventType.ISSUED, + timestamp_ns=ts + 100, + sample_uuid=uuid, + ), + ] + ) + assert agg._table.get_row(uuid) is None + + await agg.process( + [ + EventRecord( + event_type=ErrorEventType.GENERIC, + timestamp_ns=ts + 200, + sample_uuid=uuid, + data=ErrorData(error_type="t", error_message="boom"), + ) + ] + ) + + counters = _counters(registry) + assert counters[MetricCounterKey.TOTAL_SAMPLES_FAILED.value] == 1 + assert counters[MetricCounterKey.TRACKED_SAMPLES_FAILED.value] == 0 + finally: + agg.close() diff --git a/tests/unit/async_utils/services/metrics_aggregator/test_kv_store.py b/tests/unit/async_utils/services/metrics_aggregator/test_kv_store.py deleted file mode 100644 index f9e23cd7..00000000 --- a/tests/unit/async_utils/services/metrics_aggregator/test_kv_store.py +++ /dev/null @@ -1,395 +0,0 @@ -# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. -# SPDX-License-Identifier: Apache-2.0 -# -# Licensed under the Apache License, Version 2.0 (the "License"); -# you may not use this file except in compliance with the License. -# You may obtain a copy of the License at -# -# http://www.apache.org/licenses/LICENSE-2.0 -# -# Unless required by applicable law or agreed to in writing, software -# distributed under the License is distributed on an "AS IS" BASIS, -# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. -# See the License for the specific language governing permissions and -# limitations under the License. - -"""Tests for the KVStore (BasicKVStore + BasicKVStoreReader).""" - -import math -import multiprocessing -import struct -from pathlib import Path - -import pytest -from inference_endpoint.async_utils.services.metrics_aggregator.kv_store import ( - BasicKVStore, - BasicKVStoreReader, - SeriesStats, -) - -# --------------------------------------------------------------------------- -# SeriesStats -# --------------------------------------------------------------------------- - - -@pytest.mark.unit -class TestSeriesStats: - def test_from_values(self): - stats = SeriesStats([10.0, 20.0, 5.0]) - assert stats.count == 3 - assert stats.total == 35.0 - assert stats.min_val == 5.0 - assert stats.max_val == 20.0 - - def test_sum_sq(self): - stats = SeriesStats([3.0, 4.0]) - assert stats.sum_sq == pytest.approx(3.0**2 + 4.0**2) - - def test_empty(self): - stats = SeriesStats() - assert stats.count == 0 - assert stats.total == 0.0 - # Sentinel values for an empty series — compute_summary() is responsible - # for normalizing these to 0 before exposing them to users. - assert stats.min_val == math.inf - assert stats.max_val == -math.inf - - def test_incremental_rollup(self): - stats = SeriesStats([1.0, 2.0]) - assert stats._last_rollup_idx == 2 - stats.values.extend([3.0, 4.0]) - stats._update_rollup() - assert stats.count == 4 - assert stats.total == 10.0 - assert stats._last_rollup_idx == 4 - - -# --------------------------------------------------------------------------- -# BasicKVStore (writer) -# --------------------------------------------------------------------------- - - -@pytest.mark.unit -class TestBasicKVStore: - def test_counter(self, tmp_path: Path): - store = BasicKVStore(tmp_path / "kv") - store.create_key("error_count", "counter") - store.update("error_count", 5) - assert store.get("error_count") == 5 - store.update("error_count", 10) - assert store.get("error_count") == 10 - store.close() - - def test_counter_returns_int(self, tmp_path: Path): - store = BasicKVStore(tmp_path / "kv") - store.create_key("c", "counter") - store.update("c", 42) - val = store.get("c") - assert isinstance(val, int) - store.close() - - def test_series_uint64(self, tmp_path: Path): - store = BasicKVStore(tmp_path / "kv") - store.create_key("ttft_ns", "series") - store.update("ttft_ns", 100) - store.update("ttft_ns", 200) - result = store.get("ttft_ns") - assert isinstance(result, SeriesStats) - assert result.count == 2 - assert result.values == [100, 200] - store.close() - - def test_series_float64(self, tmp_path: Path): - store = BasicKVStore(tmp_path / "kv") - store.create_key("ratio", "series", dtype=float) - store.update("ratio", 1.5) - store.update("ratio", 2.5) - result = store.get("ratio") - assert isinstance(result, SeriesStats) - assert result.count == 2 - assert result.values == [1.5, 2.5] - store.close() - - def test_snapshot(self, tmp_path: Path): - store = BasicKVStore(tmp_path / "kv") - store.create_key("n_issued", "counter") - store.create_key("latency", "series") - store.update("n_issued", 42) - store.update("latency", 150) - store.update("latency", 250) - - snap = store.snapshot() - assert snap["n_issued"] == 42 - assert isinstance(snap["latency"], SeriesStats) - assert snap["latency"].count == 2 - store.close() - - def test_snapshot_is_isolated_from_later_writes(self, tmp_path: Path): - """Mutations after snapshot() must not alter the captured snapshot.""" - store = BasicKVStore(tmp_path / "kv") - store.create_key("n_issued", "counter") - store.create_key("latency", "series") - store.update("n_issued", 5) - store.update("latency", 100) - store.update("latency", 200) - - snap = store.snapshot() - - store.update("n_issued", 99) - store.update("latency", 300) - - assert snap["n_issued"] == 5 - latency_snap = snap["latency"] - assert isinstance(latency_snap, SeriesStats) - assert latency_snap.count == 2 - assert latency_snap.values == [100, 200] - assert latency_snap.total == 300 - store.close() - - def test_update_unknown_key_raises(self, tmp_path: Path): - store = BasicKVStore(tmp_path / "kv") - with pytest.raises(KeyError, match="Key not created"): - store.update("missing", 1) - store.close() - - def test_create_key_idempotent(self, tmp_path: Path): - store = BasicKVStore(tmp_path / "kv") - store.create_key("x", "counter") - store.update("x", 5) - store.create_key("x", "counter") # should not reset - assert store.get("x") == 5 - store.close() - - def test_unlink(self, tmp_path: Path): - store_dir = tmp_path / "kv" - store = BasicKVStore(store_dir) - store.create_key("a", "counter") - assert store_dir.exists() - store.unlink() - assert not store_dir.exists() - - -# --------------------------------------------------------------------------- -# BasicKVStoreReader -# --------------------------------------------------------------------------- - - -@pytest.mark.unit -class TestBasicKVStoreReader: - def test_read_counter(self, tmp_path: Path): - store_dir = tmp_path / "kv" - writer = BasicKVStore(store_dir) - writer.create_key("count", "counter") - writer.update("count", 7) - - reader = BasicKVStoreReader(store_dir) - reader.register_key("count", "counter") - assert reader.get("count") == 7 - - reader.close() - writer.close() - - def test_read_series(self, tmp_path: Path): - store_dir = tmp_path / "kv" - writer = BasicKVStore(store_dir) - writer.create_key("ttft", "series") - writer.update("ttft", 100) - writer.update("ttft", 200) - - reader = BasicKVStoreReader(store_dir) - reader.register_key("ttft", "series") - stats = reader.get("ttft") - assert isinstance(stats, SeriesStats) - assert stats.count == 2 - assert stats.values == [100, 200] - - reader.close() - writer.close() - - def test_incremental_read(self, tmp_path: Path): - store_dir = tmp_path / "kv" - writer = BasicKVStore(store_dir) - writer.create_key("lat", "series") - writer.update("lat", 1000) - - reader = BasicKVStoreReader(store_dir) - reader.register_key("lat", "series") - s1 = reader.get("lat") - assert isinstance(s1, SeriesStats) - assert s1.count == 1 - - writer.update("lat", 2000) - writer.update("lat", 3000) - s2 = reader.get("lat") - assert isinstance(s2, SeriesStats) - assert s2.count == 3 - assert s2.total == 6000 - - reader.close() - writer.close() - - def test_snapshot(self, tmp_path: Path): - store_dir = tmp_path / "kv" - writer = BasicKVStore(store_dir) - writer.create_key("n", "counter") - writer.create_key("s", "series") - writer.update("n", 5) - writer.update("s", 10) - - reader = BasicKVStoreReader(store_dir) - reader.register_key("n", "counter") - reader.register_key("s", "series") - snap = reader.snapshot() - assert snap["n"] == 5 - assert isinstance(snap["s"], SeriesStats) - assert snap["s"].count == 1 - - reader.close() - writer.close() - - def test_reader_lazy_open(self, tmp_path: Path): - """Reader for a key whose file doesn't exist yet opens lazily.""" - store_dir = tmp_path / "kv" - store_dir.mkdir() - reader = BasicKVStoreReader(store_dir) - reader.register_key("lat", "series") - s = reader.get("lat") - assert isinstance(s, SeriesStats) - assert s.count == 0 - - # Now create the writer and write - writer = BasicKVStore(store_dir) - writer.create_key("lat", "series") - writer.update("lat", 42) - - s = reader.get("lat") - assert isinstance(s, SeriesStats) - assert s.count == 1 - assert s.values == [42] - - reader.close() - writer.close() - - -# --------------------------------------------------------------------------- -# Cross-process -# --------------------------------------------------------------------------- - - -def _child_read(store_dir_str: str, queue: multiprocessing.Queue) -> None: - store_dir = Path(store_dir_str) - reader = BasicKVStoreReader(store_dir) - reader.register_key("n", "counter") - reader.register_key("ttft", "series") - snap = reader.snapshot() - ttft = snap["ttft"] - assert isinstance(ttft, SeriesStats) - queue.put((snap["n"], ttft.count, ttft.values)) - reader.close() - - -@pytest.mark.unit -class TestCrossProcess: - def test_cross_process_read(self, tmp_path: Path): - store_dir = tmp_path / "kv" - writer = BasicKVStore(store_dir) - writer.create_key("n", "counter") - writer.create_key("ttft", "series") - writer.update("n", 2) - writer.update("ttft", 42) - writer.update("ttft", 99) - - q: multiprocessing.Queue = multiprocessing.Queue() - proc = multiprocessing.Process(target=_child_read, args=(str(store_dir), q)) - proc.start() - proc.join(timeout=10) - - assert not q.empty() - n, count, values = q.get() - assert n == 2 - assert count == 2 - assert values == [42, 99] - - writer.close() - - -# --------------------------------------------------------------------------- -# Integer precision -# --------------------------------------------------------------------------- - -# First integer not exactly representable in IEEE 754 float64 (53-bit mantissa). -_BEYOND_FLOAT64 = 2**53 + 1 - - -@pytest.mark.unit -class TestIntegerPrecision: - """Verify uint64 storage preserves integers that exceed float64 precision.""" - - def test_float64_struct_loses_precision(self): - """Confirm struct float64 roundtrip is lossy for _BEYOND_FLOAT64. - - If this test fails, the other tests in TestIntegerPrecision lose - validity — they depend on _BEYOND_FLOAT64 being unrepresentable - in float64. - """ - packed_d = struct.pack(" MetricsTable: + """A MetricsTable backed by a fresh, empty MetricsRegistry.""" + return MetricsTable(MetricsRegistry()) @pytest.mark.unit @@ -66,7 +81,7 @@ def test_completed_samples_increment(self): @pytest.mark.unit class TestMetricsTable: def test_create_and_get_row(self): - table = MetricsTable(InMemoryKVStore()) + table = _new_table() table.is_tracking = True table.tracked_blocks.append(TrackedBlock(start_ns=0, last_complete_ns=0)) ev = EventRecord( @@ -77,7 +92,7 @@ def test_create_and_get_row(self): assert len(table) == 1 def test_complete_removes_row(self): - table = MetricsTable(InMemoryKVStore()) + table = _new_table() table.is_tracking = True table.tracked_blocks.append(TrackedBlock(start_ns=0, last_complete_ns=0)) issued = EventRecord( @@ -92,7 +107,7 @@ def test_complete_removes_row(self): assert len(table) == 0 def test_set_field_noop_for_untracked(self): - table = MetricsTable(InMemoryKVStore()) + table = _new_table() ev = EventRecord( event_type=SampleEventType.RECV_FIRST, timestamp_ns=200, @@ -102,7 +117,7 @@ def test_set_field_noop_for_untracked(self): assert table.get_row("unknown") is None def test_issued_noop_when_not_tracking(self): - table = MetricsTable(InMemoryKVStore()) + table = _new_table() ev = EventRecord( event_type=SampleEventType.ISSUED, timestamp_ns=100, sample_uuid="s1" ) @@ -110,7 +125,7 @@ def test_issued_noop_when_not_tracking(self): assert table.get_row("s1") is None def test_duplicate_issued_returns_existing(self): - table = MetricsTable(InMemoryKVStore()) + table = _new_table() table.is_tracking = True table.tracked_blocks.append(TrackedBlock(start_ns=0, last_complete_ns=0)) ev1 = EventRecord( @@ -126,7 +141,7 @@ def test_duplicate_issued_returns_existing(self): assert len(table) == 1 def test_multiple_rows(self): - table = MetricsTable(InMemoryKVStore()) + table = _new_table() table.is_tracking = True table.tracked_blocks.append(TrackedBlock(start_ns=0, last_complete_ns=0)) for uuid in ("s1", "s2", "s3"): @@ -139,13 +154,13 @@ def test_multiple_rows(self): assert len(table) == 3 def test_handle_session_started(self): - table = MetricsTable(InMemoryKVStore()) + table = _new_table() ev = EventRecord(event_type=SessionEventType.STARTED, timestamp_ns=42) table.handle_session_event(ev) assert table.session_started_ns == 42 def test_handle_start_stop_tracking(self): - table = MetricsTable(InMemoryKVStore()) + table = _new_table() assert not table.is_tracking start = EventRecord( @@ -163,7 +178,7 @@ def test_handle_start_stop_tracking(self): assert not table.is_tracking def test_duplicate_start_is_noop(self): - table = MetricsTable(InMemoryKVStore()) + table = _new_table() start1 = EventRecord( event_type=SessionEventType.START_PERFORMANCE_TRACKING, timestamp_ns=100 ) @@ -175,7 +190,7 @@ def test_duplicate_start_is_noop(self): assert len(table.tracked_blocks) == 1 def test_tracked_block_updated_on_complete(self): - table = MetricsTable(InMemoryKVStore()) + table = _new_table() start = EventRecord( event_type=SessionEventType.START_PERFORMANCE_TRACKING, timestamp_ns=0 ) @@ -195,7 +210,7 @@ def test_tracked_block_updated_on_complete(self): assert table.total_completed_tracked_samples == 1 def test_multiple_tracking_windows(self): - table = MetricsTable(InMemoryKVStore()) + table = _new_table() # Block 0 table.handle_session_event( diff --git a/tests/unit/async_utils/services/metrics_aggregator/test_publisher.py b/tests/unit/async_utils/services/metrics_aggregator/test_publisher.py new file mode 100644 index 00000000..9e26f734 --- /dev/null +++ b/tests/unit/async_utils/services/metrics_aggregator/test_publisher.py @@ -0,0 +1,254 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Tests for ``MetricsPublisher`` (tick task + final JSON write + pub/sub signal).""" + +from __future__ import annotations + +import asyncio +import json +from pathlib import Path +from unittest.mock import MagicMock + +import pytest +from inference_endpoint.async_utils.services.metrics_aggregator.publisher import ( + MetricsPublisher, +) +from inference_endpoint.async_utils.services.metrics_aggregator.registry import ( + MetricsRegistry, +) +from inference_endpoint.async_utils.services.metrics_aggregator.snapshot import ( + MetricsSnapshotCodec, + SessionState, +) +from inference_endpoint.async_utils.transport.zmq.context import ManagedZMQContext + + +@pytest.fixture +def zmq_ctx_scope(): + """Provide a scoped ManagedZMQContext for the duration of a test.""" + with ManagedZMQContext.scoped() as ctx: + yield ctx + + +@pytest.mark.unit +class TestMetricsPublisher: + @pytest.mark.asyncio + async def test_start_schedules_tick_task( + self, tmp_path: Path, zmq_ctx_scope: ManagedZMQContext + ): + loop = asyncio.get_event_loop() + publisher = MetricsPublisher( + MetricsSnapshotCodec(), + zmq_ctx_scope, + "test_pub_start", + loop, + final_snapshot_path=tmp_path / "final_snapshot.json", + ) + try: + registry = MetricsRegistry() + registry.register_counter("c") + + calls = [] + + def get_runtime_state() -> tuple[SessionState, int]: + calls.append(True) + return SessionState.LIVE, 0 + + publisher.start( + registry, + publish_interval_s=0.01, + get_runtime_state=get_runtime_state, + ) + assert publisher._tick_task is not None + assert not publisher._tick_task.done() + + # Let at least one tick run. + await asyncio.sleep(0.05) + assert len(calls) >= 1 + finally: + publisher.close() + + @pytest.mark.asyncio + async def test_publish_final_writes_json_atomically( + self, tmp_path: Path, zmq_ctx_scope: ManagedZMQContext + ): + loop = asyncio.get_event_loop() + target = tmp_path / "final_snapshot.json" + publisher = MetricsPublisher( + MetricsSnapshotCodec(), + zmq_ctx_scope, + "test_pub_disk", + loop, + final_snapshot_path=target, + ) + try: + registry = MetricsRegistry() + registry.register_counter("c") + registry.increment("c", 5) + + await publisher.publish_final(registry, n_pending_tasks=0) + + # The .tmp file MUST NOT exist after the rename. + tmp_target = target.with_suffix(target.suffix + ".tmp") + assert not tmp_target.exists(), "tmp file should have been renamed" + assert target.exists(), "final snapshot should be on disk" + + decoded = json.loads(target.read_bytes()) + assert decoded["state"] == SessionState.COMPLETE.value + assert decoded["n_pending_tasks"] == 0 + finally: + publisher.close() + + @pytest.mark.asyncio + async def test_publish_final_writes_interrupted_state( + self, tmp_path: Path, zmq_ctx_scope: ManagedZMQContext + ): + """``interrupted=True`` tags the snapshot INTERRUPTED so consumers + can distinguish "user killed the run" from a clean shutdown.""" + loop = asyncio.get_event_loop() + target = tmp_path / "final_snapshot.json" + publisher = MetricsPublisher( + MetricsSnapshotCodec(), + zmq_ctx_scope, + "test_pub_interrupted", + loop, + final_snapshot_path=target, + ) + try: + registry = MetricsRegistry() + registry.register_counter("c") + await publisher.publish_final(registry, n_pending_tasks=3, interrupted=True) + decoded = json.loads(target.read_bytes()) + assert decoded["state"] == SessionState.INTERRUPTED.value + assert decoded["n_pending_tasks"] == 3 + finally: + publisher.close() + + @pytest.mark.asyncio + async def test_disk_failure_does_not_block_pubsub( + self, tmp_path: Path, zmq_ctx_scope: ManagedZMQContext + ): + """Disk write failure MUST NOT prevent pub/sub publish.""" + loop = asyncio.get_event_loop() + # Point the fallback at a path whose parent is a *file*, not a dir. + # Writing into it will fail; pub/sub publish should still complete. + bad_parent = tmp_path / "not_a_dir" + bad_parent.write_bytes(b"this is a file, not a directory") + publisher = MetricsPublisher( + MetricsSnapshotCodec(), + zmq_ctx_scope, + "test_pub_diskfail", + loop, + final_snapshot_path=bad_parent / "final_snapshot.json", + ) + try: + registry = MetricsRegistry() + registry.register_counter("c") + + # Stub the inner ZMQ publisher with a recording mock so we can + # confirm it was called even though disk fails. + inner_mock = MagicMock() + publisher._publisher = inner_mock + await publisher.publish_final(registry, n_pending_tasks=0) + + assert inner_mock.publish.call_count == 1 + # Disk should not have been written. + assert not (bad_parent / "final_snapshot.json").exists() + finally: + try: + publisher.close() + except Exception: + # Inner mock may complain on close; we just want the test to + # exercise the disk-failure path without hanging. + pass + + @pytest.mark.asyncio + async def test_publish_final_awaits_tick_task_cancellation( + self, tmp_path: Path, zmq_ctx_scope: ManagedZMQContext + ): + """publish_final MUST NOT return while the tick task could still emit. + + ``self._tick_task.cancel()`` only schedules cancellation at the + next await point; without ``await``ing the task, a late live tick + landing after the COMPLETE frame would replace it in a + ``conflate=True`` SUB queue. publish_final must therefore await + cancellation before publishing COMPLETE. + """ + loop = asyncio.get_event_loop() + publisher = MetricsPublisher( + MetricsSnapshotCodec(), + zmq_ctx_scope, + "test_pub_finalrace", + loop, + final_snapshot_path=tmp_path / "final_snapshot.json", + ) + try: + registry = MetricsRegistry() + registry.register_counter("c") + + publisher.start( + registry, + publish_interval_s=0.01, + get_runtime_state=lambda: (SessionState.LIVE, 0), + ) + tick_task = publisher._tick_task + assert tick_task is not None + # Allow the tick to begin so we know it's running. + await asyncio.sleep(0.02) + + await publisher.publish_final(registry, n_pending_tasks=0) + + # After publish_final returns, the tick task MUST be done. + assert ( + tick_task.done() + ), "tick task must be done before publish_final returns" + # And the publisher's reference is cleared so close() is a no-op + # for the tick path. + assert publisher._tick_task is None + finally: + publisher.close() + + @pytest.mark.asyncio + async def test_close_cancels_tick_task( + self, tmp_path: Path, zmq_ctx_scope: ManagedZMQContext + ): + loop = asyncio.get_event_loop() + publisher = MetricsPublisher( + MetricsSnapshotCodec(), + zmq_ctx_scope, + "test_pub_close", + loop, + final_snapshot_path=tmp_path / "final_snapshot.json", + ) + + registry = MetricsRegistry() + registry.register_counter("c") + publisher.start( + registry, + publish_interval_s=0.1, + get_runtime_state=lambda: (SessionState.LIVE, 0), + ) + tick_task = publisher._tick_task + assert tick_task is not None + publisher.close() + + # Give the cancellation a chance to take effect. + try: + await asyncio.wait_for(tick_task, timeout=1.0) + except (asyncio.CancelledError, TimeoutError): + # Cancelled: expected. Timeout: also acceptable on slow CI. + pass + assert tick_task.done() diff --git a/tests/unit/async_utils/services/metrics_aggregator/test_registry.py b/tests/unit/async_utils/services/metrics_aggregator/test_registry.py new file mode 100644 index 00000000..d6759973 --- /dev/null +++ b/tests/unit/async_utils/services/metrics_aggregator/test_registry.py @@ -0,0 +1,438 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Tests for ``MetricsRegistry`` and its samplers.""" + +from __future__ import annotations + +import pytest +from inference_endpoint.async_utils.services.metrics_aggregator.registry import ( + CounterSampler, + MetricsRegistry, + SeriesSampler, +) +from inference_endpoint.async_utils.services.metrics_aggregator.snapshot import ( + CounterStat, + SeriesStat, + SessionState, +) + +# 1 hour in ns — same as the aggregator's default for time-series metrics. +_NS_HIGH = 3_600_000_000_000 + + +@pytest.mark.unit +class TestCounterSampler: + def test_increment_and_value(self): + c = CounterSampler("c", dtype=int) + c.increment(1) + c.increment(4) + assert c.value() == 5 + + def test_set(self): + c = CounterSampler("c", dtype=int) + c.increment(10) + c.set(2) + assert c.value() == 2 + + def test_build_stat(self): + c = CounterSampler("c", dtype=int) + c.increment(7) + stat = c.build_stat(exact=False) + assert isinstance(stat, CounterStat) + assert stat.name == "c" + assert stat.value == 7 + + +@pytest.mark.unit +class TestSeriesSampler: + def _make(self, dtype=int): + return SeriesSampler( + "s", + hdr_low=1, + hdr_high=_NS_HIGH, + sig_figs=3, + n_histogram_buckets=5, + percentiles=(50.0, 99.0), + dtype=dtype, + ) + + def test_empty_build_stat(self): + s = self._make() + stat = s.build_stat(exact=False) + assert isinstance(stat, SeriesStat) + assert stat.count == 0 + # No data → empty histogram. Edges are dynamic and only meaningful + # once min/max are observed. + assert stat.histogram == [] + + def test_record_and_rollups(self): + s = self._make() + for v in [10, 20, 30, 40, 50]: + s.record(v) + stat = s.build_stat(exact=False) + assert stat.count == 5 + assert stat.total == 150 + assert stat.min == 10 + assert stat.max == 50 + assert stat.sum_sq == 10**2 + 20**2 + 30**2 + 40**2 + 50**2 + + def test_hdr_percentiles_within_tolerance(self): + s = self._make() + for v in range(1, 101): # 1..100 + s.record(v * 1000) # values: 1000..100000 + stat = s.build_stat(exact=False) + # HDR with 3 sig figs is approximate but should be close. + # Keys are stringified percentile floats (e.g. "50.0", "99.0"). + p50 = stat.percentiles.get("50.0", stat.percentiles.get("50")) + p99 = stat.percentiles.get("99.0", stat.percentiles.get("99")) + assert p50 == pytest.approx(50_000, rel=0.05) + assert p99 == pytest.approx(99_000, rel=0.05) + + def test_final_exact_percentiles(self): + s = self._make() + for v in range(1, 101): + s.record(v * 1000) + stat = s.build_stat(exact=True) + # method="lower" returns observed values. + p50 = stat.percentiles.get("50.0", stat.percentiles.get("50")) + p99 = stat.percentiles.get("99.0", stat.percentiles.get("99")) + assert p50 == 50_000 + assert p99 == 99_000 + + def test_final_histogram_is_dense(self): + s = self._make() + for v in range(1, 11): + s.record(v) + stat = s.build_stat(exact=True) + # Number of buckets matches what was registered. + assert len(stat.histogram) == 5 + # Final histogram is exact: every recorded value lands in some bucket + # (clipped into range when out of bounds), so total == count. + total = sum(c for _, c in stat.histogram) + assert total == stat.count + + def test_final_histogram_edges_track_observed_range(self): + """Dynamic edges span [observed_min, observed_max] of the data — + the histogram auto-zooms instead of using fixed [hdr_low, hdr_high]. + """ + s = self._make() + for v in (1_000_000, 2_000_000, 5_000_000, 10_000_000): + s.record(v) + stat = s.build_stat(exact=True) + # First bucket starts at observed min (or its log-clamp). Last + # bucket ends at observed max. Edges should be much tighter than + # the [1, _NS_HIGH] HDR bounds. + assert stat.histogram[0][0][0] >= 1 + assert stat.histogram[0][0][0] <= 1_000_000 + assert stat.histogram[-1][0][1] == pytest.approx(10_000_000) + # All values land in some bucket. + total = sum(c for _, c in stat.histogram) + assert total == stat.count == 4 + + def test_final_histogram_handles_zero_value(self): + """Sub-clamp raw values (e.g. 0) are clipped into the first bucket, + not dropped. Total bucket count equals the recorded count. + """ + s = self._make() + s.record(0) + s.record(100) + s.record(1000) + stat = s.build_stat(exact=True) + total = sum(c for _, c in stat.histogram) + assert total == stat.count == 3 + + def test_hdr_histogram_count_matches_total(self): + """HDR-derived histogram bucket counts must sum to the recorded count. + + Without this invariant, deriving display-bucket counts via the + difference of two ``get_count_at_value`` queries would silently + under-count: ``get_count_at_value(v)`` returns the count of the + single sub-bucket containing ``v``, not a cumulative count, so + the subtraction is meaningless. + """ + s = self._make() + for v in range(1, 101): + s.record(v * 1000) + stat = s.build_stat(exact=False) + total = sum(c for _, c in stat.histogram) + # Every recorded value must land in exactly one display bucket. + assert total == stat.count == 100 + + def test_hdr_histogram_distribution_matches_exact(self): + """HDR-derived bucket counts approximate the exact counts. + + Within ~5% relative tolerance per non-empty bucket: HDR's bucketing + rounds values into its sub-buckets, which can shift a few near a + display-bucket boundary, but the bulk shape matches. + """ + # Values clustered into two display buckets so HDR rounding can't + # significantly redistribute the totals. + s = self._make() + # 60 values around 1e4, 40 values around 1e8 — far apart, so they + # end up in clearly distinct display buckets. + for _ in range(60): + s.record(10_000) + for _ in range(40): + s.record(100_000_000) + live = s.build_stat(exact=False) + ended = s.build_stat(exact=True) + + live_counts = [c for _, c in live.histogram] + ended_counts = [c for _, c in ended.histogram] + + # Both must agree on which buckets are non-empty. + assert [c > 0 for c in live_counts] == [c > 0 for c in ended_counts] + # And on totals (HDR aggregates exactly across buckets). + assert sum(live_counts) == sum(ended_counts) == 100 + + def test_float_dtype(self): + s = self._make(dtype=float) + s.record(1.5) + s.record(2.5) + s.record(3.5) + stat = s.build_stat(exact=True) + assert stat.count == 3 + assert stat.total == pytest.approx(7.5) + + +@pytest.mark.unit +class TestMetricsRegistry: + def test_register_and_increment(self): + reg = MetricsRegistry() + reg.register_counter("c1") + reg.increment("c1", 1) + reg.increment("c1", 2) + snap = reg.build_snapshot(state=SessionState.LIVE, n_pending_tasks=0) + assert snap.counter == 1 + # Find the counter in the snapshot. + counter_stats = [m for m in snap.metrics if isinstance(m, CounterStat)] + assert len(counter_stats) == 1 + assert counter_stats[0].name == "c1" + assert counter_stats[0].value == 3 + + def test_set_counter(self): + reg = MetricsRegistry() + reg.register_counter("c1") + reg.set_counter("c1", 99) + snap = reg.build_snapshot(state=SessionState.LIVE, n_pending_tasks=0) + c = next(m for m in snap.metrics if isinstance(m, CounterStat)) + assert c.value == 99 + + def test_record_series(self): + reg = MetricsRegistry() + reg.register_series( + "ttft_ns", + hdr_low=1, + hdr_high=_NS_HIGH, + sig_figs=3, + n_histogram_buckets=10, + percentiles=(50.0,), + ) + for v in [100, 200, 300]: + reg.record("ttft_ns", v) + snap = reg.build_snapshot(state=SessionState.COMPLETE, n_pending_tasks=0) + s = next(m for m in snap.metrics if isinstance(m, SeriesStat)) + assert s.count == 3 + assert s.total == 600 + + def test_seq_increments(self): + reg = MetricsRegistry() + reg.register_counter("c") + s1 = reg.build_snapshot(state=SessionState.LIVE, n_pending_tasks=0) + s2 = reg.build_snapshot(state=SessionState.LIVE, n_pending_tasks=0) + assert s2.counter == s1.counter + 1 + + def test_complete_flag_propagates(self): + reg = MetricsRegistry() + snap = reg.build_snapshot(state=SessionState.COMPLETE, n_pending_tasks=2) + assert snap.state == SessionState.COMPLETE + assert snap.n_pending_tasks == 2 + + def test_name_collision_counter(self): + reg = MetricsRegistry() + reg.register_counter("dup") + with pytest.raises(ValueError, match="already registered"): + reg.register_counter("dup") + + def test_name_collision_series(self): + reg = MetricsRegistry() + reg.register_series("dup", hdr_low=1, hdr_high=_NS_HIGH) + with pytest.raises(ValueError, match="already registered"): + reg.register_series("dup", hdr_low=1, hdr_high=_NS_HIGH) + + def test_name_collision_cross_kind(self): + """A counter and a series MUST NOT share a name.""" + reg = MetricsRegistry() + reg.register_counter("dup") + with pytest.raises(ValueError, match="already registered"): + reg.register_series("dup", hdr_low=1, hdr_high=_NS_HIGH) + + +@pytest.mark.unit +class TestSeriesSamplerBoundaries: + """Boundary-condition coverage for ``SeriesSampler``. + + The tests below pin behavior at HDR bounds, sig_figs extremes, and + the warn-once clamp logic — internal contracts callers shouldn't have + to discover by reading source. + """ + + def _make( + self, + *, + hdr_low: int = 1, + hdr_high: int = _NS_HIGH, + sig_figs: int = 3, + dtype: type = int, + ) -> SeriesSampler: + return SeriesSampler( + "s", + hdr_low=hdr_low, + hdr_high=hdr_high, + sig_figs=sig_figs, + n_histogram_buckets=5, + percentiles=(50.0, 99.0), + dtype=dtype, + ) + + # -- HDR construction-time validation ---------------------------------- + + def test_high_below_2x_low_is_rejected(self): + # hdrhistogram requires high >= 2*low; the pre-check catches it + # up-front with both values in the message. + with pytest.raises(ValueError, match=r"high \(10\) must be >= 2 \* low \(6\)"): + self._make(hdr_low=6, hdr_high=10) + + def test_high_equal_to_2x_low_is_accepted(self): + # Exact boundary: high == 2*low must succeed. + s = self._make(hdr_low=5, hdr_high=10) + s.record(7) + stat = s.build_stat(exact=True) + assert stat.count == 1 + + def test_low_zero_is_coerced_to_one(self): + # HDR rejects low=0; the sampler silently raises it to 1 to keep + # the "anything positive" registration contract. + s = self._make(hdr_low=0, hdr_high=100) + assert s._hdr_low == 1 + + def test_unsupported_dtype_rejected(self): + with pytest.raises(ValueError, match="Unsupported series dtype"): + self._make(dtype=str) # type: ignore[arg-type] + + # -- Value clamping at hot-path boundaries ----------------------------- + + def test_value_at_hdr_low_is_unclamped(self): + s = self._make(hdr_low=10, hdr_high=10_000) + s.record(10) + # No clamp → warn-once flag stays False. + assert s._warned_clamp is False + stat = s.build_stat(exact=True) + assert stat.min == 10 and stat.max == 10 + + def test_value_at_hdr_high_is_unclamped(self): + s = self._make(hdr_low=10, hdr_high=10_000) + s.record(10_000) + assert s._warned_clamp is False + stat = s.build_stat(exact=True) + assert stat.max == 10_000 + + def test_value_below_hdr_low_clamps_and_warns_once(self, caplog): + s = self._make(hdr_low=10, hdr_high=10_000) + with caplog.at_level("WARNING"): + s.record(5) + s.record(7) # second under-clamp should NOT warn again + clamp_warnings = [ + r for r in caplog.records if "outside HDR bounds" in r.message + ] + assert len(clamp_warnings) == 1 + assert s._warned_clamp is True + # Raw values are preserved un-clamped — only the HDR view is clamped. + stat = s.build_stat(exact=True) + assert stat.min == 5 + assert stat.count == 2 + + def test_value_above_hdr_high_clamps_and_warns_once(self, caplog): + s = self._make(hdr_low=10, hdr_high=1_000) + with caplog.at_level("WARNING"): + s.record(5_000) + s.record(10_000) + clamp_warnings = [ + r for r in caplog.records if "outside HDR bounds" in r.message + ] + assert len(clamp_warnings) == 1 + # Raw values preserved. + stat = s.build_stat(exact=True) + assert stat.max == 10_000 + + def test_float_value_uses_float_clamp(self): + # The int branch would int-truncate the clamp boundary; the float + # path must keep float-precision so 0.5 below an integer low is + # still recognized as below-bound. + s = self._make(hdr_low=10, hdr_high=10_000, dtype=float) + s.record(9.5) # below low → clamped + assert s._warned_clamp is True + + # -- sig_figs extremes ------------------------------------------------- + + def test_sig_figs_min(self): + # HDR accepts sig_figs in [1, 5]. sig_figs=1 means very coarse + # percentiles but must still satisfy the bucket-sum invariant. + s = self._make(sig_figs=1) + for v in range(1, 101): + s.record(v * 1000) + stat = s.build_stat(exact=False) + total = sum(c for _, c in stat.histogram) + assert total == stat.count == 100 + + def test_sig_figs_max(self): + # sig_figs=5 is the HDR max; sub-bucket count is largest and memory + # is highest, but construction must still work. + s = self._make(sig_figs=5) + s.record(1000) + s.record(50_000) + stat = s.build_stat(exact=True) + assert stat.count == 2 + + # -- Rollup edges ------------------------------------------------------ + + def test_count_one_rollups(self): + # Single-value series: min == max == total, sum_sq == value^2. + s = self._make() + s.record(42) + stat = s.build_stat(exact=True) + assert stat.count == 1 + assert stat.min == 42 + assert stat.max == 42 + assert stat.total == 42 + assert stat.sum_sq == 42 * 42 + + def test_empty_rollups_have_inf_min_neg_inf_max(self): + # No data: build_stat returns empty histogram and untouched min/max + # sentinels. Consumers MUST check count > 0 before reading min/max. + s = self._make() + stat = s.build_stat(exact=False) + assert stat.count == 0 + assert stat.histogram == [] + + def test_warn_once_resets_per_sampler(self): + # The warn-once flag is per-sampler, not per-process — a separate + # registration starts fresh. + s1 = self._make(hdr_low=10, hdr_high=100) + s2 = self._make(hdr_low=10, hdr_high=100) + s1.record(5) + assert s1._warned_clamp is True + assert s2._warned_clamp is False diff --git a/tests/unit/async_utils/services/metrics_aggregator/test_snapshot.py b/tests/unit/async_utils/services/metrics_aggregator/test_snapshot.py new file mode 100644 index 00000000..ba861094 --- /dev/null +++ b/tests/unit/async_utils/services/metrics_aggregator/test_snapshot.py @@ -0,0 +1,222 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Tests for the snapshot wire schema and codec.""" + +from __future__ import annotations + +import msgspec +import msgspec.msgpack +import pytest +from inference_endpoint.async_utils.services.metrics_aggregator.snapshot import ( + METRICS_SNAPSHOT_TOPIC, + CounterStat, + MetricsSnapshot, + MetricsSnapshotCodec, + SeriesStat, + SessionState, +) +from inference_endpoint.core.record import TOPIC_FRAME_SIZE + + +@pytest.mark.unit +class TestCounterStat: + def test_roundtrip(self): + stat = CounterStat(name="total_samples_issued", value=42) + encoded = msgspec.msgpack.encode(stat) + decoded = msgspec.msgpack.decode(encoded, type=CounterStat) + assert decoded == stat + + def test_float_value(self): + stat = CounterStat(name="duration_s", value=3.14) + decoded = msgspec.msgpack.decode(msgspec.msgpack.encode(stat), type=CounterStat) + assert decoded.value == pytest.approx(3.14) + + +@pytest.mark.unit +class TestSeriesStat: + def test_roundtrip(self): + stat = SeriesStat( + name="ttft_ns", + count=5, + total=500, + min=50, + max=150, + sum_sq=55000, + percentiles={"50": 100.0, "99": 145.0}, + histogram=[((50.0, 100.0), 2), ((100.0, 150.0), 3)], + ) + encoded = msgspec.msgpack.encode(stat) + decoded = msgspec.msgpack.decode(encoded, type=SeriesStat) + assert decoded == stat + + +@pytest.mark.unit +class TestMetricsSnapshot: + def test_empty_metrics_roundtrip(self): + snap = MetricsSnapshot( + counter=1, + timestamp_ns=1234, + state=SessionState.LIVE, + n_pending_tasks=0, + metrics=[], + ) + codec = MetricsSnapshotCodec() + topic, payload = codec.encode(snap) + assert topic == METRICS_SNAPSHOT_TOPIC + assert len(topic) == TOPIC_FRAME_SIZE + decoded = codec.decode(payload) + assert decoded == snap + + def test_tagged_union_dispatch(self): + """Decoder must produce the right concrete type per tag.""" + snap = MetricsSnapshot( + counter=2, + timestamp_ns=42, + state=SessionState.COMPLETE, + n_pending_tasks=3, + metrics=[ + CounterStat(name="c1", value=10), + SeriesStat( + name="s1", + count=1, + total=10, + min=10, + max=10, + sum_sq=100, + percentiles={"50": 10.0}, + histogram=[((1.0, 10.0), 1)], + ), + ], + ) + codec = MetricsSnapshotCodec() + _, payload = codec.encode(snap) + decoded = codec.decode(payload) + assert isinstance(decoded.metrics[0], CounterStat) + assert isinstance(decoded.metrics[1], SeriesStat) + assert decoded.metrics[0].name == "c1" + assert decoded.metrics[1].name == "s1" + + def test_on_decode_error_drops_malformed(self): + codec = MetricsSnapshotCodec() + # Decode a clearly malformed payload (truncated msgpack) + try: + codec.decode(b"\xff\x00") + except Exception as e: + fallback = codec.on_decode_error(b"\xff\x00", e) + assert fallback is None + + def test_on_decode_error_reraises_unknown(self): + codec = MetricsSnapshotCodec() + # Non-decode errors should propagate. + with pytest.raises(RuntimeError): + codec.on_decode_error(b"", RuntimeError("not a decode error")) + + +@pytest.mark.unit +class TestSessionStateTransitions: + """The SessionState enum members are the only valid states the + aggregator surfaces on the wire. Tests below pin the forward-only + transition contract so a future enum addition / reorder doesn't + silently break the consumer's drain-timeout detection + (``state == COMPLETE and n_pending_tasks > 0``). + """ + + def test_all_states_are_string_serializable(self): + # MetricsSnapshot encodes state as the enum *value* via msgspec's + # str-Enum support. Each state must therefore round-trip as a + # string literal — protects against accidental int-Enum reorder. + for s in SessionState: + assert isinstance(s.value, str) + assert s.value == s + + def test_expected_member_set(self): + # Pin the membership so a future addition is a deliberate review + # decision, not an accident. Adding a new state requires updating + # this test (and presumably the drain-timeout detection rule). + assert {s.value for s in SessionState} == { + "initialize", + "live", + "draining", + "complete", + "interrupted", + } + + def test_forward_ordering_matches_declaration_order(self): + # The aggregator transitions in declaration order; consumers can + # rely on `list(SessionState)` for "did we move forward?" checks. + # COMPLETE and INTERRUPTED are sibling terminal states — either is + # reachable from DRAINING but not from each other. + assert list(SessionState) == [ + SessionState.INITIALIZE, + SessionState.LIVE, + SessionState.DRAINING, + SessionState.COMPLETE, + SessionState.INTERRUPTED, + ] + + @pytest.mark.parametrize( + "state, n_pending, complete", + [ + (SessionState.LIVE, 0, False), + (SessionState.LIVE, 3, False), + (SessionState.DRAINING, 5, False), + (SessionState.COMPLETE, 0, True), + # Drain-timeout case: COMPLETE arrived but tasks were + # cancelled / abandoned. Consumer treats as incomplete. + (SessionState.COMPLETE, 1, False), + (SessionState.INITIALIZE, 0, False), + # INTERRUPTED never satisfies the complete predicate, regardless + # of n_pending_tasks — a kill-signal-triggered snapshot is + # always partial data. + (SessionState.INTERRUPTED, 0, False), + (SessionState.INTERRUPTED, 7, False), + ], + ) + def test_complete_predicate(self, state, n_pending, complete): + """``complete = state == COMPLETE and n_pending_tasks == 0``.""" + assert (state == SessionState.COMPLETE and n_pending == 0) is complete + + def test_initialize_round_trips_on_the_wire(self): + # INITIALIZE is part of the wire schema; a snapshot tagged + # INITIALIZE must decode back to the same state. (The aggregator + # doesn't emit this state today — the tick task only starts on + # the first STARTED — but the codec must still tolerate it for + # future setup-phase ticks.) + snap = MetricsSnapshot( + counter=1, + timestamp_ns=1, + state=SessionState.INITIALIZE, + n_pending_tasks=0, + metrics=[], + ) + codec = MetricsSnapshotCodec() + _, payload = codec.encode(snap) + decoded = codec.decode(payload) + assert decoded.state == SessionState.INITIALIZE + + def test_interrupted_round_trips_on_the_wire(self): + # INTERRUPTED is emitted by the signal handler's publish_final + # call (SIGTERM/SIGINT). The msgpack wire form must round-trip. + snap = MetricsSnapshot( + counter=42, + timestamp_ns=12345, + state=SessionState.INTERRUPTED, + n_pending_tasks=3, + metrics=[CounterStat(name="c", value=7)], + ) + codec = MetricsSnapshotCodec() + _, msgpack_payload = codec.encode(snap) + assert codec.decode(msgpack_payload).state == SessionState.INTERRUPTED diff --git a/tests/unit/commands/test_benchmark.py b/tests/unit/commands/test_benchmark.py index c664234f..8be87463 100644 --- a/tests/unit/commands/test_benchmark.py +++ b/tests/unit/commands/test_benchmark.py @@ -310,7 +310,7 @@ def test_from_yaml_file(self): model_params: name: "test-model" datasets: - - path: "tests/datasets/dummy_1k.jsonl" + - path: "tests/assets/datasets/dummy_1k.jsonl" endpoint_config: endpoints: ["http://test:8000"] """) diff --git a/tests/unit/commands/test_benchmark_final_snapshot.py b/tests/unit/commands/test_benchmark_final_snapshot.py new file mode 100644 index 00000000..e3eac7b5 --- /dev/null +++ b/tests/unit/commands/test_benchmark_final_snapshot.py @@ -0,0 +1,161 @@ +# SPDX-FileCopyrightText: Copyright (c) 2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# SPDX-License-Identifier: Apache-2.0 +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. + +"""Tests for the consumer-side final-snapshot read path in +``commands/benchmark/execute.py``. + +The Report consumer reads ``final_snapshot.json`` as the primary source +and falls back to the pub/sub subscriber's ``latest`` only if the file +is missing (the aggregator was killed by an uncatchable signal before +its handler ran). These tests pin both branches plus the +malformed-file behavior, since this is the load-bearing path for the +"JSON file is the canonical Report source" architecture. +""" + +from __future__ import annotations + +import json +from pathlib import Path + +import pytest +from inference_endpoint.async_utils.services.metrics_aggregator.snapshot import ( + SessionState, +) +from inference_endpoint.commands.benchmark.execute import ( + _load_final_snapshot_from_disk, +) +from inference_endpoint.metrics.report import Report + + +def _snapshot_dict( + *, + state: str = SessionState.COMPLETE.value, + n_pending_tasks: int = 0, + n_completed: int = 5, + duration_ns: int = 10_000_000_000, +) -> dict: + """Build a minimal valid snapshot dict shaped like ``snapshot_to_dict``.""" + return { + "counter": 1, + "timestamp_ns": 12345, + "state": state, + "n_pending_tasks": n_pending_tasks, + "metrics": [ + { + "type": "counter", + "name": "tracked_samples_completed", + "value": n_completed, + }, + { + "type": "counter", + "name": "tracked_samples_issued", + "value": n_completed, + }, + { + "type": "counter", + "name": "tracked_duration_ns", + "value": duration_ns, + }, + { + "type": "counter", + "name": "tracked_samples_failed", + "value": 0, + }, + ], + } + + +@pytest.mark.unit +class TestLoadFinalSnapshotFromDisk: + def test_returns_none_if_file_missing(self, tmp_path: Path): + """SIGKILL / OOM-kill case: aggregator died before signal handler + could write. Loader returns None so the caller can fall back to + the live subscriber.""" + missing = tmp_path / "does_not_exist.json" + assert _load_final_snapshot_from_disk(missing) is None + + def test_reads_valid_json_as_dict(self, tmp_path: Path): + target = tmp_path / "final_snapshot.json" + target.write_text(json.dumps(_snapshot_dict())) + loaded = _load_final_snapshot_from_disk(target) + assert loaded is not None + assert loaded["state"] == SessionState.COMPLETE.value + assert loaded["n_pending_tasks"] == 0 + + def test_returns_none_on_malformed_json(self, tmp_path: Path, caplog): + """A truncated / corrupt file MUST NOT crash the Report build — + the caller falls back to the live subscriber and the report is + marked incomplete. A warning is logged so the failure is visible.""" + target = tmp_path / "final_snapshot.json" + target.write_bytes(b"{not valid json") + with caplog.at_level("WARNING"): + result = _load_final_snapshot_from_disk(target) + assert result is None + assert any("Failed to read final snapshot" in r.message for r in caplog.records) + + +@pytest.mark.unit +class TestReportFromLoadedSnapshot: + """End-to-end: load JSON → build Report. Pins the + state→complete-flag→display-warning contract that the consumer + relies on across the three terminal states.""" + + @pytest.mark.parametrize( + "state, n_pending, expected_complete", + [ + (SessionState.COMPLETE.value, 0, True), + # Drain-timeout: COMPLETE state but tasks still pending. + (SessionState.COMPLETE.value, 3, False), + # Interrupted: signal-handler-written snapshot. + (SessionState.INTERRUPTED.value, 0, False), + (SessionState.INTERRUPTED.value, 7, False), + ], + ) + def test_report_complete_flag_matches_state_and_pending( + self, tmp_path: Path, state: str, n_pending: int, expected_complete: bool + ): + target = tmp_path / "final_snapshot.json" + target.write_text( + json.dumps(_snapshot_dict(state=state, n_pending_tasks=n_pending)) + ) + loaded = _load_final_snapshot_from_disk(target) + assert loaded is not None + report = Report.from_snapshot(loaded) + assert report.state == state + assert report.complete is expected_complete + + def test_interrupted_display_surfaces_signal_warning(self, tmp_path: Path): + """An INTERRUPTED snapshot loaded from disk produces a Report + whose ``display()`` prominently calls out the signal-driven + shutdown — so a user reading the output knows the data is + partial, not just incomplete.""" + target = tmp_path / "final_snapshot.json" + target.write_text( + json.dumps(_snapshot_dict(state=SessionState.INTERRUPTED.value)) + ) + report = Report.from_snapshot(_load_final_snapshot_from_disk(target) or {}) + lines: list[str] = [] + report.display(fn=lines.append, summary_only=True) + output = "\n".join(lines) + # Must surface the signal cause explicitly. + assert "interrupted" in output.lower() + assert "SIGTERM" in output or "signal" in output.lower() + + def test_missing_file_path_fallback_yields_no_loaded_snapshot(self, tmp_path: Path): + """The contract the caller in execute.py relies on: missing file + → None → caller switches to live-snapshot fallback. This pins + the precondition the fallback chain depends on.""" + result = _load_final_snapshot_from_disk(tmp_path / "nope.json") + assert result is None diff --git a/tests/unit/load_generator/test_async_session.py b/tests/unit/load_generator/test_async_session.py index 38dd014e..9d24618d 100644 --- a/tests/unit/load_generator/test_async_session.py +++ b/tests/unit/load_generator/test_async_session.py @@ -558,6 +558,18 @@ async def inject_error(): # Bug #5: error event should also be published assert len(error_events) == 1 + # ERROR must be emitted BEFORE COMPLETE so the metrics aggregator can + # observe the in-flight tracked row before set_field(...COMPLETE...) + # removes it. Reverting this order would silently zero + # tracked_samples_failed. + error_idx = publisher.events.index(error_events[0]) + complete_idx = publisher.events.index(complete_events[0]) + assert error_idx < complete_idx, ( + f"ERROR event must be emitted before COMPLETE for metrics " + f"aggregator correctness; got error at idx {error_idx}, " + f"complete at idx {complete_idx}" + ) + @pytest.mark.unit class TestBenchmarkSessionPoissonIntegration: diff --git a/tests/unit/metrics/test_report_builder.py b/tests/unit/metrics/test_report_builder.py index d5e66526..587f7752 100644 --- a/tests/unit/metrics/test_report_builder.py +++ b/tests/unit/metrics/test_report_builder.py @@ -13,139 +13,143 @@ # See the License for the specific language governing permissions and # limitations under the License. -"""Tests for report.py and report_builder.py.""" +"""Tests for ``Report.from_snapshot`` and display helpers. + +Reports are built from a ``MetricsSnapshot`` produced by a populated +``MetricsRegistry``. +""" + +from __future__ import annotations import json +import math from pathlib import Path import pytest -from inference_endpoint.async_utils.services.metrics_aggregator.kv_store import ( - BasicKVStore, - BasicKVStoreReader, - SeriesStats, +from inference_endpoint.async_utils.services.metrics_aggregator.aggregator import ( + MetricCounterKey, ) -from inference_endpoint.metrics.report import Report, compute_summary - -# --------------------------------------------------------------------------- -# compute_summary -# --------------------------------------------------------------------------- - - -@pytest.mark.unit -class TestComputeSummary: - def test_empty(self): - s = compute_summary(SeriesStats()) - assert s["total"] == 0 - assert s["min"] == 0 - assert s["max"] == 0 - assert s["std_dev"] == 0 - assert s["histogram"]["buckets"] == [] - - def test_single_value(self): - s = compute_summary(SeriesStats([42.0], dtype=float)) - assert s["min"] == 42.0 - assert s["max"] == 42.0 - assert s["avg"] == 42.0 - assert s["std_dev"] == 0.0 - - def test_multiple_values(self): - s = compute_summary(SeriesStats([1.0, 2.0, 3.0, 4.0, 5.0], dtype=float)) - assert s["min"] == 1.0 - assert s["max"] == 5.0 - assert s["total"] == 15.0 - assert s["avg"] == 3.0 - assert s["median"] == 3.0 - assert len(s["histogram"]["buckets"]) > 0 - assert len(s["percentiles"]) > 0 - - def test_percentiles(self): - values = list(range(1, 101)) # 1..100 - s = compute_summary( - SeriesStats([float(v) for v in values], dtype=float), - percentiles=(50, 90, 99), - ) - assert s["percentiles"]["50"] == pytest.approx(50.5, abs=1) - assert s["percentiles"]["90"] == pytest.approx(90.1, abs=1) - assert s["percentiles"]["99"] == pytest.approx(99.01, abs=1) - - -# --------------------------------------------------------------------------- -# Helper: create a populated KVStore writer + reader -# --------------------------------------------------------------------------- - +from inference_endpoint.async_utils.services.metrics_aggregator.metrics_table import ( + MetricSeriesKey, +) +from inference_endpoint.async_utils.services.metrics_aggregator.registry import ( + MetricsRegistry, +) +from inference_endpoint.async_utils.services.metrics_aggregator.snapshot import ( + MetricsSnapshot, + SeriesStat, + SessionState, + snapshot_to_dict, +) +from inference_endpoint.metrics.report import Report + +# 1 hour in ns — same as the aggregator's default bound for time-series. +_NS_HIGH = 3_600_000_000_000 + + +def _make_registry(n_samples: int = 50) -> MetricsRegistry: + """A registry populated with the metrics ``Report.from_snapshot`` reads. + + Only the metrics consumed by ``Report.from_snapshot`` are registered: + the tracked counters (issued/completed/failed/duration) and the four + series surfaced on the report (ttft_ns, sample_latency_ns, osl, + tpot_ns). ISL/chunk_delta_ns are intentionally not registered to + keep the test data minimal — ``Report.from_snapshot`` ignores them. + """ + registry = MetricsRegistry() + for key in MetricCounterKey.__members__.values(): + registry.register_counter(key.value) + registry.register_series( + MetricSeriesKey.SAMPLE_LATENCY_NS.value, + hdr_low=1, + hdr_high=_NS_HIGH, + sig_figs=3, + n_histogram_buckets=10, + percentiles=(50.0, 90.0, 99.0), + ) + registry.register_series( + MetricSeriesKey.TTFT_NS.value, + hdr_low=1, + hdr_high=_NS_HIGH, + sig_figs=3, + n_histogram_buckets=10, + percentiles=(50.0, 90.0, 99.0), + ) + registry.register_series( + MetricSeriesKey.OSL.value, + hdr_low=1, + hdr_high=10_000_000, + sig_figs=3, + n_histogram_buckets=10, + percentiles=(50.0, 90.0, 99.0), + ) + registry.register_series( + MetricSeriesKey.TPOT_NS.value, + hdr_low=1, + hdr_high=_NS_HIGH, + sig_figs=3, + n_histogram_buckets=10, + percentiles=(50.0, 90.0, 99.0), + dtype=float, + ) -def _make_store(tmp_path: Path, n_samples: int = 50): - """Create a writer with typical benchmark data and return (writer, reader).""" - store_dir = tmp_path / "kv" - w = BasicKVStore(store_dir) - - # Counter keys matching MetricCounterKey enum - for key in [ - "total_samples_issued", - "total_samples_completed", - "total_samples_failed", - "tracked_samples_issued", - "tracked_samples_completed", - "tracked_duration_ns", - "total_duration_ns", - ]: - w.create_key(key, "counter") - for key in ["ttft_ns", "sample_latency_ns", "osl", "isl", "chunk_delta_ns"]: - w.create_key(key, "series") - w.create_key("tpot_ns", "series", dtype=float) - - w.update("tracked_samples_issued", n_samples) - w.update("tracked_samples_completed", n_samples) - w.update("total_samples_failed", 0) if n_samples > 0: - w.update("tracked_duration_ns", 10_000_000_000) - - for i in range(n_samples): - w.update("ttft_ns", 1_000_000 + i * 10_000) - w.update("sample_latency_ns", 5_000_000 + i * 50_000) - w.update("osl", 100 + i) - - r = BasicKVStoreReader(store_dir) - for key in [ - "total_samples_issued", - "total_samples_completed", - "total_samples_failed", - "tracked_samples_issued", - "tracked_samples_completed", - "tracked_duration_ns", - "total_duration_ns", - ]: - r.register_key(key, "counter") - for key in ["ttft_ns", "sample_latency_ns", "osl", "isl", "chunk_delta_ns"]: - r.register_key(key, "series") - r.register_key("tpot_ns", "series", dtype=float) - - return w, r + registry.increment(MetricCounterKey.TRACKED_SAMPLES_ISSUED.value, n_samples) + registry.increment(MetricCounterKey.TRACKED_SAMPLES_COMPLETED.value, n_samples) + registry.set_counter(MetricCounterKey.TRACKED_DURATION_NS.value, 10_000_000_000) + for i in range(n_samples): + registry.record(MetricSeriesKey.TTFT_NS.value, 1_000_000 + i * 10_000) + registry.record( + MetricSeriesKey.SAMPLE_LATENCY_NS.value, 5_000_000 + i * 50_000 + ) + registry.record(MetricSeriesKey.OSL.value, 100 + i) + + return registry + + +def _build_report( + registry: MetricsRegistry, + *, + state: SessionState = SessionState.COMPLETE, + n_pending_tasks: int = 0, +) -> Report: + """Build a Report from a snapshot dict (matches the consumer contract). + + ``Report.from_snapshot`` consumes the dict form produced by + ``snapshot_to_dict``; that's also the shape persisted to + ``final_snapshot.json``. We deliberately route through the dict + here so the tests exercise the same path the production consumer + does (loaded JSON file → Report). + """ + snap = registry.build_snapshot(state=state, n_pending_tasks=n_pending_tasks) + return Report.from_snapshot(snapshot_to_dict(snap)) # --------------------------------------------------------------------------- -# build_report +# from_snapshot — happy paths # --------------------------------------------------------------------------- @pytest.mark.unit -class TestBuildReport: - def test_empty_store(self, tmp_path: Path): - w, r = _make_store(tmp_path, n_samples=0) - report = Report.from_kv_reader(r) +class TestFromSnapshot: + def test_empty_registry(self): + registry = _make_registry(n_samples=0) + report = _build_report(registry) assert report.n_samples_issued == 0 + assert report.n_samples_completed == 0 + assert report.n_samples_failed == 0 assert report.duration_ns is None assert report.qps() is None + # Series with count==0 should produce empty dicts. assert report.ttft == {} assert report.latency == {} + assert report.output_sequence_lengths == {} + assert report.tpot == {} - r.close() - w.close() - - def test_with_metrics(self, tmp_path: Path): - w, r = _make_store(tmp_path, n_samples=50) - report = Report.from_kv_reader(r) + def test_with_metrics(self): + registry = _make_registry(n_samples=50) + report = _build_report(registry) assert report.n_samples_issued == 50 assert report.n_samples_completed == 50 @@ -157,23 +161,53 @@ def test_with_metrics(self, tmp_path: Path): assert "histogram" in report.ttft assert report.ttft["min"] > 0 assert report.latency["min"] > 0 - assert report.tpot == {} # No TPOT values written - assert report.tps() is not None # OSL data present - - r.close() - w.close() + # No TPOT recordings in the registry → empty dict. + assert report.tpot == {} + # OSL data was written → tps() is computable. + assert report.tps() is not None + + def test_failed_uses_tracked_counter(self): + """``n_samples_failed`` reads from ``tracked_samples_failed``, not + ``total_samples_failed``. The two diverge when an ERROR fires for + an untracked sample (warmup window) — only the tracked count + flows into the Report. + """ + registry = _make_registry(n_samples=10) + registry.increment(MetricCounterKey.TOTAL_SAMPLES_FAILED.value, 3) + registry.increment(MetricCounterKey.TRACKED_SAMPLES_FAILED.value, 1) + report = _build_report(registry) + assert report.n_samples_failed == 1 + + def test_complete_flag_true_when_state_complete_and_no_pending(self): + registry = _make_registry(n_samples=5) + report = _build_report(registry, state=SessionState.COMPLETE, n_pending_tasks=0) + assert report.complete is True + + def test_complete_flag_false_when_drain_timeout(self): + """COMPLETE state but n_pending_tasks > 0 → drain timed out, report + is partial. + """ + registry = _make_registry(n_samples=5) + report = _build_report(registry, state=SessionState.COMPLETE, n_pending_tasks=2) + assert report.complete is False + + def test_complete_flag_false_when_state_live(self): + """LIVE/DRAINING snapshots produce reports with ``complete=False``.""" + registry = _make_registry(n_samples=5) + report = _build_report(registry, state=SessionState.LIVE, n_pending_tasks=0) + assert report.complete is False # --------------------------------------------------------------------------- -# Report display and serialization +# Display + JSON serialization # --------------------------------------------------------------------------- @pytest.mark.unit -class TestReport: - def test_display_summary(self, tmp_path: Path): - w, r = _make_store(tmp_path, n_samples=10) - report = Report.from_kv_reader(r) +class TestReportDisplayAndSerialize: + def test_display_summary(self): + registry = _make_registry(n_samples=10) + report = _build_report(registry) lines: list[str] = [] report.display(fn=lines.append, summary_only=True) @@ -183,12 +217,9 @@ def test_display_summary(self, tmp_path: Path): assert "QPS:" in output assert "End of Summary" in output - r.close() - w.close() - - def test_display_full(self, tmp_path: Path): - w, r = _make_store(tmp_path, n_samples=10) - report = Report.from_kv_reader(r) + def test_display_full(self): + registry = _make_registry(n_samples=10) + report = _build_report(registry) lines: list[str] = [] report.display(fn=lines.append, summary_only=False) @@ -199,23 +230,17 @@ def test_display_full(self, tmp_path: Path): assert "Histogram" in output assert "Percentiles" in output - r.close() - w.close() - - def test_to_json(self, tmp_path: Path): - w, r = _make_store(tmp_path, n_samples=5) - report = Report.from_kv_reader(r) + def test_to_json(self): + registry = _make_registry(n_samples=5) + report = _build_report(registry) data = json.loads(report.to_json()) assert data["n_samples_completed"] == 5 assert "ttft" in data - r.close() - w.close() - def test_to_json_save(self, tmp_path: Path): - w, r = _make_store(tmp_path, n_samples=5) - report = Report.from_kv_reader(r) + registry = _make_registry(n_samples=5) + report = _build_report(registry) out_path = tmp_path / "report.json" report.to_json(save_to=out_path) @@ -223,9 +248,6 @@ def test_to_json_save(self, tmp_path: Path): data = json.loads(out_path.read_bytes()) assert data["n_samples_completed"] == 5 - r.close() - w.close() - def test_qps_none_without_duration(self): report = Report( version="test", @@ -235,6 +257,8 @@ def test_qps_none_without_duration(self): n_samples_completed=100, n_samples_failed=0, duration_ns=None, + state="complete", + complete=True, ttft={}, tpot={}, latency={}, @@ -253,6 +277,8 @@ def test_display_no_started_at(self): n_samples_completed=0, n_samples_failed=0, duration_ns=None, + state="complete", + complete=True, ttft={}, tpot={}, latency={}, @@ -262,3 +288,214 @@ def test_display_no_started_at(self): report.display(fn=lines.append, summary_only=True) output = "\n".join(lines) assert "Test started at" not in output + + def test_display_warns_when_incomplete(self): + """Reports with ``complete=False`` surface a WARNING in display().""" + report = Report( + version="test", + git_sha=None, + test_started_at=0, + n_samples_issued=10, + n_samples_completed=10, + n_samples_failed=0, + duration_ns=1_000_000_000, + state="complete", # drain-timeout case: complete state, n_pending>0 + complete=False, + ttft={}, + tpot={}, + latency={}, + output_sequence_lengths={}, + ) + lines: list[str] = [] + report.display(fn=lines.append, summary_only=True) + output = "\n".join(lines) + assert "WARNING" in output or "incomplete" in output.lower() + + def test_display_warns_when_interrupted(self): + """Reports with ``state == "interrupted"`` surface a distinct WARNING.""" + report = Report( + version="test", + git_sha=None, + test_started_at=0, + n_samples_issued=10, + n_samples_completed=5, + n_samples_failed=0, + duration_ns=1_000_000_000, + state="interrupted", + complete=False, + ttft={}, + tpot={}, + latency={}, + output_sequence_lengths={}, + ) + lines: list[str] = [] + report.display(fn=lines.append, summary_only=True) + output = "\n".join(lines) + assert "interrupted" in output.lower() + assert "SIGTERM" in output or "signal" in output.lower() + + +# --------------------------------------------------------------------------- +# Direct dict construction — Report.from_snapshot accepts arbitrary dicts +# (matches the JSON-file → consumer path; defaults absorb partial input). +# --------------------------------------------------------------------------- + + +@pytest.mark.unit +class TestFromSnapshotDict: + def test_minimal_dict_yields_empty_report(self): + """A snapshot dict with no metrics produces a Report whose counters + are 0 and whose series dicts are empty. ``duration_ns`` is None + because ``tracked_duration_ns`` is missing. + """ + snap = { + "counter": 1, + "timestamp_ns": 0, + "state": "complete", + "n_pending_tasks": 0, + "metrics": [], + } + report = Report.from_snapshot(snap) + assert report.n_samples_issued == 0 + assert report.n_samples_completed == 0 + assert report.n_samples_failed == 0 + assert report.duration_ns is None + assert report.state == "complete" + assert report.complete is True + assert report.ttft == {} + + def test_empty_dict_defaults_to_interrupted_incomplete(self): + """A dict missing every key (e.g. corrupt file, truncated read) + produces a non-crashing Report tagged interrupted and incomplete. + Defaults: state→interrupted, counters→0, series→empty. + """ + report = Report.from_snapshot({}) + assert report.state == "interrupted" + assert report.complete is False + assert report.n_samples_issued == 0 + assert report.ttft == {} + + def test_interrupted_state_round_trips_to_report(self): + """An INTERRUPTED snapshot dict produces a Report flagged as such.""" + snap = { + "counter": 1, + "timestamp_ns": 0, + "state": "interrupted", + "n_pending_tasks": 5, + "metrics": [ + {"type": "counter", "name": "tracked_samples_issued", "value": 100}, + {"type": "counter", "name": "tracked_samples_completed", "value": 80}, + ], + } + report = Report.from_snapshot(snap) + assert report.state == "interrupted" + assert report.complete is False + # Partial counters still surface through. + assert report.n_samples_issued == 100 + assert report.n_samples_completed == 80 + + def test_missing_metric_type_is_skipped_not_crashed(self): + """A malformed metric entry (no 'type' field) is skipped rather + than crashing the whole report build. + """ + snap = { + "state": "complete", + "n_pending_tasks": 0, + "metrics": [ + {"name": "orphan_no_type", "value": 99}, # missing 'type' + {"type": "counter", "name": "tracked_samples_issued", "value": 5}, + ], + } + report = Report.from_snapshot(snap) + assert report.n_samples_issued == 5 + + def test_display_handles_scrubbed_nan_percentiles(self): + """``_scrub_nonfinite`` maps producer-side NaN/Inf to ``None`` so the + snapshot JSON stays strict. ``Report.display()`` is called from + ``finalize_benchmark`` outside the report-build try/except — a + ``None * scale_factor`` crash there takes down the whole run. + + Asserts: display() does not raise and renders an N/A indicator + for the scrubbed values. + """ + snap = { + "counter": 1, + "timestamp_ns": 0, + "state": "complete", + "n_pending_tasks": 0, + "metrics": [ + { + "type": "counter", + "name": "tracked_samples_issued", + "value": 5, + }, + { + "type": "counter", + "name": "tracked_samples_completed", + "value": 5, + }, + { + "type": "counter", + "name": "tracked_duration_ns", + "value": 1_000_000_000, + }, + { + "type": "series", + "name": "ttft_ns", + "count": 5, + "total": 5_000_000, + "min": 1_000_000, + "max": 1_500_000, + "sum_sq": 5_005_000_000_000, + # All percentile values scrubbed from NaN → None. + "percentiles": {"50.0": None, "90.0": None, "99.0": None}, + "histogram": [[[1_000_000.0, 1_500_000.0], 5]], + }, + ], + } + report = Report.from_snapshot(snap) + + lines: list[str] = [] + # Currently crashes with TypeError on val * scale_factor. + report.display(fn=lines.append, summary_only=False) + output = "\n".join(lines) + assert "TTFT" in output + # Scrubbed values surface as a sentinel rather than crashing. + assert "N/A" in output + + +@pytest.mark.unit +def test_scrub_nonfinite_round_trip_yields_none(): + """End-to-end: a registry that records a non-finite series value + produces a snapshot dict whose percentile entries are ``None`` (not + NaN literals). Anchors the producer-side invariant the display-time + None-guard depends on. + """ + series = SeriesStat( + name="ttft_ns", + count=1, + total=0.0, + min=0.0, + max=0.0, + sum_sq=0.0, + percentiles={ + "50.0": float("nan"), + "90.0": float("inf"), + "99.0": float("-inf"), + }, + histogram=[], + ) + snap = MetricsSnapshot( + counter=1, + timestamp_ns=0, + state=SessionState.COMPLETE, + n_pending_tasks=0, + metrics=[series], + ) + d = snapshot_to_dict(snap) + perc = d["metrics"][0]["percentiles"] + assert perc == {"50.0": None, "90.0": None, "99.0": None} + # And the result must be strict-JSON serializable. + json.dumps(d, allow_nan=False) + # Sanity: original NaN was indeed non-finite. + assert not math.isfinite(float("nan")) diff --git a/uv.lock b/uv.lock index 86a62d33..44a01855 100644 --- a/uv.lock +++ b/uv.lock @@ -1,5 +1,5 @@ version = 1 -revision = 3 +revision = 2 requires-python = ">=3.12" resolution-markers = [ "python_full_version >= '3.14' and platform_machine == 'x86_64' and sys_platform == 'linux'", @@ -46,29 +46,61 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/f5/1b/428a7c64687b3b2e9cd293186695affc0e1e54a445d0361743b231f11066/aiohttp-3.13.5-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:15c933ad7920b7d9a20de151efcd05a6e38302cbf0e10c9b2acb9a42210a2416", size = 499557, upload-time = "2026-03-31T21:57:38.236Z" }, { url = "https://files.pythonhosted.org/packages/29/47/7be41556bfbb6917069d6a6634bb7dd5e163ba445b783a90d40f5ac7e3a7/aiohttp-3.13.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ab2899f9fa2f9f741896ebb6fa07c4c883bfa5c7f2ddd8cf2aafa86fa981b2d2", size = 500258, upload-time = "2026-03-31T21:57:39.923Z" }, { url = "https://files.pythonhosted.org/packages/67/84/c9ecc5828cb0b3695856c07c0a6817a99d51e2473400f705275a2b3d9239/aiohttp-3.13.5-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:a60eaa2d440cd4707696b52e40ed3e2b0f73f65be07fd0ef23b6b539c9c0b0b4", size = 1749199, upload-time = "2026-03-31T21:57:41.938Z" }, + { url = "https://files.pythonhosted.org/packages/f0/d3/3c6d610e66b495657622edb6ae7c7fd31b2e9086b4ec50b47897ad6042a9/aiohttp-3.13.5-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:55b3bdd3292283295774ab585160c4004f4f2f203946997f49aac032c84649e9", size = 1721013, upload-time = "2026-03-31T21:57:43.904Z" }, + { url = "https://files.pythonhosted.org/packages/49/a0/24409c12217456df0bae7babe3b014e460b0b38a8e60753d6cb339f6556d/aiohttp-3.13.5-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c2b2355dc094e5f7d45a7bb262fe7207aa0460b37a0d87027dcf21b5d890e7d5", size = 1781501, upload-time = "2026-03-31T21:57:46.285Z" }, + { url = "https://files.pythonhosted.org/packages/98/9d/b65ec649adc5bccc008b0957a9a9c691070aeac4e41cea18559fef49958b/aiohttp-3.13.5-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b38765950832f7d728297689ad78f5f2cf79ff82487131c4d26fe6ceecdc5f8e", size = 1878981, upload-time = "2026-03-31T21:57:48.734Z" }, { url = "https://files.pythonhosted.org/packages/57/d8/8d44036d7eb7b6a8ec4c5494ea0c8c8b94fbc0ed3991c1a7adf230df03bf/aiohttp-3.13.5-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b18f31b80d5a33661e08c89e202edabf1986e9b49c42b4504371daeaa11b47c1", size = 1767934, upload-time = "2026-03-31T21:57:51.171Z" }, + { url = "https://files.pythonhosted.org/packages/31/04/d3f8211f273356f158e3464e9e45484d3fb8c4ce5eb2f6fe9405c3273983/aiohttp-3.13.5-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:33add2463dde55c4f2d9635c6ab33ce154e5ecf322bd26d09af95c5f81cfa286", size = 1566671, upload-time = "2026-03-31T21:57:53.326Z" }, { url = "https://files.pythonhosted.org/packages/41/db/073e4ebe00b78e2dfcacff734291651729a62953b48933d765dc513bf798/aiohttp-3.13.5-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:327cc432fdf1356fb4fbc6fe833ad4e9f6aacb71a8acaa5f1855e4b25910e4a9", size = 1705219, upload-time = "2026-03-31T21:57:55.385Z" }, + { url = "https://files.pythonhosted.org/packages/48/45/7dfba71a2f9fd97b15c95c06819de7eb38113d2cdb6319669195a7d64270/aiohttp-3.13.5-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:7c35b0bf0b48a70b4cb4fc5d7bed9b932532728e124874355de1a0af8ec4bc88", size = 1743049, upload-time = "2026-03-31T21:57:57.341Z" }, + { url = "https://files.pythonhosted.org/packages/18/71/901db0061e0f717d226386a7f471bb59b19566f2cae5f0d93874b017271f/aiohttp-3.13.5-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:df23d57718f24badef8656c49743e11a89fd6f5358fa8a7b96e728fda2abf7d3", size = 1749557, upload-time = "2026-03-31T21:57:59.626Z" }, + { url = "https://files.pythonhosted.org/packages/08/d5/41eebd16066e59cd43728fe74bce953d7402f2b4ddfdfef2c0e9f17ca274/aiohttp-3.13.5-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:02e048037a6501a5ec1f6fc9736135aec6eb8a004ce48838cb951c515f32c80b", size = 1558931, upload-time = "2026-03-31T21:58:01.972Z" }, + { url = "https://files.pythonhosted.org/packages/30/e6/4a799798bf05740e66c3a1161079bda7a3dd8e22ca392481d7a7f9af82a6/aiohttp-3.13.5-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:31cebae8b26f8a615d2b546fee45d5ffb76852ae6450e2a03f42c9102260d6fe", size = 1774125, upload-time = "2026-03-31T21:58:04.007Z" }, { url = "https://files.pythonhosted.org/packages/84/63/7749337c90f92bc2cb18f9560d67aa6258c7060d1397d21529b8004fcf6f/aiohttp-3.13.5-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:888e78eb5ca55a615d285c3c09a7a91b42e9dd6fc699b166ebd5dee87c9ccf14", size = 1732427, upload-time = "2026-03-31T21:58:06.337Z" }, { url = "https://files.pythonhosted.org/packages/78/e9/d76bf503005709e390122d34e15256b88f7008e246c4bdbe915cd4f1adce/aiohttp-3.13.5-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:a5029cc80718bbd545123cd8fe5d15025eccaaaace5d0eeec6bd556ad6163d61", size = 742930, upload-time = "2026-03-31T21:58:13.155Z" }, { url = "https://files.pythonhosted.org/packages/57/00/4b7b70223deaebd9bb85984d01a764b0d7bd6526fcdc73cca83bcbe7243e/aiohttp-3.13.5-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:4bb6bf5811620003614076bdc807ef3b5e38244f9d25ca5fe888eaccea2a9832", size = 496927, upload-time = "2026-03-31T21:58:15.073Z" }, { url = "https://files.pythonhosted.org/packages/9c/f5/0fb20fb49f8efdcdce6cd8127604ad2c503e754a8f139f5e02b01626523f/aiohttp-3.13.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:a84792f8631bf5a94e52d9cc881c0b824ab42717165a5579c760b830d9392ac9", size = 497141, upload-time = "2026-03-31T21:58:17.009Z" }, { url = "https://files.pythonhosted.org/packages/3b/86/b7c870053e36a94e8951b803cb5b909bfbc9b90ca941527f5fcafbf6b0fa/aiohttp-3.13.5-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:57653eac22c6a4c13eb22ecf4d673d64a12f266e72785ab1c8b8e5940d0e8090", size = 1732476, upload-time = "2026-03-31T21:58:18.925Z" }, + { url = "https://files.pythonhosted.org/packages/b5/e5/4e161f84f98d80c03a238671b4136e6530453d65262867d989bbe78244d0/aiohttp-3.13.5-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e5e5f7debc7a57af53fdf5c5009f9391d9f4c12867049d509bf7bb164a6e295b", size = 1706507, upload-time = "2026-03-31T21:58:21.094Z" }, + { url = "https://files.pythonhosted.org/packages/d4/56/ea11a9f01518bd5a2a2fcee869d248c4b8a0cfa0bb13401574fa31adf4d4/aiohttp-3.13.5-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c719f65bebcdf6716f10e9eff80d27567f7892d8988c06de12bbbd39307c6e3a", size = 1773465, upload-time = "2026-03-31T21:58:23.159Z" }, + { url = "https://files.pythonhosted.org/packages/eb/40/333ca27fb74b0383f17c90570c748f7582501507307350a79d9f9f3c6eb1/aiohttp-3.13.5-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d97f93fdae594d886c5a866636397e2bcab146fd7a132fd6bb9ce182224452f8", size = 1873523, upload-time = "2026-03-31T21:58:25.59Z" }, { url = "https://files.pythonhosted.org/packages/f0/d2/e2f77eef1acb7111405433c707dc735e63f67a56e176e72e9e7a2cd3f493/aiohttp-3.13.5-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:3df334e39d4c2f899a914f1dba283c1aadc311790733f705182998c6f7cae665", size = 1754113, upload-time = "2026-03-31T21:58:27.624Z" }, + { url = "https://files.pythonhosted.org/packages/fb/56/3f653d7f53c89669301ec9e42c95233e2a0c0a6dd051269e6e678db4fdb0/aiohttp-3.13.5-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:fe6970addfea9e5e081401bcbadf865d2b6da045472f58af08427e108d618540", size = 1562351, upload-time = "2026-03-31T21:58:29.918Z" }, { url = "https://files.pythonhosted.org/packages/ec/a6/9b3e91eb8ae791cce4ee736da02211c85c6f835f1bdfac0594a8a3b7018c/aiohttp-3.13.5-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:7becdf835feff2f4f335d7477f121af787e3504b48b449ff737afb35869ba7bb", size = 1693205, upload-time = "2026-03-31T21:58:32.214Z" }, + { url = "https://files.pythonhosted.org/packages/98/fc/bfb437a99a2fcebd6b6eaec609571954de2ed424f01c352f4b5504371dd3/aiohttp-3.13.5-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:676e5651705ad5d8a70aeb8eb6936c436d8ebbd56e63436cb7dd9bb36d2a9a46", size = 1730618, upload-time = "2026-03-31T21:58:34.728Z" }, + { url = "https://files.pythonhosted.org/packages/e4/b6/c8534862126191a034f68153194c389addc285a0f1347d85096d349bbc15/aiohttp-3.13.5-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:9b16c653d38eb1a611cc898c41e76859ca27f119d25b53c12875fd0474ae31a8", size = 1745185, upload-time = "2026-03-31T21:58:36.909Z" }, + { url = "https://files.pythonhosted.org/packages/0b/93/4ca8ee2ef5236e2707e0fd5fecb10ce214aee1ff4ab307af9c558bda3b37/aiohttp-3.13.5-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:999802d5fa0389f58decd24b537c54aa63c01c3219ce17d1214cbda3c2b22d2d", size = 1557311, upload-time = "2026-03-31T21:58:39.38Z" }, + { url = "https://files.pythonhosted.org/packages/57/ae/76177b15f18c5f5d094f19901d284025db28eccc5ae374d1d254181d33f4/aiohttp-3.13.5-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:ec707059ee75732b1ba130ed5f9580fe10ff75180c812bc267ded039db5128c6", size = 1773147, upload-time = "2026-03-31T21:58:41.476Z" }, { url = "https://files.pythonhosted.org/packages/01/a4/62f05a0a98d88af59d93b7fcac564e5f18f513cb7471696ac286db970d6a/aiohttp-3.13.5-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:2d6d44a5b48132053c2f6cd5c8cb14bc67e99a63594e336b0f2af81e94d5530c", size = 1730356, upload-time = "2026-03-31T21:58:44.049Z" }, { url = "https://files.pythonhosted.org/packages/5d/ce/46572759afc859e867a5bc8ec3487315869013f59281ce61764f76d879de/aiohttp-3.13.5-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:eb4639f32fd4a9904ab8fb45bf3383ba71137f3d9d4ba25b3b3f3109977c5b8c", size = 745721, upload-time = "2026-03-31T21:58:50.229Z" }, { url = "https://files.pythonhosted.org/packages/13/fe/8a2efd7626dbe6049b2ef8ace18ffda8a4dfcbe1bcff3ac30c0c7575c20b/aiohttp-3.13.5-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:7e5dc4311bd5ac493886c63cbf76ab579dbe4641268e7c74e48e774c74b6f2be", size = 497663, upload-time = "2026-03-31T21:58:52.232Z" }, { url = "https://files.pythonhosted.org/packages/9b/91/cc8cc78a111826c54743d88651e1687008133c37e5ee615fee9b57990fac/aiohttp-3.13.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:756c3c304d394977519824449600adaf2be0ccee76d206ee339c5e76b70ded25", size = 499094, upload-time = "2026-03-31T21:58:54.566Z" }, { url = "https://files.pythonhosted.org/packages/0a/33/a8362cb15cf16a3af7e86ed11962d5cd7d59b449202dc576cdc731310bde/aiohttp-3.13.5-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ecc26751323224cf8186efcf7fbcbc30f4e1d8c7970659daf25ad995e4032a56", size = 1726701, upload-time = "2026-03-31T21:58:56.864Z" }, + { url = "https://files.pythonhosted.org/packages/45/0c/c091ac5c3a17114bd76cbf85d674650969ddf93387876cf67f754204bd77/aiohttp-3.13.5-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:10a75acfcf794edf9d8db50e5a7ec5fc818b2a8d3f591ce93bc7b1210df016d2", size = 1683360, upload-time = "2026-03-31T21:58:59.072Z" }, + { url = "https://files.pythonhosted.org/packages/23/73/bcee1c2b79bc275e964d1446c55c54441a461938e70267c86afaae6fba27/aiohttp-3.13.5-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:0f7a18f258d124cd678c5fe072fe4432a4d5232b0657fca7c1847f599233c83a", size = 1773023, upload-time = "2026-03-31T21:59:01.776Z" }, + { url = "https://files.pythonhosted.org/packages/c7/ef/720e639df03004fee2d869f771799d8c23046dec47d5b81e396c7cda583a/aiohttp-3.13.5-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:df6104c009713d3a89621096f3e3e88cc323fd269dbd7c20afe18535094320be", size = 1853795, upload-time = "2026-03-31T21:59:04.568Z" }, { url = "https://files.pythonhosted.org/packages/bd/c9/989f4034fb46841208de7aeeac2c6d8300745ab4f28c42f629ba77c2d916/aiohttp-3.13.5-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:241a94f7de7c0c3b616627aaad530fe2cb620084a8b144d3be7b6ecfe95bae3b", size = 1730405, upload-time = "2026-03-31T21:59:07.221Z" }, + { url = "https://files.pythonhosted.org/packages/ce/75/ee1fd286ca7dc599d824b5651dad7b3be7ff8d9a7e7b3fe9820d9180f7db/aiohttp-3.13.5-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c974fb66180e58709b6fc402846f13791240d180b74de81d23913abe48e96d94", size = 1558082, upload-time = "2026-03-31T21:59:09.484Z" }, { url = "https://files.pythonhosted.org/packages/c3/20/1e9e6650dfc436340116b7aa89ff8cb2bbdf0abc11dfaceaad8f74273a10/aiohttp-3.13.5-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:6e27ea05d184afac78aabbac667450c75e54e35f62238d44463131bd3f96753d", size = 1692346, upload-time = "2026-03-31T21:59:12.068Z" }, + { url = "https://files.pythonhosted.org/packages/d8/40/8ebc6658d48ea630ac7903912fe0dd4e262f0e16825aa4c833c56c9f1f56/aiohttp-3.13.5-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:a79a6d399cef33a11b6f004c67bb07741d91f2be01b8d712d52c75711b1e07c7", size = 1698891, upload-time = "2026-03-31T21:59:14.552Z" }, + { url = "https://files.pythonhosted.org/packages/d8/78/ea0ae5ec8ba7a5c10bdd6e318f1ba5e76fcde17db8275188772afc7917a4/aiohttp-3.13.5-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:c632ce9c0b534fbe25b52c974515ed674937c5b99f549a92127c85f771a78772", size = 1742113, upload-time = "2026-03-31T21:59:17.068Z" }, + { url = "https://files.pythonhosted.org/packages/8a/66/9d308ed71e3f2491be1acb8769d96c6f0c47d92099f3bc9119cada27b357/aiohttp-3.13.5-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:fceedde51fbd67ee2bcc8c0b33d0126cc8b51ef3bbde2f86662bd6d5a6f10ec5", size = 1553088, upload-time = "2026-03-31T21:59:19.541Z" }, + { url = "https://files.pythonhosted.org/packages/da/a6/6cc25ed8dfc6e00c90f5c6d126a98e2cf28957ad06fa1036bd34b6f24a2c/aiohttp-3.13.5-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:f92995dfec9420bb69ae629abf422e516923ba79ba4403bc750d94fb4a6c68c1", size = 1757976, upload-time = "2026-03-31T21:59:22.311Z" }, { url = "https://files.pythonhosted.org/packages/c1/2b/cce5b0ffe0de99c83e5e36d8f828e4161e415660a9f3e58339d07cce3006/aiohttp-3.13.5-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:20ae0ff08b1f2c8788d6fb85afcb798654ae6ba0b747575f8562de738078457b", size = 1712444, upload-time = "2026-03-31T21:59:24.635Z" }, { url = "https://files.pythonhosted.org/packages/79/11/c27d9332ee20d68dd164dc12a6ecdef2e2e35ecc97ed6cf0d2442844624b/aiohttp-3.13.5-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:1efb06900858bb618ff5cee184ae2de5828896c448403d51fb633f09e109be0a", size = 778758, upload-time = "2026-03-31T21:59:31.547Z" }, { url = "https://files.pythonhosted.org/packages/04/fb/377aead2e0a3ba5f09b7624f702a964bdf4f08b5b6728a9799830c80041e/aiohttp-3.13.5-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:fee86b7c4bd29bdaf0d53d14739b08a106fdda809ca5fe032a15f52fae5fe254", size = 512883, upload-time = "2026-03-31T21:59:34.098Z" }, { url = "https://files.pythonhosted.org/packages/bb/a6/aa109a33671f7a5d3bd78b46da9d852797c5e665bfda7d6b373f56bff2ec/aiohttp-3.13.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:20058e23909b9e65f9da62b396b77dfa95965cbe840f8def6e572538b1d32e36", size = 516668, upload-time = "2026-03-31T21:59:36.497Z" }, { url = "https://files.pythonhosted.org/packages/79/b3/ca078f9f2fa9563c36fb8ef89053ea2bb146d6f792c5104574d49d8acb63/aiohttp-3.13.5-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8cf20a8d6868cb15a73cab329ffc07291ba8c22b1b88176026106ae39aa6df0f", size = 1883461, upload-time = "2026-03-31T21:59:38.723Z" }, + { url = "https://files.pythonhosted.org/packages/b7/e3/a7ad633ca1ca497b852233a3cce6906a56c3225fb6d9217b5e5e60b7419d/aiohttp-3.13.5-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:330f5da04c987f1d5bdb8ae189137c77139f36bd1cb23779ca1a354a4b027800", size = 1747661, upload-time = "2026-03-31T21:59:41.187Z" }, + { url = "https://files.pythonhosted.org/packages/33/b9/cd6fe579bed34a906d3d783fe60f2fa297ef55b27bb4538438ee49d4dc41/aiohttp-3.13.5-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6f1cbf0c7926d315c3c26c2da41fd2b5d2fe01ac0e157b78caefc51a782196cf", size = 1863800, upload-time = "2026-03-31T21:59:43.84Z" }, + { url = "https://files.pythonhosted.org/packages/c0/3f/2c1e2f5144cefa889c8afd5cf431994c32f3b29da9961698ff4e3811b79a/aiohttp-3.13.5-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:53fc049ed6390d05423ba33103ded7281fe897cf97878f369a527070bd95795b", size = 1958382, upload-time = "2026-03-31T21:59:46.187Z" }, { url = "https://files.pythonhosted.org/packages/66/1d/f31ec3f1013723b3babe3609e7f119c2c2fb6ef33da90061a705ef3e1bc8/aiohttp-3.13.5-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:898703aa2667e3c5ca4c54ca36cd73f58b7a38ef87a5606414799ebce4d3fd3a", size = 1803724, upload-time = "2026-03-31T21:59:48.656Z" }, + { url = "https://files.pythonhosted.org/packages/0e/b4/57712dfc6f1542f067daa81eb61da282fab3e6f1966fca25db06c4fc62d5/aiohttp-3.13.5-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:0494a01ca9584eea1e5fbd6d748e61ecff218c51b576ee1999c23db7066417d8", size = 1640027, upload-time = "2026-03-31T21:59:51.284Z" }, { url = "https://files.pythonhosted.org/packages/25/3c/734c878fb43ec083d8e31bf029daae1beafeae582d1b35da234739e82ee7/aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:6cf81fe010b8c17b09495cbd15c1d35afbc8fb405c0c9cf4738e5ae3af1d65be", size = 1806644, upload-time = "2026-03-31T21:59:53.753Z" }, + { url = "https://files.pythonhosted.org/packages/20/a5/f671e5cbec1c21d044ff3078223f949748f3a7f86b14e34a365d74a5d21f/aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:c564dd5f09ddc9d8f2c2d0a301cd30a79a2cc1b46dd1a73bef8f0038863d016b", size = 1791630, upload-time = "2026-03-31T21:59:56.239Z" }, + { url = "https://files.pythonhosted.org/packages/0b/63/fb8d0ad63a0b8a99be97deac8c04dacf0785721c158bdf23d679a87aa99e/aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:2994be9f6e51046c4f864598fd9abeb4fba6e88f0b2152422c9666dcd4aea9c6", size = 1809403, upload-time = "2026-03-31T21:59:59.103Z" }, + { url = "https://files.pythonhosted.org/packages/59/0c/bfed7f30662fcf12206481c2aac57dedee43fe1c49275e85b3a1e1742294/aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:157826e2fa245d2ef46c83ea8a5faf77ca19355d278d425c29fda0beb3318037", size = 1634924, upload-time = "2026-03-31T22:00:02.116Z" }, + { url = "https://files.pythonhosted.org/packages/17/d6/fd518d668a09fd5a3319ae5e984d4d80b9a4b3df4e21c52f02251ef5a32e/aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:a8aca50daa9493e9e13c0f566201a9006f080e7c50e5e90d0b06f53146a54500", size = 1836119, upload-time = "2026-03-31T22:00:04.756Z" }, { url = "https://files.pythonhosted.org/packages/78/b7/15fb7a9d52e112a25b621c67b69c167805cb1f2ab8f1708a5c490d1b52fe/aiohttp-3.13.5-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:3b13560160d07e047a93f23aaa30718606493036253d5430887514715b67c9d9", size = 1772072, upload-time = "2026-03-31T22:00:07.494Z" }, ] @@ -191,24 +223,32 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/ea/47/4f61023ea636104d4f16ab488e268b93008c3d0bb76893b1b31db1f96802/cffi-2.0.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:6d02d6655b0e54f54c4ef0b94eb6be0607b70853c45ce98bd278dc7de718be5d", size = 185271, upload-time = "2025-09-08T23:22:44.795Z" }, { url = "https://files.pythonhosted.org/packages/df/a2/781b623f57358e360d62cdd7a8c681f074a71d445418a776eef0aadb4ab4/cffi-2.0.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:8eca2a813c1cb7ad4fb74d368c2ffbbb4789d377ee5bb8df98373c2cc0dee76c", size = 181048, upload-time = "2025-09-08T23:22:45.938Z" }, { url = "https://files.pythonhosted.org/packages/d5/72/12b5f8d3865bf0f87cf1404d8c374e7487dcf097a1c91c436e72e6badd83/cffi-2.0.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:b21e08af67b8a103c71a250401c78d5e0893beff75e28c53c98f4de42f774062", size = 220097, upload-time = "2025-09-08T23:22:48.677Z" }, + { url = "https://files.pythonhosted.org/packages/c2/95/7a135d52a50dfa7c882ab0ac17e8dc11cec9d55d2c18dda414c051c5e69e/cffi-2.0.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:1e3a615586f05fc4065a8b22b8152f0c1b00cdbc60596d187c2a74f9e3036e4e", size = 207983, upload-time = "2025-09-08T23:22:50.06Z" }, + { url = "https://files.pythonhosted.org/packages/3a/c8/15cb9ada8895957ea171c62dc78ff3e99159ee7adb13c0123c001a2546c1/cffi-2.0.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:81afed14892743bbe14dacb9e36d9e0e504cd204e0b165062c488942b9718037", size = 206519, upload-time = "2025-09-08T23:22:51.364Z" }, { url = "https://files.pythonhosted.org/packages/78/2d/7fa73dfa841b5ac06c7b8855cfc18622132e365f5b81d02230333ff26e9e/cffi-2.0.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:3e17ed538242334bf70832644a32a7aae3d83b57567f9fd60a26257e992b79ba", size = 219572, upload-time = "2025-09-08T23:22:52.902Z" }, { url = "https://files.pythonhosted.org/packages/07/e0/267e57e387b4ca276b90f0434ff88b2c2241ad72b16d31836adddfd6031b/cffi-2.0.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3925dd22fa2b7699ed2617149842d2e6adde22b262fcbfada50e3d195e4b3a94", size = 222963, upload-time = "2025-09-08T23:22:54.518Z" }, { url = "https://files.pythonhosted.org/packages/b6/75/1f2747525e06f53efbd878f4d03bac5b859cbc11c633d0fb81432d98a795/cffi-2.0.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:2c8f814d84194c9ea681642fd164267891702542f028a15fc97d4674b6206187", size = 221361, upload-time = "2025-09-08T23:22:55.867Z" }, { url = "https://files.pythonhosted.org/packages/4b/8d/a0a47a0c9e413a658623d014e91e74a50cdd2c423f7ccfd44086ef767f90/cffi-2.0.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:00bdf7acc5f795150faa6957054fbbca2439db2f775ce831222b66f192f03beb", size = 185230, upload-time = "2025-09-08T23:23:00.879Z" }, { url = "https://files.pythonhosted.org/packages/4a/d2/a6c0296814556c68ee32009d9c2ad4f85f2707cdecfd7727951ec228005d/cffi-2.0.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:45d5e886156860dc35862657e1494b9bae8dfa63bf56796f2fb56e1679fc0bca", size = 181043, upload-time = "2025-09-08T23:23:02.231Z" }, { url = "https://files.pythonhosted.org/packages/a9/f5/a2c23eb03b61a0b8747f211eb716446c826ad66818ddc7810cc2cc19b3f2/cffi-2.0.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:d48a880098c96020b02d5a1f7d9251308510ce8858940e6fa99ece33f610838b", size = 220101, upload-time = "2025-09-08T23:23:04.792Z" }, + { url = "https://files.pythonhosted.org/packages/f2/7f/e6647792fc5850d634695bc0e6ab4111ae88e89981d35ac269956605feba/cffi-2.0.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:f93fd8e5c8c0a4aa1f424d6173f14a892044054871c771f8566e4008eaa359d2", size = 207948, upload-time = "2025-09-08T23:23:06.127Z" }, + { url = "https://files.pythonhosted.org/packages/cb/1e/a5a1bd6f1fb30f22573f76533de12a00bf274abcdc55c8edab639078abb6/cffi-2.0.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:dd4f05f54a52fb558f1ba9f528228066954fee3ebe629fc1660d874d040ae5a3", size = 206422, upload-time = "2025-09-08T23:23:07.753Z" }, { url = "https://files.pythonhosted.org/packages/98/df/0a1755e750013a2081e863e7cd37e0cdd02664372c754e5560099eb7aa44/cffi-2.0.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:c8d3b5532fc71b7a77c09192b4a5a200ea992702734a2e9279a37f2478236f26", size = 219499, upload-time = "2025-09-08T23:23:09.648Z" }, { url = "https://files.pythonhosted.org/packages/50/e1/a969e687fcf9ea58e6e2a928ad5e2dd88cc12f6f0ab477e9971f2309b57c/cffi-2.0.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:d9b29c1f0ae438d5ee9acb31cadee00a58c46cc9c0b2f9038c6b0b3470877a8c", size = 222928, upload-time = "2025-09-08T23:23:10.928Z" }, { url = "https://files.pythonhosted.org/packages/36/54/0362578dd2c9e557a28ac77698ed67323ed5b9775ca9d3fe73fe191bb5d8/cffi-2.0.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:6d50360be4546678fc1b79ffe7a66265e28667840010348dd69a314145807a1b", size = 221302, upload-time = "2025-09-08T23:23:12.42Z" }, { url = "https://files.pythonhosted.org/packages/92/c4/3ce07396253a83250ee98564f8d7e9789fab8e58858f35d07a9a2c78de9f/cffi-2.0.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fc33c5141b55ed366cfaad382df24fe7dcbc686de5be719b207bb248e3053dc5", size = 185320, upload-time = "2025-09-08T23:23:18.087Z" }, { url = "https://files.pythonhosted.org/packages/59/dd/27e9fa567a23931c838c6b02d0764611c62290062a6d4e8ff7863daf9730/cffi-2.0.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c654de545946e0db659b3400168c9ad31b5d29593291482c43e3564effbcee13", size = 181487, upload-time = "2025-09-08T23:23:19.622Z" }, { url = "https://files.pythonhosted.org/packages/d6/43/0e822876f87ea8a4ef95442c3d766a06a51fc5298823f884ef87aaad168c/cffi-2.0.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:24b6f81f1983e6df8db3adc38562c83f7d4a0c36162885ec7f7b77c7dcbec97b", size = 220049, upload-time = "2025-09-08T23:23:20.853Z" }, + { url = "https://files.pythonhosted.org/packages/b4/89/76799151d9c2d2d1ead63c2429da9ea9d7aac304603de0c6e8764e6e8e70/cffi-2.0.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:12873ca6cb9b0f0d3a0da705d6086fe911591737a59f28b7936bdfed27c0d47c", size = 207793, upload-time = "2025-09-08T23:23:22.08Z" }, + { url = "https://files.pythonhosted.org/packages/bb/dd/3465b14bb9e24ee24cb88c9e3730f6de63111fffe513492bf8c808a3547e/cffi-2.0.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:d9b97165e8aed9272a6bb17c01e3cc5871a594a446ebedc996e2397a1c1ea8ef", size = 206300, upload-time = "2025-09-08T23:23:23.314Z" }, { url = "https://files.pythonhosted.org/packages/47/d9/d83e293854571c877a92da46fdec39158f8d7e68da75bf73581225d28e90/cffi-2.0.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:afb8db5439b81cf9c9d0c80404b60c3cc9c3add93e114dcae767f1477cb53775", size = 219244, upload-time = "2025-09-08T23:23:24.541Z" }, { url = "https://files.pythonhosted.org/packages/2b/0f/1f177e3683aead2bb00f7679a16451d302c436b5cbf2505f0ea8146ef59e/cffi-2.0.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:737fe7d37e1a1bffe70bd5754ea763a62a066dc5913ca57e957824b72a85e205", size = 222828, upload-time = "2025-09-08T23:23:26.143Z" }, { url = "https://files.pythonhosted.org/packages/c6/0f/cafacebd4b040e3119dcb32fed8bdef8dfe94da653155f9d0b9dc660166e/cffi-2.0.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:38100abb9d1b1435bc4cc340bb4489635dc2f0da7456590877030c9b3d40b0c1", size = 220926, upload-time = "2025-09-08T23:23:27.873Z" }, { url = "https://files.pythonhosted.org/packages/3e/61/c768e4d548bfa607abcda77423448df8c471f25dbe64fb2ef6d555eae006/cffi-2.0.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:9a67fc9e8eb39039280526379fb3a70023d77caec1852002b4da7e8b270c4dd9", size = 188773, upload-time = "2025-09-08T23:23:29.347Z" }, { url = "https://files.pythonhosted.org/packages/2c/ea/5f76bce7cf6fcd0ab1a1058b5af899bfbef198bea4d5686da88471ea0336/cffi-2.0.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:7a66c7204d8869299919db4d5069a82f1561581af12b11b3c9f48c584eb8743d", size = 185013, upload-time = "2025-09-08T23:23:30.63Z" }, { url = "https://files.pythonhosted.org/packages/be/b4/c56878d0d1755cf9caa54ba71e5d049479c52f9e4afc230f06822162ab2f/cffi-2.0.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:7cc09976e8b56f8cebd752f7113ad07752461f48a58cbba644139015ac24954c", size = 221593, upload-time = "2025-09-08T23:23:31.91Z" }, + { url = "https://files.pythonhosted.org/packages/e0/0d/eb704606dfe8033e7128df5e90fee946bbcb64a04fcdaa97321309004000/cffi-2.0.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.whl", hash = "sha256:92b68146a71df78564e4ef48af17551a5ddd142e5190cdf2c5624d0c3ff5b2e8", size = 209354, upload-time = "2025-09-08T23:23:33.214Z" }, + { url = "https://files.pythonhosted.org/packages/d8/19/3c435d727b368ca475fb8742ab97c9cb13a0de600ce86f62eab7fa3eea60/cffi-2.0.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.whl", hash = "sha256:b1e74d11748e7e98e2f426ab176d4ed720a64412b6a15054378afdb71e0f37dc", size = 208480, upload-time = "2025-09-08T23:23:34.495Z" }, { url = "https://files.pythonhosted.org/packages/d0/44/681604464ed9541673e486521497406fadcc15b5217c3e326b061696899a/cffi-2.0.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:28a3a209b96630bca57cce802da70c266eb08c6e97e5afd61a75611ee6c64592", size = 221584, upload-time = "2025-09-08T23:23:36.096Z" }, { url = "https://files.pythonhosted.org/packages/25/8e/342a504ff018a2825d395d44d63a767dd8ebc927ebda557fecdaca3ac33a/cffi-2.0.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:7553fb2090d71822f02c629afe6042c299edf91ba1bf94951165613553984512", size = 224443, upload-time = "2025-09-08T23:23:37.328Z" }, { url = "https://files.pythonhosted.org/packages/e1/5e/b666bacbbc60fbf415ba9988324a132c9a7a0448a9a8f125074671c0f2c3/cffi-2.0.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:6c6c373cfc5c83a975506110d17457138c8c63016b563cc9ed6e056a82f13ce4", size = 223437, upload-time = "2025-09-08T23:23:38.945Z" }, @@ -231,23 +271,55 @@ sdist = { url = "https://files.pythonhosted.org/packages/e7/a1/67fe25fac3c764272 wheels = [ { url = "https://files.pythonhosted.org/packages/0c/eb/4fc8d0a7110eb5fc9cc161723a34a8a6c200ce3b4fbf681bc86feee22308/charset_normalizer-3.4.7-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:eca9705049ad3c7345d574e3510665cb2cf844c2f2dcfe675332677f081cbd46", size = 311328, upload-time = "2026-04-02T09:26:24.331Z" }, { url = "https://files.pythonhosted.org/packages/f8/e3/0fadc706008ac9d7b9b5be6dc767c05f9d3e5df51744ce4cc9605de7b9f4/charset_normalizer-3.4.7-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6178f72c5508bfc5fd446a5905e698c6212932f25bcdd4b47a757a50605a90e2", size = 208061, upload-time = "2026-04-02T09:26:25.568Z" }, + { url = "https://files.pythonhosted.org/packages/42/f0/3dd1045c47f4a4604df85ec18ad093912ae1344ac706993aff91d38773a2/charset_normalizer-3.4.7-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e1421b502d83040e6d7fb2fb18dff63957f720da3d77b2fbd3187ceb63755d7b", size = 229031, upload-time = "2026-04-02T09:26:26.865Z" }, + { url = "https://files.pythonhosted.org/packages/dc/67/675a46eb016118a2fbde5a277a5d15f4f69d5f3f5f338e5ee2f8948fcf43/charset_normalizer-3.4.7-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:edac0f1ab77644605be2cbba52e6b7f630731fc42b34cb0f634be1a6eface56a", size = 225239, upload-time = "2026-04-02T09:26:28.044Z" }, { url = "https://files.pythonhosted.org/packages/4b/f8/d0118a2f5f23b02cd166fa385c60f9b0d4f9194f574e2b31cef350ad7223/charset_normalizer-3.4.7-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:5649fd1c7bade02f320a462fdefd0b4bd3ce036065836d4f42e0de958038e116", size = 216589, upload-time = "2026-04-02T09:26:29.239Z" }, + { url = "https://files.pythonhosted.org/packages/b1/f1/6d2b0b261b6c4ceef0fcb0d17a01cc5bc53586c2d4796fa04b5c540bc13d/charset_normalizer-3.4.7-cp312-cp312-manylinux_2_31_armv7l.whl", hash = "sha256:203104ed3e428044fd943bc4bf45fa73c0730391f9621e37fe39ecf477b128cb", size = 202733, upload-time = "2026-04-02T09:26:30.5Z" }, + { url = "https://files.pythonhosted.org/packages/6f/c0/7b1f943f7e87cc3db9626ba17807d042c38645f0a1d4415c7a14afb5591f/charset_normalizer-3.4.7-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:298930cec56029e05497a76988377cbd7457ba864beeea92ad7e844fe74cd1f1", size = 212652, upload-time = "2026-04-02T09:26:31.709Z" }, { url = "https://files.pythonhosted.org/packages/38/dd/5a9ab159fe45c6e72079398f277b7d2b523e7f716acc489726115a910097/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:708838739abf24b2ceb208d0e22403dd018faeef86ddac04319a62ae884c4f15", size = 211229, upload-time = "2026-04-02T09:26:33.282Z" }, + { url = "https://files.pythonhosted.org/packages/d5/ff/531a1cad5ca855d1c1a8b69cb71abfd6d85c0291580146fda7c82857caa1/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:0f7eb884681e3938906ed0434f20c63046eacd0111c4ba96f27b76084cd679f5", size = 203552, upload-time = "2026-04-02T09:26:34.845Z" }, + { url = "https://files.pythonhosted.org/packages/c1/4c/a5fb52d528a8ca41f7598cb619409ece30a169fbdf9cdce592e53b46c3a6/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:4dc1e73c36828f982bfe79fadf5919923f8a6f4df2860804db9a98c48824ce8d", size = 230806, upload-time = "2026-04-02T09:26:36.152Z" }, + { url = "https://files.pythonhosted.org/packages/59/7a/071feed8124111a32b316b33ae4de83d36923039ef8cf48120266844285b/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:aed52fea0513bac0ccde438c188c8a471c4e0f457c2dd20cdbf6ea7a450046c7", size = 212316, upload-time = "2026-04-02T09:26:37.672Z" }, + { url = "https://files.pythonhosted.org/packages/fd/35/f7dba3994312d7ba508e041eaac39a36b120f32d4c8662b8814dab876431/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:fea24543955a6a729c45a73fe90e08c743f0b3334bbf3201e6c4bc1b0c7fa464", size = 227274, upload-time = "2026-04-02T09:26:38.93Z" }, { url = "https://files.pythonhosted.org/packages/8a/2d/a572df5c9204ab7688ec1edc895a73ebded3b023bb07364710b05dd1c9be/charset_normalizer-3.4.7-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:bb6d88045545b26da47aa879dd4a89a71d1dce0f0e549b1abcb31dfe4a8eac49", size = 218468, upload-time = "2026-04-02T09:26:40.17Z" }, { url = "https://files.pythonhosted.org/packages/c1/3b/66777e39d3ae1ddc77ee606be4ec6d8cbd4c801f65e5a1b6f2b11b8346dd/charset_normalizer-3.4.7-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:f496c9c3cc02230093d8330875c4c3cdfc3b73612a5fd921c65d39cbcef08063", size = 309627, upload-time = "2026-04-02T09:26:45.198Z" }, { url = "https://files.pythonhosted.org/packages/2e/4e/b7f84e617b4854ade48a1b7915c8ccfadeba444d2a18c291f696e37f0d3b/charset_normalizer-3.4.7-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0ea948db76d31190bf08bd371623927ee1339d5f2a0b4b1b4a4439a65298703c", size = 207008, upload-time = "2026-04-02T09:26:46.824Z" }, + { url = "https://files.pythonhosted.org/packages/c4/bb/ec73c0257c9e11b268f018f068f5d00aa0ef8c8b09f7753ebd5f2880e248/charset_normalizer-3.4.7-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a277ab8928b9f299723bc1a2dabb1265911b1a76341f90a510368ca44ad9ab66", size = 228303, upload-time = "2026-04-02T09:26:48.397Z" }, + { url = "https://files.pythonhosted.org/packages/85/fb/32d1f5033484494619f701e719429c69b766bfc4dbc61aa9e9c8c166528b/charset_normalizer-3.4.7-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3bec022aec2c514d9cf199522a802bd007cd588ab17ab2525f20f9c34d067c18", size = 224282, upload-time = "2026-04-02T09:26:49.684Z" }, { url = "https://files.pythonhosted.org/packages/fa/07/330e3a0dda4c404d6da83b327270906e9654a24f6c546dc886a0eb0ffb23/charset_normalizer-3.4.7-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:e044c39e41b92c845bc815e5ae4230804e8e7bc29e399b0437d64222d92809dd", size = 215595, upload-time = "2026-04-02T09:26:50.915Z" }, + { url = "https://files.pythonhosted.org/packages/e3/7c/fc890655786e423f02556e0216d4b8c6bcb6bdfa890160dc66bf52dee468/charset_normalizer-3.4.7-cp313-cp313-manylinux_2_31_armv7l.whl", hash = "sha256:f495a1652cf3fbab2eb0639776dad966c2fb874d79d87ca07f9d5f059b8bd215", size = 201986, upload-time = "2026-04-02T09:26:52.197Z" }, + { url = "https://files.pythonhosted.org/packages/d8/97/bfb18b3db2aed3b90cf54dc292ad79fdd5ad65c4eae454099475cbeadd0d/charset_normalizer-3.4.7-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e712b419df8ba5e42b226c510472b37bd57b38e897d3eca5e8cfd410a29fa859", size = 211711, upload-time = "2026-04-02T09:26:53.49Z" }, { url = "https://files.pythonhosted.org/packages/6f/a5/a581c13798546a7fd557c82614a5c65a13df2157e9ad6373166d2a3e645d/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:7804338df6fcc08105c7745f1502ba68d900f45fd770d5bdd5288ddccb8a42d8", size = 210036, upload-time = "2026-04-02T09:26:54.975Z" }, + { url = "https://files.pythonhosted.org/packages/8c/bf/b3ab5bcb478e4193d517644b0fb2bf5497fbceeaa7a1bc0f4d5b50953861/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:481551899c856c704d58119b5025793fa6730adda3571971af568f66d2424bb5", size = 202998, upload-time = "2026-04-02T09:26:56.303Z" }, + { url = "https://files.pythonhosted.org/packages/e7/4e/23efd79b65d314fa320ec6017b4b5834d5c12a58ba4610aa353af2e2f577/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:f59099f9b66f0d7145115e6f80dd8b1d847176df89b234a5a6b3f00437aa0832", size = 230056, upload-time = "2026-04-02T09:26:57.554Z" }, + { url = "https://files.pythonhosted.org/packages/b9/9f/1e1941bc3f0e01df116e68dc37a55c4d249df5e6fa77f008841aef68264f/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:f59ad4c0e8f6bba240a9bb85504faa1ab438237199d4cce5f622761507b8f6a6", size = 211537, upload-time = "2026-04-02T09:26:58.843Z" }, + { url = "https://files.pythonhosted.org/packages/80/0f/088cbb3020d44428964a6c97fe1edfb1b9550396bf6d278330281e8b709c/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:3dedcc22d73ec993f42055eff4fcfed9318d1eeb9a6606c55892a26964964e48", size = 226176, upload-time = "2026-04-02T09:27:00.437Z" }, { url = "https://files.pythonhosted.org/packages/6a/9f/130394f9bbe06f4f63e22641d32fc9b202b7e251c9aef4db044324dac493/charset_normalizer-3.4.7-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:64f02c6841d7d83f832cd97ccf8eb8a906d06eb95d5276069175c696b024b60a", size = 217723, upload-time = "2026-04-02T09:27:02.021Z" }, { url = "https://files.pythonhosted.org/packages/97/c8/c67cb8c70e19ef1960b97b22ed2a1567711de46c4ddf19799923adc836c2/charset_normalizer-3.4.7-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:c36c333c39be2dbca264d7803333c896ab8fa7d4d6f0ab7edb7dfd7aea6e98c0", size = 309234, upload-time = "2026-04-02T09:27:07.194Z" }, { url = "https://files.pythonhosted.org/packages/99/85/c091fdee33f20de70d6c8b522743b6f831a2f1cd3ff86de4c6a827c48a76/charset_normalizer-3.4.7-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1c2aed2e5e41f24ea8ef1590b8e848a79b56f3a5564a65ceec43c9d692dc7d8a", size = 208042, upload-time = "2026-04-02T09:27:08.749Z" }, + { url = "https://files.pythonhosted.org/packages/87/1c/ab2ce611b984d2fd5d86a5a8a19c1ae26acac6bad967da4967562c75114d/charset_normalizer-3.4.7-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:54523e136b8948060c0fa0bc7b1b50c32c186f2fceee897a495406bb6e311d2b", size = 228706, upload-time = "2026-04-02T09:27:09.951Z" }, + { url = "https://files.pythonhosted.org/packages/a8/29/2b1d2cb00bf085f59d29eb773ce58ec2d325430f8c216804a0a5cd83cbca/charset_normalizer-3.4.7-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:715479b9a2802ecac752a3b0efa2b0b60285cf962ee38414211abdfccc233b41", size = 224727, upload-time = "2026-04-02T09:27:11.175Z" }, { url = "https://files.pythonhosted.org/packages/47/5c/032c2d5a07fe4d4855fea851209cca2b6f03ebeb6d4e3afdb3358386a684/charset_normalizer-3.4.7-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bd6c2a1c7573c64738d716488d2cdd3c00e340e4835707d8fdb8dc1a66ef164e", size = 215882, upload-time = "2026-04-02T09:27:12.446Z" }, + { url = "https://files.pythonhosted.org/packages/2c/c2/356065d5a8b78ed04499cae5f339f091946a6a74f91e03476c33f0ab7100/charset_normalizer-3.4.7-cp314-cp314-manylinux_2_31_armv7l.whl", hash = "sha256:c45e9440fb78f8ddabcf714b68f936737a121355bf59f3907f4e17721b9d1aae", size = 200860, upload-time = "2026-04-02T09:27:13.721Z" }, + { url = "https://files.pythonhosted.org/packages/0c/cd/a32a84217ced5039f53b29f460962abb2d4420def55afabe45b1c3c7483d/charset_normalizer-3.4.7-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3534e7dcbdcf757da6b85a0bbf5b6868786d5982dd959b065e65481644817a18", size = 211564, upload-time = "2026-04-02T09:27:15.272Z" }, { url = "https://files.pythonhosted.org/packages/44/86/58e6f13ce26cc3b8f4a36b94a0f22ae2f00a72534520f4ae6857c4b81f89/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:e8ac484bf18ce6975760921bb6148041faa8fef0547200386ea0b52b5d27bf7b", size = 211276, upload-time = "2026-04-02T09:27:16.834Z" }, + { url = "https://files.pythonhosted.org/packages/8f/fe/d17c32dc72e17e155e06883efa84514ca375f8a528ba2546bee73fc4df81/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:a5fe03b42827c13cdccd08e6c0247b6a6d4b5e3cdc53fd1749f5896adcdc2356", size = 201238, upload-time = "2026-04-02T09:27:18.229Z" }, + { url = "https://files.pythonhosted.org/packages/6a/29/f33daa50b06525a237451cdb6c69da366c381a3dadcd833fa5676bc468b3/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:2d6eb928e13016cea4f1f21d1e10c1cebd5a421bc57ddf5b1142ae3f86824fab", size = 230189, upload-time = "2026-04-02T09:27:19.445Z" }, + { url = "https://files.pythonhosted.org/packages/b6/6e/52c84015394a6a0bdcd435210a7e944c5f94ea1055f5cc5d56c5fe368e7b/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:e74327fb75de8986940def6e8dee4f127cc9752bee7355bb323cc5b2659b6d46", size = 211352, upload-time = "2026-04-02T09:27:20.79Z" }, + { url = "https://files.pythonhosted.org/packages/8c/d7/4353be581b373033fb9198bf1da3cf8f09c1082561e8e922aa7b39bf9fe8/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:d6038d37043bced98a66e68d3aa2b6a35505dc01328cd65217cefe82f25def44", size = 227024, upload-time = "2026-04-02T09:27:22.063Z" }, { url = "https://files.pythonhosted.org/packages/30/45/99d18aa925bd1740098ccd3060e238e21115fffbfdcb8f3ece837d0ace6c/charset_normalizer-3.4.7-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:7579e913a5339fb8fa133f6bbcfd8e6749696206cf05acdbdca71a1b436d8e72", size = 217869, upload-time = "2026-04-02T09:27:23.486Z" }, { url = "https://files.pythonhosted.org/packages/94/09/7e8a7f73d24dba1f0035fbbf014d2c36828fc1bf9c88f84093e57d315935/charset_normalizer-3.4.7-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:effc3f449787117233702311a1b7d8f59cba9ced946ba727bdc329ec69028e24", size = 330133, upload-time = "2026-04-02T09:27:29.474Z" }, { url = "https://files.pythonhosted.org/packages/8d/da/96975ddb11f8e977f706f45cddd8540fd8242f71ecdb5d18a80723dcf62c/charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:fbccdc05410c9ee21bbf16a35f4c1d16123dcdeb8a1d38f33654fa21d0234f79", size = 216257, upload-time = "2026-04-02T09:27:30.793Z" }, + { url = "https://files.pythonhosted.org/packages/e5/e8/1d63bf8ef2d388e95c64b2098f45f84758f6d102a087552da1485912637b/charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:733784b6d6def852c814bce5f318d25da2ee65dd4839a0718641c696e09a2960", size = 234851, upload-time = "2026-04-02T09:27:32.44Z" }, + { url = "https://files.pythonhosted.org/packages/9b/40/e5ff04233e70da2681fa43969ad6f66ca5611d7e669be0246c4c7aaf6dc8/charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a89c23ef8d2c6b27fd200a42aa4ac72786e7c60d40efdc76e6011260b6e949c4", size = 233393, upload-time = "2026-04-02T09:27:34.03Z" }, { url = "https://files.pythonhosted.org/packages/be/c1/06c6c49d5a5450f76899992f1ee40b41d076aee9279b49cf9974d2f313d5/charset_normalizer-3.4.7-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6c114670c45346afedc0d947faf3c7f701051d2518b943679c8ff88befe14f8e", size = 223251, upload-time = "2026-04-02T09:27:35.369Z" }, + { url = "https://files.pythonhosted.org/packages/2b/9f/f2ff16fb050946169e3e1f82134d107e5d4ae72647ec8a1b1446c148480f/charset_normalizer-3.4.7-cp314-cp314t-manylinux_2_31_armv7l.whl", hash = "sha256:a180c5e59792af262bf263b21a3c49353f25945d8d9f70628e73de370d55e1e1", size = 206609, upload-time = "2026-04-02T09:27:36.661Z" }, + { url = "https://files.pythonhosted.org/packages/69/d5/a527c0cd8d64d2eab7459784fb4169a0ac76e5a6fc5237337982fd61347e/charset_normalizer-3.4.7-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3c9a494bc5ec77d43cea229c4f6db1e4d8fe7e1bbffa8b6f0f0032430ff8ab44", size = 220014, upload-time = "2026-04-02T09:27:38.019Z" }, { url = "https://files.pythonhosted.org/packages/7e/80/8a7b8104a3e203074dc9aa2c613d4b726c0e136bad1cc734594b02867972/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8d828b6667a32a728a1ad1d93957cdf37489c57b97ae6c4de2860fa749b8fc1e", size = 218979, upload-time = "2026-04-02T09:27:39.37Z" }, + { url = "https://files.pythonhosted.org/packages/02/9a/b759b503d507f375b2b5c153e4d2ee0a75aa215b7f2489cf314f4541f2c0/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:cf1493cd8607bec4d8a7b9b004e699fcf8f9103a9284cc94962cb73d20f9d4a3", size = 209238, upload-time = "2026-04-02T09:27:40.722Z" }, + { url = "https://files.pythonhosted.org/packages/c2/4e/0f3f5d47b86bdb79256e7290b26ac847a2832d9a4033f7eb2cd4bcf4bb5b/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:0c96c3b819b5c3e9e165495db84d41914d6894d55181d2d108cc1a69bfc9cce0", size = 236110, upload-time = "2026-04-02T09:27:42.33Z" }, + { url = "https://files.pythonhosted.org/packages/96/23/bce28734eb3ed2c91dcf93abeb8a5cf393a7b2749725030bb630e554fdd8/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:752a45dc4a6934060b3b0dab47e04edc3326575f82be64bc4fc293914566503e", size = 219824, upload-time = "2026-04-02T09:27:43.924Z" }, + { url = "https://files.pythonhosted.org/packages/2c/6f/6e897c6984cc4d41af319b077f2f600fc8214eb2fe2d6bcb79141b882400/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:8778f0c7a52e56f75d12dae53ae320fae900a8b9b4164b981b9c5ce059cd1fcb", size = 233103, upload-time = "2026-04-02T09:27:45.348Z" }, { url = "https://files.pythonhosted.org/packages/76/22/ef7bd0fe480a0ae9b656189ec00744b60933f68b4f42a7bb06589f6f576a/charset_normalizer-3.4.7-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ce3412fbe1e31eb81ea42f4169ed94861c56e643189e1e75f0041f3fe7020abe", size = 225194, upload-time = "2026-04-02T09:27:46.706Z" }, { url = "https://files.pythonhosted.org/packages/db/8f/61959034484a4a7c527811f4721e75d02d653a35afb0b6054474d8185d4c/charset_normalizer-3.4.7-py3-none-any.whl", hash = "sha256:3dce51d0f5e7951f8bb4900c257dad282f49190fdbebecd4ba99bcc41fef404d", size = 61958, upload-time = "2026-04-02T09:28:37.794Z" }, ] @@ -282,30 +354,40 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/be/45/adfee365d9ea3d853550b2e735f9d66366701c65db7855cd07621732ccfc/contourpy-1.3.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:b08a32ea2f8e42cf1d4be3169a98dd4be32bafe4f22b6c4cb4ba810fa9e5d2cb", size = 293419, upload-time = "2025-07-26T12:01:21.16Z" }, { url = "https://files.pythonhosted.org/packages/53/3e/405b59cfa13021a56bba395a6b3aca8cec012b45bf177b0eaf7a202cde2c/contourpy-1.3.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:556dba8fb6f5d8742f2923fe9457dbdd51e1049c4a43fd3986a0b14a1d815fc6", size = 273979, upload-time = "2025-07-26T12:01:22.448Z" }, { url = "https://files.pythonhosted.org/packages/d4/1c/a12359b9b2ca3a845e8f7f9ac08bdf776114eb931392fcad91743e2ea17b/contourpy-1.3.3-cp312-cp312-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92d9abc807cf7d0e047b95ca5d957cf4792fcd04e920ca70d48add15c1a90ea7", size = 332653, upload-time = "2025-07-26T12:01:24.155Z" }, + { url = "https://files.pythonhosted.org/packages/63/12/897aeebfb475b7748ea67b61e045accdfcf0d971f8a588b67108ed7f5512/contourpy-1.3.3-cp312-cp312-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b2e8faa0ed68cb29af51edd8e24798bb661eac3bd9f65420c1887b6ca89987c8", size = 379536, upload-time = "2025-07-26T12:01:25.91Z" }, + { url = "https://files.pythonhosted.org/packages/43/8a/a8c584b82deb248930ce069e71576fc09bd7174bbd35183b7943fb1064fd/contourpy-1.3.3-cp312-cp312-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:626d60935cf668e70a5ce6ff184fd713e9683fb458898e4249b63be9e28286ea", size = 384397, upload-time = "2025-07-26T12:01:27.152Z" }, { url = "https://files.pythonhosted.org/packages/cc/8f/ec6289987824b29529d0dfda0d74a07cec60e54b9c92f3c9da4c0ac732de/contourpy-1.3.3-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4d00e655fcef08aba35ec9610536bfe90267d7ab5ba944f7032549c55a146da1", size = 362601, upload-time = "2025-07-26T12:01:28.808Z" }, { url = "https://files.pythonhosted.org/packages/05/0a/a3fe3be3ee2dceb3e615ebb4df97ae6f3828aa915d3e10549ce016302bd1/contourpy-1.3.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:451e71b5a7d597379ef572de31eeb909a87246974d960049a9848c3bc6c41bf7", size = 1331288, upload-time = "2025-07-26T12:01:31.198Z" }, { url = "https://files.pythonhosted.org/packages/33/1d/acad9bd4e97f13f3e2b18a3977fe1b4a37ecf3d38d815333980c6c72e963/contourpy-1.3.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:459c1f020cd59fcfe6650180678a9993932d80d44ccde1fa1868977438f0b411", size = 1403386, upload-time = "2025-07-26T12:01:33.947Z" }, { url = "https://files.pythonhosted.org/packages/68/35/0167aad910bbdb9599272bd96d01a9ec6852f36b9455cf2ca67bd4cc2d23/contourpy-1.3.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:177fb367556747a686509d6fef71d221a4b198a3905fe824430e5ea0fda54eb5", size = 293257, upload-time = "2025-07-26T12:01:39.367Z" }, { url = "https://files.pythonhosted.org/packages/96/e4/7adcd9c8362745b2210728f209bfbcf7d91ba868a2c5f40d8b58f54c509b/contourpy-1.3.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:d002b6f00d73d69333dac9d0b8d5e84d9724ff9ef044fd63c5986e62b7c9e1b1", size = 274034, upload-time = "2025-07-26T12:01:40.645Z" }, { url = "https://files.pythonhosted.org/packages/73/23/90e31ceeed1de63058a02cb04b12f2de4b40e3bef5e082a7c18d9c8ae281/contourpy-1.3.3-cp313-cp313-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:348ac1f5d4f1d66d3322420f01d42e43122f43616e0f194fc1c9f5d830c5b286", size = 334672, upload-time = "2025-07-26T12:01:41.942Z" }, + { url = "https://files.pythonhosted.org/packages/ed/93/b43d8acbe67392e659e1d984700e79eb67e2acb2bd7f62012b583a7f1b55/contourpy-1.3.3-cp313-cp313-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:655456777ff65c2c548b7c454af9c6f33f16c8884f11083244b5819cc214f1b5", size = 381234, upload-time = "2025-07-26T12:01:43.499Z" }, + { url = "https://files.pythonhosted.org/packages/46/3b/bec82a3ea06f66711520f75a40c8fc0b113b2a75edb36aa633eb11c4f50f/contourpy-1.3.3-cp313-cp313-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:644a6853d15b2512d67881586bd03f462c7ab755db95f16f14d7e238f2852c67", size = 385169, upload-time = "2025-07-26T12:01:45.219Z" }, { url = "https://files.pythonhosted.org/packages/4b/32/e0f13a1c5b0f8572d0ec6ae2f6c677b7991fafd95da523159c19eff0696a/contourpy-1.3.3-cp313-cp313-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4debd64f124ca62069f313a9cb86656ff087786016d76927ae2cf37846b006c9", size = 362859, upload-time = "2025-07-26T12:01:46.519Z" }, { url = "https://files.pythonhosted.org/packages/33/71/e2a7945b7de4e58af42d708a219f3b2f4cff7386e6b6ab0a0fa0033c49a9/contourpy-1.3.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a15459b0f4615b00bbd1e91f1b9e19b7e63aea7483d03d804186f278c0af2659", size = 1332062, upload-time = "2025-07-26T12:01:48.964Z" }, { url = "https://files.pythonhosted.org/packages/12/fc/4e87ac754220ccc0e807284f88e943d6d43b43843614f0a8afa469801db0/contourpy-1.3.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:ca0fdcd73925568ca027e0b17ab07aad764be4706d0a925b89227e447d9737b7", size = 1403932, upload-time = "2025-07-26T12:01:51.979Z" }, { url = "https://files.pythonhosted.org/packages/c0/b3/f8a1a86bd3298513f500e5b1f5fd92b69896449f6cab6a146a5d52715479/contourpy-1.3.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:88df9880d507169449d434c293467418b9f6cbe82edd19284aa0409e7fdb933d", size = 306730, upload-time = "2025-07-26T12:01:57.051Z" }, { url = "https://files.pythonhosted.org/packages/3f/11/4780db94ae62fc0c2053909b65dc3246bd7cecfc4f8a20d957ad43aa4ad8/contourpy-1.3.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:d06bb1f751ba5d417047db62bca3c8fde202b8c11fb50742ab3ab962c81e8216", size = 287897, upload-time = "2025-07-26T12:01:58.663Z" }, { url = "https://files.pythonhosted.org/packages/ae/15/e59f5f3ffdd6f3d4daa3e47114c53daabcb18574a26c21f03dc9e4e42ff0/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e4e6b05a45525357e382909a4c1600444e2a45b4795163d3b22669285591c1ae", size = 326751, upload-time = "2025-07-26T12:02:00.343Z" }, + { url = "https://files.pythonhosted.org/packages/0f/81/03b45cfad088e4770b1dcf72ea78d3802d04200009fb364d18a493857210/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ab3074b48c4e2cf1a960e6bbeb7f04566bf36b1861d5c9d4d8ac04b82e38ba20", size = 375486, upload-time = "2025-07-26T12:02:02.128Z" }, + { url = "https://files.pythonhosted.org/packages/0c/ba/49923366492ffbdd4486e970d421b289a670ae8cf539c1ea9a09822b371a/contourpy-1.3.3-cp313-cp313t-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6c3d53c796f8647d6deb1abe867daeb66dcc8a97e8455efa729516b997b8ed99", size = 388106, upload-time = "2025-07-26T12:02:03.615Z" }, { url = "https://files.pythonhosted.org/packages/9f/52/5b00ea89525f8f143651f9f03a0df371d3cbd2fccd21ca9b768c7a6500c2/contourpy-1.3.3-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:50ed930df7289ff2a8d7afeb9603f8289e5704755c7e5c3bbd929c90c817164b", size = 352548, upload-time = "2025-07-26T12:02:05.165Z" }, { url = "https://files.pythonhosted.org/packages/32/1d/a209ec1a3a3452d490f6b14dd92e72280c99ae3d1e73da74f8277d4ee08f/contourpy-1.3.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:4feffb6537d64b84877da813a5c30f1422ea5739566abf0bd18065ac040e120a", size = 1322297, upload-time = "2025-07-26T12:02:07.379Z" }, { url = "https://files.pythonhosted.org/packages/bc/9e/46f0e8ebdd884ca0e8877e46a3f4e633f6c9c8c4f3f6e72be3fe075994aa/contourpy-1.3.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:2b7e9480ffe2b0cd2e787e4df64270e3a0440d9db8dc823312e2c940c167df7e", size = 1391023, upload-time = "2025-07-26T12:02:10.171Z" }, { url = "https://files.pythonhosted.org/packages/72/8b/4546f3ab60f78c514ffb7d01a0bd743f90de36f0019d1be84d0a708a580a/contourpy-1.3.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:fde6c716d51c04b1c25d0b90364d0be954624a0ee9d60e23e850e8d48353d07a", size = 292189, upload-time = "2025-07-26T12:02:16.095Z" }, { url = "https://files.pythonhosted.org/packages/fd/e1/3542a9cb596cadd76fcef413f19c79216e002623158befe6daa03dbfa88c/contourpy-1.3.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:cbedb772ed74ff5be440fa8eee9bd49f64f6e3fc09436d9c7d8f1c287b121d77", size = 273251, upload-time = "2025-07-26T12:02:17.524Z" }, { url = "https://files.pythonhosted.org/packages/b1/71/f93e1e9471d189f79d0ce2497007731c1e6bf9ef6d1d61b911430c3db4e5/contourpy-1.3.3-cp314-cp314-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:22e9b1bd7a9b1d652cd77388465dc358dafcd2e217d35552424aa4f996f524f5", size = 335810, upload-time = "2025-07-26T12:02:18.9Z" }, + { url = "https://files.pythonhosted.org/packages/91/f9/e35f4c1c93f9275d4e38681a80506b5510e9327350c51f8d4a5a724d178c/contourpy-1.3.3-cp314-cp314-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a22738912262aa3e254e4f3cb079a95a67132fc5a063890e224393596902f5a4", size = 382871, upload-time = "2025-07-26T12:02:20.418Z" }, + { url = "https://files.pythonhosted.org/packages/b5/71/47b512f936f66a0a900d81c396a7e60d73419868fba959c61efed7a8ab46/contourpy-1.3.3-cp314-cp314-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:afe5a512f31ee6bd7d0dda52ec9864c984ca3d66664444f2d72e0dc4eb832e36", size = 386264, upload-time = "2025-07-26T12:02:21.916Z" }, { url = "https://files.pythonhosted.org/packages/04/5f/9ff93450ba96b09c7c2b3f81c94de31c89f92292f1380261bd7195bea4ea/contourpy-1.3.3-cp314-cp314-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f64836de09927cba6f79dcd00fdd7d5329f3fccc633468507079c829ca4db4e3", size = 363819, upload-time = "2025-07-26T12:02:23.759Z" }, { url = "https://files.pythonhosted.org/packages/3e/a6/0b185d4cc480ee494945cde102cb0149ae830b5fa17bf855b95f2e70ad13/contourpy-1.3.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:1fd43c3be4c8e5fd6e4f2baeae35ae18176cf2e5cced681cca908addf1cdd53b", size = 1333650, upload-time = "2025-07-26T12:02:26.181Z" }, { url = "https://files.pythonhosted.org/packages/43/d7/afdc95580ca56f30fbcd3060250f66cedbde69b4547028863abd8aa3b47e/contourpy-1.3.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:6afc576f7b33cf00996e5c1102dc2a8f7cc89e39c0b55df93a0b78c1bd992b36", size = 1404833, upload-time = "2025-07-26T12:02:28.782Z" }, { url = "https://files.pythonhosted.org/packages/1f/42/38c159a7d0f2b7b9c04c64ab317042bb6952b713ba875c1681529a2932fe/contourpy-1.3.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:33c82d0138c0a062380332c861387650c82e4cf1747aaa6938b9b6516762e772", size = 306769, upload-time = "2025-07-26T12:02:34.2Z" }, { url = "https://files.pythonhosted.org/packages/c3/6c/26a8205f24bca10974e77460de68d3d7c63e282e23782f1239f226fcae6f/contourpy-1.3.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:ea37e7b45949df430fe649e5de8351c423430046a2af20b1c1961cae3afcda77", size = 287892, upload-time = "2025-07-26T12:02:35.807Z" }, { url = "https://files.pythonhosted.org/packages/66/06/8a475c8ab718ebfd7925661747dbb3c3ee9c82ac834ccb3570be49d129f4/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d304906ecc71672e9c89e87c4675dc5c2645e1f4269a5063b99b0bb29f232d13", size = 326748, upload-time = "2025-07-26T12:02:37.193Z" }, + { url = "https://files.pythonhosted.org/packages/b4/a3/c5ca9f010a44c223f098fccd8b158bb1cb287378a31ac141f04730dc49be/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ca658cd1a680a5c9ea96dc61cdbae1e85c8f25849843aa799dfd3cb370ad4fbe", size = 375554, upload-time = "2025-07-26T12:02:38.894Z" }, + { url = "https://files.pythonhosted.org/packages/80/5b/68bd33ae63fac658a4145088c1e894405e07584a316738710b636c6d0333/contourpy-1.3.3-cp314-cp314t-manylinux_2_26_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ab2fd90904c503739a75b7c8c5c01160130ba67944a7b77bbf36ef8054576e7f", size = 388118, upload-time = "2025-07-26T12:02:40.642Z" }, { url = "https://files.pythonhosted.org/packages/40/52/4c285a6435940ae25d7410a6c36bda5145839bc3f0beb20c707cda18b9d2/contourpy-1.3.3-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b7301b89040075c30e5768810bc96a8e8d78085b47d8be6e4c3f5a0b4ed478a0", size = 352555, upload-time = "2025-07-26T12:02:42.25Z" }, { url = "https://files.pythonhosted.org/packages/24/ee/3e81e1dd174f5c7fefe50e85d0892de05ca4e26ef1c9a59c2a57e43b865a/contourpy-1.3.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:2a2a8b627d5cc6b7c41a4beff6c5ad5eb848c88255fda4a8745f7e901b32d8e4", size = 1322295, upload-time = "2025-07-26T12:02:44.668Z" }, { url = "https://files.pythonhosted.org/packages/3c/b2/6d913d4d04e14379de429057cd169e5e00f6c2af3bb13e1710bcbdb5da12/contourpy-1.3.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:fd6ec6be509c787f1caf6b247f0b1ca598bef13f4ddeaa126b7658215529ba0f", size = 1391027, upload-time = "2025-07-26T12:02:47.09Z" }, @@ -321,31 +403,51 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/5d/96/5238b1efc5922ddbdc9b0db9243152c09777804fb7c02ad1741eb18a11c0/coverage-7.13.4-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:40aa8808140e55dc022b15d8aa7f651b6b3d68b365ea0398f1441e0b04d859c3", size = 219810, upload-time = "2026-02-09T12:56:56.33Z" }, { url = "https://files.pythonhosted.org/packages/5d/a0/2ea570925524ef4e00bb6c82649f5682a77fac5ab910a65c9284de422600/coverage-7.13.4-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:2c048ea43875fbf8b45d476ad79f179809c590ec7b79e2035c662e7afa3192e3", size = 254052, upload-time = "2026-02-09T12:56:59.754Z" }, { url = "https://files.pythonhosted.org/packages/e8/ac/45dc2e19a1939098d783c846e130b8f862fbb50d09e0af663988f2f21973/coverage-7.13.4-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b7b38448866e83176e28086674fe7368ab8590e4610fb662b44e345b86d63ffa", size = 255165, upload-time = "2026-02-09T12:57:01.287Z" }, + { url = "https://files.pythonhosted.org/packages/2d/4d/26d236ff35abc3b5e63540d3386e4c3b192168c1d96da5cb2f43c640970f/coverage-7.13.4-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:de6defc1c9badbf8b9e67ae90fd00519186d6ab64e5cc5f3d21359c2a9b2c1d3", size = 257432, upload-time = "2026-02-09T12:57:02.637Z" }, + { url = "https://files.pythonhosted.org/packages/ec/55/14a966c757d1348b2e19caf699415a2a4c4f7feaa4bbc6326a51f5c7dd1b/coverage-7.13.4-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:7eda778067ad7ffccd23ecffce537dface96212576a07924cbf0d8799d2ded5a", size = 251716, upload-time = "2026-02-09T12:57:04.056Z" }, { url = "https://files.pythonhosted.org/packages/77/33/50116647905837c66d28b2af1321b845d5f5d19be9655cb84d4a0ea806b4/coverage-7.13.4-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:e87f6c587c3f34356c3759f0420693e35e7eb0e2e41e4c011cb6ec6ecbbf1db7", size = 253089, upload-time = "2026-02-09T12:57:05.503Z" }, + { url = "https://files.pythonhosted.org/packages/51/24/8cd73dd399b812cc76bb0ac260e671c4163093441847ffe058ac9fda1e32/coverage-7.13.4-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:25381386e80ae727608e662474db537d4df1ecd42379b5ba33c84633a2b36d47", size = 255299, upload-time = "2026-02-09T12:57:08.245Z" }, + { url = "https://files.pythonhosted.org/packages/03/94/0a4b12f1d0e029ce1ccc1c800944a9984cbe7d678e470bb6d3c6bc38a0da/coverage-7.13.4-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:ee756f00726693e5ba94d6df2bdfd64d4852d23b09bb0bc700e3b30e6f333985", size = 250796, upload-time = "2026-02-09T12:57:10.142Z" }, { url = "https://files.pythonhosted.org/packages/73/44/6002fbf88f6698ca034360ce474c406be6d5a985b3fdb3401128031eef6b/coverage-7.13.4-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fdfc1e28e7c7cdce44985b3043bc13bbd9c747520f94a4d7164af8260b3d91f0", size = 252673, upload-time = "2026-02-09T12:57:12.197Z" }, { url = "https://files.pythonhosted.org/packages/db/23/aad45061a31677d68e47499197a131eea55da4875d16c1f42021ab963503/coverage-7.13.4-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:b66a2da594b6068b48b2692f043f35d4d3693fb639d5ea8b39533c2ad9ac3ab9", size = 219474, upload-time = "2026-02-09T12:57:19.332Z" }, { url = "https://files.pythonhosted.org/packages/a5/70/9b8b67a0945f3dfec1fd896c5cefb7c19d5a3a6d74630b99a895170999ae/coverage-7.13.4-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:3599eb3992d814d23b35c536c28df1a882caa950f8f507cef23d1cbf334995ac", size = 219844, upload-time = "2026-02-09T12:57:20.66Z" }, { url = "https://files.pythonhosted.org/packages/e4/dc/b2442d10020c2f52617828862d8b6ee337859cd8f3a1f13d607dddda9cf7/coverage-7.13.4-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b720ce6a88a2755f7c697c23268ddc47a571b88052e6b155224347389fdf6a3b", size = 253434, upload-time = "2026-02-09T12:57:23.339Z" }, { url = "https://files.pythonhosted.org/packages/5a/88/6728a7ad17428b18d836540630487231f5470fb82454871149502f5e5aa2/coverage-7.13.4-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7b322db1284a2ed3aa28ffd8ebe3db91c929b7a333c0820abec3d838ef5b3525", size = 254676, upload-time = "2026-02-09T12:57:24.774Z" }, + { url = "https://files.pythonhosted.org/packages/7c/bc/21244b1b8cedf0dff0a2b53b208015fe798d5f2a8d5348dbfece04224fff/coverage-7.13.4-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f4594c67d8a7c89cf922d9df0438c7c7bb022ad506eddb0fdb2863359ff78242", size = 256807, upload-time = "2026-02-09T12:57:26.125Z" }, + { url = "https://files.pythonhosted.org/packages/97/a0/ddba7ed3251cff51006737a727d84e05b61517d1784a9988a846ba508877/coverage-7.13.4-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:53d133df809c743eb8bce33b24bcababb371f4441340578cd406e084d94a6148", size = 251058, upload-time = "2026-02-09T12:57:27.614Z" }, { url = "https://files.pythonhosted.org/packages/9b/55/e289addf7ff54d3a540526f33751951bf0878f3809b47f6dfb3def69c6f7/coverage-7.13.4-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:76451d1978b95ba6507a039090ba076105c87cc76fc3efd5d35d72093964d49a", size = 252805, upload-time = "2026-02-09T12:57:29.066Z" }, + { url = "https://files.pythonhosted.org/packages/94/44/1093b8f93018f8b41a8cf29636c9292502f05e4a113d4d107d14a3acd044/coverage-7.13.4-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:1731dc33dc276dafc410a885cbf5992f1ff171393e48a21453b78727d090de80", size = 254923, upload-time = "2026-02-09T12:57:31.946Z" }, + { url = "https://files.pythonhosted.org/packages/8b/55/ea2796da2d42257f37dbea1aab239ba9263b31bd91d5527cdd6db5efe174/coverage-7.13.4-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:bd60d4fe2f6fa7dff9223ca1bbc9f05d2b6697bc5961072e5d3b952d46e1b1ea", size = 250591, upload-time = "2026-02-09T12:57:33.842Z" }, { url = "https://files.pythonhosted.org/packages/d4/fa/7c4bb72aacf8af5020675aa633e59c1fbe296d22aed191b6a5b711eb2bc7/coverage-7.13.4-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9181a3ccead280b828fae232df12b16652702b49d41e99d657f46cc7b1f6ec7a", size = 252364, upload-time = "2026-02-09T12:57:35.743Z" }, { url = "https://files.pythonhosted.org/packages/52/57/ee93ced533bcb3e6df961c0c6e42da2fc6addae53fb95b94a89b1e33ebd7/coverage-7.13.4-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:40d74da8e6c4b9ac18b15331c4b5ebc35a17069410cad462ad4f40dcd2d50c0d", size = 220165, upload-time = "2026-02-09T12:57:41.639Z" }, { url = "https://files.pythonhosted.org/packages/c5/e0/969fc285a6fbdda49d91af278488d904dcd7651b2693872f0ff94e40e84a/coverage-7.13.4-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4223b4230a376138939a9173f1bdd6521994f2aff8047fae100d6d94d50c5a12", size = 220516, upload-time = "2026-02-09T12:57:44.215Z" }, { url = "https://files.pythonhosted.org/packages/8a/f3/e63df6d500314a2a60390d1989240d5f27318a7a68fa30ad3806e2a9323e/coverage-7.13.4-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:200dea7d1e8095cc6e98cdabe3fd1d21ab17d3cee6dab00cadbb2fe35d9c15b9", size = 263885, upload-time = "2026-02-09T12:57:47.42Z" }, { url = "https://files.pythonhosted.org/packages/f3/67/7654810de580e14b37670b60a09c599fa348e48312db5b216d730857ffe6/coverage-7.13.4-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b8eb931ee8e6d8243e253e5ed7336deea6904369d2fd8ae6e43f68abbf167092", size = 266308, upload-time = "2026-02-09T12:57:49.345Z" }, + { url = "https://files.pythonhosted.org/packages/37/6f/39d41eca0eab3cc82115953ad41c4e77935286c930e8fad15eaed1389d83/coverage-7.13.4-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:75eab1ebe4f2f64d9509b984f9314d4aa788540368218b858dad56dc8f3e5eb9", size = 267452, upload-time = "2026-02-09T12:57:50.811Z" }, + { url = "https://files.pythonhosted.org/packages/50/6d/39c0fbb8fc5cd4d2090811e553c2108cf5112e882f82505ee7495349a6bf/coverage-7.13.4-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c35eb28c1d085eb7d8c9b3296567a1bebe03ce72962e932431b9a61f28facf26", size = 261057, upload-time = "2026-02-09T12:57:52.447Z" }, { url = "https://files.pythonhosted.org/packages/a4/a2/60010c669df5fa603bb5a97fb75407e191a846510da70ac657eb696b7fce/coverage-7.13.4-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:eb88b316ec33760714a4720feb2816a3a59180fd58c1985012054fa7aebee4c2", size = 263875, upload-time = "2026-02-09T12:57:53.938Z" }, + { url = "https://files.pythonhosted.org/packages/70/bf/69f86ba1ad85bc3ad240e4c0e57a2e620fbc0e1645a47b5c62f0e941ad7f/coverage-7.13.4-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:fb26a934946a6afe0e326aebe0730cdff393a8bc0bbb65a2f41e30feddca399c", size = 265212, upload-time = "2026-02-09T12:57:57.5Z" }, + { url = "https://files.pythonhosted.org/packages/ae/f2/5f65a278a8c2148731831574c73e42f57204243d33bedaaf18fa79c5958f/coverage-7.13.4-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:dae88bc0fc77edaa65c14be099bd57ee140cf507e6bfdeea7938457ab387efb0", size = 260398, upload-time = "2026-02-09T12:57:59.027Z" }, { url = "https://files.pythonhosted.org/packages/ef/80/6e8280a350ee9fea92f14b8357448a242dcaa243cb2c72ab0ca591f66c8c/coverage-7.13.4-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:845f352911777a8e722bfce168958214951e07e47e5d5d9744109fa5fe77f79b", size = 262584, upload-time = "2026-02-09T12:58:01.129Z" }, { url = "https://files.pythonhosted.org/packages/92/11/a9cf762bb83386467737d32187756a42094927150c3e107df4cb078e8590/coverage-7.13.4-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:300deaee342f90696ed186e3a00c71b5b3d27bffe9e827677954f4ee56969601", size = 219522, upload-time = "2026-02-09T12:58:08.623Z" }, { url = "https://files.pythonhosted.org/packages/d3/28/56e6d892b7b052236d67c95f1936b6a7cf7c3e2634bf27610b8cbd7f9c60/coverage-7.13.4-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:29e3220258d682b6226a9b0925bc563ed9a1ebcff3cad30f043eceea7eaf2689", size = 219855, upload-time = "2026-02-09T12:58:10.176Z" }, { url = "https://files.pythonhosted.org/packages/06/90/2cdab0974b9b5bbc1623f7876b73603aecac11b8d95b85b5b86b32de5eab/coverage-7.13.4-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:0dd7ab8278f0d58a0128ba2fca25824321f05d059c1441800e934ff2efa52129", size = 253396, upload-time = "2026-02-09T12:58:14.615Z" }, { url = "https://files.pythonhosted.org/packages/ac/15/ea4da0f85bf7d7b27635039e649e99deb8173fe551096ea15017f7053537/coverage-7.13.4-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:78cdf0d578b15148b009ccf18c686aa4f719d887e76e6b40c38ffb61d264a552", size = 254745, upload-time = "2026-02-09T12:58:16.162Z" }, + { url = "https://files.pythonhosted.org/packages/99/11/bb356e86920c655ca4d61daee4e2bbc7258f0a37de0be32d233b561134ff/coverage-7.13.4-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:48685fee12c2eb3b27c62f2658e7ea21e9c3239cba5a8a242801a0a3f6a8c62a", size = 257055, upload-time = "2026-02-09T12:58:17.892Z" }, + { url = "https://files.pythonhosted.org/packages/c9/0f/9ae1f8cb17029e09da06ca4e28c9e1d5c1c0a511c7074592e37e0836c915/coverage-7.13.4-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:4e83efc079eb39480e6346a15a1bcb3e9b04759c5202d157e1dd4303cd619356", size = 250911, upload-time = "2026-02-09T12:58:19.495Z" }, { url = "https://files.pythonhosted.org/packages/89/3a/adfb68558fa815cbc29747b553bc833d2150228f251b127f1ce97e48547c/coverage-7.13.4-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ecae9737b72408d6a950f7e525f30aca12d4bd8dd95e37342e5beb3a2a8c4f71", size = 252754, upload-time = "2026-02-09T12:58:21.064Z" }, + { url = "https://files.pythonhosted.org/packages/c7/95/383609462b3ffb1fe133014a7c84fc0dd01ed55ac6140fa1093b5af7ebb1/coverage-7.13.4-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:6fdef321fdfbb30a197efa02d48fcd9981f0d8ad2ae8903ac318adc653f5df98", size = 254994, upload-time = "2026-02-09T12:58:24.548Z" }, + { url = "https://files.pythonhosted.org/packages/f7/ba/1761138e86c81680bfc3c49579d66312865457f9fe405b033184e5793cb3/coverage-7.13.4-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:2b0f6ccf3dbe577170bebfce1318707d0e8c3650003cb4b3a9dd744575daa8b5", size = 250531, upload-time = "2026-02-09T12:58:26.271Z" }, { url = "https://files.pythonhosted.org/packages/f8/8e/05900df797a9c11837ab59c4d6fe94094e029582aab75c3309a93e6fb4e3/coverage-7.13.4-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:75fcd519f2a5765db3f0e391eb3b7d150cce1a771bf4c9f861aeab86c767a3c0", size = 252189, upload-time = "2026-02-09T12:58:27.807Z" }, { url = "https://files.pythonhosted.org/packages/a7/e4/c884a405d6ead1370433dad1e3720216b4f9fd8ef5b64bfd984a2a60a11a/coverage-7.13.4-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:85480adfb35ffc32d40918aad81b89c69c9cc5661a9b8a81476d3e645321a056", size = 220246, upload-time = "2026-02-09T12:58:34.181Z" }, { url = "https://files.pythonhosted.org/packages/81/5c/4d7ed8b23b233b0fffbc9dfec53c232be2e695468523242ea9fd30f97ad2/coverage-7.13.4-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:79be69cf7f3bf9b0deeeb062eab7ac7f36cd4cc4c4dd694bd28921ba4d8596cc", size = 220514, upload-time = "2026-02-09T12:58:35.704Z" }, { url = "https://files.pythonhosted.org/packages/09/aa/b672a647bbe1556a85337dc95bfd40d146e9965ead9cc2fe81bde1e5cbce/coverage-7.13.4-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:14375934243ee05f56c45393fe2ce81fe5cc503c07cee2bdf1725fb8bef3ffaf", size = 264004, upload-time = "2026-02-09T12:58:39.492Z" }, { url = "https://files.pythonhosted.org/packages/79/a1/aa384dbe9181f98bba87dd23dda436f0c6cf2e148aecbb4e50fc51c1a656/coverage-7.13.4-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:25a41c3104d08edb094d9db0d905ca54d0cd41c928bb6be3c4c799a54753af55", size = 266408, upload-time = "2026-02-09T12:58:41.852Z" }, + { url = "https://files.pythonhosted.org/packages/53/5e/5150bf17b4019bc600799f376bb9606941e55bd5a775dc1e096b6ffea952/coverage-7.13.4-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:6f01afcff62bf9a08fb32b2c1d6e924236c0383c02c790732b6537269e466a72", size = 267544, upload-time = "2026-02-09T12:58:44.093Z" }, + { url = "https://files.pythonhosted.org/packages/e0/ed/f1de5c675987a4a7a672250d2c5c9d73d289dbf13410f00ed7181d8017dd/coverage-7.13.4-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:eb9078108fbf0bcdde37c3f4779303673c2fa1fe8f7956e68d447d0dd426d38a", size = 260980, upload-time = "2026-02-09T12:58:45.721Z" }, { url = "https://files.pythonhosted.org/packages/b3/e3/fe758d01850aa172419a6743fe76ba8b92c29d181d4f676ffe2dae2ba631/coverage-7.13.4-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:0e086334e8537ddd17e5f16a344777c1ab8194986ec533711cbe6c41cde841b6", size = 263871, upload-time = "2026-02-09T12:58:47.334Z" }, + { url = "https://files.pythonhosted.org/packages/14/9e/caedb1679e73e2f6ad240173f55218488bfe043e38da577c4ec977489915/coverage-7.13.4-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:3c06f0f1337c667b971ca2f975523347e63ec5e500b9aa5882d91931cd3ef750", size = 265210, upload-time = "2026-02-09T12:58:51.178Z" }, + { url = "https://files.pythonhosted.org/packages/3a/10/0dd02cb009b16ede425b49ec344aba13a6ae1dc39600840ea6abcb085ac4/coverage-7.13.4-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:590c0ed4bf8e85f745e6b805b2e1c457b2e33d5255dd9729743165253bc9ad39", size = 260319, upload-time = "2026-02-09T12:58:53.081Z" }, { url = "https://files.pythonhosted.org/packages/92/8e/234d2c927af27c6d7a5ffad5bd2cf31634c46a477b4c7adfbfa66baf7ebb/coverage-7.13.4-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:eb30bf180de3f632cd043322dad5751390e5385108b2807368997d1a92a509d0", size = 262638, upload-time = "2026-02-09T12:58:55.258Z" }, { url = "https://files.pythonhosted.org/packages/0d/4a/331fe2caf6799d591109bb9c08083080f6de90a823695d412a935622abb2/coverage-7.13.4-py3-none-any.whl", hash = "sha256:1af1641e57cf7ba1bd67d677c9abdbcd6cc2ab7da3bca7fa1e2b7e50e65f2ad0", size = 211242, upload-time = "2026-02-09T12:59:02.032Z" }, ] @@ -545,35 +647,65 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/2b/94/5c8a2b50a496b11dd519f4a24cb5496cf125681dd99e94c604ccdea9419a/frozenlist-1.8.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f833670942247a14eafbb675458b4e61c82e002a148f49e68257b79296e865c4", size = 50448, upload-time = "2025-10-06T05:36:08.78Z" }, { url = "https://files.pythonhosted.org/packages/6a/bd/d91c5e39f490a49df14320f4e8c80161cfcce09f1e2cde1edd16a551abb3/frozenlist-1.8.0-cp312-cp312-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:494a5952b1c597ba44e0e78113a7266e656b9794eec897b19ead706bd7074383", size = 242411, upload-time = "2025-10-06T05:36:09.801Z" }, { url = "https://files.pythonhosted.org/packages/8f/83/f61505a05109ef3293dfb1ff594d13d64a2324ac3482be2cedc2be818256/frozenlist-1.8.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:96f423a119f4777a4a056b66ce11527366a8bb92f54e541ade21f2374433f6d4", size = 243014, upload-time = "2025-10-06T05:36:11.394Z" }, + { url = "https://files.pythonhosted.org/packages/d8/cb/cb6c7b0f7d4023ddda30cf56b8b17494eb3a79e3fda666bf735f63118b35/frozenlist-1.8.0-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3462dd9475af2025c31cc61be6652dfa25cbfb56cbbf52f4ccfe029f38decaf8", size = 234909, upload-time = "2025-10-06T05:36:12.598Z" }, + { url = "https://files.pythonhosted.org/packages/31/c5/cd7a1f3b8b34af009fb17d4123c5a778b44ae2804e3ad6b86204255f9ec5/frozenlist-1.8.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c4c800524c9cd9bac5166cd6f55285957fcfc907db323e193f2afcd4d9abd69b", size = 250049, upload-time = "2025-10-06T05:36:14.065Z" }, + { url = "https://files.pythonhosted.org/packages/c0/01/2f95d3b416c584a1e7f0e1d6d31998c4a795f7544069ee2e0962a4b60740/frozenlist-1.8.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:d6a5df73acd3399d893dafc71663ad22534b5aa4f94e8a2fabfe856c3c1b6a52", size = 256485, upload-time = "2025-10-06T05:36:15.39Z" }, { url = "https://files.pythonhosted.org/packages/ce/03/024bf7720b3abaebcff6d0793d73c154237b85bdf67b7ed55e5e9596dc9a/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:405e8fe955c2280ce66428b3ca55e12b3c4e9c336fb2103a4937e891c69a4a29", size = 237619, upload-time = "2025-10-06T05:36:16.558Z" }, + { url = "https://files.pythonhosted.org/packages/69/fa/f8abdfe7d76b731f5d8bd217827cf6764d4f1d9763407e42717b4bed50a0/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:908bd3f6439f2fef9e85031b59fd4f1297af54415fb60e4254a95f75b3cab3f3", size = 250320, upload-time = "2025-10-06T05:36:17.821Z" }, + { url = "https://files.pythonhosted.org/packages/f5/3c/b051329f718b463b22613e269ad72138cc256c540f78a6de89452803a47d/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:294e487f9ec720bd8ffcebc99d575f7eff3568a08a253d1ee1a0378754b74143", size = 246820, upload-time = "2025-10-06T05:36:19.046Z" }, + { url = "https://files.pythonhosted.org/packages/0f/ae/58282e8f98e444b3f4dd42448ff36fa38bef29e40d40f330b22e7108f565/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:74c51543498289c0c43656701be6b077f4b265868fa7f8a8859c197006efb608", size = 250518, upload-time = "2025-10-06T05:36:20.763Z" }, { url = "https://files.pythonhosted.org/packages/8f/96/007e5944694d66123183845a106547a15944fbbb7154788cbf7272789536/frozenlist-1.8.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:776f352e8329135506a1d6bf16ac3f87bc25b28e765949282dcc627af36123aa", size = 239096, upload-time = "2025-10-06T05:36:22.129Z" }, { url = "https://files.pythonhosted.org/packages/2d/40/0832c31a37d60f60ed79e9dfb5a92e1e2af4f40a16a29abcc7992af9edff/frozenlist-1.8.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:8d92f1a84bb12d9e56f818b3a746f3efba93c1b63c8387a73dde655e1e42282a", size = 85717, upload-time = "2025-10-06T05:36:27.341Z" }, { url = "https://files.pythonhosted.org/packages/30/ba/b0b3de23f40bc55a7057bd38434e25c34fa48e17f20ee273bbde5e0650f3/frozenlist-1.8.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:96153e77a591c8adc2ee805756c61f59fef4cf4073a9275ee86fe8cba41241f7", size = 49651, upload-time = "2025-10-06T05:36:28.855Z" }, { url = "https://files.pythonhosted.org/packages/0c/ab/6e5080ee374f875296c4243c381bbdef97a9ac39c6e3ce1d5f7d42cb78d6/frozenlist-1.8.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f21f00a91358803399890ab167098c131ec2ddd5f8f5fd5fe9c9f2c6fcd91e40", size = 49417, upload-time = "2025-10-06T05:36:29.877Z" }, { url = "https://files.pythonhosted.org/packages/d5/4e/e4691508f9477ce67da2015d8c00acd751e6287739123113a9fca6f1604e/frozenlist-1.8.0-cp313-cp313-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:fb30f9626572a76dfe4293c7194a09fb1fe93ba94c7d4f720dfae3b646b45027", size = 234391, upload-time = "2025-10-06T05:36:31.301Z" }, { url = "https://files.pythonhosted.org/packages/40/76/c202df58e3acdf12969a7895fd6f3bc016c642e6726aa63bd3025e0fc71c/frozenlist-1.8.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:eaa352d7047a31d87dafcacbabe89df0aa506abb5b1b85a2fb91bc3faa02d822", size = 233048, upload-time = "2025-10-06T05:36:32.531Z" }, + { url = "https://files.pythonhosted.org/packages/f9/c0/8746afb90f17b73ca5979c7a3958116e105ff796e718575175319b5bb4ce/frozenlist-1.8.0-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:03ae967b4e297f58f8c774c7eabcce57fe3c2434817d4385c50661845a058121", size = 226549, upload-time = "2025-10-06T05:36:33.706Z" }, + { url = "https://files.pythonhosted.org/packages/7e/eb/4c7eefc718ff72f9b6c4893291abaae5fbc0c82226a32dcd8ef4f7a5dbef/frozenlist-1.8.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f6292f1de555ffcc675941d65fffffb0a5bcd992905015f85d0592201793e0e5", size = 239833, upload-time = "2025-10-06T05:36:34.947Z" }, + { url = "https://files.pythonhosted.org/packages/c2/4e/e5c02187cf704224f8b21bee886f3d713ca379535f16893233b9d672ea71/frozenlist-1.8.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:29548f9b5b5e3460ce7378144c3010363d8035cea44bc0bf02d57f5a685e084e", size = 245363, upload-time = "2025-10-06T05:36:36.534Z" }, { url = "https://files.pythonhosted.org/packages/1f/96/cb85ec608464472e82ad37a17f844889c36100eed57bea094518bf270692/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:ec3cc8c5d4084591b4237c0a272cc4f50a5b03396a47d9caaf76f5d7b38a4f11", size = 229314, upload-time = "2025-10-06T05:36:38.582Z" }, + { url = "https://files.pythonhosted.org/packages/5d/6f/4ae69c550e4cee66b57887daeebe006fe985917c01d0fff9caab9883f6d0/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:517279f58009d0b1f2e7c1b130b377a349405da3f7621ed6bfae50b10adf20c1", size = 243365, upload-time = "2025-10-06T05:36:40.152Z" }, + { url = "https://files.pythonhosted.org/packages/7a/58/afd56de246cf11780a40a2c28dc7cbabbf06337cc8ddb1c780a2d97e88d8/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:db1e72ede2d0d7ccb213f218df6a078a9c09a7de257c2fe8fcef16d5925230b1", size = 237763, upload-time = "2025-10-06T05:36:41.355Z" }, + { url = "https://files.pythonhosted.org/packages/cb/36/cdfaf6ed42e2644740d4a10452d8e97fa1c062e2a8006e4b09f1b5fd7d63/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:b4dec9482a65c54a5044486847b8a66bf10c9cb4926d42927ec4e8fd5db7fed8", size = 240110, upload-time = "2025-10-06T05:36:42.716Z" }, { url = "https://files.pythonhosted.org/packages/03/a8/9ea226fbefad669f11b52e864c55f0bd57d3c8d7eb07e9f2e9a0b39502e1/frozenlist-1.8.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:21900c48ae04d13d416f0e1e0c4d81f7931f73a9dfa0b7a8746fb2fe7dd970ed", size = 233717, upload-time = "2025-10-06T05:36:44.251Z" }, { url = "https://files.pythonhosted.org/packages/d2/5c/3bbfaa920dfab09e76946a5d2833a7cbdf7b9b4a91c714666ac4855b88b4/frozenlist-1.8.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:e25ac20a2ef37e91c1b39938b591457666a0fa835c7783c3a8f33ea42870db94", size = 89235, upload-time = "2025-10-06T05:36:48.78Z" }, { url = "https://files.pythonhosted.org/packages/d2/d6/f03961ef72166cec1687e84e8925838442b615bd0b8854b54923ce5b7b8a/frozenlist-1.8.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:07cdca25a91a4386d2e76ad992916a85038a9b97561bf7a3fd12d5d9ce31870c", size = 50742, upload-time = "2025-10-06T05:36:49.837Z" }, { url = "https://files.pythonhosted.org/packages/1e/bb/a6d12b7ba4c3337667d0e421f7181c82dda448ce4e7ad7ecd249a16fa806/frozenlist-1.8.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:4e0c11f2cc6717e0a741f84a527c52616140741cd812a50422f83dc31749fb52", size = 51725, upload-time = "2025-10-06T05:36:50.851Z" }, { url = "https://files.pythonhosted.org/packages/bc/71/d1fed0ffe2c2ccd70b43714c6cab0f4188f09f8a67a7914a6b46ee30f274/frozenlist-1.8.0-cp313-cp313t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:b3210649ee28062ea6099cfda39e147fa1bc039583c8ee4481cb7811e2448c51", size = 284533, upload-time = "2025-10-06T05:36:51.898Z" }, { url = "https://files.pythonhosted.org/packages/c9/1f/fb1685a7b009d89f9bf78a42d94461bc06581f6e718c39344754a5d9bada/frozenlist-1.8.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:581ef5194c48035a7de2aefc72ac6539823bb71508189e5de01d60c9dcd5fa65", size = 292506, upload-time = "2025-10-06T05:36:53.101Z" }, + { url = "https://files.pythonhosted.org/packages/e6/3b/b991fe1612703f7e0d05c0cf734c1b77aaf7c7d321df4572e8d36e7048c8/frozenlist-1.8.0-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:3ef2d026f16a2b1866e1d86fc4e1291e1ed8a387b2c333809419a2f8b3a77b82", size = 274161, upload-time = "2025-10-06T05:36:54.309Z" }, + { url = "https://files.pythonhosted.org/packages/ca/ec/c5c618767bcdf66e88945ec0157d7f6c4a1322f1473392319b7a2501ded7/frozenlist-1.8.0-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:5500ef82073f599ac84d888e3a8c1f77ac831183244bfd7f11eaa0289fb30714", size = 294676, upload-time = "2025-10-06T05:36:55.566Z" }, + { url = "https://files.pythonhosted.org/packages/7c/ce/3934758637d8f8a88d11f0585d6495ef54b2044ed6ec84492a91fa3b27aa/frozenlist-1.8.0-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:50066c3997d0091c411a66e710f4e11752251e6d2d73d70d8d5d4c76442a199d", size = 300638, upload-time = "2025-10-06T05:36:56.758Z" }, { url = "https://files.pythonhosted.org/packages/fc/4f/a7e4d0d467298f42de4b41cbc7ddaf19d3cfeabaf9ff97c20c6c7ee409f9/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:5c1c8e78426e59b3f8005e9b19f6ff46e5845895adbde20ece9218319eca6506", size = 283067, upload-time = "2025-10-06T05:36:57.965Z" }, + { url = "https://files.pythonhosted.org/packages/dc/48/c7b163063d55a83772b268e6d1affb960771b0e203b632cfe09522d67ea5/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:eefdba20de0d938cec6a89bd4d70f346a03108a19b9df4248d3cf0d88f1b0f51", size = 292101, upload-time = "2025-10-06T05:36:59.237Z" }, + { url = "https://files.pythonhosted.org/packages/9f/d0/2366d3c4ecdc2fd391e0afa6e11500bfba0ea772764d631bbf82f0136c9d/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:cf253e0e1c3ceb4aaff6df637ce033ff6535fb8c70a764a8f46aafd3d6ab798e", size = 289901, upload-time = "2025-10-06T05:37:00.811Z" }, + { url = "https://files.pythonhosted.org/packages/b8/94/daff920e82c1b70e3618a2ac39fbc01ae3e2ff6124e80739ce5d71c9b920/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:032efa2674356903cd0261c4317a561a6850f3ac864a63fc1583147fb05a79b0", size = 289395, upload-time = "2025-10-06T05:37:02.115Z" }, { url = "https://files.pythonhosted.org/packages/e3/20/bba307ab4235a09fdcd3cc5508dbabd17c4634a1af4b96e0f69bfe551ebd/frozenlist-1.8.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:6da155091429aeba16851ecb10a9104a108bcd32f6c1642867eadaee401c1c41", size = 283659, upload-time = "2025-10-06T05:37:03.711Z" }, { url = "https://files.pythonhosted.org/packages/f1/c8/85da824b7e7b9b6e7f7705b2ecaf9591ba6f79c1177f324c2735e41d36a2/frozenlist-1.8.0-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:cee686f1f4cadeb2136007ddedd0aaf928ab95216e7691c63e50a8ec066336d0", size = 86127, upload-time = "2025-10-06T05:37:08.438Z" }, { url = "https://files.pythonhosted.org/packages/8e/e8/a1185e236ec66c20afd72399522f142c3724c785789255202d27ae992818/frozenlist-1.8.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:119fb2a1bd47307e899c2fac7f28e85b9a543864df47aa7ec9d3c1b4545f096f", size = 49698, upload-time = "2025-10-06T05:37:09.48Z" }, { url = "https://files.pythonhosted.org/packages/a1/93/72b1736d68f03fda5fdf0f2180fb6caaae3894f1b854d006ac61ecc727ee/frozenlist-1.8.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:4970ece02dbc8c3a92fcc5228e36a3e933a01a999f7094ff7c23fbd2beeaa67c", size = 49749, upload-time = "2025-10-06T05:37:10.569Z" }, { url = "https://files.pythonhosted.org/packages/a7/b2/fabede9fafd976b991e9f1b9c8c873ed86f202889b864756f240ce6dd855/frozenlist-1.8.0-cp314-cp314-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:cba69cb73723c3f329622e34bdbf5ce1f80c21c290ff04256cff1cd3c2036ed2", size = 231298, upload-time = "2025-10-06T05:37:11.993Z" }, { url = "https://files.pythonhosted.org/packages/3a/3b/d9b1e0b0eed36e70477ffb8360c49c85c8ca8ef9700a4e6711f39a6e8b45/frozenlist-1.8.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:778a11b15673f6f1df23d9586f83c4846c471a8af693a22e066508b77d201ec8", size = 232015, upload-time = "2025-10-06T05:37:13.194Z" }, + { url = "https://files.pythonhosted.org/packages/dc/94/be719d2766c1138148564a3960fc2c06eb688da592bdc25adcf856101be7/frozenlist-1.8.0-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0325024fe97f94c41c08872db482cf8ac4800d80e79222c6b0b7b162d5b13686", size = 225038, upload-time = "2025-10-06T05:37:14.577Z" }, + { url = "https://files.pythonhosted.org/packages/e4/09/6712b6c5465f083f52f50cf74167b92d4ea2f50e46a9eea0523d658454ae/frozenlist-1.8.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:97260ff46b207a82a7567b581ab4190bd4dfa09f4db8a8b49d1a958f6aa4940e", size = 240130, upload-time = "2025-10-06T05:37:15.781Z" }, + { url = "https://files.pythonhosted.org/packages/f8/d4/cd065cdcf21550b54f3ce6a22e143ac9e4836ca42a0de1022da8498eac89/frozenlist-1.8.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:54b2077180eb7f83dd52c40b2750d0a9f175e06a42e3213ce047219de902717a", size = 242845, upload-time = "2025-10-06T05:37:17.037Z" }, { url = "https://files.pythonhosted.org/packages/62/c3/f57a5c8c70cd1ead3d5d5f776f89d33110b1addae0ab010ad774d9a44fb9/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:2f05983daecab868a31e1da44462873306d3cbfd76d1f0b5b69c473d21dbb128", size = 229131, upload-time = "2025-10-06T05:37:18.221Z" }, + { url = "https://files.pythonhosted.org/packages/6c/52/232476fe9cb64f0742f3fde2b7d26c1dac18b6d62071c74d4ded55e0ef94/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:33f48f51a446114bc5d251fb2954ab0164d5be02ad3382abcbfe07e2531d650f", size = 240542, upload-time = "2025-10-06T05:37:19.771Z" }, + { url = "https://files.pythonhosted.org/packages/5f/85/07bf3f5d0fb5414aee5f47d33c6f5c77bfe49aac680bfece33d4fdf6a246/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:154e55ec0655291b5dd1b8731c637ecdb50975a2ae70c606d100750a540082f7", size = 237308, upload-time = "2025-10-06T05:37:20.969Z" }, + { url = "https://files.pythonhosted.org/packages/11/99/ae3a33d5befd41ac0ca2cc7fd3aa707c9c324de2e89db0e0f45db9a64c26/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:4314debad13beb564b708b4a496020e5306c7333fa9a3ab90374169a20ffab30", size = 238210, upload-time = "2025-10-06T05:37:22.252Z" }, { url = "https://files.pythonhosted.org/packages/b2/60/b1d2da22f4970e7a155f0adde9b1435712ece01b3cd45ba63702aea33938/frozenlist-1.8.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:073f8bf8becba60aa931eb3bc420b217bb7d5b8f4750e6f8b3be7f3da85d38b7", size = 231972, upload-time = "2025-10-06T05:37:23.5Z" }, { url = "https://files.pythonhosted.org/packages/c0/c7/43200656ecc4e02d3f8bc248df68256cd9572b3f0017f0a0c4e93440ae23/frozenlist-1.8.0-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:d3bb933317c52d7ea5004a1c442eef86f426886fba134ef8cf4226ea6ee1821d", size = 89238, upload-time = "2025-10-06T05:37:29.373Z" }, { url = "https://files.pythonhosted.org/packages/d1/29/55c5f0689b9c0fb765055629f472c0de484dcaf0acee2f7707266ae3583c/frozenlist-1.8.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:8009897cdef112072f93a0efdce29cd819e717fd2f649ee3016efd3cd885a7ed", size = 50738, upload-time = "2025-10-06T05:37:30.792Z" }, { url = "https://files.pythonhosted.org/packages/ba/7d/b7282a445956506fa11da8c2db7d276adcbf2b17d8bb8407a47685263f90/frozenlist-1.8.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:2c5dcbbc55383e5883246d11fd179782a9d07a986c40f49abe89ddf865913930", size = 51739, upload-time = "2025-10-06T05:37:32.127Z" }, { url = "https://files.pythonhosted.org/packages/62/1c/3d8622e60d0b767a5510d1d3cf21065b9db874696a51ea6d7a43180a259c/frozenlist-1.8.0-cp314-cp314t-manylinux1_x86_64.manylinux_2_28_x86_64.manylinux_2_5_x86_64.whl", hash = "sha256:39ecbc32f1390387d2aa4f5a995e465e9e2f79ba3adcac92d68e3e0afae6657c", size = 284186, upload-time = "2025-10-06T05:37:33.21Z" }, { url = "https://files.pythonhosted.org/packages/2d/14/aa36d5f85a89679a85a1d44cd7a6657e0b1c75f61e7cad987b203d2daca8/frozenlist-1.8.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:92db2bf818d5cc8d9c1f1fc56b897662e24ea5adb36ad1f1d82875bd64e03c24", size = 292196, upload-time = "2025-10-06T05:37:36.107Z" }, + { url = "https://files.pythonhosted.org/packages/05/23/6bde59eb55abd407d34f77d39a5126fb7b4f109a3f611d3929f14b700c66/frozenlist-1.8.0-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:2dc43a022e555de94c3b68a4ef0b11c4f747d12c024a520c7101709a2144fb37", size = 273830, upload-time = "2025-10-06T05:37:37.663Z" }, + { url = "https://files.pythonhosted.org/packages/d2/3f/22cff331bfad7a8afa616289000ba793347fcd7bc275f3b28ecea2a27909/frozenlist-1.8.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:cb89a7f2de3602cfed448095bab3f178399646ab7c61454315089787df07733a", size = 294289, upload-time = "2025-10-06T05:37:39.261Z" }, + { url = "https://files.pythonhosted.org/packages/a4/89/5b057c799de4838b6c69aa82b79705f2027615e01be996d2486a69ca99c4/frozenlist-1.8.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:33139dc858c580ea50e7e60a1b0ea003efa1fd42e6ec7fdbad78fff65fad2fd2", size = 300318, upload-time = "2025-10-06T05:37:43.213Z" }, { url = "https://files.pythonhosted.org/packages/30/de/2c22ab3eb2a8af6d69dc799e48455813bab3690c760de58e1bf43b36da3e/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:168c0969a329b416119507ba30b9ea13688fafffac1b7822802537569a1cb0ef", size = 282814, upload-time = "2025-10-06T05:37:45.337Z" }, + { url = "https://files.pythonhosted.org/packages/59/f7/970141a6a8dbd7f556d94977858cfb36fa9b66e0892c6dd780d2219d8cd8/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:28bd570e8e189d7f7b001966435f9dac6718324b5be2990ac496cf1ea9ddb7fe", size = 291762, upload-time = "2025-10-06T05:37:46.657Z" }, + { url = "https://files.pythonhosted.org/packages/c1/15/ca1adae83a719f82df9116d66f5bb28bb95557b3951903d39135620ef157/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:b2a095d45c5d46e5e79ba1e5b9cb787f541a8dee0433836cea4b96a2c439dcd8", size = 289470, upload-time = "2025-10-06T05:37:47.946Z" }, + { url = "https://files.pythonhosted.org/packages/ac/83/dca6dc53bf657d371fbc88ddeb21b79891e747189c5de990b9dfff2ccba1/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:eab8145831a0d56ec9c4139b6c3e594c7a83c2c8be25d5bcf2d86136a532287a", size = 289042, upload-time = "2025-10-06T05:37:49.499Z" }, { url = "https://files.pythonhosted.org/packages/96/52/abddd34ca99be142f354398700536c5bd315880ed0a213812bc491cff5e4/frozenlist-1.8.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:974b28cf63cc99dfb2188d8d222bc6843656188164848c4f679e63dae4b0708e", size = 283148, upload-time = "2025-10-06T05:37:50.745Z" }, { url = "https://files.pythonhosted.org/packages/9a/9a/e35b4a917281c0b8419d4207f4334c8e8c5dbf4f3f5f9ada73958d937dcc/frozenlist-1.8.0-py3-none-any.whl", hash = "sha256:0c18a16eab41e82c295618a77502e17b195883241c563b00f0aa5106fc4eaa0d", size = 13409, upload-time = "2025-10-06T05:38:16.721Z" }, ] @@ -600,21 +732,29 @@ sdist = { url = "https://files.pythonhosted.org/packages/a3/51/1664f6b78fc6ebbd9 wheels = [ { url = "https://files.pythonhosted.org/packages/ea/ab/1608e5a7578e62113506740b88066bf09888322a311cff602105e619bd87/greenlet-3.3.2-cp312-cp312-macosx_11_0_universal2.whl", hash = "sha256:ac8d61d4343b799d1e526db579833d72f23759c71e07181c2d2944e429eb09cd", size = 280358, upload-time = "2026-02-20T20:17:43.971Z" }, { url = "https://files.pythonhosted.org/packages/a5/23/0eae412a4ade4e6623ff7626e38998cb9b11e9ff1ebacaa021e4e108ec15/greenlet-3.3.2-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3ceec72030dae6ac0c8ed7591b96b70410a8be370b6a477b1dbc072856ad02bd", size = 601217, upload-time = "2026-02-20T20:47:31.462Z" }, + { url = "https://files.pythonhosted.org/packages/f8/16/5b1678a9c07098ecb9ab2dd159fafaf12e963293e61ee8d10ecb55273e5e/greenlet-3.3.2-cp312-cp312-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a2a5be83a45ce6188c045bcc44b0ee037d6a518978de9a5d97438548b953a1ac", size = 611792, upload-time = "2026-02-20T20:55:58.423Z" }, + { url = "https://files.pythonhosted.org/packages/5c/c5/cc09412a29e43406eba18d61c70baa936e299bc27e074e2be3806ed29098/greenlet-3.3.2-cp312-cp312-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ae9e21c84035c490506c17002f5c8ab25f980205c3e61ddb3a2a2a2e6c411fcb", size = 626250, upload-time = "2026-02-20T21:02:46.596Z" }, { url = "https://files.pythonhosted.org/packages/50/1f/5155f55bd71cabd03765a4aac9ac446be129895271f73872c36ebd4b04b6/greenlet-3.3.2-cp312-cp312-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:43e99d1749147ac21dde49b99c9abffcbc1e2d55c67501465ef0930d6e78e070", size = 613875, upload-time = "2026-02-20T20:21:01.102Z" }, { url = "https://files.pythonhosted.org/packages/fc/dd/845f249c3fcd69e32df80cdab059b4be8b766ef5830a3d0aa9d6cad55beb/greenlet-3.3.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:4c956a19350e2c37f2c48b336a3afb4bff120b36076d9d7fb68cb44e05d95b79", size = 1571467, upload-time = "2026-02-20T20:49:33.495Z" }, { url = "https://files.pythonhosted.org/packages/2a/50/2649fe21fcc2b56659a452868e695634722a6655ba245d9f77f5656010bf/greenlet-3.3.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:6c6f8ba97d17a1e7d664151284cb3315fc5f8353e75221ed4324f84eb162b395", size = 1640001, upload-time = "2026-02-20T20:21:09.154Z" }, { url = "https://files.pythonhosted.org/packages/ac/48/f8b875fa7dea7dd9b33245e37f065af59df6a25af2f9561efa8d822fde51/greenlet-3.3.2-cp313-cp313-macosx_11_0_universal2.whl", hash = "sha256:aa6ac98bdfd716a749b84d4034486863fd81c3abde9aa3cf8eff9127981a4ae4", size = 279120, upload-time = "2026-02-20T20:19:01.9Z" }, { url = "https://files.pythonhosted.org/packages/49/8d/9771d03e7a8b1ee456511961e1b97a6d77ae1dea4a34a5b98eee706689d3/greenlet-3.3.2-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ab0c7e7901a00bc0a7284907273dc165b32e0d109a6713babd04471327ff7986", size = 603238, upload-time = "2026-02-20T20:47:32.873Z" }, + { url = "https://files.pythonhosted.org/packages/59/0e/4223c2bbb63cd5c97f28ffb2a8aee71bdfb30b323c35d409450f51b91e3e/greenlet-3.3.2-cp313-cp313-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:d248d8c23c67d2291ffd47af766e2a3aa9fa1c6703155c099feb11f526c63a92", size = 614219, upload-time = "2026-02-20T20:55:59.817Z" }, + { url = "https://files.pythonhosted.org/packages/94/2b/4d012a69759ac9d77210b8bfb128bc621125f5b20fc398bce3940d036b1c/greenlet-3.3.2-cp313-cp313-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:ccd21bb86944ca9be6d967cf7691e658e43417782bce90b5d2faeda0ff78a7dd", size = 628268, upload-time = "2026-02-20T21:02:48.024Z" }, { url = "https://files.pythonhosted.org/packages/7a/34/259b28ea7a2a0c904b11cd36c79b8cef8019b26ee5dbe24e73b469dea347/greenlet-3.3.2-cp313-cp313-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b6997d360a4e6a4e936c0f9625b1c20416b8a0ea18a8e19cabbefc712e7397ab", size = 616774, upload-time = "2026-02-20T20:21:02.454Z" }, { url = "https://files.pythonhosted.org/packages/0a/03/996c2d1689d486a6e199cb0f1cf9e4aa940c500e01bdf201299d7d61fa69/greenlet-3.3.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:64970c33a50551c7c50491671265d8954046cb6e8e2999aacdd60e439b70418a", size = 1571277, upload-time = "2026-02-20T20:49:34.795Z" }, { url = "https://files.pythonhosted.org/packages/d9/c4/2570fc07f34a39f2caf0bf9f24b0a1a0a47bc2e8e465b2c2424821389dfc/greenlet-3.3.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:1a9172f5bf6bd88e6ba5a84e0a68afeac9dc7b6b412b245dd64f52d83c81e55b", size = 1640455, upload-time = "2026-02-20T20:21:10.261Z" }, { url = "https://files.pythonhosted.org/packages/3f/ae/8bffcbd373b57a5992cd077cbe8858fff39110480a9d50697091faea6f39/greenlet-3.3.2-cp314-cp314-macosx_11_0_universal2.whl", hash = "sha256:8d1658d7291f9859beed69a776c10822a0a799bc4bfe1bd4272bb60e62507dab", size = 279650, upload-time = "2026-02-20T20:18:00.783Z" }, { url = "https://files.pythonhosted.org/packages/d1/c0/45f93f348fa49abf32ac8439938726c480bd96b2a3c6f4d949ec0124b69f/greenlet-3.3.2-cp314-cp314-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:18cb1b7337bca281915b3c5d5ae19f4e76d35e1df80f4ad3c1a7be91fadf1082", size = 650295, upload-time = "2026-02-20T20:47:34.036Z" }, + { url = "https://files.pythonhosted.org/packages/b3/de/dd7589b3f2b8372069ab3e4763ea5329940fc7ad9dcd3e272a37516d7c9b/greenlet-3.3.2-cp314-cp314-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c2e47408e8ce1c6f1ceea0dffcdf6ebb85cc09e55c7af407c99f1112016e45e9", size = 662163, upload-time = "2026-02-20T20:56:01.295Z" }, + { url = "https://files.pythonhosted.org/packages/cd/ac/85804f74f1ccea31ba518dcc8ee6f14c79f73fe36fa1beba38930806df09/greenlet-3.3.2-cp314-cp314-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:e3cb43ce200f59483eb82949bf1835a99cf43d7571e900d7c8d5c62cdf25d2f9", size = 675371, upload-time = "2026-02-20T21:02:49.664Z" }, { url = "https://files.pythonhosted.org/packages/d2/d8/09bfa816572a4d83bccd6750df1926f79158b1c36c5f73786e26dbe4ee38/greenlet-3.3.2-cp314-cp314-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:63d10328839d1973e5ba35e98cccbca71b232b14051fd957b6f8b6e8e80d0506", size = 664160, upload-time = "2026-02-20T20:21:04.015Z" }, { url = "https://files.pythonhosted.org/packages/48/cf/56832f0c8255d27f6c35d41b5ec91168d74ec721d85f01a12131eec6b93c/greenlet-3.3.2-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:8e4ab3cfb02993c8cc248ea73d7dae6cec0253e9afa311c9b37e603ca9fad2ce", size = 1619181, upload-time = "2026-02-20T20:49:36.052Z" }, { url = "https://files.pythonhosted.org/packages/0a/23/b90b60a4aabb4cec0796e55f25ffbfb579a907c3898cd2905c8918acaa16/greenlet-3.3.2-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:94ad81f0fd3c0c0681a018a976e5c2bd2ca2d9d94895f23e7bb1af4e8af4e2d5", size = 1687713, upload-time = "2026-02-20T20:21:11.684Z" }, { url = "https://files.pythonhosted.org/packages/98/6d/8f2ef704e614bcf58ed43cfb8d87afa1c285e98194ab2cfad351bf04f81e/greenlet-3.3.2-cp314-cp314t-macosx_11_0_universal2.whl", hash = "sha256:e26e72bec7ab387ac80caa7496e0f908ff954f31065b0ffc1f8ecb1338b11b54", size = 286617, upload-time = "2026-02-20T20:19:29.856Z" }, { url = "https://files.pythonhosted.org/packages/5e/0d/93894161d307c6ea237a43988f27eba0947b360b99ac5239ad3fe09f0b47/greenlet-3.3.2-cp314-cp314t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8b466dff7a4ffda6ca975979bab80bdadde979e29fc947ac3be4451428d8b0e4", size = 655189, upload-time = "2026-02-20T20:47:35.742Z" }, + { url = "https://files.pythonhosted.org/packages/f5/2c/d2d506ebd8abcb57386ec4f7ba20f4030cbe56eae541bc6fd6ef399c0b41/greenlet-3.3.2-cp314-cp314t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b8bddc5b73c9720bea487b3bffdb1840fe4e3656fba3bd40aa1489e9f37877ff", size = 658225, upload-time = "2026-02-20T20:56:02.527Z" }, + { url = "https://files.pythonhosted.org/packages/d1/67/8197b7e7e602150938049d8e7f30de1660cfb87e4c8ee349b42b67bdb2e1/greenlet-3.3.2-cp314-cp314t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:59b3e2c40f6706b05a9cd299c836c6aa2378cabe25d021acd80f13abf81181cf", size = 666581, upload-time = "2026-02-20T21:02:51.526Z" }, { url = "https://files.pythonhosted.org/packages/8e/30/3a09155fbf728673a1dea713572d2d31159f824a37c22da82127056c44e4/greenlet-3.3.2-cp314-cp314t-manylinux_2_24_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:b26b0f4428b871a751968285a1ac9648944cea09807177ac639b030bddebcea4", size = 657907, upload-time = "2026-02-20T20:21:05.259Z" }, { url = "https://files.pythonhosted.org/packages/f3/fd/d05a4b7acd0154ed758797f0a43b4c0962a843bedfe980115e842c5b2d08/greenlet-3.3.2-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:1fb39a11ee2e4d94be9a76671482be9398560955c9e568550de0224e41104727", size = 1618857, upload-time = "2026-02-20T20:49:37.309Z" }, { url = "https://files.pythonhosted.org/packages/6f/e1/50ee92a5db521de8f35075b5eff060dd43d39ebd46c2181a2042f7070385/greenlet-3.3.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:20154044d9085151bc309e7689d6f7ba10027f8f5a8c0676ad398b951913d89e", size = 1680010, upload-time = "2026-02-20T20:21:13.427Z" }, @@ -629,6 +769,20 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/04/4b/29cac41a4d98d144bf5f6d33995617b185d14b22401f75ca86f384e87ff1/h11-0.16.0-py3-none-any.whl", hash = "sha256:63cf8bbe7522de3bf65932fda1d9c2772064ffb3dae62d55932da54b31cb6c86", size = 37515, upload-time = "2025-04-24T03:35:24.344Z" }, ] +[[package]] +name = "hdrhistogram" +version = "0.10.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "pbr", marker = "(platform_machine == 'arm64' and sys_platform == 'darwin') or (platform_machine == 'x86_64' and sys_platform == 'darwin') or (platform_machine == 'aarch64' and sys_platform == 'linux') or (platform_machine == 'x86_64' and sys_platform == 'linux')" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/c2/79/674aad5279dd1a77b85efa1cbf8dcead209dc5f38f55cbbfd75bc20cc65b/hdrhistogram-0.10.3.tar.gz", hash = "sha256:f3890df0a6f3c582a0a8b2a49a568729cb319f1600683e4458cc98b68ca32841", size = 60077, upload-time = "2023-08-11T04:00:36.003Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/05/60/4d12ce18d95c815553751ace3936bccc54d67f47c7a2ebcd94c7fc89ca7f/hdrhistogram-0.10.3-cp312-cp312-macosx_10_9_x86_64.whl", hash = "sha256:088d3ef64c2004fc3cd4b21c4292efe4648367a1ce98c554bf7c5730a0ba018e", size = 36661, upload-time = "2023-08-11T03:59:31.173Z" }, + { url = "https://files.pythonhosted.org/packages/b1/8a/ca7b687c70409aec9a524e3ce7c044274f5108fd9c33cc93635237279b70/hdrhistogram-0.10.3-cp312-cp312-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:f2ba2550e8a392a543e727a4875f76f7131d1dd04ebe7c03d3cbe44b83fc130b", size = 47987, upload-time = "2023-08-11T03:59:33.84Z" }, + { url = "https://files.pythonhosted.org/packages/a4/9d/c3ba5788f3feed8b2198a8a5461706f174912bb59595af616595a7cefd98/hdrhistogram-0.10.3-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:ad6d3ca8bcec581b8cf936608f79f6dd619e2690d1135c1978d80b01318e19e3", size = 52533, upload-time = "2023-08-11T03:59:37.027Z" }, +] + [[package]] name = "hf-xet" version = "1.4.3" @@ -777,6 +931,7 @@ dependencies = [ { name = "cyclopts", marker = "(platform_machine == 'arm64' and sys_platform == 'darwin') or (platform_machine == 'x86_64' and sys_platform == 'darwin') or (platform_machine == 'aarch64' and sys_platform == 'linux') or (platform_machine == 'x86_64' and sys_platform == 'linux')" }, { name = "datasets", marker = "(platform_machine == 'arm64' and sys_platform == 'darwin') or (platform_machine == 'x86_64' and sys_platform == 'darwin') or (platform_machine == 'aarch64' and sys_platform == 'linux') or (platform_machine == 'x86_64' and sys_platform == 'linux')" }, { name = "duckdb", marker = "(platform_machine == 'arm64' and sys_platform == 'darwin') or (platform_machine == 'x86_64' and sys_platform == 'darwin') or (platform_machine == 'aarch64' and sys_platform == 'linux') or (platform_machine == 'x86_64' and sys_platform == 'linux')" }, + { name = "hdrhistogram", marker = "(platform_machine == 'arm64' and sys_platform == 'darwin') or (platform_machine == 'x86_64' and sys_platform == 'darwin') or (platform_machine == 'aarch64' and sys_platform == 'linux') or (platform_machine == 'x86_64' and sys_platform == 'linux')" }, { name = "httptools", marker = "(platform_machine == 'arm64' and sys_platform == 'darwin') or (platform_machine == 'x86_64' and sys_platform == 'darwin') or (platform_machine == 'aarch64' and sys_platform == 'linux') or (platform_machine == 'x86_64' and sys_platform == 'linux')" }, { name = "msgspec", marker = "(platform_machine == 'arm64' and sys_platform == 'darwin') or (platform_machine == 'x86_64' and sys_platform == 'darwin') or (platform_machine == 'aarch64' and sys_platform == 'linux') or (platform_machine == 'x86_64' and sys_platform == 'linux')" }, { name = "numpy", marker = "(platform_machine == 'arm64' and sys_platform == 'darwin') or (platform_machine == 'x86_64' and sys_platform == 'darwin') or (platform_machine == 'aarch64' and sys_platform == 'linux') or (platform_machine == 'x86_64' and sys_platform == 'linux')" }, @@ -841,6 +996,7 @@ requires-dist = [ { name = "cyclopts", specifier = "==4.10.0" }, { name = "datasets", specifier = "==4.8.4" }, { name = "duckdb", specifier = "==1.5.1" }, + { name = "hdrhistogram", specifier = "==0.10.3" }, { name = "httptools", specifier = "==0.7.1" }, { name = "hypothesis", marker = "extra == 'test'", specifier = "==6.151.10" }, { name = "inference-endpoint", extras = ["sql"], marker = "extra == 'test'" }, @@ -918,35 +1074,65 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/99/9f/795fedf35634f746151ca8839d05681ceb6287fbed6cc1c9bf235f7887c2/kiwisolver-1.5.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:ed3a984b31da7481b103f68776f7128a89ef26ed40f4dc41a2223cda7fb24819", size = 64068, upload-time = "2026-03-09T13:13:25.878Z" }, { url = "https://files.pythonhosted.org/packages/c4/13/680c54afe3e65767bed7ec1a15571e1a2f1257128733851ade24abcefbcc/kiwisolver-1.5.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:bb5136fb5352d3f422df33f0c879a1b0c204004324150cc3b5e3c4f310c9049f", size = 1477934, upload-time = "2026-03-09T13:13:27.166Z" }, { url = "https://files.pythonhosted.org/packages/c8/2f/cebfcdb60fd6a9b0f6b47a9337198bcbad6fbe15e68189b7011fd914911f/kiwisolver-1.5.0-cp312-cp312-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b2af221f268f5af85e776a73d62b0845fc8baf8ef0abfae79d29c77d0e776aaf", size = 1278537, upload-time = "2026-03-09T13:13:28.707Z" }, + { url = "https://files.pythonhosted.org/packages/f2/0d/9b782923aada3fafb1d6b84e13121954515c669b18af0c26e7d21f579855/kiwisolver-1.5.0-cp312-cp312-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b0f172dc8ffaccb8522d7c5d899de00133f2f1ca7b0a49b7da98e901de87bf2d", size = 1296685, upload-time = "2026-03-09T13:13:30.528Z" }, + { url = "https://files.pythonhosted.org/packages/27/70/83241b6634b04fe44e892688d5208332bde130f38e610c0418f9ede47ded/kiwisolver-1.5.0-cp312-cp312-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:6ab8ba9152203feec73758dad83af9a0bbe05001eb4639e547207c40cfb52083", size = 1346024, upload-time = "2026-03-09T13:13:32.818Z" }, + { url = "https://files.pythonhosted.org/packages/e4/db/30ed226fb271ae1a6431fc0fe0edffb2efe23cadb01e798caeb9f2ceae8f/kiwisolver-1.5.0-cp312-cp312-manylinux_2_39_riscv64.whl", hash = "sha256:cdee07c4d7f6d72008d3f73b9bf027f4e11550224c7c50d8df1ae4a37c1402a6", size = 987241, upload-time = "2026-03-09T13:13:34.435Z" }, { url = "https://files.pythonhosted.org/packages/ec/bd/c314595208e4c9587652d50959ead9e461995389664e490f4dce7ff0f782/kiwisolver-1.5.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:7c60d3c9b06fb23bd9c6139281ccbdc384297579ae037f08ae90c69f6845c0b1", size = 2227742, upload-time = "2026-03-09T13:13:36.4Z" }, + { url = "https://files.pythonhosted.org/packages/c1/43/0499cec932d935229b5543d073c2b87c9c22846aab48881e9d8d6e742a2d/kiwisolver-1.5.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:e315e5ec90d88e140f57696ff85b484ff68bb311e36f2c414aa4286293e6dee0", size = 2323966, upload-time = "2026-03-09T13:13:38.204Z" }, + { url = "https://files.pythonhosted.org/packages/3d/6f/79b0d760907965acfd9d61826a3d41f8f093c538f55cd2633d3f0db269f6/kiwisolver-1.5.0-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:1465387ac63576c3e125e5337a6892b9e99e0627d52317f3ca79e6930d889d15", size = 1977417, upload-time = "2026-03-09T13:13:39.966Z" }, + { url = "https://files.pythonhosted.org/packages/ab/31/01d0537c41cb75a551a438c3c7a80d0c60d60b81f694dac83dd436aec0d0/kiwisolver-1.5.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:530a3fd64c87cffa844d4b6b9768774763d9caa299e9b75d8eca6a4423b31314", size = 2491238, upload-time = "2026-03-09T13:13:41.698Z" }, { url = "https://files.pythonhosted.org/packages/e4/34/8aefdd0be9cfd00a44509251ba864f5caf2991e36772e61c408007e7f417/kiwisolver-1.5.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:1d9daea4ea6b9be74fe2f01f7fbade8d6ffab263e781274cffca0dba9be9eec9", size = 2294947, upload-time = "2026-03-09T13:13:43.343Z" }, { url = "https://files.pythonhosted.org/packages/9d/69/024d6711d5ba575aa65d5538042e99964104e97fa153a9f10bc369182bc2/kiwisolver-1.5.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:fd40bb9cd0891c4c3cb1ddf83f8bbfa15731a248fdc8162669405451e2724b09", size = 123166, upload-time = "2026-03-09T13:13:48.032Z" }, { url = "https://files.pythonhosted.org/packages/ce/48/adbb40df306f587054a348831220812b9b1d787aff714cfbc8556e38fccd/kiwisolver-1.5.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:c0e1403fd7c26d77c1f03e096dc58a5c726503fa0db0456678b8668f76f521e3", size = 66395, upload-time = "2026-03-09T13:13:49.365Z" }, { url = "https://files.pythonhosted.org/packages/a8/3a/d0a972b34e1c63e2409413104216cd1caa02c5a37cb668d1687d466c1c45/kiwisolver-1.5.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:dda366d548e89a90d88a86c692377d18d8bd64b39c1fb2b92cb31370e2896bbd", size = 64065, upload-time = "2026-03-09T13:13:50.562Z" }, { url = "https://files.pythonhosted.org/packages/2b/0a/7b98e1e119878a27ba8618ca1e18b14f992ff1eda40f47bccccf4de44121/kiwisolver-1.5.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:332b4f0145c30b5f5ad9374881133e5aa64320428a57c2c2b61e9d891a51c2f3", size = 1477903, upload-time = "2026-03-09T13:13:52.084Z" }, { url = "https://files.pythonhosted.org/packages/18/d8/55638d89ffd27799d5cc3d8aa28e12f4ce7a64d67b285114dbedc8ea4136/kiwisolver-1.5.0-cp313-cp313-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:0c50b89ffd3e1a911c69a1dd3de7173c0cd10b130f56222e57898683841e4f96", size = 1278751, upload-time = "2026-03-09T13:13:54.673Z" }, + { url = "https://files.pythonhosted.org/packages/b8/97/b4c8d0d18421ecceba20ad8701358453b88e32414e6f6950b5a4bad54e65/kiwisolver-1.5.0-cp313-cp313-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4db576bb8c3ef9365f8b40fe0f671644de6736ae2c27a2c62d7d8a1b4329f099", size = 1296793, upload-time = "2026-03-09T13:13:56.287Z" }, + { url = "https://files.pythonhosted.org/packages/c4/10/f862f94b6389d8957448ec9df59450b81bec4abb318805375c401a1e6892/kiwisolver-1.5.0-cp313-cp313-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0b85aad90cea8ac6797a53b5d5f2e967334fa4d1149f031c4537569972596cb8", size = 1346041, upload-time = "2026-03-09T13:13:58.269Z" }, + { url = "https://files.pythonhosted.org/packages/a3/6a/f1650af35821eaf09de398ec0bc2aefc8f211f0cda50204c9f1673741ba9/kiwisolver-1.5.0-cp313-cp313-manylinux_2_39_riscv64.whl", hash = "sha256:d36ca54cb4c6c4686f7cbb7b817f66f5911c12ddb519450bbe86707155028f87", size = 987292, upload-time = "2026-03-09T13:13:59.871Z" }, { url = "https://files.pythonhosted.org/packages/de/19/d7fb82984b9238115fe629c915007be608ebd23dc8629703d917dbfaffd4/kiwisolver-1.5.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:38f4a703656f493b0ad185211ccfca7f0386120f022066b018eb5296d8613e23", size = 2227865, upload-time = "2026-03-09T13:14:01.401Z" }, + { url = "https://files.pythonhosted.org/packages/7f/b9/46b7f386589fd222dac9e9de9c956ce5bcefe2ee73b4e79891381dda8654/kiwisolver-1.5.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:3ac2360e93cb41be81121755c6462cff3beaa9967188c866e5fce5cf13170859", size = 2324369, upload-time = "2026-03-09T13:14:02.972Z" }, + { url = "https://files.pythonhosted.org/packages/92/8b/95e237cf3d9c642960153c769ddcbe278f182c8affb20cecc1cc983e7cc5/kiwisolver-1.5.0-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:c95cab08d1965db3d84a121f1c7ce7479bdd4072c9b3dafd8fecce48a2e6b902", size = 1977989, upload-time = "2026-03-09T13:14:04.503Z" }, + { url = "https://files.pythonhosted.org/packages/1b/95/980c9df53501892784997820136c01f62bc1865e31b82b9560f980c0e649/kiwisolver-1.5.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:fc20894c3d21194d8041a28b65622d5b86db786da6e3cfe73f0c762951a61167", size = 2491645, upload-time = "2026-03-09T13:14:06.106Z" }, { url = "https://files.pythonhosted.org/packages/cb/32/900647fd0840abebe1561792c6b31e6a7c0e278fc3973d30572a965ca14c/kiwisolver-1.5.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:7a32f72973f0f950c1920475d5c5ea3d971b81b6f0ec53b8d0a956cc965f22e0", size = 2295237, upload-time = "2026-03-09T13:14:08.891Z" }, { url = "https://files.pythonhosted.org/packages/b0/69/ce68dd0c85755ae2de490bf015b62f2cea5f6b14ff00a463f9d0774449ff/kiwisolver-1.5.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:db485b3847d182b908b483b2ed133c66d88d49cacf98fd278fadafe11b4478d1", size = 125700, upload-time = "2026-03-09T13:14:14.636Z" }, { url = "https://files.pythonhosted.org/packages/74/aa/937aac021cf9d4349990d47eb319309a51355ed1dbdc9c077cdc9224cb11/kiwisolver-1.5.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:be12f931839a3bdfe28b584db0e640a65a8bcbc24560ae3fdb025a449b3d754e", size = 67537, upload-time = "2026-03-09T13:14:15.808Z" }, { url = "https://files.pythonhosted.org/packages/ee/20/3a87fbece2c40ad0f6f0aefa93542559159c5f99831d596050e8afae7a9f/kiwisolver-1.5.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:16b85d37c2cbb3253226d26e64663f755d88a03439a9c47df6246b35defbdfb7", size = 65514, upload-time = "2026-03-09T13:14:18.035Z" }, { url = "https://files.pythonhosted.org/packages/f0/7f/f943879cda9007c45e1f7dba216d705c3a18d6b35830e488b6c6a4e7cdf0/kiwisolver-1.5.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:4432b835675f0ea7414aab3d37d119f7226d24869b7a829caeab49ebda407b0c", size = 1584848, upload-time = "2026-03-09T13:14:19.745Z" }, { url = "https://files.pythonhosted.org/packages/37/f8/4d4f85cc1870c127c88d950913370dd76138482161cd07eabbc450deff01/kiwisolver-1.5.0-cp313-cp313t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b0feb50971481a2cc44d94e88bdb02cdd497618252ae226b8eb1201b957e368", size = 1391542, upload-time = "2026-03-09T13:14:21.54Z" }, + { url = "https://files.pythonhosted.org/packages/04/0b/65dd2916c84d252b244bd405303220f729e7c17c9d7d33dca6feeff9ffc4/kiwisolver-1.5.0-cp313-cp313t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:56fa888f10d0f367155e76ce849fa1166fc9730d13bd2d65a2aa13b6f5424489", size = 1404447, upload-time = "2026-03-09T13:14:23.205Z" }, + { url = "https://files.pythonhosted.org/packages/39/5c/2606a373247babce9b1d056c03a04b65f3cf5290a8eac5d7bdead0a17e21/kiwisolver-1.5.0-cp313-cp313t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:940dda65d5e764406b9fb92761cbf462e4e63f712ab60ed98f70552e496f3bf1", size = 1455918, upload-time = "2026-03-09T13:14:24.74Z" }, + { url = "https://files.pythonhosted.org/packages/d5/d1/c6078b5756670658e9192a2ef11e939c92918833d2745f85cd14a6004bdf/kiwisolver-1.5.0-cp313-cp313t-manylinux_2_39_riscv64.whl", hash = "sha256:89fc958c702ee9a745e4700378f5d23fddbc46ff89e8fdbf5395c24d5c1452a3", size = 1072856, upload-time = "2026-03-09T13:14:26.597Z" }, { url = "https://files.pythonhosted.org/packages/cb/c8/7def6ddf16eb2b3741d8b172bdaa9af882b03c78e9b0772975408801fa63/kiwisolver-1.5.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9027d773c4ff81487181a925945743413f6069634d0b122d0b37684ccf4f1e18", size = 2333580, upload-time = "2026-03-09T13:14:28.237Z" }, + { url = "https://files.pythonhosted.org/packages/9e/87/2ac1fce0eb1e616fcd3c35caa23e665e9b1948bb984f4764790924594128/kiwisolver-1.5.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:5b233ea3e165e43e35dba1d2b8ecc21cf070b45b65ae17dd2747d2713d942021", size = 2423018, upload-time = "2026-03-09T13:14:30.018Z" }, + { url = "https://files.pythonhosted.org/packages/67/13/c6700ccc6cc218716bfcda4935e4b2997039869b4ad8a94f364c5a3b8e63/kiwisolver-1.5.0-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:ce9bf03dad3b46408c08649c6fbd6ca28a9fce0eb32fdfffa6775a13103b5310", size = 2062804, upload-time = "2026-03-09T13:14:32.888Z" }, + { url = "https://files.pythonhosted.org/packages/1b/bd/877056304626943ff0f1f44c08f584300c199b887cb3176cd7e34f1515f1/kiwisolver-1.5.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:fc4d3f1fb9ca0ae9f97b095963bc6326f1dbfd3779d6679a1e016b9baaa153d3", size = 2597482, upload-time = "2026-03-09T13:14:34.971Z" }, { url = "https://files.pythonhosted.org/packages/75/19/c60626c47bf0f8ac5dcf72c6c98e266d714f2fbbfd50cf6dab5ede3aaa50/kiwisolver-1.5.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:f443b4825c50a51ee68585522ab4a1d1257fac65896f282b4c6763337ac9f5d2", size = 2394328, upload-time = "2026-03-09T13:14:36.816Z" }, { url = "https://files.pythonhosted.org/packages/e4/d7/060f45052f2a01ad5762c8fdecd6d7a752b43400dc29ff75cd47225a40fd/kiwisolver-1.5.0-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:8df31fe574b8b3993cc61764f40941111b25c2d9fea13d3ce24a49907cd2d615", size = 123231, upload-time = "2026-03-09T13:14:41.323Z" }, { url = "https://files.pythonhosted.org/packages/c2/a7/78da680eadd06ff35edef6ef68a1ad273bad3e2a0936c9a885103230aece/kiwisolver-1.5.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:1d49a49ac4cbfb7c1375301cd1ec90169dfeae55ff84710d782260ce77a75a02", size = 66489, upload-time = "2026-03-09T13:14:42.534Z" }, { url = "https://files.pythonhosted.org/packages/49/b2/97980f3ad4fae37dd7fe31626e2bf75fbf8bdf5d303950ec1fab39a12da8/kiwisolver-1.5.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:0cbe94b69b819209a62cb27bdfa5dc2a8977d8de2f89dfd97ba4f53ed3af754e", size = 64063, upload-time = "2026-03-09T13:14:44.759Z" }, { url = "https://files.pythonhosted.org/packages/e7/f9/b06c934a6aa8bc91f566bd2a214fd04c30506c2d9e2b6b171953216a65b6/kiwisolver-1.5.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:80aa065ffd378ff784822a6d7c3212f2d5f5e9c3589614b5c228b311fd3063ac", size = 1475913, upload-time = "2026-03-09T13:14:46.247Z" }, { url = "https://files.pythonhosted.org/packages/6b/f0/f768ae564a710135630672981231320bc403cf9152b5596ec5289de0f106/kiwisolver-1.5.0-cp314-cp314-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e7f886f47ab881692f278ae901039a234e4025a68e6dfab514263a0b1c4ae05", size = 1282782, upload-time = "2026-03-09T13:14:48.458Z" }, + { url = "https://files.pythonhosted.org/packages/e2/9f/1de7aad00697325f05238a5f2eafbd487fb637cc27a558b5367a5f37fb7f/kiwisolver-1.5.0-cp314-cp314-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:5060731cc3ed12ca3a8b57acd4aeca5bbc2f49216dd0bec1650a1acd89486bcd", size = 1300815, upload-time = "2026-03-09T13:14:50.721Z" }, + { url = "https://files.pythonhosted.org/packages/5a/c2/297f25141d2e468e0ce7f7a7b92e0cf8918143a0cbd3422c1ad627e85a06/kiwisolver-1.5.0-cp314-cp314-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:7a4aa69609f40fce3cbc3f87b2061f042eee32f94b8f11db707b66a26461591a", size = 1347925, upload-time = "2026-03-09T13:14:52.304Z" }, + { url = "https://files.pythonhosted.org/packages/b9/d3/f4c73a02eb41520c47610207b21afa8cdd18fdbf64ffd94674ae21c4812d/kiwisolver-1.5.0-cp314-cp314-manylinux_2_39_riscv64.whl", hash = "sha256:d168fda2dbff7b9b5f38e693182d792a938c31db4dac3a80a4888de603c99554", size = 991322, upload-time = "2026-03-09T13:14:54.637Z" }, { url = "https://files.pythonhosted.org/packages/7b/46/d3f2efef7732fcda98d22bf4ad5d3d71d545167a852ca710a494f4c15343/kiwisolver-1.5.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:413b820229730d358efd838ecbab79902fe97094565fdc80ddb6b0a18c18a581", size = 2232857, upload-time = "2026-03-09T13:14:56.471Z" }, + { url = "https://files.pythonhosted.org/packages/3f/ec/2d9756bf2b6d26ae4349b8d3662fb3993f16d80c1f971c179ce862b9dbae/kiwisolver-1.5.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:5124d1ea754509b09e53738ec185584cc609aae4a3b510aaf4ed6aa047ef9303", size = 2329376, upload-time = "2026-03-09T13:14:58.072Z" }, + { url = "https://files.pythonhosted.org/packages/8f/9f/876a0a0f2260f1bde92e002b3019a5fabc35e0939c7d945e0fa66185eb20/kiwisolver-1.5.0-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:e4415a8db000bf49a6dd1c478bf70062eaacff0f462b92b0ba68791a905861f9", size = 1982549, upload-time = "2026-03-09T13:14:59.668Z" }, + { url = "https://files.pythonhosted.org/packages/6c/4f/ba3624dfac23a64d54ac4179832860cb537c1b0af06024936e82ca4154a0/kiwisolver-1.5.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:d618fd27420381a4f6044faa71f46d8bfd911bd077c555f7138ed88729bfbe79", size = 2494680, upload-time = "2026-03-09T13:15:01.364Z" }, { url = "https://files.pythonhosted.org/packages/39/b7/97716b190ab98911b20d10bf92eca469121ec483b8ce0edd314f51bc85af/kiwisolver-1.5.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5092eb5b1172947f57d6ea7d89b2f29650414e4293c47707eb499ec07a0ac796", size = 2297905, upload-time = "2026-03-09T13:15:03.925Z" }, { url = "https://files.pythonhosted.org/packages/17/01/7dc8c5443ff42b38e72731643ed7cf1ed9bf01691ae5cdca98501999ed83/kiwisolver-1.5.0-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:d1ffeb80b5676463d7a7d56acbe8e37a20ce725570e09549fe738e02ca6b7e1e", size = 125794, upload-time = "2026-03-09T13:15:10.525Z" }, { url = "https://files.pythonhosted.org/packages/46/8a/b4ebe46ebaac6a303417fab10c2e165c557ddaff558f9699d302b256bc53/kiwisolver-1.5.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:bc4d8e252f532ab46a1de9349e2d27b91fce46736a9eedaa37beaca66f574ed4", size = 67646, upload-time = "2026-03-09T13:15:12.016Z" }, { url = "https://files.pythonhosted.org/packages/60/35/10a844afc5f19d6f567359bf4789e26661755a2f36200d5d1ed8ad0126e5/kiwisolver-1.5.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:6783e069732715ad0c3ce96dbf21dbc2235ab0593f2baf6338101f70371f4028", size = 65511, upload-time = "2026-03-09T13:15:13.311Z" }, { url = "https://files.pythonhosted.org/packages/f8/8a/685b297052dd041dcebce8e8787b58923b6e78acc6115a0dc9189011c44b/kiwisolver-1.5.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.whl", hash = "sha256:e7c4c09a490dc4d4a7f8cbee56c606a320f9dc28cf92a7157a39d1ce7676a657", size = 1584858, upload-time = "2026-03-09T13:15:15.103Z" }, { url = "https://files.pythonhosted.org/packages/9e/80/04865e3d4638ac5bddec28908916df4a3075b8c6cc101786a96803188b96/kiwisolver-1.5.0-cp314-cp314t-manylinux_2_24_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2a075bd7bd19c70cf67c8badfa36cf7c5d8de3c9ddb8420c51e10d9c50e94920", size = 1392539, upload-time = "2026-03-09T13:15:16.661Z" }, + { url = "https://files.pythonhosted.org/packages/ba/01/77a19cacc0893fa13fafa46d1bba06fb4dc2360b3292baf4b56d8e067b24/kiwisolver-1.5.0-cp314-cp314t-manylinux_2_24_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:bdd3e53429ff02aa319ba59dfe4ceeec345bf46cf180ec2cf6fd5b942e7975e9", size = 1405310, upload-time = "2026-03-09T13:15:18.229Z" }, + { url = "https://files.pythonhosted.org/packages/53/39/bcaf5d0cca50e604cfa9b4e3ae1d64b50ca1ae5b754122396084599ef903/kiwisolver-1.5.0-cp314-cp314t-manylinux_2_24_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3cdcb35dc9d807259c981a85531048ede628eabcffb3239adf3d17463518992d", size = 1456244, upload-time = "2026-03-09T13:15:20.444Z" }, + { url = "https://files.pythonhosted.org/packages/d0/7a/72c187abc6975f6978c3e39b7cf67aeb8b3c0a8f9790aa7fd412855e9e1f/kiwisolver-1.5.0-cp314-cp314t-manylinux_2_39_riscv64.whl", hash = "sha256:70d593af6a6ca332d1df73d519fddb5148edb15cd90d5f0155e3746a6d4fcc65", size = 1073154, upload-time = "2026-03-09T13:15:22.039Z" }, { url = "https://files.pythonhosted.org/packages/c7/ca/cf5b25783ebbd59143b4371ed0c8428a278abe68d6d0104b01865b1bbd0f/kiwisolver-1.5.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:377815a8616074cabbf3f53354e1d040c35815a134e01d7614b7692e4bf8acfa", size = 2334377, upload-time = "2026-03-09T13:15:23.741Z" }, + { url = "https://files.pythonhosted.org/packages/4a/e5/b1f492adc516796e88751282276745340e2a72dcd0d36cf7173e0daf3210/kiwisolver-1.5.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:0255a027391d52944eae1dbb5d4cc5903f57092f3674e8e544cdd2622826b3f0", size = 2425288, upload-time = "2026-03-09T13:15:25.789Z" }, + { url = "https://files.pythonhosted.org/packages/e6/e5/9b21fbe91a61b8f409d74a26498706e97a48008bfcd1864373d32a6ba31c/kiwisolver-1.5.0-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:012b1eb16e28718fa782b5e61dc6f2da1f0792ca73bd05d54de6cb9561665fc9", size = 2063158, upload-time = "2026-03-09T13:15:27.63Z" }, + { url = "https://files.pythonhosted.org/packages/b1/02/83f47986138310f95ea95531f851b2a62227c11cbc3e690ae1374fe49f0f/kiwisolver-1.5.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:0e3aafb33aed7479377e5e9a82e9d4bf87063741fc99fc7ae48b0f16e32bdd6f", size = 2597260, upload-time = "2026-03-09T13:15:29.421Z" }, { url = "https://files.pythonhosted.org/packages/07/18/43a5f24608d8c313dd189cf838c8e68d75b115567c6279de7796197cfb6a/kiwisolver-1.5.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:e7a116ae737f0000343218c4edf5bd45893bfeaff0993c0b215d7124c9f77646", size = 2394403, upload-time = "2026-03-09T13:15:31.517Z" }, { url = "https://files.pythonhosted.org/packages/1c/fa/2910df836372d8761bb6eff7d8bdcb1613b5c2e03f260efe7abe34d388a7/kiwisolver-1.5.0-graalpy312-graalpy250_312_native-macosx_10_13_x86_64.whl", hash = "sha256:5ae8e62c147495b01a0f4765c878e9bfdf843412446a247e28df59936e99e797", size = 130262, upload-time = "2026-03-09T13:15:35.629Z" }, { url = "https://files.pythonhosted.org/packages/0f/41/c5f71f9f00aabcc71fee8b7475e3f64747282580c2fe748961ba29b18385/kiwisolver-1.5.0-graalpy312-graalpy250_312_native-macosx_11_0_arm64.whl", hash = "sha256:f6764a4ccab3078db14a632420930f6186058750df066b8ea2a7106df91d3203", size = 138036, upload-time = "2026-03-09T13:15:36.894Z" }, @@ -1019,31 +1205,41 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/9a/81/7e4e08678a1f98521201c3079f77db69fb552acd56067661f8c2f534a718/markupsafe-3.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:1872df69a4de6aead3491198eaf13810b565bdbeec3ae2dc8780f14458ec73ce", size = 12020, upload-time = "2025-09-27T18:36:31.971Z" }, { url = "https://files.pythonhosted.org/packages/1e/2c/799f4742efc39633a1b54a92eec4082e4f815314869865d876824c257c1e/markupsafe-3.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:3a7e8ae81ae39e62a41ec302f972ba6ae23a5c5396c8e60113e9066ef893da0d", size = 24332, upload-time = "2025-09-27T18:36:32.813Z" }, { url = "https://files.pythonhosted.org/packages/3c/2e/8d0c2ab90a8c1d9a24f0399058ab8519a3279d1bd4289511d74e909f060e/markupsafe-3.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d6dd0be5b5b189d31db7cda48b91d7e0a9795f31430b7f271219ab30f1d3ac9d", size = 22947, upload-time = "2025-09-27T18:36:33.86Z" }, + { url = "https://files.pythonhosted.org/packages/2c/54/887f3092a85238093a0b2154bd629c89444f395618842e8b0c41783898ea/markupsafe-3.0.3-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:94c6f0bb423f739146aec64595853541634bde58b2135f27f61c1ffd1cd4d16a", size = 21962, upload-time = "2025-09-27T18:36:35.099Z" }, { url = "https://files.pythonhosted.org/packages/c9/2f/336b8c7b6f4a4d95e91119dc8521402461b74a485558d8f238a68312f11c/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:be8813b57049a7dc738189df53d69395eba14fb99345e0a5994914a3864c8a4b", size = 23760, upload-time = "2025-09-27T18:36:36.001Z" }, + { url = "https://files.pythonhosted.org/packages/32/43/67935f2b7e4982ffb50a4d169b724d74b62a3964bc1a9a527f5ac4f1ee2b/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:83891d0e9fb81a825d9a6d61e3f07550ca70a076484292a70fde82c4b807286f", size = 21529, upload-time = "2025-09-27T18:36:36.906Z" }, { url = "https://files.pythonhosted.org/packages/89/e0/4486f11e51bbba8b0c041098859e869e304d1c261e59244baa3d295d47b7/markupsafe-3.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:77f0643abe7495da77fb436f50f8dab76dbc6e5fd25d39589a0f1fe6548bfa2b", size = 23015, upload-time = "2025-09-27T18:36:37.868Z" }, { url = "https://files.pythonhosted.org/packages/38/2f/907b9c7bbba283e68f20259574b13d005c121a0fa4c175f9bed27c4597ff/markupsafe-3.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:e1cf1972137e83c5d4c136c43ced9ac51d0e124706ee1c8aa8532c1287fa8795", size = 11622, upload-time = "2025-09-27T18:36:41.777Z" }, { url = "https://files.pythonhosted.org/packages/9c/d9/5f7756922cdd676869eca1c4e3c0cd0df60ed30199ffd775e319089cb3ed/markupsafe-3.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:116bb52f642a37c115f517494ea5feb03889e04df47eeff5b130b1808ce7c219", size = 12029, upload-time = "2025-09-27T18:36:43.257Z" }, { url = "https://files.pythonhosted.org/packages/00/07/575a68c754943058c78f30db02ee03a64b3c638586fba6a6dd56830b30a3/markupsafe-3.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:133a43e73a802c5562be9bbcd03d090aa5a1fe899db609c29e8c8d815c5f6de6", size = 24374, upload-time = "2025-09-27T18:36:44.508Z" }, { url = "https://files.pythonhosted.org/packages/a9/21/9b05698b46f218fc0e118e1f8168395c65c8a2c750ae2bab54fc4bd4e0e8/markupsafe-3.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ccfcd093f13f0f0b7fdd0f198b90053bf7b2f02a3927a30e63f3ccc9df56b676", size = 22980, upload-time = "2025-09-27T18:36:45.385Z" }, + { url = "https://files.pythonhosted.org/packages/7f/71/544260864f893f18b6827315b988c146b559391e6e7e8f7252839b1b846a/markupsafe-3.0.3-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:509fa21c6deb7a7a273d629cf5ec029bc209d1a51178615ddf718f5918992ab9", size = 21990, upload-time = "2025-09-27T18:36:46.916Z" }, { url = "https://files.pythonhosted.org/packages/c2/28/b50fc2f74d1ad761af2f5dcce7492648b983d00a65b8c0e0cb457c82ebbe/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:a4afe79fb3de0b7097d81da19090f4df4f8d3a2b3adaa8764138aac2e44f3af1", size = 23784, upload-time = "2025-09-27T18:36:47.884Z" }, + { url = "https://files.pythonhosted.org/packages/ed/76/104b2aa106a208da8b17a2fb72e033a5a9d7073c68f7e508b94916ed47a9/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:795e7751525cae078558e679d646ae45574b47ed6e7771863fcc079a6171a0fc", size = 21588, upload-time = "2025-09-27T18:36:48.82Z" }, { url = "https://files.pythonhosted.org/packages/b5/99/16a5eb2d140087ebd97180d95249b00a03aa87e29cc224056274f2e45fd6/markupsafe-3.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:8485f406a96febb5140bfeca44a73e3ce5116b2501ac54fe953e488fb1d03b12", size = 23041, upload-time = "2025-09-27T18:36:49.797Z" }, { url = "https://files.pythonhosted.org/packages/e4/d7/e05cd7efe43a88a17a37b3ae96e79a19e846f3f456fe79c57ca61356ef01/markupsafe-3.0.3-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:218551f6df4868a8d527e3062d0fb968682fe92054e89978594c28e642c43a73", size = 11658, upload-time = "2025-09-27T18:36:54.819Z" }, { url = "https://files.pythonhosted.org/packages/99/9e/e412117548182ce2148bdeacdda3bb494260c0b0184360fe0d56389b523b/markupsafe-3.0.3-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:3524b778fe5cfb3452a09d31e7b5adefeea8c5be1d43c4f810ba09f2ceb29d37", size = 12066, upload-time = "2025-09-27T18:36:55.714Z" }, { url = "https://files.pythonhosted.org/packages/bc/e6/fa0ffcda717ef64a5108eaa7b4f5ed28d56122c9a6d70ab8b72f9f715c80/markupsafe-3.0.3-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4e885a3d1efa2eadc93c894a21770e4bc67899e3543680313b09f139e149ab19", size = 25639, upload-time = "2025-09-27T18:36:56.908Z" }, { url = "https://files.pythonhosted.org/packages/96/ec/2102e881fe9d25fc16cb4b25d5f5cde50970967ffa5dddafdb771237062d/markupsafe-3.0.3-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8709b08f4a89aa7586de0aadc8da56180242ee0ada3999749b183aa23df95025", size = 23569, upload-time = "2025-09-27T18:36:57.913Z" }, + { url = "https://files.pythonhosted.org/packages/4b/30/6f2fce1f1f205fc9323255b216ca8a235b15860c34b6798f810f05828e32/markupsafe-3.0.3-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:b8512a91625c9b3da6f127803b166b629725e68af71f8184ae7e7d54686a56d6", size = 23284, upload-time = "2025-09-27T18:36:58.833Z" }, { url = "https://files.pythonhosted.org/packages/58/47/4a0ccea4ab9f5dcb6f79c0236d954acb382202721e704223a8aafa38b5c8/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9b79b7a16f7fedff2495d684f2b59b0457c3b493778c9eed31111be64d58279f", size = 24801, upload-time = "2025-09-27T18:36:59.739Z" }, + { url = "https://files.pythonhosted.org/packages/6a/70/3780e9b72180b6fecb83a4814d84c3bf4b4ae4bf0b19c27196104149734c/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:12c63dfb4a98206f045aa9563db46507995f7ef6d83b2f68eda65c307c6829eb", size = 22769, upload-time = "2025-09-27T18:37:00.719Z" }, { url = "https://files.pythonhosted.org/packages/98/c5/c03c7f4125180fc215220c035beac6b9cb684bc7a067c84fc69414d315f5/markupsafe-3.0.3-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:8f71bc33915be5186016f675cd83a1e08523649b0e33efdb898db577ef5bb009", size = 23642, upload-time = "2025-09-27T18:37:01.673Z" }, { url = "https://files.pythonhosted.org/packages/33/8a/8e42d4838cd89b7dde187011e97fe6c3af66d8c044997d2183fbd6d31352/markupsafe-3.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:eaa9599de571d72e2daf60164784109f19978b327a3910d3e9de8c97b5b70cfe", size = 11619, upload-time = "2025-09-27T18:37:06.342Z" }, { url = "https://files.pythonhosted.org/packages/b5/64/7660f8a4a8e53c924d0fa05dc3a55c9cee10bbd82b11c5afb27d44b096ce/markupsafe-3.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:c47a551199eb8eb2121d4f0f15ae0f923d31350ab9280078d1e5f12b249e0026", size = 12029, upload-time = "2025-09-27T18:37:07.213Z" }, { url = "https://files.pythonhosted.org/packages/da/ef/e648bfd021127bef5fa12e1720ffed0c6cbb8310c8d9bea7266337ff06de/markupsafe-3.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f34c41761022dd093b4b6896d4810782ffbabe30f2d443ff5f083e0cbbb8c737", size = 24408, upload-time = "2025-09-27T18:37:09.572Z" }, { url = "https://files.pythonhosted.org/packages/41/3c/a36c2450754618e62008bf7435ccb0f88053e07592e6028a34776213d877/markupsafe-3.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:457a69a9577064c05a97c41f4e65148652db078a3a509039e64d3467b9e7ef97", size = 23005, upload-time = "2025-09-27T18:37:10.58Z" }, + { url = "https://files.pythonhosted.org/packages/bc/20/b7fdf89a8456b099837cd1dc21974632a02a999ec9bf7ca3e490aacd98e7/markupsafe-3.0.3-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:e8afc3f2ccfa24215f8cb28dcf43f0113ac3c37c2f0f0806d8c70e4228c5cf4d", size = 22048, upload-time = "2025-09-27T18:37:11.547Z" }, { url = "https://files.pythonhosted.org/packages/9a/a7/591f592afdc734f47db08a75793a55d7fbcc6902a723ae4cfbab61010cc5/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:ec15a59cf5af7be74194f7ab02d0f59a62bdcf1a537677ce67a2537c9b87fcda", size = 23821, upload-time = "2025-09-27T18:37:12.48Z" }, + { url = "https://files.pythonhosted.org/packages/7d/33/45b24e4f44195b26521bc6f1a82197118f74df348556594bd2262bda1038/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:0eb9ff8191e8498cca014656ae6b8d61f39da5f95b488805da4bb029cccbfbaf", size = 21606, upload-time = "2025-09-27T18:37:13.485Z" }, { url = "https://files.pythonhosted.org/packages/ff/0e/53dfaca23a69fbfbbf17a4b64072090e70717344c52eaaaa9c5ddff1e5f0/markupsafe-3.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:2713baf880df847f2bece4230d4d094280f4e67b1e813eec43b4c0e144a34ffe", size = 23043, upload-time = "2025-09-27T18:37:14.408Z" }, { url = "https://files.pythonhosted.org/packages/3c/f0/57689aa4076e1b43b15fdfa646b04653969d50cf30c32a102762be2485da/markupsafe-3.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:1353ef0c1b138e1907ae78e2f6c63ff67501122006b0f9abad68fda5f4ffc6ab", size = 11661, upload-time = "2025-09-27T18:37:18.453Z" }, { url = "https://files.pythonhosted.org/packages/89/c3/2e67a7ca217c6912985ec766c6393b636fb0c2344443ff9d91404dc4c79f/markupsafe-3.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1085e7fbddd3be5f89cc898938f42c0b3c711fdcb37d75221de2666af647c175", size = 12069, upload-time = "2025-09-27T18:37:19.332Z" }, { url = "https://files.pythonhosted.org/packages/f0/00/be561dce4e6ca66b15276e184ce4b8aec61fe83662cce2f7d72bd3249d28/markupsafe-3.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:1b52b4fb9df4eb9ae465f8d0c228a00624de2334f216f178a995ccdcf82c4634", size = 25670, upload-time = "2025-09-27T18:37:20.245Z" }, { url = "https://files.pythonhosted.org/packages/50/09/c419f6f5a92e5fadde27efd190eca90f05e1261b10dbd8cbcb39cd8ea1dc/markupsafe-3.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fed51ac40f757d41b7c48425901843666a6677e3e8eb0abcff09e4ba6e664f50", size = 23598, upload-time = "2025-09-27T18:37:21.177Z" }, + { url = "https://files.pythonhosted.org/packages/22/44/a0681611106e0b2921b3033fc19bc53323e0b50bc70cffdd19f7d679bb66/markupsafe-3.0.3-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:f190daf01f13c72eac4efd5c430a8de82489d9cff23c364c3ea822545032993e", size = 23261, upload-time = "2025-09-27T18:37:22.167Z" }, { url = "https://files.pythonhosted.org/packages/5f/57/1b0b3f100259dc9fffe780cfb60d4be71375510e435efec3d116b6436d43/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e56b7d45a839a697b5eb268c82a71bd8c7f6c94d6fd50c3d577fa39a9f1409f5", size = 24835, upload-time = "2025-09-27T18:37:23.296Z" }, + { url = "https://files.pythonhosted.org/packages/26/6a/4bf6d0c97c4920f1597cc14dd720705eca0bf7c787aebc6bb4d1bead5388/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:f3e98bb3798ead92273dc0e5fd0f31ade220f59a266ffd8a4f6065e0a3ce0523", size = 22733, upload-time = "2025-09-27T18:37:24.237Z" }, { url = "https://files.pythonhosted.org/packages/14/c7/ca723101509b518797fedc2fdf79ba57f886b4aca8a7d31857ba3ee8281f/markupsafe-3.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5678211cb9333a6468fb8d8be0305520aa073f50d17f089b5b4b477ea6e67fdc", size = 23672, upload-time = "2025-09-27T18:37:25.271Z" }, ] @@ -1198,36 +1394,66 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/fe/cf/18ef143a81610136d3da8193da9d80bfe1cb548a1e2d1c775f26b23d024a/multidict-6.7.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:3fccb473e87eaa1382689053e4a4618e7ba7b9b9b8d6adf2027ee474597128cd", size = 45456, upload-time = "2026-01-26T02:43:53.893Z" }, { url = "https://files.pythonhosted.org/packages/a9/65/1caac9d4cd32e8433908683446eebc953e82d22b03d10d41a5f0fefe991b/multidict-6.7.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:b0fa96985700739c4c7853a43c0b3e169360d6855780021bfc6d0f1ce7c123e7", size = 43872, upload-time = "2026-01-26T02:43:55.041Z" }, { url = "https://files.pythonhosted.org/packages/fd/80/c959c5933adedb9ac15152e4067c702a808ea183a8b64cf8f31af8ad3155/multidict-6.7.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:eb0ce7b2a32d09892b3dd6cc44877a0d02a33241fafca5f25c8b6b62374f8b75", size = 258883, upload-time = "2026-01-26T02:43:57.499Z" }, + { url = "https://files.pythonhosted.org/packages/86/85/7ed40adafea3d4f1c8b916e3b5cc3a8e07dfcdcb9cd72800f4ed3ca1b387/multidict-6.7.1-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:c3a32d23520ee37bf327d1e1a656fec76a2edd5c038bf43eddfa0572ec49c60b", size = 242413, upload-time = "2026-01-26T02:43:58.755Z" }, + { url = "https://files.pythonhosted.org/packages/d2/57/b8565ff533e48595503c785f8361ff9a4fde4d67de25c207cd0ba3befd03/multidict-6.7.1-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:9c90fed18bffc0189ba814749fdcc102b536e83a9f738a9003e569acd540a733", size = 268404, upload-time = "2026-01-26T02:44:00.216Z" }, + { url = "https://files.pythonhosted.org/packages/e0/50/9810c5c29350f7258180dfdcb2e52783a0632862eb334c4896ac717cebcb/multidict-6.7.1-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:da62917e6076f512daccfbbde27f46fed1c98fee202f0559adec8ee0de67f71a", size = 269456, upload-time = "2026-01-26T02:44:02.202Z" }, { url = "https://files.pythonhosted.org/packages/f3/8d/5e5be3ced1d12966fefb5c4ea3b2a5b480afcea36406559442c6e31d4a48/multidict-6.7.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:bfde23ef6ed9db7eaee6c37dcec08524cb43903c60b285b172b6c094711b3961", size = 256322, upload-time = "2026-01-26T02:44:03.56Z" }, { url = "https://files.pythonhosted.org/packages/31/6e/d8a26d81ac166a5592782d208dd90dfdc0a7a218adaa52b45a672b46c122/multidict-6.7.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:3758692429e4e32f1ba0df23219cd0b4fc0a52f476726fff9337d1a57676a582", size = 253955, upload-time = "2026-01-26T02:44:04.845Z" }, + { url = "https://files.pythonhosted.org/packages/59/4c/7c672c8aad41534ba619bcd4ade7a0dc87ed6b8b5c06149b85d3dd03f0cd/multidict-6.7.1-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:398c1478926eca669f2fd6a5856b6de9c0acf23a2cb59a14c0ba5844fa38077e", size = 251254, upload-time = "2026-01-26T02:44:06.133Z" }, + { url = "https://files.pythonhosted.org/packages/fa/ba/f5449385510825b73d01c2d4087bf6d2fccc20a2d42ac34df93191d3dd03/multidict-6.7.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:a088b62bd733e2ad12c50dad01b7d0166c30287c166e137433d3b410add807a6", size = 263588, upload-time = "2026-01-26T02:44:09.382Z" }, + { url = "https://files.pythonhosted.org/packages/d7/11/afc7c677f68f75c84a69fe37184f0f82fce13ce4b92f49f3db280b7e92b3/multidict-6.7.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:3d51ff4785d58d3f6c91bdbffcb5e1f7ddfda557727043aa20d20ec4f65e324a", size = 259642, upload-time = "2026-01-26T02:44:10.73Z" }, { url = "https://files.pythonhosted.org/packages/2b/17/ebb9644da78c4ab36403739e0e6e0e30ebb135b9caf3440825001a0bddcb/multidict-6.7.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fc5907494fccf3e7d3f94f95c91d6336b092b5fc83811720fae5e2765890dfba", size = 251377, upload-time = "2026-01-26T02:44:12.042Z" }, { url = "https://files.pythonhosted.org/packages/f2/22/929c141d6c0dba87d3e1d38fbdf1ba8baba86b7776469f2bc2d3227a1e67/multidict-6.7.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:2b41f5fed0ed563624f1c17630cb9941cf2309d4df00e494b551b5f3e3d67a23", size = 76174, upload-time = "2026-01-26T02:44:18.509Z" }, { url = "https://files.pythonhosted.org/packages/c7/75/bc704ae15fee974f8fccd871305e254754167dce5f9e42d88a2def741a1d/multidict-6.7.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:84e61e3af5463c19b67ced91f6c634effb89ef8bfc5ca0267f954451ed4bb6a2", size = 45116, upload-time = "2026-01-26T02:44:19.745Z" }, { url = "https://files.pythonhosted.org/packages/79/76/55cd7186f498ed080a18440c9013011eb548f77ae1b297206d030eb1180a/multidict-6.7.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:935434b9853c7c112eee7ac891bc4cb86455aa631269ae35442cb316790c1445", size = 43524, upload-time = "2026-01-26T02:44:21.571Z" }, { url = "https://files.pythonhosted.org/packages/f6/32/befed7f74c458b4a525e60519fe8d87eef72bb1e99924fa2b0f9d97a221e/multidict-6.7.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:e82d14e3c948952a1a85503817e038cba5905a3352de76b9a465075d072fba23", size = 256952, upload-time = "2026-01-26T02:44:24.306Z" }, + { url = "https://files.pythonhosted.org/packages/03/d6/c878a44ba877f366630c860fdf74bfb203c33778f12b6ac274936853c451/multidict-6.7.1-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:4cfb48c6ea66c83bcaaf7e4dfa7ec1b6bbcf751b7db85a328902796dfde4c060", size = 240317, upload-time = "2026-01-26T02:44:25.772Z" }, + { url = "https://files.pythonhosted.org/packages/68/49/57421b4d7ad2e9e60e25922b08ceb37e077b90444bde6ead629095327a6f/multidict-6.7.1-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:1d540e51b7e8e170174555edecddbd5538105443754539193e3e1061864d444d", size = 267132, upload-time = "2026-01-26T02:44:27.648Z" }, + { url = "https://files.pythonhosted.org/packages/b7/fe/ec0edd52ddbcea2a2e89e174f0206444a61440b40f39704e64dc807a70bd/multidict-6.7.1-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:273d23f4b40f3dce4d6c8a821c741a86dec62cded82e1175ba3d99be128147ed", size = 268140, upload-time = "2026-01-26T02:44:29.588Z" }, { url = "https://files.pythonhosted.org/packages/b0/73/6e1b01cbeb458807aa0831742232dbdd1fa92bfa33f52a3f176b4ff3dc11/multidict-6.7.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9d624335fd4fa1c08a53f8b4be7676ebde19cd092b3895c421045ca87895b429", size = 254277, upload-time = "2026-01-26T02:44:30.902Z" }, { url = "https://files.pythonhosted.org/packages/6a/b2/5fb8c124d7561a4974c342bc8c778b471ebbeb3cc17df696f034a7e9afe7/multidict-6.7.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:12fad252f8b267cc75b66e8fc51b3079604e8d43a75428ffe193cd9e2195dfd6", size = 252291, upload-time = "2026-01-26T02:44:32.31Z" }, + { url = "https://files.pythonhosted.org/packages/5a/96/51d4e4e06bcce92577fcd488e22600bd38e4fd59c20cb49434d054903bd2/multidict-6.7.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:03ede2a6ffbe8ef936b92cb4529f27f42be7f56afcdab5ab739cd5f27fb1cbf9", size = 250156, upload-time = "2026-01-26T02:44:33.734Z" }, + { url = "https://files.pythonhosted.org/packages/44/a3/ec5b5bd98f306bc2aa297b8c6f11a46714a56b1e6ef5ebda50a4f5d7c5fb/multidict-6.7.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:5c4b9bfc148f5a91be9244d6264c53035c8a0dcd2f51f1c3c6e30e30ebaa1c84", size = 262221, upload-time = "2026-01-26T02:44:36.604Z" }, + { url = "https://files.pythonhosted.org/packages/cd/f7/e8c0d0da0cd1e28d10e624604e1a36bcc3353aaebdfdc3a43c72bc683a12/multidict-6.7.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:401c5a650f3add2472d1d288c26deebc540f99e2fb83e9525007a74cd2116f1d", size = 258664, upload-time = "2026-01-26T02:44:38.008Z" }, { url = "https://files.pythonhosted.org/packages/52/da/151a44e8016dd33feed44f730bd856a66257c1ee7aed4f44b649fb7edeb3/multidict-6.7.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:97891f3b1b3ffbded884e2916cacf3c6fc87b66bb0dde46f7357404750559f33", size = 249490, upload-time = "2026-01-26T02:44:39.386Z" }, { url = "https://files.pythonhosted.org/packages/6d/b3/e6b21c6c4f314bb956016b0b3ef2162590a529b84cb831c257519e7fde44/multidict-6.7.1-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:c76c4bec1538375dad9d452d246ca5368ad6e1c9039dadcf007ae59c70619ea1", size = 83175, upload-time = "2026-01-26T02:44:44.894Z" }, { url = "https://files.pythonhosted.org/packages/fb/76/23ecd2abfe0957b234f6c960f4ade497f55f2c16aeb684d4ecdbf1c95791/multidict-6.7.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:57b46b24b5d5ebcc978da4ec23a819a9402b4228b8a90d9c656422b4bdd8a963", size = 48460, upload-time = "2026-01-26T02:44:46.106Z" }, { url = "https://files.pythonhosted.org/packages/c4/57/a0ed92b23f3a042c36bc4227b72b97eca803f5f1801c1ab77c8a212d455e/multidict-6.7.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:e954b24433c768ce78ab7929e84ccf3422e46deb45a4dc9f93438f8217fa2d34", size = 46930, upload-time = "2026-01-26T02:44:47.278Z" }, { url = "https://files.pythonhosted.org/packages/58/18/64f5a795e7677670e872673aca234162514696274597b3708b2c0d276cce/multidict-6.7.1-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:253282d70d67885a15c8a7716f3a73edf2d635793ceda8173b9ecc21f2fb8292", size = 250031, upload-time = "2026-01-26T02:44:50.544Z" }, + { url = "https://files.pythonhosted.org/packages/c8/ed/e192291dbbe51a8290c5686f482084d31bcd9d09af24f63358c3d42fd284/multidict-6.7.1-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:0b4c48648d7649c9335cf1927a8b87fa692de3dcb15faa676c6a6f1f1aabda43", size = 228596, upload-time = "2026-01-26T02:44:51.951Z" }, + { url = "https://files.pythonhosted.org/packages/1e/7e/3562a15a60cf747397e7f2180b0a11dc0c38d9175a650e75fa1b4d325e15/multidict-6.7.1-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:98bc624954ec4d2c7cb074b8eefc2b5d0ce7d482e410df446414355d158fe4ca", size = 257492, upload-time = "2026-01-26T02:44:53.902Z" }, + { url = "https://files.pythonhosted.org/packages/24/02/7d0f9eae92b5249bb50ac1595b295f10e263dd0078ebb55115c31e0eaccd/multidict-6.7.1-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:1b99af4d9eec0b49927b4402bcbb58dea89d3e0db8806a4086117019939ad3dd", size = 255899, upload-time = "2026-01-26T02:44:55.316Z" }, { url = "https://files.pythonhosted.org/packages/00/e3/9b60ed9e23e64c73a5cde95269ef1330678e9c6e34dd4eb6b431b85b5a10/multidict-6.7.1-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6aac4f16b472d5b7dc6f66a0d49dd57b0e0902090be16594dc9ebfd3d17c47e7", size = 247970, upload-time = "2026-01-26T02:44:56.783Z" }, { url = "https://files.pythonhosted.org/packages/3e/06/538e58a63ed5cfb0bd4517e346b91da32fde409d839720f664e9a4ae4f9d/multidict-6.7.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:21f830fe223215dffd51f538e78c172ed7c7f60c9b96a2bf05c4848ad49921c3", size = 245060, upload-time = "2026-01-26T02:44:58.195Z" }, + { url = "https://files.pythonhosted.org/packages/b2/2f/d743a3045a97c895d401e9bd29aaa09b94f5cbdf1bd561609e5a6c431c70/multidict-6.7.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:f5dd81c45b05518b9aa4da4aa74e1c93d715efa234fd3e8a179df611cc85e5f4", size = 235888, upload-time = "2026-01-26T02:44:59.57Z" }, + { url = "https://files.pythonhosted.org/packages/20/1f/9d2327086bd15da2725ef6aae624208e2ef828ed99892b17f60c344e57ed/multidict-6.7.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:c9035dde0f916702850ef66460bc4239d89d08df4d02023a5926e7446724212c", size = 252341, upload-time = "2026-01-26T02:45:02.484Z" }, + { url = "https://files.pythonhosted.org/packages/e8/2c/2a1aa0280cf579d0f6eed8ee5211c4f1730bd7e06c636ba2ee6aafda302e/multidict-6.7.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:af959b9beeb66c822380f222f0e0a1889331597e81f1ded7f374f3ecb0fd6c52", size = 246391, upload-time = "2026-01-26T02:45:03.862Z" }, { url = "https://files.pythonhosted.org/packages/e5/03/7ca022ffc36c5a3f6e03b179a5ceb829be9da5783e6fe395f347c0794680/multidict-6.7.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:41f2952231456154ee479651491e94118229844dd7226541788be783be2b5108", size = 243422, upload-time = "2026-01-26T02:45:05.296Z" }, { url = "https://files.pythonhosted.org/packages/91/cc/db74228a8be41884a567e88a62fd589a913708fcf180d029898c17a9a371/multidict-6.7.1-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:8f333ec9c5eb1b7105e3b84b53141e66ca05a19a605368c55450b6ba208cb9ee", size = 75190, upload-time = "2026-01-26T02:45:10.651Z" }, { url = "https://files.pythonhosted.org/packages/d5/22/492f2246bb5b534abd44804292e81eeaf835388901f0c574bac4eeec73c5/multidict-6.7.1-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:a407f13c188f804c759fc6a9f88286a565c242a76b27626594c133b82883b5c2", size = 44486, upload-time = "2026-01-26T02:45:11.938Z" }, { url = "https://files.pythonhosted.org/packages/f1/4f/733c48f270565d78b4544f2baddc2fb2a245e5a8640254b12c36ac7ac68e/multidict-6.7.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:0e161ddf326db5577c3a4cc2d8648f81456e8a20d40415541587a71620d7a7d1", size = 43219, upload-time = "2026-01-26T02:45:14.346Z" }, { url = "https://files.pythonhosted.org/packages/a7/f9/44d4b3064c65079d2467888794dea218d1601898ac50222ab8a9a8094460/multidict-6.7.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:97231140a50f5d447d3164f994b86a0bed7cd016e2682f8650d6a9158e14fd31", size = 252420, upload-time = "2026-01-26T02:45:17.293Z" }, + { url = "https://files.pythonhosted.org/packages/8b/13/78f7275e73fa17b24c9a51b0bd9d73ba64bb32d0ed51b02a746eb876abe7/multidict-6.7.1-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:6b10359683bd8806a200fd2909e7c8ca3a7b24ec1d8132e483d58e791d881048", size = 233510, upload-time = "2026-01-26T02:45:19.356Z" }, + { url = "https://files.pythonhosted.org/packages/4b/25/8167187f62ae3cbd52da7893f58cb036b47ea3fb67138787c76800158982/multidict-6.7.1-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:283ddac99f7ac25a4acadbf004cb5ae34480bbeb063520f70ce397b281859362", size = 264094, upload-time = "2026-01-26T02:45:20.834Z" }, + { url = "https://files.pythonhosted.org/packages/a1/e7/69a3a83b7b030cf283fb06ce074a05a02322359783424d7edf0f15fe5022/multidict-6.7.1-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:538cec1e18c067d0e6103aa9a74f9e832904c957adc260e61cd9d8cf0c3b3d37", size = 260786, upload-time = "2026-01-26T02:45:22.818Z" }, { url = "https://files.pythonhosted.org/packages/fe/3b/8ec5074bcfc450fe84273713b4b0a0dd47c0249358f5d82eb8104ffe2520/multidict-6.7.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7eee46ccb30ff48a1e35bb818cc90846c6be2b68240e42a78599166722cea709", size = 248483, upload-time = "2026-01-26T02:45:24.368Z" }, { url = "https://files.pythonhosted.org/packages/48/5a/d5a99e3acbca0e29c5d9cba8f92ceb15dce78bab963b308ae692981e3a5d/multidict-6.7.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:fa263a02f4f2dd2d11a7b1bb4362aa7cb1049f84a9235d31adf63f30143469a0", size = 248403, upload-time = "2026-01-26T02:45:25.982Z" }, + { url = "https://files.pythonhosted.org/packages/35/48/e58cd31f6c7d5102f2a4bf89f96b9cf7e00b6c6f3d04ecc44417c00a5a3c/multidict-6.7.1-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:2e1425e2f99ec5bd36c15a01b690a1a2456209c5deed58f95469ffb46039ccbb", size = 240315, upload-time = "2026-01-26T02:45:27.487Z" }, + { url = "https://files.pythonhosted.org/packages/64/f2/6e1107d226278c876c783056b7db43d800bb64c6131cec9c8dfb6903698e/multidict-6.7.1-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:233b398c29d3f1b9676b4b6f75c518a06fcb2ea0b925119fb2c1bc35c05e1601", size = 258784, upload-time = "2026-01-26T02:45:30.503Z" }, + { url = "https://files.pythonhosted.org/packages/4d/c1/11f664f14d525e4a1b5327a82d4de61a1db604ab34c6603bb3c2cc63ad34/multidict-6.7.1-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:93b1818e4a6e0930454f0f2af7dfce69307ca03cdcfb3739bf4d91241967b6c1", size = 251980, upload-time = "2026-01-26T02:45:32.603Z" }, { url = "https://files.pythonhosted.org/packages/e1/9f/75a9ac888121d0c5bbd4ecf4eead45668b1766f6baabfb3b7f66a410e231/multidict-6.7.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:f33dc2a3abe9249ea5d8360f969ec7f4142e7ac45ee7014d8f8d5acddf178b7b", size = 243602, upload-time = "2026-01-26T02:45:34.043Z" }, { url = "https://files.pythonhosted.org/packages/e0/6d/f9293baa6146ba9507e360ea0292b6422b016907c393e2f63fc40ab7b7b5/multidict-6.7.1-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:0458c978acd8e6ea53c81eefaddbbee9c6c5e591f41b3f5e8e194780fe026581", size = 82401, upload-time = "2026-01-26T02:45:40.254Z" }, { url = "https://files.pythonhosted.org/packages/7a/68/53b5494738d83558d87c3c71a486504d8373421c3e0dbb6d0db48ad42ee0/multidict-6.7.1-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:c0abd12629b0af3cf590982c0b413b1e7395cd4ec026f30986818ab95bfaa94a", size = 48143, upload-time = "2026-01-26T02:45:41.635Z" }, { url = "https://files.pythonhosted.org/packages/37/e8/5284c53310dcdc99ce5d66563f6e5773531a9b9fe9ec7a615e9bc306b05f/multidict-6.7.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:14525a5f61d7d0c94b368a42cff4c9a4e7ba2d52e2672a7b23d84dc86fb02b0c", size = 46507, upload-time = "2026-01-26T02:45:42.99Z" }, { url = "https://files.pythonhosted.org/packages/41/75/4ad0973179361cdf3a113905e6e088173198349131be2b390f9fa4da5fc6/multidict-6.7.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:7a7e590ff876a3eaf1c02a4dfe0724b6e69a9e9de6d8f556816f29c496046e59", size = 246884, upload-time = "2026-01-26T02:45:47.167Z" }, + { url = "https://files.pythonhosted.org/packages/c3/9c/095bb28b5da139bd41fb9a5d5caff412584f377914bd8787c2aa98717130/multidict-6.7.1-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:5fa6a95dfee63893d80a34758cd0e0c118a30b8dcb46372bf75106c591b77889", size = 225878, upload-time = "2026-01-26T02:45:48.698Z" }, + { url = "https://files.pythonhosted.org/packages/07/d0/c0a72000243756e8f5a277b6b514fa005f2c73d481b7d9e47cd4568aa2e4/multidict-6.7.1-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a0543217a6a017692aa6ae5cc39adb75e587af0f3a82288b1492eb73dd6cc2a4", size = 253542, upload-time = "2026-01-26T02:45:50.164Z" }, + { url = "https://files.pythonhosted.org/packages/c0/6b/f69da15289e384ecf2a68837ec8b5ad8c33e973aa18b266f50fe55f24b8c/multidict-6.7.1-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f99fe611c312b3c1c0ace793f92464d8cd263cc3b26b5721950d977b006b6c4d", size = 252403, upload-time = "2026-01-26T02:45:51.779Z" }, { url = "https://files.pythonhosted.org/packages/a2/76/b9669547afa5a1a25cd93eaca91c0da1c095b06b6d2d8ec25b713588d3a1/multidict-6.7.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:9004d8386d133b7e6135679424c91b0b854d2d164af6ea3f289f8f2761064609", size = 244889, upload-time = "2026-01-26T02:45:53.27Z" }, { url = "https://files.pythonhosted.org/packages/7e/a9/a50d2669e506dad33cfc45b5d574a205587b7b8a5f426f2fbb2e90882588/multidict-6.7.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:e628ef0e6859ffd8273c69412a2465c4be4a9517d07261b33334b5ec6f3c7489", size = 241982, upload-time = "2026-01-26T02:45:54.919Z" }, + { url = "https://files.pythonhosted.org/packages/c5/bb/1609558ad8b456b4827d3c5a5b775c93b87878fd3117ed3db3423dfbce1b/multidict-6.7.1-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:841189848ba629c3552035a6a7f5bf3b02eb304e9fea7492ca220a8eda6b0e5c", size = 232415, upload-time = "2026-01-26T02:45:56.981Z" }, + { url = "https://files.pythonhosted.org/packages/a1/29/fdc6a43c203890dc2ae9249971ecd0c41deaedfe00d25cb6564b2edd99eb/multidict-6.7.1-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:b26684587228afed0d50cf804cc71062cc9c1cdf55051c4c6345d372947b268c", size = 248788, upload-time = "2026-01-26T02:46:00.862Z" }, + { url = "https://files.pythonhosted.org/packages/a9/14/a153a06101323e4cf086ecee3faadba52ff71633d471f9685c42e3736163/multidict-6.7.1-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:9f9af11306994335398293f9958071019e3ab95e9a707dc1383a35613f6abcb9", size = 242842, upload-time = "2026-01-26T02:46:02.824Z" }, { url = "https://files.pythonhosted.org/packages/41/5f/604ae839e64a4a6efc80db94465348d3b328ee955e37acb24badbcd24d83/multidict-6.7.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:b4938326284c4f1224178a560987b6cf8b4d38458b113d9b8c1db1a836e640a2", size = 240237, upload-time = "2026-01-26T02:46:05.898Z" }, { url = "https://files.pythonhosted.org/packages/81/08/7036c080d7117f28a4af526d794aab6a84463126db031b007717c1a6676e/multidict-6.7.1-py3-none-any.whl", hash = "sha256:55d97cc6dae627efa6a6e548885712d4864b81110ac76fa4e534c03819fa4a56", size = 12319, upload-time = "2026-01-26T02:46:44.004Z" }, ] @@ -1332,8 +1558,11 @@ sdist = { url = "https://files.pythonhosted.org/packages/3e/92/2d038d096f29179c7 wheels = [ { url = "https://files.pythonhosted.org/packages/45/c6/2502f416d46be3ec08bb66d696cccffb57781a499e3ff2e4d7c174af4e8f/openai_harmony-0.0.8-cp38-abi3-macosx_11_0_arm64.whl", hash = "sha256:029ec25ca74abe48fdb58eb9fdd2a8c1618581fc33ce8e5653f8a1ffbfbd9326", size = 2627806, upload-time = "2025-11-05T19:06:57.063Z" }, { url = "https://files.pythonhosted.org/packages/d3/d2/ce6953ca87db9cae3e775024184da7d1c5cb88cead19a2d75b42f00a959c/openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e4f709815924ec325b9a890e6ab2bbb0ceec8e319a4e257328eb752cf36b2efc", size = 2948463, upload-time = "2025-11-05T19:06:48.17Z" }, + { url = "https://files.pythonhosted.org/packages/fa/4c/b553c9651662d6ce102ca7f3629d268b23df1abe5841e24bed81e8a8e949/openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:5cfcfd963b50a41fc656c84d3440ca6eecdccd6c552158ce790b8f2e33dfb5a9", size = 2704083, upload-time = "2025-11-05T19:06:50.205Z" }, + { url = "https://files.pythonhosted.org/packages/11/3c/33f3374e4624e0e776f6b13b73c45a7ead7f9c4529f8369ed5bfcaa30cac/openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:b4d5cfa168e74d08f8ba6d58a7e49bc7daef4d58951ec69b66b0d56f4927a68d", size = 3427031, upload-time = "2025-11-05T19:06:51.829Z" }, { url = "https://files.pythonhosted.org/packages/25/3f/1a192b93bb47c6b44cd98ba8cc1d3d2a9308f1bb700c3017e6352da11bda/openai_harmony-0.0.8-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c007d277218a50db8839e599ed78e0fffe5130f614c3f6d93ae257f282071a29", size = 2953260, upload-time = "2025-11-05T19:06:55.406Z" }, { url = "https://files.pythonhosted.org/packages/5b/f8/93b582cad3531797c3db7c2db5400fd841538ccddfd9f5e3df61be99a630/openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:8565d4f5a0638da1bffde29832ed63c9e695c558611053add3b2dc0b56c92dbc", size = 3127044, upload-time = "2025-11-05T19:06:59.553Z" }, + { url = "https://files.pythonhosted.org/packages/1d/10/4327dbf87f75ae813405fd9a9b4a5cde63d506ffed0a096a440a4cabd89c/openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_armv7l.whl", hash = "sha256:cbaa3bda75ef0d8836e1f8cc84af62f971b1d756d740efc95c38c3e04c0bfde2", size = 2932931, upload-time = "2025-11-05T19:07:01.437Z" }, { url = "https://files.pythonhosted.org/packages/60/c3/3d1e01e2dba517a91760e4a03e4f20ffc75039a6fe584d0e6f9b5c78fd15/openai_harmony-0.0.8-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:007b0476a1f331f8130783f901f1da6f5a7057af1a4891f1b6a31dec364189b5", size = 3205080, upload-time = "2025-11-05T19:07:05.078Z" }, ] @@ -1397,6 +1626,18 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/e4/a5/2f6ed612056819de445a433ca1f2821ac3dab7f150d569a59e9cc105de1d/pandas-3.0.2-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:734be7551687c00fbd760dc0522ed974f82ad230d4a10f54bf51b80d44a08702", size = 11815274, upload-time = "2026-03-31T06:48:22.695Z" }, ] +[[package]] +name = "pbr" +version = "7.0.3" +source = { registry = "https://pypi.org/simple" } +dependencies = [ + { name = "setuptools", marker = "(platform_machine == 'arm64' and sys_platform == 'darwin') or (platform_machine == 'x86_64' and sys_platform == 'darwin') or (platform_machine == 'aarch64' and sys_platform == 'linux') or (platform_machine == 'x86_64' and sys_platform == 'linux')" }, +] +sdist = { url = "https://files.pythonhosted.org/packages/5e/ab/1de9a4f730edde1bdbbc2b8d19f8fa326f036b4f18b2f72cfbea7dc53c26/pbr-7.0.3.tar.gz", hash = "sha256:b46004ec30a5324672683ec848aed9e8fc500b0d261d40a3229c2d2bbfcedc29", size = 135625, upload-time = "2025-11-03T17:04:56.274Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/c0/db/61efa0d08a99f897ef98256b03e563092d36cc38dc4ebe4a85020fe40b31/pbr-7.0.3-py2.py3-none-any.whl", hash = "sha256:ff223894eb1cd271a98076b13d3badff3bb36c424074d26334cd25aebeecea6b", size = 131898, upload-time = "2025-11-03T17:04:54.875Z" }, +] + [[package]] name = "pillow" version = "12.2.0" @@ -1411,9 +1652,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/43/e3/fdc657359e919462369869f1c9f0e973f353f9a9ee295a39b1fea8ee1a77/pillow-12.2.0-cp312-cp312-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:62f5409336adb0663b7caa0da5c7d9e7bdbaae9ce761d34669420c2a801b2780", size = 7087215, upload-time = "2026-04-01T14:43:26.758Z" }, { url = "https://files.pythonhosted.org/packages/8b/f8/2f6825e441d5b1959d2ca5adec984210f1ec086435b0ed5f52c19b3b8a6e/pillow-12.2.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:01afa7cf67f74f09523699b4e88c73fb55c13346d212a59a2db1f86b0a63e8c5", size = 6509783, upload-time = "2026-04-01T14:43:29.56Z" }, { url = "https://files.pythonhosted.org/packages/67/f9/029a27095ad20f854f9dba026b3ea6428548316e057e6fc3545409e86651/pillow-12.2.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:fc3d34d4a8fbec3e88a79b92e5465e0f9b842b628675850d860b8bd300b159f5", size = 7212112, upload-time = "2026-04-01T14:43:32.091Z" }, - { url = "https://files.pythonhosted.org/packages/4a/01/53d10cf0dbad820a8db274d259a37ba50b88b24768ddccec07355382d5ad/pillow-12.2.0-cp313-cp313-ios_13_0_arm64_iphoneos.whl", hash = "sha256:8297651f5b5679c19968abefd6bb84d95fe30ef712eb1b2d9b2d31ca61267f4c", size = 4100837, upload-time = "2026-04-01T14:43:41.506Z" }, - { url = "https://files.pythonhosted.org/packages/0f/98/f3a6657ecb698c937f6c76ee564882945f29b79bad496abcba0e84659ec5/pillow-12.2.0-cp313-cp313-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:50d8520da2a6ce0af445fa6d648c4273c3eeefbc32d7ce049f22e8b5c3daecc2", size = 4176528, upload-time = "2026-04-01T14:43:43.773Z" }, - { url = "https://files.pythonhosted.org/packages/69/bc/8986948f05e3ea490b8442ea1c1d4d990b24a7e43d8a51b2c7d8b1dced36/pillow-12.2.0-cp313-cp313-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:766cef22385fa1091258ad7e6216792b156dc16d8d3fa607e7545b2b72061f1c", size = 3640401, upload-time = "2026-04-01T14:43:45.87Z" }, { url = "https://files.pythonhosted.org/packages/34/46/6c717baadcd62bc8ed51d238d521ab651eaa74838291bda1f86fe1f864c9/pillow-12.2.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:5d2fd0fa6b5d9d1de415060363433f28da8b1526c1c129020435e186794b3795", size = 5308094, upload-time = "2026-04-01T14:43:48.438Z" }, { url = "https://files.pythonhosted.org/packages/71/43/905a14a8b17fdb1ccb58d282454490662d2cb89a6bfec26af6d3520da5ec/pillow-12.2.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:56b25336f502b6ed02e889f4ece894a72612fe885889a6e8c4c80239ff6e5f5f", size = 4695402, upload-time = "2026-04-01T14:43:51.292Z" }, { url = "https://files.pythonhosted.org/packages/73/dd/42107efcb777b16fa0393317eac58f5b5cf30e8392e266e76e51cff28c3d/pillow-12.2.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:f1c943e96e85df3d3478f7b691f229887e143f81fedab9b20205349ab04d73ed", size = 6280005, upload-time = "2026-04-01T14:43:54.242Z" }, @@ -1430,9 +1668,6 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/36/a3/f9a77144231fb8d40ee27107b4463e205fa4677e2ca2548e14da5cf18dce/pillow-12.2.0-cp313-cp313t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:efd8c21c98c5cc60653bcb311bef2ce0401642b7ce9d09e03a7da87c878289d4", size = 7115667, upload-time = "2026-04-01T14:44:32.773Z" }, { url = "https://files.pythonhosted.org/packages/c1/fc/ac4ee3041e7d5a565e1c4fd72a113f03b6394cc72ab7089d27608f8aaccb/pillow-12.2.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:9f08483a632889536b8139663db60f6724bfcb443c96f1b18855860d7d5c0fd4", size = 6538966, upload-time = "2026-04-01T14:44:35.252Z" }, { url = "https://files.pythonhosted.org/packages/c0/a8/27fb307055087f3668f6d0a8ccb636e7431d56ed0750e07a60547b1e083e/pillow-12.2.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:dac8d77255a37e81a2efcbd1fc05f1c15ee82200e6c240d7e127e25e365c39ea", size = 7238241, upload-time = "2026-04-01T14:44:37.875Z" }, - { url = "https://files.pythonhosted.org/packages/bf/98/4595daa2365416a86cb0d495248a393dfc84e96d62ad080c8546256cb9c0/pillow-12.2.0-cp314-cp314-ios_13_0_arm64_iphoneos.whl", hash = "sha256:3adc9215e8be0448ed6e814966ecf3d9952f0ea40eb14e89a102b87f450660d8", size = 4100848, upload-time = "2026-04-01T14:44:48.48Z" }, - { url = "https://files.pythonhosted.org/packages/0b/79/40184d464cf89f6663e18dfcf7ca21aae2491fff1a16127681bf1fa9b8cf/pillow-12.2.0-cp314-cp314-ios_13_0_arm64_iphonesimulator.whl", hash = "sha256:6a9adfc6d24b10f89588096364cc726174118c62130c817c2837c60cf08a392b", size = 4176515, upload-time = "2026-04-01T14:44:51.353Z" }, - { url = "https://files.pythonhosted.org/packages/b0/63/703f86fd4c422a9cf722833670f4f71418fb116b2853ff7da722ea43f184/pillow-12.2.0-cp314-cp314-ios_13_0_x86_64_iphonesimulator.whl", hash = "sha256:6a6e67ea2e6feda684ed370f9a1c52e7a243631c025ba42149a2cc5934dec295", size = 3640159, upload-time = "2026-04-01T14:44:53.588Z" }, { url = "https://files.pythonhosted.org/packages/71/e0/fb22f797187d0be2270f83500aab851536101b254bfa1eae10795709d283/pillow-12.2.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:2bb4a8d594eacdfc59d9e5ad972aa8afdd48d584ffd5f13a937a664c3e7db0ed", size = 5312185, upload-time = "2026-04-01T14:44:56.039Z" }, { url = "https://files.pythonhosted.org/packages/ba/8c/1a9e46228571de18f8e28f16fabdfc20212a5d019f3e3303452b3f0a580d/pillow-12.2.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:80b2da48193b2f33ed0c32c38140f9d3186583ce7d516526d462645fd98660ae", size = 4695386, upload-time = "2026-04-01T14:44:58.663Z" }, { url = "https://files.pythonhosted.org/packages/70/62/98f6b7f0c88b9addd0e87c217ded307b36be024d4ff8869a812b241d1345/pillow-12.2.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.whl", hash = "sha256:22db17c68434de69d8ecfc2fe821569195c0c373b25cccb9cbdacf2c6e53c601", size = 6280384, upload-time = "2026-04-01T14:45:01.5Z" }, @@ -1550,36 +1785,61 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/76/47/8ccf75935f51448ba9a16a71b783eb7ef6b9ee60f5d14c7f8a8a79fbeed7/propcache-0.4.1-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:cd547953428f7abb73c5ad82cbb32109566204260d98e41e5dfdc682eb7f8403", size = 46037, upload-time = "2025-10-08T19:46:47.23Z" }, { url = "https://files.pythonhosted.org/packages/0a/b6/5c9a0e42df4d00bfb4a3cbbe5cf9f54260300c88a0e9af1f47ca5ce17ac0/propcache-0.4.1-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:f048da1b4f243fc44f205dfd320933a951b8d89e0afd4c7cacc762a8b9165207", size = 47324, upload-time = "2025-10-08T19:46:48.384Z" }, { url = "https://files.pythonhosted.org/packages/9e/d3/6c7ee328b39a81ee877c962469f1e795f9db87f925251efeb0545e0020d0/propcache-0.4.1-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ec17c65562a827bba85e3872ead335f95405ea1674860d96483a02f5c698fa72", size = 225505, upload-time = "2025-10-08T19:46:50.055Z" }, + { url = "https://files.pythonhosted.org/packages/01/5d/1c53f4563490b1d06a684742cc6076ef944bc6457df6051b7d1a877c057b/propcache-0.4.1-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:405aac25c6394ef275dee4c709be43745d36674b223ba4eb7144bf4d691b7367", size = 230242, upload-time = "2025-10-08T19:46:51.815Z" }, + { url = "https://files.pythonhosted.org/packages/20/e1/ce4620633b0e2422207c3cb774a0ee61cac13abc6217763a7b9e2e3f4a12/propcache-0.4.1-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0013cb6f8dde4b2a2f66903b8ba740bdfe378c943c4377a200551ceb27f379e4", size = 238474, upload-time = "2025-10-08T19:46:53.208Z" }, { url = "https://files.pythonhosted.org/packages/46/4b/3aae6835b8e5f44ea6a68348ad90f78134047b503765087be2f9912140ea/propcache-0.4.1-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:15932ab57837c3368b024473a525e25d316d8353016e7cc0e5ba9eb343fbb1cf", size = 221575, upload-time = "2025-10-08T19:46:54.511Z" }, { url = "https://files.pythonhosted.org/packages/6e/a5/8a5e8678bcc9d3a1a15b9a29165640d64762d424a16af543f00629c87338/propcache-0.4.1-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:031dce78b9dc099f4c29785d9cf5577a3faf9ebf74ecbd3c856a7b92768c3df3", size = 216736, upload-time = "2025-10-08T19:46:56.212Z" }, + { url = "https://files.pythonhosted.org/packages/f1/63/b7b215eddeac83ca1c6b934f89d09a625aa9ee4ba158338854c87210cc36/propcache-0.4.1-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:ab08df6c9a035bee56e31af99be621526bd237bea9f32def431c656b29e41778", size = 213019, upload-time = "2025-10-08T19:46:57.595Z" }, + { url = "https://files.pythonhosted.org/packages/57/74/f580099a58c8af587cac7ba19ee7cb418506342fbbe2d4a4401661cca886/propcache-0.4.1-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:4d7af63f9f93fe593afbf104c21b3b15868efb2c21d07d8732c0c4287e66b6a6", size = 220376, upload-time = "2025-10-08T19:46:59.067Z" }, + { url = "https://files.pythonhosted.org/packages/c4/ee/542f1313aff7eaf19c2bb758c5d0560d2683dac001a1c96d0774af799843/propcache-0.4.1-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:cfc27c945f422e8b5071b6e93169679e4eb5bf73bbcbf1ba3ae3a83d2f78ebd9", size = 226988, upload-time = "2025-10-08T19:47:00.544Z" }, { url = "https://files.pythonhosted.org/packages/8f/18/9c6b015dd9c6930f6ce2229e1f02fb35298b847f2087ea2b436a5bfa7287/propcache-0.4.1-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:35c3277624a080cc6ec6f847cbbbb5b49affa3598c4535a0a4682a697aaa5c75", size = 215615, upload-time = "2025-10-08T19:47:01.968Z" }, { url = "https://files.pythonhosted.org/packages/bf/df/6d9c1b6ac12b003837dde8a10231a7344512186e87b36e855bef32241942/propcache-0.4.1-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:43eedf29202c08550aac1d14e0ee619b0430aaef78f85864c1a892294fbc28cf", size = 77750, upload-time = "2025-10-08T19:47:07.648Z" }, { url = "https://files.pythonhosted.org/packages/8b/e8/677a0025e8a2acf07d3418a2e7ba529c9c33caf09d3c1f25513023c1db56/propcache-0.4.1-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:d62cdfcfd89ccb8de04e0eda998535c406bf5e060ffd56be6c586cbcc05b3311", size = 44780, upload-time = "2025-10-08T19:47:08.851Z" }, { url = "https://files.pythonhosted.org/packages/89/a4/92380f7ca60f99ebae761936bc48a72a639e8a47b29050615eef757cb2a7/propcache-0.4.1-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:cae65ad55793da34db5f54e4029b89d3b9b9490d8abe1b4c7ab5d4b8ec7ebf74", size = 46308, upload-time = "2025-10-08T19:47:09.982Z" }, { url = "https://files.pythonhosted.org/packages/2d/48/c5ac64dee5262044348d1d78a5f85dd1a57464a60d30daee946699963eb3/propcache-0.4.1-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:333ddb9031d2704a301ee3e506dc46b1fe5f294ec198ed6435ad5b6a085facfe", size = 208182, upload-time = "2025-10-08T19:47:11.319Z" }, + { url = "https://files.pythonhosted.org/packages/c6/0c/cd762dd011a9287389a6a3eb43aa30207bde253610cca06824aeabfe9653/propcache-0.4.1-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:fd0858c20f078a32cf55f7e81473d96dcf3b93fd2ccdb3d40fdf54b8573df3af", size = 211215, upload-time = "2025-10-08T19:47:13.146Z" }, + { url = "https://files.pythonhosted.org/packages/30/3e/49861e90233ba36890ae0ca4c660e95df565b2cd15d4a68556ab5865974e/propcache-0.4.1-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:678ae89ebc632c5c204c794f8dab2837c5f159aeb59e6ed0539500400577298c", size = 218112, upload-time = "2025-10-08T19:47:14.913Z" }, { url = "https://files.pythonhosted.org/packages/f1/8b/544bc867e24e1bd48f3118cecd3b05c694e160a168478fa28770f22fd094/propcache-0.4.1-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:d472aeb4fbf9865e0c6d622d7f4d54a4e101a89715d8904282bb5f9a2f476c3f", size = 204442, upload-time = "2025-10-08T19:47:16.277Z" }, { url = "https://files.pythonhosted.org/packages/50/a6/4282772fd016a76d3e5c0df58380a5ea64900afd836cec2c2f662d1b9bb3/propcache-0.4.1-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:4d3df5fa7e36b3225954fba85589da77a0fe6a53e3976de39caf04a0db4c36f1", size = 199398, upload-time = "2025-10-08T19:47:17.962Z" }, + { url = "https://files.pythonhosted.org/packages/3e/ec/d8a7cd406ee1ddb705db2139f8a10a8a427100347bd698e7014351c7af09/propcache-0.4.1-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:ee17f18d2498f2673e432faaa71698032b0127ebf23ae5974eeaf806c279df24", size = 196920, upload-time = "2025-10-08T19:47:19.355Z" }, + { url = "https://files.pythonhosted.org/packages/f6/6c/f38ab64af3764f431e359f8baf9e0a21013e24329e8b85d2da32e8ed07ca/propcache-0.4.1-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:580e97762b950f993ae618e167e7be9256b8353c2dcd8b99ec100eb50f5286aa", size = 203748, upload-time = "2025-10-08T19:47:21.338Z" }, + { url = "https://files.pythonhosted.org/packages/d6/e3/fa846bd70f6534d647886621388f0a265254d30e3ce47e5c8e6e27dbf153/propcache-0.4.1-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:501d20b891688eb8e7aa903021f0b72d5a55db40ffaab27edefd1027caaafa61", size = 205877, upload-time = "2025-10-08T19:47:23.059Z" }, { url = "https://files.pythonhosted.org/packages/e2/39/8163fc6f3133fea7b5f2827e8eba2029a0277ab2c5beee6c1db7b10fc23d/propcache-0.4.1-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:9a0bd56e5b100aef69bd8562b74b46254e7c8812918d3baa700c8a8009b0af66", size = 199437, upload-time = "2025-10-08T19:47:24.445Z" }, { url = "https://files.pythonhosted.org/packages/83/ce/a31bbdfc24ee0dcbba458c8175ed26089cf109a55bbe7b7640ed2470cfe9/propcache-0.4.1-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:92d1935ee1f8d7442da9c0c4fa7ac20d07e94064184811b685f5c4fada64553b", size = 81451, upload-time = "2025-10-08T19:47:29.445Z" }, { url = "https://files.pythonhosted.org/packages/25/9c/442a45a470a68456e710d96cacd3573ef26a1d0a60067e6a7d5e655621ed/propcache-0.4.1-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:473c61b39e1460d386479b9b2f337da492042447c9b685f28be4f74d3529e566", size = 46374, upload-time = "2025-10-08T19:47:30.579Z" }, { url = "https://files.pythonhosted.org/packages/f4/bf/b1d5e21dbc3b2e889ea4327044fb16312a736d97640fb8b6aa3f9c7b3b65/propcache-0.4.1-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:c0ef0aaafc66fbd87842a3fe3902fd889825646bc21149eafe47be6072725835", size = 48396, upload-time = "2025-10-08T19:47:31.79Z" }, { url = "https://files.pythonhosted.org/packages/f4/04/5b4c54a103d480e978d3c8a76073502b18db0c4bc17ab91b3cb5092ad949/propcache-0.4.1-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:f95393b4d66bfae908c3ca8d169d5f79cd65636ae15b5e7a4f6e67af675adb0e", size = 275950, upload-time = "2025-10-08T19:47:33.481Z" }, + { url = "https://files.pythonhosted.org/packages/b4/c1/86f846827fb969c4b78b0af79bba1d1ea2156492e1b83dea8b8a6ae27395/propcache-0.4.1-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:c07fda85708bc48578467e85099645167a955ba093be0a2dcba962195676e859", size = 273856, upload-time = "2025-10-08T19:47:34.906Z" }, + { url = "https://files.pythonhosted.org/packages/36/1d/fc272a63c8d3bbad6878c336c7a7dea15e8f2d23a544bda43205dfa83ada/propcache-0.4.1-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:af223b406d6d000830c6f65f1e6431783fc3f713ba3e6cc8c024d5ee96170a4b", size = 280420, upload-time = "2025-10-08T19:47:36.338Z" }, { url = "https://files.pythonhosted.org/packages/07/0c/01f2219d39f7e53d52e5173bcb09c976609ba30209912a0680adfb8c593a/propcache-0.4.1-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a78372c932c90ee474559c5ddfffd718238e8673c340dc21fe45c5b8b54559a0", size = 263254, upload-time = "2025-10-08T19:47:37.692Z" }, { url = "https://files.pythonhosted.org/packages/2d/18/cd28081658ce597898f0c4d174d4d0f3c5b6d4dc27ffafeef835c95eb359/propcache-0.4.1-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:564d9f0d4d9509e1a870c920a89b2fec951b44bf5ba7d537a9e7c1ccec2c18af", size = 261205, upload-time = "2025-10-08T19:47:39.659Z" }, + { url = "https://files.pythonhosted.org/packages/7a/71/1f9e22eb8b8316701c2a19fa1f388c8a3185082607da8e406a803c9b954e/propcache-0.4.1-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:17612831fda0138059cc5546f4d12a2aacfb9e47068c06af35c400ba58ba7393", size = 247873, upload-time = "2025-10-08T19:47:41.084Z" }, + { url = "https://files.pythonhosted.org/packages/4a/65/3d4b61f36af2b4eddba9def857959f1016a51066b4f1ce348e0cf7881f58/propcache-0.4.1-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:41a89040cb10bd345b3c1a873b2bf36413d48da1def52f268a055f7398514874", size = 262739, upload-time = "2025-10-08T19:47:42.51Z" }, + { url = "https://files.pythonhosted.org/packages/2a/42/26746ab087faa77c1c68079b228810436ccd9a5ce9ac85e2b7307195fd06/propcache-0.4.1-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e35b88984e7fa64aacecea39236cee32dd9bd8c55f57ba8a75cf2399553f9bd7", size = 263514, upload-time = "2025-10-08T19:47:43.927Z" }, { url = "https://files.pythonhosted.org/packages/94/13/630690fe201f5502d2403dd3cfd451ed8858fe3c738ee88d095ad2ff407b/propcache-0.4.1-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:6f8b465489f927b0df505cbe26ffbeed4d6d8a2bbc61ce90eb074ff129ef0ab1", size = 257781, upload-time = "2025-10-08T19:47:45.448Z" }, { url = "https://files.pythonhosted.org/packages/8e/5c/bca52d654a896f831b8256683457ceddd490ec18d9ec50e97dfd8fc726a8/propcache-0.4.1-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:3f7124c9d820ba5548d431afb4632301acf965db49e666aa21c305cbe8c6de12", size = 78152, upload-time = "2025-10-08T19:47:51.051Z" }, { url = "https://files.pythonhosted.org/packages/65/9b/03b04e7d82a5f54fb16113d839f5ea1ede58a61e90edf515f6577c66fa8f/propcache-0.4.1-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:c0d4b719b7da33599dfe3b22d3db1ef789210a0597bc650b7cee9c77c2be8c5c", size = 44869, upload-time = "2025-10-08T19:47:52.594Z" }, { url = "https://files.pythonhosted.org/packages/b2/fa/89a8ef0468d5833a23fff277b143d0573897cf75bd56670a6d28126c7d68/propcache-0.4.1-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:9f302f4783709a78240ebc311b793f123328716a60911d667e0c036bc5dcbded", size = 46596, upload-time = "2025-10-08T19:47:54.073Z" }, { url = "https://files.pythonhosted.org/packages/86/bd/47816020d337f4a746edc42fe8d53669965138f39ee117414c7d7a340cfe/propcache-0.4.1-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c80ee5802e3fb9ea37938e7eecc307fb984837091d5fd262bb37238b1ae97641", size = 206981, upload-time = "2025-10-08T19:47:55.715Z" }, + { url = "https://files.pythonhosted.org/packages/df/f6/c5fa1357cc9748510ee55f37173eb31bfde6d94e98ccd9e6f033f2fc06e1/propcache-0.4.1-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ed5a841e8bb29a55fb8159ed526b26adc5bdd7e8bd7bf793ce647cb08656cdf4", size = 211490, upload-time = "2025-10-08T19:47:57.499Z" }, + { url = "https://files.pythonhosted.org/packages/80/1e/e5889652a7c4a3846683401a48f0f2e5083ce0ec1a8a5221d8058fbd1adf/propcache-0.4.1-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:55c72fd6ea2da4c318e74ffdf93c4fe4e926051133657459131a95c846d16d44", size = 215371, upload-time = "2025-10-08T19:47:59.317Z" }, { url = "https://files.pythonhosted.org/packages/b2/f2/889ad4b2408f72fe1a4f6a19491177b30ea7bf1a0fd5f17050ca08cfc882/propcache-0.4.1-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8326e144341460402713f91df60ade3c999d601e7eb5ff8f6f7862d54de0610d", size = 201424, upload-time = "2025-10-08T19:48:00.67Z" }, { url = "https://files.pythonhosted.org/packages/27/73/033d63069b57b0812c8bd19f311faebeceb6ba31b8f32b73432d12a0b826/propcache-0.4.1-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:060b16ae65bc098da7f6d25bf359f1f31f688384858204fe5d652979e0015e5b", size = 197566, upload-time = "2025-10-08T19:48:02.604Z" }, + { url = "https://files.pythonhosted.org/packages/dc/89/ce24f3dc182630b4e07aa6d15f0ff4b14ed4b9955fae95a0b54c58d66c05/propcache-0.4.1-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:89eb3fa9524f7bec9de6e83cf3faed9d79bffa560672c118a96a171a6f55831e", size = 193130, upload-time = "2025-10-08T19:48:04.499Z" }, + { url = "https://files.pythonhosted.org/packages/a9/24/ef0d5fd1a811fb5c609278d0209c9f10c35f20581fcc16f818da959fc5b4/propcache-0.4.1-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:dee69d7015dc235f526fe80a9c90d65eb0039103fe565776250881731f06349f", size = 202625, upload-time = "2025-10-08T19:48:06.213Z" }, + { url = "https://files.pythonhosted.org/packages/f5/02/98ec20ff5546f68d673df2f7a69e8c0d076b5abd05ca882dc7ee3a83653d/propcache-0.4.1-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:5558992a00dfd54ccbc64a32726a3357ec93825a418a401f5cc67df0ac5d9e49", size = 204209, upload-time = "2025-10-08T19:48:08.432Z" }, { url = "https://files.pythonhosted.org/packages/a0/87/492694f76759b15f0467a2a93ab68d32859672b646aa8a04ce4864e7932d/propcache-0.4.1-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:c9b822a577f560fbd9554812526831712c1436d2c046cedee4c3796d3543b144", size = 197797, upload-time = "2025-10-08T19:48:09.968Z" }, { url = "https://files.pythonhosted.org/packages/99/85/9ff785d787ccf9bbb3f3106f79884a130951436f58392000231b4c737c80/propcache-0.4.1-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:824e908bce90fb2743bd6b59db36eb4f45cd350a39637c9f73b1c1ea66f5b75f", size = 81455, upload-time = "2025-10-08T19:48:15.16Z" }, { url = "https://files.pythonhosted.org/packages/90/85/2431c10c8e7ddb1445c1f7c4b54d886e8ad20e3c6307e7218f05922cad67/propcache-0.4.1-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:c2b5e7db5328427c57c8e8831abda175421b709672f6cfc3d630c3b7e2146393", size = 46372, upload-time = "2025-10-08T19:48:16.424Z" }, { url = "https://files.pythonhosted.org/packages/01/20/b0972d902472da9bcb683fa595099911f4d2e86e5683bcc45de60dd05dc3/propcache-0.4.1-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:6f6ff873ed40292cd4969ef5310179afd5db59fdf055897e282485043fc80ad0", size = 48411, upload-time = "2025-10-08T19:48:17.577Z" }, { url = "https://files.pythonhosted.org/packages/e2/e3/7dc89f4f21e8f99bad3d5ddb3a3389afcf9da4ac69e3deb2dcdc96e74169/propcache-0.4.1-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:49a2dc67c154db2c1463013594c458881a069fcf98940e61a0569016a583020a", size = 275712, upload-time = "2025-10-08T19:48:18.901Z" }, + { url = "https://files.pythonhosted.org/packages/20/67/89800c8352489b21a8047c773067644e3897f02ecbbd610f4d46b7f08612/propcache-0.4.1-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:005f08e6a0529984491e37d8dbc3dd86f84bd78a8ceb5fa9a021f4c48d4984be", size = 273557, upload-time = "2025-10-08T19:48:20.762Z" }, + { url = "https://files.pythonhosted.org/packages/e2/a1/b52b055c766a54ce6d9c16d9aca0cad8059acd9637cdf8aa0222f4a026ef/propcache-0.4.1-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5c3310452e0d31390da9035c348633b43d7e7feb2e37be252be6da45abd1abcc", size = 280015, upload-time = "2025-10-08T19:48:22.592Z" }, { url = "https://files.pythonhosted.org/packages/48/c8/33cee30bd890672c63743049f3c9e4be087e6780906bfc3ec58528be59c1/propcache-0.4.1-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:4c3c70630930447f9ef1caac7728c8ad1c56bc5015338b20fed0d08ea2480b3a", size = 262880, upload-time = "2025-10-08T19:48:23.947Z" }, { url = "https://files.pythonhosted.org/packages/0c/b1/8f08a143b204b418285c88b83d00edbd61afbc2c6415ffafc8905da7038b/propcache-0.4.1-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8e57061305815dfc910a3634dcf584f08168a8836e6999983569f51a8544cd89", size = 260938, upload-time = "2025-10-08T19:48:25.656Z" }, + { url = "https://files.pythonhosted.org/packages/cf/12/96e4664c82ca2f31e1c8dff86afb867348979eb78d3cb8546a680287a1e9/propcache-0.4.1-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:521a463429ef54143092c11a77e04056dd00636f72e8c45b70aaa3140d639726", size = 247641, upload-time = "2025-10-08T19:48:27.207Z" }, + { url = "https://files.pythonhosted.org/packages/18/ed/e7a9cfca28133386ba52278136d42209d3125db08d0a6395f0cba0c0285c/propcache-0.4.1-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:120c964da3fdc75e3731aa392527136d4ad35868cc556fd09bb6d09172d9a367", size = 262510, upload-time = "2025-10-08T19:48:28.65Z" }, + { url = "https://files.pythonhosted.org/packages/f5/76/16d8bf65e8845dd62b4e2b57444ab81f07f40caa5652b8969b87ddcf2ef6/propcache-0.4.1-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:d8f353eb14ee3441ee844ade4277d560cdd68288838673273b978e3d6d2c8f36", size = 263161, upload-time = "2025-10-08T19:48:30.133Z" }, { url = "https://files.pythonhosted.org/packages/e7/70/c99e9edb5d91d5ad8a49fa3c1e8285ba64f1476782fed10ab251ff413ba1/propcache-0.4.1-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:ab2943be7c652f09638800905ee1bab2c544e537edb57d527997a24c13dc1455", size = 257393, upload-time = "2025-10-08T19:48:31.567Z" }, { url = "https://files.pythonhosted.org/packages/5b/5a/bc7b4a4ef808fa59a816c17b20c4bef6884daebbdf627ff2a161da67da19/propcache-0.4.1-py3-none-any.whl", hash = "sha256:af2a6052aeb6cf17d3e46ee169099044fd8224cbaf75c76a2ef596e8163e2237", size = 13305, upload-time = "2025-10-08T19:49:00.792Z" }, ] @@ -1592,6 +1852,7 @@ sdist = { url = "https://files.pythonhosted.org/packages/6b/6b/a0e95cad1ad7cc3f2 wheels = [ { url = "https://files.pythonhosted.org/packages/ec/11/3325d41e6ee15bf1125654301211247b042563bcc898784351252549a8ad/protobuf-7.34.1-cp310-abi3-macosx_10_9_universal2.whl", hash = "sha256:d8b2cc79c4d8f62b293ad9b11ec3aebce9af481fa73e64556969f7345ebf9fc7", size = 429247, upload-time = "2026-03-20T17:34:37.024Z" }, { url = "https://files.pythonhosted.org/packages/eb/9d/aa69df2724ff63efa6f72307b483ce0827f4347cc6d6df24b59e26659fef/protobuf-7.34.1-cp310-abi3-manylinux2014_aarch64.whl", hash = "sha256:5185e0e948d07abe94bb76ec9b8416b604cfe5da6f871d67aad30cbf24c3110b", size = 325753, upload-time = "2026-03-20T17:34:38.751Z" }, + { url = "https://files.pythonhosted.org/packages/92/e8/d174c91fd48e50101943f042b09af9029064810b734e4160bbe282fa1caa/protobuf-7.34.1-cp310-abi3-manylinux2014_s390x.whl", hash = "sha256:403b093a6e28a960372b44e5eb081775c9b056e816a8029c61231743d63f881a", size = 340198, upload-time = "2026-03-20T17:34:39.871Z" }, { url = "https://files.pythonhosted.org/packages/53/1b/3b431694a4dc6d37b9f653f0c64b0a0d9ec074ee810710c0c3da21d67ba7/protobuf-7.34.1-cp310-abi3-manylinux2014_x86_64.whl", hash = "sha256:8ff40ce8cd688f7265326b38d5a1bed9bfdf5e6723d49961432f83e21d5713e4", size = 324267, upload-time = "2026-03-20T17:34:41.1Z" }, { url = "https://files.pythonhosted.org/packages/88/95/608f665226bca68b736b79e457fded9a2a38c4f4379a4a7614303d9db3bc/protobuf-7.34.1-py3-none-any.whl", hash = "sha256:bb3812cd53aefea2b028ef42bd780f5b96407247f20c6ef7c679807e9d188f11", size = 170715, upload-time = "2026-03-20T17:34:45.384Z" }, ] @@ -1713,26 +1974,42 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/5f/5d/5f6c63eebb5afee93bcaae4ce9a898f3373ca23df3ccaef086d0233a35a7/pydantic_core-2.41.5-cp312-cp312-macosx_10_12_x86_64.whl", hash = "sha256:f41a7489d32336dbf2199c8c0a215390a751c5b014c2c1c5366e817202e9cdf7", size = 2110990, upload-time = "2025-11-04T13:39:58.079Z" }, { url = "https://files.pythonhosted.org/packages/aa/32/9c2e8ccb57c01111e0fd091f236c7b371c1bccea0fa85247ac55b1e2b6b6/pydantic_core-2.41.5-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:070259a8818988b9a84a449a2a7337c7f430a22acc0859c6b110aa7212a6d9c0", size = 1896003, upload-time = "2025-11-04T13:39:59.956Z" }, { url = "https://files.pythonhosted.org/packages/68/b8/a01b53cb0e59139fbc9e4fda3e9724ede8de279097179be4ff31f1abb65a/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e96cea19e34778f8d59fe40775a7a574d95816eb150850a85a7a4c8f4b94ac69", size = 1919200, upload-time = "2025-11-04T13:40:02.241Z" }, + { url = "https://files.pythonhosted.org/packages/38/de/8c36b5198a29bdaade07b5985e80a233a5ac27137846f3bc2d3b40a47360/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:ed2e99c456e3fadd05c991f8f437ef902e00eedf34320ba2b0842bd1c3ca3a75", size = 2052578, upload-time = "2025-11-04T13:40:04.401Z" }, + { url = "https://files.pythonhosted.org/packages/00/b5/0e8e4b5b081eac6cb3dbb7e60a65907549a1ce035a724368c330112adfdd/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:65840751b72fbfd82c3c640cff9284545342a4f1eb1586ad0636955b261b0b05", size = 2208504, upload-time = "2025-11-04T13:40:06.072Z" }, + { url = "https://files.pythonhosted.org/packages/77/56/87a61aad59c7c5b9dc8caad5a41a5545cba3810c3e828708b3d7404f6cef/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:e536c98a7626a98feb2d3eaf75944ef6f3dbee447e1f841eae16f2f0a72d8ddc", size = 2335816, upload-time = "2025-11-04T13:40:07.835Z" }, { url = "https://files.pythonhosted.org/packages/0d/76/941cc9f73529988688a665a5c0ecff1112b3d95ab48f81db5f7606f522d3/pydantic_core-2.41.5-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:eceb81a8d74f9267ef4081e246ffd6d129da5d87e37a77c9bde550cb04870c1c", size = 2075366, upload-time = "2025-11-04T13:40:09.804Z" }, { url = "https://files.pythonhosted.org/packages/b1/87/41f3202e4193e3bacfc2c065fab7706ebe81af46a83d3e27605029c1f5a6/pydantic_core-2.41.5-cp312-cp312-musllinux_1_1_aarch64.whl", hash = "sha256:c23e27686783f60290e36827f9c626e63154b82b116d7fe9adba1fda36da706c", size = 2132603, upload-time = "2025-11-04T13:40:13.868Z" }, + { url = "https://files.pythonhosted.org/packages/49/7d/4c00df99cb12070b6bccdef4a195255e6020a550d572768d92cc54dba91a/pydantic_core-2.41.5-cp312-cp312-musllinux_1_1_armv7l.whl", hash = "sha256:482c982f814460eabe1d3bb0adfdc583387bd4691ef00b90575ca0d2b6fe2294", size = 2329591, upload-time = "2025-11-04T13:40:15.672Z" }, { url = "https://files.pythonhosted.org/packages/cc/6a/ebf4b1d65d458f3cda6a7335d141305dfa19bdc61140a884d165a8a1bbc7/pydantic_core-2.41.5-cp312-cp312-musllinux_1_1_x86_64.whl", hash = "sha256:bfea2a5f0b4d8d43adf9d7b8bf019fb46fdd10a2e5cde477fbcb9d1fa08c68e1", size = 2319068, upload-time = "2025-11-04T13:40:17.532Z" }, { url = "https://files.pythonhosted.org/packages/87/06/8806241ff1f70d9939f9af039c6c35f2360cf16e93c2ca76f184e76b1564/pydantic_core-2.41.5-cp313-cp313-macosx_10_12_x86_64.whl", hash = "sha256:941103c9be18ac8daf7b7adca8228f8ed6bb7a1849020f643b3a14d15b1924d9", size = 2120403, upload-time = "2025-11-04T13:40:25.248Z" }, { url = "https://files.pythonhosted.org/packages/94/02/abfa0e0bda67faa65fef1c84971c7e45928e108fe24333c81f3bfe35d5f5/pydantic_core-2.41.5-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:112e305c3314f40c93998e567879e887a3160bb8689ef3d2c04b6cc62c33ac34", size = 1896206, upload-time = "2025-11-04T13:40:27.099Z" }, { url = "https://files.pythonhosted.org/packages/15/df/a4c740c0943e93e6500f9eb23f4ca7ec9bf71b19e608ae5b579678c8d02f/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:0cbaad15cb0c90aa221d43c00e77bb33c93e8d36e0bf74760cd00e732d10a6a0", size = 1919307, upload-time = "2025-11-04T13:40:29.806Z" }, + { url = "https://files.pythonhosted.org/packages/9a/e3/6324802931ae1d123528988e0e86587c2072ac2e5394b4bc2bc34b61ff6e/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:03ca43e12fab6023fc79d28ca6b39b05f794ad08ec2feccc59a339b02f2b3d33", size = 2063258, upload-time = "2025-11-04T13:40:33.544Z" }, + { url = "https://files.pythonhosted.org/packages/c9/d4/2230d7151d4957dd79c3044ea26346c148c98fbf0ee6ebd41056f2d62ab5/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:dc799088c08fa04e43144b164feb0c13f9a0bc40503f8df3e9fde58a3c0c101e", size = 2214917, upload-time = "2025-11-04T13:40:35.479Z" }, + { url = "https://files.pythonhosted.org/packages/e6/9f/eaac5df17a3672fef0081b6c1bb0b82b33ee89aa5cec0d7b05f52fd4a1fa/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:97aeba56665b4c3235a0e52b2c2f5ae9cd071b8a8310ad27bddb3f7fb30e9aa2", size = 2332186, upload-time = "2025-11-04T13:40:37.436Z" }, { url = "https://files.pythonhosted.org/packages/cf/4e/35a80cae583a37cf15604b44240e45c05e04e86f9cfd766623149297e971/pydantic_core-2.41.5-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:406bf18d345822d6c21366031003612b9c77b3e29ffdb0f612367352aab7d586", size = 2073164, upload-time = "2025-11-04T13:40:40.289Z" }, { url = "https://files.pythonhosted.org/packages/75/c7/20bd7fc05f0c6ea2056a4565c6f36f8968c0924f19b7d97bbfea55780e73/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_aarch64.whl", hash = "sha256:01a3d0ab748ee531f4ea6c3e48ad9dac84ddba4b0d82291f87248f2f9de8d740", size = 2137788, upload-time = "2025-11-04T13:40:44.752Z" }, + { url = "https://files.pythonhosted.org/packages/3a/8d/34318ef985c45196e004bc46c6eab2eda437e744c124ef0dbe1ff2c9d06b/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_armv7l.whl", hash = "sha256:6561e94ba9dacc9c61bce40e2d6bdc3bfaa0259d3ff36ace3b1e6901936d2e3e", size = 2340133, upload-time = "2025-11-04T13:40:46.66Z" }, { url = "https://files.pythonhosted.org/packages/9c/59/013626bf8c78a5a5d9350d12e7697d3d4de951a75565496abd40ccd46bee/pydantic_core-2.41.5-cp313-cp313-musllinux_1_1_x86_64.whl", hash = "sha256:915c3d10f81bec3a74fbd4faebe8391013ba61e5a1a8d48c4455b923bdda7858", size = 2324852, upload-time = "2025-11-04T13:40:48.575Z" }, { url = "https://files.pythonhosted.org/packages/ea/28/46b7c5c9635ae96ea0fbb779e271a38129df2550f763937659ee6c5dbc65/pydantic_core-2.41.5-cp314-cp314-macosx_10_12_x86_64.whl", hash = "sha256:3f37a19d7ebcdd20b96485056ba9e8b304e27d9904d233d7b1015db320e51f0a", size = 2119622, upload-time = "2025-11-04T13:40:56.68Z" }, { url = "https://files.pythonhosted.org/packages/74/1a/145646e5687e8d9a1e8d09acb278c8535ebe9e972e1f162ed338a622f193/pydantic_core-2.41.5-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:1d1d9764366c73f996edd17abb6d9d7649a7eb690006ab6adbda117717099b14", size = 1891725, upload-time = "2025-11-04T13:40:58.807Z" }, { url = "https://files.pythonhosted.org/packages/23/04/e89c29e267b8060b40dca97bfc64a19b2a3cf99018167ea1677d96368273/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:25e1c2af0fce638d5f1988b686f3b3ea8cd7de5f244ca147c777769e798a9cd1", size = 1915040, upload-time = "2025-11-04T13:41:00.853Z" }, + { url = "https://files.pythonhosted.org/packages/84/a3/15a82ac7bd97992a82257f777b3583d3e84bdb06ba6858f745daa2ec8a85/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:506d766a8727beef16b7adaeb8ee6217c64fc813646b424d0804d67c16eddb66", size = 2063691, upload-time = "2025-11-04T13:41:03.504Z" }, + { url = "https://files.pythonhosted.org/packages/74/9b/0046701313c6ef08c0c1cf0e028c67c770a4e1275ca73131563c5f2a310a/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:4819fa52133c9aa3c387b3328f25c1facc356491e6135b459f1de698ff64d869", size = 2213897, upload-time = "2025-11-04T13:41:05.804Z" }, + { url = "https://files.pythonhosted.org/packages/8a/cd/6bac76ecd1b27e75a95ca3a9a559c643b3afcd2dd62086d4b7a32a18b169/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2b761d210c9ea91feda40d25b4efe82a1707da2ef62901466a42492c028553a2", size = 2333302, upload-time = "2025-11-04T13:41:07.809Z" }, { url = "https://files.pythonhosted.org/packages/4c/d2/ef2074dc020dd6e109611a8be4449b98cd25e1b9b8a303c2f0fca2f2bcf7/pydantic_core-2.41.5-cp314-cp314-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:22f0fb8c1c583a3b6f24df2470833b40207e907b90c928cc8d3594b76f874375", size = 2064877, upload-time = "2025-11-04T13:41:09.827Z" }, { url = "https://files.pythonhosted.org/packages/d3/9e/3ce66cebb929f3ced22be85d4c2399b8e85b622db77dad36b73c5387f8f8/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_aarch64.whl", hash = "sha256:0177272f88ab8312479336e1d777f6b124537d47f2123f89cb37e0accea97f90", size = 2138960, upload-time = "2025-11-04T13:41:14.627Z" }, + { url = "https://files.pythonhosted.org/packages/a6/62/205a998f4327d2079326b01abee48e502ea739d174f0a89295c481a2272e/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_armv7l.whl", hash = "sha256:63510af5e38f8955b8ee5687740d6ebf7c2a0886d15a6d65c32814613681bc07", size = 2339102, upload-time = "2025-11-04T13:41:16.868Z" }, { url = "https://files.pythonhosted.org/packages/3c/0d/f05e79471e889d74d3d88f5bd20d0ed189ad94c2423d81ff8d0000aab4ff/pydantic_core-2.41.5-cp314-cp314-musllinux_1_1_x86_64.whl", hash = "sha256:e56ba91f47764cc14f1daacd723e3e82d1a89d783f0f5afe9c364b8bb491ccdb", size = 2326039, upload-time = "2025-11-04T13:41:18.934Z" }, { url = "https://files.pythonhosted.org/packages/92/ed/77542d0c51538e32e15afe7899d79efce4b81eee631d99850edc2f5e9349/pydantic_core-2.41.5-cp314-cp314t-macosx_10_12_x86_64.whl", hash = "sha256:8566def80554c3faa0e65ac30ab0932b9e3a5cd7f8323764303d468e5c37595a", size = 2120255, upload-time = "2025-11-04T13:41:28.569Z" }, { url = "https://files.pythonhosted.org/packages/bb/3d/6913dde84d5be21e284439676168b28d8bbba5600d838b9dca99de0fad71/pydantic_core-2.41.5-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:b80aa5095cd3109962a298ce14110ae16b8c1aece8b72f9dafe81cf597ad80b3", size = 1863760, upload-time = "2025-11-04T13:41:31.055Z" }, { url = "https://files.pythonhosted.org/packages/5a/f0/e5e6b99d4191da102f2b0eb9687aaa7f5bea5d9964071a84effc3e40f997/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:3006c3dd9ba34b0c094c544c6006cc79e87d8612999f1a5d43b769b89181f23c", size = 1878092, upload-time = "2025-11-04T13:41:33.21Z" }, + { url = "https://files.pythonhosted.org/packages/71/48/36fb760642d568925953bcc8116455513d6e34c4beaa37544118c36aba6d/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:72f6c8b11857a856bcfa48c86f5368439f74453563f951e473514579d44aa612", size = 2053385, upload-time = "2025-11-04T13:41:35.508Z" }, + { url = "https://files.pythonhosted.org/packages/20/25/92dc684dd8eb75a234bc1c764b4210cf2646479d54b47bf46061657292a8/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:5cb1b2f9742240e4bb26b652a5aeb840aa4b417c7748b6f8387927bc6e45e40d", size = 2218832, upload-time = "2025-11-04T13:41:37.732Z" }, + { url = "https://files.pythonhosted.org/packages/e2/09/f53e0b05023d3e30357d82eb35835d0f6340ca344720a4599cd663dca599/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:bd3d54f38609ff308209bd43acea66061494157703364ae40c951f83ba99a1a9", size = 2327585, upload-time = "2025-11-04T13:41:40Z" }, { url = "https://files.pythonhosted.org/packages/aa/4e/2ae1aa85d6af35a39b236b1b1641de73f5a6ac4d5a7509f77b814885760c/pydantic_core-2.41.5-cp314-cp314t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:2ff4321e56e879ee8d2a879501c8e469414d948f4aba74a2d4593184eb326660", size = 2041078, upload-time = "2025-11-04T13:41:42.323Z" }, { url = "https://files.pythonhosted.org/packages/02/7a/f999a6dcbcd0e5660bc348a3991c8915ce6599f4f2c6ac22f01d7a10816c/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_aarch64.whl", hash = "sha256:a39455728aabd58ceabb03c90e12f71fd30fa69615760a075b9fec596456ccc3", size = 2129560, upload-time = "2025-11-04T13:41:47.474Z" }, + { url = "https://files.pythonhosted.org/packages/3a/b1/6c990ac65e3b4c079a4fb9f5b05f5b013afa0f4ed6780a3dd236d2cbdc64/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_armv7l.whl", hash = "sha256:239edca560d05757817c13dc17c50766136d21f7cd0fac50295499ae24f90fdf", size = 2329244, upload-time = "2025-11-04T13:41:49.992Z" }, { url = "https://files.pythonhosted.org/packages/d9/02/3c562f3a51afd4d88fff8dffb1771b30cfdfd79befd9883ee094f5b6c0d8/pydantic_core-2.41.5-cp314-cp314t-musllinux_1_1_x86_64.whl", hash = "sha256:2a5e06546e19f24c6a96a129142a75cee553cc018ffee48a460059b1185f4470", size = 2331955, upload-time = "2025-11-04T13:41:54.079Z" }, { url = "https://files.pythonhosted.org/packages/09/32/59b0c7e63e277fa7911c2fc70ccfb45ce4b98991e7ef37110663437005af/pydantic_core-2.41.5-graalpy312-graalpy250_312_native-macosx_10_12_x86_64.whl", hash = "sha256:7da7087d756b19037bc2c06edc6c170eeef3c3bafcb8f532ff17d64dc427adfd", size = 2110495, upload-time = "2025-11-04T13:42:49.689Z" }, { url = "https://files.pythonhosted.org/packages/aa/81/05e400037eaf55ad400bcd318c05bb345b57e708887f07ddb2d20e3f0e98/pydantic_core-2.41.5-graalpy312-graalpy250_312_native-macosx_11_0_arm64.whl", hash = "sha256:aabf5777b5c8ca26f7824cb4a120a740c9588ed58df9b2d196ce92fba42ff8dc", size = 1915388, upload-time = "2025-11-04T13:42:52.215Z" }, @@ -1890,24 +2167,28 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/d1/33/422b98d2195232ca1826284a76852ad5a86fe23e31b009c9886b2d0fb8b2/pyyaml-6.0.3-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:7f047e29dcae44602496db43be01ad42fc6f1cc0d8cd6c83d342306c32270196", size = 182063, upload-time = "2025-09-25T21:32:11.445Z" }, { url = "https://files.pythonhosted.org/packages/89/a0/6cf41a19a1f2f3feab0e9c0b74134aa2ce6849093d5517a0c550fe37a648/pyyaml-6.0.3-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:fc09d0aa354569bc501d4e787133afc08552722d3ab34836a80547331bb5d4a0", size = 173973, upload-time = "2025-09-25T21:32:12.492Z" }, { url = "https://files.pythonhosted.org/packages/ed/23/7a778b6bd0b9a8039df8b1b1d80e2e2ad78aa04171592c8a5c43a56a6af4/pyyaml-6.0.3-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9149cad251584d5fb4981be1ecde53a1ca46c891a79788c0df828d2f166bda28", size = 775116, upload-time = "2025-09-25T21:32:13.652Z" }, + { url = "https://files.pythonhosted.org/packages/65/30/d7353c338e12baef4ecc1b09e877c1970bd3382789c159b4f89d6a70dc09/pyyaml-6.0.3-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:5fdec68f91a0c6739b380c83b951e2c72ac0197ace422360e6d5a959d8d97b2c", size = 844011, upload-time = "2025-09-25T21:32:15.21Z" }, { url = "https://files.pythonhosted.org/packages/8b/9d/b3589d3877982d4f2329302ef98a8026e7f4443c765c46cfecc8858c6b4b/pyyaml-6.0.3-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:ba1cc08a7ccde2d2ec775841541641e4548226580ab850948cbfda66a1befcdc", size = 807870, upload-time = "2025-09-25T21:32:16.431Z" }, { url = "https://files.pythonhosted.org/packages/05/c0/b3be26a015601b822b97d9149ff8cb5ead58c66f981e04fedf4e762f4bd4/pyyaml-6.0.3-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:8dc52c23056b9ddd46818a57b78404882310fb473d63f17b07d5c40421e47f8e", size = 761089, upload-time = "2025-09-25T21:32:17.56Z" }, { url = "https://files.pythonhosted.org/packages/be/8e/98435a21d1d4b46590d5459a22d88128103f8da4c2d4cb8f14f2a96504e1/pyyaml-6.0.3-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:41715c910c881bc081f1e8872880d3c650acf13dfa8214bad49ed4cede7c34ea", size = 790181, upload-time = "2025-09-25T21:32:18.834Z" }, { url = "https://files.pythonhosted.org/packages/d1/11/0fd08f8192109f7169db964b5707a2f1e8b745d4e239b784a5a1dd80d1db/pyyaml-6.0.3-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:8da9669d359f02c0b91ccc01cac4a67f16afec0dac22c2ad09f46bee0697eba8", size = 181669, upload-time = "2025-09-25T21:32:23.673Z" }, { url = "https://files.pythonhosted.org/packages/b1/16/95309993f1d3748cd644e02e38b75d50cbc0d9561d21f390a76242ce073f/pyyaml-6.0.3-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:2283a07e2c21a2aa78d9c4442724ec1eb15f5e42a723b99cb3d822d48f5f7ad1", size = 173252, upload-time = "2025-09-25T21:32:25.149Z" }, { url = "https://files.pythonhosted.org/packages/50/31/b20f376d3f810b9b2371e72ef5adb33879b25edb7a6d072cb7ca0c486398/pyyaml-6.0.3-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:ee2922902c45ae8ccada2c5b501ab86c36525b883eff4255313a253a3160861c", size = 767081, upload-time = "2025-09-25T21:32:26.575Z" }, + { url = "https://files.pythonhosted.org/packages/49/1e/a55ca81e949270d5d4432fbbd19dfea5321eda7c41a849d443dc92fd1ff7/pyyaml-6.0.3-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a33284e20b78bd4a18c8c2282d549d10bc8408a2a7ff57653c0cf0b9be0afce5", size = 841159, upload-time = "2025-09-25T21:32:27.727Z" }, { url = "https://files.pythonhosted.org/packages/74/27/e5b8f34d02d9995b80abcef563ea1f8b56d20134d8f4e5e81733b1feceb2/pyyaml-6.0.3-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0f29edc409a6392443abf94b9cf89ce99889a1dd5376d94316ae5145dfedd5d6", size = 801626, upload-time = "2025-09-25T21:32:28.878Z" }, { url = "https://files.pythonhosted.org/packages/f9/11/ba845c23988798f40e52ba45f34849aa8a1f2d4af4b798588010792ebad6/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f7057c9a337546edc7973c0d3ba84ddcdf0daa14533c2065749c9075001090e6", size = 753613, upload-time = "2025-09-25T21:32:30.178Z" }, { url = "https://files.pythonhosted.org/packages/3d/e0/7966e1a7bfc0a45bf0a7fb6b98ea03fc9b8d84fa7f2229e9659680b69ee3/pyyaml-6.0.3-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:eda16858a3cab07b80edaf74336ece1f986ba330fdb8ee0d6c0d68fe82bc96be", size = 794115, upload-time = "2025-09-25T21:32:31.353Z" }, { url = "https://files.pythonhosted.org/packages/9d/8c/f4bd7f6465179953d3ac9bc44ac1a8a3e6122cf8ada906b4f96c60172d43/pyyaml-6.0.3-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:8d1fab6bb153a416f9aeb4b8763bc0f22a5586065f86f7664fc23339fc1c1fac", size = 181814, upload-time = "2025-09-25T21:32:35.712Z" }, { url = "https://files.pythonhosted.org/packages/bd/9c/4d95bb87eb2063d20db7b60faa3840c1b18025517ae857371c4dd55a6b3a/pyyaml-6.0.3-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:34d5fcd24b8445fadc33f9cf348c1047101756fd760b4dacb5c3e99755703310", size = 173809, upload-time = "2025-09-25T21:32:36.789Z" }, { url = "https://files.pythonhosted.org/packages/92/b5/47e807c2623074914e29dabd16cbbdd4bf5e9b2db9f8090fa64411fc5382/pyyaml-6.0.3-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:501a031947e3a9025ed4405a168e6ef5ae3126c59f90ce0cd6f2bfc477be31b7", size = 766454, upload-time = "2025-09-25T21:32:37.966Z" }, + { url = "https://files.pythonhosted.org/packages/02/9e/e5e9b168be58564121efb3de6859c452fccde0ab093d8438905899a3a483/pyyaml-6.0.3-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b3bc83488de33889877a0f2543ade9f70c67d66d9ebb4ac959502e12de895788", size = 836355, upload-time = "2025-09-25T21:32:39.178Z" }, { url = "https://files.pythonhosted.org/packages/88/f9/16491d7ed2a919954993e48aa941b200f38040928474c9e85ea9e64222c3/pyyaml-6.0.3-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:c458b6d084f9b935061bc36216e8a69a7e293a2f1e68bf956dcd9e6cbcd143f5", size = 794175, upload-time = "2025-09-25T21:32:40.865Z" }, { url = "https://files.pythonhosted.org/packages/dd/3f/5989debef34dc6397317802b527dbbafb2b4760878a53d4166579111411e/pyyaml-6.0.3-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:7c6610def4f163542a622a73fb39f534f8c101d690126992300bf3207eab9764", size = 755228, upload-time = "2025-09-25T21:32:42.084Z" }, { url = "https://files.pythonhosted.org/packages/d7/ce/af88a49043cd2e265be63d083fc75b27b6ed062f5f9fd6cdc223ad62f03e/pyyaml-6.0.3-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:5190d403f121660ce8d1d2c1bb2ef1bd05b5f68533fc5c2ea899bd15f4399b35", size = 789194, upload-time = "2025-09-25T21:32:43.362Z" }, { url = "https://files.pythonhosted.org/packages/7c/f9/07dd09ae774e4616edf6cda684ee78f97777bdd15847253637a6f052a62f/pyyaml-6.0.3-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:02893d100e99e03eda1c8fd5c441d8c60103fd175728e23e431db1b589cf5ab3", size = 189108, upload-time = "2025-09-25T21:32:44.377Z" }, { url = "https://files.pythonhosted.org/packages/4e/78/8d08c9fb7ce09ad8c38ad533c1191cf27f7ae1effe5bb9400a46d9437fcf/pyyaml-6.0.3-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:c1ff362665ae507275af2853520967820d9124984e0f7466736aea23d8611fba", size = 183641, upload-time = "2025-09-25T21:32:45.407Z" }, { url = "https://files.pythonhosted.org/packages/7b/5b/3babb19104a46945cf816d047db2788bcaf8c94527a805610b0289a01c6b/pyyaml-6.0.3-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:6adc77889b628398debc7b65c073bcb99c4a0237b248cacaf3fe8a557563ef6c", size = 831901, upload-time = "2025-09-25T21:32:48.83Z" }, + { url = "https://files.pythonhosted.org/packages/8b/cc/dff0684d8dc44da4d22a13f35f073d558c268780ce3c6ba1b87055bb0b87/pyyaml-6.0.3-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a80cb027f6b349846a3bf6d73b5e95e782175e52f22108cfa17876aaeff93702", size = 861132, upload-time = "2025-09-25T21:32:50.149Z" }, { url = "https://files.pythonhosted.org/packages/b1/5e/f77dc6b9036943e285ba76b49e118d9ea929885becb0a29ba8a7c75e29fe/pyyaml-6.0.3-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:00c4bdeba853cc34e7dd471f16b4114f4162dc03e6b7afcc2128711f0eca823c", size = 839261, upload-time = "2025-09-25T21:32:51.808Z" }, { url = "https://files.pythonhosted.org/packages/ce/88/a9db1376aa2a228197c58b37302f284b5617f56a5d959fd1763fb1675ce6/pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:66e1674c3ef6f541c35191caae2d429b967b99e02040f5ba928632d9a7f0f065", size = 805272, upload-time = "2025-09-25T21:32:52.941Z" }, { url = "https://files.pythonhosted.org/packages/da/92/1446574745d74df0c92e6aa4a7b0b3130706a4142b2d1a5869f2eaa423c6/pyyaml-6.0.3-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:16249ee61e95f858e83976573de0f5b2893b3677ba71c9dd36b9cf8be9ac6d65", size = 829923, upload-time = "2025-09-25T21:32:54.537Z" }, @@ -1927,6 +2208,7 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/f8/9b/c108cdb55560eaf253f0cbdb61b29971e9fb34d9c3499b0e96e4e60ed8a5/pyzmq-27.1.0-cp312-abi3-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:43ad9a73e3da1fab5b0e7e13402f0b2fb934ae1c876c51d0afff0e7c052eca31", size = 840995, upload-time = "2025-09-08T23:08:08.396Z" }, { url = "https://files.pythonhosted.org/packages/c2/bb/b79798ca177b9eb0825b4c9998c6af8cd2a7f15a6a1a4272c1d1a21d382f/pyzmq-27.1.0-cp312-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:0de3028d69d4cdc475bfe47a6128eb38d8bc0e8f4d69646adfbcd840facbac28", size = 1642070, upload-time = "2025-09-08T23:08:09.989Z" }, { url = "https://files.pythonhosted.org/packages/46/bd/2d45ad24f5f5ae7e8d01525eb76786fa7557136555cac7d929880519e33a/pyzmq-27.1.0-cp312-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:f30f395a9e6fbca195400ce833c731e7b64c3919aa481af4d88c3759e0cb7496", size = 1878550, upload-time = "2025-09-08T23:08:13.513Z" }, + { url = "https://files.pythonhosted.org/packages/dc/b6/94414759a69a26c3dd674570a81813c46a078767d931a6c70ad29fc585cb/pyzmq-27.1.0-cp313-cp313-android_24_x86_64.whl", hash = "sha256:fbb4f2400bfda24f12f009cba62ad5734148569ff4949b1b6ec3b519444342e6", size = 1156301, upload-time = "2025-09-08T23:08:22.47Z" }, { url = "https://files.pythonhosted.org/packages/a5/ad/15906493fd40c316377fd8a8f6b1f93104f97a752667763c9b9c1b71d42d/pyzmq-27.1.0-cp313-cp313t-macosx_10_15_universal2.whl", hash = "sha256:e343d067f7b151cfe4eb3bb796a7752c9d369eed007b91231e817071d2c2fec7", size = 1341197, upload-time = "2025-09-08T23:08:24.286Z" }, { url = "https://files.pythonhosted.org/packages/69/2d/d83dd6d7ca929a2fc67d2c3005415cdf322af7751d773524809f9e585129/pyzmq-27.1.0-cp313-cp313t-manylinux_2_26_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:d54530c8c8b5b8ddb3318f481297441af102517602b569146185fa10b63f4fa9", size = 660469, upload-time = "2025-09-08T23:08:27.623Z" }, { url = "https://files.pythonhosted.org/packages/3e/cd/9822a7af117f4bc0f1952dbe9ef8358eb50a24928efd5edf54210b850259/pyzmq-27.1.0-cp313-cp313t-manylinux_2_26_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:6f3afa12c392f0a44a2414056d730eebc33ec0926aae92b5ad5cf26ebb6cc128", size = 847961, upload-time = "2025-09-08T23:08:29.672Z" }, @@ -1949,36 +2231,66 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/92/0a/7dcffeebe0fcac45a1f9caf80712002d3cbd66d7d69d719315ee142b280f/regex-2026.3.32-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:3f5747501b69299c6b0b047853771e4ed390510bada68cb16da9c9c2078343f7", size = 292078, upload-time = "2026-03-28T21:46:29.789Z" }, { url = "https://files.pythonhosted.org/packages/e3/ec/988486058ef49eb931476419bae00f164c4ceb44787c45dc7a54b7de0ea4/regex-2026.3.32-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:db976be51375bca900e008941639448d148c655c9545071965d0571ecc04f5d0", size = 289786, upload-time = "2026-03-28T21:46:31.415Z" }, { url = "https://files.pythonhosted.org/packages/4a/cf/1955bb5567bc491bd63068e17f75ab0c9ff5e9d08466beec7e347f5e768d/regex-2026.3.32-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:66a5083c3ffe5a5a95f8281ea47a88072d4f24001d562d1d9d28d4cdc005fec5", size = 796431, upload-time = "2026-03-28T21:46:33.101Z" }, + { url = "https://files.pythonhosted.org/packages/27/8a/67fcbca511b792107540181ee0690df6de877bfbcb41b7ecae7028025ca5/regex-2026.3.32-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:e83ce8008b48762be296f1401f19afd9ea29f3d035d1974e0cecb74e9afbd1df", size = 865785, upload-time = "2026-03-28T21:46:35.053Z" }, + { url = "https://files.pythonhosted.org/packages/c2/59/0677bc44f2c28305edcabc11933777b9ad34e9e8ded7ba573d24e4bc3ee7/regex-2026.3.32-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:b3aa21bad31db904e0b9055e12c8282df62d43169c4a9d2929407060066ebc74", size = 913593, upload-time = "2026-03-28T21:46:36.835Z" }, { url = "https://files.pythonhosted.org/packages/0a/fe/661043d1c263b0d9d10c6ff4e9c9745f3df9641c62b51f96a3473638e7ce/regex-2026.3.32-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:f54840bea73541652f1170dc63402a5b776fc851ad36a842da9e5163c1f504a0", size = 801512, upload-time = "2026-03-28T21:46:38.587Z" }, + { url = "https://files.pythonhosted.org/packages/ff/27/74c986061380e1811a46cf04cdf9c939db9f8c0e63953eddfe37ffd633ea/regex-2026.3.32-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:2ffbadc647325dd4e3118269bda93ded1eb5f5b0c3b7ba79a3da9fbd04f248e9", size = 776182, upload-time = "2026-03-28T21:46:40.69Z" }, { url = "https://files.pythonhosted.org/packages/b6/c8/d833397b70cd1bacfcdc0a611f0e2c1f5b91fee8eedd88affcee770cbbb6/regex-2026.3.32-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:66d3126afe7eac41759cd5f0b3b246598086e88e70527c0d68c9e615b81771c4", size = 785837, upload-time = "2026-03-28T21:46:42.926Z" }, + { url = "https://files.pythonhosted.org/packages/e0/53/fa226b72989b5b93db6926fab5478115e085dfcf077e18d2cb386be0fd23/regex-2026.3.32-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:f785f44a44702dea89b28bce5bc82552490694ce4e144e21a4f0545e364d2150", size = 860612, upload-time = "2026-03-28T21:46:44.8Z" }, + { url = "https://files.pythonhosted.org/packages/04/28/bdd2fc0c055a1b15702bd4084829bbb6b06095f27990e5bee52b2898ea03/regex-2026.3.32-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:b7836aa13721dbdef658aebd11f60d00de633a95726521860fe1f6be75fa225a", size = 765285, upload-time = "2026-03-28T21:46:46.625Z" }, + { url = "https://files.pythonhosted.org/packages/b4/da/21f5e2a35a191b27e5a47cccb3914c99e139b49b1342d3f36e64e8cc60f7/regex-2026.3.32-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:5336b1506142eb0f23c96fb4a34b37c4fefd4fed2a7042069f3c8058efe17855", size = 851963, upload-time = "2026-03-28T21:46:48.341Z" }, { url = "https://files.pythonhosted.org/packages/18/f4/04ed04ebf335a44083695c22772be6a42efa31900415555563acf02cb4de/regex-2026.3.32-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:b56993a7aeb4140c4770f4f7965c9e5af4f024457d06e23c01b0d47501cb18ed", size = 788332, upload-time = "2026-03-28T21:46:50.454Z" }, { url = "https://files.pythonhosted.org/packages/bd/ba/9c1819f302b42b5fbd4139ead6280e9ec37d19bbe33379df0039b2a57bb4/regex-2026.3.32-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:c6d9c6e783b348f719b6118bb3f187b2e138e3112576c9679eb458cc8b2e164b", size = 490394, upload-time = "2026-03-28T21:46:58.112Z" }, { url = "https://files.pythonhosted.org/packages/5b/0b/f62b0ce79eb83ca82fffea1736289d29bc24400355968301406789bcebd2/regex-2026.3.32-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:0f21ae18dfd15752cdd98d03cbd7a3640be826bfd58482a93f730dbd24d7b9fb", size = 291993, upload-time = "2026-03-28T21:47:00.198Z" }, { url = "https://files.pythonhosted.org/packages/e7/d8/ba0f8f81f88cd20c0b27acc123561ac5495ea33f800f0b8ebed2038b23eb/regex-2026.3.32-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:844d88509c968dd44b30daeefac72b038b1bf31ac372d5106358ab01d393c48b", size = 289618, upload-time = "2026-03-28T21:47:02.269Z" }, { url = "https://files.pythonhosted.org/packages/fd/0d/b47a0e68bc511c195ff129c0311a4cd79b954b8676193a9d03a97c623a91/regex-2026.3.32-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:8fc918cd003ba0d066bf0003deb05a259baaaab4dc9bd4f1207bbbe64224857a", size = 796427, upload-time = "2026-03-28T21:47:04.096Z" }, + { url = "https://files.pythonhosted.org/packages/51/d7/32b05aa8fde7789ba316533c0f30e87b6b5d38d6d7f8765eadc5aab84671/regex-2026.3.32-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:bbc458a292aee57d572075f22c035fa32969cdb7987d454e3e34d45a40a0a8b4", size = 865850, upload-time = "2026-03-28T21:47:05.982Z" }, + { url = "https://files.pythonhosted.org/packages/dc/67/828d8095501f237b83f630d4069eea8c0e5cb6a204e859cf0b67c223ce12/regex-2026.3.32-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:987cdfcfb97a249abc3601ad53c7de5c370529f1981e4c8c46793e4a1e1bfe8e", size = 913578, upload-time = "2026-03-28T21:47:08.172Z" }, { url = "https://files.pythonhosted.org/packages/0f/f8/acf1eb80f58852e85bd39a6ddfa78ce2243ddc8de8da7582e6ba657da593/regex-2026.3.32-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a5d88fa37ba5e8a80ca8d956b9ea03805cfa460223ac94b7d4854ee5e30f3173", size = 801536, upload-time = "2026-03-28T21:47:10.206Z" }, + { url = "https://files.pythonhosted.org/packages/9f/05/986cdf8d12693451f5889aaf4ea4f65b2c49b1152ae814fa1fb75439e40b/regex-2026.3.32-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:4d082be64e51671dd5ee1c208c92da2ddda0f2f20d8ef387e57634f7e97b6aae", size = 776226, upload-time = "2026-03-28T21:47:12.891Z" }, { url = "https://files.pythonhosted.org/packages/32/02/945a6a2348ca1c6608cb1747275c8affd2ccd957d4885c25218a86377912/regex-2026.3.32-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:c1d7fa44aece1fa02b8927441614c96520253a5cad6a96994e3a81e060feed55", size = 785933, upload-time = "2026-03-28T21:47:14.795Z" }, + { url = "https://files.pythonhosted.org/packages/53/12/c5bab6cc679ad79a45427a98c4e70809586ac963c5ad54a9217533c4763e/regex-2026.3.32-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:d478a2ca902b6ef28ffc9521e5f0f728d036abe35c0b250ee8ae78cfe7c5e44e", size = 860671, upload-time = "2026-03-28T21:47:16.985Z" }, + { url = "https://files.pythonhosted.org/packages/bf/68/8d85f98c2443469facabef62b82b851d369b13f92bec2ca7a3808deaa47b/regex-2026.3.32-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:2820d2231885e97aff0fcf230a19ebd5d2b5b8a1ba338c20deb34f16db1c7897", size = 765335, upload-time = "2026-03-28T21:47:18.872Z" }, + { url = "https://files.pythonhosted.org/packages/89/a7/d8a9c270916107a501fca63b748547c6c77e570d19f16a29b557ce734f3d/regex-2026.3.32-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:fc8ced733d6cd9af5e412f256a32f7c61cd2d7371280a65c689939ac4572499f", size = 851913, upload-time = "2026-03-28T21:47:20.793Z" }, { url = "https://files.pythonhosted.org/packages/f4/8e/03d392b26679914ccf21f83d18ad4443232d2f8c3e2c30a962d4e3918d9c/regex-2026.3.32-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:847087abe98b3c1ebf1eb49d6ef320dbba75a83ee4f83c94704580f1df007dd4", size = 788447, upload-time = "2026-03-28T21:47:22.628Z" }, { url = "https://files.pythonhosted.org/packages/58/08/e38372da599dc1c39c599907ec535016d110034bd3701ce36554f59767ef/regex-2026.3.32-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:5d86e3fb08c94f084a625c8dc2132a79a3a111c8bf6e2bc59351fa61753c2f6e", size = 494495, upload-time = "2026-03-28T21:47:30.642Z" }, { url = "https://files.pythonhosted.org/packages/5f/27/6e29ece8c9ce01001ece1137fa21c8707529c2305b22828f63623b0eb262/regex-2026.3.32-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:b6f366a5ef66a2df4d9e68035cfe9f0eb8473cdfb922c37fac1d169b468607b0", size = 293988, upload-time = "2026-03-28T21:47:32.553Z" }, { url = "https://files.pythonhosted.org/packages/e1/98/8752e18bb87a2fe728b73b0f83c082eb162a470766063f8028759fb26844/regex-2026.3.32-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:b8fca73e16c49dd972ce3a88278dfa5b93bf91ddef332a46e9443abe21ca2f7c", size = 292634, upload-time = "2026-03-28T21:47:34.651Z" }, { url = "https://files.pythonhosted.org/packages/7f/7b/d7729fe294e23e9c7c3871cb69d49059fa7d65fd11e437a2cbea43f6615d/regex-2026.3.32-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:b953d9d496d19786f4d46e6ba4b386c6e493e81e40f9c5392332458183b0599d", size = 810532, upload-time = "2026-03-28T21:47:36.839Z" }, + { url = "https://files.pythonhosted.org/packages/fd/49/4dae7b000659f611b17b9c1541fba800b0569e4060debc4635ef1b23982c/regex-2026.3.32-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b565f25171e04d4fad950d1fa837133e3af6ea6f509d96166eed745eb0cf63bc", size = 871919, upload-time = "2026-03-28T21:47:39.192Z" }, + { url = "https://files.pythonhosted.org/packages/83/85/aa8ad3977b9399861db3df62b33fe5fef6932ee23a1b9f4f357f58f2094b/regex-2026.3.32-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f28eac18a8733a124444643a66ac96fef2c0ad65f50034e0a043b90333dc677f", size = 916550, upload-time = "2026-03-28T21:47:41.618Z" }, { url = "https://files.pythonhosted.org/packages/c8/c0/6379d7f5b59ff0656ba49cf666d5013ecee55e83245275b310b0ffc79143/regex-2026.3.32-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:7cdd508664430dd51b8888deb6c5b416d8de046b2e11837254378d31febe4a98", size = 814988, upload-time = "2026-03-28T21:47:43.681Z" }, + { url = "https://files.pythonhosted.org/packages/2c/af/2dfddc64074bd9b70e27e170ee9db900542e2870210b489ad4471416ba86/regex-2026.3.32-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:5c35d097f509cf7e40d20d5bee548d35d6049b36eb9965e8d43e4659923405b9", size = 786337, upload-time = "2026-03-28T21:47:46.076Z" }, { url = "https://files.pythonhosted.org/packages/eb/2f/4eb8abd705236402b4fe0e130971634deffb1855e2028bf02a2b7c0e841c/regex-2026.3.32-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:85c9b0c131427470a6423baa0a9330be6fd8c3630cc3ee6fdee03360724cbec5", size = 800029, upload-time = "2026-03-28T21:47:48.356Z" }, + { url = "https://files.pythonhosted.org/packages/3e/2c/77d9ca2c9df483b51b4b1291c96d79c9ae301077841c4db39bc822f6b4c6/regex-2026.3.32-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:e50af656c15e2723eeb7279c0837e07accc594b95ec18b86821a4d44b51b24bf", size = 865843, upload-time = "2026-03-28T21:47:50.762Z" }, + { url = "https://files.pythonhosted.org/packages/48/10/306f477a509f4eed699071b1f031d89edd5a2b5fa28c8ede5b2638eaba82/regex-2026.3.32-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:4bc32b4dbdb4f9f300cf9f38f8ea2ce9511a068ffaa45ac1373ee7a943f1d810", size = 772473, upload-time = "2026-03-28T21:47:52.771Z" }, + { url = "https://files.pythonhosted.org/packages/f4/f6/54bd83ec46ac037de2beb049afc9dd5d2769c6ecaadf7856254ce610e62a/regex-2026.3.32-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:e3e5d1802cba785210a4a800e63fcee7a228649a880f3bf7f2aadccb151a834b", size = 856805, upload-time = "2026-03-28T21:47:55.04Z" }, { url = "https://files.pythonhosted.org/packages/37/e8/ee0e7d14de1fc6582d5782f072db6c61465a38a4142f88e175dda494b536/regex-2026.3.32-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:ef250a3f5e93182193f5c927c5e9575b2cb14b80d03e258bc0b89cc5de076b60", size = 801875, upload-time = "2026-03-28T21:47:57.434Z" }, { url = "https://files.pythonhosted.org/packages/32/68/ff024bf6131b7446a791a636dbbb7fa732d586f33b276d84b3460ea49393/regex-2026.3.32-cp314-cp314-macosx_10_13_universal2.whl", hash = "sha256:a416ee898ecbc5d8b283223b4cf4d560f93244f6f7615c1bd67359744b00c166", size = 490430, upload-time = "2026-03-28T21:48:05.654Z" }, { url = "https://files.pythonhosted.org/packages/61/72/039d9164817ee298f2a2d0246001afe662241dcbec0eedd1fe03e2a2555e/regex-2026.3.32-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:d76d62909bfb14521c3f7cfd5b94c0c75ec94b0a11f647d2f604998962ec7b6c", size = 291948, upload-time = "2026-03-28T21:48:07.666Z" }, { url = "https://files.pythonhosted.org/packages/06/9d/77f684d90ffe3e99b828d3cabb87a0f1601d2b9decd1333ff345809b1d02/regex-2026.3.32-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:631f7d95c83f42bccfe18946a38ad27ff6b6717fb4807e60cf24860b5eb277fc", size = 289786, upload-time = "2026-03-28T21:48:09.562Z" }, { url = "https://files.pythonhosted.org/packages/83/70/bd76069a0304e924682b2efd8683a01617a7e1da9b651af73039d8da76a4/regex-2026.3.32-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:12917c6c6813ffcdfb11680a04e4d63c5532b88cf089f844721c5f41f41a63ad", size = 796672, upload-time = "2026-03-28T21:48:11.568Z" }, + { url = "https://files.pythonhosted.org/packages/80/31/c2d7d9a5671e111a2c16d57e0cb03e1ce35b28a115901590528aa928bb5b/regex-2026.3.32-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3e221b615f83b15887636fcb90ed21f1a19541366f8b7ba14ba1ad8304f4ded4", size = 866556, upload-time = "2026-03-28T21:48:14.081Z" }, + { url = "https://files.pythonhosted.org/packages/d7/b9/9921a31931d0bc3416ac30205471e0e2ed60dcbd16fc922bbd69b427322b/regex-2026.3.32-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:4f9ae4755fa90f1dc2d0d393d572ebc134c0fe30fcfc0ab7e67c1db15f192041", size = 912787, upload-time = "2026-03-28T21:48:16.548Z" }, { url = "https://files.pythonhosted.org/packages/41/ab/2c1bc8ab99f63cdabdbc7823af8f4cfcd6ddbb2babf01861826c3f1ad44d/regex-2026.3.32-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a094e9dcafedfb9d333db5cf880304946683f43a6582bb86688f123335122929", size = 800879, upload-time = "2026-03-28T21:48:18.971Z" }, + { url = "https://files.pythonhosted.org/packages/49/e5/0be716eb2c0b2ae3a439e44432534e82b2f81848af64cb21c0473ad8ae46/regex-2026.3.32-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c1cecea3e477af105f32ef2119b8d895f297492e41d317e60d474bc4bffd62ff", size = 776332, upload-time = "2026-03-28T21:48:21.163Z" }, { url = "https://files.pythonhosted.org/packages/26/80/114a61bd25dec7d1070930eaef82aadf9b05961a37629e7cca7bc3fc2257/regex-2026.3.32-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:f26262900edd16272b6360014495e8d68379c6c6e95983f9b7b322dc928a1194", size = 786384, upload-time = "2026-03-28T21:48:23.277Z" }, + { url = "https://files.pythonhosted.org/packages/0c/78/be0a6531f8db426e8e60d6356aeef8e9cc3f541655a648c4968b63c87a88/regex-2026.3.32-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:1cb22fa9ee6a0acb22fc9aecce5f9995fe4d2426ed849357d499d62608fbd7f9", size = 861381, upload-time = "2026-03-28T21:48:25.371Z" }, + { url = "https://files.pythonhosted.org/packages/45/b1/e5076fbe45b8fb39672584b1b606d512f5bd3a43155be68a95f6b88c1fc5/regex-2026.3.32-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:9b9118a78e031a2e4709cd2fcc3028432e89b718db70073a8da574c249b5b249", size = 765434, upload-time = "2026-03-28T21:48:27.494Z" }, + { url = "https://files.pythonhosted.org/packages/a3/da/fd65d68b897f8b52b1390d20d776fa753582484724a9cb4f4c26de657ae5/regex-2026.3.32-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:b193ed199848aa96618cd5959c1582a0bf23cd698b0b900cb0ffe81b02c8659c", size = 851501, upload-time = "2026-03-28T21:48:29.884Z" }, { url = "https://files.pythonhosted.org/packages/e8/d6/1e9c991c32022a9312e9124cc974961b3a2501338de2cd1cce75a3612d7a/regex-2026.3.32-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:10fb2aaae1aaadf7d43c9f3c2450404253697bf8b9ce360bd5418d1d16292298", size = 788076, upload-time = "2026-03-28T21:48:32.025Z" }, { url = "https://files.pythonhosted.org/packages/30/6e/87caccd608837a1fa4f8c7edc48e206103452b9bbc94fc724fa39340e807/regex-2026.3.32-cp314-cp314t-macosx_10_13_universal2.whl", hash = "sha256:51fb7e26f91f9091fd8ec6a946f99b15d3bc3667cb5ddc73dd6cb2222dd4a1cc", size = 494506, upload-time = "2026-03-28T21:48:41.327Z" }, { url = "https://files.pythonhosted.org/packages/16/53/a922e6b24694d70bdd68fc3fd076950e15b1b418cff9d2cc362b3968d86f/regex-2026.3.32-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:51a93452034d671b0e21b883d48ea66c5d6a05620ee16a9d3f229e828568f3f0", size = 293986, upload-time = "2026-03-28T21:48:43.481Z" }, { url = "https://files.pythonhosted.org/packages/60/e4/0cb32203c1aebad0577fcd5b9af1fe764869e617d5234bc6a0ad284299ea/regex-2026.3.32-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:03c2ebd15ff51e7b13bb3dc28dd5ac18cd39e59ebb40430b14ae1a19e833cff1", size = 292677, upload-time = "2026-03-28T21:48:45.772Z" }, { url = "https://files.pythonhosted.org/packages/f0/f8/5006b70291469d4174dd66ad162802e2f68419c0f2a7952d0c76c1288cfa/regex-2026.3.32-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5bf2f3c2c5bd8360d335c7dcd4a9006cf1dabae063ee2558ee1b07bbc8a20d88", size = 810661, upload-time = "2026-03-28T21:48:48.147Z" }, + { url = "https://files.pythonhosted.org/packages/b2/9b/438763a20d22cd1f65f95c8f030dd25df2d80a941068a891d21a5f240456/regex-2026.3.32-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:8a4a3189a99ecdd1c13f42513ab3fc7fa8311b38ba7596dd98537acb8cd9acc3", size = 872156, upload-time = "2026-03-28T21:48:50.739Z" }, + { url = "https://files.pythonhosted.org/packages/6c/5b/1341287887ac982ed9f5f60125e440513ffe354aa7e3681940495af7c12a/regex-2026.3.32-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3c0bbfbd38506e1ea96a85da6782577f06239cb9fcf9696f1ea537c980c0680b", size = 916749, upload-time = "2026-03-28T21:48:53.57Z" }, { url = "https://files.pythonhosted.org/packages/42/e2/1d2b48b8e94debfffc6fefb84d2a86a178cc208652a1d6493d5f29821c70/regex-2026.3.32-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:8aaf8ee8f34b677f90742ca089b9c83d64bdc410528767273c816a863ed57327", size = 814788, upload-time = "2026-03-28T21:48:55.905Z" }, + { url = "https://files.pythonhosted.org/packages/a6/d9/7dacb34c43adaeb954518d851f3e5d3ce495ac00a9d6010e3b4b59917c4a/regex-2026.3.32-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:3ea568832eca219c2be1721afa073c1c9eb8f98a9733fdedd0a9747639fc22a5", size = 786594, upload-time = "2026-03-28T21:48:58.404Z" }, { url = "https://files.pythonhosted.org/packages/ea/72/28295068c92dbd6d3ce4fd22554345cf504e957cc57dadeda4a64fa86a57/regex-2026.3.32-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:8e4c8fa46aad1a11ae2f8fcd1c90b9d55e18925829ac0d98c5bb107f93351745", size = 800167, upload-time = "2026-03-28T21:49:01.226Z" }, + { url = "https://files.pythonhosted.org/packages/ca/17/b10745adeca5b8d52da050e7c746137f5d01dabc6dbbe6e8d9d821dc65c1/regex-2026.3.32-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:0cec365d44835b043d7b3266487797639d07d621bec9dc0ea224b00775797cc1", size = 865906, upload-time = "2026-03-28T21:49:03.484Z" }, + { url = "https://files.pythonhosted.org/packages/45/9d/1acbcce765044ac0c87f453f4876e0897f7a61c10315262f960184310798/regex-2026.3.32-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:09e26cad1544d856da85881ad292797289e4406338afe98163f3db9f7fac816c", size = 772642, upload-time = "2026-03-28T21:49:06.811Z" }, + { url = "https://files.pythonhosted.org/packages/24/41/1ef8b4811355ad7b9d7579d3aeca00f18b7bc043ace26c8c609b9287346d/regex-2026.3.32-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:6062c4ef581a3e9e503dccf4e1b7f2d33fdc1c13ad510b287741ac73bc4c6b27", size = 856927, upload-time = "2026-03-28T21:49:09.373Z" }, { url = "https://files.pythonhosted.org/packages/97/b1/0dc1d361be80ec1b8b707ada041090181133a7a29d438e432260a4b26f9a/regex-2026.3.32-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:88ebc0783907468f17fca3d7821b30f9c21865a721144eb498cb0ff99a67bcac", size = 801910, upload-time = "2026-03-28T21:49:11.818Z" }, ] @@ -2038,11 +2350,17 @@ version = "0.15.8" source = { registry = "https://pypi.org/simple" } sdist = { url = "https://files.pythonhosted.org/packages/14/b0/73cf7550861e2b4824950b8b52eebdcc5adc792a00c514406556c5b80817/ruff-0.15.8.tar.gz", hash = "sha256:995f11f63597ee362130d1d5a327a87cb6f3f5eae3094c620bcc632329a4d26e", size = 4610921, upload-time = "2026-03-26T18:39:38.675Z" } wheels = [ + { url = "https://files.pythonhosted.org/packages/4a/92/c445b0cd6da6e7ae51e954939cb69f97e008dbe750cfca89b8cedc081be7/ruff-0.15.8-py3-none-linux_armv6l.whl", hash = "sha256:cbe05adeba76d58162762d6b239c9056f1a15a55bd4b346cfd21e26cd6ad7bc7", size = 10527394, upload-time = "2026-03-26T18:39:41.566Z" }, { url = "https://files.pythonhosted.org/packages/eb/92/f1c662784d149ad1414cae450b082cf736430c12ca78367f20f5ed569d65/ruff-0.15.8-py3-none-macosx_10_12_x86_64.whl", hash = "sha256:d3e3d0b6ba8dca1b7ef9ab80a28e840a20070c4b62e56d675c24f366ef330570", size = 10905693, upload-time = "2026-03-26T18:39:30.364Z" }, { url = "https://files.pythonhosted.org/packages/ca/f2/7a631a8af6d88bcef997eb1bf87cc3da158294c57044aafd3e17030613de/ruff-0.15.8-py3-none-macosx_11_0_arm64.whl", hash = "sha256:6ee3ae5c65a42f273f126686353f2e08ff29927b7b7e203b711514370d500de3", size = 10323044, upload-time = "2026-03-26T18:39:33.37Z" }, { url = "https://files.pythonhosted.org/packages/67/18/1bf38e20914a05e72ef3b9569b1d5c70a7ef26cd188d69e9ca8ef588d5bf/ruff-0.15.8-py3-none-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:fdce027ada77baa448077ccc6ebb2fa9c3c62fd110d8659d601cf2f475858d94", size = 10629135, upload-time = "2026-03-26T18:39:44.142Z" }, + { url = "https://files.pythonhosted.org/packages/d2/e9/138c150ff9af60556121623d41aba18b7b57d95ac032e177b6a53789d279/ruff-0.15.8-py3-none-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:12e617fc01a95e5821648a6df341d80456bd627bfab8a829f7cfc26a14a4b4a3", size = 10348041, upload-time = "2026-03-26T18:39:52.178Z" }, + { url = "https://files.pythonhosted.org/packages/10/11/6da2e538704e753c04e8d86b1fc55712fdbdcc266af1a1ece7a51fff0d10/ruff-0.15.8-py3-none-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:d910ae974b7a06a33a057cb87d2a10792a3b2b3b35e33d2699fdf63ec8f6b17a", size = 11951057, upload-time = "2026-03-26T18:39:19.18Z" }, + { url = "https://files.pythonhosted.org/packages/83/f0/c9208c5fd5101bf87002fed774ff25a96eea313d305f1e5d5744698dc314/ruff-0.15.8-py3-none-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:2033f963c43949d51e6fdccd3946633c6b37c484f5f98c3035f49c27395a8ab8", size = 11464613, upload-time = "2026-03-26T18:40:06.301Z" }, { url = "https://files.pythonhosted.org/packages/f8/22/d7f2fabdba4fae9f3b570e5605d5eb4500dcb7b770d3217dca4428484b17/ruff-0.15.8-py3-none-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:0f29b989a55572fb885b77464cf24af05500806ab4edf9a0fd8977f9759d85b1", size = 11257557, upload-time = "2026-03-26T18:39:57.972Z" }, + { url = "https://files.pythonhosted.org/packages/71/8c/382a9620038cf6906446b23ce8632ab8c0811b8f9d3e764f58bedd0c9a6f/ruff-0.15.8-py3-none-manylinux_2_31_riscv64.whl", hash = "sha256:ac51d486bf457cdc985a412fb1801b2dfd1bd8838372fc55de64b1510eff4bec", size = 11169440, upload-time = "2026-03-26T18:39:22.205Z" }, { url = "https://files.pythonhosted.org/packages/4d/0d/0994c802a7eaaf99380085e4e40c845f8e32a562e20a38ec06174b52ef24/ruff-0.15.8-py3-none-musllinux_1_2_aarch64.whl", hash = "sha256:c9861eb959edab053c10ad62c278835ee69ca527b6dcd72b47d5c1e5648964f6", size = 10605963, upload-time = "2026-03-26T18:39:46.682Z" }, + { url = "https://files.pythonhosted.org/packages/19/aa/d624b86f5b0aad7cef6bbf9cd47a6a02dfdc4f72c92a337d724e39c9d14b/ruff-0.15.8-py3-none-musllinux_1_2_armv7l.whl", hash = "sha256:8d9a5b8ea13f26ae90838afc33f91b547e61b794865374f114f349e9036835fb", size = 10357484, upload-time = "2026-03-26T18:39:49.176Z" }, { url = "https://files.pythonhosted.org/packages/f0/51/ab20b322f637b369383adc341d761eaaa0f0203d6b9a7421cd6e783d81b9/ruff-0.15.8-py3-none-musllinux_1_2_x86_64.whl", hash = "sha256:75e5cd06b1cf3f47a3996cfc999226b19aa92e7cce682dcd62f80d7035f98f49", size = 11345125, upload-time = "2026-03-26T18:39:27.799Z" }, ] @@ -2055,8 +2373,12 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/fa/47/aef6c06649039accf914afef490268e1067ed82be62bcfa5b7e886ad15e8/safetensors-0.7.0-cp38-abi3-macosx_10_12_x86_64.whl", hash = "sha256:c82f4d474cf725255d9e6acf17252991c3c8aac038d6ef363a4bf8be2f6db517", size = 467781, upload-time = "2025-11-19T15:18:35.84Z" }, { url = "https://files.pythonhosted.org/packages/e8/00/374c0c068e30cd31f1e1b46b4b5738168ec79e7689ca82ee93ddfea05109/safetensors-0.7.0-cp38-abi3-macosx_11_0_arm64.whl", hash = "sha256:94fd4858284736bb67a897a41608b5b0c2496c9bdb3bf2af1fa3409127f20d57", size = 447058, upload-time = "2025-11-19T15:18:34.416Z" }, { url = "https://files.pythonhosted.org/packages/f1/06/578ffed52c2296f93d7fd2d844cabfa92be51a587c38c8afbb8ae449ca89/safetensors-0.7.0-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:e07d91d0c92a31200f25351f4acb2bc6aff7f48094e13ebb1d0fb995b54b6542", size = 491748, upload-time = "2025-11-19T15:18:09.79Z" }, + { url = "https://files.pythonhosted.org/packages/ae/33/1debbbb70e4791dde185edb9413d1fe01619255abb64b300157d7f15dddd/safetensors-0.7.0-cp38-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:8469155f4cb518bafb4acf4865e8bb9d6804110d2d9bdcaa78564b9fd841e104", size = 503881, upload-time = "2025-11-19T15:18:16.145Z" }, + { url = "https://files.pythonhosted.org/packages/8e/1c/40c2ca924d60792c3be509833df711b553c60effbd91da6f5284a83f7122/safetensors-0.7.0-cp38-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:54bef08bf00a2bff599982f6b08e8770e09cc012d7bba00783fc7ea38f1fb37d", size = 623463, upload-time = "2025-11-19T15:18:21.11Z" }, + { url = "https://files.pythonhosted.org/packages/9b/3a/13784a9364bd43b0d61eef4bea2845039bc2030458b16594a1bd787ae26e/safetensors-0.7.0-cp38-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:42cb091236206bb2016d245c377ed383aa7f78691748f3bb6ee1bfa51ae2ce6a", size = 532855, upload-time = "2025-11-19T15:18:25.719Z" }, { url = "https://files.pythonhosted.org/packages/a0/60/429e9b1cb3fc651937727befe258ea24122d9663e4d5709a48c9cbfceecb/safetensors-0.7.0-cp38-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:dac7252938f0696ddea46f5e855dd3138444e82236e3be475f54929f0c510d48", size = 507152, upload-time = "2025-11-19T15:18:33.023Z" }, { url = "https://files.pythonhosted.org/packages/06/87/d26d8407c44175d8ae164a95b5a62707fcc445f3c0c56108e37d98070a3d/safetensors-0.7.0-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:cdab83a366799fa730f90a4ebb563e494f28e9e92c4819e556152ad55e43591b", size = 674060, upload-time = "2025-11-19T15:18:37.211Z" }, + { url = "https://files.pythonhosted.org/packages/11/f5/57644a2ff08dc6325816ba7217e5095f17269dada2554b658442c66aed51/safetensors-0.7.0-cp38-abi3-musllinux_1_2_armv7l.whl", hash = "sha256:672132907fcad9f2aedcb705b2d7b3b93354a2aec1b2f706c4db852abe338f85", size = 771715, upload-time = "2025-11-19T15:18:38.689Z" }, { url = "https://files.pythonhosted.org/packages/4a/d8/0c8a7dc9b41dcac53c4cbf9df2b9c83e0e0097203de8b37a712b345c0be5/safetensors-0.7.0-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:b0f6d66c1c538d5a94a73aa9ddca8ccc4227e6c9ff555322ea40bdd142391dd4", size = 677368, upload-time = "2025-11-19T15:18:41.627Z" }, ] @@ -2144,6 +2466,15 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/7e/aa/553dbe4178b5f23eb28e59393dddd64186178b56b81d9b8d5c3ff1c28395/sentencepiece-0.2.1-cp314-cp314t-manylinux_2_27_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:733e59ff1794d26db706cd41fc2d7ca5f6c64a820709cb801dc0ea31780d64ab", size = 1387458, upload-time = "2025-08-12T07:00:44.56Z" }, ] +[[package]] +name = "setuptools" +version = "82.0.1" +source = { registry = "https://pypi.org/simple" } +sdist = { url = "https://files.pythonhosted.org/packages/4f/db/cfac1baf10650ab4d1c111714410d2fbb77ac5a616db26775db562c8fab2/setuptools-82.0.1.tar.gz", hash = "sha256:7d872682c5d01cfde07da7bccc7b65469d3dca203318515ada1de5eda35efbf9", size = 1152316, upload-time = "2026-03-09T12:47:17.221Z" } +wheels = [ + { url = "https://files.pythonhosted.org/packages/9d/76/f789f7a86709c6b087c5a2f52f911838cad707cc613162401badc665acfe/setuptools-82.0.1-py3-none-any.whl", hash = "sha256:a59e362652f08dcd477c78bb6e7bd9d80a7995bc73ce773050228a348ce2e5bb", size = 1006223, upload-time = "2026-03-09T12:47:15.026Z" }, +] + [[package]] name = "shellingham" version = "1.5.4" @@ -2347,8 +2678,12 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/92/97/5dbfabf04c7e348e655e907ed27913e03db0923abb5dfdd120d7b25630e1/tokenizers-0.22.2-cp39-abi3-macosx_10_12_x86_64.whl", hash = "sha256:544dd704ae7238755d790de45ba8da072e9af3eea688f698b137915ae959281c", size = 3100275, upload-time = "2026-01-05T10:41:02.158Z" }, { url = "https://files.pythonhosted.org/packages/2e/47/174dca0502ef88b28f1c9e06b73ce33500eedfac7a7692108aec220464e7/tokenizers-0.22.2-cp39-abi3-macosx_11_0_arm64.whl", hash = "sha256:1e418a55456beedca4621dbab65a318981467a2b188e982a23e117f115ce5001", size = 2981472, upload-time = "2026-01-05T10:41:00.276Z" }, { url = "https://files.pythonhosted.org/packages/d6/84/7990e799f1309a8b87af6b948f31edaa12a3ed22d11b352eaf4f4b2e5753/tokenizers-0.22.2-cp39-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2249487018adec45d6e3554c71d46eb39fa8ea67156c640f7513eb26f318cec7", size = 3290736, upload-time = "2026-01-05T10:40:32.165Z" }, + { url = "https://files.pythonhosted.org/packages/78/59/09d0d9ba94dcd5f4f1368d4858d24546b4bdc0231c2354aa31d6199f0399/tokenizers-0.22.2-cp39-abi3-manylinux_2_17_armv7l.manylinux2014_armv7l.whl", hash = "sha256:25b85325d0815e86e0bac263506dd114578953b7b53d7de09a6485e4a160a7dd", size = 3168835, upload-time = "2026-01-05T10:40:38.847Z" }, + { url = "https://files.pythonhosted.org/packages/e0/fa/89f4cb9e08df770b57adb96f8cbb7e22695a4cb6c2bd5f0c4f0ebcf33b66/tokenizers-0.22.2-cp39-abi3-manylinux_2_17_ppc64le.manylinux2014_ppc64le.whl", hash = "sha256:1c774b1276f71e1ef716e5486f21e76333464f47bece56bbd554485982a9e03e", size = 3724818, upload-time = "2026-01-05T10:40:44.507Z" }, + { url = "https://files.pythonhosted.org/packages/64/04/ca2363f0bfbe3b3d36e95bf67e56a4c88c8e3362b658e616d1ac185d47f2/tokenizers-0.22.2-cp39-abi3-manylinux_2_17_s390x.manylinux2014_s390x.whl", hash = "sha256:df6c4265b289083bf710dff49bc51ef252f9d5be33a45ee2bed151114a56207b", size = 3379195, upload-time = "2026-01-05T10:40:51.139Z" }, { url = "https://files.pythonhosted.org/packages/2e/76/932be4b50ef6ccedf9d3c6639b056a967a86258c6d9200643f01269211ca/tokenizers-0.22.2-cp39-abi3-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:369cc9fc8cc10cb24143873a0d95438bb8ee257bb80c71989e3ee290e8d72c67", size = 3274982, upload-time = "2026-01-05T10:40:58.331Z" }, { url = "https://files.pythonhosted.org/packages/1d/28/5f9f5a4cc211b69e89420980e483831bcc29dade307955cc9dc858a40f01/tokenizers-0.22.2-cp39-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:29c30b83d8dcd061078b05ae0cb94d3c710555fbb44861139f9f83dcca3dc3e4", size = 9478245, upload-time = "2026-01-05T10:41:04.053Z" }, + { url = "https://files.pythonhosted.org/packages/6c/fb/66e2da4704d6aadebf8cb39f1d6d1957df667ab24cff2326b77cda0dcb85/tokenizers-0.22.2-cp39-abi3-musllinux_1_2_armv7l.whl", hash = "sha256:37ae80a28c1d3265bb1f22464c856bd23c02a05bb211e56d0c5301a435be6c1a", size = 9560069, upload-time = "2026-01-05T10:45:10.673Z" }, { url = "https://files.pythonhosted.org/packages/05/a1/d62dfe7376beaaf1394917e0f8e93ee5f67fea8fcf4107501db35996586b/tokenizers-0.22.2-cp39-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:38337540fbbddff8e999d59970f3c6f35a82de10053206a7562f1ea02d046fa5", size = 10033429, upload-time = "2026-01-05T10:45:14.333Z" }, ] @@ -2533,32 +2868,52 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/9a/07/d9412f3d7d462347e4511181dea65e47e0d0e16e26fbee2ea86a2aefb657/xxhash-3.6.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:01362c4331775398e7bb34e3ab403bc9ee9f7c497bc7dee6272114055277dd3c", size = 32744, upload-time = "2025-10-02T14:34:34.622Z" }, { url = "https://files.pythonhosted.org/packages/79/35/0429ee11d035fc33abe32dca1b2b69e8c18d236547b9a9b72c1929189b9a/xxhash-3.6.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:b7b2df81a23f8cb99656378e72501b2cb41b1827c0f5a86f87d6b06b69f9f204", size = 30816, upload-time = "2025-10-02T14:34:36.043Z" }, { url = "https://files.pythonhosted.org/packages/4c/ed/6224ba353690d73af7a3f1c7cdb1fc1b002e38f783cb991ae338e1eb3d79/xxhash-3.6.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:93f107c673bccf0d592cdba077dedaf52fe7f42dcd7676eba1f6d6f0c3efffd2", size = 212914, upload-time = "2025-10-02T14:34:38.6Z" }, + { url = "https://files.pythonhosted.org/packages/38/86/fb6b6130d8dd6b8942cc17ab4d90e223653a89aa32ad2776f8af7064ed13/xxhash-3.6.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:2aa5ee3444c25b69813663c9f8067dcfaa2e126dc55e8dddf40f4d1c25d7effa", size = 212163, upload-time = "2025-10-02T14:34:39.872Z" }, + { url = "https://files.pythonhosted.org/packages/ee/dc/e84875682b0593e884ad73b2d40767b5790d417bde603cceb6878901d647/xxhash-3.6.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:f7f99123f0e1194fa59cc69ad46dbae2e07becec5df50a0509a808f90a0f03f0", size = 445411, upload-time = "2025-10-02T14:34:41.569Z" }, { url = "https://files.pythonhosted.org/packages/11/4f/426f91b96701ec2f37bb2b8cec664eff4f658a11f3fa9d94f0a887ea6d2b/xxhash-3.6.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:49e03e6fe2cac4a1bc64952dd250cf0dbc5ef4ebb7b8d96bce82e2de163c82a2", size = 193883, upload-time = "2025-10-02T14:34:43.249Z" }, { url = "https://files.pythonhosted.org/packages/53/5a/ddbb83eee8e28b778eacfc5a85c969673e4023cdeedcfcef61f36731610b/xxhash-3.6.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:bd17fede52a17a4f9a7bc4472a5867cb0b160deeb431795c0e4abe158bc784e9", size = 210392, upload-time = "2025-10-02T14:34:45.042Z" }, + { url = "https://files.pythonhosted.org/packages/58/ca/faa05ac19b3b622c7c9317ac3e23954187516298a091eb02c976d0d3dd45/xxhash-3.6.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:843b52f6d88071f87eba1631b684fcb4b2068cd2180a0224122fe4ef011a9374", size = 210655, upload-time = "2025-10-02T14:34:47.571Z" }, + { url = "https://files.pythonhosted.org/packages/d4/7a/06aa7482345480cc0cb597f5c875b11a82c3953f534394f620b0be2f700c/xxhash-3.6.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:7d14a6cfaf03b1b6f5f9790f76880601ccc7896aff7ab9cd8978a939c1eb7e0d", size = 414001, upload-time = "2025-10-02T14:34:49.273Z" }, { url = "https://files.pythonhosted.org/packages/23/07/63ffb386cd47029aa2916b3d2f454e6cc5b9f5c5ada3790377d5430084e7/xxhash-3.6.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:418daf3db71e1413cfe211c2f9a528456936645c17f46b5204705581a45390ae", size = 191431, upload-time = "2025-10-02T14:34:50.798Z" }, { url = "https://files.pythonhosted.org/packages/33/76/35d05267ac82f53ae9b0e554da7c5e281ee61f3cad44c743f0fcd354f211/xxhash-3.6.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:599e64ba7f67472481ceb6ee80fa3bd828fd61ba59fb11475572cc5ee52b89ec", size = 32738, upload-time = "2025-10-02T14:34:55.839Z" }, { url = "https://files.pythonhosted.org/packages/31/a8/3fbce1cd96534a95e35d5120637bf29b0d7f5d8fa2f6374e31b4156dd419/xxhash-3.6.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7d8b8aaa30fca4f16f0c84a5c8d7ddee0e25250ec2796c973775373257dde8f1", size = 30821, upload-time = "2025-10-02T14:34:57.219Z" }, { url = "https://files.pythonhosted.org/packages/ba/0c/71435dcb99874b09a43b8d7c54071e600a7481e42b3e3ce1eb5226a5711a/xxhash-3.6.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:858dc935963a33bc33490128edc1c12b0c14d9c7ebaa4e387a7869ecc4f3e263", size = 212975, upload-time = "2025-10-02T14:35:00.816Z" }, + { url = "https://files.pythonhosted.org/packages/84/7a/c2b3d071e4bb4a90b7057228a99b10d51744878f4a8a6dd643c8bd897620/xxhash-3.6.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:ba284920194615cb8edf73bf52236ce2e1664ccd4a38fdb543506413529cc546", size = 212241, upload-time = "2025-10-02T14:35:02.207Z" }, + { url = "https://files.pythonhosted.org/packages/81/5f/640b6eac0128e215f177df99eadcd0f1b7c42c274ab6a394a05059694c5a/xxhash-3.6.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:4b54219177f6c6674d5378bd862c6aedf64725f70dd29c472eaae154df1a2e89", size = 445471, upload-time = "2025-10-02T14:35:03.61Z" }, { url = "https://files.pythonhosted.org/packages/5e/1e/3c3d3ef071b051cc3abbe3721ffb8365033a172613c04af2da89d5548a87/xxhash-3.6.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:42c36dd7dbad2f5238950c377fcbf6811b1cdb1c444fab447960030cea60504d", size = 193936, upload-time = "2025-10-02T14:35:05.013Z" }, { url = "https://files.pythonhosted.org/packages/2c/bd/4a5f68381939219abfe1c22a9e3a5854a4f6f6f3c4983a87d255f21f2e5d/xxhash-3.6.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:f22927652cba98c44639ffdc7aaf35828dccf679b10b31c4ad72a5b530a18eb7", size = 210440, upload-time = "2025-10-02T14:35:06.239Z" }, + { url = "https://files.pythonhosted.org/packages/d7/fd/2c0a00c97b9e18f72e1f240ad4e8f8a90fd9d408289ba9c7c495ed7dc05c/xxhash-3.6.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:6f2580ffab1a8b68ef2b901cde7e55fa8da5e4be0977c68f78fc80f3c143de42", size = 210689, upload-time = "2025-10-02T14:35:09.438Z" }, + { url = "https://files.pythonhosted.org/packages/93/86/5dd8076a926b9a95db3206aba20d89a7fc14dd5aac16e5c4de4b56033140/xxhash-3.6.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:40c391dd3cd041ebc3ffe6f2c862f402e306eb571422e0aa918d8070ba31da11", size = 414068, upload-time = "2025-10-02T14:35:11.162Z" }, { url = "https://files.pythonhosted.org/packages/af/3c/0bb129170ee8f3650f08e993baee550a09593462a5cddd8e44d0011102b1/xxhash-3.6.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:f205badabde7aafd1a31e8ca2a3e5a763107a71c397c4481d6a804eb5063d8bd", size = 191495, upload-time = "2025-10-02T14:35:12.971Z" }, { url = "https://files.pythonhosted.org/packages/f3/30/25e5321c8732759e930c555176d37e24ab84365482d257c3b16362235212/xxhash-3.6.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:a42e633d75cdad6d625434e3468126c73f13f7584545a9cf34e883aa1710e702", size = 32956, upload-time = "2025-10-02T14:35:17.413Z" }, { url = "https://files.pythonhosted.org/packages/9f/3c/0573299560d7d9f8ab1838f1efc021a280b5ae5ae2e849034ef3dee18810/xxhash-3.6.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:568a6d743219e717b07b4e03b0a828ce593833e498c3b64752e0f5df6bfe84db", size = 31072, upload-time = "2025-10-02T14:35:18.844Z" }, { url = "https://files.pythonhosted.org/packages/e3/8e/c6d158d12a79bbd0b878f8355432075fc82759e356ab5a111463422a239b/xxhash-3.6.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:78e7f2f4c521c30ad5e786fdd6bae89d47a32672a80195467b5de0480aa97b1f", size = 215736, upload-time = "2025-10-02T14:35:21.616Z" }, + { url = "https://files.pythonhosted.org/packages/bc/68/c4c80614716345d55071a396cf03d06e34b5f4917a467faf43083c995155/xxhash-3.6.0-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:3ed0df1b11a79856df5ffcab572cbd6b9627034c1c748c5566fa79df9048a7c5", size = 214833, upload-time = "2025-10-02T14:35:23.32Z" }, + { url = "https://files.pythonhosted.org/packages/7e/e9/ae27c8ffec8b953efa84c7c4a6c6802c263d587b9fc0d6e7cea64e08c3af/xxhash-3.6.0-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:0e4edbfc7d420925b0dd5e792478ed393d6e75ff8fc219a6546fb446b6a417b1", size = 448348, upload-time = "2025-10-02T14:35:25.111Z" }, { url = "https://files.pythonhosted.org/packages/d7/6b/33e21afb1b5b3f46b74b6bd1913639066af218d704cc0941404ca717fc57/xxhash-3.6.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:fba27a198363a7ef87f8c0f6b171ec36b674fe9053742c58dd7e3201c1ab30ee", size = 196070, upload-time = "2025-10-02T14:35:26.586Z" }, { url = "https://files.pythonhosted.org/packages/96/b6/fcabd337bc5fa624e7203aa0fa7d0c49eed22f72e93229431752bddc83d9/xxhash-3.6.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:794fe9145fe60191c6532fa95063765529770edcdd67b3d537793e8004cabbfd", size = 212907, upload-time = "2025-10-02T14:35:28.087Z" }, + { url = "https://files.pythonhosted.org/packages/0d/98/e8de5baa5109394baf5118f5e72ab21a86387c4f89b0e77ef3e2f6b0327b/xxhash-3.6.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:f01375c0e55395b814a679b3eea205db7919ac2af213f4a6682e01220e5fe292", size = 213304, upload-time = "2025-10-02T14:35:31.222Z" }, + { url = "https://files.pythonhosted.org/packages/7b/1d/71056535dec5c3177eeb53e38e3d367dd1d16e024e63b1cee208d572a033/xxhash-3.6.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:d706dca2d24d834a4661619dcacf51a75c16d65985718d6a7d73c1eeeb903ddf", size = 416930, upload-time = "2025-10-02T14:35:32.517Z" }, { url = "https://files.pythonhosted.org/packages/dc/6c/5cbde9de2cd967c322e651c65c543700b19e7ae3e0aae8ece3469bf9683d/xxhash-3.6.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:5f059d9faeacd49c0215d66f4056e1326c80503f51a1532ca336a385edadd033", size = 193787, upload-time = "2025-10-02T14:35:33.827Z" }, { url = "https://files.pythonhosted.org/packages/7e/5e/0138bc4484ea9b897864d59fce9be9086030825bc778b76cb5a33a906d37/xxhash-3.6.0-cp314-cp314-macosx_10_13_x86_64.whl", hash = "sha256:a40a3d35b204b7cc7643cbcf8c9976d818cb47befcfac8bbefec8038ac363f3e", size = 32754, upload-time = "2025-10-02T14:35:38.245Z" }, { url = "https://files.pythonhosted.org/packages/18/d7/5dac2eb2ec75fd771957a13e5dda560efb2176d5203f39502a5fc571f899/xxhash-3.6.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:a54844be970d3fc22630b32d515e79a90d0a3ddb2644d8d7402e3c4c8da61405", size = 30846, upload-time = "2025-10-02T14:35:39.6Z" }, { url = "https://files.pythonhosted.org/packages/e7/3b/52badfb2aecec2c377ddf1ae75f55db3ba2d321c5e164f14461c90837ef3/xxhash-3.6.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:4f6f72232f849eb9d0141e2ebe2677ece15adfd0fa599bc058aad83c714bb2c6", size = 213074, upload-time = "2025-10-02T14:35:42.29Z" }, + { url = "https://files.pythonhosted.org/packages/a2/2b/ae46b4e9b92e537fa30d03dbc19cdae57ed407e9c26d163895e968e3de85/xxhash-3.6.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:63275a8aba7865e44b1813d2177e0f5ea7eadad3dd063a21f7cf9afdc7054063", size = 212388, upload-time = "2025-10-02T14:35:43.929Z" }, + { url = "https://files.pythonhosted.org/packages/f5/80/49f88d3afc724b4ac7fbd664c8452d6db51b49915be48c6982659e0e7942/xxhash-3.6.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:3cd01fa2aa00d8b017c97eb46b9a794fbdca53fc14f845f5a328c71254b0abb7", size = 445614, upload-time = "2025-10-02T14:35:45.216Z" }, { url = "https://files.pythonhosted.org/packages/ed/ba/603ce3961e339413543d8cd44f21f2c80e2a7c5cfe692a7b1f2cccf58f3c/xxhash-3.6.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:0226aa89035b62b6a86d3c68df4d7c1f47a342b8683da2b60cedcddb46c4d95b", size = 194024, upload-time = "2025-10-02T14:35:46.959Z" }, { url = "https://files.pythonhosted.org/packages/78/d1/8e225ff7113bf81545cfdcd79eef124a7b7064a0bba53605ff39590b95c2/xxhash-3.6.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:c6e193e9f56e4ca4923c61238cdaced324f0feac782544eb4c6d55ad5cc99ddd", size = 210541, upload-time = "2025-10-02T14:35:48.301Z" }, + { url = "https://files.pythonhosted.org/packages/11/38/5eab81580703c4df93feb5f32ff8fa7fe1e2c51c1f183ee4e48d4bb9d3d7/xxhash-3.6.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:c1ce4009c97a752e682b897aa99aef84191077a9433eb237774689f14f8ec152", size = 210848, upload-time = "2025-10-02T14:35:50.877Z" }, + { url = "https://files.pythonhosted.org/packages/5e/6b/953dc4b05c3ce678abca756416e4c130d2382f877a9c30a20d08ee6a77c0/xxhash-3.6.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:8cb2f4f679b01513b7adbb9b1b2f0f9cdc31b70007eaf9d59d0878809f385b11", size = 414142, upload-time = "2025-10-02T14:35:52.15Z" }, { url = "https://files.pythonhosted.org/packages/08/a9/238ec0d4e81a10eb5026d4a6972677cbc898ba6c8b9dbaec12ae001b1b35/xxhash-3.6.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:653a91d7c2ab54a92c19ccf43508b6a555440b9be1bc8be553376778be7f20b5", size = 191547, upload-time = "2025-10-02T14:35:53.547Z" }, { url = "https://files.pythonhosted.org/packages/2c/db/0e99732ed7f64182aef4a6fb145e1a295558deec2a746265dcdec12d191e/xxhash-3.6.0-cp314-cp314t-macosx_10_13_x86_64.whl", hash = "sha256:c5294f596a9017ca5a3e3f8884c00b91ab2ad2933cf288f4923c3fd4346cf3d4", size = 32955, upload-time = "2025-10-02T14:35:58.267Z" }, { url = "https://files.pythonhosted.org/packages/55/f4/2a7c3c68e564a099becfa44bb3d398810cc0ff6749b0d3cb8ccb93f23c14/xxhash-3.6.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:1cf9dcc4ab9cff01dfbba78544297a3a01dafd60f3bde4e2bfd016cf7e4ddc67", size = 31072, upload-time = "2025-10-02T14:35:59.382Z" }, { url = "https://files.pythonhosted.org/packages/63/93/b21590e1e381040e2ca305a884d89e1c345b347404f7780f07f2cdd47ef4/xxhash-3.6.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:51a73fb7cb3a3ead9f7a8b583ffd9b8038e277cdb8cb87cf890e88b3456afa0b", size = 215854, upload-time = "2025-10-02T14:36:02.207Z" }, + { url = "https://files.pythonhosted.org/packages/ce/b8/edab8a7d4fa14e924b29be877d54155dcbd8b80be85ea00d2be3413a9ed4/xxhash-3.6.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:b9c6df83594f7df8f7f708ce5ebeacfc69f72c9fbaaababf6cf4758eaada0c9b", size = 214965, upload-time = "2025-10-02T14:36:03.507Z" }, + { url = "https://files.pythonhosted.org/packages/27/67/dfa980ac7f0d509d54ea0d5a486d2bb4b80c3f1bb22b66e6a05d3efaf6c0/xxhash-3.6.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:627f0af069b0ea56f312fd5189001c24578868643203bca1abbc2c52d3a6f3ca", size = 448484, upload-time = "2025-10-02T14:36:04.828Z" }, { url = "https://files.pythonhosted.org/packages/8c/63/8ffc2cc97e811c0ca5d00ab36604b3ea6f4254f20b7bc658ca825ce6c954/xxhash-3.6.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:aa912c62f842dfd013c5f21a642c9c10cd9f4c4e943e0af83618b4a404d9091a", size = 196162, upload-time = "2025-10-02T14:36:06.182Z" }, { url = "https://files.pythonhosted.org/packages/4b/77/07f0e7a3edd11a6097e990f6e5b815b6592459cb16dae990d967693e6ea9/xxhash-3.6.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:b465afd7909db30168ab62afe40b2fcf79eedc0b89a6c0ab3123515dc0df8b99", size = 213007, upload-time = "2025-10-02T14:36:07.733Z" }, + { url = "https://files.pythonhosted.org/packages/26/a5/d749334130de9411783873e9b98ecc46688dad5db64ca6e04b02acc8b473/xxhash-3.6.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:9b3222c686a919a0f3253cfc12bb118b8b103506612253b5baeaac10d8027cf6", size = 213401, upload-time = "2025-10-02T14:36:10.585Z" }, + { url = "https://files.pythonhosted.org/packages/89/72/abed959c956a4bfc72b58c0384bb7940663c678127538634d896b1195c10/xxhash-3.6.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:c5aa639bc113e9286137cec8fadc20e9cd732b2cc385c0b7fa673b84fc1f2a93", size = 417083, upload-time = "2025-10-02T14:36:12.276Z" }, { url = "https://files.pythonhosted.org/packages/0c/b3/62fd2b586283b7d7d665fb98e266decadf31f058f1cf6c478741f68af0cb/xxhash-3.6.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:5c1343d49ac102799905e115aee590183c3921d475356cb24b4de29a4bc56518", size = 193913, upload-time = "2025-10-02T14:36:14.025Z" }, ] @@ -2577,36 +2932,76 @@ wheels = [ { url = "https://files.pythonhosted.org/packages/e3/6f/c6554045d59d64052698add01226bc867b52fe4a12373415d7991fdca95d/yarl-1.23.0-cp312-cp312-macosx_10_13_x86_64.whl", hash = "sha256:411225bae281f114067578891bc75534cfb3d92a3b4dfef7a6ca78ba354e6069", size = 87029, upload-time = "2026-03-01T22:05:14.376Z" }, { url = "https://files.pythonhosted.org/packages/19/2a/725ecc166d53438bc88f76822ed4b1e3b10756e790bafd7b523fe97c322d/yarl-1.23.0-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:13a563739ae600a631c36ce096615fe307f131344588b0bc0daec108cdb47b25", size = 86310, upload-time = "2026-03-01T22:05:15.71Z" }, { url = "https://files.pythonhosted.org/packages/99/30/58260ed98e6ff7f90ba84442c1ddd758c9170d70327394a6227b310cd60f/yarl-1.23.0-cp312-cp312-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:9cbf44c5cb4a7633d078788e1b56387e3d3cf2b8139a3be38040b22d6c3221c8", size = 97587, upload-time = "2026-03-01T22:05:17.384Z" }, + { url = "https://files.pythonhosted.org/packages/76/0a/8b08aac08b50682e65759f7f8dde98ae8168f72487e7357a5d684c581ef9/yarl-1.23.0-cp312-cp312-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:53ad387048f6f09a8969631e4de3f1bf70c50e93545d64af4f751b2498755072", size = 92528, upload-time = "2026-03-01T22:05:18.804Z" }, + { url = "https://files.pythonhosted.org/packages/52/07/0b7179101fe5f8385ec6c6bb5d0cb9f76bd9fb4a769591ab6fb5cdbfc69a/yarl-1.23.0-cp312-cp312-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:4a59ba56f340334766f3a4442e0efd0af895fae9e2b204741ef885c446b3a1a8", size = 105339, upload-time = "2026-03-01T22:05:20.235Z" }, + { url = "https://files.pythonhosted.org/packages/d3/8a/36d82869ab5ec829ca8574dfcb92b51286fcfb1e9c7a73659616362dc880/yarl-1.23.0-cp312-cp312-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:803a3c3ce4acc62eaf01eaca1208dcf0783025ef27572c3336502b9c232005e7", size = 105061, upload-time = "2026-03-01T22:05:22.268Z" }, { url = "https://files.pythonhosted.org/packages/66/3e/868e5c3364b6cee19ff3e1a122194fa4ce51def02c61023970442162859e/yarl-1.23.0-cp312-cp312-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:a3d2bff8f37f8d0f96c7ec554d16945050d54462d6e95414babaa18bfafc7f51", size = 100132, upload-time = "2026-03-01T22:05:23.638Z" }, + { url = "https://files.pythonhosted.org/packages/cf/26/9c89acf82f08a52cb52d6d39454f8d18af15f9d386a23795389d1d423823/yarl-1.23.0-cp312-cp312-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c75eb09e8d55bceb4367e83496ff8ef2bc7ea6960efb38e978e8073ea59ecb67", size = 99289, upload-time = "2026-03-01T22:05:25.749Z" }, { url = "https://files.pythonhosted.org/packages/6f/54/5b0db00d2cb056922356104468019c0a132e89c8d3ab67d8ede9f4483d2a/yarl-1.23.0-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:877b0738624280e34c55680d6054a307aa94f7d52fa0e3034a9cc6e790871da7", size = 96950, upload-time = "2026-03-01T22:05:27.318Z" }, + { url = "https://files.pythonhosted.org/packages/f6/40/10fa93811fd439341fad7e0718a86aca0de9548023bbb403668d6555acab/yarl-1.23.0-cp312-cp312-musllinux_1_2_armv7l.whl", hash = "sha256:b5405bb8f0e783a988172993cfc627e4d9d00432d6bbac65a923041edacf997d", size = 93960, upload-time = "2026-03-01T22:05:28.738Z" }, + { url = "https://files.pythonhosted.org/packages/bc/d2/8ae2e6cd77d0805f4526e30ec43b6f9a3dfc542d401ac4990d178e4bf0cf/yarl-1.23.0-cp312-cp312-musllinux_1_2_ppc64le.whl", hash = "sha256:1c3a3598a832590c5a3ce56ab5576361b5688c12cb1d39429cf5dba30b510760", size = 104703, upload-time = "2026-03-01T22:05:30.438Z" }, + { url = "https://files.pythonhosted.org/packages/2f/0c/b3ceacf82c3fe21183ce35fa2acf5320af003d52bc1fcf5915077681142e/yarl-1.23.0-cp312-cp312-musllinux_1_2_riscv64.whl", hash = "sha256:8419ebd326430d1cbb7efb5292330a2cf39114e82df5cc3d83c9a0d5ebeaf2f2", size = 98325, upload-time = "2026-03-01T22:05:31.835Z" }, + { url = "https://files.pythonhosted.org/packages/9d/e0/12900edd28bdab91a69bd2554b85ad7b151f64e8b521fe16f9ad2f56477a/yarl-1.23.0-cp312-cp312-musllinux_1_2_s390x.whl", hash = "sha256:be61f6fff406ca40e3b1d84716fde398fc08bc63dd96d15f3a14230a0973ed86", size = 105067, upload-time = "2026-03-01T22:05:33.358Z" }, { url = "https://files.pythonhosted.org/packages/15/61/74bb1182cf79c9bbe4eb6b1f14a57a22d7a0be5e9cedf8e2d5c2086474c3/yarl-1.23.0-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:3ceb13c5c858d01321b5d9bb65e4cf37a92169ea470b70fec6f236b2c9dd7e34", size = 100285, upload-time = "2026-03-01T22:05:35.4Z" }, { url = "https://files.pythonhosted.org/packages/9a/4b/a0a6e5d0ee8a2f3a373ddef8a4097d74ac901ac363eea1440464ccbe0898/yarl-1.23.0-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:16c6994ac35c3e74fb0ae93323bf8b9c2a9088d55946109489667c510a7d010e", size = 123796, upload-time = "2026-03-01T22:05:41.412Z" }, { url = "https://files.pythonhosted.org/packages/67/b6/8925d68af039b835ae876db5838e82e76ec87b9782ecc97e192b809c4831/yarl-1.23.0-cp313-cp313-macosx_10_13_x86_64.whl", hash = "sha256:4a42e651629dafb64fd5b0286a3580613702b5809ad3f24934ea87595804f2c5", size = 86547, upload-time = "2026-03-01T22:05:42.841Z" }, { url = "https://files.pythonhosted.org/packages/ae/50/06d511cc4b8e0360d3c94af051a768e84b755c5eb031b12adaaab6dec6e5/yarl-1.23.0-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:7c6b9461a2a8b47c65eef63bb1c76a4f1c119618ffa99ea79bc5bb1e46c5821b", size = 85854, upload-time = "2026-03-01T22:05:44.85Z" }, { url = "https://files.pythonhosted.org/packages/c4/f4/4e30b250927ffdab4db70da08b9b8d2194d7c7b400167b8fbeca1e4701ca/yarl-1.23.0-cp313-cp313-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:2569b67d616eab450d262ca7cb9f9e19d2f718c70a8b88712859359d0ab17035", size = 98351, upload-time = "2026-03-01T22:05:46.836Z" }, + { url = "https://files.pythonhosted.org/packages/86/fc/4118c5671ea948208bdb1492d8b76bdf1453d3e73df051f939f563e7dcc5/yarl-1.23.0-cp313-cp313-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:e9d9a4d06d3481eab79803beb4d9bd6f6a8e781ec078ac70d7ef2dcc29d1bea5", size = 92711, upload-time = "2026-03-01T22:05:48.316Z" }, + { url = "https://files.pythonhosted.org/packages/56/11/1ed91d42bd9e73c13dc9e7eb0dd92298d75e7ac4dd7f046ad0c472e231cd/yarl-1.23.0-cp313-cp313-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:f514f6474e04179d3d33175ed3f3e31434d3130d42ec153540d5b157deefd735", size = 106014, upload-time = "2026-03-01T22:05:50.028Z" }, + { url = "https://files.pythonhosted.org/packages/ce/c9/74e44e056a23fbc33aca71779ef450ca648a5bc472bdad7a82339918f818/yarl-1.23.0-cp313-cp313-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:fda207c815b253e34f7e1909840fd14299567b1c0eb4908f8c2ce01a41265401", size = 105557, upload-time = "2026-03-01T22:05:51.416Z" }, { url = "https://files.pythonhosted.org/packages/66/fe/b1e10b08d287f518994f1e2ff9b6d26f0adeecd8dd7d533b01bab29a3eda/yarl-1.23.0-cp313-cp313-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:34b6cf500e61c90f305094911f9acc9c86da1a05a7a3f5be9f68817043f486e4", size = 101559, upload-time = "2026-03-01T22:05:52.872Z" }, + { url = "https://files.pythonhosted.org/packages/72/59/c5b8d94b14e3d3c2a9c20cb100119fd534ab5a14b93673ab4cc4a4141ea5/yarl-1.23.0-cp313-cp313-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:d7504f2b476d21653e4d143f44a175f7f751cd41233525312696c76aa3dbb23f", size = 100502, upload-time = "2026-03-01T22:05:54.954Z" }, { url = "https://files.pythonhosted.org/packages/77/4f/96976cb54cbfc5c9fd73ed4c51804f92f209481d1fb190981c0f8a07a1d7/yarl-1.23.0-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:578110dd426f0d209d1509244e6d4a3f1a3e9077655d98c5f22583d63252a08a", size = 98027, upload-time = "2026-03-01T22:05:56.409Z" }, + { url = "https://files.pythonhosted.org/packages/63/6e/904c4f476471afdbad6b7e5b70362fb5810e35cd7466529a97322b6f5556/yarl-1.23.0-cp313-cp313-musllinux_1_2_armv7l.whl", hash = "sha256:609d3614d78d74ebe35f54953c5bbd2ac647a7ddb9c30a5d877580f5e86b22f2", size = 95369, upload-time = "2026-03-01T22:05:58.141Z" }, + { url = "https://files.pythonhosted.org/packages/9d/40/acfcdb3b5f9d68ef499e39e04d25e141fe90661f9d54114556cf83be8353/yarl-1.23.0-cp313-cp313-musllinux_1_2_ppc64le.whl", hash = "sha256:4966242ec68afc74c122f8459abd597afd7d8a60dc93d695c1334c5fd25f762f", size = 105565, upload-time = "2026-03-01T22:06:00.286Z" }, + { url = "https://files.pythonhosted.org/packages/5e/c6/31e28f3a6ba2869c43d124f37ea5260cac9c9281df803c354b31f4dd1f3c/yarl-1.23.0-cp313-cp313-musllinux_1_2_riscv64.whl", hash = "sha256:e0fd068364a6759bc794459f0a735ab151d11304346332489c7972bacbe9e72b", size = 99813, upload-time = "2026-03-01T22:06:01.712Z" }, + { url = "https://files.pythonhosted.org/packages/08/1f/6f65f59e72d54aa467119b63fc0b0b1762eff0232db1f4720cd89e2f4a17/yarl-1.23.0-cp313-cp313-musllinux_1_2_s390x.whl", hash = "sha256:39004f0ad156da43e86aa71f44e033de68a44e5a31fc53507b36dd253970054a", size = 105632, upload-time = "2026-03-01T22:06:03.188Z" }, { url = "https://files.pythonhosted.org/packages/a3/c4/18b178a69935f9e7a338127d5b77d868fdc0f0e49becd286d51b3a18c61d/yarl-1.23.0-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:e5723c01a56c5028c807c701aa66722916d2747ad737a046853f6c46f4875543", size = 101895, upload-time = "2026-03-01T22:06:04.651Z" }, { url = "https://files.pythonhosted.org/packages/9c/fc/119dd07004f17ea43bb91e3ece6587759edd7519d6b086d16bfbd3319982/yarl-1.23.0-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:aecfed0b41aa72b7881712c65cf764e39ce2ec352324f5e0837c7048d9e6daaa", size = 130719, upload-time = "2026-03-01T22:06:11.708Z" }, { url = "https://files.pythonhosted.org/packages/e6/0d/9f2348502fbb3af409e8f47730282cd6bc80dec6630c1e06374d882d6eb2/yarl-1.23.0-cp313-cp313t-macosx_10_13_x86_64.whl", hash = "sha256:a41bcf68efd19073376eb8cf948b8d9be0af26256403e512bb18f3966f1f9120", size = 89690, upload-time = "2026-03-01T22:06:13.429Z" }, { url = "https://files.pythonhosted.org/packages/50/93/e88f3c80971b42cfc83f50a51b9d165a1dbf154b97005f2994a79f212a07/yarl-1.23.0-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:cde9a2ecd91668bcb7f077c4966d8ceddb60af01b52e6e3e2680e4cf00ad1a59", size = 89851, upload-time = "2026-03-01T22:06:15.53Z" }, { url = "https://files.pythonhosted.org/packages/1c/07/61c9dd8ba8f86473263b4036f70fb594c09e99c0d9737a799dfd8bc85651/yarl-1.23.0-cp313-cp313t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:5023346c4ee7992febc0068e7593de5fa2bf611848c08404b35ebbb76b1b0512", size = 95874, upload-time = "2026-03-01T22:06:17.553Z" }, + { url = "https://files.pythonhosted.org/packages/9e/e9/f9ff8ceefba599eac6abddcfb0b3bee9b9e636e96dbf54342a8577252379/yarl-1.23.0-cp313-cp313t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:d1009abedb49ae95b136a8904a3f71b342f849ffeced2d3747bf29caeda218c4", size = 88710, upload-time = "2026-03-01T22:06:19.004Z" }, + { url = "https://files.pythonhosted.org/packages/eb/78/0231bfcc5d4c8eec220bc2f9ef82cb4566192ea867a7c5b4148f44f6cbcd/yarl-1.23.0-cp313-cp313t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a8d00f29b42f534cc8aa3931cfe773b13b23e561e10d2b26f27a8d309b0e82a1", size = 101033, upload-time = "2026-03-01T22:06:21.203Z" }, + { url = "https://files.pythonhosted.org/packages/cd/9b/30ea5239a61786f18fd25797151a17fbb3be176977187a48d541b5447dd4/yarl-1.23.0-cp313-cp313t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:95451e6ce06c3e104556d73b559f5da6c34a069b6b62946d3ad66afcd51642ea", size = 100817, upload-time = "2026-03-01T22:06:22.738Z" }, { url = "https://files.pythonhosted.org/packages/62/e2/a4980481071791bc83bce2b7a1a1f7adcabfa366007518b4b845e92eeee3/yarl-1.23.0-cp313-cp313t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:531ef597132086b6cf96faa7c6c1dcd0361dd5f1694e5cc30375907b9b7d3ea9", size = 97482, upload-time = "2026-03-01T22:06:24.21Z" }, + { url = "https://files.pythonhosted.org/packages/e5/1e/304a00cf5f6100414c4b5a01fc7ff9ee724b62158a08df2f8170dfc72a2d/yarl-1.23.0-cp313-cp313t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:88f9fb0116fbfcefcab70f85cf4b74a2b6ce5d199c41345296f49d974ddb4123", size = 95949, upload-time = "2026-03-01T22:06:25.697Z" }, { url = "https://files.pythonhosted.org/packages/68/03/093f4055ed4cae649ac53bca3d180bd37102e9e11d048588e9ab0c0108d0/yarl-1.23.0-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:e7b0460976dc75cb87ad9cc1f9899a4b97751e7d4e77ab840fc9b6d377b8fd24", size = 95839, upload-time = "2026-03-01T22:06:27.309Z" }, + { url = "https://files.pythonhosted.org/packages/b9/28/4c75ebb108f322aa8f917ae10a8ffa4f07cae10a8a627b64e578617df6a0/yarl-1.23.0-cp313-cp313t-musllinux_1_2_armv7l.whl", hash = "sha256:115136c4a426f9da976187d238e84139ff6b51a20839aa6e3720cd1026d768de", size = 90696, upload-time = "2026-03-01T22:06:29.048Z" }, + { url = "https://files.pythonhosted.org/packages/23/9c/42c2e2dd91c1a570402f51bdf066bfdb1241c2240ba001967bad778e77b7/yarl-1.23.0-cp313-cp313t-musllinux_1_2_ppc64le.whl", hash = "sha256:ead11956716a940c1abc816b7df3fa2b84d06eaed8832ca32f5c5e058c65506b", size = 100865, upload-time = "2026-03-01T22:06:30.525Z" }, + { url = "https://files.pythonhosted.org/packages/74/05/1bcd60a8a0a914d462c305137246b6f9d167628d73568505fce3f1cb2e65/yarl-1.23.0-cp313-cp313t-musllinux_1_2_riscv64.whl", hash = "sha256:fe8f8f5e70e6dbdfca9882cd9deaac058729bcf323cf7a58660901e55c9c94f6", size = 96234, upload-time = "2026-03-01T22:06:32.692Z" }, + { url = "https://files.pythonhosted.org/packages/90/b2/f52381aac396d6778ce516b7bc149c79e65bfc068b5de2857ab69eeea3b7/yarl-1.23.0-cp313-cp313t-musllinux_1_2_s390x.whl", hash = "sha256:a0e317df055958a0c1e79e5d2aa5a5eaa4a6d05a20d4b0c9c3f48918139c9fc6", size = 100295, upload-time = "2026-03-01T22:06:34.268Z" }, { url = "https://files.pythonhosted.org/packages/e5/e8/638bae5bbf1113a659b2435d8895474598afe38b4a837103764f603aba56/yarl-1.23.0-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:6f0fd84de0c957b2d280143522c4f91a73aada1923caee763e24a2b3fda9f8a5", size = 97784, upload-time = "2026-03-01T22:06:35.864Z" }, { url = "https://files.pythonhosted.org/packages/90/98/b85a038d65d1b92c3903ab89444f48d3cee490a883477b716d7a24b1a78c/yarl-1.23.0-cp314-cp314-macosx_10_15_universal2.whl", hash = "sha256:21d1b7305a71a15b4794b5ff22e8eef96ff4a6d7f9657155e5aa419444b28912", size = 124455, upload-time = "2026-03-01T22:06:43.615Z" }, { url = "https://files.pythonhosted.org/packages/39/54/bc2b45559f86543d163b6e294417a107bb87557609007c007ad889afec18/yarl-1.23.0-cp314-cp314-macosx_10_15_x86_64.whl", hash = "sha256:85610b4f27f69984932a7abbe52703688de3724d9f72bceb1cca667deff27474", size = 86752, upload-time = "2026-03-01T22:06:45.425Z" }, { url = "https://files.pythonhosted.org/packages/24/f9/e8242b68362bffe6fb536c8db5076861466fc780f0f1b479fc4ffbebb128/yarl-1.23.0-cp314-cp314-macosx_11_0_arm64.whl", hash = "sha256:23f371bd662cf44a7630d4d113101eafc0cfa7518a2760d20760b26021454719", size = 86291, upload-time = "2026-03-01T22:06:46.974Z" }, { url = "https://files.pythonhosted.org/packages/ea/d8/d1cb2378c81dd729e98c716582b1ccb08357e8488e4c24714658cc6630e8/yarl-1.23.0-cp314-cp314-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:c4a80f77dc1acaaa61f0934176fccca7096d9b1ff08c8ba9cddf5ae034a24319", size = 99026, upload-time = "2026-03-01T22:06:48.459Z" }, + { url = "https://files.pythonhosted.org/packages/0a/ff/7196790538f31debe3341283b5b0707e7feb947620fc5e8236ef28d44f72/yarl-1.23.0-cp314-cp314-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:bd654fad46d8d9e823afbb4f87c79160b5a374ed1ff5bde24e542e6ba8f41434", size = 92355, upload-time = "2026-03-01T22:06:50.306Z" }, + { url = "https://files.pythonhosted.org/packages/c1/56/25d58c3eddde825890a5fe6aa1866228377354a3c39262235234ab5f616b/yarl-1.23.0-cp314-cp314-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:682bae25f0a0dd23a056739f23a134db9f52a63e2afd6bfb37ddc76292bbd723", size = 106417, upload-time = "2026-03-01T22:06:52.1Z" }, + { url = "https://files.pythonhosted.org/packages/51/8a/882c0e7bc8277eb895b31bce0138f51a1ba551fc2e1ec6753ffc1e7c1377/yarl-1.23.0-cp314-cp314-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:a82836cab5f197a0514235aaf7ffccdc886ccdaa2324bc0aafdd4ae898103039", size = 106422, upload-time = "2026-03-01T22:06:54.424Z" }, { url = "https://files.pythonhosted.org/packages/42/2b/fef67d616931055bf3d6764885990a3ac647d68734a2d6a9e1d13de437a2/yarl-1.23.0-cp314-cp314-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:1c57676bdedc94cd3bc37724cf6f8cd2779f02f6aba48de45feca073e714fe52", size = 101915, upload-time = "2026-03-01T22:06:55.895Z" }, + { url = "https://files.pythonhosted.org/packages/18/6a/530e16aebce27c5937920f3431c628a29a4b6b430fab3fd1c117b26ff3f6/yarl-1.23.0-cp314-cp314-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:c7f8dc16c498ff06497c015642333219871effba93e4a2e8604a06264aca5c5c", size = 100690, upload-time = "2026-03-01T22:06:58.21Z" }, { url = "https://files.pythonhosted.org/packages/88/08/93749219179a45e27b036e03260fda05190b911de8e18225c294ac95bbc9/yarl-1.23.0-cp314-cp314-musllinux_1_2_aarch64.whl", hash = "sha256:5ee586fb17ff8f90c91cf73c6108a434b02d69925f44f5f8e0d7f2f260607eae", size = 98750, upload-time = "2026-03-01T22:06:59.794Z" }, + { url = "https://files.pythonhosted.org/packages/d9/cf/ea424a004969f5d81a362110a6ac1496d79efdc6d50c2c4b2e3ea0fc2519/yarl-1.23.0-cp314-cp314-musllinux_1_2_armv7l.whl", hash = "sha256:17235362f580149742739cc3828b80e24029d08cbb9c4bda0242c7b5bc610a8e", size = 94685, upload-time = "2026-03-01T22:07:01.375Z" }, + { url = "https://files.pythonhosted.org/packages/e2/b7/14341481fe568e2b0408bcf1484c652accafe06a0ade9387b5d3fd9df446/yarl-1.23.0-cp314-cp314-musllinux_1_2_ppc64le.whl", hash = "sha256:0793e2bd0cf14234983bbb371591e6bea9e876ddf6896cdcc93450996b0b5c85", size = 106009, upload-time = "2026-03-01T22:07:03.151Z" }, + { url = "https://files.pythonhosted.org/packages/0a/e6/5c744a9b54f4e8007ad35bce96fbc9218338e84812d36f3390cea616881a/yarl-1.23.0-cp314-cp314-musllinux_1_2_riscv64.whl", hash = "sha256:3650dc2480f94f7116c364096bc84b1d602f44224ef7d5c7208425915c0475dd", size = 100033, upload-time = "2026-03-01T22:07:04.701Z" }, + { url = "https://files.pythonhosted.org/packages/0c/23/e3bfc188d0b400f025bc49d99793d02c9abe15752138dcc27e4eaf0c4a9e/yarl-1.23.0-cp314-cp314-musllinux_1_2_s390x.whl", hash = "sha256:f40e782d49630ad384db66d4d8b73ff4f1b8955dc12e26b09a3e3af064b3b9d6", size = 106483, upload-time = "2026-03-01T22:07:06.231Z" }, { url = "https://files.pythonhosted.org/packages/72/42/f0505f949a90b3f8b7a363d6cbdf398f6e6c58946d85c6d3a3bc70595b26/yarl-1.23.0-cp314-cp314-musllinux_1_2_x86_64.whl", hash = "sha256:94f8575fbdf81749008d980c17796097e645574a3b8c28ee313931068dad14fe", size = 102175, upload-time = "2026-03-01T22:07:08.4Z" }, { url = "https://files.pythonhosted.org/packages/dd/8d/d2fad34b1c08aa161b74394183daa7d800141aaaee207317e82c790b418d/yarl-1.23.0-cp314-cp314t-macosx_10_15_universal2.whl", hash = "sha256:9a18d6f9359e45722c064c97464ec883eb0e0366d33eda61cb19a244bf222679", size = 131019, upload-time = "2026-03-01T22:07:14.903Z" }, { url = "https://files.pythonhosted.org/packages/19/ff/33009a39d3ccf4b94d7d7880dfe17fb5816c5a4fe0096d9b56abceea9ac7/yarl-1.23.0-cp314-cp314t-macosx_10_15_x86_64.whl", hash = "sha256:2803ed8b21ca47a43da80a6fd1ed3019d30061f7061daa35ac54f63933409412", size = 89894, upload-time = "2026-03-01T22:07:17.372Z" }, { url = "https://files.pythonhosted.org/packages/0c/f1/dab7ac5e7306fb79c0190766a3c00b4cb8d09a1f390ded68c85a5934faf5/yarl-1.23.0-cp314-cp314t-macosx_11_0_arm64.whl", hash = "sha256:394906945aa8b19fc14a61cf69743a868bb8c465efe85eee687109cc540b98f4", size = 89979, upload-time = "2026-03-01T22:07:19.361Z" }, { url = "https://files.pythonhosted.org/packages/aa/b1/08e95f3caee1fad6e65017b9f26c1d79877b502622d60e517de01e72f95d/yarl-1.23.0-cp314-cp314t-manylinux2014_aarch64.manylinux_2_17_aarch64.manylinux_2_28_aarch64.whl", hash = "sha256:71d006bee8397a4a89f469b8deb22469fe7508132d3c17fa6ed871e79832691c", size = 95943, upload-time = "2026-03-01T22:07:21.266Z" }, + { url = "https://files.pythonhosted.org/packages/c0/cc/6409f9018864a6aa186c61175b977131f373f1988e198e031236916e87e4/yarl-1.23.0-cp314-cp314t-manylinux2014_armv7l.manylinux_2_17_armv7l.manylinux_2_31_armv7l.whl", hash = "sha256:62694e275c93d54f7ccedcfef57d42761b2aad5234b6be1f3e3026cae4001cd4", size = 88786, upload-time = "2026-03-01T22:07:23.129Z" }, + { url = "https://files.pythonhosted.org/packages/76/40/cc22d1d7714b717fde2006fad2ced5efe5580606cb059ae42117542122f3/yarl-1.23.0-cp314-cp314t-manylinux2014_ppc64le.manylinux_2_17_ppc64le.manylinux_2_28_ppc64le.whl", hash = "sha256:a31de1613658308efdb21ada98cbc86a97c181aa050ba22a808120bb5be3ab94", size = 101307, upload-time = "2026-03-01T22:07:24.689Z" }, + { url = "https://files.pythonhosted.org/packages/8f/0d/476c38e85ddb4c6ec6b20b815bdd779aa386a013f3d8b85516feee55c8dc/yarl-1.23.0-cp314-cp314t-manylinux2014_s390x.manylinux_2_17_s390x.manylinux_2_28_s390x.whl", hash = "sha256:fb1e8b8d66c278b21d13b0a7ca22c41dd757a7c209c6b12c313e445c31dd3b28", size = 100904, upload-time = "2026-03-01T22:07:26.287Z" }, { url = "https://files.pythonhosted.org/packages/72/32/0abe4a76d59adf2081dcb0397168553ece4616ada1c54d1c49d8936c74f8/yarl-1.23.0-cp314-cp314t-manylinux2014_x86_64.manylinux_2_17_x86_64.manylinux_2_28_x86_64.whl", hash = "sha256:50f9d8d531dfb767c565f348f33dd5139a6c43f5cbdf3f67da40d54241df93f6", size = 97728, upload-time = "2026-03-01T22:07:27.906Z" }, + { url = "https://files.pythonhosted.org/packages/b7/35/7b30f4810fba112f60f5a43237545867504e15b1c7647a785fbaf588fac2/yarl-1.23.0-cp314-cp314t-manylinux_2_31_riscv64.manylinux_2_39_riscv64.whl", hash = "sha256:575aa4405a656e61a540f4a80eaa5260f2a38fff7bfdc4b5f611840d76e9e277", size = 95964, upload-time = "2026-03-01T22:07:30.198Z" }, { url = "https://files.pythonhosted.org/packages/2d/86/ed7a73ab85ef00e8bb70b0cb5421d8a2a625b81a333941a469a6f4022828/yarl-1.23.0-cp314-cp314t-musllinux_1_2_aarch64.whl", hash = "sha256:041b1a4cefacf65840b4e295c6985f334ba83c30607441ae3cf206a0eed1a2e4", size = 95882, upload-time = "2026-03-01T22:07:32.132Z" }, + { url = "https://files.pythonhosted.org/packages/19/90/d56967f61a29d8498efb7afb651e0b2b422a1e9b47b0ab5f4e40a19b699b/yarl-1.23.0-cp314-cp314t-musllinux_1_2_armv7l.whl", hash = "sha256:d38c1e8231722c4ce40d7593f28d92b5fc72f3e9774fe73d7e800ec32299f63a", size = 90797, upload-time = "2026-03-01T22:07:34.404Z" }, + { url = "https://files.pythonhosted.org/packages/72/00/8b8f76909259f56647adb1011d7ed8b321bcf97e464515c65016a47ecdf0/yarl-1.23.0-cp314-cp314t-musllinux_1_2_ppc64le.whl", hash = "sha256:d53834e23c015ee83a99377db6e5e37d8484f333edb03bd15b4bc312cc7254fb", size = 101023, upload-time = "2026-03-01T22:07:35.953Z" }, + { url = "https://files.pythonhosted.org/packages/ac/e2/cab11b126fb7d440281b7df8e9ddbe4851e70a4dde47a202b6642586b8d9/yarl-1.23.0-cp314-cp314t-musllinux_1_2_riscv64.whl", hash = "sha256:2e27c8841126e017dd2a054a95771569e6070b9ee1b133366d8b31beb5018a41", size = 96227, upload-time = "2026-03-01T22:07:37.594Z" }, + { url = "https://files.pythonhosted.org/packages/c2/9b/2c893e16bfc50e6b2edf76c1a9eb6cb0c744346197e74c65e99ad8d634d0/yarl-1.23.0-cp314-cp314t-musllinux_1_2_s390x.whl", hash = "sha256:76855800ac56f878847a09ce6dba727c93ca2d89c9e9d63002d26b916810b0a2", size = 100302, upload-time = "2026-03-01T22:07:39.334Z" }, { url = "https://files.pythonhosted.org/packages/28/ec/5498c4e3a6d5f1003beb23405671c2eb9cdbf3067d1c80f15eeafe301010/yarl-1.23.0-cp314-cp314t-musllinux_1_2_x86_64.whl", hash = "sha256:e09fd068c2e169a7070d83d3bde728a4d48de0549f975290be3c108c02e499b4", size = 98202, upload-time = "2026-03-01T22:07:41.717Z" }, { url = "https://files.pythonhosted.org/packages/69/68/c8739671f5699c7dc470580a4f821ef37c32c4cb0b047ce223a7f115757f/yarl-1.23.0-py3-none-any.whl", hash = "sha256:a2df6afe50dea8ae15fa34c9f824a3ee958d785fd5d089063d960bae1daa0a3f", size = 48288, upload-time = "2026-03-01T22:07:51.388Z" }, ]