Problem
99problems get -q currently paginates search results sequentially across providers. For multi-page queries, end-to-end runtime becomes dominated by page-by-page network waiting.
Design
Implement phased pagination optimization for -q:
- parallelize safe page-number/offset endpoints with bounded in-flight requests,
- keep deterministic output ordering,
- handle token/cursor-based endpoints conservatively (pipeline or sequential fallback),
- preserve existing error categories and behavior.
Scope
- Add internal bounded pagination scheduler for query/search paths.
- Roll out phased search pagination optimization across providers.
- Preserve deterministic emit order by page/index.
- Keep existing provider auth/rate-limit/error semantics.
- Add tests and trace-level diagnostics for pagination scheduling behavior.
Boundary
- No new CLI flags/config keys in this phase.
- No by-id hydration/comment-pagination changes in this issue.
- No aggressive fan-out for token/cursor-constrained endpoints.
Acceptance Criteria
- Multi-page
-q runs show lower wall-clock time in representative scenarios.
- Emitted conversation ordering stays stable and deterministic.
- Search result completeness remains unchanged versus baseline.
- Error categories/exit codes remain consistent with current behavior.
- Tests and pedantic clippy checks pass.
Context
Recent provider hydration parallelization delivered measurable improvements. Query pagination is the next latency bottleneck for larger searches and is a natural follow-up optimization area.
Problem
99problems get -qcurrently paginates search results sequentially across providers. For multi-page queries, end-to-end runtime becomes dominated by page-by-page network waiting.Design
Implement phased pagination optimization for
-q:Scope
Boundary
Acceptance Criteria
-qruns show lower wall-clock time in representative scenarios.Context
Recent provider hydration parallelization delivered measurable improvements. Query pagination is the next latency bottleneck for larger searches and is a natural follow-up optimization area.