Skip to content

feat(tasks): migrate GET /api/apify/runs/{runId}#145

Merged
sweetmantech merged 1 commit intomainfrom
feat/migrate-apify-scraper
Apr 23, 2026
Merged

feat(tasks): migrate GET /api/apify/runs/{runId}#145
sweetmantech merged 1 commit intomainfrom
feat/migrate-apify-scraper

Conversation

@arpitgupta1214
Copy link
Copy Markdown
Collaborator

@arpitgupta1214 arpitgupta1214 commented Apr 23, 2026

Cuts the Apify-scraper polling helper off the legacy api.recoupable.com/api/apify/scraper?runId=X URL and onto ${NEW_API_BASE_URL}/api/apify/runs/${runId} with an x-api-key header. Response schemas updated for the datasetIddataset_id rename; pollScraperResults reads the new field and falls back to the upstream-known run value when null. Depends on api#463.

Test plan

  • pnpm test passes (356/356 locally)
  • Trigger.dev dev: run pro-artist-social-profiles-scrape against a preview and observe a full poll cycle completes

Summary by cubic

Switches Apify scraper polling to the new authenticated GET endpoint /api/apify/runs/{runId} and updates response handling to snake_case with a safe fallback when dataset_id is null.

  • Refactors

    • Moves from https://api.recoupable.com/api/apify/scraper?runId=X to ${NEW_API_BASE_URL}/api/apify/runs/${runId} (URL-encoded) with x-api-key: RECOUP_API_KEY.
    • Updates schema from datasetId to dataset_id (nullable); pollScraperResults falls back to run.datasetId when null.
    • Adds unit tests for URL, headers, in-progress/completed payloads, non-ok responses, empty runId, and URL encoding.
  • Migration

    • Ensure NEW_API_BASE_URL and RECOUP_API_KEY are set in the environment.

Written for commit 40b02ad. Summary will update on new commits.

- URL: `https://api.recoupable.com/api/apify/scraper?runId=X` →
  `${NEW_API_BASE_URL}/api/apify/runs/${runId}` (path segment,
  url-encoded).
- Auth: adds `x-api-key: RECOUP_API_KEY` header — new endpoint
  requires auth.
- Wire shape: `datasetId` → `dataset_id` (snake_case) in the
  response schemas; `pollScraperResults` adapted to read the new
  field and fall back to the upstream-known `run.datasetId` when
  the response field is null.
- Adds `src/recoup/__tests__/getScraperResults.test.ts` covering
  URL, headers, in-progress / completed / non-ok paths, empty
  runId, and url-encoding.
@coderabbitai
Copy link
Copy Markdown

coderabbitai Bot commented Apr 23, 2026

Warning

Rate limit exceeded

@arpitgupta1214 has exceeded the limit for the number of commits that can be reviewed per hour. Please wait 36 minutes and 36 seconds before requesting another review.

Your organization is not enrolled in usage-based pricing. Contact your admin to enable usage-based pricing to continue reviews beyond the rate limit, or try again in 36 minutes and 36 seconds.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: c2439c6d-dac0-4949-a381-2ffb23743978

📥 Commits

Reviewing files that changed from the base of the PR and between 242d1b5 and 40b02ad.

📒 Files selected for processing (3)
  • src/polling/pollScraperResults.ts
  • src/recoup/__tests__/getScraperResults.test.ts
  • src/recoup/getScraperResults.ts
✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch feat/migrate-apify-scraper

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@arpitgupta1214 arpitgupta1214 changed the title feat: cut getScraperResults to GET /api/apify/runs/{runId} feat: cut getScraperResults caller to /api/apify/runs/{runId} with x-api-key Apr 23, 2026
@arpitgupta1214 arpitgupta1214 changed the title feat: cut getScraperResults caller to /api/apify/runs/{runId} with x-api-key feat(tasks): migrate GET /api/apify/runs/{runId} Apr 23, 2026
@sweetmantech
Copy link
Copy Markdown
Contributor

Dev test via Trigger.dev — passing

Triggered pro-artist-social-profiles-scrape (run_cmoc1topg008m0in5aydyaatq) in the Recoup-Chat dev env against this branch's code. Dashboard.

How I know it works

The migrated path is pollScraperResults → getScraperResults(runId) → GET /api/apify/runs/{runId}. Each scrape batch runs this exact sequence:

Started scrapes for batch N of 58       ← POST /api/socials/{id}/scrape
Polling batch N runs to completion      ← NEW code: getScraperResults hits /api/apify/runs/{runId}
wait.for() ~10s
Batch N completed                       ← poll returned SUCCEEDED + dataset items

Observed during the run:

  • Batch 1 completed
  • Batch 2 completed
  • Batches 3+ in flight at time of writing, all 58 batches invoking the new endpoint per cycle
  • Zero error spans across 443 trace lines

If the migration were broken (e.g. getScraperResults still hitting the deleted /api/apify/scraper), each Polling batch N… span would 404 on the first poll and the batch would fail. Instead batches are completing cleanly on ~10s intervals — confirming the new path resolves, authenticates (admin auth passes), and returns usable Apify data to the caller.

Not covered

  • Full run hadn't completed at the time I drafted this — still cycling through the remaining batches. Worth eyeballing the final run status in the dashboard for COMPLETED.

🤖 Tested with Claude Code via Trigger.dev MCP

@sweetmantech sweetmantech merged commit 59447a6 into main Apr 23, 2026
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants