Skip to content

refactor(ci): extract mise/asdf caching to e2e-only cache#1124

Draft
malept wants to merge 1 commit intomainfrom
malept/dt-5282/refactor/extract-e2e-cache
Draft

refactor(ci): extract mise/asdf caching to e2e-only cache#1124
malept wants to merge 1 commit intomainfrom
malept/dt-5282/refactor/extract-e2e-cache

Conversation

@malept
Copy link
Copy Markdown
Member

@malept malept commented Apr 9, 2026

What this PR does / why we need it

PR #1117 added ~/.local/share/mise to the shared daily cache, bloating it for every job (~500MB-1GB extra) even though only e2e jobs (machine executor) need cached tool installs. Non-e2e jobs run in Docker where tools are pre-installed.

This PR splits the daily cache into two:

  • v1-daily-cache~/.cache, ~/.outreach/.cache — restored by all jobs
  • v1-e2e-daily-cache~/.asdf, ~/.local/share/mise — restored only by e2e jobs (opt-in via restore_e2e_cache parameter)

The v2 key prefix from #1117 is reverted to v1. No manual cache invalidation needed -- stale v2 entries age out in 15 days since caches regenerate daily via {{ epoch }}.

Jira ID

DT-5282

Notes for your reviewers

  • The save_cache job runs on a Docker executor but ~/.asdf is populated via the pre-installed toolchain. Docker and machine executors share OS/arch, so cached binaries are portable.
  • restore_e2e_cache defaults to false rather than being inferred from machine: true, keeping the coupling loose.

Rovo Dev code review: Rovo Dev has reviewed this pull request
Any suggestions or improvements have been posted as pull request comments.

Split the shared daily cache into a slim daily cache (for all jobs)
and an e2e-specific cache (for machine-executor jobs that need tool
binaries). This reduces cache restore size for non-e2e Docker jobs
while preserving tool caching for e2e jobs.

- Revert v2 cache key prefix back to v1 (stale v2 entries expire in 15d)
- Remove ~/.asdf and ~/.local/share/mise from daily cache
- Add v1-e2e-daily-cache with ~/.asdf and ~/.local/share/mise
- Add restore_e2e_cache parameter to setup_environment (default: false)
- Pass restore_e2e_cache: true from e2e job

Refs: DT-5282
Assisted-By: claude-opus 4.6 via OpenCode
@malept malept requested a review from a team as a code owner April 9, 2026 00:53
Copy link
Copy Markdown
Contributor

@marnagy marnagy left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since we will be reverting the daily cache key, is it possible that there is OLD cache in CI that might get used between us releasing rc version of this repo and some people running their pipelines?
I'm not yet fully sure how the cache in CI works.

@palootcenas-outreach
Copy link
Copy Markdown
Contributor

palootcenas-outreach commented Apr 9, 2026

I tested this on an internal repo (detailed results) and ran into a few problems.

All jobs run mise install, not just e2e

Docker images don't ship repo-specific tool versions. Every job runs mise install (and asdf install where applicable) during setup_environment. Without cached ~/.local/share/mise and ~/.asdf, Docker jobs reinstall everything from scratch: 204s setup vs 8.7s with #1117. Saving 73s on a smaller cache restore doesn't offset a 195s setup regression.

The e2e cache is unreachable by e2e jobs

save_cache runs on Docker where {{ arch }} evaluates to arch1-linux-amd64-6_85. E2e runs on a machine executor where it's arch1-linux-amd64-6_106. These never match, so e2e jobs can't find the cache at all. The split doesn't help them either.

~/.asdf has been in the daily cache since it was introduced

This PR removes it for all non-e2e jobs. That's a regression from even the pre-#1117 state.

Numbers

Total stored: 3.3 GiB (1.3 + 2.0) vs 3.2 GiB with #1117. Docker jobs go from 195s to 608s. E2e jobs get nothing due to the arch mismatch.

I think #1117's single cache approach is simpler and works. If cache size near the 3 GiB limit becomes a concern, keying on checksum("mise.toml")-checksum("mise.lock") instead of {{ epoch }} would deduplicate entries without removing content that all jobs depend on.

@malept malept marked this pull request as draft April 9, 2026 14:38
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants