Conversation
|
Note Reviews pausedIt looks like this branch is under active development. To avoid overwhelming you with review comments due to an influx of new commits, CodeRabbit has automatically paused this review. You can configure this behavior by changing the Use the following commands to manage reviews:
Use the checkboxes below for quick actions:
📝 WalkthroughWalkthroughAdds per-test exclude_nodes support across SLURM node resolution and command generation; introduces optional git submodule initialization in installers with cleanup on failure; adjusts Megatron-Bridge repo mounting and launcher Slurm params; updates configs and adds tests for these behaviors. Changes
Estimated code review effort🎯 4 (Complex) | ⏱️ ~45 minutes Poem
🚥 Pre-merge checks | ✅ 2✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
📝 Coding Plan
Comment |
Greptile SummaryThis PR introduces several enhancements to the Megatron-Bridge workload and Slurm infrastructure for CloudAI: node placement propagation (resolved Key changes:
Confidence Score: 3/5
Important Files Changed
|
src/cloudai/workloads/megatron_bridge/slurm_command_gen_strategy.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Actionable comments posted: 2
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
conf/experimental/megatron_bridge/test/gb200/megatron_bridge_qwen_30b.toml (1)
28-40:⚠️ Potential issue | 🟠 MajorDo not leave
hf_tokenempty in this test recipe.This updated recipe still ships with
hf_token = "", which makes the config unusable unless callers patch the TOML out-of-band. Please wire this to a user-supplied secret/input, or explicitly exempt this scenario from the non-empty validation path. Based on learnings: For Megatron Bridge test TOML configurations underconf/experimental/megatron_bridge/test/,hf_tokenmust be non-empty; tests should fail validation ifhf_tokenis left empty to ensure users supply their own Hugging Face token.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@conf/experimental/megatron_bridge/test/gb200/megatron_bridge_qwen_30b.toml` around lines 28 - 40, The test TOML leaves the [cmd_args] key hf_token empty which breaks validation; change the recipe to require a user-supplied token by wiring hf_token to a runtime secret/input (e.g., use a placeholder token variable like HF_TOKEN referenced by the deployment system) or update the config validation logic to reject empty hf_token for Megatron Bridge test configs; specifically modify the [cmd_args] hf_token entry and/or the validation path that checks test recipes under conf/experimental/megatron_bridge/test to enforce non-empty hf_token so tests fail if not provided.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/cloudai/systems/slurm/slurm_system.py`:
- Around line 722-725: The current logic in
parse_node_list/parse_nodes/get_nodes_by_spec can produce an empty node list
after applying exclude_nodes and then fall back to an unconstrained allocation;
modify the code so that after filtering expanded_nodes = [n for n in
expanded_nodes if n not in exclude_nodes] you detect if expanded_nodes is empty
and immediately raise a clear exception (e.g., ValueError/RuntimeError) that
includes the offending node_spec and exclude_nodes, and ensure get_nodes_by_spec
propagates this error instead of returning (num_nodes, []). This prevents
SlurmCommandGenStrategy from seeing an empty nodelist and emitting -N for
unconstrained placement; alternatively, if a non-exceptional path is required,
return a sentinel that preserves the exclusion constraint and is handled by
SlurmCommandGenStrategy rather than an empty list fallback.
In `@tests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py`:
- Around line 323-351: The tests are using the defaults from the test factory so
they pass regardless of the override plumbing; update the failing tests to use
non-default values so the overrides are actually exercised: in
test_mount_as_adds_repo_to_container_mounts call make_test_run with a distinct
mount_as (e.g. "/opt/custom-megatron") and assert the wrapper_content contains
f"-cm {repo_path.absolute()}:/opt/custom-megatron" (use the same mount_as
string), in test_gpus_per_node_passed_as_additional_slurm_param pass
cmd_args_overrides={"gpus_per_node": 2} (or another non-default number) and
assert "gpus-per-node=2" appears in wrapper_content; keep using
MegatronBridgeSlurmCommandGenStrategy, make_test_run, and _wrapper_content to
locate where to change the values.
---
Outside diff comments:
In `@conf/experimental/megatron_bridge/test/gb200/megatron_bridge_qwen_30b.toml`:
- Around line 28-40: The test TOML leaves the [cmd_args] key hf_token empty
which breaks validation; change the recipe to require a user-supplied token by
wiring hf_token to a runtime secret/input (e.g., use a placeholder token
variable like HF_TOKEN referenced by the deployment system) or update the config
validation logic to reject empty hf_token for Megatron Bridge test configs;
specifically modify the [cmd_args] hf_token entry and/or the validation path
that checks test recipes under conf/experimental/megatron_bridge/test to enforce
non-empty hf_token so tests fail if not provided.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: b086916f-352d-4c4c-b94f-f9fa9c3846a0
📒 Files selected for processing (15)
conf/experimental/megatron_bridge/test/gb200/megatron_bridge_qwen_30b.tomlsrc/cloudai/_core/test_scenario.pysrc/cloudai/models/scenario.pysrc/cloudai/systems/kubernetes/kubernetes_installer.pysrc/cloudai/systems/slurm/slurm_command_gen_strategy.pysrc/cloudai/systems/slurm/slurm_installer.pysrc/cloudai/systems/slurm/slurm_system.pysrc/cloudai/test_scenario_parser.pysrc/cloudai/workloads/megatron_bridge/megatron_bridge.pysrc/cloudai/workloads/megatron_bridge/slurm_command_gen_strategy.pytests/ref_data/megatron-bridge.sbatchtests/systems/slurm/test_allocation.pytests/systems/slurm/test_system.pytests/test_git_repo_installer.pytests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py
tests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Actionable comments posted: 1
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@tests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py`:
- Around line 323-334: The test function
test_mount_as_adds_repo_to_container_mounts declares an unused tmp_path fixture
parameter; remove tmp_path from the test signature so it becomes def
test_mount_as_adds_repo_to_container_mounts(self, configured_slurm_system:
SlurmSystem, make_test_run: Callable[..., TestRun]) -> None: and update any
references if the test was called elsewhere (none expected) to eliminate the
unused parameter warning; locate this function by name in
tests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py and adjust
the signature accordingly.
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: d59eb2c8-9f04-4000-8d8d-493717387607
📒 Files selected for processing (6)
src/cloudai/systems/slurm/slurm_command_gen_strategy.pysrc/cloudai/systems/slurm/slurm_system.pysrc/cloudai/workloads/megatron_bridge/slurm_command_gen_strategy.pytests/systems/slurm/test_command_gen_strategy.pytests/systems/slurm/test_system.pytests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py
src/cloudai/workloads/megatron_bridge/slurm_command_gen_strategy.py
Outdated
Show resolved
Hide resolved
src/cloudai/workloads/megatron_bridge/slurm_command_gen_strategy.py
Outdated
Show resolved
Hide resolved
conf/experimental/megatron_bridge/test/gb200/megatron_bridge_qwen_30b.toml
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Actionable comments posted: 3
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/cloudai/_core/installables.py (1)
2-2:⚠️ Potential issue | 🟡 MinorFix copyright year to resolve CI failure.
The pipeline reports that the copyright year is invalid. It should be
2025-2026to match the header pattern.🔧 Proposed fix
-# Copyright (c) 2025 NVIDIA CORPORATION & AFFILIATES. All rights reserved. +# Copyright (c) 2025-2026 NVIDIA CORPORATION & AFFILIATES. All rights reserved.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@src/cloudai/_core/installables.py` at line 2, Update the file header line that currently reads "Copyright (c) 2025 NVIDIA CORPORATION & AFFILIATES. All rights reserved." to use the required year range "2025-2026" so it matches the project's header pattern; locate the top-of-file copyright string in src/cloudai/_core/installables.py and replace the single year with "2025-2026".
♻️ Duplicate comments (1)
tests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py (1)
323-325:⚠️ Potential issue | 🟡 MinorRemove unused
tmp_pathfixture from test signature.
tmp_pathis unused in this method (Line 324), and keeping it triggers ARG002 noise.🧹 Proposed fix
def test_mount_as_adds_repo_to_container_mounts( - self, configured_slurm_system: SlurmSystem, make_test_run: Callable[..., TestRun], tmp_path: Path + self, configured_slurm_system: SlurmSystem, make_test_run: Callable[..., TestRun] ) -> None:🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@tests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py` around lines 323 - 325, The test function test_mount_as_adds_repo_to_container_mounts currently declares an unused tmp_path fixture which triggers ARG002; remove tmp_path from the function signature so it reads def test_mount_as_adds_repo_to_container_mounts(self, configured_slurm_system: SlurmSystem, make_test_run: Callable[..., TestRun]) -> None:, and ensure there are no remaining references to tmp_path in the body of that test (adjust imports or fixtures if needed).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@tests/test_git_repo_installer.py`:
- Around line 281-300: The tests were reformatted by ruff-format; run the
formatter and pre-commit hooks to fix CI failures by running the project's
formatting hooks (e.g., pre-commit run --all-files or ruff format) and commit
the changes; ensure the test functions
(test_submodules_skipped_when_not_requested, test_submodules_run_when_requested)
and the installer methods referenced (installer._install_one_git_repo,
installer._init_submodules, installer._clone_repository,
installer._checkout_commit) remain unchanged in behavior after formatting so the
assertions and mocks still match the corrected file.
In `@tests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py`:
- Around line 362-370: The test test_test_run_extra_srun_args_forwarded
currently only checks for the existence of "--additional_slurm_params" which can
be present by default; update it to assert that the value derived from
TestRun.extra_srun_args is actually forwarded by
MegatronBridgeSlurmCommandGenStrategy. Specifically, set tr.extra_srun_args =
"--exclusive" (already done), call _wrapper_content(cmd_gen) as before, and
replace the loose assert with one that checks the wrapper_content contains the
forwarded flag and value (e.g., that the string representing the additional
slurm params includes "--exclusive" or an exact parameter encoding produced by
the command generator) so the test fails if extra_srun_args is ignored;
reference the test function name, the TestRun.extra_srun_args field, the
MegatronBridgeSlurmCommandGenStrategy constructor, and the _wrapper_content
helper to locate the code.
- Around line 335-343: Replace the brittle blanket check for "-cm" in
test_no_mount_as_skips_repo_container_mount with a precise assertion that the
Megatron-Bridge repo mount is not present: remove or change the assert "-cm" not
in wrapper_content to assert that the specific mount target or mount flag for
the repo (for example a string containing ':/opt/Megatron-Bridge' or the '-v
...:/opt/Megatron-Bridge' pattern) is not in wrapper_content, keeping the
existing assert "/opt/Megatron-Bridge" absence check or tightening it to match
the exact mount syntax emitted by MegatronBridgeSlurmCommandGenStrategy.
---
Outside diff comments:
In `@src/cloudai/_core/installables.py`:
- Line 2: Update the file header line that currently reads "Copyright (c) 2025
NVIDIA CORPORATION & AFFILIATES. All rights reserved." to use the required year
range "2025-2026" so it matches the project's header pattern; locate the
top-of-file copyright string in src/cloudai/_core/installables.py and replace
the single year with "2025-2026".
---
Duplicate comments:
In `@tests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py`:
- Around line 323-325: The test function
test_mount_as_adds_repo_to_container_mounts currently declares an unused
tmp_path fixture which triggers ARG002; remove tmp_path from the function
signature so it reads def test_mount_as_adds_repo_to_container_mounts(self,
configured_slurm_system: SlurmSystem, make_test_run: Callable[..., TestRun]) ->
None:, and ensure there are no remaining references to tmp_path in the body of
that test (adjust imports or fixtures if needed).
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 57739f39-82cc-44a7-add5-3cc1080fcf41
📒 Files selected for processing (17)
conf/experimental/megatron_bridge/test/b200/megatron_bridge_qwen_30b.tomlconf/experimental/megatron_bridge/test/gb200/megatron_bridge_qwen_30b.tomlconf/experimental/megatron_bridge/test/gb300/megatron_bridge_qwen_30b.tomlconf/experimental/megatron_bridge/test/h100/megatron_bridge_qwen_30b.tomlsrc/cloudai/_core/installables.pysrc/cloudai/_core/test_scenario.pysrc/cloudai/models/scenario.pysrc/cloudai/systems/kubernetes/kubernetes_installer.pysrc/cloudai/systems/slurm/slurm_command_gen_strategy.pysrc/cloudai/systems/slurm/slurm_installer.pysrc/cloudai/systems/slurm/slurm_system.pysrc/cloudai/workloads/megatron_bridge/slurm_command_gen_strategy.pytests/systems/slurm/test_allocation.pytests/systems/slurm/test_command_gen_strategy.pytests/systems/slurm/test_system.pytests/test_git_repo_installer.pytests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py
src/cloudai/workloads/megatron_bridge/slurm_command_gen_strategy.py
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Actionable comments posted: 4
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@src/cloudai/_core/installables.py`:
- Line 94: The early return in both _install_one_git_repo() implementations
skips submodule setup for existing clones, so ensure submodules are initialized
when item.init_submodules is true even if repo_path.exists() and commit matches;
move or duplicate the item.init_submodules handling out of the post-clone-only
block (or add an explicit check after the early-return path) so that when
repo_path.exists() and item.init_submodules is true you call the same submodule
init/update sequence used after cloning (reference symbols:
_install_one_git_repo, repo_path.exists(), item.init_submodules).
In `@src/cloudai/workloads/megatron_bridge/slurm_command_gen_strategy.py`:
- Around line 111-133: The _parse_srun_args_as_slurm_params function incorrectly
uses str.split() which breaks quoted shell arguments; replace tokens =
srun_args.split() with tokens = shlex.split(srun_args) so quoted values (e.g.
--comment "nightly run") are treated as a single token and the rest of the
parsing logic (the while loop handling --key=val, --key value, and standalone
flags) remains unchanged; shlex is already imported so just swap the split call
to shlex.split.
In `@tests/test_git_repo_installer.py`:
- Around line 281-299: Add a regression test that covers the "existing repo"
fast path with init_submodules enabled: create a test (in
tests/test_git_repo_installer.py) that pre-creates the repository directory
referenced by the GitRepo fixture (repo_path), set git.init_submodules = True,
mock installer._verify_commit to return InstallStatusResult(True) and mock
installer._clone_repository/_checkout_commit as needed, call
installer._install_one_git_repo(git), assert the result is successful and that
installer._init_submodules.assert_called_once() to ensure submodules are
initialized on the pre-existing-repo path.
In `@tests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py`:
- Around line 375-385: Add a new unit test in the same test file exercising
MegatronBridgeSlurmCommandGenStrategy._parse_srun_args_as_slurm_params with a
quoted value, e.g. call it with '--comment "nightly run" --reservation
my_reserv' and assert the returned list includes 'comment=nightly run' and
'reservation=my_reserv' (ordering-insensitive or match exact expected order used
in existing tests); this ensures the parser correctly strips quotes and
preserves spaces for values like "nightly run".
ℹ️ Review info
⚙️ Run configuration
Configuration used: Path: .coderabbit.yaml
Review profile: ASSERTIVE
Plan: Pro
Run ID: 001e2203-7e96-421a-ad8b-79c332c8123d
📒 Files selected for processing (5)
conf/experimental/megatron_bridge/test_scenario/megatron_bridge_qwen_30b.tomlsrc/cloudai/_core/installables.pysrc/cloudai/workloads/megatron_bridge/slurm_command_gen_strategy.pytests/test_git_repo_installer.pytests/workloads/megatron_bridge/test_command_gen_strategy_slurm.py
Summary
This PR introduces several enhancements to the CloudAI Megatron Bridge workload and Slurm infrastructure:
Node placement propagation to Megatron Bridge: The resolved node list from CloudAI's scenario config is now forwarded to Megatron Bridge's
setup_experiment.pyvia--additional_slurm_params nodelist=.... This ensures that when users specify explicitpartition:group:countnode specs in the scenario, the exact placement is correctly defined inside the inner Slurm job. For example, in a scenario file:CloudAI resolves these partition:group specs into concrete node names, excludes node01, and then selects count many nodes out of the pool, and passes the final list as --additional_slurm_params 'nodelist=node02,node03,...' so Megatron-Bridge schedules on exactly those nodes.
Reserved nodes allocations: Treat Slurm nodes in resv (reserved) state as allocatable so that jobs with
--reservationconfigured inextra_srun_argscan resolve their node specifications instead of failing with"0 nodes available."GPU resource requesting: gpus_per_node and gres are now forwarded as
--additional_slurm_params gpus-per-node=N;gres=gpu:Nalongside the nodelist when the system supports gpu directive. For example, setting gpus_per_node = 4 with a nodelist produces:--additional_slurm_params 'gpus-per-node=4;gres=gpu:4;nodelist=node02,node03'This ensures the GPU resources would be correctly requested, as some Slurm systems may not automatically allocate them.
Git submodule initialization: If
init_submodules = TrueinGitRepo, then bothSlurmInstallerandKubernetesInstallernow rungit submodule update --init --recursiveafter cloning git repos. This is required for repos like Megatron-Bridge that bundle Megatron-LM as a submodule. Failures clean up the cloned directory.Example:
Optional container mount for Megatron Bridge: When
mount_asis set on the git repo entry (e.g.mount_as = "/opt/Megatron-Bridge"), the locally cloned repo is bind-mounted into the container via-cm /path/to/clone:/opt/Megatron-Bridge, overriding the image's built-in copy. Omittingmount_asuses the container's built-in copy as-is. Example test config:Qwen recipe update: Updated the GB200 Qwen 30B recipe to use tag
r0.3.0, containernvcr.io#nvidia/nemo:26.02, addedPYTHONPATHfor the bundled Megatron-LM submodule.Test Plan
Additional Notes
Megatron-Bridge r0.3.0, which setsPYTHONPATH = "/opt/Megatron-Bridge/3rdparty/Megatron-LM:${PYTHONPATH}"to pick up the bundled Megatron-LM.