To keep documentation consistent and easy to review, please follow these rules when editing or adding Markdown files.
Action documentation lives under docs/actions. Keep these files in sync with their corresponding implementations in scripts.
- Run
npm run verify:docsto check that documented inputs match each action's action.yml.
Use docs/template-action.md as the starting point for new or updated action and workflow docs. Each document should include Purpose, Parameters, Examples, and Return Codes sections, with GitHub Action inputs mapped to CLI parameters and example code blocks for both CLI and GitHub usage.
- Run
npm run lint:mdto lint Markdown formatting. - Run
npx --yes linkinator README.md docs scripts --config linkinator.config.jsonto check links before submitting changes. - Keep one
#-level heading at the top of each file and increment heading levels sequentially; do not skip levels.
- Use
#for the document title, then##,###, and so on. - Avoid jumping from a
##heading directly to####.
- Use fenced code blocks with triple backticks.
- Specify the language for syntax highlighting (for example, use
\``powershell` to start a PowerShell block). - Use
textfor blocks that show output rather than code.
- Use relative links for files within this repository.
- Provide descriptive link text instead of raw URLs.
- Check that external links resolve correctly.
- Run a spell checker or Markdown linter (if available) before opening a pull request to catch formatting and spelling issues early.
You can use MkDocs to preview documentation changes on your machine.
-
Install MkDocs and the Material theme:
pip install mkdocs mkdocs-material
-
Start a local server:
mkdocs serve
MkDocs serves the site at http://127.0.0.1:8000/ by default. The server automatically rebuilds when files change, so refresh the browser to see your latest edits.
The CI pipeline collects JUnit XML output from both Node and PowerShell tests. scripts/generate-ci-summary.ts parses these files to build the requirement traceability report. Use npm run test:ci to produce the Node JUnit report when verifying documentation updates; the results appear under test-results/ and must be committed with your pull request. Then run:
npm run derive:registry
TEST_RESULTS_GLOBS='test-results/*junit*.xml' npm run generate:summary
npm run check:traceabilityCommit test-results/* and artifacts/linux/* along with your source changes. The validate-artifacts job in CI verifies these files but does not generate them. By default, the summary script only searches artifacts/ for JUnit XML files; if your results are elsewhere, pass a glob via TEST_RESULTS_GLOBS.
Pester tests should record traceability metadata by adding Add-TestResult -Property calls in each It block. At minimum, include an Owner, a Requirement ID, and an Evidence path so the framework can link tests back to requirements:
It "[REQ-123] does something" {
Add-TestResult -Property @{ Owner = 'DevTools'; Requirement = 'REQ-123'; Evidence = 'tests/pester/example.Tests.ps1' }
# test body
}For other test frameworks, prefix the test name with [REQ-123] or use an equivalent mechanism to embed the requirement ID. These properties are preferred over naming conventions when scripts/generate-ci-summary.ts builds the CI report.