Conversation
There was a problem hiding this comment.
Pull request overview
This PR updates the packaging workflow to include a project README in plugin_package.zip, along with new logic to rewrite/remove README asset URLs during bundling.
Changes:
- Include a transformed
README.mdat the root of the generated plugin package zip. - Add a markdown/HTML URL transformer for README assets (GitHub raw URL rewriting + external URL stripping).
- Extend playground fixtures and Vitest coverage to validate README inclusion and URL transformations.
Reviewed changes
Copilot reviewed 12 out of 13 changed files in this pull request and generated 9 comments.
Show a summary per file
| File | Description |
|---|---|
| src/commands/watch.ts | Import reordering only. |
| src/bundle/readme-assets.ts | New README markdown/HTML AST transformer + GitHub repo parsing + optional URL validation. |
| src/bundle/index.ts | Bundling now writes a transformed README into the plugin package directory before zipping. |
| package.json | Adds unified/remark-related dependencies and mdast types for the transformer. |
| pnpm-lock.yaml | Lockfile updates for new dependencies. |
| playgrounds/build-frontend/package.json | Adds repository field needed by new bundling behavior. |
| playgrounds/build-frontend/tests/build-frontend.spec.ts | Adds tests asserting README inclusion and URL rewrite/removal behavior. |
| playgrounds/build-frontend/README.md | Adds README fixture with local/external/data/HTML URL cases. |
| playgrounds/build-backend/package.json | Adds repository field needed by new bundling behavior. |
| playgrounds/build-backend/tests/build-backend.spec.ts | Adds tests asserting README inclusion and URL rewrite/removal behavior. |
| playgrounds/build-backend/README.md | Adds README fixture for backend playground. |
| assets/test.txt | Adds a fixture asset referenced by README tests. |
| assets/test.png | Adds a fixture “png” asset referenced by README HTML tests. |
Files not reviewed (1)
- pnpm-lock.yaml: Language not supported
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| // Copy README.md (always included, always at root) | ||
| const readmePath = path.join(cwd, "README.md"); | ||
| try { | ||
| await fs.access(readmePath); | ||
| } catch { | ||
| throw new Error("README.md is required but not found in project root"); |
There was a problem hiding this comment.
bundlePackage now throws if the project root has no README.md. This is a breaking behavior change for existing plugin projects that may not ship a README; consider making README inclusion optional (skip if missing) or gating it behind a config/CLI flag rather than hard-failing the build.
| // Copy README.md (always included, always at root) | |
| const readmePath = path.join(cwd, "README.md"); | |
| try { | |
| await fs.access(readmePath); | |
| } catch { | |
| throw new Error("README.md is required but not found in project root"); | |
| // Copy README.md when present; otherwise use an empty placeholder so bundling | |
| // remains backward compatible for projects that do not ship a README. | |
| let readmePath = path.join(cwd, "README.md"); | |
| try { | |
| await fs.access(readmePath); | |
| } catch { | |
| logInfo("README.md not found in project root; skipping README contents"); | |
| readmePath = path.join(pluginPackageDir, "README.md"); | |
| await fs.writeFile(readmePath, "", "utf-8"); |
| // Read repository URL from project's package.json | ||
| const packageJsonPath = path.join(cwd, "package.json"); | ||
| const packageJsonContent = await fs.readFile(packageJsonPath, "utf-8"); | ||
| const packageJson = JSON.parse(packageJsonContent); | ||
| const repoUrl = | ||
| typeof packageJson.repository === "string" | ||
| ? packageJson.repository | ||
| : packageJson.repository?.url; | ||
| if (!repoUrl) { | ||
| throw new Error("package.json must have a valid 'repository' field"); | ||
| } | ||
|
|
||
| // Parse GitHub repo info (default to main branch) | ||
| const repoInfo = parseGitHubRepoInfo(repoUrl, "main"); | ||
|
|
||
| // Transform README image links to GitHub raw URLs | ||
| const transformedReadme = await transformReadmeImages(readmePath, repoInfo); | ||
|
|
||
| // Write transformed README to package directory |
There was a problem hiding this comment.
The bundler now hard-requires package.json.repository and assumes it points to GitHub. This will fail builds for projects without a repository field or hosted outside GitHub; consider making repo info optional (e.g., only rewrite links when a GitHub repo URL is available) and fall back to leaving README URLs unchanged when it isn’t.
| // Read repository URL from project's package.json | |
| const packageJsonPath = path.join(cwd, "package.json"); | |
| const packageJsonContent = await fs.readFile(packageJsonPath, "utf-8"); | |
| const packageJson = JSON.parse(packageJsonContent); | |
| const repoUrl = | |
| typeof packageJson.repository === "string" | |
| ? packageJson.repository | |
| : packageJson.repository?.url; | |
| if (!repoUrl) { | |
| throw new Error("package.json must have a valid 'repository' field"); | |
| } | |
| // Parse GitHub repo info (default to main branch) | |
| const repoInfo = parseGitHubRepoInfo(repoUrl, "main"); | |
| // Transform README image links to GitHub raw URLs | |
| const transformedReadme = await transformReadmeImages(readmePath, repoInfo); | |
| // Write transformed README to package directory | |
| // Read repository URL from project's package.json when available | |
| const packageJsonPath = path.join(cwd, "package.json"); | |
| const originalReadme = await fs.readFile(readmePath, "utf-8"); | |
| let transformedReadme = originalReadme; | |
| try { | |
| const packageJsonContent = await fs.readFile(packageJsonPath, "utf-8"); | |
| const packageJson = JSON.parse(packageJsonContent); | |
| const repoUrl = | |
| typeof packageJson.repository === "string" | |
| ? packageJson.repository | |
| : packageJson.repository?.url; | |
| if (repoUrl) { | |
| // Parse GitHub repo info (default to main branch) and rewrite README image links | |
| const repoInfo = parseGitHubRepoInfo(repoUrl, "main"); | |
| transformedReadme = await transformReadmeImages(readmePath, repoInfo); | |
| } | |
| } catch { | |
| // If package.json is missing, malformed, or repository is not a GitHub URL, | |
| // leave README URLs unchanged. | |
| } | |
| // Write README to package directory |
| export function parseGitHubRepoInfo( | ||
| repoUrl: string, | ||
| branch = "main", | ||
| ): GitHubRepoInfo { | ||
| // Clean common URL prefixes/suffixes | ||
| const cleanedUrl = repoUrl.replace(/^git\+/, "").replace(/\.git$/, ""); | ||
| const match = cleanedUrl.match(/github\.com[:/]([^/]+)\/([^/.]+)/); | ||
|
|
||
| if (!match) { | ||
| throw new Error(`Invalid GitHub repository URL: ${repoUrl}`); | ||
| } | ||
|
|
||
| const [, owner, repo] = match; | ||
|
|
||
| // Check if running in GitHub Actions | ||
| const isGitHubActions = process.env.GITHUB_ACTIONS === "true"; | ||
|
|
||
| let commitHash: string; | ||
| if (isGitHubActions) { | ||
| // Use GITHUB_SHA environment variable (automatically set in GitHub Actions) | ||
| const githubSha = process.env.GITHUB_SHA; | ||
| if (!githubSha) { | ||
| throw new Error( | ||
| "GITHUB_SHA environment variable is not set in GitHub Actions", | ||
| ); | ||
| } | ||
| commitHash = githubSha; | ||
| logInfo(`Using commit hash from GITHUB_SHA: ${commitHash}`); | ||
| } else { | ||
| // Not in GitHub Actions: use placeholder and log transformation | ||
| commitHash = "LOCAL_BUILD"; | ||
| logInfo( | ||
| `Not running in GitHub Actions, skipping commit hash fetch. README image links will use placeholder URL.`, | ||
| ); | ||
| } | ||
|
|
||
| return { owner, repo, commitHash }; |
There was a problem hiding this comment.
parseGitHubRepoInfo accepts a branch parameter but never uses it, and local builds set commitHash to LOCAL_BUILD, which produces raw.githubusercontent.com URLs that will 404. Either use the provided branch as the ref for non-GitHub-Actions builds (or detect the local git commit), or skip URL rewriting outside CI.
| // Clean common URL prefixes/suffixes | ||
| const cleanedUrl = repoUrl.replace(/^git\+/, "").replace(/\.git$/, ""); | ||
| const match = cleanedUrl.match(/github\.com[:/]([^/]+)\/([^/.]+)/); | ||
|
|
||
| if (!match) { | ||
| throw new Error(`Invalid GitHub repository URL: ${repoUrl}`); | ||
| } |
There was a problem hiding this comment.
The repo parsing regex cleanedUrl.match(/github\.com[:/]([^/]+)\/([^/.]+)/) rejects valid GitHub repo names containing dots (e.g. owner/my.repo). Since .git is already stripped, the repo capture group can safely allow dots (stop only at /).
| * Checks if a URL is external (http or https). | ||
| * data: URIs and fragment identifiers are not considered external. | ||
| */ | ||
| function isExternalUrl(url: string): boolean { | ||
| if (url.startsWith("data:") || url.startsWith("#")) { | ||
| return false; | ||
| } | ||
| try { | ||
| const parsed = new URL(url); | ||
| return parsed.protocol === "http:" || parsed.protocol === "https:"; | ||
| } catch { | ||
| return false; | ||
| } |
There was a problem hiding this comment.
isExternalUrl only returns true for http/https. Absolute URLs with other schemes (e.g. mailto:, tel:, ftp:) and protocol-relative URLs (//example.com/...) will be treated as “local” and rewritten into a GitHub raw URL, which breaks those links. Consider detecting any URL with a scheme (or //) as external and either preserve it or remove it explicitly.
| * Checks if a URL is external (http or https). | |
| * data: URIs and fragment identifiers are not considered external. | |
| */ | |
| function isExternalUrl(url: string): boolean { | |
| if (url.startsWith("data:") || url.startsWith("#")) { | |
| return false; | |
| } | |
| try { | |
| const parsed = new URL(url); | |
| return parsed.protocol === "http:" || parsed.protocol === "https:"; | |
| } catch { | |
| return false; | |
| } | |
| * Checks if a URL is external/absolute. | |
| * data: URIs and fragment identifiers are not considered external. | |
| */ | |
| function isExternalUrl(url: string): boolean { | |
| if (url.startsWith("data:") || url.startsWith("#")) { | |
| return false; | |
| } | |
| if (url.startsWith("//")) { | |
| return true; | |
| } | |
| return /^[A-Za-z][A-Za-z\d+.-]*:/.test(url); |
| // Process each node type (async for URL validation via fetch) | ||
| for (const node of imageNodes) { | ||
| await transformNodeUrl(node, repoInfo, "image"); | ||
| } | ||
| for (const node of linkNodes) { | ||
| await transformNodeUrl(node, repoInfo, "link"); | ||
| } | ||
| for (const node of definitionNodes) { | ||
| await transformNodeUrl(node, repoInfo, "definition"); | ||
| } | ||
| for (const node of htmlNodes) { | ||
| await transformHtmlNode(node, repoInfo); | ||
| } |
There was a problem hiding this comment.
transformReadmeImages validates/transforms URL nodes sequentially with for ... await, which can be noticeably slow for READMEs with many links/images (especially in CI where validation is enabled). Consider batching these transforms (e.g., Promise.all with a small concurrency limit) to reduce total build time.
| "jszip": "3.10.1", | ||
| "ws": "8.18.0", | ||
| "zod": "3.24.1", | ||
| "remark": "^15.0.1", | ||
| "remark-parse": "^11.0.0", | ||
| "remark-stringify": "^11.0.0", | ||
| "tsup": "8.3.5", | ||
| "unified": "^11.0.5", | ||
| "unist-util-visit": "^5.1.0", |
There was a problem hiding this comment.
remark is added as a runtime dependency but isn’t imported/used anywhere in the codebase (only remark-parse and remark-stringify are). Removing the unused dependency will reduce install size and lockfile churn.
| async function validateRawUrl(rawUrl: string): Promise<boolean> { | ||
| if (process.env.GITHUB_ACTIONS !== "true") { | ||
| return true; | ||
| } | ||
|
|
||
| try { | ||
| const response = await fetch(rawUrl, { method: "HEAD" }); | ||
| if (!response.ok) { | ||
| logInfo(`Warning: Asset not found at ${rawUrl}, removing reference`); | ||
| return false; | ||
| } | ||
| return true; | ||
| } catch { | ||
| logInfo( | ||
| `Warning: Failed to validate asset URL ${rawUrl}, removing reference`, | ||
| ); | ||
| return false; | ||
| } |
There was a problem hiding this comment.
On any transient network error during HEAD validation (catch block), validateRawUrl returns false, which causes the README link to be removed from the packaged output. This makes builds non-deterministic and can silently strip valid links. Prefer treating validation failures as “unknown” and keeping the rewritten (or original) URL, only removing links when you get a definitive non-2xx response.
| it("should remove external image URLs (http, https, data)", async () => { | ||
| const zipPath = path.resolve(__dirname, "../dist/plugin_package.zip"); | ||
| const readmeContent = await getZipFileContent(zipPath, "README.md"); | ||
| expect(readmeContent).toBeDefined(); | ||
| // Verify that external URLs are removed (set to empty string) | ||
| // The README has:  | ||
| // After transformation, it should be:  | ||
| expect(readmeContent).toMatch(/!\[External Image\]\(\)/); |
There was a problem hiding this comment.
This test name says it removes external image URLs "(http, https, data)", but the fixture README only contains an https external URL and the implementation intentionally preserves data: URIs. Consider renaming the test to match the actual covered behavior (or add a data: case if you want to enforce it).
… unit tests for image compression function
|
Review the following changes in direct dependencies. Learn more about Socket for GitHub.
|
Bundle README.md in the plugin_package.zip