fix: streamline pr and release workflow (#289)

Co-authored-by: sck_0 <samujackson1337@gmail.com>
This commit is contained in:
sickn33
2026-03-13 14:20:49 +01:00
committed by GitHub
parent 5655f9b0a8
commit e325b0ee30
17 changed files with 1100 additions and 172 deletions

View File

@@ -35,7 +35,11 @@ If you touch **any of these**:
- Running `npm run chain` is **NOT optional**.
- Running `npm run catalog` is **NOT optional**.
For contributor PRs, generated drift is now **informational** in CI because shared registry artifacts are auto-synced on `main` after merge. Contributors should still run the chain locally so the PR content is reviewable and maintainers can reproduce the generated output when needed.
For contributor PRs, the contract is now **source-only**:
- contributors should not commit `CATALOG.md`, `skills_index.json`, or `data/*.json`
- PR CI previews generated drift but does not require those files in the branch
- `main` remains the only canonical owner of derived registry artifacts
If `main` CI fails with:
@@ -78,11 +82,11 @@ Before ANY commit that adds/modifies skills, run the chain:
3. **COMMIT GENERATED FILES**:
```bash
git add README.md skills_index.json data/catalog.json data/bundles.json data/aliases.json CATALOG.md
git add README.md skills_index.json data/skills_index.json data/catalog.json data/bundles.json data/aliases.json CATALOG.md
git commit -m "chore: sync generated files"
```
> 🔴 **CRITICAL for direct `main` work**: If you skip this on maintainer work that lands directly on `main`, CI will fail with "Detected uncommitted changes".
> For contributor PRs, generated drift is allowed in CI and is auto-synced after merge.
> For contributor PRs, do **not** include derived registry artifacts. CI blocks direct edits to those files and previews drift separately.
> See [`docs/maintainers/ci-drift-fix.md`](../docs/maintainers/ci-drift-fix.md) for details.
### B. When You Merge a PR (Step-by-Step)
@@ -92,14 +96,14 @@ Before ANY commit that adds/modifies skills, run the chain:
**Before merging:**
1. **CI is green** — Validation, reference checks, tests, and generated artifact steps passed (see [`.github/workflows/ci.yml`](workflows/ci.yml)).
2. **Generated drift understood** — On pull requests, generator drift is informational only. Do not block a good PR solely because `README.md`, `CATALOG.md`, or catalog/index files would be regenerated. `main` auto-syncs those artifacts after merge.
2. **Generated drift understood** — On pull requests, generator drift is informational only. Do not block a good PR solely because canonical artifacts would be regenerated. Also do not accept PRs that directly edit `CATALOG.md`, `skills_index.json`, or `data/*.json`; those files are `main`-owned.
3. **Quality Bar** — PR description confirms the [Quality Bar Checklist](.github/PULL_REQUEST_TEMPLATE.md) (metadata, risk label, credits if applicable).
4. **Issue link** — If the PR fixes an issue, the PR description should contain `Closes #N` or `Fixes #N` so GitHub auto-closes the issue on merge.
**How you merge:**
- **Always merge via GitHub** so the PR shows as **Merged** and the contributor gets credit. Use **"Squash and merge"**. Do **not** integrate locally and then close the PR — that would show "Closed" and the contributor would not get proper attribution.
- **If the PR has merge conflicts:** Resolve them **on the PR branch** (you or the contributor: merge `main` into the PR branch, fix conflicts, run `npm run chain` and `npm run catalog` if needed, push). For generated registry files, prefer keeping `main`'s side and regenerating rather than hand-editing conflicts. Then use **"Squash and merge"** on GitHub. Full steps: [docs/maintainers/merging-prs.md](../docs/maintainers/merging-prs.md).
- **If the PR has merge conflicts:** Resolve them **on the PR branch** (you or the contributor: merge `main` into the PR branch, fix conflicts, drop derived registry files from the branch if they appear, push). For generated registry files, prefer keeping `main`'s side rather than hand-editing conflicts. Then use **"Squash and merge"** on GitHub. Full steps: [docs/maintainers/merging-prs.md](../docs/maintainers/merging-prs.md).
- **Rare exception:** Only if merging via GitHub is not possible, you may integrate locally and close the PR; in that case you **must** add a Co-authored-by line to the commit and explain in a comment. Prefer to avoid this so PRs are always **Merged**.
**If a PR was closed after local integration (reopen and merge):**
@@ -116,8 +120,8 @@ If a PR was integrated via local squash and then **closed** (so it shows "Closed
```bash
git merge origin/main -m "chore: merge main to resolve conflicts"
```
For conflicts in generated/registry files (`README.md`, `CATALOG.md`, `data/catalog.json`, etc.), keep **main's version**:
`git checkout --theirs README.md CATALOG.md data/catalog.json` (and any other conflicted files), then `git add` them.
For conflicts in generated/registry files (`CATALOG.md`, `data/catalog.json`, etc.), keep **main's version** and remove those derived files from the PR branch:
`git checkout --theirs CATALOG.md data/catalog.json` (and any other derived files), then `git add` them.
4. **Commit the merge** (if not already done):
`git commit -m "chore: merge main to resolve conflicts" --no-edit`
5. **Push to the contributor's fork.** Add their fork as a remote if needed (replace `USER` and `BRANCH` with the PR head owner and branch from the PR page):
@@ -250,27 +254,24 @@ Reject any PR that fails this:
When cutting a new version, follow the maintainer playbook in [`docs/maintainers/release-process.md`](../docs/maintainers/release-process.md).
**Release checklist (order matters):**
Operational verification → Changelog → Bump `package.json` (and README if needed) → Commit & push → Create GitHub Release with tag matching `package.json` → npm publish (manual or via CI) → Close remaining linked issues.
Preflight verification → Changelog → `npm run release:prepare -- X.Y.Z` → `npm run release:publish -- X.Y.Z` → npm publish (manual or via CI) → Close remaining linked issues.
---
1. **Run release verification**:
```bash
npm run validate
npm run validate:references
npm run sync:all
npm run test
npm run app:build
npm run release:preflight
```
Optional diagnostic pass:
```bash
npm run validate:strict
```
2. **Update Changelog**: Add the new release section to `CHANGELOG.md`.
3. **Bump Version**:
- Update `package.json` → `"version": "X.Y.Z"` (source of truth for npm).
- Update version header in `README.md` if it displays the number.
- One-liner: `npm version patch` (or `minor`/`major`) — bumps `package.json` and creates a git tag; then amend if you need to tag after release.
3. **Prepare commit and tag locally**:
```bash
npm run release:prepare -- X.Y.Z
```
This validates the release, aligns versioned files, writes the release notes artifact, creates the release commit, and creates the local tag.
4. **Create GitHub Release** (REQUIRED):
> ⚠️ **CRITICAL**: Pushing a tag (`git push --tags`) is NOT enough. You must create a **GitHub Release Object** for it to appear in the sidebar and trigger the NPM publish workflow.
@@ -278,9 +279,7 @@ Operational verification → Changelog → Bump `package.json` (and README if ne
Use the GitHub CLI:
```bash
# Prepare release notes (copy the new section from CHANGELOG.md into docs/maintainers/release-process.md, or use CHANGELOG excerpt)
# Then create the tag AND the release page (tag must match package.json version, e.g. v4.1.0)
gh release create v4.0.0 --title "v4.0.0 - [Theme Name]" --notes-file docs/maintainers/release-process.md
npm run release:publish -- X.Y.Z
```
**Important:** The release tag must match `package.json`'s version. The [Publish to npm](workflows/publish-npm.yml) workflow runs on **Release published** and will run `npm publish`; npm rejects republishing the same version.

View File

@@ -2,6 +2,18 @@
Please include a summary of the change and which skill is added or fixed.
## Change Classification
- [ ] Skill PR
- [ ] Docs PR
- [ ] Infra PR
## Issue Link (Optional)
Use this only when the PR should auto-close an issue:
`Closes #N` or `Fixes #N`
## Quality Bar Checklist ✅
**All items must be checked before merging.**
@@ -12,13 +24,9 @@ Please include a summary of the change and which skill is added or fixed.
- [ ] **Triggers**: The "When to use" section is clear and specific.
- [ ] **Security**: If this is an _offensive_ skill, I included the "Authorized Use Only" disclaimer.
- [ ] **Local Test**: I have verified the skill works locally.
- [ ] **Repo Checks**: I ran `npm run validate:references` if my change affected docs, bundles, workflows, or generated artifacts.
- [ ] **Repo Checks**: I ran `npm run validate:references` if my change affected docs, workflows, or infrastructure.
- [ ] **Source-Only PR**: I did not manually include generated registry artifacts (`CATALOG.md`, `skills_index.json`, `data/*.json`) in this PR.
- [ ] **Credits**: I have added the source credit in `README.md` (if applicable).
## Type of Change
- [ ] New Skill (Feature)
- [ ] Documentation Update
- [ ] Infrastructure
- [ ] **Maintainer Edits**: I enabled **Allow edits from maintainers** on the PR.
## Screenshots (if applicable)

View File

@@ -1,8 +1,5 @@
name: Skills Registry CI
permissions:
contents: write
on:
push:
branches: ["main"]
@@ -10,9 +7,65 @@ on:
branches: ["main"]
workflow_dispatch:
permissions:
contents: read
jobs:
validate-and-build:
pr-policy:
if: github.event_name == 'pull_request'
runs-on: ubuntu-latest
outputs:
primary_category: ${{ steps.intake.outputs.primary_category }}
categories: ${{ steps.intake.outputs.categories }}
requires_references: ${{ steps.intake.outputs.requires_references }}
direct_derived_changes_count: ${{ steps.intake.outputs.direct_derived_changes_count }}
has_quality_checklist: ${{ steps.intake.outputs.has_quality_checklist }}
has_issue_link: ${{ steps.intake.outputs.has_issue_link }}
steps:
- uses: actions/checkout@v4
with:
fetch-depth: 0
- name: Set up Node
uses: actions/setup-node@v4
with:
node-version: "lts/*"
- name: Fetch base branch
run: git fetch origin "${{ github.base_ref }}"
- name: Intake PR change
id: intake
run: |
node tools/scripts/pr_preflight.js \
--base "origin/${{ github.base_ref }}" \
--head "HEAD" \
--event-path "$GITHUB_EVENT_PATH" \
--no-run \
--write-github-output \
--write-step-summary
- name: Enforce PR source-only contract
run: |
if [ "${{ steps.intake.outputs.direct_derived_changes_count }}" != "0" ]; then
echo "Pull requests must stay source-only."
echo "Remove derived files and let main regenerate them after merge."
exit 1
fi
if [ "${{ steps.intake.outputs.has_quality_checklist }}" != "true" ]; then
echo "PR body must include the Quality Bar Checklist from the template."
exit 1
fi
if [ "${{ steps.intake.outputs.has_issue_link }}" != "true" ]; then
echo "::notice::No Closes/Fixes issue link detected in the PR body."
fi
source-validation:
if: github.event_name == 'pull_request'
runs-on: ubuntu-latest
needs: pr-policy
steps:
- uses: actions/checkout@v4
@@ -21,9 +74,8 @@ jobs:
with:
python-version: "3.10"
- name: Install dependencies
run: |
pip install pyyaml
- name: Install Python dependencies
run: pip install pyyaml
- name: Set up Node
uses: actions/setup-node@v4
@@ -42,21 +94,12 @@ jobs:
test -f README.md
test -f CONTRIBUTING.md
- name: 🔍 Validate Skills (Soft Mode)
run: |
npm run validate
- name: Validate source changes
run: npm run validate
- name: 🔗 Validate References
run: |
npm run validate:references
- name: 🏗️ Generate Index
run: |
npm run index
- name: 📝 Update README
run: |
npm run readme
- name: Validate references
if: needs.pr-policy.outputs.requires_references == 'true'
run: npm run validate:references
- name: Audit npm dependencies
run: npm audit --audit-level=high
@@ -67,67 +110,152 @@ jobs:
ENABLE_NETWORK_TESTS: "1"
run: npm run test
- name: 📦 Build catalog
artifact-preview:
if: github.event_name == 'pull_request'
runs-on: ubuntu-latest
needs: [pr-policy, source-validation]
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Install Python dependencies
run: pip install pyyaml
- name: Set up Node
uses: actions/setup-node@v4
with:
node-version: "lts/*"
- name: Install npm dependencies
run: npm ci
- name: Generate canonical artifacts preview
run: |
npm run chain
npm run catalog
- name: Report generated drift
run: |
managed_files=$(node tools/scripts/generated_files.js --shell --include-mixed)
drift_files=$(git diff --name-only -- $managed_files)
{
echo "## Artifact Preview"
echo
echo "- Primary change: \`${{ needs.pr-policy.outputs.primary_category }}\`"
echo "- Categories: \`${{ needs.pr-policy.outputs.categories }}\`"
echo "- Derived-file policy: PRs remain source-only; main will canonicalize final generated outputs."
echo
} >> "$GITHUB_STEP_SUMMARY"
if [ -z "$drift_files" ]; then
echo "No generated drift detected after preview."
echo "- Generated drift: none" >> "$GITHUB_STEP_SUMMARY"
exit 0
fi
echo "::notice::Generated drift detected in artifact preview."
{
echo "- Generated drift: detected"
echo
echo "Predicted file updates:"
printf '%s\n' "$drift_files" | sed 's/^/- `/; s/$/`/'
} >> "$GITHUB_STEP_SUMMARY"
main-validation-and-sync:
if: github.event_name != 'pull_request'
runs-on: ubuntu-latest
permissions:
contents: write
steps:
- uses: actions/checkout@v4
- name: Set up Python
uses: actions/setup-python@v5
with:
python-version: "3.10"
- name: Install Python dependencies
run: pip install pyyaml
- name: Set up Node
uses: actions/setup-node@v4
with:
node-version: "lts/*"
- name: Install npm dependencies
run: npm ci
- name: Verify directory structure
run: |
test -d skills/
test -d apps/web-app/
test -d tools/scripts/
test -d tools/lib/
test -f README.md
test -f CONTRIBUTING.md
- name: Validate skills
run: npm run validate
- name: Validate references
run: npm run validate:references
- name: Generate index
run: npm run index
- name: Update README
run: npm run readme
- name: Audit npm dependencies
run: npm audit --audit-level=high
continue-on-error: true
- name: Run tests
env:
ENABLE_NETWORK_TESTS: "1"
run: npm run test
- name: Build catalog
run: npm run catalog
- name: Set up GitHub credentials (for auto-sync)
- name: Set up GitHub credentials
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
run: |
git config user.name 'github-actions[bot]'
git config user.email 'github-actions[bot]@users.noreply.github.com'
git remote set-url origin https://x-access-token:${{ secrets.GITHUB_TOKEN }}@github.com/${{ github.repository }}.git
- name: Auto-commit registry drift (main only)
- name: Auto-commit canonical artifacts
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
run: |
# If no changes, exit successfully
managed_files=$(node tools/scripts/generated_files.js --shell --include-mixed)
git diff --quiet && exit 0
# Pull with rebase to integrate remote changes
git pull origin main --rebase || true
git add $managed_files || true
git add README.md skills_index.json data/skills_index.json data/catalog.json data/bundles.json data/aliases.json CATALOG.md || true
# If nothing to commit, exit successfully
git diff --cached --quiet && exit 0
git commit -m "chore: sync generated registry files [ci skip]"
git push origin HEAD
- name: Report generated drift (PRs only)
if: github.event_name == 'pull_request'
run: |
if git diff --quiet; then
echo "No generated drift detected after validation/build."
exit 0
fi
echo "::notice::Generated registry/readme drift detected on this PR."
echo "This is informational only on pull requests because main auto-syncs generated artifacts after merge."
echo "Files changed by generators:"
git diff --name-only
{
echo "## Generated Drift"
echo
echo "This PR changes source files that regenerate shared registry artifacts."
echo "The drift is allowed on pull requests and will be auto-synced on \`main\` after merge."
echo
echo "Changed generated files:"
git diff --name-only | sed 's/^/- `/; s/$/`/'
} >> "$GITHUB_STEP_SUMMARY"
- name: 🚨 Check for Uncommitted Drift
- name: Check for uncommitted drift
if: github.event_name == 'push' && github.ref == 'refs/heads/main'
run: |
if ! git diff --quiet; then
echo "❌ Detected uncommitted changes produced by registry/readme/catalog scripts."
echo
echo "Main must be self-healing after the auto-sync step."
echo "To fix locally, run the FULL Validation Chain, then commit and push:"
echo "To fix locally, run the canonical maintainer flow:"
echo " npm run release:preflight"
echo " npm run chain"
echo " npm run catalog"
echo " git add README.md skills_index.json data/skills_index.json data/catalog.json data/bundles.json data/aliases.json CATALOG.md"
echo " git commit -m \"chore: sync generated registry files\""
echo " git push"
echo " git status"
exit 1
fi

View File

@@ -29,6 +29,8 @@ git commit -m "feat: add my-awesome-skill for [purpose]"
git push origin my-branch
```
Open the PR with the default template and enable **Allow edits from maintainers** so conflicts can be resolved without extra back-and-forth.
If you only want to improve docs, editing directly in GitHub is still perfectly fine.
---
@@ -219,6 +221,15 @@ More examples...
Recommended validation path:
For a **skill-only PR**:
```bash
npm install
npm run validate
```
For **docs / workflows / infra changes**:
```bash
npm install
npm run validate
@@ -226,6 +237,12 @@ npm run validate:references
npm test
```
Optional maintainer-style preflight:
```bash
npm run pr:preflight
```
Python-only fallback:
```bash
@@ -239,6 +256,15 @@ This checks:
- ✅ Description exists
- ✅ Reference data and docs bundles stay coherent
Do **not** commit generated registry artifacts in a normal PR. These files are canonicalized on `main` after merge:
- `CATALOG.md`
- `skills_index.json`
- `data/skills_index.json`
- `data/catalog.json`
- `data/bundles.json`
- `data/aliases.json`
Optional hardening pass:
```bash
@@ -386,8 +412,10 @@ Before submitting your contribution:
- [ ] I've included examples
- [ ] I've tested the skill with an AI assistant
- [ ] I've run `npm run validate`
- [ ] I've run `npm run validate:references` and `npm test` when my change affects docs, bundles, workflows, or generated artifacts
- [ ] I've run `npm run validate:references` and `npm test` when my change affects docs, workflows, or infrastructure
- [ ] I did **not** include generated registry artifacts (`CATALOG.md`, `skills_index.json`, `data/*.json`) in this PR
- [ ] My commit message is clear (e.g., "feat: add docker-compose skill")
- [ ] I enabled **Allow edits from maintainers** on the PR
- [ ] I've checked for typos and grammar
---

View File

@@ -13,7 +13,7 @@ Scripts like `tools/scripts/generate_index.py`, `tools/scripts/update_readme.py`
## Pull Requests vs Main
- **Pull requests**: generated drift is reported as an informational notice only. Shared files like `README.md`, `CATALOG.md`, and `data/catalog.json` can legitimately move as other PRs merge. Do not treat PR drift as a merge blocker by itself.
- **Pull requests**: PRs are now **source-only**. Contributors should not commit derived registry artifacts (`CATALOG.md`, `skills_index.json`, `data/*.json`). CI blocks those direct edits and reports generated drift as an informational preview only.
- **`main` pushes**: drift is still strict. `main` must end the workflow clean after the auto-sync step.
## How to Fix on `main`
@@ -34,16 +34,16 @@ Scripts like `tools/scripts/generate_index.py`, `tools/scripts/update_readme.py`
3. Commit and push any updates:
```bash
git add README.md skills_index.json data/catalog.json data/bundles.json data/aliases.json CATALOG.md
git add README.md skills_index.json data/skills_index.json data/catalog.json data/bundles.json data/aliases.json CATALOG.md
git commit -m "chore: sync generated registry files"
git push
```
## Maintainer guidance for PRs
- Validate the source change.
- If merge conflicts touch generated registry files, keep `main`'s version for those files, rerun `npm run chain` and `npm run catalog`, and push the refreshed branch.
- Let `main` auto-sync the final generated artifact set after merge.
- Validate the source change, not the absence of committed generated artifacts.
- If a contributor PR includes direct edits to `CATALOG.md`, `skills_index.json`, or `data/*.json`, ask them to drop those files from the PR or remove them while refreshing the branch.
- If merge conflicts touch generated registry files, keep `main`'s version for those files and let `main` auto-sync the final generated artifact set after merge.
**Summary**:
Use generator drift as a hard failure only on `main`. On PRs, it is expected maintenance noise around shared generated artifacts and should be handled with branch refreshes, not blanket rejection.
Use generator drift as a hard failure only on `main`. On PRs, the contract is simpler: source-only changes are reviewed, generated output is previewed, and `main` produces the final canonical artifact set.

View File

@@ -14,8 +14,9 @@ Resolve conflicts **on the PR branch** so the PR becomes mergeable, then use "Sq
### Generated files policy
- Treat `README.md`, `CATALOG.md`, `skills_index.json`, and `data/*.json` as **derived artifacts**, not hand-edited source of truth.
- If those files conflict during a PR refresh, prefer **`main`'s side**, then rerun `npm run chain` and `npm run catalog`.
- Treat `CATALOG.md`, `skills_index.json`, and `data/*.json` as **derived artifacts**, not contributor-owned source files.
- `README.md` is mixed ownership: contributor prose edits are allowed, but workflow-managed metadata is canonicalized on `main`.
- If derived files appear in a PR refresh or merge conflict, prefer **`main`'s side** and remove them from the PR branch instead of hand-maintaining them there.
- Do not block a PR only because shared generated files would be regenerated differently after other merges. `main` auto-syncs the final state after merge.
### Steps (maintainer resolves conflicts on the contributors branch)
@@ -26,9 +27,9 @@ Resolve conflicts **on the PR branch** so the PR becomes mergeable, then use "Sq
`git checkout pr-<PR_NUMBER>`
3. **Merge `main` into it**
`git merge origin/main`
Resolve any conflicts in the working tree. For generated registry files (`README.md`, `CATALOG.md`, `data/*.json`, `skills_index.json`), prefer `main`'s version and regenerate:
`git checkout --theirs README.md CATALOG.md data/catalog.json skills_index.json`
Then run `npm run chain` and `npm run catalog`, and `git add` the refreshed generated files.
Resolve any conflicts in the working tree. For generated registry files (`CATALOG.md`, `data/*.json`, `skills_index.json`), prefer `main`'s version and remove them from the contributor branch:
`git checkout --theirs CATALOG.md data/catalog.json skills_index.json`
If `README.md` conflicts only because of workflow-managed metadata, prefer `main`'s side there too. Keep contributor prose edits when they are real source changes.
4. **Commit the merge**
`git add .` then `git commit -m "chore: merge main to resolve conflicts"` (or leave the default merge message).
5. **Push to the same branch the PR is from**
@@ -46,8 +47,8 @@ Ask them to:
git checkout <their-branch>
git fetch origin main
git merge origin/main
# resolve conflicts, then:
npm run chain && npm run catalog # if they touched skills/ or registry
# resolve conflicts, then drop derived files from the PR if they appear:
# CATALOG.md, skills_index.json, data/*.json
git add .
git commit -m "chore: merge main to resolve conflicts"
git push origin <their-branch>

View File

@@ -4,21 +4,17 @@ This is the maintainer playbook for cutting a repository release. Historical rel
## Preconditions
- The working tree is clean, or you have explicitly isolated the release changes.
- `package.json` contains the version you intend to publish.
- Generated registry files are synchronized.
- The tracked working tree is clean.
- You are on `main`.
- `CHANGELOG.md` already contains the release section you intend to publish.
- README counts, badges, and acknowledgements are up to date.
## Release Checklist
1. Run the operational verification suite:
1. Run the scripted preflight:
```bash
npm run validate
npm run validate:references
npm run sync:all
npm run test
npm run app:build
npm run release:preflight
```
2. Optional hardening pass:
@@ -35,26 +31,38 @@ Use this as a diagnostic signal. It is useful for spotting legacy quality debt,
- Confirm `README.md` reflects the current version and generated counts.
- Confirm Credits & Sources, contributors, and support links are still correct.
4. Create the release commit and tag:
4. Prepare the release commit and tag locally:
```bash
git add README.md CHANGELOG.md CATALOG.md data/ skills_index.json package.json package-lock.json
git commit -m "chore: release vX.Y.Z"
git tag vX.Y.Z
npm run release:prepare -- X.Y.Z
```
This command:
- checks `CHANGELOG.md` for `X.Y.Z`
- aligns `package.json` / `package-lock.json`
- runs the full release suite
- refreshes release metadata in `README.md`
- stages canonical release files
- creates `chore: release vX.Y.Z`
- creates the local tag `vX.Y.Z`
5. Publish the GitHub release:
```bash
gh release create vX.Y.Z --title "vX.Y.Z" --notes-file CHANGELOG.md
npm run release:publish -- X.Y.Z
```
This command pushes `main`, pushes `vX.Y.Z`, and creates the GitHub release object from the matching `CHANGELOG.md` section.
6. Publish to npm if needed:
```bash
npm publish
```
Normally this still happens via the existing GitHub release workflow after the GitHub release is published.
## Rollback Notes
- If the release tag is wrong, delete the tag locally and remotely before republishing.

View File

@@ -14,6 +14,10 @@
"sync:all": "npm run sync:metadata && npm run chain",
"catalog": "node tools/scripts/build-catalog.js",
"build": "npm run chain && npm run catalog",
"pr:preflight": "node tools/scripts/pr_preflight.js",
"release:preflight": "node tools/scripts/release_workflow.js preflight",
"release:prepare": "node tools/scripts/release_workflow.js prepare",
"release:publish": "node tools/scripts/release_workflow.js publish",
"test": "node tools/scripts/tests/run-test-suite.js",
"test:local": "node tools/scripts/tests/run-test-suite.js --local",
"test:network": "node tools/scripts/tests/run-test-suite.js --network",

View File

@@ -0,0 +1,19 @@
{
"derivedFiles": [
"CATALOG.md",
"skills_index.json",
"data/skills_index.json",
"data/catalog.json",
"data/bundles.json",
"data/aliases.json"
],
"mixedFiles": [
"README.md"
],
"releaseManagedFiles": [
"CHANGELOG.md",
"package.json",
"package-lock.json",
"README.md"
]
}

View File

@@ -0,0 +1,181 @@
const fs = require("fs");
const path = require("path");
const { findProjectRoot } = require("./project-root");
const DOC_PREFIXES = ["docs/"];
const DOC_FILES = new Set(["README.md", "CONTRIBUTING.md", "CHANGELOG.md", "walkthrough.md"]);
const INFRA_PREFIXES = [".github/", "tools/", "apps/"];
const INFRA_FILES = new Set(["package.json", "package-lock.json"]);
const REFERENCES_PREFIXES = ["docs/", ".github/", "tools/", "apps/", "data/"];
const REFERENCES_FILES = new Set([
"README.md",
"CONTRIBUTING.md",
"CHANGELOG.md",
"walkthrough.md",
"package.json",
"package-lock.json",
]);
function normalizeRepoPath(filePath) {
return String(filePath || "").replace(/\\/g, "/").replace(/^\.\//, "");
}
function escapeRegExp(value) {
return value.replace(/[.*+?^${}()|[\]\\]/g, "\\$&");
}
function loadWorkflowContract(startDir = __dirname) {
const projectRoot = findProjectRoot(startDir);
const configPath = path.join(projectRoot, "tools", "config", "generated-files.json");
const rawConfig = fs.readFileSync(configPath, "utf8");
const config = JSON.parse(rawConfig);
return {
projectRoot,
configPath,
derivedFiles: config.derivedFiles.map(normalizeRepoPath),
mixedFiles: config.mixedFiles.map(normalizeRepoPath),
releaseManagedFiles: config.releaseManagedFiles.map(normalizeRepoPath),
};
}
function getManagedFiles(contract, options = {}) {
const includeMixed = Boolean(options.includeMixed);
const includeReleaseManaged = Boolean(options.includeReleaseManaged);
const managedFiles = [...contract.derivedFiles];
if (includeMixed) {
managedFiles.push(...contract.mixedFiles);
}
if (includeReleaseManaged) {
managedFiles.push(...contract.releaseManagedFiles);
}
return [...new Set(managedFiles.map(normalizeRepoPath))];
}
function isDerivedFile(filePath, contract) {
return contract.derivedFiles.includes(normalizeRepoPath(filePath));
}
function isMixedFile(filePath, contract) {
return contract.mixedFiles.includes(normalizeRepoPath(filePath));
}
function isDocLikeFile(filePath) {
const normalized = normalizeRepoPath(filePath);
return normalized.endsWith(".md") || DOC_FILES.has(normalized) || DOC_PREFIXES.some((prefix) => normalized.startsWith(prefix));
}
function isInfraLikeFile(filePath) {
const normalized = normalizeRepoPath(filePath);
return (
INFRA_FILES.has(normalized) ||
INFRA_PREFIXES.some((prefix) => normalized.startsWith(prefix))
);
}
function classifyChangedFiles(changedFiles, contract) {
const categories = new Set();
const normalizedFiles = changedFiles.map(normalizeRepoPath).filter(Boolean);
for (const filePath of normalizedFiles) {
if (isDerivedFile(filePath, contract)) {
continue;
}
const isSkillPath = filePath.startsWith("skills/");
if (isSkillPath) {
categories.add("skill");
}
if (!isSkillPath && (isDocLikeFile(filePath) || isMixedFile(filePath, contract))) {
categories.add("docs");
}
if (isInfraLikeFile(filePath)) {
categories.add("infra");
}
}
const orderedCategories = ["skill", "docs", "infra"].filter((category) => categories.has(category));
let primaryCategory = "none";
if (orderedCategories.includes("infra")) {
primaryCategory = "infra";
} else if (orderedCategories.includes("skill")) {
primaryCategory = "skill";
} else if (orderedCategories.includes("docs")) {
primaryCategory = "docs";
}
return {
categories: orderedCategories,
primaryCategory,
};
}
function getDirectDerivedChanges(changedFiles, contract) {
return changedFiles
.map(normalizeRepoPath)
.filter(Boolean)
.filter((filePath) => isDerivedFile(filePath, contract));
}
function requiresReferencesValidation(changedFiles, contract) {
return changedFiles
.map(normalizeRepoPath)
.filter(Boolean)
.some((filePath) => {
if (isDerivedFile(filePath, contract) || isMixedFile(filePath, contract)) {
return true;
}
return (
REFERENCES_FILES.has(filePath) ||
REFERENCES_PREFIXES.some((prefix) => filePath.startsWith(prefix))
);
});
}
function extractChangelogSection(content, version) {
const headingExpression = new RegExp(`^## \\[${escapeRegExp(version)}\\].*$`, "m");
const headingMatch = headingExpression.exec(content);
if (!headingMatch) {
throw new Error(`CHANGELOG.md does not contain a section for version ${version}.`);
}
const startIndex = headingMatch.index;
const remainder = content.slice(startIndex + headingMatch[0].length);
const nextSectionRelativeIndex = remainder.search(/^## \[/m);
const endIndex =
nextSectionRelativeIndex === -1
? content.length
: startIndex + headingMatch[0].length + nextSectionRelativeIndex;
return `${content.slice(startIndex, endIndex).trim()}\n`;
}
function hasQualityChecklist(body) {
return /quality bar checklist/i.test(String(body || ""));
}
function hasIssueLink(body) {
return /(?:closes|fixes)\s+#\d+/i.test(String(body || ""));
}
module.exports = {
classifyChangedFiles,
extractChangelogSection,
getDirectDerivedChanges,
getManagedFiles,
hasIssueLink,
hasQualityChecklist,
isDerivedFile,
isMixedFile,
loadWorkflowContract,
normalizeRepoPath,
requiresReferencesValidation,
};

View File

@@ -0,0 +1,35 @@
#!/usr/bin/env node
const { getManagedFiles, loadWorkflowContract } = require("../lib/workflow-contract");
function parseArgs(argv) {
return {
includeMixed: argv.includes("--include-mixed"),
includeReleaseManaged: argv.includes("--include-release-managed"),
json: argv.includes("--json"),
shell: argv.includes("--shell"),
};
}
function main() {
const args = parseArgs(process.argv.slice(2));
const contract = loadWorkflowContract(__dirname);
const files = getManagedFiles(contract, {
includeMixed: args.includeMixed,
includeReleaseManaged: args.includeReleaseManaged,
});
if (args.json) {
process.stdout.write(`${JSON.stringify(files, null, 2)}\n`);
return;
}
if (args.shell) {
process.stdout.write(`${files.join(" ")}\n`);
return;
}
process.stdout.write(`${files.join("\n")}\n`);
}
main();

View File

@@ -0,0 +1,250 @@
#!/usr/bin/env node
const fs = require("fs");
const path = require("path");
const { spawnSync } = require("child_process");
const { findProjectRoot } = require("../lib/project-root");
const {
classifyChangedFiles,
getDirectDerivedChanges,
hasIssueLink,
hasQualityChecklist,
loadWorkflowContract,
normalizeRepoPath,
requiresReferencesValidation,
} = require("../lib/workflow-contract");
function parseArgs(argv) {
const args = {
base: null,
head: "HEAD",
eventPath: null,
checkPolicy: false,
noRun: false,
writeGithubOutput: false,
writeStepSummary: false,
json: false,
};
for (let index = 0; index < argv.length; index += 1) {
const arg = argv[index];
if (arg === "--base") {
args.base = argv[index + 1];
index += 1;
} else if (arg === "--head") {
args.head = argv[index + 1];
index += 1;
} else if (arg === "--event-path") {
args.eventPath = argv[index + 1];
index += 1;
} else if (arg === "--check-policy") {
args.checkPolicy = true;
} else if (arg === "--no-run") {
args.noRun = true;
} else if (arg === "--write-github-output") {
args.writeGithubOutput = true;
} else if (arg === "--write-step-summary") {
args.writeStepSummary = true;
} else if (arg === "--json") {
args.json = true;
}
}
return args;
}
function runGit(args, options = {}) {
const result = spawnSync("git", args, {
cwd: options.cwd,
encoding: "utf8",
stdio: options.capture ? ["ignore", "pipe", "pipe"] : "inherit",
});
if (result.error) {
throw result.error;
}
if (typeof result.status !== "number" || result.status !== 0) {
const stderr = options.capture ? result.stderr.trim() : "";
throw new Error(stderr || `git ${args.join(" ")} failed with status ${result.status}`);
}
return options.capture ? result.stdout.trim() : "";
}
function runCommand(command, args, cwd) {
console.log(`[pr:preflight] ${command} ${args.join(" ")}`);
const result = spawnSync(command, args, {
cwd,
stdio: "inherit",
shell: process.platform === "win32",
});
if (result.error) {
throw result.error;
}
if (typeof result.status !== "number" || result.status !== 0) {
process.exit(result.status || 1);
}
}
function resolveBaseRef(projectRoot) {
for (const candidate of ["origin/main", "main"]) {
const result = spawnSync("git", ["rev-parse", "--verify", candidate], {
cwd: projectRoot,
stdio: "ignore",
});
if (result.status === 0) {
return candidate;
}
}
return "HEAD";
}
function getChangedFiles(projectRoot, baseRef, headRef) {
if (baseRef === headRef) {
return [];
}
const diffOutput = runGit(["diff", "--name-only", `${baseRef}...${headRef}`], {
cwd: projectRoot,
capture: true,
});
return [...new Set(diffOutput.split(/\r?\n/).map(normalizeRepoPath).filter(Boolean))];
}
function loadPullRequestBody(eventPath) {
if (!eventPath) {
return null;
}
const rawEvent = fs.readFileSync(path.resolve(eventPath), "utf8");
const event = JSON.parse(rawEvent);
return event.pull_request?.body || "";
}
function appendGithubOutput(result) {
const outputPath = process.env.GITHUB_OUTPUT;
if (!outputPath) {
return;
}
const lines = [
`primary_category=${result.primaryCategory}`,
`categories=${result.categories.join(",")}`,
`requires_references=${String(result.requiresReferencesValidation)}`,
`direct_derived_changes_count=${String(result.directDerivedChanges.length)}`,
`direct_derived_changes=${JSON.stringify(result.directDerivedChanges)}`,
`changed_files_count=${String(result.changedFiles.length)}`,
`has_quality_checklist=${String(result.prBody.hasQualityChecklist)}`,
`has_issue_link=${String(result.prBody.hasIssueLink)}`,
];
fs.appendFileSync(outputPath, `${lines.join("\n")}\n`, "utf8");
}
function appendStepSummary(result) {
const summaryPath = process.env.GITHUB_STEP_SUMMARY;
if (!summaryPath) {
return;
}
const derivedSummary =
result.directDerivedChanges.length === 0
? "none"
: result.directDerivedChanges.map((filePath) => `\`${filePath}\``).join(", ");
const lines = [
"## PR Workflow Intake",
"",
`- Primary change: \`${result.primaryCategory}\``,
`- Categories: ${result.categories.length > 0 ? result.categories.map((category) => `\`${category}\``).join(", ") : "\`none\`"}`,
`- Changed files: ${result.changedFiles.length}`,
`- Direct derived-file edits: ${derivedSummary}`,
`- \`validate:references\` required: ${result.requiresReferencesValidation ? "yes" : "no"}`,
`- PR template checklist: ${result.prBody.hasQualityChecklist ? "present" : "missing"}`,
`- Issue auto-close link: ${result.prBody.hasIssueLink ? "detected" : "not detected"}`,
"",
"> Generated drift is reported separately in the artifact preview job and remains informational on pull requests.",
];
fs.appendFileSync(summaryPath, `${lines.join("\n")}\n`, "utf8");
}
function main() {
const args = parseArgs(process.argv.slice(2));
const projectRoot = findProjectRoot(__dirname);
const contract = loadWorkflowContract(__dirname);
const baseRef = args.base || resolveBaseRef(projectRoot);
const changedFiles = getChangedFiles(projectRoot, baseRef, args.head);
const classification = classifyChangedFiles(changedFiles, contract);
const directDerivedChanges = getDirectDerivedChanges(changedFiles, contract);
const pullRequestBody = loadPullRequestBody(args.eventPath);
const result = {
baseRef,
headRef: args.head,
changedFiles,
categories: classification.categories,
primaryCategory: classification.primaryCategory,
directDerivedChanges,
requiresReferencesValidation: requiresReferencesValidation(changedFiles, contract),
prBody: {
available: pullRequestBody !== null,
hasQualityChecklist: hasQualityChecklist(pullRequestBody),
hasIssueLink: hasIssueLink(pullRequestBody),
},
};
if (args.writeGithubOutput) {
appendGithubOutput(result);
}
if (args.writeStepSummary) {
appendStepSummary(result);
}
if (args.json) {
process.stdout.write(`${JSON.stringify(result, null, 2)}\n`);
} else {
console.log(`[pr:preflight] Base ref: ${baseRef}`);
console.log(`[pr:preflight] Changed files: ${changedFiles.length}`);
console.log(
`[pr:preflight] Classification: ${result.categories.length > 0 ? result.categories.join(", ") : "none"}`,
);
}
if (args.checkPolicy) {
if (directDerivedChanges.length > 0) {
console.error(
[
"Pull requests are source-only.",
"Remove derived files from the PR and let main regenerate them after merge.",
`Derived files detected: ${directDerivedChanges.join(", ")}`,
].join(" "),
);
process.exit(1);
}
if (pullRequestBody !== null && !result.prBody.hasQualityChecklist) {
console.error("PR body must include the Quality Bar Checklist section from the template.");
process.exit(1);
}
}
if (!args.noRun) {
runCommand("npm", ["run", "validate"], projectRoot);
if (result.requiresReferencesValidation) {
runCommand("npm", ["run", "validate:references"], projectRoot);
}
runCommand("npm", ["run", "test"], projectRoot);
}
}
main();

View File

@@ -1,66 +1,8 @@
#!/bin/bash
# Release Cycle Automation Script
# Enforces protocols from .github/MAINTENANCE.md
set -e
GREEN='\033[0;32m'
RED='\033[0;31m'
YELLOW='\033[1;33m'
NC='\033[0m'
echo "release_cycle.sh is now a thin wrapper around the scripted release workflow."
echo "Use \`npm run release:preflight\` directly for the supported entrypoint."
echo -e "${YELLOW}🤖 Initiating Antigravity Release Protocol...${NC}"
# 1. Validation Chain
echo -e "\n${YELLOW}Step 1: Running Validation Chain...${NC}"
echo "Running validate_skills.py..."
python3 tools/scripts/validate_skills.py
echo "Running generate_index.py..."
python3 tools/scripts/generate_index.py
echo "Running update_readme.py..."
python3 tools/scripts/update_readme.py
# 2. Catalog (required for CI)
echo -e "\n${YELLOW}Step 2: Build catalog...${NC}"
npm run catalog
# 3. Stats Consistency Check
echo -e "\n${YELLOW}Step 3: Verifying Stats Consistency...${NC}"
JSON_COUNT=$(python3 -c "import json; print(len(json.load(open('skills_index.json'))))")
echo "Skills in Registry (JSON): $JSON_COUNT"
# Check README Intro
README_CONTENT=$(cat README.md)
if [[ "$README_CONTENT" != *"$JSON_COUNT high-performance"* ]]; then
echo -e "${RED}❌ ERROR: README.md intro consistency failure!${NC}"
echo "Expected: '$JSON_COUNT high-performance'"
echo "Found mismatch. Please grep for 'high-performance' in README.md and fix it."
exit 1
fi
echo -e "${GREEN}✅ Stats Consistent.${NC}"
# 4. Version check (package.json is source of truth for npm)
echo -e "\n${YELLOW}Step 4: Version check${NC}"
PKG_VERSION=$(node -p "require('./package.json').version")
echo "package.json version: $PKG_VERSION"
echo "Ensure this version is bumped before 'npm publish' (npm forbids republishing the same version)."
# 5. Contributor Check
echo -e "\n${YELLOW}Step 5: Contributor Check${NC}"
echo "Recent commits by author (check against README 'Repo Contributors'):"
git shortlog -sn --since="1 month ago" --all --no-merges | head -n 10
echo -e "${YELLOW}⚠️ MANUAL VERIFICATION REQUIRED:${NC}"
echo "1. Are all PR authors above listed in 'Repo Contributors'?"
echo "2. Are all External Sources listed in 'Credits & Sources'?"
read -p "Type 'yes' to confirm you have verified contributors: " CONFIRM_CONTRIB
if [ "$CONFIRM_CONTRIB" != "yes" ]; then
echo -e "${RED}❌ Verification failed. Aborting.${NC}"
exit 1
fi
echo -e "\n${GREEN}✅ Release Cycle Checks Passed. You may now commit and push.${NC}"
echo -e "${YELLOW}After tagging a release: run \`npm publish\` from repo root (or use GitHub Release + NPM_TOKEN for CI).${NC}"
exit 0
node tools/scripts/release_workflow.js preflight

View File

@@ -0,0 +1,252 @@
#!/usr/bin/env node
const fs = require("fs");
const path = require("path");
const { spawnSync } = require("child_process");
const { findProjectRoot } = require("../lib/project-root");
const {
extractChangelogSection,
getManagedFiles,
loadWorkflowContract,
} = require("../lib/workflow-contract");
function parseArgs(argv) {
const [command, version] = argv;
return {
command,
version: version || null,
};
}
function runCommand(command, args, cwd, options = {}) {
const result = spawnSync(command, args, {
cwd,
encoding: "utf8",
stdio: options.capture ? ["ignore", "pipe", "pipe"] : "inherit",
shell: options.shell ?? process.platform === "win32",
});
if (result.error) {
throw result.error;
}
if (typeof result.status !== "number" || result.status !== 0) {
const stderr = options.capture ? result.stderr.trim() : "";
throw new Error(stderr || `${command} ${args.join(" ")} failed with status ${result.status}`);
}
return options.capture ? result.stdout.trim() : "";
}
function ensureOnMain(projectRoot) {
const currentBranch = runCommand("git", ["rev-parse", "--abbrev-ref", "HEAD"], projectRoot, {
capture: true,
});
if (currentBranch !== "main") {
throw new Error(`Release workflow must run from main. Current branch: ${currentBranch}`);
}
}
function ensureCleanWorkingTree(projectRoot, message) {
const status = runCommand("git", ["status", "--porcelain", "--untracked-files=no"], projectRoot, {
capture: true,
});
if (status) {
throw new Error(message || "Working tree has tracked changes. Commit or stash them first.");
}
}
function ensureTagMissing(projectRoot, tagName) {
const result = spawnSync("git", ["rev-parse", "--verify", tagName], {
cwd: projectRoot,
stdio: "ignore",
});
if (result.status === 0) {
throw new Error(`Tag ${tagName} already exists.`);
}
}
function ensureTagExists(projectRoot, tagName) {
const result = spawnSync("git", ["rev-parse", "--verify", tagName], {
cwd: projectRoot,
stdio: "ignore",
});
if (result.status !== 0) {
throw new Error(`Tag ${tagName} does not exist. Run release:prepare first.`);
}
}
function ensureGithubReleaseMissing(projectRoot, tagName) {
const result = spawnSync("gh", ["release", "view", tagName], {
cwd: projectRoot,
stdio: "ignore",
});
if (result.status === 0) {
throw new Error(`GitHub release ${tagName} already exists.`);
}
}
function readPackageVersion(projectRoot) {
const packagePath = path.join(projectRoot, "package.json");
const packageJson = JSON.parse(fs.readFileSync(packagePath, "utf8"));
return packageJson.version;
}
function ensureChangelogSection(projectRoot, version) {
const changelogPath = path.join(projectRoot, "CHANGELOG.md");
const changelogContent = fs.readFileSync(changelogPath, "utf8");
return extractChangelogSection(changelogContent, version);
}
function writeReleaseNotes(projectRoot, version, sectionContent) {
const releaseNotesDir = path.join(projectRoot, ".tmp", "releases");
const notesPath = path.join(releaseNotesDir, `v${version}.md`);
fs.mkdirSync(releaseNotesDir, { recursive: true });
fs.writeFileSync(notesPath, sectionContent, "utf8");
return notesPath;
}
function runReleaseSuite(projectRoot) {
runCommand("npm", ["run", "validate"], projectRoot);
runCommand("npm", ["run", "validate:references"], projectRoot);
runCommand("npm", ["run", "sync:all"], projectRoot);
runCommand("npm", ["run", "test"], projectRoot);
runCommand("npm", ["run", "app:build"], projectRoot);
}
function runReleasePreflight(projectRoot) {
ensureOnMain(projectRoot);
ensureCleanWorkingTree(projectRoot, "release:preflight requires a clean tracked working tree.");
const version = readPackageVersion(projectRoot);
ensureChangelogSection(projectRoot, version);
runReleaseSuite(projectRoot);
ensureCleanWorkingTree(
projectRoot,
"release:preflight left tracked changes. Sync and commit them before releasing.",
);
console.log(`[release] Preflight passed for version ${version}.`);
}
function stageReleaseFiles(projectRoot, contract) {
const filesToStage = getManagedFiles(contract, {
includeMixed: true,
includeReleaseManaged: true,
});
runCommand("git", ["add", ...filesToStage], projectRoot);
}
function prepareRelease(projectRoot, version) {
if (!version) {
throw new Error("Usage: npm run release:prepare -- X.Y.Z");
}
ensureOnMain(projectRoot);
ensureCleanWorkingTree(projectRoot, "release:prepare requires a clean tracked working tree.");
ensureTagMissing(projectRoot, `v${version}`);
ensureChangelogSection(projectRoot, version);
const currentVersion = readPackageVersion(projectRoot);
if (currentVersion !== version) {
runCommand("npm", ["version", version, "--no-git-tag-version"], projectRoot);
} else {
console.log(`[release] package.json already set to ${version}; keeping current version.`);
}
runReleaseSuite(projectRoot);
runCommand(
"npm",
["run", "sync:metadata", "--", "--refresh-volatile"],
projectRoot,
);
const refreshedReleaseNotes = ensureChangelogSection(projectRoot, version);
const notesPath = writeReleaseNotes(projectRoot, version, refreshedReleaseNotes);
const contract = loadWorkflowContract(projectRoot);
stageReleaseFiles(projectRoot, contract);
const stagedFiles = runCommand("git", ["diff", "--cached", "--name-only"], projectRoot, {
capture: true,
});
if (!stagedFiles) {
throw new Error("release:prepare did not stage any files. Nothing to commit.");
}
runCommand("git", ["commit", "-m", `chore: release v${version}`], projectRoot);
runCommand("git", ["tag", `v${version}`], projectRoot);
console.log(`[release] Prepared v${version}.`);
console.log(`[release] Notes file: ${notesPath}`);
console.log(`[release] Next step: npm run release:publish -- ${version}`);
}
function publishRelease(projectRoot, version) {
if (!version) {
throw new Error("Usage: npm run release:publish -- X.Y.Z");
}
ensureOnMain(projectRoot);
ensureCleanWorkingTree(projectRoot, "release:publish requires a clean tracked working tree.");
const packageVersion = readPackageVersion(projectRoot);
if (packageVersion !== version) {
throw new Error(`package.json version ${packageVersion} does not match requested release ${version}.`);
}
const tagName = `v${version}`;
ensureTagExists(projectRoot, tagName);
ensureGithubReleaseMissing(projectRoot, tagName);
const tagCommit = runCommand("git", ["rev-list", "-n", "1", tagName], projectRoot, {
capture: true,
});
const headCommit = runCommand("git", ["rev-parse", "HEAD"], projectRoot, {
capture: true,
});
if (tagCommit !== headCommit) {
throw new Error(`${tagName} does not point at HEAD. Refusing to publish.`);
}
const notesPath = writeReleaseNotes(projectRoot, version, ensureChangelogSection(projectRoot, version));
runCommand("git", ["push", "origin", "main"], projectRoot);
runCommand("git", ["push", "origin", tagName], projectRoot);
runCommand("gh", ["release", "create", tagName, "--title", tagName, "--notes-file", notesPath], projectRoot);
console.log(`[release] Published ${tagName}.`);
}
function main() {
const args = parseArgs(process.argv.slice(2));
const projectRoot = findProjectRoot(__dirname);
if (args.command === "preflight") {
runReleasePreflight(projectRoot);
return;
}
if (args.command === "prepare") {
prepareRelease(projectRoot, args.version);
return;
}
if (args.command === "publish") {
publishRelease(projectRoot, args.version);
return;
}
throw new Error(
"Usage: node tools/scripts/release_workflow.js <preflight|prepare|publish> [X.Y.Z]",
);
}
try {
main();
} catch (error) {
console.error(`[release] ${error.message}`);
process.exit(1);
}

View File

@@ -10,6 +10,7 @@ const TOOL_TESTS = path.join(TOOL_SCRIPTS, "tests");
const LOCAL_TEST_COMMANDS = [
[path.join(TOOL_TESTS, "jetski_gemini_loader.test.js")],
[path.join(TOOL_TESTS, "validate_skills_headings.test.js")],
[path.join(TOOL_TESTS, "workflow_contracts.test.js")],
[path.join(TOOL_SCRIPTS, "run-python.js"), path.join(TOOL_TESTS, "test_validate_skills_headings.py")],
];
const NETWORK_TEST_COMMANDS = [

View File

@@ -0,0 +1,68 @@
const assert = require("assert");
const {
classifyChangedFiles,
extractChangelogSection,
getDirectDerivedChanges,
hasIssueLink,
hasQualityChecklist,
requiresReferencesValidation,
} = require("../../lib/workflow-contract");
const contract = {
derivedFiles: [
"CATALOG.md",
"skills_index.json",
"data/skills_index.json",
"data/catalog.json",
"data/bundles.json",
"data/aliases.json",
],
mixedFiles: ["README.md"],
releaseManagedFiles: ["CHANGELOG.md", "package.json", "package-lock.json", "README.md"],
};
const skillOnly = classifyChangedFiles(["skills/example/SKILL.md"], contract);
assert.deepStrictEqual(skillOnly.categories, ["skill"]);
assert.strictEqual(skillOnly.primaryCategory, "skill");
assert.strictEqual(requiresReferencesValidation(["skills/example/SKILL.md"], contract), false);
const docsOnly = classifyChangedFiles(["README.md", "docs/users/faq.md"], contract);
assert.deepStrictEqual(docsOnly.categories, ["docs"]);
assert.strictEqual(docsOnly.primaryCategory, "docs");
assert.strictEqual(requiresReferencesValidation(["README.md"], contract), true);
const infraChange = classifyChangedFiles([".github/workflows/ci.yml", "tools/scripts/pr_preflight.js"], contract);
assert.deepStrictEqual(infraChange.categories, ["infra"]);
assert.strictEqual(infraChange.primaryCategory, "infra");
assert.strictEqual(requiresReferencesValidation(["tools/scripts/pr_preflight.js"], contract), true);
const mixedChange = classifyChangedFiles(["skills/example/SKILL.md", "README.md"], contract);
assert.deepStrictEqual(mixedChange.categories, ["skill", "docs"]);
assert.strictEqual(mixedChange.primaryCategory, "skill");
assert.deepStrictEqual(
getDirectDerivedChanges(["skills/example/SKILL.md", "data/catalog.json"], contract),
["data/catalog.json"],
);
const changelog = [
"## [7.7.0] - 2026-03-13 - \"Merge Friction Reduction\"",
"",
"- Line one",
"",
"## [7.6.0] - 2026-03-01 - \"Older Release\"",
"",
"- Older line",
"",
].join("\n");
assert.strictEqual(
extractChangelogSection(changelog, "7.7.0"),
"## [7.7.0] - 2026-03-13 - \"Merge Friction Reduction\"\n\n- Line one\n",
);
assert.strictEqual(hasQualityChecklist("## Quality Bar Checklist\n- [x] Standards"), true);
assert.strictEqual(hasQualityChecklist("No template here"), false);
assert.strictEqual(hasIssueLink("Fixes #123"), true);
assert.strictEqual(hasIssueLink("Related to #123"), false);

View File

@@ -24,3 +24,7 @@
- `python3 tools/scripts/sync_repo_metadata.py --dry-run`
- `npm run readme`
- `npm run validate:references`
- Added `tools/config/generated-files.json` as the single contract for derived registry artifacts so CI, maintainer scripts, and docs share the same file list.
- Added scripted workflow entrypoints: `npm run pr:preflight`, `npm run release:preflight`, `npm run release:prepare -- X.Y.Z`, and `npm run release:publish -- X.Y.Z`.
- Split PR CI into `pr-policy`, `source-validation`, and `artifact-preview` so PRs stay source-only, policy failures are explicit, and generated drift is previewed separately from source validation.
- Updated `CONTRIBUTING.md` and `.github/PULL_REQUEST_TEMPLATE.md` so contributors are told not to commit derived files and to enable `Allow edits from maintainers`.