Merge pull request #502 from alirezarezvani/dev
Some checks are pending
Deploy Documentation to Pages / build (push) Waiting to run
Deploy Documentation to Pages / deploy (push) Blocked by required conditions

This commit is contained in:
Alireza Rezvani
2026-04-09 00:34:20 +02:00
committed by GitHub
14 changed files with 2260 additions and 3 deletions

View File

@@ -3,7 +3,7 @@
"name": "claude-code-skills",
"description": "Production-ready skill packages for AI agents - Marketing, Engineering, Product, C-Level, PM, and RA/QM",
"repository": "https://github.com/alirezarezvani/claude-skills",
"total_skills": 194,
"total_skills": 195,
"skills": [
{
"name": "contract-and-proposal-writer",
@@ -659,6 +659,12 @@
"category": "engineering-advanced",
"description": "Run hypothesis tests, analyze A/B experiment results, calculate sample sizes, and interpret statistical significance with effect sizes. Use when you need to validate whether observed differences are real, size an experiment correctly before launch, or interpret test results with confidence."
},
{
"name": "tc-tracker",
"source": "../../engineering/tc-tracker",
"category": "engineering-advanced",
"description": "Use when the user asks to track technical changes, create change records, manage TC lifecycles, or hand off work between AI sessions. Covers init/create/update/status/resume/close/export workflows for structured code change documentation."
},
{
"name": "tech-debt-tracker",
"source": "../../engineering/tech-debt-tracker",
@@ -1187,7 +1193,7 @@
"description": "Software engineering and technical skills"
},
"engineering-advanced": {
"count": 43,
"count": 44,
"source": "../../engineering",
"description": "Advanced engineering skills - agents, RAG, MCP, CI/CD, databases, observability"
},

1
.codex/skills/tc-tracker Symbolic link
View File

@@ -0,0 +1 @@
../../engineering/tc-tracker

View File

@@ -77,7 +77,10 @@ jobs:
- name: Python syntax check (blocking)
run: |
python -m compileall marketing-skill product-team c-level-advisor engineering-team ra-qm-team engineering business-growth finance project-management scripts
python -m compileall \
marketing-skill product-team c-level-advisor \
engineering-team ra-qm-team engineering \
business-growth finance project-management scripts
- name: Run test suite
run: |

146
commands/tc.md Normal file
View File

@@ -0,0 +1,146 @@
---
name: tc
description: Track technical changes with structured records, a state machine, and session handoff. Usage: /tc <init|create|update|status|resume|close|export|dashboard> [args]
---
# /tc — Technical Change Tracker
Dispatch a TC (Technical Change) command. Arguments: `$ARGUMENTS`.
If `$ARGUMENTS` is empty, print this menu and stop:
```
/tc init Initialize TC tracking in this project
/tc create <name> Create a new TC record
/tc update <tc-id> [...] Update fields, status, files, handoff
/tc status [tc-id] Show one TC or the registry summary
/tc resume <tc-id> Resume a TC from a previous session
/tc close <tc-id> Transition a TC to deployed
/tc export Re-render derived artifacts
/tc dashboard Re-render the registry summary
```
Otherwise, parse `$ARGUMENTS` as `<subcommand> <rest>` and dispatch to the matching protocol below. All scripts live at `engineering/tc-tracker/scripts/`.
## Subcommands
### `init`
1. Run:
```bash
python3 engineering/tc-tracker/scripts/tc_init.py --root . --json
```
2. If status is `already_initialized`, report current statistics and stop.
3. Otherwise report what was created and suggest `/tc create <name>` as the next step.
### `create <name>`
1. Parse `<name>` as a kebab-case slug. If missing, ask the user for one.
2. Prompt the user (one question at a time) for:
- Title (5-120 chars)
- Scope: `feature | bugfix | refactor | infrastructure | documentation | hotfix | enhancement`
- Priority: `critical | high | medium | low` (default `medium`)
- Summary (10+ chars)
- Motivation
3. Run:
```bash
python3 engineering/tc-tracker/scripts/tc_create.py --root . \
--name "<slug>" --title "<title>" --scope <scope> --priority <priority> \
--summary "<summary>" --motivation "<motivation>" --json
```
4. Report the new TC ID and the path to the record.
### `update <tc-id> [intent]`
1. If `<tc-id>` is missing, list active TCs (status `in_progress` or `blocked`) from `tc_status.py --all` and ask which one.
2. Determine the user's intent from natural language:
- **Status change** → `--set-status <state>` with `--reason "<why>"`
- **Add files** → one or more `--add-file path[:action]`
- **Add a test** → `--add-test "<title>" --test-procedure "<step>" --test-expected "<result>"`
- **Update handoff** → any combination of `--handoff-progress`, `--handoff-next`, `--handoff-blocker`, `--handoff-context`
- **Add a note** → `--note "<text>"`
- **Add a tag** → `--tag <tag>`
3. Run:
```bash
python3 engineering/tc-tracker/scripts/tc_update.py --root . --tc-id <tc-id> [flags] --json
```
4. If exit code is non-zero, surface the error verbatim. The state machine and validator will reject invalid moves — do not retry blindly.
### `status [tc-id]`
- If `<tc-id>` is provided:
```bash
python3 engineering/tc-tracker/scripts/tc_status.py --root . --tc-id <tc-id>
```
- Otherwise:
```bash
python3 engineering/tc-tracker/scripts/tc_status.py --root . --all
```
### `resume <tc-id>`
1. Run:
```bash
python3 engineering/tc-tracker/scripts/tc_status.py --root . --tc-id <tc-id> --json
```
2. Display the handoff block prominently: `progress_summary`, `next_steps` (numbered), `blockers`, `key_context`.
3. Ask: "Resume <tc-id> and pick up at next step 1? (y/n)"
4. If yes, run an update to record the resumption:
```bash
python3 engineering/tc-tracker/scripts/tc_update.py --root . --tc-id <tc-id> \
--note "Session resumed" --reason "session handoff"
```
5. Begin executing the first item in `next_steps`. Do NOT re-derive context — trust the handoff.
### `close <tc-id>`
1. Read the record via `tc_status.py --tc-id <tc-id> --json`.
2. Verify the current status is `tested`. If not, refuse and tell the user which transitions are still required.
3. Check `test_cases`: warn if any are `pending`, `fail`, or `blocked`.
4. Ask the user:
- "Who is approving? (your name, or 'self')"
- "Approval notes (optional):"
- "Test coverage status: none / partial / full"
5. Run:
```bash
python3 engineering/tc-tracker/scripts/tc_update.py --root . --tc-id <tc-id> \
--set-status deployed --reason "Approved by <approver>" --note "Approval: <approver> — <notes>"
```
Then directly edit the `approval` block via a follow-up update if your script version supports it; otherwise instruct the user to record approval in `notes`.
6. Report: "TC-NNN closed and deployed."
### `export`
There is no automatic HTML export in this skill. Re-validate everything instead:
1. Read the registry.
2. For each record, run:
```bash
python3 engineering/tc-tracker/scripts/tc_validator.py --record <path> --json
```
3. Run:
```bash
python3 engineering/tc-tracker/scripts/tc_validator.py --registry docs/TC/tc_registry.json --json
```
4. Report: total records validated, any errors, paths to anything invalid.
### `dashboard`
Run the all-records summary:
```bash
python3 engineering/tc-tracker/scripts/tc_status.py --root . --all
```
## Iron Rules
1. **Never edit `tc_record.json` by hand.** Always use `tc_update.py` so revision history is appended and validation runs.
2. **Never skip the state machine.** Walk forward through states even if it feels redundant.
3. **Never delete a TC.** History is append-only — add a final revision and tag it `[CANCELLED]`.
4. **Background bookkeeping.** When mid-task, spawn a background subagent to update the TC. Do not pause coding to do paperwork.
5. **Validate before reporting success.** If a script exits non-zero, surface the error and stop.
## Related Skills
- `engineering/tc-tracker` — Full SKILL.md with schema reference, lifecycle diagrams, and the handoff format.
- `engineering/changelog-generator` — Pair with TC tracker: TCs for the per-change audit trail, changelog for user-facing release notes.
- `engineering/tech-debt-tracker` — For tracking long-lived debt rather than discrete code changes.

View File

@@ -0,0 +1,72 @@
# TC Tracker
Structured tracking for technical changes (TCs) with a strict state machine, append-only revision history, and a session-handoff block that lets a new AI session resume in-progress work cleanly.
## Quick Start
```bash
# 1. Initialize tracking in your project
python3 scripts/tc_init.py --project "My Project" --root .
# 2. Create a new TC
python3 scripts/tc_create.py --root . \
--name "user-auth" \
--title "Add JWT authentication" \
--scope feature --priority high \
--summary "Adds JWT login + middleware" \
--motivation "Required for protected endpoints"
# 3. Move it to in_progress and record some work
python3 scripts/tc_update.py --root . --tc-id <TC-ID> \
--set-status in_progress --reason "Starting implementation"
python3 scripts/tc_update.py --root . --tc-id <TC-ID> \
--add-file src/auth.py:created \
--add-file src/middleware.py:modified
# 4. Write a session handoff before stopping
python3 scripts/tc_update.py --root . --tc-id <TC-ID> \
--handoff-progress "JWT middleware wired up" \
--handoff-next "Write integration tests" \
--handoff-blocker "Waiting on test fixtures"
# 5. Check status
python3 scripts/tc_status.py --root . --all
```
## Included Scripts
- `scripts/tc_init.py` — Initialize `docs/TC/` in a project (idempotent)
- `scripts/tc_create.py` — Create a new TC record with sequential ID
- `scripts/tc_update.py` — Update fields, status, files, handoff, with atomic writes
- `scripts/tc_status.py` — View a single TC or the full registry
- `scripts/tc_validator.py` — Validate a record or registry against schema + state machine
All scripts:
- Use Python stdlib only
- Support `--help` and `--json`
- Use exit codes 0 (ok) / 1 (warnings) / 2 (errors)
## References
- `references/tc-schema.md` — JSON schema reference
- `references/lifecycle.md` — State machine and transitions
- `references/handoff-format.md` — Session handoff structure
## Slash Command
When installed with the rest of this repo, the `/tc <subcommand>` slash command (defined at `commands/tc.md`) dispatches to these scripts.
## Installation
### Claude Code
```bash
cp -R engineering/tc-tracker ~/.claude/skills/tc-tracker
```
### OpenAI Codex
```bash
cp -R engineering/tc-tracker ~/.codex/skills/tc-tracker
```

View File

@@ -0,0 +1,207 @@
---
name: "tc-tracker"
description: "Use when the user asks to track technical changes, create change records, manage TC lifecycles, or hand off work between AI sessions. Covers init/create/update/status/resume/close/export workflows for structured code change documentation."
---
# TC Tracker
Track every code change with structured JSON records, an enforced state machine, and a session handoff format that lets a new AI session resume work cleanly when a previous one expires.
## Overview
A Technical Change (TC) is a structured record that captures **what** changed, **why** it changed, **who** changed it, **when** it changed, **how it was tested**, and **where work stands** for the next session. Records live as JSON in `docs/TC/` inside the target project, validated against a strict schema and a state machine.
**Use this skill when the user:**
- Asks to "track this change" or wants an audit trail for code modifications
- Wants to hand off in-progress work to a future AI session
- Needs structured release notes that go beyond commit messages
- Onboards an existing project and wants retroactive change documentation
- Asks for `/tc init`, `/tc create`, `/tc update`, `/tc status`, `/tc resume`, or `/tc close`
**Do NOT use this skill when:**
- The user only wants a changelog from git history (use `engineering/changelog-generator`)
- The user only wants to track tech debt items (use `engineering/tech-debt-tracker`)
- The change is trivial (typo, formatting) and won't affect behavior
## Storage Layout
Each project stores TCs at `{project_root}/docs/TC/`:
```
docs/TC/
├── tc_config.json # Project settings
├── tc_registry.json # Master index + statistics
├── records/
│ └── TC-001-04-05-26-user-auth/
│ └── tc_record.json # Source of truth
└── evidence/
└── TC-001/ # Log snippets, command output, screenshots
```
## TC ID Convention
- **Parent TC:** `TC-NNN-MM-DD-YY-functionality-slug` (e.g., `TC-001-04-05-26-user-authentication`)
- **Sub-TC:** `TC-NNN.A` or `TC-NNN.A.1` (letter = revision, digit = sub-revision)
- `NNN` is sequential, `MM-DD-YY` is the creation date, slug is kebab-case.
## State Machine
```
planned -> in_progress -> implemented -> tested -> deployed
| | | | |
+-> blocked -+ +- in_progress <-------+
| (rework / hotfix)
+-> planned
```
> See [references/lifecycle.md](references/lifecycle.md) for the full transition table and recovery flows.
## Workflow Commands
The skill ships five Python scripts that perform deterministic, stdlib-only operations on TC records. Each one supports `--help` and `--json`.
### 1. Initialize tracking in a project
```bash
python3 scripts/tc_init.py --project "My Project" --root .
```
Creates `docs/TC/`, `docs/TC/records/`, `docs/TC/evidence/`, `tc_config.json`, and `tc_registry.json`. Idempotent — re-running reports "already initialized" with current stats.
### 2. Create a new TC record
```bash
python3 scripts/tc_create.py \
--root . \
--name "user-authentication" \
--title "Add JWT-based user authentication" \
--scope feature \
--priority high \
--summary "Adds JWT login + middleware" \
--motivation "Required for protected endpoints"
```
Generates the next sequential TC ID, creates the record directory, writes a fully populated `tc_record.json` (status `planned`, R1 creation revision), and updates the registry.
### 3. Update a TC record
```bash
# Status transition (validated against the state machine)
python3 scripts/tc_update.py --root . --tc-id TC-001-04-05-26-user-auth \
--set-status in_progress --reason "Starting implementation"
# Add a file
python3 scripts/tc_update.py --root . --tc-id TC-001-04-05-26-user-auth \
--add-file src/auth.py:created
# Append handoff data
python3 scripts/tc_update.py --root . --tc-id TC-001-04-05-26-user-auth \
--handoff-progress "JWT middleware wired up" \
--handoff-next "Write integration tests" \
--handoff-next "Update README"
```
Every change appends a sequential `R<n>` revision entry, refreshes `updated`, and re-validates against the schema before writing atomically (`.tmp` then rename).
### 4. View status
```bash
# Single TC
python3 scripts/tc_status.py --root . --tc-id TC-001-04-05-26-user-auth
# All TCs (registry summary)
python3 scripts/tc_status.py --root . --all --json
```
### 5. Validate a record or registry
```bash
python3 scripts/tc_validator.py --record docs/TC/records/TC-001-.../tc_record.json
python3 scripts/tc_validator.py --registry docs/TC/tc_registry.json
```
Validator enforces the schema, checks state-machine legality, verifies sequential `R<n>` and `T<n>` IDs, and asserts approval consistency (`approved=true` requires `approved_by` and `approved_date`).
> See [references/tc-schema.md](references/tc-schema.md) for the full schema.
## Slash-Command Dispatcher
The repo ships a `/tc` slash command at `commands/tc.md` that dispatches to these scripts based on subcommand:
| Command | Action |
|---------|--------|
| `/tc init` | Run `tc_init.py` for the current project |
| `/tc create <name>` | Prompt for fields, run `tc_create.py` |
| `/tc update <tc-id>` | Apply user-described changes via `tc_update.py` |
| `/tc status [tc-id]` | Run `tc_status.py` |
| `/tc resume <tc-id>` | Display handoff, archive prior session, start a new one |
| `/tc close <tc-id>` | Transition to `deployed`, set approval |
| `/tc export` | Re-render all derived artifacts |
| `/tc dashboard` | Re-render the registry summary |
The slash command is the user interface; the Python scripts are the engine.
## Session Handoff Format
The handoff block lives at `session_context.handoff` inside each TC and is the single most important field for AI continuity. It contains:
- `progress_summary` — what has been done
- `next_steps` — ordered list of remaining actions
- `blockers` — anything preventing progress
- `key_context` — critical decisions, gotchas, patterns the next bot must know
- `files_in_progress` — files being edited and their state (`editing`, `needs_review`, `partially_done`, `ready`)
- `decisions_made` — architectural decisions with rationale and timestamp
> See [references/handoff-format.md](references/handoff-format.md) for the full structure and fill-out rules.
## Validation Rules (Always Enforced)
1. **State machine** — only valid transitions are allowed.
2. **Sequential IDs**`revision_history` uses `R1, R2, R3...`; `test_cases` uses `T1, T2, T3...`.
3. **Append-only history** — revision entries are never modified or deleted.
4. **Approval consistency**`approved=true` requires `approved_by` and `approved_date`.
5. **TC ID format** — must match `TC-NNN-MM-DD-YY-slug`.
6. **Sub-TC ID format** — must match `TC-NNN.A` or `TC-NNN.A.N`.
7. **Atomic writes** — JSON is written to `.tmp` then renamed.
8. **Registry stats** — recomputed on every registry write.
## Non-Blocking Bookkeeping Pattern
TC tracking must NOT interrupt the main workflow.
- **Never stop to update TC records inline.** Keep coding.
- At natural milestones, spawn a background subagent to update the record.
- Surface questions only when genuinely needed ("This work doesn't match any active TC — create one?"), and ask once per session, not per file.
- At session end, write a final handoff block before closing.
## Retroactive Bulk Creation
For onboarding an existing project with undocumented history, build a `retro_changelog.json` (one entry per logical change) and feed it to `tc_create.py` in a loop, or extend the script for batch mode. Group commits by feature, not by file.
## Anti-Patterns
| Anti-pattern | Why it's bad | Do this instead |
|--------------|--------------|-----------------|
| Editing `revision_history` to "fix" a typo | History is append-only — tampering destroys the audit trail | Add a new revision that corrects the field |
| Skipping the state machine ("just set status to deployed") | Bypasses validation and hides skipped phases | Walk through `in_progress -> implemented -> tested -> deployed` |
| Creating one TC per file changed | Fragments related work and explodes the registry | One TC per logical unit (feature, fix, refactor) |
| Updating TC inline between every code edit | Slows the main agent, wastes context | Spawn a background subagent at milestones |
| Marking `approved=true` without `approved_by` | Validator will reject; misleading audit trail | Always set `approved_by` and `approved_date` together |
| Overwriting `tc_record.json` directly with a text editor | Risks corruption mid-write and skips validation | Use `tc_update.py` (atomic write + schema check) |
| Putting secrets in `notes` or evidence | Records are committed to the repo | Reference an env var or external secret store |
| Reusing TC IDs after deletion | Breaks the sequential guarantee and confuses history | Increment forward only — never recycle |
| Letting `next_steps` go stale | Defeats the purpose of handoff | Update on every milestone, even if it's "nothing changed" |
## Cross-References
- `engineering/changelog-generator` — Generates Keep-a-Changelog release notes from Conventional Commits. Pair it with TC tracker: TC for the granular per-change audit trail, changelog for user-facing release notes.
- `engineering/tech-debt-tracker` — For tracking long-lived debt items rather than discrete code changes.
- `engineering/focused-fix` — When a bug fix needs systematic feature-wide repair, run `/focused-fix` first then capture the result as a TC.
- `project-management/decision-log` — Architectural decisions made inside a TC's `decisions_made` block can also be promoted to a project-wide decision log.
- `engineering-team/code-reviewer` — Pre-merge review fits naturally into the `tested -> deployed` transition; capture the reviewer in `approval.approved_by`.
## References in This Skill
- [references/tc-schema.md](references/tc-schema.md) — Full JSON schema for TC records and the registry.
- [references/lifecycle.md](references/lifecycle.md) — State machine, valid transitions, and recovery flows.
- [references/handoff-format.md](references/handoff-format.md) — Session handoff structure and best practices.

View File

@@ -0,0 +1,139 @@
# Session Handoff Format
The handoff block is the most important part of a TC for AI continuity. When a session expires, the next session reads this block to resume work cleanly without re-deriving context.
## Where it lives
`session_context.handoff` inside `tc_record.json`.
## Structure
```json
{
"progress_summary": "string",
"next_steps": ["string", "..."],
"blockers": ["string", "..."],
"key_context": ["string", "..."],
"files_in_progress": [
{
"path": "src/foo.py",
"state": "editing|needs_review|partially_done|ready",
"notes": "string|null"
}
],
"decisions_made": [
{
"decision": "string",
"rationale": "string",
"timestamp": "ISO 8601"
}
]
}
```
## Field-by-field rules
### `progress_summary` (string)
A 1-3 sentence narrative of what has been done. Past tense. Concrete.
GOOD:
> "Implemented JWT signing with HS256, wired the auth middleware into the main router, and added two passing unit tests for the happy path."
BAD:
> "Working on auth." (too vague)
> "Wrote a bunch of code." (no specifics)
### `next_steps` (array of strings)
Ordered list of remaining actions. Each step should be small enough to complete in 5-15 minutes. Use imperative mood.
GOOD:
- "Add integration test for invalid token (401)"
- "Update README with the new POST /login endpoint"
- "Run `pytest tests/auth/` and capture output as evidence T2"
BAD:
- "Finish the feature" (not actionable)
- "Make it better" (no measurable outcome)
### `blockers` (array of strings)
Things preventing progress RIGHT NOW. If empty, the TC should not be in `blocked` status.
GOOD:
- "Test fixtures for the user model do not exist; need to create `tests/fixtures/user.py`"
- "Waiting for product to confirm whether refresh tokens are in scope (asked in #product channel)"
BAD:
- "It's hard." (not a blocker)
- "I'm tired." (not a blocker)
### `key_context` (array of strings)
Critical decisions, gotchas, patterns, or constraints the next session MUST know. Things that took the current session significant effort to discover.
GOOD:
- "The `legacy_auth` module is being phased out — do NOT extend it. New code goes in `src/auth/`."
- "We use HS256 (not RS256) because the secret rotation tooling does not support asymmetric keys yet."
- "There is a hidden import cycle if you import `User` from `models.user` instead of `models`. Always use `from models import User`."
BAD:
- "Be careful." (not specific)
- "There might be bugs." (not actionable)
### `files_in_progress` (array of objects)
Files currently mid-edit or partially complete. Include the state so the next session knows whether to read, edit, or review.
| state | meaning |
|-------|---------|
| `editing` | Actively being modified, may not compile |
| `needs_review` | Changes complete but unverified |
| `partially_done` | Some functions done, others stubbed |
| `ready` | Complete and tested |
### `decisions_made` (array of objects)
Architectural decisions taken during the current session, with rationale and timestamp. These should also be promoted to a project-wide decision log when significant.
```json
{
"decision": "Use HS256 instead of RS256 for JWT signing",
"rationale": "Secret rotation tooling does not support asymmetric keys; we accept the tradeoff because token lifetime is 15 minutes",
"timestamp": "2026-04-05T14:32:00+00:00"
}
```
## Handoff Lifecycle
### When to write the handoff
- At every natural milestone (feature complete, tests passing, EOD)
- BEFORE the session is likely to expire
- Whenever a blocker is hit
- Whenever a non-obvious decision is made
### How to write it (non-blocking)
Spawn a background subagent so the main agent doesn't pause:
> "Read `docs/TC/records/<TC-ID>/tc_record.json`. Update the handoff section with: progress_summary='...'; add next_step '...'; add blocker '...'. Use `tc_update.py` so revision history is appended. Then update `last_active` and write atomically."
### How the next session reads it
1. Read `docs/TC/tc_registry.json` and find TCs with status `in_progress` or `blocked`.
2. Read `tc_record.json` for each.
3. Display the handoff block to the user.
4. Ask: "Resume <TC-ID>? (y/n)"
5. If yes:
- Archive the previous session's `current_session` into `session_history` with an `ended` timestamp and a summary.
- Create a new `current_session` for the new bot.
- Append a revision: "Session resumed by <platform/model>".
- Walk through `next_steps` in order.
## Quality Bar
A handoff is "good" if a fresh AI session, with no other context, can pick up the work and make progress within 5 minutes of reading the record. If the next session has to ask "what was I doing?" or "what does this code do?", the previous handoff failed.
## Anti-patterns
| Anti-pattern | Why it's bad |
|--------------|--------------|
| Empty handoff at session end | Defeats the entire purpose |
| `next_steps: ["continue"]` | Not actionable |
| Handoff written but never updated as work progresses | Goes stale within an hour |
| Decisions buried in `notes` instead of `decisions_made` | Loses the rationale |
| Files mid-edit but not listed in `files_in_progress` | Next session reads stale code |
| Blockers in `notes` instead of `blockers` array | TC status cannot be set to `blocked` |

View File

@@ -0,0 +1,98 @@
# TC Lifecycle and State Machine
A TC moves through six implementation states. Transitions are validated on every write — invalid moves are rejected with a clear error.
## State Diagram
```
+-----------+
| planned |
+-----------+
| ^
v |
+-------------+
+-----> | in_progress | <-----+
| +-------------+ |
| | | |
v | v |
+---------+ | +-------------+ |
| blocked |<---+ | implemented | |
+---------+ +-------------+ |
| | |
v v |
+---------+ +--------+ |
| planned | | tested |-----+
+---------+ +--------+
|
v
+----------+
| deployed |
+----------+
|
v
in_progress (rework / hotfix)
```
## Transition Table
| From | Allowed Transitions |
|------|---------------------|
| `planned` | `in_progress`, `blocked` |
| `in_progress` | `blocked`, `implemented` |
| `blocked` | `in_progress`, `planned` |
| `implemented` | `tested`, `in_progress` |
| `tested` | `deployed`, `in_progress` |
| `deployed` | `in_progress` |
Same-status transitions are no-ops and always allowed. Anything else is an error.
## State Definitions
| State | Meaning | Required Before Moving Forward |
|-------|---------|--------------------------------|
| `planned` | TC has been created with description and motivation | Decide implementation approach |
| `in_progress` | Active development | Code changes captured in `files_affected` |
| `blocked` | Cannot proceed (dependency, decision needed) | At least one entry in `handoff.blockers` |
| `implemented` | Code complete, awaiting tests | All target files in `files_affected` |
| `tested` | Test cases executed, results recorded | At least one `test_case` with status `pass` (or explicit `skip` with rationale) |
| `deployed` | Approved and shipped | `approval.approved=true` with `approved_by` and `approved_date` |
## Recovery Flows
### "I committed before testing"
1. Status is `implemented`.
2. Write tests, run them, set `test_cases[*].status = pass`.
3. Transition `implemented -> tested`.
### "Production bug in a deployed TC"
1. Open the deployed TC.
2. Transition `deployed -> in_progress`.
3. Add a new revision summarizing the rework.
4. Walk forward through `implemented -> tested -> deployed` again.
### "Blocked, then unblocked"
1. From `in_progress`, transition to `blocked`. Add blockers to `handoff.blockers`.
2. When unblocked, transition `blocked -> in_progress` and clear/move blockers to `notes`.
### "Cancelled work"
There is no `cancelled` state. If a TC is abandoned:
1. Add a final revision: "Cancelled — reason: ...".
2. Move to `blocked`.
3. Add a `[CANCELLED]` tag.
4. Leave the record in place — never delete it (history is append-only).
## Status Field Discipline
- Update `status` ONLY through `tc_update.py --set-status`. Never edit JSON by hand.
- Every status change creates a new revision entry with `field` = `status`, `action` = `changed`, and `reason` populated.
- The registry's `statistics.by_status` is recomputed on every write.
## Anti-patterns
| Anti-pattern | Why it's wrong |
|--------------|----------------|
| Skipping `tested` and going straight to `deployed` | Bypasses validation; misleads downstream consumers |
| Deleting a record to "cancel" a TC | History is append-only; deletion breaks the audit trail |
| Re-using a TC ID after deletion | Sequential numbering must be preserved |
| Changing status without a `--reason` | Future maintainers cannot reconstruct intent |
| Long-lived `in_progress` TCs (weeks+) | Either too big — split into sub-TCs — or stalled and should be marked `blocked` |

View File

@@ -0,0 +1,204 @@
# TC Record Schema
A TC record is a JSON object stored at `docs/TC/records/<TC-ID>/tc_record.json`. Every record is validated against this schema and a state machine on every write.
## Top-Level Fields
| Field | Type | Required | Notes |
|-------|------|----------|-------|
| `tc_id` | string | yes | Pattern: `TC-NNN-MM-DD-YY-slug` |
| `parent_tc` | string \| null | no | For sub-TCs only |
| `title` | string | yes | 5-120 characters |
| `status` | enum | yes | One of: `planned`, `in_progress`, `blocked`, `implemented`, `tested`, `deployed` |
| `priority` | enum | yes | `critical`, `high`, `medium`, `low` |
| `created` | ISO 8601 | yes | UTC timestamp |
| `updated` | ISO 8601 | yes | UTC timestamp, refreshed on every write |
| `created_by` | string | yes | Author identifier (e.g., `user:micha`, `ai:claude-opus`) |
| `project` | string | yes | Project name (denormalized from registry) |
| `description` | object | yes | See below |
| `files_affected` | array | yes | See below |
| `revision_history` | array | yes | Append-only, sequential `R<n>` IDs |
| `sub_tcs` | array | no | Child TCs |
| `test_cases` | array | yes | Sequential `T<n>` IDs |
| `approval` | object | yes | See below |
| `session_context` | object | yes | See below |
| `tags` | array<string> | yes | Freeform tags |
| `related_tcs` | array<string> | yes | Cross-references |
| `notes` | string | yes | Freeform notes |
| `metadata` | object | yes | See below |
## description
```json
{
"summary": "string (10+ chars)",
"motivation": "string (1+ chars)",
"scope": "feature|bugfix|refactor|infrastructure|documentation|hotfix|enhancement",
"detailed_design": "string|null",
"breaking_changes": ["string", "..."],
"dependencies": ["string", "..."]
}
```
## files_affected (array of objects)
```json
{
"path": "src/auth.py",
"action": "created|modified|deleted|renamed",
"description": "string|null",
"lines_added": "integer|null",
"lines_removed": "integer|null"
}
```
## revision_history (array of objects, append-only)
```json
{
"revision_id": "R1",
"timestamp": "2026-04-05T12:34:56+00:00",
"author": "ai:claude-opus",
"summary": "Created TC record",
"field_changes": [
{
"field": "status",
"action": "set|changed|added|removed",
"old_value": "planned",
"new_value": "in_progress",
"reason": "Starting implementation"
}
]
}
```
**Rules:**
- IDs are sequential: R1, R2, R3, ... no gaps allowed.
- The first entry is always the creation event.
- Existing entries are NEVER modified or deleted.
## test_cases (array of objects)
```json
{
"test_id": "T1",
"title": "Login returns JWT for valid credentials",
"procedure": ["POST /login", "with valid creds"],
"expected_result": "200 + token in body",
"actual_result": "string|null",
"status": "pending|pass|fail|skip|blocked",
"evidence": [
{
"type": "log_snippet|screenshot|file_reference|command_output",
"description": "string",
"content": "string|null",
"path": "string|null",
"timestamp": "ISO|null"
}
],
"tested_by": "string|null",
"tested_date": "ISO|null"
}
```
## approval
```json
{
"approved": false,
"approved_by": "string|null",
"approved_date": "ISO|null",
"approval_notes": "string",
"test_coverage_status": "none|partial|full"
}
```
**Consistency rule:** if `approved=true`, both `approved_by` and `approved_date` MUST be set.
## session_context
```json
{
"current_session": {
"session_id": "string",
"platform": "claude_code|claude_web|api|other",
"model": "string",
"started": "ISO",
"last_active": "ISO|null"
},
"handoff": {
"progress_summary": "string",
"next_steps": ["string", "..."],
"blockers": ["string", "..."],
"key_context": ["string", "..."],
"files_in_progress": [
{
"path": "src/foo.py",
"state": "editing|needs_review|partially_done|ready",
"notes": "string|null"
}
],
"decisions_made": [
{
"decision": "string",
"rationale": "string",
"timestamp": "ISO"
}
]
},
"session_history": [
{
"session_id": "string",
"platform": "string",
"model": "string",
"started": "ISO",
"ended": "ISO",
"summary": "string",
"changes_made": ["string", "..."]
}
]
}
```
## metadata
```json
{
"project": "string",
"created_by": "string",
"last_modified_by": "string",
"last_modified": "ISO",
"estimated_effort": "trivial|small|medium|large|epic|null"
}
```
## Registry Schema (`tc_registry.json`)
```json
{
"project_name": "string",
"created": "ISO",
"updated": "ISO",
"next_tc_number": 1,
"records": [
{
"tc_id": "TC-001-...",
"title": "string",
"status": "enum",
"scope": "enum",
"priority": "enum",
"created": "ISO",
"updated": "ISO",
"path": "records/TC-001-.../tc_record.json"
}
],
"statistics": {
"total": 0,
"by_status": { "planned": 0, "in_progress": 0, "blocked": 0, "implemented": 0, "tested": 0, "deployed": 0 },
"by_scope": { "feature": 0, "bugfix": 0, "refactor": 0, "infrastructure": 0, "documentation": 0, "hotfix": 0, "enhancement": 0 },
"by_priority": { "critical": 0, "high": 0, "medium": 0, "low": 0 }
}
}
```
Statistics are recomputed on every registry write. Never edit them by hand.

View File

@@ -0,0 +1,277 @@
#!/usr/bin/env python3
"""TC Create — Create a new Technical Change record.
Generates the next sequential TC ID, scaffolds the record directory, writes a
fully populated tc_record.json (status=planned, R1 creation revision), and
appends a registry entry with recomputed statistics.
Usage:
python3 tc_create.py --root . --name user-auth \\
--title "Add JWT authentication" --scope feature --priority high \\
--summary "Adds JWT login + middleware" \\
--motivation "Required for protected endpoints"
Exit codes:
0 = created
1 = warnings (e.g. validation soft warnings)
2 = critical error (registry missing, bad args, schema invalid)
"""
from __future__ import annotations
import argparse
import json
import os
import re
import sys
from datetime import datetime, timezone
from pathlib import Path
VALID_STATUSES = ("planned", "in_progress", "blocked", "implemented", "tested", "deployed")
VALID_SCOPES = ("feature", "bugfix", "refactor", "infrastructure", "documentation", "hotfix", "enhancement")
VALID_PRIORITIES = ("critical", "high", "medium", "low")
def now_iso() -> str:
return datetime.now(timezone.utc).isoformat(timespec="seconds")
def slugify(text: str) -> str:
text = text.lower().strip()
text = re.sub(r"[^a-z0-9\s-]", "", text)
text = re.sub(r"[\s_]+", "-", text)
text = re.sub(r"-+", "-", text)
return text.strip("-")
def date_slug(dt: datetime) -> str:
return dt.strftime("%m-%d-%y")
def write_json_atomic(path: Path, data: dict) -> None:
tmp = path.with_suffix(path.suffix + ".tmp")
tmp.write_text(json.dumps(data, indent=2) + "\n", encoding="utf-8")
tmp.replace(path)
def compute_stats(records: list) -> dict:
stats = {
"total": len(records),
"by_status": {s: 0 for s in VALID_STATUSES},
"by_scope": {s: 0 for s in VALID_SCOPES},
"by_priority": {p: 0 for p in VALID_PRIORITIES},
}
for rec in records:
for key, bucket in (("status", "by_status"), ("scope", "by_scope"), ("priority", "by_priority")):
v = rec.get(key, "")
if v in stats[bucket]:
stats[bucket][v] += 1
return stats
def build_record(tc_id: str, title: str, scope: str, priority: str, summary: str,
motivation: str, project_name: str, author: str, session_id: str,
platform: str, model: str) -> dict:
ts = now_iso()
return {
"tc_id": tc_id,
"parent_tc": None,
"title": title,
"status": "planned",
"priority": priority,
"created": ts,
"updated": ts,
"created_by": author,
"project": project_name,
"description": {
"summary": summary,
"motivation": motivation,
"scope": scope,
"detailed_design": None,
"breaking_changes": [],
"dependencies": [],
},
"files_affected": [],
"revision_history": [
{
"revision_id": "R1",
"timestamp": ts,
"author": author,
"summary": "TC record created",
"field_changes": [
{"field": "status", "action": "set", "new_value": "planned", "reason": "initial creation"},
],
}
],
"sub_tcs": [],
"test_cases": [],
"approval": {
"approved": False,
"approved_by": None,
"approved_date": None,
"approval_notes": "",
"test_coverage_status": "none",
},
"session_context": {
"current_session": {
"session_id": session_id,
"platform": platform,
"model": model,
"started": ts,
"last_active": ts,
},
"handoff": {
"progress_summary": "",
"next_steps": [],
"blockers": [],
"key_context": [],
"files_in_progress": [],
"decisions_made": [],
},
"session_history": [],
},
"tags": [],
"related_tcs": [],
"notes": "",
"metadata": {
"project": project_name,
"created_by": author,
"last_modified_by": author,
"last_modified": ts,
"estimated_effort": None,
},
}
def main() -> int:
parser = argparse.ArgumentParser(description="Create a new TC record.")
parser.add_argument("--root", default=".", help="Project root (default: current directory)")
parser.add_argument("--name", required=True, help="Functionality slug (kebab-case, e.g. user-auth)")
parser.add_argument("--title", required=True, help="Human-readable title (5-120 chars)")
parser.add_argument("--scope", required=True, choices=VALID_SCOPES, help="Change category")
parser.add_argument("--priority", default="medium", choices=VALID_PRIORITIES, help="Priority level")
parser.add_argument("--summary", required=True, help="Concise summary (10+ chars)")
parser.add_argument("--motivation", required=True, help="Why this change is needed")
parser.add_argument("--author", default=None, help="Author identifier (defaults to config default_author)")
parser.add_argument("--session-id", default=None, help="Session identifier (default: auto)")
parser.add_argument("--platform", default="claude_code", choices=("claude_code", "claude_web", "api", "other"))
parser.add_argument("--model", default="unknown", help="AI model identifier")
parser.add_argument("--json", action="store_true", help="Output as JSON")
args = parser.parse_args()
root = Path(args.root).resolve()
tc_dir = root / "docs" / "TC"
config_path = tc_dir / "tc_config.json"
registry_path = tc_dir / "tc_registry.json"
if not config_path.exists() or not registry_path.exists():
msg = f"TC tracking not initialized at {tc_dir}. Run tc_init.py first."
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
try:
config = json.loads(config_path.read_text(encoding="utf-8"))
registry = json.loads(registry_path.read_text(encoding="utf-8"))
except (OSError, json.JSONDecodeError) as e:
msg = f"Failed to read config/registry: {e}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
project_name = config.get("project_name", "Unknown Project")
author = args.author or config.get("default_author", "Claude")
session_id = args.session_id or f"session-{int(datetime.now().timestamp())}-{os.getpid()}"
if len(args.title) < 5 or len(args.title) > 120:
msg = "Title must be 5-120 characters."
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
if len(args.summary) < 10:
msg = "Summary must be at least 10 characters."
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
name_slug = slugify(args.name)
if not name_slug:
msg = "Invalid name slug."
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
next_num = registry.get("next_tc_number", 1)
today = datetime.now()
tc_id = f"TC-{next_num:03d}-{date_slug(today)}-{name_slug}"
record_dir = tc_dir / "records" / tc_id
if record_dir.exists():
msg = f"Record directory already exists: {record_dir}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
record = build_record(
tc_id=tc_id,
title=args.title,
scope=args.scope,
priority=args.priority,
summary=args.summary,
motivation=args.motivation,
project_name=project_name,
author=author,
session_id=session_id,
platform=args.platform,
model=args.model,
)
try:
record_dir.mkdir(parents=True, exist_ok=False)
(tc_dir / "evidence" / tc_id).mkdir(parents=True, exist_ok=True)
write_json_atomic(record_dir / "tc_record.json", record)
except OSError as e:
msg = f"Failed to write record: {e}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
registry_entry = {
"tc_id": tc_id,
"title": args.title,
"status": "planned",
"scope": args.scope,
"priority": args.priority,
"created": record["created"],
"updated": record["updated"],
"path": f"records/{tc_id}/tc_record.json",
}
registry["records"].append(registry_entry)
registry["next_tc_number"] = next_num + 1
registry["updated"] = now_iso()
registry["statistics"] = compute_stats(registry["records"])
try:
write_json_atomic(registry_path, registry)
except OSError as e:
msg = f"Failed to update registry: {e}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
result = {
"status": "created",
"tc_id": tc_id,
"title": args.title,
"scope": args.scope,
"priority": args.priority,
"record_path": str(record_dir / "tc_record.json"),
}
if args.json:
print(json.dumps(result, indent=2))
else:
print(f"Created {tc_id}")
print(f" Title: {args.title}")
print(f" Scope: {args.scope}")
print(f" Priority: {args.priority}")
print(f" Record: {record_dir / 'tc_record.json'}")
print()
print(f"Next: tc_update.py --root {args.root} --tc-id {tc_id} --set-status in_progress")
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@@ -0,0 +1,196 @@
#!/usr/bin/env python3
"""TC Init — Initialize TC tracking inside a project.
Creates docs/TC/ with tc_config.json, tc_registry.json, records/, and evidence/.
Idempotent: re-running on an already-initialized project reports current stats
and exits cleanly.
Usage:
python3 tc_init.py --project "My Project" --root .
python3 tc_init.py --project "My Project" --root /path/to/project --json
Exit codes:
0 = initialized OR already initialized
1 = warnings (e.g. partial state)
2 = bad CLI args / I/O error
"""
from __future__ import annotations
import argparse
import json
import sys
from datetime import datetime, timezone
from pathlib import Path
VALID_STATUSES = ("planned", "in_progress", "blocked", "implemented", "tested", "deployed")
VALID_SCOPES = ("feature", "bugfix", "refactor", "infrastructure", "documentation", "hotfix", "enhancement")
VALID_PRIORITIES = ("critical", "high", "medium", "low")
def now_iso() -> str:
return datetime.now(timezone.utc).isoformat(timespec="seconds")
def detect_project_name(root: Path) -> str:
"""Try CLAUDE.md heading, package.json name, pyproject.toml name, then directory basename."""
claude_md = root / "CLAUDE.md"
if claude_md.exists():
try:
for line in claude_md.read_text(encoding="utf-8").splitlines():
line = line.strip()
if line.startswith("# "):
return line[2:].strip()
except OSError:
pass
pkg = root / "package.json"
if pkg.exists():
try:
data = json.loads(pkg.read_text(encoding="utf-8"))
name = data.get("name")
if isinstance(name, str) and name.strip():
return name.strip()
except (OSError, json.JSONDecodeError):
pass
pyproject = root / "pyproject.toml"
if pyproject.exists():
try:
for line in pyproject.read_text(encoding="utf-8").splitlines():
stripped = line.strip()
if stripped.startswith("name") and "=" in stripped:
value = stripped.split("=", 1)[1].strip().strip('"').strip("'")
if value:
return value
except OSError:
pass
return root.resolve().name
def build_config(project_name: str) -> dict:
return {
"project_name": project_name,
"tc_root": "docs/TC",
"created": now_iso(),
"auto_track": True,
"default_author": "Claude",
"categories": list(VALID_SCOPES),
}
def build_registry(project_name: str) -> dict:
return {
"project_name": project_name,
"created": now_iso(),
"updated": now_iso(),
"next_tc_number": 1,
"records": [],
"statistics": {
"total": 0,
"by_status": {s: 0 for s in VALID_STATUSES},
"by_scope": {s: 0 for s in VALID_SCOPES},
"by_priority": {p: 0 for p in VALID_PRIORITIES},
},
}
def write_json_atomic(path: Path, data: dict) -> None:
"""Write JSON to a temp file and rename, to avoid partial writes."""
tmp = path.with_suffix(path.suffix + ".tmp")
tmp.write_text(json.dumps(data, indent=2) + "\n", encoding="utf-8")
tmp.replace(path)
def main() -> int:
parser = argparse.ArgumentParser(description="Initialize TC tracking in a project.")
parser.add_argument("--root", default=".", help="Project root directory (default: current directory)")
parser.add_argument("--project", help="Project name (auto-detected if omitted)")
parser.add_argument("--force", action="store_true", help="Re-initialize even if config exists (preserves registry)")
parser.add_argument("--json", action="store_true", help="Output as JSON")
args = parser.parse_args()
root = Path(args.root).resolve()
if not root.exists() or not root.is_dir():
msg = f"Project root does not exist or is not a directory: {root}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
tc_dir = root / "docs" / "TC"
config_path = tc_dir / "tc_config.json"
registry_path = tc_dir / "tc_registry.json"
if config_path.exists() and not args.force:
try:
cfg = json.loads(config_path.read_text(encoding="utf-8"))
except (OSError, json.JSONDecodeError) as e:
msg = f"Existing tc_config.json is unreadable: {e}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
stats = {}
if registry_path.exists():
try:
reg = json.loads(registry_path.read_text(encoding="utf-8"))
stats = reg.get("statistics", {})
except (OSError, json.JSONDecodeError):
stats = {}
result = {
"status": "already_initialized",
"project_name": cfg.get("project_name"),
"tc_root": str(tc_dir),
"statistics": stats,
}
if args.json:
print(json.dumps(result, indent=2))
else:
print(f"TC tracking already initialized for project '{cfg.get('project_name')}'.")
print(f" TC root: {tc_dir}")
if stats:
print(f" Total TCs: {stats.get('total', 0)}")
return 0
project_name = args.project or detect_project_name(root)
try:
tc_dir.mkdir(parents=True, exist_ok=True)
(tc_dir / "records").mkdir(exist_ok=True)
(tc_dir / "evidence").mkdir(exist_ok=True)
write_json_atomic(config_path, build_config(project_name))
if not registry_path.exists() or args.force:
write_json_atomic(registry_path, build_registry(project_name))
except OSError as e:
msg = f"Failed to create TC directories or files: {e}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
result = {
"status": "initialized",
"project_name": project_name,
"tc_root": str(tc_dir),
"files_created": [
str(config_path),
str(registry_path),
str(tc_dir / "records"),
str(tc_dir / "evidence"),
],
}
if args.json:
print(json.dumps(result, indent=2))
else:
print(f"Initialized TC tracking for project '{project_name}'")
print(f" TC root: {tc_dir}")
print(f" Config: {config_path}")
print(f" Registry: {registry_path}")
print(f" Records: {tc_dir / 'records'}")
print(f" Evidence: {tc_dir / 'evidence'}")
print()
print("Next: python3 tc_create.py --root . --name <slug> --title <title> --scope <scope> ...")
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@@ -0,0 +1,200 @@
#!/usr/bin/env python3
"""TC Status — Show TC status for one record or the entire registry.
Usage:
# Single TC
python3 tc_status.py --root . --tc-id <TC-ID>
python3 tc_status.py --root . --tc-id <TC-ID> --json
# All TCs (registry summary)
python3 tc_status.py --root . --all
python3 tc_status.py --root . --all --json
Exit codes:
0 = ok
1 = warnings (e.g. validation issues found while reading)
2 = critical error (file missing, parse error, bad args)
"""
from __future__ import annotations
import argparse
import json
import sys
from pathlib import Path
def find_record_path(tc_dir: Path, tc_id: str) -> Path | None:
direct = tc_dir / "records" / tc_id / "tc_record.json"
if direct.exists():
return direct
for entry in (tc_dir / "records").glob("*"):
if entry.is_dir() and entry.name.startswith(tc_id):
candidate = entry / "tc_record.json"
if candidate.exists():
return candidate
return None
def render_single(record: dict) -> str:
lines = []
lines.append(f"TC: {record.get('tc_id')}")
lines.append(f" Title: {record.get('title')}")
lines.append(f" Status: {record.get('status')}")
lines.append(f" Priority: {record.get('priority')}")
desc = record.get("description", {}) or {}
lines.append(f" Scope: {desc.get('scope')}")
lines.append(f" Created: {record.get('created')}")
lines.append(f" Updated: {record.get('updated')}")
lines.append(f" Author: {record.get('created_by')}")
lines.append("")
summary = desc.get("summary") or ""
if summary:
lines.append(f" Summary: {summary}")
motivation = desc.get("motivation") or ""
if motivation:
lines.append(f" Motivation: {motivation}")
lines.append("")
files = record.get("files_affected", []) or []
lines.append(f" Files affected: {len(files)}")
for f in files[:10]:
lines.append(f" - {f.get('path')} ({f.get('action')})")
if len(files) > 10:
lines.append(f" ... and {len(files) - 10} more")
lines.append("")
tests = record.get("test_cases", []) or []
pass_count = sum(1 for t in tests if t.get("status") == "pass")
fail_count = sum(1 for t in tests if t.get("status") == "fail")
lines.append(f" Tests: {pass_count} pass / {fail_count} fail / {len(tests)} total")
lines.append("")
revs = record.get("revision_history", []) or []
lines.append(f" Revisions: {len(revs)}")
if revs:
latest = revs[-1]
lines.append(f" Latest: {latest.get('revision_id')} {latest.get('timestamp')}")
lines.append(f" {latest.get('author')}: {latest.get('summary')}")
lines.append("")
handoff = (record.get("session_context", {}) or {}).get("handoff", {}) or {}
if any(handoff.get(k) for k in ("progress_summary", "next_steps", "blockers", "key_context")):
lines.append(" Handoff:")
if handoff.get("progress_summary"):
lines.append(f" Progress: {handoff['progress_summary']}")
if handoff.get("next_steps"):
lines.append(" Next steps:")
for s in handoff["next_steps"]:
lines.append(f" - {s}")
if handoff.get("blockers"):
lines.append(" Blockers:")
for b in handoff["blockers"]:
lines.append(f" ! {b}")
if handoff.get("key_context"):
lines.append(" Key context:")
for c in handoff["key_context"]:
lines.append(f" * {c}")
appr = record.get("approval", {}) or {}
lines.append("")
lines.append(f" Approved: {appr.get('approved')} ({appr.get('test_coverage_status')} coverage)")
if appr.get("approved"):
lines.append(f" By: {appr.get('approved_by')} on {appr.get('approved_date')}")
return "\n".join(lines)
def render_registry(registry: dict) -> str:
lines = []
lines.append(f"Project: {registry.get('project_name')}")
lines.append(f"Updated: {registry.get('updated')}")
stats = registry.get("statistics", {}) or {}
lines.append(f"Total TCs: {stats.get('total', 0)}")
by_status = stats.get("by_status", {}) or {}
lines.append("By status:")
for status, count in by_status.items():
if count:
lines.append(f" {status:12} {count}")
lines.append("")
records = registry.get("records", []) or []
if records:
lines.append(f"{'TC ID':40} {'Status':14} {'Scope':14} {'Priority':10} Title")
lines.append("-" * 100)
for rec in records:
lines.append("{:40} {:14} {:14} {:10} {}".format(
rec.get("tc_id", "")[:40],
rec.get("status", "")[:14],
rec.get("scope", "")[:14],
rec.get("priority", "")[:10],
rec.get("title", ""),
))
else:
lines.append("No TC records yet. Run tc_create.py to add one.")
return "\n".join(lines)
def main() -> int:
parser = argparse.ArgumentParser(description="Show TC status.")
parser.add_argument("--root", default=".", help="Project root (default: current directory)")
group = parser.add_mutually_exclusive_group(required=True)
group.add_argument("--tc-id", help="Show this single TC")
group.add_argument("--all", action="store_true", help="Show registry summary for all TCs")
parser.add_argument("--json", action="store_true", help="Output as JSON")
args = parser.parse_args()
root = Path(args.root).resolve()
tc_dir = root / "docs" / "TC"
registry_path = tc_dir / "tc_registry.json"
if not registry_path.exists():
msg = f"TC tracking not initialized at {tc_dir}. Run tc_init.py first."
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
try:
registry = json.loads(registry_path.read_text(encoding="utf-8"))
except (OSError, json.JSONDecodeError) as e:
msg = f"Failed to read registry: {e}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
if args.all:
if args.json:
print(json.dumps({
"status": "ok",
"project_name": registry.get("project_name"),
"updated": registry.get("updated"),
"statistics": registry.get("statistics", {}),
"records": registry.get("records", []),
}, indent=2))
else:
print(render_registry(registry))
return 0
record_path = find_record_path(tc_dir, args.tc_id)
if record_path is None:
msg = f"TC not found: {args.tc_id}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
try:
record = json.loads(record_path.read_text(encoding="utf-8"))
except (OSError, json.JSONDecodeError) as e:
msg = f"Failed to read record: {e}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
if args.json:
print(json.dumps({"status": "ok", "record": record}, indent=2))
else:
print(render_single(record))
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@@ -0,0 +1,361 @@
#!/usr/bin/env python3
"""TC Update — Update an existing TC record.
Each invocation appends a sequential R<n> revision entry, refreshes the
`updated` timestamp, validates the resulting record, and writes atomically.
Usage:
# Status transition (validated against state machine)
python3 tc_update.py --root . --tc-id <TC-ID> \\
--set-status in_progress --reason "Starting implementation"
# Add files
python3 tc_update.py --root . --tc-id <TC-ID> \\
--add-file src/auth.py:created \\
--add-file src/middleware.py:modified
# Add a test case
python3 tc_update.py --root . --tc-id <TC-ID> \\
--add-test "Login returns JWT" \\
--test-procedure "POST /login with valid creds" \\
--test-expected "200 + token in body"
# Append handoff data
python3 tc_update.py --root . --tc-id <TC-ID> \\
--handoff-progress "JWT middleware wired up" \\
--handoff-next "Write integration tests" \\
--handoff-next "Update README" \\
--handoff-blocker "Waiting on test fixtures"
# Append a freeform note
python3 tc_update.py --root . --tc-id <TC-ID> --note "Decision: use HS256"
Exit codes:
0 = updated
1 = warnings (e.g. validation produced errors but write skipped)
2 = critical error (file missing, invalid transition, parse error)
"""
from __future__ import annotations
import argparse
import json
import re
import sys
from datetime import datetime, timezone
from pathlib import Path
VALID_STATUSES = ("planned", "in_progress", "blocked", "implemented", "tested", "deployed")
VALID_TRANSITIONS = {
"planned": ["in_progress", "blocked"],
"in_progress": ["blocked", "implemented"],
"blocked": ["in_progress", "planned"],
"implemented": ["tested", "in_progress"],
"tested": ["deployed", "in_progress"],
"deployed": ["in_progress"],
}
VALID_FILE_ACTIONS = ("created", "modified", "deleted", "renamed")
VALID_TEST_STATUSES = ("pending", "pass", "fail", "skip", "blocked")
VALID_SCOPES = ("feature", "bugfix", "refactor", "infrastructure", "documentation", "hotfix", "enhancement")
VALID_PRIORITIES = ("critical", "high", "medium", "low")
def now_iso() -> str:
return datetime.now(timezone.utc).isoformat(timespec="seconds")
def write_json_atomic(path: Path, data: dict) -> None:
tmp = path.with_suffix(path.suffix + ".tmp")
tmp.write_text(json.dumps(data, indent=2) + "\n", encoding="utf-8")
tmp.replace(path)
def find_record_path(tc_dir: Path, tc_id: str) -> Path | None:
direct = tc_dir / "records" / tc_id / "tc_record.json"
if direct.exists():
return direct
for entry in (tc_dir / "records").glob("*"):
if entry.is_dir() and entry.name.startswith(tc_id):
candidate = entry / "tc_record.json"
if candidate.exists():
return candidate
return None
def validate_transition(current: str, new: str) -> str | None:
if current == new:
return None
allowed = VALID_TRANSITIONS.get(current, [])
if new not in allowed:
return f"Invalid transition '{current}' -> '{new}'. Allowed: {', '.join(allowed) or 'none'}"
return None
def next_revision_id(record: dict) -> str:
return f"R{len(record.get('revision_history', [])) + 1}"
def next_test_id(record: dict) -> str:
return f"T{len(record.get('test_cases', [])) + 1}"
def compute_stats(records: list) -> dict:
stats = {
"total": len(records),
"by_status": {s: 0 for s in VALID_STATUSES},
"by_scope": {s: 0 for s in VALID_SCOPES},
"by_priority": {p: 0 for p in VALID_PRIORITIES},
}
for rec in records:
for key, bucket in (("status", "by_status"), ("scope", "by_scope"), ("priority", "by_priority")):
v = rec.get(key, "")
if v in stats[bucket]:
stats[bucket][v] += 1
return stats
def parse_file_arg(spec: str) -> tuple[str, str]:
"""Parse 'path:action' or just 'path' (default action: modified)."""
if ":" in spec:
path, action = spec.rsplit(":", 1)
action = action.strip()
if action not in VALID_FILE_ACTIONS:
raise ValueError(f"Invalid file action '{action}'. Must be one of {VALID_FILE_ACTIONS}")
return path.strip(), action
return spec.strip(), "modified"
def main() -> int:
parser = argparse.ArgumentParser(description="Update an existing TC record.")
parser.add_argument("--root", default=".", help="Project root (default: current directory)")
parser.add_argument("--tc-id", required=True, help="Target TC ID (full or prefix)")
parser.add_argument("--author", default=None, help="Author for this revision (defaults to config)")
parser.add_argument("--reason", default="", help="Reason for the change (recorded in revision)")
parser.add_argument("--set-status", choices=VALID_STATUSES, help="Transition status (state machine enforced)")
parser.add_argument("--add-file", action="append", default=[], metavar="path[:action]",
help="Add a file. Action defaults to 'modified'. Repeatable.")
parser.add_argument("--add-test", help="Add a test case with this title")
parser.add_argument("--test-procedure", action="append", default=[],
help="Procedure step for the test being added. Repeatable.")
parser.add_argument("--test-expected", help="Expected result for the test being added")
parser.add_argument("--handoff-progress", help="Set progress_summary in handoff")
parser.add_argument("--handoff-next", action="append", default=[], help="Append to next_steps. Repeatable.")
parser.add_argument("--handoff-blocker", action="append", default=[], help="Append to blockers. Repeatable.")
parser.add_argument("--handoff-context", action="append", default=[], help="Append to key_context. Repeatable.")
parser.add_argument("--note", help="Append a freeform note (with timestamp)")
parser.add_argument("--tag", action="append", default=[], help="Add a tag. Repeatable.")
parser.add_argument("--json", action="store_true", help="Output as JSON")
args = parser.parse_args()
root = Path(args.root).resolve()
tc_dir = root / "docs" / "TC"
config_path = tc_dir / "tc_config.json"
registry_path = tc_dir / "tc_registry.json"
if not config_path.exists() or not registry_path.exists():
msg = f"TC tracking not initialized at {tc_dir}. Run tc_init.py first."
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
record_path = find_record_path(tc_dir, args.tc_id)
if record_path is None:
msg = f"TC not found: {args.tc_id}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
try:
config = json.loads(config_path.read_text(encoding="utf-8"))
registry = json.loads(registry_path.read_text(encoding="utf-8"))
record = json.loads(record_path.read_text(encoding="utf-8"))
except (OSError, json.JSONDecodeError) as e:
msg = f"Failed to read JSON: {e}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
author = args.author or config.get("default_author", "Claude")
ts = now_iso()
field_changes = []
summary_parts = []
if args.set_status:
current = record.get("status")
new = args.set_status
err = validate_transition(current, new)
if err:
print(json.dumps({"status": "error", "error": err}) if args.json else f"ERROR: {err}")
return 2
if current != new:
record["status"] = new
field_changes.append({
"field": "status", "action": "changed",
"old_value": current, "new_value": new, "reason": args.reason or None,
})
summary_parts.append(f"status: {current} -> {new}")
for spec in args.add_file:
try:
path, action = parse_file_arg(spec)
except ValueError as e:
print(json.dumps({"status": "error", "error": str(e)}) if args.json else f"ERROR: {e}")
return 2
record.setdefault("files_affected", []).append({
"path": path, "action": action, "description": None,
"lines_added": None, "lines_removed": None,
})
field_changes.append({
"field": "files_affected", "action": "added",
"new_value": {"path": path, "action": action},
"reason": args.reason or None,
})
summary_parts.append(f"+file {path} ({action})")
if args.add_test:
if not args.test_procedure or not args.test_expected:
msg = "--add-test requires at least one --test-procedure and --test-expected"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
test_id = next_test_id(record)
new_test = {
"test_id": test_id,
"title": args.add_test,
"procedure": list(args.test_procedure),
"expected_result": args.test_expected,
"actual_result": None,
"status": "pending",
"evidence": [],
"tested_by": None,
"tested_date": None,
}
record.setdefault("test_cases", []).append(new_test)
field_changes.append({
"field": "test_cases", "action": "added",
"new_value": test_id, "reason": args.reason or None,
})
summary_parts.append(f"+test {test_id}: {args.add_test}")
handoff = record.setdefault("session_context", {}).setdefault("handoff", {
"progress_summary": "", "next_steps": [], "blockers": [],
"key_context": [], "files_in_progress": [], "decisions_made": [],
})
if args.handoff_progress is not None:
old = handoff.get("progress_summary", "")
handoff["progress_summary"] = args.handoff_progress
field_changes.append({
"field": "session_context.handoff.progress_summary",
"action": "changed", "old_value": old, "new_value": args.handoff_progress,
"reason": args.reason or None,
})
summary_parts.append("handoff: updated progress_summary")
for step in args.handoff_next:
handoff.setdefault("next_steps", []).append(step)
field_changes.append({
"field": "session_context.handoff.next_steps",
"action": "added", "new_value": step, "reason": args.reason or None,
})
summary_parts.append(f"handoff: +next_step '{step}'")
for blk in args.handoff_blocker:
handoff.setdefault("blockers", []).append(blk)
field_changes.append({
"field": "session_context.handoff.blockers",
"action": "added", "new_value": blk, "reason": args.reason or None,
})
summary_parts.append(f"handoff: +blocker '{blk}'")
for ctx in args.handoff_context:
handoff.setdefault("key_context", []).append(ctx)
field_changes.append({
"field": "session_context.handoff.key_context",
"action": "added", "new_value": ctx, "reason": args.reason or None,
})
summary_parts.append(f"handoff: +context")
if args.note:
existing = record.get("notes", "") or ""
addition = f"[{ts}] {args.note}"
record["notes"] = (existing + "\n" + addition).strip() if existing else addition
field_changes.append({
"field": "notes", "action": "added",
"new_value": args.note, "reason": args.reason or None,
})
summary_parts.append("note appended")
for tag in args.tag:
if tag not in record.setdefault("tags", []):
record["tags"].append(tag)
field_changes.append({
"field": "tags", "action": "added",
"new_value": tag, "reason": args.reason or None,
})
summary_parts.append(f"+tag {tag}")
if not field_changes:
msg = "No changes specified. Use --set-status, --add-file, --add-test, --handoff-*, --note, or --tag."
print(json.dumps({"status": "noop", "message": msg}) if args.json else msg)
return 0
revision = {
"revision_id": next_revision_id(record),
"timestamp": ts,
"author": author,
"summary": "; ".join(summary_parts) if summary_parts else "TC updated",
"field_changes": field_changes,
}
record.setdefault("revision_history", []).append(revision)
record["updated"] = ts
meta = record.setdefault("metadata", {})
meta["last_modified"] = ts
meta["last_modified_by"] = author
cs = record.setdefault("session_context", {}).setdefault("current_session", {})
cs["last_active"] = ts
try:
write_json_atomic(record_path, record)
except OSError as e:
msg = f"Failed to write record: {e}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
for entry in registry.get("records", []):
if entry.get("tc_id") == record["tc_id"]:
entry["status"] = record["status"]
entry["updated"] = ts
break
registry["updated"] = ts
registry["statistics"] = compute_stats(registry.get("records", []))
try:
write_json_atomic(registry_path, registry)
except OSError as e:
msg = f"Failed to update registry: {e}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
result = {
"status": "updated",
"tc_id": record["tc_id"],
"revision": revision["revision_id"],
"summary": revision["summary"],
"current_status": record["status"],
}
if args.json:
print(json.dumps(result, indent=2))
else:
print(f"Updated {record['tc_id']} ({revision['revision_id']})")
print(f" {revision['summary']}")
print(f" Status: {record['status']}")
return 0
if __name__ == "__main__":
sys.exit(main())

View File

@@ -0,0 +1,347 @@
#!/usr/bin/env python3
"""TC Validator — Validate a TC record or registry against the schema and state machine.
Enforces:
* Schema shape (required fields, types, enum values)
* State machine transitions (planned -> in_progress -> implemented -> tested -> deployed)
* Sequential R<n> revision IDs and T<n> test IDs
* TC ID format (TC-NNN-MM-DD-YY-slug)
* Sub-TC ID format (TC-NNN.A or TC-NNN.A.N)
* Approval consistency (approved=true requires approved_by + approved_date)
Usage:
python3 tc_validator.py --record path/to/tc_record.json
python3 tc_validator.py --registry path/to/tc_registry.json
python3 tc_validator.py --record path/to/tc_record.json --json
Exit codes:
0 = valid
1 = validation errors
2 = file not found / JSON parse error / bad CLI args
"""
from __future__ import annotations
import argparse
import json
import re
import sys
from datetime import datetime
from pathlib import Path
VALID_STATUSES = ("planned", "in_progress", "blocked", "implemented", "tested", "deployed")
VALID_TRANSITIONS = {
"planned": ["in_progress", "blocked"],
"in_progress": ["blocked", "implemented"],
"blocked": ["in_progress", "planned"],
"implemented": ["tested", "in_progress"],
"tested": ["deployed", "in_progress"],
"deployed": ["in_progress"],
}
VALID_SCOPES = ("feature", "bugfix", "refactor", "infrastructure", "documentation", "hotfix", "enhancement")
VALID_PRIORITIES = ("critical", "high", "medium", "low")
VALID_FILE_ACTIONS = ("created", "modified", "deleted", "renamed")
VALID_TEST_STATUSES = ("pending", "pass", "fail", "skip", "blocked")
VALID_EVIDENCE_TYPES = ("log_snippet", "screenshot", "file_reference", "command_output")
VALID_FIELD_CHANGE_ACTIONS = ("set", "changed", "added", "removed")
VALID_PLATFORMS = ("claude_code", "claude_web", "api", "other")
VALID_COVERAGE = ("none", "partial", "full")
VALID_FILE_IN_PROGRESS_STATES = ("editing", "needs_review", "partially_done", "ready")
TC_ID_PATTERN = re.compile(r"^TC-\d{3}-\d{2}-\d{2}-\d{2}-[a-z0-9]+(-[a-z0-9]+)*$")
SUB_TC_PATTERN = re.compile(r"^TC-\d{3}\.[A-Z](\.\d+)?$")
REVISION_ID_PATTERN = re.compile(r"^R(\d+)$")
TEST_ID_PATTERN = re.compile(r"^T(\d+)$")
def _enum(value, valid, name):
if value not in valid:
return [f"Field '{name}' has invalid value '{value}'. Must be one of: {', '.join(str(v) for v in valid)}"]
return []
def _string(value, name, min_length=0, max_length=None):
errors = []
if not isinstance(value, str):
return [f"Field '{name}' must be a string, got {type(value).__name__}"]
if len(value) < min_length:
errors.append(f"Field '{name}' must be at least {min_length} characters, got {len(value)}")
if max_length is not None and len(value) > max_length:
errors.append(f"Field '{name}' must be at most {max_length} characters, got {len(value)}")
return errors
def _iso(value, name):
if value is None:
return []
if not isinstance(value, str):
return [f"Field '{name}' must be an ISO 8601 datetime string"]
try:
datetime.fromisoformat(value)
except ValueError:
return [f"Field '{name}' is not a valid ISO 8601 datetime: '{value}'"]
return []
def _required(record, fields, prefix=""):
errors = []
for f in fields:
if f not in record:
path = f"{prefix}.{f}" if prefix else f
errors.append(f"Missing required field: '{path}'")
return errors
def validate_tc_id(tc_id):
"""Validate a TC identifier."""
if not isinstance(tc_id, str):
return [f"tc_id must be a string, got {type(tc_id).__name__}"]
if not TC_ID_PATTERN.match(tc_id):
return [f"tc_id '{tc_id}' does not match pattern TC-NNN-MM-DD-YY-slug"]
return []
def validate_state_transition(current, new):
"""Validate a state machine transition. Same-status is a no-op."""
errors = []
if current not in VALID_STATUSES:
errors.append(f"Current status '{current}' is invalid")
if new not in VALID_STATUSES:
errors.append(f"New status '{new}' is invalid")
if errors:
return errors
if current == new:
return []
allowed = VALID_TRANSITIONS.get(current, [])
if new not in allowed:
return [f"Invalid transition '{current}' -> '{new}'. Allowed from '{current}': {', '.join(allowed) or 'none'}"]
return []
def validate_tc_record(record):
"""Validate a TC record dict against the schema."""
errors = []
if not isinstance(record, dict):
return [f"TC record must be a JSON object, got {type(record).__name__}"]
top_required = [
"tc_id", "title", "status", "priority", "created", "updated",
"created_by", "project", "description", "files_affected",
"revision_history", "test_cases", "approval", "session_context",
"tags", "related_tcs", "notes", "metadata",
]
errors.extend(_required(record, top_required))
if "tc_id" in record:
errors.extend(validate_tc_id(record["tc_id"]))
if "title" in record:
errors.extend(_string(record["title"], "title", 5, 120))
if "status" in record:
errors.extend(_enum(record["status"], VALID_STATUSES, "status"))
if "priority" in record:
errors.extend(_enum(record["priority"], VALID_PRIORITIES, "priority"))
for ts in ("created", "updated"):
if ts in record:
errors.extend(_iso(record[ts], ts))
if "created_by" in record:
errors.extend(_string(record["created_by"], "created_by", 1))
if "project" in record:
errors.extend(_string(record["project"], "project", 1))
desc = record.get("description")
if isinstance(desc, dict):
errors.extend(_required(desc, ["summary", "motivation", "scope"], "description"))
if "summary" in desc:
errors.extend(_string(desc["summary"], "description.summary", 10))
if "motivation" in desc:
errors.extend(_string(desc["motivation"], "description.motivation", 1))
if "scope" in desc:
errors.extend(_enum(desc["scope"], VALID_SCOPES, "description.scope"))
elif "description" in record:
errors.append("Field 'description' must be an object")
files = record.get("files_affected")
if isinstance(files, list):
for i, f in enumerate(files):
prefix = f"files_affected[{i}]"
if not isinstance(f, dict):
errors.append(f"{prefix} must be an object")
continue
errors.extend(_required(f, ["path", "action"], prefix))
if "action" in f:
errors.extend(_enum(f["action"], VALID_FILE_ACTIONS, f"{prefix}.action"))
elif "files_affected" in record:
errors.append("Field 'files_affected' must be an array")
revs = record.get("revision_history")
if isinstance(revs, list):
if len(revs) < 1:
errors.append("revision_history must have at least 1 entry")
for i, rev in enumerate(revs):
prefix = f"revision_history[{i}]"
if not isinstance(rev, dict):
errors.append(f"{prefix} must be an object")
continue
errors.extend(_required(rev, ["revision_id", "timestamp", "author", "summary"], prefix))
rid = rev.get("revision_id")
if isinstance(rid, str):
m = REVISION_ID_PATTERN.match(rid)
if not m:
errors.append(f"{prefix}.revision_id '{rid}' must match R<n>")
elif int(m.group(1)) != i + 1:
errors.append(f"{prefix}.revision_id is '{rid}' but expected 'R{i + 1}' (must be sequential)")
if "timestamp" in rev:
errors.extend(_iso(rev["timestamp"], f"{prefix}.timestamp"))
elif "revision_history" in record:
errors.append("Field 'revision_history' must be an array")
tests = record.get("test_cases")
if isinstance(tests, list):
for i, tc in enumerate(tests):
prefix = f"test_cases[{i}]"
if not isinstance(tc, dict):
errors.append(f"{prefix} must be an object")
continue
errors.extend(_required(tc, ["test_id", "title", "procedure", "expected_result", "status"], prefix))
tid = tc.get("test_id")
if isinstance(tid, str):
m = TEST_ID_PATTERN.match(tid)
if not m:
errors.append(f"{prefix}.test_id '{tid}' must match T<n>")
elif int(m.group(1)) != i + 1:
errors.append(f"{prefix}.test_id is '{tid}' but expected 'T{i + 1}' (must be sequential)")
if "status" in tc:
errors.extend(_enum(tc["status"], VALID_TEST_STATUSES, f"{prefix}.status"))
appr = record.get("approval")
if isinstance(appr, dict):
errors.extend(_required(appr, ["approved", "test_coverage_status"], "approval"))
if appr.get("approved") is True:
if not appr.get("approved_by"):
errors.append("approval.approved_by is required when approval.approved is true")
if not appr.get("approved_date"):
errors.append("approval.approved_date is required when approval.approved is true")
if "test_coverage_status" in appr:
errors.extend(_enum(appr["test_coverage_status"], VALID_COVERAGE, "approval.test_coverage_status"))
elif "approval" in record:
errors.append("Field 'approval' must be an object")
ctx = record.get("session_context")
if isinstance(ctx, dict):
errors.extend(_required(ctx, ["current_session"], "session_context"))
cs = ctx.get("current_session")
if isinstance(cs, dict):
errors.extend(_required(cs, ["session_id", "platform", "model", "started"], "session_context.current_session"))
if "platform" in cs:
errors.extend(_enum(cs["platform"], VALID_PLATFORMS, "session_context.current_session.platform"))
if "started" in cs:
errors.extend(_iso(cs["started"], "session_context.current_session.started"))
meta = record.get("metadata")
if isinstance(meta, dict):
errors.extend(_required(meta, ["project", "created_by", "last_modified_by", "last_modified"], "metadata"))
if "last_modified" in meta:
errors.extend(_iso(meta["last_modified"], "metadata.last_modified"))
return errors
def validate_registry(registry):
"""Validate a TC registry dict."""
errors = []
if not isinstance(registry, dict):
return [f"Registry must be an object, got {type(registry).__name__}"]
errors.extend(_required(registry, ["project_name", "created", "updated", "next_tc_number", "records", "statistics"]))
if "next_tc_number" in registry:
v = registry["next_tc_number"]
if not isinstance(v, int) or v < 1:
errors.append(f"next_tc_number must be a positive integer, got {v}")
if isinstance(registry.get("records"), list):
for i, rec in enumerate(registry["records"]):
prefix = f"records[{i}]"
if not isinstance(rec, dict):
errors.append(f"{prefix} must be an object")
continue
errors.extend(_required(rec, ["tc_id", "title", "status", "scope", "priority", "created", "updated", "path"], prefix))
if "status" in rec:
errors.extend(_enum(rec["status"], VALID_STATUSES, f"{prefix}.status"))
if "scope" in rec:
errors.extend(_enum(rec["scope"], VALID_SCOPES, f"{prefix}.scope"))
if "priority" in rec:
errors.extend(_enum(rec["priority"], VALID_PRIORITIES, f"{prefix}.priority"))
return errors
def slugify(text):
"""Convert text to a kebab-case slug."""
text = text.lower().strip()
text = re.sub(r"[^a-z0-9\s-]", "", text)
text = re.sub(r"[\s_]+", "-", text)
text = re.sub(r"-+", "-", text)
return text.strip("-")
def compute_registry_statistics(records):
"""Recompute registry statistics from the records array."""
stats = {
"total": len(records),
"by_status": {s: 0 for s in VALID_STATUSES},
"by_scope": {s: 0 for s in VALID_SCOPES},
"by_priority": {p: 0 for p in VALID_PRIORITIES},
}
for rec in records:
for key, bucket in (("status", "by_status"), ("scope", "by_scope"), ("priority", "by_priority")):
v = rec.get(key, "")
if v in stats[bucket]:
stats[bucket][v] += 1
return stats
def main():
parser = argparse.ArgumentParser(description="Validate a TC record or registry.")
group = parser.add_mutually_exclusive_group(required=True)
group.add_argument("--record", help="Path to tc_record.json")
group.add_argument("--registry", help="Path to tc_registry.json")
parser.add_argument("--json", action="store_true", help="Output results as JSON")
args = parser.parse_args()
target = args.record or args.registry
path = Path(target)
if not path.exists():
msg = f"File not found: {path}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
try:
data = json.loads(path.read_text(encoding="utf-8"))
except json.JSONDecodeError as e:
msg = f"Invalid JSON in {path}: {e}"
print(json.dumps({"status": "error", "error": msg}) if args.json else f"ERROR: {msg}")
return 2
errors = validate_registry(data) if args.registry else validate_tc_record(data)
if args.json:
result = {
"status": "valid" if not errors else "invalid",
"file": str(path),
"kind": "registry" if args.registry else "record",
"error_count": len(errors),
"errors": errors,
}
print(json.dumps(result, indent=2))
else:
if errors:
print(f"VALIDATION ERRORS ({len(errors)}):")
for i, err in enumerate(errors, 1):
print(f" {i}. {err}")
else:
print("VALID")
return 1 if errors else 0
if __name__ == "__main__":
sys.exit(main())