feat(code-to-prd): add analysis scripts, references, and tooling docs
- frontend_analyzer.py: scans codebase for routes, APIs, enums, framework detection - prd_scaffolder.py: generates PRD directory with README, page stubs, appendix - references/framework-patterns.md: React, Next.js, Vue, Nuxt, Angular, Svelte patterns - references/prd-quality-checklist.md: validation checklist for generated PRDs - SKILL.md updated with tooling section Both scripts are stdlib-only (no pip install).
This commit is contained in:
@@ -337,3 +337,37 @@ Each page's Markdown should be **standalone** — reading just that file gives c
|
|||||||
| Ignoring dynamic route params | `/order/:id` = page requires an order ID to load |
|
| Ignoring dynamic route params | `/order/:id` = page requires an order ID to load |
|
||||||
| Forgetting permission controls | Document which roles see which buttons/pages |
|
| Forgetting permission controls | Document which roles see which buttons/pages |
|
||||||
| Assuming all APIs are real | Check for mock data patterns before documenting endpoints |
|
| Assuming all APIs are real | Check for mock data patterns before documenting endpoints |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Tooling
|
||||||
|
|
||||||
|
### Scripts
|
||||||
|
|
||||||
|
| Script | Purpose | Usage |
|
||||||
|
|--------|---------|-------|
|
||||||
|
| `scripts/frontend_analyzer.py` | Scan codebase → extract routes, APIs, enums, structure | `python3 frontend_analyzer.py /path/to/project` |
|
||||||
|
| `scripts/prd_scaffolder.py` | Generate PRD directory skeleton from analysis JSON | `python3 prd_scaffolder.py analysis.json` |
|
||||||
|
|
||||||
|
**Recommended workflow:**
|
||||||
|
```bash
|
||||||
|
# 1. Analyze the project (JSON output)
|
||||||
|
python3 scripts/frontend_analyzer.py /path/to/project -o analysis.json
|
||||||
|
|
||||||
|
# 2. Review the analysis (markdown summary)
|
||||||
|
python3 scripts/frontend_analyzer.py /path/to/project -f markdown
|
||||||
|
|
||||||
|
# 3. Scaffold the PRD directory with stubs
|
||||||
|
python3 scripts/prd_scaffolder.py analysis.json -o prd/ -n "My App"
|
||||||
|
|
||||||
|
# 4. Fill in TODO sections page-by-page using the SKILL.md workflow
|
||||||
|
```
|
||||||
|
|
||||||
|
Both scripts are **stdlib-only** — no pip install needed.
|
||||||
|
|
||||||
|
### References
|
||||||
|
|
||||||
|
| File | Contents |
|
||||||
|
|------|----------|
|
||||||
|
| `references/prd-quality-checklist.md` | Validation checklist for completeness, accuracy, readability |
|
||||||
|
| `references/framework-patterns.md` | Framework-specific patterns for routes, state, APIs, forms, permissions |
|
||||||
|
|||||||
121
product-team/code-to-prd/references/framework-patterns.md
Normal file
121
product-team/code-to-prd/references/framework-patterns.md
Normal file
@@ -0,0 +1,121 @@
|
|||||||
|
# Framework-Specific Patterns
|
||||||
|
|
||||||
|
Quick reference for identifying routes, components, state, and APIs across frontend frameworks.
|
||||||
|
|
||||||
|
## React (CRA / Vite)
|
||||||
|
|
||||||
|
| Aspect | Where to Look |
|
||||||
|
|--------|--------------|
|
||||||
|
| Routes | `react-router-dom` — `<Route path="...">` or `createBrowserRouter` |
|
||||||
|
| Components | `.tsx` / `.jsx` files, default exports |
|
||||||
|
| State | Redux (`store/`), Zustand, Jotai, Recoil, React Context |
|
||||||
|
| API | `axios`, `fetch`, TanStack Query (`useQuery`), SWR (`useSWR`) |
|
||||||
|
| Forms | React Hook Form, Formik, Ant Design Form, custom `useState` |
|
||||||
|
| i18n | `react-i18next`, `react-intl` |
|
||||||
|
|
||||||
|
## Next.js (App Router)
|
||||||
|
|
||||||
|
| Aspect | Where to Look |
|
||||||
|
|--------|--------------|
|
||||||
|
| Routes | `app/` directory — `page.tsx` = route, folders = segments |
|
||||||
|
| Layouts | `layout.tsx` per directory |
|
||||||
|
| Loading | `loading.tsx`, `error.tsx`, `not-found.tsx` |
|
||||||
|
| API routes | `app/api/` or `pages/api/` (Pages Router) |
|
||||||
|
| Server actions | `"use server"` directive |
|
||||||
|
| Middleware | `middleware.ts` at root |
|
||||||
|
|
||||||
|
## Next.js (Pages Router)
|
||||||
|
|
||||||
|
| Aspect | Where to Look |
|
||||||
|
|--------|--------------|
|
||||||
|
| Routes | `pages/` directory — filename = route |
|
||||||
|
| Data fetching | `getServerSideProps`, `getStaticProps`, `getStaticPaths` |
|
||||||
|
| API routes | `pages/api/` |
|
||||||
|
|
||||||
|
## Vue 3
|
||||||
|
|
||||||
|
| Aspect | Where to Look |
|
||||||
|
|--------|--------------|
|
||||||
|
| Routes | `vue-router` — `routes` array in `router/index.ts` |
|
||||||
|
| Components | `.vue` SFCs (`<template>`, `<script setup>`, `<style>`) |
|
||||||
|
| State | Pinia (`stores/`), Vuex (`store/`) |
|
||||||
|
| API | `axios`, `fetch`, VueQuery |
|
||||||
|
| Forms | VeeValidate, FormKit, custom `ref()` / `reactive()` |
|
||||||
|
| i18n | `vue-i18n` |
|
||||||
|
|
||||||
|
## Nuxt 3
|
||||||
|
|
||||||
|
| Aspect | Where to Look |
|
||||||
|
|--------|--------------|
|
||||||
|
| Routes | `pages/` directory (file-system routing) |
|
||||||
|
| Layouts | `layouts/` |
|
||||||
|
| API routes | `server/api/` |
|
||||||
|
| Data fetching | `useFetch`, `useAsyncData`, `$fetch` |
|
||||||
|
| State | `useState`, Pinia |
|
||||||
|
| Middleware | `middleware/` |
|
||||||
|
|
||||||
|
## Angular
|
||||||
|
|
||||||
|
| Aspect | Where to Look |
|
||||||
|
|--------|--------------|
|
||||||
|
| Routes | `app-routing.module.ts` or `Routes` array |
|
||||||
|
| Components | `@Component` decorator, `*.component.ts` |
|
||||||
|
| State | NgRx (`store/`), services with `BehaviorSubject` |
|
||||||
|
| API | `HttpClient` in services |
|
||||||
|
| Forms | Reactive Forms (`FormGroup`), Template-driven forms |
|
||||||
|
| i18n | `@angular/localize`, `ngx-translate` |
|
||||||
|
| Guards | `CanActivate`, `CanDeactivate` |
|
||||||
|
|
||||||
|
## Svelte / SvelteKit
|
||||||
|
|
||||||
|
| Aspect | Where to Look |
|
||||||
|
|--------|--------------|
|
||||||
|
| Routes | `src/routes/` (file-system routing with `+page.svelte`) |
|
||||||
|
| Layouts | `+layout.svelte` |
|
||||||
|
| Data loading | `+page.ts` / `+page.server.ts` (`load` function) |
|
||||||
|
| API routes | `+server.ts` |
|
||||||
|
| State | Svelte stores (`writable`, `readable`, `derived`) |
|
||||||
|
|
||||||
|
## Common Patterns Across Frameworks
|
||||||
|
|
||||||
|
### Mock Detection
|
||||||
|
```
|
||||||
|
# Likely mock
|
||||||
|
setTimeout(() => resolve(data), 500)
|
||||||
|
Promise.resolve(mockData)
|
||||||
|
import { data } from './fixtures'
|
||||||
|
faker.name.firstName()
|
||||||
|
|
||||||
|
# Likely real
|
||||||
|
axios.get('/api/users')
|
||||||
|
fetch('/api/data')
|
||||||
|
httpClient.post(url, body)
|
||||||
|
useSWR('/api/resource')
|
||||||
|
```
|
||||||
|
|
||||||
|
### Permission Patterns
|
||||||
|
```
|
||||||
|
# React
|
||||||
|
{hasPermission('admin') && <Button>Delete</Button>}
|
||||||
|
<ProtectedRoute roles={['admin', 'manager']}>
|
||||||
|
|
||||||
|
# Vue
|
||||||
|
v-if="user.role === 'admin'"
|
||||||
|
v-permission="'user:delete'"
|
||||||
|
|
||||||
|
# Angular
|
||||||
|
*ngIf="authService.hasRole('admin')"
|
||||||
|
canActivate: [AuthGuard]
|
||||||
|
```
|
||||||
|
|
||||||
|
### Form Validation
|
||||||
|
```
|
||||||
|
# React Hook Form
|
||||||
|
{ required: 'Name is required', maxLength: { value: 50, message: 'Too long' } }
|
||||||
|
|
||||||
|
# VeeValidate (Vue)
|
||||||
|
rules="required|email|max:100"
|
||||||
|
|
||||||
|
# Angular Reactive Forms
|
||||||
|
Validators.required, Validators.minLength(3), Validators.pattern(...)
|
||||||
|
```
|
||||||
54
product-team/code-to-prd/references/prd-quality-checklist.md
Normal file
54
product-team/code-to-prd/references/prd-quality-checklist.md
Normal file
@@ -0,0 +1,54 @@
|
|||||||
|
# PRD Quality Checklist
|
||||||
|
|
||||||
|
Use this checklist to validate generated PRDs before delivery.
|
||||||
|
|
||||||
|
## Completeness
|
||||||
|
|
||||||
|
- [ ] Every route/page has a corresponding document
|
||||||
|
- [ ] All form fields listed with type, required, validation, default
|
||||||
|
- [ ] All table columns listed with format, sortable, filterable
|
||||||
|
- [ ] All action buttons documented with visibility conditions
|
||||||
|
- [ ] All API endpoints listed with method, path, trigger, params
|
||||||
|
- [ ] Mock vs integrated APIs clearly distinguished
|
||||||
|
- [ ] All enums exhaustively listed with every value
|
||||||
|
- [ ] Page load behavior documented for every page
|
||||||
|
- [ ] Page relationships mapped (inbound, outbound, data coupling)
|
||||||
|
|
||||||
|
## Accuracy
|
||||||
|
|
||||||
|
- [ ] Route paths match actual code
|
||||||
|
- [ ] Field names match UI labels (not variable names)
|
||||||
|
- [ ] Validation rules match actual code logic
|
||||||
|
- [ ] Permission conditions match auth guard implementations
|
||||||
|
- [ ] API paths match actual service layer calls
|
||||||
|
- [ ] Enum values match source constants (no fabrication)
|
||||||
|
- [ ] Uncertain items marked `[TBC]` with explanation
|
||||||
|
|
||||||
|
## Readability
|
||||||
|
|
||||||
|
- [ ] Business language used (not implementation details)
|
||||||
|
- [ ] Each page doc is self-contained
|
||||||
|
- [ ] No component names used as page names
|
||||||
|
- [ ] Interactions described as user action → system response
|
||||||
|
- [ ] Modals/drawers documented within their parent page
|
||||||
|
- [ ] README system overview written for non-technical reader
|
||||||
|
|
||||||
|
## Structure
|
||||||
|
|
||||||
|
- [ ] `prd/README.md` exists with system overview + page inventory
|
||||||
|
- [ ] `prd/pages/` contains numbered page files
|
||||||
|
- [ ] `prd/appendix/enum-dictionary.md` exists
|
||||||
|
- [ ] `prd/appendix/api-inventory.md` exists
|
||||||
|
- [ ] `prd/appendix/page-relationships.md` exists
|
||||||
|
- [ ] Cross-references use relative links
|
||||||
|
|
||||||
|
## Common Issues to Watch
|
||||||
|
|
||||||
|
| Issue | How to Detect | Fix |
|
||||||
|
|-------|--------------|-----|
|
||||||
|
| Missing modal content | Search for `Modal`, `Dialog`, `Drawer` components | Add as subsection in parent page |
|
||||||
|
| Undocumented field linking | Search for conditional renders based on field values | Add to interaction logic |
|
||||||
|
| Hidden permissions | Search for `v-if`, `v-show`, role checks, auth guards | Add visibility conditions |
|
||||||
|
| Stale mock data | Compare mock shapes with API types/interfaces | Flag as `[Mock - verify with backend]` |
|
||||||
|
| Missing error states | Search for error boundaries, catch blocks, toast errors | Add failure paths to interactions |
|
||||||
|
| Unlinked pages | Cross-reference route params with navigation calls | Complete page relationships |
|
||||||
501
product-team/code-to-prd/scripts/frontend_analyzer.py
Executable file
501
product-team/code-to-prd/scripts/frontend_analyzer.py
Executable file
@@ -0,0 +1,501 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Analyze a frontend codebase and extract page inventory, routes, APIs, and project structure.
|
||||||
|
|
||||||
|
Stdlib only — no third-party dependencies. Outputs JSON for downstream PRD generation.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
python3 frontend_analyzer.py /path/to/project
|
||||||
|
python3 frontend_analyzer.py /path/to/project --output prd-analysis.json
|
||||||
|
python3 frontend_analyzer.py /path/to/project --format markdown
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
import os
|
||||||
|
import re
|
||||||
|
from collections import defaultdict
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any, Dict, List, Optional, Set, Tuple
|
||||||
|
|
||||||
|
IGNORED_DIRS = {
|
||||||
|
".git", "node_modules", ".next", "dist", "build", "coverage",
|
||||||
|
"venv", ".venv", "__pycache__", ".nuxt", ".output", ".cache",
|
||||||
|
".turbo", ".vercel", "out", "storybook-static",
|
||||||
|
}
|
||||||
|
|
||||||
|
FRAMEWORK_SIGNALS = {
|
||||||
|
"react": ["react", "react-dom"],
|
||||||
|
"next": ["next"],
|
||||||
|
"vue": ["vue"],
|
||||||
|
"nuxt": ["nuxt"],
|
||||||
|
"angular": ["@angular/core"],
|
||||||
|
"svelte": ["svelte"],
|
||||||
|
"sveltekit": ["@sveltejs/kit"],
|
||||||
|
"solid": ["solid-js"],
|
||||||
|
"astro": ["astro"],
|
||||||
|
"remix": ["@remix-run/react"],
|
||||||
|
}
|
||||||
|
|
||||||
|
ROUTE_FILE_PATTERNS = [
|
||||||
|
"**/router.{ts,tsx,js,jsx}",
|
||||||
|
"**/routes.{ts,tsx,js,jsx}",
|
||||||
|
"**/routing.{ts,tsx,js,jsx}",
|
||||||
|
"**/app-routing*.{ts,tsx,js,jsx}",
|
||||||
|
]
|
||||||
|
|
||||||
|
ROUTE_DIR_PATTERNS = [
|
||||||
|
"pages", "views", "routes", "app",
|
||||||
|
"src/pages", "src/views", "src/routes", "src/app",
|
||||||
|
]
|
||||||
|
|
||||||
|
API_DIR_PATTERNS = [
|
||||||
|
"api", "services", "requests", "endpoints", "client",
|
||||||
|
"src/api", "src/services", "src/requests",
|
||||||
|
]
|
||||||
|
|
||||||
|
STATE_DIR_PATTERNS = [
|
||||||
|
"store", "stores", "models", "context", "state",
|
||||||
|
"src/store", "src/stores", "src/models", "src/context",
|
||||||
|
]
|
||||||
|
|
||||||
|
I18N_DIR_PATTERNS = [
|
||||||
|
"locales", "i18n", "lang", "translations", "messages",
|
||||||
|
"src/locales", "src/i18n", "src/lang",
|
||||||
|
]
|
||||||
|
|
||||||
|
MOCK_SIGNALS = [
|
||||||
|
r"setTimeout\s*\(.*\breturn\b",
|
||||||
|
r"Promise\.resolve\s*\(",
|
||||||
|
r"\.mock\.",
|
||||||
|
r"__mocks__",
|
||||||
|
r"mockData",
|
||||||
|
r"mock[A-Z]",
|
||||||
|
r"faker\.",
|
||||||
|
r"fixtures?/",
|
||||||
|
]
|
||||||
|
|
||||||
|
REAL_API_SIGNALS = [
|
||||||
|
r"\baxios\b",
|
||||||
|
r"\bfetch\s*\(",
|
||||||
|
r"httpGet|httpPost|httpPut|httpDelete|httpPatch",
|
||||||
|
r"\.get\s*\(\s*['\"`/]",
|
||||||
|
r"\.post\s*\(\s*['\"`/]",
|
||||||
|
r"\.put\s*\(\s*['\"`/]",
|
||||||
|
r"\.delete\s*\(\s*['\"`/]",
|
||||||
|
r"\.patch\s*\(\s*['\"`/]",
|
||||||
|
r"useSWR|useQuery|useMutation",
|
||||||
|
r"\$http\.",
|
||||||
|
r"this\.http\.",
|
||||||
|
]
|
||||||
|
|
||||||
|
ROUTE_PATTERNS = [
|
||||||
|
# React Router
|
||||||
|
r'<Route\s+[^>]*path\s*=\s*["\']([^"\']+)["\']',
|
||||||
|
r'path\s*:\s*["\']([^"\']+)["\']',
|
||||||
|
# Vue Router
|
||||||
|
r'path\s*:\s*["\']([^"\']+)["\']',
|
||||||
|
# Angular
|
||||||
|
r'path\s*:\s*["\']([^"\']+)["\']',
|
||||||
|
]
|
||||||
|
|
||||||
|
API_PATH_PATTERNS = [
|
||||||
|
r'["\'](?:GET|POST|PUT|DELETE|PATCH)["\'].*?["\'](/[a-zA-Z0-9/_\-:{}]+)["\']',
|
||||||
|
r'(?:get|post|put|delete|patch)\s*\(\s*["\'](/[a-zA-Z0-9/_\-:{}]+)["\']',
|
||||||
|
r'(?:url|path|endpoint|baseURL)\s*[:=]\s*["\'](/[a-zA-Z0-9/_\-:{}]+)["\']',
|
||||||
|
r'fetch\s*\(\s*[`"\'](?:https?://[^/]+)?(/[a-zA-Z0-9/_\-:{}]+)',
|
||||||
|
]
|
||||||
|
|
||||||
|
COMPONENT_EXTENSIONS = {".tsx", ".jsx", ".vue", ".svelte", ".astro"}
|
||||||
|
CODE_EXTENSIONS = {".ts", ".tsx", ".js", ".jsx", ".vue", ".svelte", ".astro"}
|
||||||
|
|
||||||
|
|
||||||
|
def detect_framework(project_root: Path) -> Dict[str, Any]:
|
||||||
|
"""Detect frontend framework from package.json."""
|
||||||
|
pkg_path = project_root / "package.json"
|
||||||
|
if not pkg_path.exists():
|
||||||
|
return {"framework": "unknown", "dependencies": {}}
|
||||||
|
|
||||||
|
try:
|
||||||
|
with open(pkg_path) as f:
|
||||||
|
pkg = json.load(f)
|
||||||
|
except (json.JSONDecodeError, IOError):
|
||||||
|
return {"framework": "unknown", "dependencies": {}}
|
||||||
|
|
||||||
|
all_deps = {}
|
||||||
|
for key in ("dependencies", "devDependencies", "peerDependencies"):
|
||||||
|
all_deps.update(pkg.get(key, {}))
|
||||||
|
|
||||||
|
detected = []
|
||||||
|
for framework, signals in FRAMEWORK_SIGNALS.items():
|
||||||
|
if any(s in all_deps for s in signals):
|
||||||
|
detected.append(framework)
|
||||||
|
|
||||||
|
# Prefer specific over generic (next > react, nuxt > vue)
|
||||||
|
priority = ["sveltekit", "next", "nuxt", "remix", "astro", "angular", "svelte", "vue", "react", "solid"]
|
||||||
|
framework = "unknown"
|
||||||
|
for fw in priority:
|
||||||
|
if fw in detected:
|
||||||
|
framework = fw
|
||||||
|
break
|
||||||
|
|
||||||
|
return {
|
||||||
|
"framework": framework,
|
||||||
|
"name": pkg.get("name", ""),
|
||||||
|
"version": pkg.get("version", ""),
|
||||||
|
"detected_frameworks": detected,
|
||||||
|
"dependency_count": len(all_deps),
|
||||||
|
"key_deps": {k: v for k, v in all_deps.items()
|
||||||
|
if any(s in k for s in ["router", "redux", "vuex", "pinia", "zustand",
|
||||||
|
"mobx", "recoil", "jotai", "tanstack", "swr",
|
||||||
|
"axios", "tailwind", "material", "ant",
|
||||||
|
"chakra", "shadcn", "i18n", "intl"])},
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
|
def find_dirs(root: Path, patterns: List[str]) -> List[Path]:
|
||||||
|
"""Find directories matching common patterns."""
|
||||||
|
found = []
|
||||||
|
for pattern in patterns:
|
||||||
|
candidate = root / pattern
|
||||||
|
if candidate.is_dir():
|
||||||
|
found.append(candidate)
|
||||||
|
return found
|
||||||
|
|
||||||
|
|
||||||
|
def walk_files(root: Path, extensions: Set[str] = CODE_EXTENSIONS) -> List[Path]:
|
||||||
|
"""Walk project tree, skip ignored dirs, return files matching extensions."""
|
||||||
|
results = []
|
||||||
|
for dirpath, dirnames, filenames in os.walk(root):
|
||||||
|
dirnames[:] = [d for d in dirnames if d not in IGNORED_DIRS]
|
||||||
|
for fname in filenames:
|
||||||
|
if Path(fname).suffix in extensions:
|
||||||
|
results.append(Path(dirpath) / fname)
|
||||||
|
return results
|
||||||
|
|
||||||
|
|
||||||
|
def extract_routes_from_file(filepath: Path) -> List[Dict[str, str]]:
|
||||||
|
"""Extract route definitions from a file."""
|
||||||
|
routes = []
|
||||||
|
try:
|
||||||
|
content = filepath.read_text(errors="replace")
|
||||||
|
except IOError:
|
||||||
|
return routes
|
||||||
|
|
||||||
|
for pattern in ROUTE_PATTERNS:
|
||||||
|
for match in re.finditer(pattern, content):
|
||||||
|
path = match.group(1)
|
||||||
|
if path and not path.startswith("http") and len(path) < 200:
|
||||||
|
routes.append({
|
||||||
|
"path": path,
|
||||||
|
"source": str(filepath),
|
||||||
|
"line": content[:match.start()].count("\n") + 1,
|
||||||
|
})
|
||||||
|
return routes
|
||||||
|
|
||||||
|
|
||||||
|
def extract_routes_from_filesystem(pages_dir: Path, root: Path) -> List[Dict[str, str]]:
|
||||||
|
"""Infer routes from file-system routing (Next.js, Nuxt, SvelteKit)."""
|
||||||
|
routes = []
|
||||||
|
for filepath in sorted(pages_dir.rglob("*")):
|
||||||
|
if filepath.is_file() and filepath.suffix in CODE_EXTENSIONS:
|
||||||
|
rel = filepath.relative_to(pages_dir)
|
||||||
|
route = "/" + str(rel.with_suffix("")).replace("\\", "/")
|
||||||
|
# Normalize index routes
|
||||||
|
route = re.sub(r"/index$", "", route) or "/"
|
||||||
|
# Convert [param] to :param
|
||||||
|
route = re.sub(r"\[\.\.\.(\w+)\]", r"*\1", route)
|
||||||
|
route = re.sub(r"\[(\w+)\]", r":\1", route)
|
||||||
|
routes.append({
|
||||||
|
"path": route,
|
||||||
|
"source": str(filepath),
|
||||||
|
"filesystem": True,
|
||||||
|
})
|
||||||
|
return routes
|
||||||
|
|
||||||
|
|
||||||
|
def extract_apis_from_file(filepath: Path) -> List[Dict[str, Any]]:
|
||||||
|
"""Extract API calls from a file."""
|
||||||
|
apis = []
|
||||||
|
try:
|
||||||
|
content = filepath.read_text(errors="replace")
|
||||||
|
except IOError:
|
||||||
|
return apis
|
||||||
|
|
||||||
|
is_mock = any(re.search(p, content) for p in MOCK_SIGNALS)
|
||||||
|
is_real = any(re.search(p, content) for p in REAL_API_SIGNALS)
|
||||||
|
|
||||||
|
for pattern in API_PATH_PATTERNS:
|
||||||
|
for match in re.finditer(pattern, content):
|
||||||
|
path = match.group(1) if match.lastindex else match.group(0)
|
||||||
|
if path and len(path) < 200:
|
||||||
|
# Try to detect HTTP method
|
||||||
|
context = content[max(0, match.start() - 100):match.end()]
|
||||||
|
method = "UNKNOWN"
|
||||||
|
for m in ["GET", "POST", "PUT", "DELETE", "PATCH"]:
|
||||||
|
if m.lower() in context.lower():
|
||||||
|
method = m
|
||||||
|
break
|
||||||
|
|
||||||
|
apis.append({
|
||||||
|
"path": path,
|
||||||
|
"method": method,
|
||||||
|
"source": str(filepath),
|
||||||
|
"line": content[:match.start()].count("\n") + 1,
|
||||||
|
"integrated": is_real and not is_mock,
|
||||||
|
"mock_detected": is_mock,
|
||||||
|
})
|
||||||
|
return apis
|
||||||
|
|
||||||
|
|
||||||
|
def extract_enums(filepath: Path) -> List[Dict[str, Any]]:
|
||||||
|
"""Extract enum/constant definitions."""
|
||||||
|
enums = []
|
||||||
|
try:
|
||||||
|
content = filepath.read_text(errors="replace")
|
||||||
|
except IOError:
|
||||||
|
return enums
|
||||||
|
|
||||||
|
# TypeScript enums
|
||||||
|
for match in re.finditer(r"enum\s+(\w+)\s*\{([^}]+)\}", content):
|
||||||
|
name = match.group(1)
|
||||||
|
body = match.group(2)
|
||||||
|
values = re.findall(r"(\w+)\s*=\s*['\"]?([^,'\"\n]+)", body)
|
||||||
|
enums.append({
|
||||||
|
"name": name,
|
||||||
|
"type": "enum",
|
||||||
|
"values": {k.strip(): v.strip().rstrip(",") for k, v in values},
|
||||||
|
"source": str(filepath),
|
||||||
|
})
|
||||||
|
|
||||||
|
# Object constant maps (const STATUS_MAP = { ... })
|
||||||
|
for match in re.finditer(
|
||||||
|
r"(?:const|export\s+const)\s+(\w*(?:MAP|STATUS|TYPE|ENUM|OPTION|ROLE|STATE)\w*)\s*[:=]\s*\{([^}]+)\}",
|
||||||
|
content, re.IGNORECASE
|
||||||
|
):
|
||||||
|
name = match.group(1)
|
||||||
|
body = match.group(2)
|
||||||
|
values = re.findall(r"['\"]?(\w+)['\"]?\s*:\s*['\"]([^'\"]+)['\"]", body)
|
||||||
|
if values:
|
||||||
|
enums.append({
|
||||||
|
"name": name,
|
||||||
|
"type": "constant_map",
|
||||||
|
"values": dict(values),
|
||||||
|
"source": str(filepath),
|
||||||
|
})
|
||||||
|
|
||||||
|
return enums
|
||||||
|
|
||||||
|
|
||||||
|
def count_components(files: List[Path]) -> Dict[str, int]:
|
||||||
|
"""Count components by type."""
|
||||||
|
counts: Dict[str, int] = defaultdict(int)
|
||||||
|
for f in files:
|
||||||
|
if f.suffix in COMPONENT_EXTENSIONS:
|
||||||
|
counts["components"] += 1
|
||||||
|
elif f.suffix in {".ts", ".js"}:
|
||||||
|
counts["modules"] += 1
|
||||||
|
return dict(counts)
|
||||||
|
|
||||||
|
|
||||||
|
def analyze_project(project_root: Path) -> Dict[str, Any]:
|
||||||
|
"""Run full analysis on a frontend project."""
|
||||||
|
root = Path(project_root).resolve()
|
||||||
|
if not root.is_dir():
|
||||||
|
return {"error": f"Not a directory: {root}"}
|
||||||
|
|
||||||
|
# 1. Framework detection
|
||||||
|
framework_info = detect_framework(root)
|
||||||
|
|
||||||
|
# 2. File inventory
|
||||||
|
all_files = walk_files(root)
|
||||||
|
component_counts = count_components(all_files)
|
||||||
|
|
||||||
|
# 3. Directory structure
|
||||||
|
route_dirs = find_dirs(root, ROUTE_DIR_PATTERNS)
|
||||||
|
api_dirs = find_dirs(root, API_DIR_PATTERNS)
|
||||||
|
state_dirs = find_dirs(root, STATE_DIR_PATTERNS)
|
||||||
|
i18n_dirs = find_dirs(root, I18N_DIR_PATTERNS)
|
||||||
|
|
||||||
|
# 4. Routes
|
||||||
|
routes = []
|
||||||
|
# Config-based routes
|
||||||
|
for f in all_files:
|
||||||
|
if any(p in f.name.lower() for p in ["router", "routes", "routing"]):
|
||||||
|
routes.extend(extract_routes_from_file(f))
|
||||||
|
|
||||||
|
# File-system routes (Next.js, Nuxt, SvelteKit)
|
||||||
|
if framework_info["framework"] in ("next", "nuxt", "sveltekit", "remix", "astro"):
|
||||||
|
for d in route_dirs:
|
||||||
|
routes.extend(extract_routes_from_filesystem(d, root))
|
||||||
|
|
||||||
|
# Deduplicate routes by path
|
||||||
|
seen_paths: Set[str] = set()
|
||||||
|
unique_routes = []
|
||||||
|
for r in routes:
|
||||||
|
if r["path"] not in seen_paths:
|
||||||
|
seen_paths.add(r["path"])
|
||||||
|
unique_routes.append(r)
|
||||||
|
routes = sorted(unique_routes, key=lambda r: r["path"])
|
||||||
|
|
||||||
|
# 5. API calls
|
||||||
|
apis = []
|
||||||
|
for f in all_files:
|
||||||
|
apis.extend(extract_apis_from_file(f))
|
||||||
|
|
||||||
|
# Deduplicate APIs by path+method
|
||||||
|
seen_apis: Set[Tuple[str, str]] = set()
|
||||||
|
unique_apis = []
|
||||||
|
for a in apis:
|
||||||
|
key = (a["path"], a["method"])
|
||||||
|
if key not in seen_apis:
|
||||||
|
seen_apis.add(key)
|
||||||
|
unique_apis.append(a)
|
||||||
|
apis = sorted(unique_apis, key=lambda a: a["path"])
|
||||||
|
|
||||||
|
# 6. Enums
|
||||||
|
enums = []
|
||||||
|
for f in all_files:
|
||||||
|
enums.extend(extract_enums(f))
|
||||||
|
|
||||||
|
# 7. Summary
|
||||||
|
mock_count = sum(1 for a in apis if a.get("mock_detected"))
|
||||||
|
real_count = sum(1 for a in apis if a.get("integrated"))
|
||||||
|
|
||||||
|
analysis = {
|
||||||
|
"project": {
|
||||||
|
"root": str(root),
|
||||||
|
"name": framework_info.get("name", root.name),
|
||||||
|
"framework": framework_info["framework"],
|
||||||
|
"detected_frameworks": framework_info.get("detected_frameworks", []),
|
||||||
|
"key_dependencies": framework_info.get("key_deps", {}),
|
||||||
|
},
|
||||||
|
"structure": {
|
||||||
|
"total_files": len(all_files),
|
||||||
|
"components": component_counts,
|
||||||
|
"route_dirs": [str(d) for d in route_dirs],
|
||||||
|
"api_dirs": [str(d) for d in api_dirs],
|
||||||
|
"state_dirs": [str(d) for d in state_dirs],
|
||||||
|
"i18n_dirs": [str(d) for d in i18n_dirs],
|
||||||
|
},
|
||||||
|
"routes": {
|
||||||
|
"count": len(routes),
|
||||||
|
"pages": routes,
|
||||||
|
},
|
||||||
|
"apis": {
|
||||||
|
"total": len(apis),
|
||||||
|
"integrated": real_count,
|
||||||
|
"mock": mock_count,
|
||||||
|
"endpoints": apis,
|
||||||
|
},
|
||||||
|
"enums": {
|
||||||
|
"count": len(enums),
|
||||||
|
"definitions": enums,
|
||||||
|
},
|
||||||
|
"summary": {
|
||||||
|
"pages": len(routes),
|
||||||
|
"api_endpoints": len(apis),
|
||||||
|
"api_integrated": real_count,
|
||||||
|
"api_mock": mock_count,
|
||||||
|
"enums": len(enums),
|
||||||
|
"has_i18n": len(i18n_dirs) > 0,
|
||||||
|
"has_state_management": len(state_dirs) > 0,
|
||||||
|
},
|
||||||
|
}
|
||||||
|
|
||||||
|
return analysis
|
||||||
|
|
||||||
|
|
||||||
|
def format_markdown(analysis: Dict[str, Any]) -> str:
|
||||||
|
"""Format analysis as markdown summary."""
|
||||||
|
lines = []
|
||||||
|
proj = analysis["project"]
|
||||||
|
summary = analysis["summary"]
|
||||||
|
|
||||||
|
lines.append(f"# Frontend Analysis: {proj['name'] or 'Project'}")
|
||||||
|
lines.append("")
|
||||||
|
lines.append(f"**Framework:** {proj['framework']}")
|
||||||
|
lines.append(f"**Total files:** {analysis['structure']['total_files']}")
|
||||||
|
lines.append(f"**Pages:** {summary['pages']}")
|
||||||
|
lines.append(f"**API endpoints:** {summary['api_endpoints']} "
|
||||||
|
f"({summary['api_integrated']} integrated, {summary['api_mock']} mock)")
|
||||||
|
lines.append(f"**Enums:** {summary['enums']}")
|
||||||
|
lines.append(f"**i18n:** {'Yes' if summary['has_i18n'] else 'No'}")
|
||||||
|
lines.append(f"**State management:** {'Yes' if summary['has_state_management'] else 'No'}")
|
||||||
|
lines.append("")
|
||||||
|
|
||||||
|
if analysis["routes"]["pages"]:
|
||||||
|
lines.append("## Pages / Routes")
|
||||||
|
lines.append("")
|
||||||
|
lines.append("| # | Route | Source |")
|
||||||
|
lines.append("|---|-------|--------|")
|
||||||
|
for i, r in enumerate(analysis["routes"]["pages"], 1):
|
||||||
|
src = r.get("source", "").split("/")[-1]
|
||||||
|
fs = " (fs)" if r.get("filesystem") else ""
|
||||||
|
lines.append(f"| {i} | `{r['path']}` | {src}{fs} |")
|
||||||
|
lines.append("")
|
||||||
|
|
||||||
|
if analysis["apis"]["endpoints"]:
|
||||||
|
lines.append("## API Endpoints")
|
||||||
|
lines.append("")
|
||||||
|
lines.append("| Method | Path | Integrated | Source |")
|
||||||
|
lines.append("|--------|------|-----------|--------|")
|
||||||
|
for a in analysis["apis"]["endpoints"]:
|
||||||
|
src = a.get("source", "").split("/")[-1]
|
||||||
|
status = "✅" if a.get("integrated") else "⚠️ Mock"
|
||||||
|
lines.append(f"| {a['method']} | `{a['path']}` | {status} | {src} |")
|
||||||
|
lines.append("")
|
||||||
|
|
||||||
|
if analysis["enums"]["definitions"]:
|
||||||
|
lines.append("## Enums & Constants")
|
||||||
|
lines.append("")
|
||||||
|
for e in analysis["enums"]["definitions"]:
|
||||||
|
lines.append(f"### {e['name']} ({e['type']})")
|
||||||
|
if e["values"]:
|
||||||
|
lines.append("| Key | Value |")
|
||||||
|
lines.append("|-----|-------|")
|
||||||
|
for k, v in e["values"].items():
|
||||||
|
lines.append(f"| {k} | {v} |")
|
||||||
|
lines.append("")
|
||||||
|
|
||||||
|
if proj.get("key_dependencies"):
|
||||||
|
lines.append("## Key Dependencies")
|
||||||
|
lines.append("")
|
||||||
|
for dep, ver in sorted(proj["key_dependencies"].items()):
|
||||||
|
lines.append(f"- `{dep}`: {ver}")
|
||||||
|
lines.append("")
|
||||||
|
|
||||||
|
return "\n".join(lines)
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Analyze frontend codebase for PRD generation"
|
||||||
|
)
|
||||||
|
parser.add_argument("project", help="Path to frontend project root")
|
||||||
|
parser.add_argument("-o", "--output", help="Output file (default: stdout)")
|
||||||
|
parser.add_argument(
|
||||||
|
"-f", "--format",
|
||||||
|
choices=["json", "markdown"],
|
||||||
|
default="json",
|
||||||
|
help="Output format (default: json)",
|
||||||
|
)
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
analysis = analyze_project(Path(args.project))
|
||||||
|
|
||||||
|
if args.format == "markdown":
|
||||||
|
output = format_markdown(analysis)
|
||||||
|
else:
|
||||||
|
output = json.dumps(analysis, indent=2, ensure_ascii=False)
|
||||||
|
|
||||||
|
if args.output:
|
||||||
|
Path(args.output).write_text(output)
|
||||||
|
print(f"Written to {args.output}")
|
||||||
|
else:
|
||||||
|
print(output)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
341
product-team/code-to-prd/scripts/prd_scaffolder.py
Executable file
341
product-team/code-to-prd/scripts/prd_scaffolder.py
Executable file
@@ -0,0 +1,341 @@
|
|||||||
|
#!/usr/bin/env python3
|
||||||
|
"""Scaffold PRD directory structure from frontend_analyzer.py output.
|
||||||
|
|
||||||
|
Reads analysis JSON and creates the prd/ directory with README.md,
|
||||||
|
per-page stubs, and appendix files pre-populated with extracted data.
|
||||||
|
|
||||||
|
Stdlib only — no third-party dependencies.
|
||||||
|
|
||||||
|
Usage:
|
||||||
|
python3 frontend_analyzer.py /path/to/project -o analysis.json
|
||||||
|
python3 prd_scaffolder.py analysis.json
|
||||||
|
python3 prd_scaffolder.py analysis.json --output-dir ./prd --project-name "My App"
|
||||||
|
"""
|
||||||
|
|
||||||
|
from __future__ import annotations
|
||||||
|
|
||||||
|
import argparse
|
||||||
|
import json
|
||||||
|
import re
|
||||||
|
from datetime import datetime
|
||||||
|
from pathlib import Path
|
||||||
|
from typing import Any, Dict, List
|
||||||
|
|
||||||
|
|
||||||
|
def slugify(text: str) -> str:
|
||||||
|
"""Convert text to a filename-safe slug."""
|
||||||
|
text = text.strip().lower()
|
||||||
|
text = re.sub(r"[/:{}*?\"<>|]", "-", text)
|
||||||
|
text = re.sub(r"[^a-z0-9\-]", "-", text)
|
||||||
|
text = re.sub(r"-+", "-", text)
|
||||||
|
return text.strip("-")
|
||||||
|
|
||||||
|
|
||||||
|
def route_to_page_name(route: str) -> str:
|
||||||
|
"""Convert a route path to a human-readable page name."""
|
||||||
|
if route == "/" or route == "":
|
||||||
|
return "Home"
|
||||||
|
parts = route.strip("/").split("/")
|
||||||
|
# Remove dynamic segments for naming
|
||||||
|
clean = [p for p in parts if not p.startswith(":") and not p.startswith("*")]
|
||||||
|
if not clean:
|
||||||
|
clean = [p.lstrip(":*") for p in parts]
|
||||||
|
return " ".join(w.capitalize() for w in "-".join(clean).replace("_", "-").split("-"))
|
||||||
|
|
||||||
|
|
||||||
|
def generate_readme(project_name: str, routes: List[Dict], summary: Dict, date: str) -> str:
|
||||||
|
"""Generate the PRD README.md."""
|
||||||
|
lines = [
|
||||||
|
f"# {project_name} — Product Requirements Document",
|
||||||
|
"",
|
||||||
|
f"> Generated: {date}",
|
||||||
|
"",
|
||||||
|
"## System Overview",
|
||||||
|
"",
|
||||||
|
f"<!-- TODO: Describe what {project_name} does, its business context, and primary users -->",
|
||||||
|
"",
|
||||||
|
"## Summary",
|
||||||
|
"",
|
||||||
|
f"| Metric | Count |",
|
||||||
|
f"|--------|-------|",
|
||||||
|
f"| Pages | {summary.get('pages', 0)} |",
|
||||||
|
f"| API Endpoints | {summary.get('api_endpoints', 0)} |",
|
||||||
|
f"| Integrated APIs | {summary.get('api_integrated', 0)} |",
|
||||||
|
f"| Mock APIs | {summary.get('api_mock', 0)} |",
|
||||||
|
f"| Enums/Constants | {summary.get('enums', 0)} |",
|
||||||
|
f"| i18n | {'Yes' if summary.get('has_i18n') else 'No'} |",
|
||||||
|
f"| State Management | {'Yes' if summary.get('has_state_management') else 'No'} |",
|
||||||
|
"",
|
||||||
|
"## Module Overview",
|
||||||
|
"",
|
||||||
|
"| Module | Pages | Core Functionality |",
|
||||||
|
"|--------|-------|--------------------|",
|
||||||
|
"| <!-- TODO: Group pages into modules --> | | |",
|
||||||
|
"",
|
||||||
|
"## Page Inventory",
|
||||||
|
"",
|
||||||
|
"| # | Page Name | Route | Module | Doc Link |",
|
||||||
|
"|---|-----------|-------|--------|----------|",
|
||||||
|
]
|
||||||
|
|
||||||
|
for i, route in enumerate(routes, 1):
|
||||||
|
path = route.get("path", "/")
|
||||||
|
name = route_to_page_name(path)
|
||||||
|
slug = slugify(name) or f"page-{i}"
|
||||||
|
filename = f"{i:02d}-{slug}.md"
|
||||||
|
lines.append(f"| {i} | {name} | `{path}` | <!-- TODO --> | [→](./pages/{filename}) |")
|
||||||
|
|
||||||
|
lines.extend([
|
||||||
|
"",
|
||||||
|
"## Global Notes",
|
||||||
|
"",
|
||||||
|
"### Permission Model",
|
||||||
|
"<!-- TODO: Summarize auth/role system if present -->",
|
||||||
|
"",
|
||||||
|
"### Common Interaction Patterns",
|
||||||
|
"<!-- TODO: Global rules — delete confirmations, default sort, etc. -->",
|
||||||
|
"",
|
||||||
|
])
|
||||||
|
|
||||||
|
return "\n".join(lines)
|
||||||
|
|
||||||
|
|
||||||
|
def generate_page_stub(route: Dict, index: int, date: str) -> str:
|
||||||
|
"""Generate a per-page PRD stub."""
|
||||||
|
path = route.get("path", "/")
|
||||||
|
name = route_to_page_name(path)
|
||||||
|
source = route.get("source", "unknown")
|
||||||
|
|
||||||
|
return f"""# {name}
|
||||||
|
|
||||||
|
> **Route:** `{path}`
|
||||||
|
> **Module:** <!-- TODO -->
|
||||||
|
> **Source:** `{source}`
|
||||||
|
> **Generated:** {date}
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
<!-- TODO: 2-3 sentences — core function and use case -->
|
||||||
|
|
||||||
|
## Layout
|
||||||
|
<!-- TODO: Region breakdown — search area, table, detail panel, action bar, etc. -->
|
||||||
|
|
||||||
|
## Fields
|
||||||
|
|
||||||
|
### Search / Filters
|
||||||
|
| Field | Type | Required | Options / Enum | Default | Notes |
|
||||||
|
|-------|------|----------|---------------|---------|-------|
|
||||||
|
| <!-- TODO --> | | | | | |
|
||||||
|
|
||||||
|
### Data Table
|
||||||
|
| Column | Format | Sortable | Filterable | Notes |
|
||||||
|
|--------|--------|----------|-----------|-------|
|
||||||
|
| <!-- TODO --> | | | | |
|
||||||
|
|
||||||
|
### Actions
|
||||||
|
| Button | Visibility Condition | Behavior |
|
||||||
|
|--------|---------------------|----------|
|
||||||
|
| <!-- TODO --> | | |
|
||||||
|
|
||||||
|
## Interactions
|
||||||
|
|
||||||
|
### Page Load
|
||||||
|
<!-- TODO: What happens on mount — default queries, preloaded data -->
|
||||||
|
|
||||||
|
### Search
|
||||||
|
- **Trigger:** <!-- TODO -->
|
||||||
|
- **Behavior:** <!-- TODO -->
|
||||||
|
- **Special rules:** <!-- TODO -->
|
||||||
|
|
||||||
|
### Create / Edit
|
||||||
|
- **Trigger:** <!-- TODO -->
|
||||||
|
- **Modal/drawer content:** <!-- TODO -->
|
||||||
|
- **Validation:** <!-- TODO -->
|
||||||
|
- **On success:** <!-- TODO -->
|
||||||
|
|
||||||
|
### Delete
|
||||||
|
- **Trigger:** <!-- TODO -->
|
||||||
|
- **Confirmation:** <!-- TODO -->
|
||||||
|
- **On success:** <!-- TODO -->
|
||||||
|
|
||||||
|
## API Dependencies
|
||||||
|
|
||||||
|
| API | Method | Path | Trigger | Integrated | Notes |
|
||||||
|
|-----|--------|------|---------|-----------|-------|
|
||||||
|
| <!-- TODO --> | | | | | |
|
||||||
|
|
||||||
|
## Page Relationships
|
||||||
|
- **From:** <!-- TODO: Source pages + params -->
|
||||||
|
- **To:** <!-- TODO: Target pages + params -->
|
||||||
|
- **Data coupling:** <!-- TODO: Cross-page refresh triggers -->
|
||||||
|
|
||||||
|
## Business Rules
|
||||||
|
<!-- TODO: Anything that doesn't fit above -->
|
||||||
|
"""
|
||||||
|
|
||||||
|
|
||||||
|
def generate_enum_dictionary(enums: List[Dict]) -> str:
|
||||||
|
"""Generate the enum dictionary appendix."""
|
||||||
|
lines = [
|
||||||
|
"# Enum & Constant Dictionary",
|
||||||
|
"",
|
||||||
|
"All enums, status codes, and type mappings extracted from the codebase.",
|
||||||
|
"",
|
||||||
|
]
|
||||||
|
|
||||||
|
if not enums:
|
||||||
|
lines.append("*No enums detected. Manual review recommended.*")
|
||||||
|
return "\n".join(lines)
|
||||||
|
|
||||||
|
for e in enums:
|
||||||
|
lines.append(f"## {e['name']}")
|
||||||
|
lines.append(f"**Type:** {e.get('type', 'unknown')} | **Source:** `{e.get('source', 'unknown').split('/')[-1]}`")
|
||||||
|
lines.append("")
|
||||||
|
if e.get("values"):
|
||||||
|
lines.append("| Key | Value |")
|
||||||
|
lines.append("|-----|-------|")
|
||||||
|
for k, v in e["values"].items():
|
||||||
|
lines.append(f"| `{k}` | {v} |")
|
||||||
|
lines.append("")
|
||||||
|
|
||||||
|
return "\n".join(lines)
|
||||||
|
|
||||||
|
|
||||||
|
def generate_api_inventory(apis: List[Dict]) -> str:
|
||||||
|
"""Generate the API inventory appendix."""
|
||||||
|
lines = [
|
||||||
|
"# API Inventory",
|
||||||
|
"",
|
||||||
|
"All API endpoints detected in the codebase.",
|
||||||
|
"",
|
||||||
|
]
|
||||||
|
|
||||||
|
if not apis:
|
||||||
|
lines.append("*No API calls detected. Manual review recommended.*")
|
||||||
|
return "\n".join(lines)
|
||||||
|
|
||||||
|
integrated = [a for a in apis if a.get("integrated")]
|
||||||
|
mocked = [a for a in apis if a.get("mock_detected") and not a.get("integrated")]
|
||||||
|
unknown = [a for a in apis if not a.get("integrated") and not a.get("mock_detected")]
|
||||||
|
|
||||||
|
for label, group in [("Integrated APIs", integrated), ("Mock / Stub APIs", mocked), ("Unknown Status", unknown)]:
|
||||||
|
if group:
|
||||||
|
lines.append(f"## {label}")
|
||||||
|
lines.append("")
|
||||||
|
lines.append("| Method | Path | Source | Notes |")
|
||||||
|
lines.append("|--------|------|--------|-------|")
|
||||||
|
for a in group:
|
||||||
|
src = a.get("source", "").split("/")[-1]
|
||||||
|
lines.append(f"| {a.get('method', '?')} | `{a.get('path', '?')}` | {src} | |")
|
||||||
|
lines.append("")
|
||||||
|
|
||||||
|
return "\n".join(lines)
|
||||||
|
|
||||||
|
|
||||||
|
def generate_page_relationships(routes: List[Dict]) -> str:
|
||||||
|
"""Generate page relationships appendix stub."""
|
||||||
|
lines = [
|
||||||
|
"# Page Relationships",
|
||||||
|
"",
|
||||||
|
"Navigation flow and data coupling between pages.",
|
||||||
|
"",
|
||||||
|
"## Navigation Map",
|
||||||
|
"",
|
||||||
|
"<!-- TODO: Fill in after page-by-page analysis -->",
|
||||||
|
"",
|
||||||
|
"```",
|
||||||
|
"Home",
|
||||||
|
]
|
||||||
|
|
||||||
|
for r in routes[:20]: # Cap at 20 for readability
|
||||||
|
name = route_to_page_name(r.get("path", "/"))
|
||||||
|
lines.append(f" ├── {name}")
|
||||||
|
|
||||||
|
if len(routes) > 20:
|
||||||
|
lines.append(f" └── ... ({len(routes) - 20} more)")
|
||||||
|
|
||||||
|
lines.extend([
|
||||||
|
"```",
|
||||||
|
"",
|
||||||
|
"## Cross-Page Data Dependencies",
|
||||||
|
"",
|
||||||
|
"| Source Page | Target Page | Trigger | Data Passed |",
|
||||||
|
"|-----------|------------|---------|------------|",
|
||||||
|
"| <!-- TODO --> | | | |",
|
||||||
|
"",
|
||||||
|
])
|
||||||
|
|
||||||
|
return "\n".join(lines)
|
||||||
|
|
||||||
|
|
||||||
|
def scaffold(analysis: Dict[str, Any], output_dir: Path, project_name: str | None = None):
|
||||||
|
"""Create the full PRD directory structure."""
|
||||||
|
date = datetime.now().strftime("%Y-%m-%d")
|
||||||
|
name = project_name or analysis.get("project", {}).get("name", "Project")
|
||||||
|
routes = analysis.get("routes", {}).get("pages", [])
|
||||||
|
apis = analysis.get("apis", {}).get("endpoints", [])
|
||||||
|
enums = analysis.get("enums", {}).get("definitions", [])
|
||||||
|
summary = analysis.get("summary", {})
|
||||||
|
|
||||||
|
# Create directories
|
||||||
|
pages_dir = output_dir / "pages"
|
||||||
|
appendix_dir = output_dir / "appendix"
|
||||||
|
pages_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
appendix_dir.mkdir(parents=True, exist_ok=True)
|
||||||
|
|
||||||
|
# README.md
|
||||||
|
readme = generate_readme(name, routes, summary, date)
|
||||||
|
(output_dir / "README.md").write_text(readme)
|
||||||
|
print(f" Created: README.md")
|
||||||
|
|
||||||
|
# Per-page stubs
|
||||||
|
for i, route in enumerate(routes, 1):
|
||||||
|
page_name = route_to_page_name(route.get("path", "/"))
|
||||||
|
slug = slugify(page_name) or f"page-{i}"
|
||||||
|
filename = f"{i:02d}-{slug}.md"
|
||||||
|
content = generate_page_stub(route, i, date)
|
||||||
|
(pages_dir / filename).write_text(content)
|
||||||
|
print(f" Created: pages/{filename}")
|
||||||
|
|
||||||
|
# Appendix
|
||||||
|
(appendix_dir / "enum-dictionary.md").write_text(generate_enum_dictionary(enums))
|
||||||
|
print(f" Created: appendix/enum-dictionary.md")
|
||||||
|
|
||||||
|
(appendix_dir / "api-inventory.md").write_text(generate_api_inventory(apis))
|
||||||
|
print(f" Created: appendix/api-inventory.md")
|
||||||
|
|
||||||
|
(appendix_dir / "page-relationships.md").write_text(generate_page_relationships(routes))
|
||||||
|
print(f" Created: appendix/page-relationships.md")
|
||||||
|
|
||||||
|
print(f"\n✅ PRD scaffold complete: {output_dir}")
|
||||||
|
print(f" {len(routes)} page stubs, {len(apis)} API endpoints, {len(enums)} enums")
|
||||||
|
print(f"\n Next: Review each page stub and fill in the TODO sections.")
|
||||||
|
|
||||||
|
|
||||||
|
def main():
|
||||||
|
parser = argparse.ArgumentParser(
|
||||||
|
description="Scaffold PRD directory from frontend analysis"
|
||||||
|
)
|
||||||
|
parser.add_argument("analysis", help="Path to analysis JSON from frontend_analyzer.py")
|
||||||
|
parser.add_argument("-o", "--output-dir", default="prd", help="Output directory (default: prd/)")
|
||||||
|
parser.add_argument("-n", "--project-name", help="Override project name")
|
||||||
|
args = parser.parse_args()
|
||||||
|
|
||||||
|
analysis_path = Path(args.analysis)
|
||||||
|
if not analysis_path.exists():
|
||||||
|
print(f"Error: {analysis_path} not found")
|
||||||
|
return
|
||||||
|
|
||||||
|
with open(analysis_path) as f:
|
||||||
|
analysis = json.load(f)
|
||||||
|
|
||||||
|
if "error" in analysis:
|
||||||
|
print(f"Error in analysis: {analysis['error']}")
|
||||||
|
return
|
||||||
|
|
||||||
|
output_dir = Path(args.output_dir)
|
||||||
|
print(f"Scaffolding PRD in {output_dir}/...\n")
|
||||||
|
scaffold(analysis, output_dir, args.project_name)
|
||||||
|
|
||||||
|
|
||||||
|
if __name__ == "__main__":
|
||||||
|
main()
|
||||||
Reference in New Issue
Block a user