* chore: update gitignore for audit reports and playwright cache * fix: add YAML frontmatter (name + description) to all SKILL.md files - Added frontmatter to 34 skills that were missing it entirely (0% Tessl score) - Fixed name field format to kebab-case across all 169 skills - Resolves #284 * chore: sync codex skills symlinks [automated] * fix: optimize 14 low-scoring skills via Tessl review (#290) Tessl optimization: 14 skills improved from ≤69% to 85%+. Closes #285, #286. * chore: sync codex skills symlinks [automated] * fix: optimize 18 skills via Tessl review + compliance fix (closes #287) (#291) Phase 1: 18 skills optimized via Tessl (avg 77% → 95%). Closes #287. * feat: add scripts and references to 4 prompt-only skills + Tessl optimization (#292) Phase 2: 3 new scripts + 2 reference files for prompt-only skills. Tessl 45-55% → 94-100%. * feat: add 6 agents + 5 slash commands for full coverage (v2.7.0) (#293) Phase 3: 6 new agents (all 9 categories covered) + 5 slash commands. * fix: Phase 5 verification fixes + docs update (#294) Phase 5 verification fixes * chore: sync codex skills symlinks [automated] * fix: marketplace audit — all 11 plugins validated by Claude Code (#295) Marketplace audit: all 11 plugins validated + installed + tested in Claude Code * fix: restore 7 removed plugins + revert playwright-pro name to pw Reverts two overly aggressive audit changes: - Restored content-creator, demand-gen, fullstack-engineer, aws-architect, product-manager, scrum-master, skill-security-auditor to marketplace - Reverted playwright-pro plugin.json name back to 'pw' (intentional short name) * refactor: split 21 over-500-line skills into SKILL.md + references (#296) * chore: sync codex skills symlinks [automated] * docs: update all documentation with accurate counts and regenerated skill pages - Update skill count to 170, Python tools to 213, references to 314 across all docs - Regenerate all 170 skill doc pages from latest SKILL.md sources - Update CLAUDE.md with v2.1.1 highlights, accurate architecture tree, and roadmap - Update README.md badges and overview table - Update marketplace.json metadata description and version - Update mkdocs.yml, index.md, getting-started.md with correct numbers * fix: add root-level SKILL.md and .codex/instructions.md to all domains (#301) Root cause: CLI tools (ai-agent-skills, agent-skills-cli) look for SKILL.md at the specified install path. 7 of 9 domain directories were missing this file, causing "Skill not found" errors for bundle installs like: npx ai-agent-skills install alirezarezvani/claude-skills/engineering-team Fix: - Add root-level SKILL.md with YAML frontmatter to 7 domains - Add .codex/instructions.md to 8 domains (for Codex CLI discovery) - Update INSTALLATION.md with accurate skill counts (53→170) - Add troubleshooting entry for "Skill not found" error All 9 domains now have: SKILL.md + .codex/instructions.md + plugin.json Closes #301 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat: add Gemini CLI + OpenClaw support, fix Codex missing 25 skills Gemini CLI: - Add GEMINI.md with activation instructions - Add scripts/gemini-install.sh setup script - Add scripts/sync-gemini-skills.py (194 skills indexed) - Add .gemini/skills/ with symlinks for all skills, agents, commands - Remove phantom medium-content-pro entries from sync script - Add top-level folder filter to prevent gitignored dirs from leaking Codex CLI: - Fix sync-codex-skills.py missing "engineering" domain (25 POWERFUL skills) - Regenerate .codex/skills-index.json: 124 → 149 skills - Add 25 new symlinks in .codex/skills/ OpenClaw: - Add OpenClaw installation section to INSTALLATION.md - Add ClawHub install + manual install + YAML frontmatter docs Documentation: - Update INSTALLATION.md with all 4 platforms + accurate counts - Update README.md: "three platforms" → "four platforms" + Gemini quick start - Update CLAUDE.md with Gemini CLI support in v2.1.1 highlights - Update SKILL-AUTHORING-STANDARD.md + SKILL_PIPELINE.md with Gemini steps - Add OpenClaw + Gemini to installation locations reference table Marketplace: all 18 plugins validated — sources exist, SKILL.md present Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * feat(product,pm): world-class product & PM skills audit — 6 scripts, 5 agents, 7 commands, 23 references/assets Phase 1 — Agent & Command Foundation: - Rewrite cs-project-manager agent (55→515 lines, 4 workflows, 6 skill integrations) - Expand cs-product-manager agent (408→684 lines, orchestrates all 8 product skills) - Add 7 slash commands: /rice, /okr, /persona, /user-story, /sprint-health, /project-health, /retro Phase 2 — Script Gap Closure (2,779 lines): - jira-expert: jql_query_builder.py (22 patterns), workflow_validator.py - confluence-expert: space_structure_generator.py, content_audit_analyzer.py - atlassian-admin: permission_audit_tool.py - atlassian-templates: template_scaffolder.py (Confluence XHTML generation) Phase 3 — Reference & Asset Enrichment: - 9 product references (competitive-teardown, landing-page-generator, saas-scaffolder) - 6 PM references (confluence-expert, atlassian-admin, atlassian-templates) - 7 product assets (templates for PRD, RICE, sprint, stories, OKR, research, design system) - 1 PM asset (permission_scheme_template.json) Phase 4 — New Agents: - cs-agile-product-owner, cs-product-strategist, cs-ux-researcher Phase 5 — Integration & Polish: - Related Skills cross-references in 8 SKILL.md files - Updated product-team/CLAUDE.md (5→8 skills, 6→9 tools, 4 agents, 5 commands) - Updated project-management/CLAUDE.md (0→12 scripts, 3 commands) - Regenerated docs site (177 pages), updated homepage and getting-started Quality audit: 31 files reviewed, 29 PASS, 2 fixed (copy-frameworks.md, governance-framework.md) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: audit and repair all plugins, agents, and commands - Fix 12 command files: correct CLI arg syntax, script paths, and usage docs - Fix 3 agents with broken script/reference paths (cs-content-creator, cs-demand-gen-specialist, cs-financial-analyst) - Add complete YAML frontmatter to 5 agents (cs-growth-strategist, cs-engineering-lead, cs-senior-engineer, cs-financial-analyst, cs-quality-regulatory) - Fix cs-ceo-advisor related agent path - Update marketplace.json metadata counts (224 tools, 341 refs, 14 agents, 12 commands) Verified: all 19 scripts pass --help, all 14 agent paths resolve, mkdocs builds clean. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix: repair 25 Python scripts failing --help across all domains - Fix Python 3.10+ syntax (float | None → Optional[float]) in 2 scripts - Add argparse CLI handling to 9 marketing scripts using raw sys.argv - Fix 10 scripts crashing at module level (wrap in __main__, add argparse) - Make yaml/prefect/mcp imports conditional with stdlib fallbacks (4 scripts) - Fix f-string backslash syntax in project_bootstrapper.py - Fix -h flag conflict in pr_analyzer.py - Fix tech-debt.md description (score → prioritize) All 237 scripts now pass python3 --help verification. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * fix(product-team): close 3 verified gaps in product skills - Fix competitive-teardown/SKILL.md: replace broken references DATA_COLLECTION.md → references/data-collection-guide.md and TEMPLATES.md → references/analysis-templates.md (workflow was broken at steps 2 and 4) - Upgrade landing_page_scaffolder.py: add TSX + Tailwind output format (--format tsx) matching SKILL.md promise of Next.js/React components. 4 design styles (dark-saas, clean-minimal, bold-startup, enterprise). TSX is now default; HTML preserved via --format html - Rewrite README.md: fix stale counts (was 5 skills/15+ tools, now accurately shows 8 skills/9 tools), remove 7 ghost scripts that never existed (sprint_planner.py, velocity_tracker.py, etc.) - Fix tech-debt.md description (score → prioritize) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> * release: v2.1.2 — landing page TSX output, brand voice integration, docs update - Landing page generator defaults to Next.js TSX + Tailwind CSS (4 design styles) - Brand voice analyzer integrated into landing page generation workflow - CHANGELOG, CLAUDE.md, README.md updated for v2.1.2 - All 13 plugin.json + marketplace.json bumped to 2.1.2 - Gemini/Codex skill indexes re-synced - Backward compatible: --format html preserved, no breaking changes Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com> --------- Co-authored-by: alirezarezvani <5697919+alirezarezvani@users.noreply.github.com> Co-authored-by: Leo <leo@openclaw.ai> Co-authored-by: Claude Opus 4.6 <noreply@anthropic.com>
442 lines
14 KiB
Python
442 lines
14 KiB
Python
#!/usr/bin/env python3
|
|
"""Project Bootstrapper — Generate SaaS project scaffolding from config.
|
|
|
|
Creates project directory structure with boilerplate files, README,
|
|
docker-compose, environment configs, and CI/CD templates.
|
|
|
|
Usage:
|
|
python project_bootstrapper.py config.json --output-dir ./my-project
|
|
python project_bootstrapper.py config.json --format json --dry-run
|
|
"""
|
|
|
|
import argparse
|
|
import json
|
|
import os
|
|
import sys
|
|
from typing import Dict, List, Any, Optional
|
|
from datetime import datetime
|
|
|
|
|
|
STACK_TEMPLATES = {
|
|
"nextjs": {
|
|
"package.json": lambda c: json.dumps({
|
|
"name": c["name"],
|
|
"version": "0.1.0",
|
|
"private": True,
|
|
"scripts": {
|
|
"dev": "next dev",
|
|
"build": "next build",
|
|
"start": "next start",
|
|
"lint": "next lint",
|
|
"test": "jest",
|
|
"test:watch": "jest --watch"
|
|
},
|
|
"dependencies": {
|
|
"next": "^14.0.0",
|
|
"react": "^18.0.0",
|
|
"react-dom": "^18.0.0"
|
|
},
|
|
"devDependencies": {
|
|
"typescript": "^5.0.0",
|
|
"@types/react": "^18.0.0",
|
|
"@types/node": "^20.0.0",
|
|
"eslint": "^8.0.0",
|
|
"eslint-config-next": "^14.0.0"
|
|
}
|
|
}, indent=2),
|
|
"tsconfig.json": lambda c: json.dumps({
|
|
"compilerOptions": {
|
|
"target": "es5",
|
|
"lib": ["dom", "dom.iterable", "esnext"],
|
|
"allowJs": True,
|
|
"skipLibCheck": True,
|
|
"strict": True,
|
|
"forceConsistentCasingInFileNames": True,
|
|
"noEmit": True,
|
|
"esModuleInterop": True,
|
|
"module": "esnext",
|
|
"moduleResolution": "bundler",
|
|
"resolveJsonModule": True,
|
|
"isolatedModules": True,
|
|
"jsx": "preserve",
|
|
"incremental": True,
|
|
"paths": {"@/*": ["./src/*"]}
|
|
},
|
|
"include": ["next-env.d.ts", "**/*.ts", "**/*.tsx"],
|
|
"exclude": ["node_modules"]
|
|
}, indent=2),
|
|
"dirs": ["src/app", "src/components", "src/lib", "src/styles", "public", "tests"],
|
|
"files": {
|
|
"src/app/layout.tsx": "export default function RootLayout({ children }: { children: React.ReactNode }) {\n return <html lang=\"en\"><body>{children}</body></html>;\n}\n",
|
|
"src/app/page.tsx": "export default function Home() {\n return <main><h1>Welcome</h1></main>;\n}\n",
|
|
}
|
|
},
|
|
"express": {
|
|
"package.json": lambda c: json.dumps({
|
|
"name": c["name"],
|
|
"version": "0.1.0",
|
|
"main": "src/index.ts",
|
|
"scripts": {
|
|
"dev": "tsx watch src/index.ts",
|
|
"build": "tsc",
|
|
"start": "node dist/index.js",
|
|
"test": "jest",
|
|
"lint": "eslint src/"
|
|
},
|
|
"dependencies": {
|
|
"express": "^4.18.0",
|
|
"cors": "^2.8.5",
|
|
"helmet": "^7.0.0",
|
|
"dotenv": "^16.0.0"
|
|
},
|
|
"devDependencies": {
|
|
"typescript": "^5.0.0",
|
|
"@types/express": "^4.17.0",
|
|
"@types/cors": "^2.8.0",
|
|
"@types/node": "^20.0.0",
|
|
"tsx": "^4.0.0",
|
|
"jest": "^29.0.0",
|
|
"@types/jest": "^29.0.0",
|
|
"eslint": "^8.0.0"
|
|
}
|
|
}, indent=2),
|
|
"dirs": ["src/routes", "src/middleware", "src/models", "src/services", "src/utils", "tests"],
|
|
"files": {
|
|
"src/index.ts": "import express from 'express';\nimport cors from 'cors';\nimport helmet from 'helmet';\nimport { config } from 'dotenv';\n\nconfig();\n\nconst app = express();\nconst PORT = process.env.PORT || 3000;\n\napp.use(helmet());\napp.use(cors());\napp.use(express.json());\n\napp.get('/health', (req, res) => res.json({ status: 'ok' }));\n\napp.listen(PORT, () => console.log(`Server running on port ${PORT}`));\n",
|
|
}
|
|
},
|
|
"fastapi": {
|
|
"requirements.txt": lambda c: "fastapi>=0.100.0\nuvicorn[standard]>=0.23.0\npydantic>=2.0.0\npython-dotenv>=1.0.0\nsqlalchemy>=2.0.0\nalembic>=1.12.0\npytest>=7.0.0\nhttpx>=0.24.0\n",
|
|
"dirs": ["app/api", "app/models", "app/services", "app/core", "tests", "alembic"],
|
|
"files": {
|
|
"app/__init__.py": "",
|
|
"app/main.py": "from fastapi import FastAPI\nfrom app.core.config import settings\n\napp = FastAPI(title=settings.PROJECT_NAME)\n\n@app.get('/health')\ndef health(): return {'status': 'ok'}\n",
|
|
"app/core/__init__.py": "",
|
|
"app/core/config.py": "from pydantic_settings import BaseSettings\n\nclass Settings(BaseSettings):\n PROJECT_NAME: str = 'API'\n DATABASE_URL: str = 'sqlite:///./app.db'\n class Config:\n env_file = '.env'\n\nsettings = Settings()\n",
|
|
}
|
|
}
|
|
}
|
|
|
|
|
|
def generate_readme(config: Dict[str, Any]) -> str:
|
|
"""Generate README.md content."""
|
|
name = config.get("name", "my-project")
|
|
desc = config.get("description", "A SaaS application")
|
|
stack = config.get("stack", "nextjs")
|
|
|
|
return f"""# {name}
|
|
|
|
{desc}
|
|
|
|
## Tech Stack
|
|
|
|
- **Framework**: {stack}
|
|
- **Database**: {config.get('database', 'PostgreSQL')}
|
|
- **Auth**: {config.get('auth', 'JWT')}
|
|
|
|
## Getting Started
|
|
|
|
### Prerequisites
|
|
|
|
- Node.js 18+ / Python 3.11+
|
|
- Docker & Docker Compose
|
|
|
|
### Development
|
|
|
|
```bash
|
|
# Clone the repo
|
|
git clone <repo-url>
|
|
cd {name}
|
|
|
|
# Copy environment variables
|
|
cp .env.example .env
|
|
|
|
# Start with Docker
|
|
docker compose up -d
|
|
|
|
# Or run locally
|
|
{'npm install && npm run dev' if stack in ('nextjs', 'express') else 'pip install -r requirements.txt && uvicorn app.main:app --reload'}
|
|
```
|
|
|
|
### Testing
|
|
|
|
```bash
|
|
{'npm test' if stack in ('nextjs', 'express') else 'pytest'}
|
|
```
|
|
|
|
## Project Structure
|
|
|
|
```
|
|
{name}/
|
|
├── {'src/' if stack in ('nextjs', 'express') else 'app/'}
|
|
├── tests/
|
|
├── docker-compose.yml
|
|
├── .env.example
|
|
└── README.md
|
|
```
|
|
|
|
## License
|
|
|
|
MIT
|
|
"""
|
|
|
|
|
|
def generate_env_example(config: Dict[str, Any]) -> str:
|
|
"""Generate .env.example file."""
|
|
lines = [
|
|
"# Application",
|
|
f"APP_NAME={config.get('name', 'my-app')}",
|
|
"NODE_ENV=development",
|
|
"PORT=3000",
|
|
"",
|
|
"# Database",
|
|
]
|
|
db = config.get("database", "postgresql")
|
|
if db == "postgresql":
|
|
lines.extend(["DATABASE_URL=postgresql://user:password@localhost:5432/mydb", ""])
|
|
elif db == "mongodb":
|
|
lines.extend(["MONGODB_URI=mongodb://localhost:27017/mydb", ""])
|
|
elif db == "mysql":
|
|
lines.extend(["DATABASE_URL=mysql://user:password@localhost:3306/mydb", ""])
|
|
|
|
if config.get("auth"):
|
|
lines.extend([
|
|
"# Auth",
|
|
"JWT_SECRET=change-me-in-production",
|
|
"JWT_EXPIRY=7d",
|
|
""
|
|
])
|
|
|
|
if config.get("features", {}).get("email"):
|
|
lines.extend(["# Email", "SMTP_HOST=smtp.example.com", "SMTP_PORT=587", "SMTP_USER=", "SMTP_PASS=", ""])
|
|
|
|
if config.get("features", {}).get("storage"):
|
|
lines.extend(["# Storage", "S3_BUCKET=", "S3_REGION=us-east-1", "AWS_ACCESS_KEY_ID=", "AWS_SECRET_ACCESS_KEY=", ""])
|
|
|
|
return "\n".join(lines)
|
|
|
|
|
|
def generate_docker_compose(config: Dict[str, Any]) -> str:
|
|
"""Generate docker-compose.yml."""
|
|
name = config.get("name", "app")
|
|
stack = config.get("stack", "nextjs")
|
|
db = config.get("database", "postgresql")
|
|
|
|
services = {
|
|
"app": {
|
|
"build": ".",
|
|
"ports": ["3000:3000"],
|
|
"env_file": [".env"],
|
|
"depends_on": ["db"] if db else [],
|
|
"volumes": [".:/app", "/app/node_modules"] if stack != "fastapi" else [".:/app"]
|
|
}
|
|
}
|
|
|
|
if db == "postgresql":
|
|
services["db"] = {
|
|
"image": "postgres:16-alpine",
|
|
"ports": ["5432:5432"],
|
|
"environment": {
|
|
"POSTGRES_USER": "user",
|
|
"POSTGRES_PASSWORD": "password",
|
|
"POSTGRES_DB": "mydb"
|
|
},
|
|
"volumes": ["pgdata:/var/lib/postgresql/data"]
|
|
}
|
|
elif db == "mongodb":
|
|
services["db"] = {
|
|
"image": "mongo:7",
|
|
"ports": ["27017:27017"],
|
|
"volumes": ["mongodata:/data/db"]
|
|
}
|
|
|
|
if config.get("features", {}).get("redis"):
|
|
services["redis"] = {
|
|
"image": "redis:7-alpine",
|
|
"ports": ["6379:6379"]
|
|
}
|
|
|
|
compose = {
|
|
"version": "3.8",
|
|
"services": services,
|
|
"volumes": {}
|
|
}
|
|
if db == "postgresql":
|
|
compose["volumes"]["pgdata"] = {}
|
|
elif db == "mongodb":
|
|
compose["volumes"]["mongodata"] = {}
|
|
|
|
# Manual YAML-like output (avoid pyyaml dependency)
|
|
nl = "\n"
|
|
depends_on = f" depends_on:{nl} - db" if db else ""
|
|
vol_line = " pgdata:" if db == "postgresql" else " mongodata:" if db == "mongodb" else " {}"
|
|
return f"""version: '3.8'
|
|
|
|
services:
|
|
app:
|
|
build: .
|
|
ports:
|
|
- "3000:3000"
|
|
env_file:
|
|
- .env
|
|
volumes:
|
|
- .:/app
|
|
{depends_on}
|
|
|
|
{generate_db_service(db)}
|
|
{generate_redis_service(config)}
|
|
volumes:
|
|
{vol_line}
|
|
"""
|
|
|
|
|
|
def generate_db_service(db: str) -> str:
|
|
if db == "postgresql":
|
|
return """ db:
|
|
image: postgres:16-alpine
|
|
ports:
|
|
- "5432:5432"
|
|
environment:
|
|
POSTGRES_USER: user
|
|
POSTGRES_PASSWORD: password
|
|
POSTGRES_DB: mydb
|
|
volumes:
|
|
- pgdata:/var/lib/postgresql/data
|
|
"""
|
|
elif db == "mongodb":
|
|
return """ db:
|
|
image: mongo:7
|
|
ports:
|
|
- "27017:27017"
|
|
volumes:
|
|
- mongodata:/data/db
|
|
"""
|
|
return ""
|
|
|
|
|
|
def generate_redis_service(config: Dict[str, Any]) -> str:
|
|
if config.get("features", {}).get("redis"):
|
|
return """ redis:
|
|
image: redis:7-alpine
|
|
ports:
|
|
- "6379:6379"
|
|
"""
|
|
return ""
|
|
|
|
|
|
def generate_gitignore(stack: str) -> str:
|
|
"""Generate .gitignore."""
|
|
common = "node_modules/\n.env\n.env.local\ndist/\nbuild/\n.next/\n*.log\n.DS_Store\ncoverage/\n__pycache__/\n*.pyc\n.pytest_cache/\n.venv/\n"
|
|
return common
|
|
|
|
|
|
def generate_dockerfile(config: Dict[str, Any]) -> str:
|
|
"""Generate Dockerfile."""
|
|
stack = config.get("stack", "nextjs")
|
|
if stack == "fastapi":
|
|
return """FROM python:3.11-slim
|
|
WORKDIR /app
|
|
COPY requirements.txt .
|
|
RUN pip install --no-cache-dir -r requirements.txt
|
|
COPY . .
|
|
EXPOSE 3000
|
|
CMD ["uvicorn", "app.main:app", "--host", "0.0.0.0", "--port", "3000"]
|
|
"""
|
|
return """FROM node:20-alpine
|
|
WORKDIR /app
|
|
COPY package*.json ./
|
|
RUN npm ci
|
|
COPY . .
|
|
RUN npm run build
|
|
EXPOSE 3000
|
|
CMD ["npm", "start"]
|
|
"""
|
|
|
|
|
|
def scaffold_project(config: Dict[str, Any], output_dir: str, dry_run: bool = False) -> Dict[str, Any]:
|
|
"""Generate project scaffolding."""
|
|
stack = config.get("stack", "nextjs")
|
|
template = STACK_TEMPLATES.get(stack, STACK_TEMPLATES["nextjs"])
|
|
files_created = []
|
|
|
|
# Create directories
|
|
for d in template.get("dirs", []):
|
|
path = os.path.join(output_dir, d)
|
|
if not dry_run:
|
|
os.makedirs(path, exist_ok=True)
|
|
files_created.append({"path": d + "/", "type": "directory"})
|
|
|
|
# Create template files
|
|
all_files = {}
|
|
|
|
# Package/requirements file
|
|
for key in ("package.json", "requirements.txt"):
|
|
if key in template:
|
|
all_files[key] = template[key](config)
|
|
|
|
if "tsconfig.json" in template:
|
|
all_files["tsconfig.json"] = template["tsconfig.json"](config)
|
|
|
|
# Stack-specific files
|
|
all_files.update(template.get("files", {}))
|
|
|
|
# Common files
|
|
all_files["README.md"] = generate_readme(config)
|
|
all_files[".env.example"] = generate_env_example(config)
|
|
all_files[".gitignore"] = generate_gitignore(stack)
|
|
all_files["docker-compose.yml"] = generate_docker_compose(config)
|
|
all_files["Dockerfile"] = generate_dockerfile(config)
|
|
|
|
# Write files
|
|
for filepath, content in all_files.items():
|
|
full_path = os.path.join(output_dir, filepath)
|
|
if not dry_run:
|
|
os.makedirs(os.path.dirname(full_path), exist_ok=True)
|
|
with open(full_path, "w") as f:
|
|
f.write(content)
|
|
files_created.append({"path": filepath, "type": "file", "size": len(content)})
|
|
|
|
return {
|
|
"generated_at": datetime.now().isoformat(),
|
|
"project_name": config.get("name", "my-project"),
|
|
"stack": stack,
|
|
"output_dir": output_dir,
|
|
"files_created": files_created,
|
|
"total_files": len([f for f in files_created if f["type"] == "file"]),
|
|
"total_dirs": len([f for f in files_created if f["type"] == "directory"]),
|
|
"dry_run": dry_run
|
|
}
|
|
|
|
|
|
def main():
|
|
parser = argparse.ArgumentParser(description="Bootstrap SaaS project from config")
|
|
parser.add_argument("input", help="Path to project config JSON")
|
|
parser.add_argument("--output-dir", type=str, default="./my-project", help="Output directory")
|
|
parser.add_argument("--format", choices=["json", "text"], default="text", help="Output format")
|
|
parser.add_argument("--dry-run", action="store_true", help="Preview without creating files")
|
|
|
|
args = parser.parse_args()
|
|
|
|
with open(args.input) as f:
|
|
config = json.load(f)
|
|
|
|
result = scaffold_project(config, args.output_dir, args.dry_run)
|
|
|
|
if args.format == "json":
|
|
print(json.dumps(result, indent=2))
|
|
else:
|
|
print(f"Project '{result['project_name']}' scaffolded at {result['output_dir']}")
|
|
print(f"Stack: {result['stack']}")
|
|
print(f"Created: {result['total_files']} files, {result['total_dirs']} directories")
|
|
if result["dry_run"]:
|
|
print("\n[DRY RUN] No files were created. Files that would be created:")
|
|
print("\nFiles:")
|
|
for f in result["files_created"]:
|
|
prefix = "📁" if f["type"] == "directory" else "📄"
|
|
size = f" ({f.get('size', 0)} bytes)" if f.get("size") else ""
|
|
print(f" {prefix} {f['path']}{size}")
|
|
|
|
|
|
if __name__ == "__main__":
|
|
main()
|