feat: Unified create command + consolidated enhancement flags
This commit includes two major improvements:
## 1. Unified Create Command (v3.0.0 feature)
- Auto-detects source type (web, GitHub, local, PDF, config)
- Three-tier argument organization (universal, source-specific, advanced)
- Routes to existing scrapers (100% backward compatible)
- Progressive disclosure: 15 universal flags in default help
**New files:**
- src/skill_seekers/cli/source_detector.py - Auto-detection logic
- src/skill_seekers/cli/arguments/create.py - Argument definitions
- src/skill_seekers/cli/create_command.py - Main orchestrator
- src/skill_seekers/cli/parsers/create_parser.py - Parser integration
**Tests:**
- tests/test_source_detector.py (35 tests)
- tests/test_create_arguments.py (30 tests)
- tests/test_create_integration_basic.py (10 tests)
## 2. Enhanced Flag Consolidation (Phase 1)
- Consolidated 3 flags (--enhance, --enhance-local, --enhance-level) → 1 flag
- --enhance-level 0-3 with auto-detection of API vs LOCAL mode
- Default: --enhance-level 2 (balanced enhancement)
**Modified files:**
- arguments/{common,create,scrape,github,analyze}.py - Added enhance_level
- {doc_scraper,github_scraper,config_extractor,main}.py - Updated logic
- create_command.py - Uses consolidated flag
**Auto-detection:**
- If ANTHROPIC_API_KEY set → API mode
- Else → LOCAL mode (Claude Code)
## 3. PresetManager Bug Fix
- Fixed module naming conflict (presets.py vs presets/ directory)
- Moved presets.py → presets/manager.py
- Updated __init__.py exports
**Test Results:**
- All 160+ tests passing
- Zero regressions
- 100% backward compatible
Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
@@ -1,6 +1,11 @@
|
||||
"""Scrape subcommand parser."""
|
||||
"""Scrape subcommand parser.
|
||||
|
||||
Uses shared argument definitions from arguments.scrape to ensure
|
||||
consistency with the standalone doc_scraper module.
|
||||
"""
|
||||
|
||||
from .base import SubcommandParser
|
||||
from skill_seekers.cli.arguments.scrape import add_scrape_arguments
|
||||
|
||||
|
||||
class ScrapeParser(SubcommandParser):
|
||||
@@ -19,24 +24,12 @@ class ScrapeParser(SubcommandParser):
|
||||
return "Scrape documentation website and generate skill"
|
||||
|
||||
def add_arguments(self, parser):
|
||||
"""Add scrape-specific arguments."""
|
||||
parser.add_argument("url", nargs="?", help="Documentation URL (positional argument)")
|
||||
parser.add_argument("--config", help="Config JSON file")
|
||||
parser.add_argument("--name", help="Skill name")
|
||||
parser.add_argument("--description", help="Skill description")
|
||||
parser.add_argument(
|
||||
"--max-pages",
|
||||
type=int,
|
||||
dest="max_pages",
|
||||
help="Maximum pages to scrape (override config)",
|
||||
)
|
||||
parser.add_argument(
|
||||
"--skip-scrape", action="store_true", help="Skip scraping, use cached data"
|
||||
)
|
||||
parser.add_argument("--enhance", action="store_true", help="AI enhancement (API)")
|
||||
parser.add_argument("--enhance-local", action="store_true", help="AI enhancement (local)")
|
||||
parser.add_argument("--dry-run", action="store_true", help="Dry run mode")
|
||||
parser.add_argument(
|
||||
"--async", dest="async_mode", action="store_true", help="Use async scraping"
|
||||
)
|
||||
parser.add_argument("--workers", type=int, help="Number of async workers")
|
||||
"""Add scrape-specific arguments.
|
||||
|
||||
Uses shared argument definitions to ensure consistency
|
||||
with doc_scraper.py (standalone scraper).
|
||||
"""
|
||||
# Add all scrape arguments from shared definitions
|
||||
# This ensures the unified CLI has exactly the same arguments
|
||||
# as the standalone scraper - they CANNOT drift out of sync
|
||||
add_scrape_arguments(parser)
|
||||
|
||||
Reference in New Issue
Block a user