fix: improve MiniMax adaptor from PR #318 review (#319)

* feat: add MiniMax AI as LLM platform adaptor

Original implementation by octo-patch in PR #318.
This commit includes comprehensive improvements and documentation.

Code Improvements:
- Fix API key validation to properly check JWT format (eyJ prefix)
- Add specific exception handling for timeout and connection errors
- Remove unused variable in upload method

Dependencies:
- Add MiniMax to [all-llms] extra group in pyproject.toml

Tests:
- Remove duplicate setUp method in integration test class
- Add 4 new test methods:
  * test_package_excludes_backup_files
  * test_upload_success_mocked (with OpenAI mocking)
  * test_upload_network_error
  * test_upload_connection_error
  * test_validate_api_key_jwt_format
- Update test_validate_api_key_valid to use JWT format keys
- Fix test assertions for error message matching

Documentation:
- Create comprehensive MINIMAX_INTEGRATION.md guide (380+ lines)
- Update MULTI_LLM_SUPPORT.md with MiniMax platform entry
- Update 01-installation.md extras table
- Update INTEGRATIONS.md AI platforms table
- Update AGENTS.md adaptor import pattern example
- Fix README.md platform count from 4 to 5

All tests pass (33 passed, 3 skipped)
Lint checks pass

Co-authored-by: octo-patch <octo-patch@users.noreply.github.com>

* fix: improve MiniMax adaptor — typed exceptions, key validation, tests, docs

- Remove invalid "minimax" self-reference from all-llms dependency group
- Use typed OpenAI exceptions (APITimeoutError, APIConnectionError)
  instead of string-matching on generic Exception
- Replace incorrect JWT assumption in validate_api_key with length check
- Use DEFAULT_API_ENDPOINT constant instead of hardcoded URLs (3 sites)
- Add Path() cast for output_path before .is_dir() call
- Add sys.modules mock to test_enhance_missing_library
- Add mocked test_enhance_success with backup/content verification
- Update test assertions for new exception types and key validation
- Add MiniMax to __init__.py docstrings (module, get_adaptor, list_platforms)
- Add MiniMax sections to MULTI_LLM_SUPPORT.md (install, format, API key,
  workflow example, export-to-all)

Follows up on PR #318 by @octo-patch (feat: add MiniMax AI as LLM platform adaptor).

Co-Authored-By: Octopus <octo-patch@users.noreply.github.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>

---------

Co-authored-by: octo-patch <octo-patch@users.noreply.github.com>
Co-authored-by: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
yusyus
2026-03-20 22:12:23 +03:00
committed by GitHub
parent 37a23e6c6d
commit 4f87de6b56
12 changed files with 1676 additions and 2359 deletions

View File

@@ -112,6 +112,7 @@ Upload documentation as custom skills to AI chat platforms:
| **[Claude](CLAUDE.md)** | Anthropic | ZIP + YAML | Claude.ai Projects | [Setup →](CLAUDE.md) |
| **[Gemini](GEMINI_INTEGRATION.md)** | Google | tar.gz | Gemini AI | [Setup →](GEMINI_INTEGRATION.md) |
| **[ChatGPT](OPENAI_INTEGRATION.md)** | OpenAI | ZIP + Vector Store | GPT Actions | [Setup →](OPENAI_INTEGRATION.md) |
| **[MiniMax](MINIMAX_INTEGRATION.md)** | MiniMax | ZIP | MiniMax AI Platform | [Setup →](MINIMAX_INTEGRATION.md) |
**Quick Example:**
```bash
@@ -139,7 +140,7 @@ skill-seekers upload output/vue-claude.zip --target claude
| **AI coding (flow-based)** | Windsurf | Unique flow paradigm, Codeium AI | 5 min |
| **AI coding (VS Code ext)** | Cline | Claude in VS Code, MCP integration | 10 min |
| **AI coding (any IDE)** | Continue.dev | Works everywhere, open-source | 5 min |
| **Chat with documentation** | Claude/Gemini/ChatGPT | Direct upload as custom skill | 3 min |
| **Chat with documentation** | Claude/Gemini/ChatGPT/MiniMax | Direct upload as custom skill | 3 min |
### By Technical Requirements