fix: harden registry tooling, make tests hermetic, and restore metadata consistency (#168)

* chore: upgrade maintenance scripts to robust PyYAML parsing

- Replaces fragile regex frontmatter parsing with PyYAML/yaml library
- Ensures multi-line descriptions and complex characters are handled safely
- Normalizes quoting and field ordering across all maintenance scripts
- Updates validator to strictly enforce description quality

* fix: restore and refine truncated skill descriptions

- Recovered 223+ truncated descriptions from git history (6.5.0 regression)
- Refined long descriptions into concise, complete sentences (<200 chars)
- Added missing descriptions for brainstorming and orchestration skills
- Manually fixed imagen skill description
- Resolved dangling links in competitor-alternatives skill

* chore: sync generated registry files and document fixes

- Regenerated skills index with normalized forward-slash paths
- Updated README and CATALOG to reflect restored descriptions
- Documented restoration and script improvements in CHANGELOG.md

* fix: restore missing skill and align metadata for full 955 count

- Renamed SKILL.MD to SKILL.md in andruia-skill-smith to ensure indexing
- Fixed risk level and missing section in andruia-skill-smith
- Synchronized all registry files for final 955 skill count

* chore(scripts): add cross-platform runners and hermetic test orchestration

* fix(scripts): harden utf-8 output and clone target writeability

* fix(skills): add missing date metadata for strict validation

* chore(index): sync generated metadata dates

* fix(catalog): normalize skill paths to prevent CI drift

* chore: sync generated registry files

* fix: enforce LF line endings for generated registry files
This commit is contained in:
Ares
2026-03-01 08:38:25 +00:00
committed by GitHub
parent c9a76a2d94
commit 4a5f1234bb
258 changed files with 4296 additions and 1809 deletions

View File

@@ -50,6 +50,8 @@ jobs:
continue-on-error: true
- name: Run tests
env:
ENABLE_NETWORK_TESTS: "1"
run: npm run test
- name: 📦 Build catalog

1
.gitignore vendored
View File

@@ -2,6 +2,7 @@ node_modules/
__pycache__/
.ruff_cache/
.worktrees/
.tmp/
.DS_Store
# npm pack artifacts

File diff suppressed because it is too large Load Diff

View File

@@ -39,6 +39,10 @@ Check prototypes and generated code for structural flaws, hidden technical debt,
## 📦 Improvements
- **Skill Description Restoration**: Recovered 223+ truncated descriptions from git history that were corrupted in release 6.5.0.
- **Robust YAML Tooling**: Replaced fragile regex parsing with `PyYAML` across all maintenance scripts (`manage_skill_dates.py`, `validate_skills.py`, etc.) to prevent future data loss.
- **Refined Descriptions**: Standardized all skill descriptions to be under 200 characters while maintaining grammatical correctness and functional value.
- **Cross-Platform Index**: Normalized `skills_index.json` to use forward slashes for universal path compatibility.
- **Skill Validation Fixes**: Corrected invalid description lengths and `risk` fields in `copywriting`, `videodb-skills`, and `vibe-code-auditor`. (Fixes #157, #158)
- **Documentation**: New dedicated `docs/SEC_SKILLS.md` indexing all 128 security skills.
- **README Quality**: Cleaned up inconsistencies, deduplicated lists, updated stats (954+ total skills).

View File

@@ -1,6 +1,6 @@
# 🌌 Antigravity Awesome Skills: 954+ Agentic Skills for Claude Code, Gemini CLI, Cursor, Copilot & More
# 🌌 Antigravity Awesome Skills: 955+ Agentic Skills for Claude Code, Gemini CLI, Cursor, Copilot & More
> **The Ultimate Collection of 954+ Universal Agentic Skills for AI Coding Assistants — Claude Code, Gemini CLI, Codex CLI, Antigravity IDE, GitHub Copilot, Cursor, OpenCode, AdaL**
> **The Ultimate Collection of 955+ Universal Agentic Skills for AI Coding Assistants — Claude Code, Gemini CLI, Codex CLI, Antigravity IDE, GitHub Copilot, Cursor, OpenCode, AdaL**
[![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT)
[![Claude Code](https://img.shields.io/badge/Claude%20Code-Anthropic-purple)](https://claude.ai)
@@ -42,7 +42,7 @@ This repository provides essential skills to transform your AI assistant into a
- [🎁 Curated Collections (Bundles)](#curated-collections)
- [🧭 Antigravity Workflows](#antigravity-workflows)
- [📦 Features & Categories](#features--categories)
- [📚 Browse 954+ Skills](#browse-954-skills)
- [📚 Browse 955+ Skills](#browse-955-skills)
- [🤝 How to Contribute](#how-to-contribute)
- [💬 Community](#community)
- [☕ Support the Project](#support-the-project)
@@ -341,7 +341,7 @@ The repository is organized into specialized domains to transform your AI into a
Counts change as new skills are added. For the current full registry, see [CATALOG.md](CATALOG.md).
## Browse 954+ Skills
## Browse 955+ Skills
We have moved the full skill registry to a dedicated catalog to keep this README clean, and we've also introduced an interactive **Web App**!

View File

@@ -7,7 +7,7 @@
"agent-orchestration-optimize": "agent-orchestration-multi-agent-optimize",
"android-jetpack-expert": "android-jetpack-compose-expert",
"api-testing-mock": "api-testing-observability-api-mock",
"templates": "app-builder/templates",
"templates": "app-builder\\templates",
"application-performance-optimization": "application-performance-performance-optimization",
"azure-ai-dotnet": "azure-ai-agents-persistent-dotnet",
"azure-ai-java": "azure-ai-agents-persistent-java",
@@ -65,24 +65,24 @@
"frontend-mobile-scaffold": "frontend-mobile-development-component-scaffold",
"frontend-mobile-scan": "frontend-mobile-security-xss-scan",
"full-stack-feature": "full-stack-orchestration-full-stack-feature",
"2d-games": "game-development/2d-games",
"3d-games": "game-development/3d-games",
"game-art": "game-development/game-art",
"game-audio": "game-development/game-audio",
"game-design": "game-development/game-design",
"mobile-games": "game-development/mobile-games",
"multiplayer": "game-development/multiplayer",
"pc-games": "game-development/pc-games",
"vr-ar": "game-development/vr-ar",
"web-games": "game-development/web-games",
"2d-games": "game-development\\2d-games",
"3d-games": "game-development\\3d-games",
"game-art": "game-development\\game-art",
"game-audio": "game-development\\game-audio",
"game-design": "game-development\\game-design",
"mobile-games": "game-development\\mobile-games",
"multiplayer": "game-development\\multiplayer",
"pc-games": "game-development\\pc-games",
"vr-ar": "game-development\\vr-ar",
"web-games": "game-development\\web-games",
"git-pr-workflow": "git-pr-workflows-git-workflow",
"incident-response": "incident-response-incident-response",
"javascript-typescript-scaffold": "javascript-typescript-typescript-scaffold",
"base": "libreoffice/base",
"calc": "libreoffice/calc",
"draw": "libreoffice/draw",
"impress": "libreoffice/impress",
"writer": "libreoffice/writer",
"base": "libreoffice\\base",
"calc": "libreoffice\\calc",
"draw": "libreoffice\\draw",
"impress": "libreoffice\\impress",
"writer": "libreoffice\\writer",
"llm-application-assistant": "llm-application-dev-ai-assistant",
"llm-application-agent": "llm-application-dev-langchain-agent",
"llm-application-optimize": "llm-application-dev-prompt-optimize",
@@ -100,11 +100,11 @@
"security-scanning-dependencies": "security-scanning-security-dependencies",
"security-scanning-hardening": "security-scanning-security-hardening",
"security-scanning-sast": "security-scanning-security-sast",
"aws-compliance-checker": "security/aws-compliance-checker",
"aws-iam-best-practices": "security/aws-iam-best-practices",
"security/aws-iam-practices": "security/aws-iam-best-practices",
"aws-secrets-rotation": "security/aws-secrets-rotation",
"aws-security-audit": "security/aws-security-audit",
"aws-compliance-checker": "security\\aws-compliance-checker",
"aws-iam-best-practices": "security\\aws-iam-best-practices",
"security\\aws-iam-practices": "security\\aws-iam-best-practices",
"aws-secrets-rotation": "security\\aws-secrets-rotation",
"aws-security-audit": "security\\aws-security-audit",
"seo-forensic-response": "seo-forensic-incident-response",
"startup-business-case": "startup-business-analyst-business-case",
"startup-business-projections": "startup-business-analyst-financial-projections",

View File

@@ -27,15 +27,21 @@
"azure-ai-agents-persistent-java",
"azure-ai-anomalydetector-java",
"azure-ai-contentsafety-java",
"azure-ai-contentsafety-py",
"azure-ai-contentunderstanding-py",
"azure-ai-formrecognizer-java",
"azure-ai-ml-py",
"azure-ai-projects-java",
"azure-ai-projects-py",
"azure-ai-projects-ts",
"azure-ai-transcription-py",
"azure-ai-translation-ts",
"azure-ai-vision-imageanalysis-java",
"azure-ai-voicelive-java",
"azure-ai-voicelive-py",
"azure-ai-voicelive-ts",
"azure-appconfiguration-java",
"azure-appconfiguration-py",
"azure-appconfiguration-ts",
"azure-communication-callautomation-java",
"azure-communication-callingserver-java",
@@ -43,34 +49,64 @@
"azure-communication-common-java",
"azure-communication-sms-java",
"azure-compute-batch-java",
"azure-containerregistry-py",
"azure-cosmos-db-py",
"azure-cosmos-java",
"azure-cosmos-py",
"azure-cosmos-rust",
"azure-cosmos-ts",
"azure-data-tables-java",
"azure-data-tables-py",
"azure-eventgrid-java",
"azure-eventgrid-py",
"azure-eventhub-java",
"azure-eventhub-py",
"azure-eventhub-rust",
"azure-eventhub-ts",
"azure-functions",
"azure-identity-java",
"azure-identity-py",
"azure-identity-rust",
"azure-identity-ts",
"azure-keyvault-certificates-rust",
"azure-keyvault-keys-rust",
"azure-keyvault-keys-ts",
"azure-keyvault-py",
"azure-keyvault-secrets-rust",
"azure-keyvault-secrets-ts",
"azure-messaging-webpubsub-java",
"azure-messaging-webpubsubservice-py",
"azure-mgmt-apicenter-dotnet",
"azure-mgmt-apicenter-py",
"azure-mgmt-apimanagement-dotnet",
"azure-mgmt-apimanagement-py",
"azure-mgmt-botservice-py",
"azure-mgmt-fabric-py",
"azure-monitor-ingestion-java",
"azure-monitor-ingestion-py",
"azure-monitor-opentelemetry-exporter-java",
"azure-monitor-opentelemetry-exporter-py",
"azure-monitor-opentelemetry-py",
"azure-monitor-opentelemetry-ts",
"azure-monitor-query-java",
"azure-monitor-query-py",
"azure-postgres-ts",
"azure-search-documents-py",
"azure-search-documents-ts",
"azure-security-keyvault-keys-java",
"azure-security-keyvault-secrets-java",
"azure-servicebus-py",
"azure-servicebus-ts",
"azure-speech-to-text-rest-py",
"azure-storage-blob-java",
"azure-storage-blob-py",
"azure-storage-blob-rust",
"azure-storage-blob-ts",
"azure-storage-file-datalake-py",
"azure-storage-file-share-py",
"azure-storage-file-share-ts",
"azure-storage-queue-py",
"azure-storage-queue-ts",
"azure-web-pubsub-ts",
"backend-architect",
"backend-dev-guidelines",
@@ -97,6 +133,7 @@
"documentation",
"documentation-generation-doc-generate",
"documentation-templates",
"dotnet-architect",
"dotnet-backend",
"dotnet-backend-patterns",
"exa-search",
@@ -117,7 +154,7 @@
"frontend-security-coder",
"frontend-slides",
"frontend-ui-dark-ts",
"game-development/mobile-games",
"game-development\\mobile-games",
"gemini-api-dev",
"go-concurrency-patterns",
"go-playwright",
@@ -132,6 +169,8 @@
"javascript-testing-patterns",
"javascript-typescript-typescript-scaffold",
"launch-strategy",
"m365-agents-py",
"m365-agents-ts",
"makepad-skills",
"manifest",
"memory-safety-patterns",
@@ -170,15 +209,17 @@
"react-patterns",
"react-state-management",
"react-ui-patterns",
"reference-builder",
"remotion-best-practices",
"ruby-pro",
"rust-async-patterns",
"rust-pro",
"security-audit",
"security/aws-secrets-rotation",
"security\\aws-secrets-rotation",
"senior-architect",
"senior-fullstack",
"shopify-apps",
"shopify-development",
"slack-automation",
"slack-bot-builder",
"stitch-ui-design",
@@ -217,6 +258,7 @@
"auth-implementation-patterns",
"aws-penetration-testing",
"azure-cosmos-db-py",
"azure-keyvault-py",
"azure-keyvault-secrets-rust",
"azure-keyvault-secrets-ts",
"azure-security-keyvault-keys-dotnet",
@@ -239,25 +281,34 @@
"ethical-hacking-methodology",
"find-bugs",
"firebase",
"firmware-analyst",
"framework-migration-deps-upgrade",
"frontend-mobile-security-xss-scan",
"frontend-security-coder",
"gdpr-data-handling",
"graphql-architect",
"k8s-manifest-generator",
"k8s-security-policies",
"laravel-expert",
"laravel-security-audit",
"legal-advisor",
"linkerd-patterns",
"loki-mode",
"m365-agents-dotnet",
"m365-agents-py",
"malware-analyst",
"mobile-security-coder",
"nestjs-expert",
"network-engineer",
"nextjs-supabase-auth",
"nodejs-best-practices",
"notebooklm",
"openapi-spec-generation",
"payment-integration",
"pci-compliance",
"pentest-checklist",
"plaid-fintech",
"quant-analyst",
"risk-manager",
"risk-metrics-calculation",
"sast-configuration",
@@ -271,10 +322,10 @@
"security-scanning-security-dependencies",
"security-scanning-security-hardening",
"security-scanning-security-sast",
"security/aws-compliance-checker",
"security/aws-iam-best-practices",
"security/aws-secrets-rotation",
"security/aws-security-audit",
"security\\aws-compliance-checker",
"security\\aws-iam-best-practices",
"security\\aws-secrets-rotation",
"security\\aws-security-audit",
"service-mesh-expert",
"solidity-security",
"stride-analysis-patterns",
@@ -282,6 +333,7 @@
"threat-mitigation-mapping",
"threat-modeling-expert",
"top-web-vulnerabilities",
"ui-visual-validator",
"varlock-claude-skill",
"vulnerability-scanner",
"web-design-guidelines",
@@ -294,8 +346,15 @@
"description": "Kubernetes and service mesh essentials.",
"skills": [
"azure-cosmos-db-py",
"azure-identity-dotnet",
"azure-identity-java",
"azure-identity-py",
"azure-identity-ts",
"azure-messaging-webpubsubservice-py",
"azure-mgmt-botservice-dotnet",
"azure-mgmt-botservice-py",
"azure-servicebus-dotnet",
"azure-servicebus-py",
"azure-servicebus-ts",
"chrome-extension-developer",
"cloud-devops",
@@ -308,6 +367,7 @@
"k8s-security-policies",
"kubernetes-architect",
"kubernetes-deployment",
"legal-advisor",
"linkerd-patterns",
"linux-troubleshooting",
"microservices-patterns",
@@ -326,20 +386,36 @@
"analytics-tracking",
"angular-ui-patterns",
"appdeploy",
"azure-ai-document-intelligence-dotnet",
"azure-ai-document-intelligence-ts",
"azure-ai-textanalytics-py",
"azure-cosmos-db-py",
"azure-cosmos-java",
"azure-cosmos-py",
"azure-cosmos-rust",
"azure-cosmos-ts",
"azure-data-tables-java",
"azure-data-tables-py",
"azure-eventhub-java",
"azure-eventhub-rust",
"azure-eventhub-ts",
"azure-maps-search-dotnet",
"azure-monitor-ingestion-java",
"azure-monitor-ingestion-py",
"azure-monitor-query-java",
"azure-monitor-query-py",
"azure-postgres-ts",
"azure-resource-manager-mysql-dotnet",
"azure-resource-manager-postgresql-dotnet",
"azure-resource-manager-sql-dotnet",
"azure-security-keyvault-secrets-java",
"azure-storage-file-datalake-py",
"blockrun",
"business-analyst",
"cc-skill-backend-patterns",
"cc-skill-clickhouse-io",
"claude-d3js-skill",
"content-marketer",
"data-engineer",
"data-engineering-data-driven-feature",
"data-engineering-data-pipeline",
@@ -365,9 +441,11 @@
"google-analytics-automation",
"googlesheets-automation",
"graphql",
"ios-developer",
"kpi-dashboard-design",
"libreoffice/base",
"libreoffice/calc",
"legal-advisor",
"libreoffice\\base",
"libreoffice\\calc",
"loki-mode",
"mailchimp-automation",
"ml-pipeline-workflow",
@@ -376,13 +454,18 @@
"nextjs-best-practices",
"nodejs-backend-patterns",
"pci-compliance",
"php-pro",
"postgres-best-practices",
"postgresql",
"postgresql-optimization",
"prisma-expert",
"programmatic-seo",
"pydantic-models-py",
"quant-analyst",
"rag-implementation",
"react-ui-patterns",
"scala-pro",
"schema-markup",
"segment-cdp",
"sendgrid-automation",
"senior-architect",
@@ -409,6 +492,9 @@
"aws-serverless",
"azd-deployment",
"azure-ai-anomalydetector-java",
"azure-mgmt-applicationinsights-dotnet",
"azure-mgmt-arizeaiobservabilityeval-dotnet",
"azure-mgmt-weightsandbiases-dotnet",
"azure-microsoft-playwright-testing-ts",
"azure-monitor-opentelemetry-ts",
"backend-development-feature-development",
@@ -425,6 +511,7 @@
"devops-troubleshooter",
"distributed-debugging-debug-trace",
"distributed-tracing",
"django-pro",
"docker-expert",
"e2e-testing-patterns",
"error-debugging-error-analysis",
@@ -432,7 +519,8 @@
"error-diagnostics-error-analysis",
"error-diagnostics-error-trace",
"expo-deployment",
"game-development/game-art",
"flutter-expert",
"game-development\\game-art",
"git-pr-workflows-git-workflow",
"gitlab-ci-patterns",
"gitops-workflow",
@@ -443,12 +531,15 @@
"incident-response-smart-fix",
"incident-runbook-templates",
"kpi-dashboard-design",
"kubernetes-architect",
"kubernetes-deployment",
"langfuse",
"llm-app-patterns",
"loki-mode",
"machine-learning-ops-ml-pipeline",
"malware-analyst",
"manifest",
"ml-engineer",
"ml-pipeline-workflow",
"observability-engineer",
"observability-monitoring-monitor-setup",
@@ -464,6 +555,8 @@
"service-mesh-expert",
"service-mesh-observability",
"slo-implementation",
"temporal-python-pro",
"unity-developer",
"vercel-deploy-claimable",
"vercel-deployment"
]

File diff suppressed because it is too large Load Diff

View File

@@ -4,17 +4,19 @@
"description": "900+ agentic skills for Claude Code, Gemini CLI, Cursor, Antigravity & more. Installer CLI.",
"license": "MIT",
"scripts": {
"validate": "python3 scripts/validate_skills.py",
"validate:strict": "python3 scripts/validate_skills.py --strict",
"index": "python3 scripts/generate_index.py",
"readme": "python3 scripts/update_readme.py",
"validate": "node scripts/run-python.js scripts/validate_skills.py",
"validate:strict": "node scripts/run-python.js scripts/validate_skills.py --strict",
"index": "node scripts/run-python.js scripts/generate_index.py",
"readme": "node scripts/run-python.js scripts/update_readme.py",
"chain": "npm run validate && npm run index && npm run readme",
"catalog": "node scripts/build-catalog.js",
"build": "npm run chain && npm run catalog",
"test": "node scripts/tests/validate_skills_headings.test.js && python3 scripts/tests/test_validate_skills_headings.py && python3 scripts/tests/inspect_microsoft_repo.py && python3 scripts/tests/test_comprehensive_coverage.py",
"sync:microsoft": "python3 scripts/sync_microsoft_skills.py",
"test": "node scripts/tests/run-test-suite.js",
"test:local": "node scripts/tests/run-test-suite.js --local",
"test:network": "node scripts/tests/run-test-suite.js --network",
"sync:microsoft": "node scripts/run-python.js scripts/sync_microsoft_skills.py",
"sync:all-official": "npm run sync:microsoft && npm run chain",
"update:skills": "python3 scripts/generate_index.py && copy skills_index.json web-app/public/skills.json",
"update:skills": "node scripts/run-python.js scripts/generate_index.py && node scripts/copy-file.js skills_index.json web-app/public/skills.json",
"app:setup": "node scripts/setup_web.js",
"app:install": "cd web-app && npm install",
"app:dev": "npm run app:setup && cd web-app && npm run dev",

View File

@@ -128,8 +128,10 @@ def categorize_skill(skill_name, description):
return None
import yaml
def auto_categorize(skills_dir, dry_run=False):
"""Auto-categorize skills and update generate_index.py"""
"""Auto-categorize skills and update SKILL.md files"""
skills = []
categorized_count = 0
already_categorized = 0
@@ -146,17 +148,19 @@ def auto_categorize(skills_dir, dry_run=False):
with open(skill_path, 'r', encoding='utf-8') as f:
content = f.read()
# Extract name and description from frontmatter
# Extract frontmatter and body
fm_match = re.search(r'^---\s*\n(.*?)\n---', content, re.DOTALL)
if not fm_match:
continue
fm_text = fm_match.group(1)
metadata = {}
for line in fm_text.split('\n'):
if ':' in line and not line.strip().startswith('#'):
key, val = line.split(':', 1)
metadata[key.strip()] = val.strip().strip('"').strip("'")
body = content[fm_match.end():]
try:
metadata = yaml.safe_load(fm_text) or {}
except yaml.YAMLError as e:
print(f"⚠️ {skill_id}: YAML error - {e}")
continue
skill_name = metadata.get('name', skill_id)
description = metadata.get('description', '')
@@ -186,32 +190,12 @@ def auto_categorize(skills_dir, dry_run=False):
})
if not dry_run:
# Update the SKILL.md file - add or replace category
fm_start = content.find('---')
fm_end = content.find('---', fm_start + 3)
metadata['category'] = new_category
new_fm = yaml.dump(metadata, sort_keys=False, allow_unicode=True, width=1000).strip()
new_content = f"---\n{new_fm}\n---" + body
if fm_start >= 0 and fm_end > fm_start:
frontmatter = content[fm_start:fm_end+3]
body = content[fm_end+3:]
# Check if category exists in frontmatter
if 'category:' in frontmatter:
# Replace existing category
new_frontmatter = re.sub(
r'category:\s*\w+',
f'category: {new_category}',
frontmatter
)
else:
# Add category before the closing ---
new_frontmatter = frontmatter.replace(
'\n---',
f'\ncategory: {new_category}\n---'
)
new_content = new_frontmatter + body
with open(skill_path, 'w', encoding='utf-8') as f:
f.write(new_content)
with open(skill_path, 'w', encoding='utf-8') as f:
f.write(new_content)
categorized_count += 1
else:

View File

@@ -628,7 +628,8 @@ function buildCatalog() {
category,
tags,
triggers,
path: path.relative(ROOT, skill.path),
// Normalize separators for deterministic cross-platform output.
path: path.relative(ROOT, skill.path).split(path.sep).join("/"),
});
}

71
scripts/copy-file.js Normal file
View File

@@ -0,0 +1,71 @@
#!/usr/bin/env node
'use strict';
const fs = require('node:fs');
const path = require('node:path');
const args = process.argv.slice(2);
if (args.length !== 2) {
console.error('Usage: node scripts/copy-file.js <source> <destination>');
process.exit(1);
}
const [sourceInput, destinationInput] = args;
const projectRoot = path.resolve(__dirname, '..');
const sourcePath = path.resolve(projectRoot, sourceInput);
const destinationPath = path.resolve(projectRoot, destinationInput);
const destinationDir = path.dirname(destinationPath);
function fail(message) {
console.error(message);
process.exit(1);
}
function isInsideProjectRoot(targetPath) {
const relativePath = path.relative(projectRoot, targetPath);
return relativePath === '' || (!relativePath.startsWith('..') && !path.isAbsolute(relativePath));
}
if (!isInsideProjectRoot(sourcePath) || !isInsideProjectRoot(destinationPath)) {
fail('Source and destination must resolve inside the project root.');
}
if (sourcePath === destinationPath) {
fail('Source and destination must be different files.');
}
if (!fs.existsSync(sourcePath)) {
fail(`Source file not found: ${sourceInput}`);
}
let sourceStats;
try {
sourceStats = fs.statSync(sourcePath);
} catch (error) {
fail(`Unable to read source file "${sourceInput}": ${error.message}`);
}
if (!sourceStats.isFile()) {
fail(`Source is not a file: ${sourceInput}`);
}
let destinationDirStats;
try {
destinationDirStats = fs.statSync(destinationDir);
} catch {
fail(`Destination directory not found: ${path.relative(projectRoot, destinationDir)}`);
}
if (!destinationDirStats.isDirectory()) {
fail(`Destination parent is not a directory: ${path.relative(projectRoot, destinationDir)}`);
}
try {
fs.copyFileSync(sourcePath, destinationPath);
} catch (error) {
fail(`Copy failed (${sourceInput} -> ${destinationInput}): ${error.message}`);
}
console.log(`Copied ${sourceInput} -> ${destinationInput}`);

View File

@@ -1,5 +1,6 @@
import os
import re
import yaml
def fix_skills(skills_dir):
for root, dirs, files in os.walk(skills_dir):
@@ -14,33 +15,31 @@ def fix_skills(skills_dir):
continue
fm_text = fm_match.group(1)
body = content[fm_match.end():]
folder_name = os.path.basename(root)
new_fm_lines = []
try:
metadata = yaml.safe_load(fm_text) or {}
except yaml.YAMLError as e:
print(f"⚠️ {skill_path}: YAML error - {e}")
continue
changed = False
for line in fm_text.split('\n'):
if line.startswith('name:'):
old_name = line.split(':', 1)[1].strip().strip('"').strip("'")
if old_name != folder_name:
new_fm_lines.append(f"name: {folder_name}")
changed = True
else:
new_fm_lines.append(line)
elif line.startswith('description:'):
desc = line.split(':', 1)[1].strip().strip('"').strip("'")
if len(desc) > 200:
# trim to 197 chars and add "..."
short_desc = desc[:197] + "..."
new_fm_lines.append(f'description: "{short_desc}"')
changed = True
else:
new_fm_lines.append(line)
else:
new_fm_lines.append(line)
# 1. Fix Name
if metadata.get('name') != folder_name:
metadata['name'] = folder_name
changed = True
# 2. Fix Description length
desc = metadata.get('description', '')
if isinstance(desc, str) and len(desc) > 200:
metadata['description'] = desc[:197] + "..."
changed = True
if changed:
new_fm_text = '\n'.join(new_fm_lines)
new_content = content[:fm_match.start(1)] + new_fm_text + content[fm_match.end(1):]
new_fm = yaml.dump(metadata, sort_keys=False, allow_unicode=True, width=1000).strip()
new_content = f"---\n{new_fm}\n---" + body
with open(skill_path, 'w', encoding='utf-8') as f:
f.write(new_content)
print(f"Fixed {skill_path}")

View File

@@ -1,9 +1,9 @@
import os
import re
import json
import yaml
def fix_yaml_quotes(skills_dir):
print(f"Scanning for YAML quoting errors in {skills_dir}...")
print(f"Normalizing YAML frontmatter in {skills_dir}...")
fixed_count = 0
for root, dirs, files in os.walk(skills_dir):
@@ -21,42 +21,24 @@ def fix_yaml_quotes(skills_dir):
continue
fm_text = fm_match.group(1)
new_fm_lines = []
changed = False
body = content[fm_match.end():]
for line in fm_text.split('\n'):
if line.startswith('description:'):
key, val = line.split(':', 1)
val = val.strip()
# Store original to check if it matches the fixed version
orig_val = val
# Strip matching outer quotes if they exist
if val.startswith('"') and val.endswith('"') and len(val) >= 2:
val = val[1:-1]
elif val.startswith("'") and val.endswith("'") and len(val) >= 2:
val = val[1:-1]
# Now safely encode using JSON to handle internal escapes
safe_val = json.dumps(val)
if safe_val != orig_val:
new_line = f"description: {safe_val}"
new_fm_lines.append(new_line)
changed = True
continue
new_fm_lines.append(line)
try:
# safe_load and then dump will normalize quoting automatically
metadata = yaml.safe_load(fm_text) or {}
new_fm = yaml.dump(metadata, sort_keys=False, allow_unicode=True, width=1000).strip()
if changed:
new_fm_text = '\n'.join(new_fm_lines)
new_content = content[:fm_match.start(1)] + new_fm_text + content[fm_match.end(1):]
with open(file_path, 'w', encoding='utf-8') as f:
f.write(new_content)
print(f"Fixed quotes in {os.path.relpath(file_path, skills_dir)}")
fixed_count += 1
# Check if it actually changed something significant (beyond just style)
# but normalization is good anyway. We'll just compare the fm_text.
if new_fm.strip() != fm_text.strip():
new_content = f"---\n{new_fm}\n---" + body
with open(file_path, 'w', encoding='utf-8') as f:
f.write(new_content)
fixed_count += 1
except yaml.YAMLError as e:
print(f"⚠️ {file_path}: YAML error - {e}")
print(f"Total files fixed: {fixed_count}")
print(f"Total files normalized: {fixed_count}")
if __name__ == '__main__':
base_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))

View File

@@ -59,9 +59,11 @@ def generate_index(skills_dir, output_file):
parent_dir = os.path.basename(os.path.dirname(root))
# Default values
rel_path = os.path.relpath(root, os.path.dirname(skills_dir))
# Force forward slashes for cross-platform JSON compatibility
skill_info = {
"id": dir_name,
"path": os.path.relpath(root, os.path.dirname(skills_dir)),
"path": rel_path.replace(os.sep, '/'),
"category": parent_dir if parent_dir != "skills" else None, # Will be overridden by frontmatter if present
"name": dir_name.replace("-", " ").title(),
"description": "",
@@ -117,7 +119,7 @@ def generate_index(skills_dir, output_file):
# Sort validation: by name
skills.sort(key=lambda x: (x["name"].lower(), x["id"].lower()))
with open(output_file, 'w', encoding='utf-8') as f:
with open(output_file, 'w', encoding='utf-8', newline='\n') as f:
json.dump(skills, f, indent=2)
print(f"✅ Generated rich index with {len(skills)} skills at: {output_file}")

View File

@@ -18,20 +18,19 @@ def get_project_root():
"""Get the project root directory."""
return os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
import yaml
def parse_frontmatter(content):
"""Parse frontmatter from SKILL.md content."""
"""Parse frontmatter from SKILL.md content using PyYAML."""
fm_match = re.search(r'^---\s*\n(.*?)\n---', content, re.DOTALL)
if not fm_match:
return None
fm_text = fm_match.group(1)
metadata = {}
for line in fm_text.split('\n'):
if ':' in line and not line.strip().startswith('#'):
key, val = line.split(':', 1)
metadata[key.strip()] = val.strip().strip('"').strip("'")
return metadata
try:
return yaml.safe_load(fm_text) or {}
except yaml.YAMLError:
return None
def generate_skills_report(output_file=None, sort_by='date'):
"""Generate a report of all skills with their metadata."""

View File

@@ -26,45 +26,39 @@ def get_project_root():
"""Get the project root directory."""
return os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
import yaml
def parse_frontmatter(content):
"""Parse frontmatter from SKILL.md content."""
"""Parse frontmatter from SKILL.md content using PyYAML."""
fm_match = re.search(r'^---\s*\n(.*?)\n---', content, re.DOTALL)
if not fm_match:
return None, content
fm_text = fm_match.group(1)
metadata = {}
for line in fm_text.split('\n'):
if ':' in line and not line.strip().startswith('#'):
key, val = line.split(':', 1)
metadata[key.strip()] = val.strip().strip('"').strip("'")
return metadata, content
try:
metadata = yaml.safe_load(fm_text) or {}
return metadata, content
except yaml.YAMLError as e:
print(f"⚠️ YAML parsing error: {e}")
return None, content
def reconstruct_frontmatter(metadata):
"""Reconstruct frontmatter from metadata dict."""
lines = ["---"]
# Order: id, name, description, category, risk, source, tags, date_added
priority_keys = ['id', 'name', 'description', 'category', 'risk', 'source', 'tags']
"""Reconstruct frontmatter from metadata dict using PyYAML."""
# Ensure important keys are at the top if they exist
ordered = {}
priority_keys = ['id', 'name', 'description', 'category', 'risk', 'source', 'tags', 'date_added']
for key in priority_keys:
if key in metadata:
val = metadata[key]
if isinstance(val, list):
# Handle list fields like tags
lines.append(f'{key}: {val}')
elif ' ' in str(val) or any(c in str(val) for c in ':#"'):
lines.append(f'{key}: "{val}"')
else:
lines.append(f'{key}: {val}')
ordered[key] = metadata[key]
# Add date_added at the end
if 'date_added' in metadata:
lines.append(f'date_added: "{metadata["date_added"]}"')
lines.append("---")
return '\n'.join(lines)
# Add any remaining keys
for key, value in metadata.items():
if key not in ordered:
ordered[key] = value
fm_text = yaml.dump(ordered, sort_keys=False, allow_unicode=True, width=1000).strip()
return f"---\n{fm_text}\n---"
def update_skill_frontmatter(skill_path, metadata):
"""Update a skill's frontmatter with new metadata."""

View File

@@ -14,6 +14,9 @@ const ALLOWED_FIELDS = new Set([
'compatibility',
'metadata',
'allowed-tools',
'date_added',
'category',
'id',
]);
function isPlainObject(value) {
@@ -122,7 +125,8 @@ function normalizeSkill(skillId) {
if (!modified) return false;
const ordered = {};
for (const key of ['name', 'description', 'license', 'compatibility', 'allowed-tools', 'metadata']) {
const order = ['id', 'name', 'description', 'category', 'risk', 'source', 'license', 'compatibility', 'date_added', 'allowed-tools', 'metadata'];
for (const key of order) {
if (updated[key] !== undefined) {
ordered[key] = updated[key];
}

90
scripts/run-python.js Normal file
View File

@@ -0,0 +1,90 @@
#!/usr/bin/env node
'use strict';
const { spawn, spawnSync } = require('node:child_process');
const args = process.argv.slice(2);
if (args.length === 0) {
console.error('Usage: node scripts/run-python.js <script.py> [args...]');
process.exit(1);
}
function uniqueCandidates(candidates) {
const seen = new Set();
const unique = [];
for (const candidate of candidates) {
const key = candidate.join('\u0000');
if (!seen.has(key)) {
seen.add(key);
unique.push(candidate);
}
}
return unique;
}
function getPythonCandidates() {
// Optional override for CI/local pinning without editing scripts.
const configuredPython =
process.env.ANTIGRAVITY_PYTHON || process.env.npm_config_python;
const candidates = [
configuredPython ? [configuredPython] : null,
// Keep this ordered list easy to update if project requirements change.
['python3'],
['python'],
['py', '-3'],
].filter(Boolean);
return uniqueCandidates(candidates);
}
function canRun(candidate) {
const [command, ...baseArgs] = candidate;
const probe = spawnSync(
command,
[...baseArgs, '-c', 'import sys; raise SystemExit(0 if sys.version_info[0] == 3 else 1)'],
{
stdio: 'ignore',
shell: false,
},
);
return probe.error == null && probe.status === 0;
}
const pythonCandidates = getPythonCandidates();
const selected = pythonCandidates.find(canRun);
if (!selected) {
console.error(
'Unable to find a Python 3 interpreter. Tried: python3, python, py -3',
);
process.exit(1);
}
const [command, ...baseArgs] = selected;
const child = spawn(command, [...baseArgs, ...args], {
stdio: 'inherit',
shell: false,
});
child.on('error', (error) => {
console.error(`Failed to start Python interpreter "${command}": ${error.message}`);
process.exit(1);
});
child.on('exit', (code, signal) => {
if (signal) {
try {
process.kill(process.pid, signal);
} catch {
process.exit(1);
}
return;
}
process.exit(code ?? 1);
});

View File

@@ -59,8 +59,10 @@ def cleanup_previous_sync():
return removed_count
import yaml
def extract_skill_name(skill_md_path: Path) -> str | None:
"""Extract the 'name' field from SKILL.md YAML frontmatter."""
"""Extract the 'name' field from SKILL.md YAML frontmatter using PyYAML."""
try:
content = skill_md_path.read_text(encoding="utf-8")
except Exception:
@@ -70,13 +72,11 @@ def extract_skill_name(skill_md_path: Path) -> str | None:
if not fm_match:
return None
for line in fm_match.group(1).splitlines():
match = re.match(r"^name:\s*(.+)$", line)
if match:
value = match.group(1).strip().strip("\"'")
if value:
return value
return None
try:
data = yaml.safe_load(fm_match.group(1)) or {}
return data.get('name')
except Exception:
return None
def generate_fallback_name(relative_path: Path) -> str:

View File

@@ -5,13 +5,61 @@ Shows the repository layout, skill locations, and what flat names would be gener
"""
import re
import io
import shutil
import subprocess
import sys
import tempfile
import traceback
import uuid
from pathlib import Path
MS_REPO = "https://github.com/microsoft/skills.git"
def create_clone_target(prefix: str) -> Path:
"""Return a writable, non-existent path for git clone destination."""
repo_tmp_root = Path(__file__).resolve().parents[2] / ".tmp" / "tests"
candidate_roots = (repo_tmp_root, Path(tempfile.gettempdir()))
last_error: OSError | None = None
for root in candidate_roots:
try:
root.mkdir(parents=True, exist_ok=True)
probe_file = root / f".{prefix}write-probe-{uuid.uuid4().hex}.tmp"
with probe_file.open("xb"):
pass
probe_file.unlink()
return root / f"{prefix}{uuid.uuid4().hex}"
except OSError as exc:
last_error = exc
if last_error is not None:
raise last_error
raise OSError("Unable to determine clone destination")
def configure_utf8_output() -> None:
"""Best-effort UTF-8 stdout/stderr on Windows without dropping diagnostics."""
for stream_name in ("stdout", "stderr"):
stream = getattr(sys, stream_name)
try:
stream.reconfigure(encoding="utf-8", errors="backslashreplace")
continue
except Exception:
pass
buffer = getattr(stream, "buffer", None)
if buffer is not None:
setattr(
sys,
stream_name,
io.TextIOWrapper(
buffer, encoding="utf-8", errors="backslashreplace"
),
)
def extract_skill_name(skill_md_path: Path) -> str | None:
"""Extract the 'name' field from SKILL.md YAML frontmatter."""
try:
@@ -37,18 +85,26 @@ def inspect_repo():
print("🔍 Inspecting Microsoft Skills Repository Structure")
print("=" * 60)
with tempfile.TemporaryDirectory() as temp_dir:
temp_path = Path(temp_dir)
repo_path: Path | None = None
try:
repo_path = create_clone_target(prefix="ms-skills-")
print("\n1⃣ Cloning repository...")
subprocess.run(
["git", "clone", "--depth", "1", MS_REPO, str(temp_path)],
check=True,
capture_output=True,
)
try:
subprocess.run(
["git", "clone", "--depth", "1", MS_REPO, str(repo_path)],
check=True,
capture_output=True,
text=True,
)
except subprocess.CalledProcessError as exc:
print("\n❌ git clone failed.", file=sys.stderr)
if exc.stderr:
print(exc.stderr.strip(), file=sys.stderr)
raise
# Find all SKILL.md files
all_skill_mds = list(temp_path.rglob("SKILL.md"))
all_skill_mds = list(repo_path.rglob("SKILL.md"))
print(f"\n2⃣ Total SKILL.md files found: {len(all_skill_mds)}")
# Show flat name mapping
@@ -59,7 +115,7 @@ def inspect_repo():
for skill_md in sorted(all_skill_mds, key=lambda p: str(p)):
try:
rel = skill_md.parent.relative_to(temp_path)
rel = skill_md.parent.relative_to(repo_path)
except ValueError:
rel = skill_md.parent
@@ -87,12 +143,18 @@ def inspect_repo():
f"\n4⃣ ✅ No name collisions — all {len(names_seen)} names are unique!")
print("\n✨ Inspection complete!")
finally:
if repo_path is not None:
shutil.rmtree(repo_path, ignore_errors=True)
if __name__ == "__main__":
configure_utf8_output()
try:
inspect_repo()
except subprocess.CalledProcessError as exc:
sys.exit(exc.returncode or 1)
except Exception as e:
print(f"\n❌ Error: {e}")
import traceback
traceback.print_exc()
print(f"\n❌ Error: {e}", file=sys.stderr)
traceback.print_exc(file=sys.stderr)
sys.exit(1)

View File

@@ -0,0 +1,76 @@
#!/usr/bin/env node
const { spawnSync } = require("child_process");
const NETWORK_TEST_ENV = "ENABLE_NETWORK_TESTS";
const ENABLED_VALUES = new Set(["1", "true", "yes", "on"]);
const LOCAL_TEST_COMMANDS = [
["scripts/tests/validate_skills_headings.test.js"],
["scripts/run-python.js", "scripts/tests/test_validate_skills_headings.py"],
];
const NETWORK_TEST_COMMANDS = [
["scripts/run-python.js", "scripts/tests/inspect_microsoft_repo.py"],
["scripts/run-python.js", "scripts/tests/test_comprehensive_coverage.py"],
];
function isNetworkTestsEnabled() {
const value = process.env[NETWORK_TEST_ENV];
if (!value) {
return false;
}
return ENABLED_VALUES.has(String(value).trim().toLowerCase());
}
function runNodeCommand(args) {
const result = spawnSync(process.execPath, args, { stdio: "inherit" });
if (result.error) {
throw result.error;
}
if (result.signal) {
process.kill(process.pid, result.signal);
}
if (typeof result.status !== "number") {
process.exit(1);
}
if (result.status !== 0) {
process.exit(result.status);
}
}
function runCommandSet(commands) {
for (const commandArgs of commands) {
runNodeCommand(commandArgs);
}
}
function main() {
const mode = process.argv[2];
if (mode === "--local") {
runCommandSet(LOCAL_TEST_COMMANDS);
return;
}
if (mode === "--network") {
runCommandSet(NETWORK_TEST_COMMANDS);
return;
}
runCommandSet(LOCAL_TEST_COMMANDS);
if (!isNetworkTestsEnabled()) {
console.log(
`[tests] Skipping network integration tests. Set ${NETWORK_TEST_ENV}=1 to enable.`,
);
return;
}
console.log(`[tests] ${NETWORK_TEST_ENV} enabled; running network integration tests.`);
runCommandSet(NETWORK_TEST_COMMANDS);
}
main();

View File

@@ -5,14 +5,62 @@ Ensures all skills are captured and no directory name collisions exist.
"""
import re
import io
import shutil
import subprocess
import sys
import tempfile
import traceback
import uuid
from pathlib import Path
from collections import defaultdict
MS_REPO = "https://github.com/microsoft/skills.git"
def create_clone_target(prefix: str) -> Path:
"""Return a writable, non-existent path for git clone destination."""
repo_tmp_root = Path(__file__).resolve().parents[2] / ".tmp" / "tests"
candidate_roots = (repo_tmp_root, Path(tempfile.gettempdir()))
last_error: OSError | None = None
for root in candidate_roots:
try:
root.mkdir(parents=True, exist_ok=True)
probe_file = root / f".{prefix}write-probe-{uuid.uuid4().hex}.tmp"
with probe_file.open("xb"):
pass
probe_file.unlink()
return root / f"{prefix}{uuid.uuid4().hex}"
except OSError as exc:
last_error = exc
if last_error is not None:
raise last_error
raise OSError("Unable to determine clone destination")
def configure_utf8_output() -> None:
"""Best-effort UTF-8 stdout/stderr on Windows without dropping diagnostics."""
for stream_name in ("stdout", "stderr"):
stream = getattr(sys, stream_name)
try:
stream.reconfigure(encoding="utf-8", errors="backslashreplace")
continue
except Exception:
pass
buffer = getattr(stream, "buffer", None)
if buffer is not None:
setattr(
sys,
stream_name,
io.TextIOWrapper(
buffer, encoding="utf-8", errors="backslashreplace"
),
)
def extract_skill_name(skill_md_path: Path) -> str | None:
"""Extract the 'name' field from SKILL.md YAML frontmatter."""
try:
@@ -41,27 +89,35 @@ def analyze_skill_locations():
print("🔬 Comprehensive Skill Coverage & Uniqueness Analysis")
print("=" * 60)
with tempfile.TemporaryDirectory() as temp_dir:
temp_path = Path(temp_dir)
repo_path: Path | None = None
try:
repo_path = create_clone_target(prefix="ms-skills-")
print("\n1⃣ Cloning repository...")
subprocess.run(
["git", "clone", "--depth", "1", MS_REPO, str(temp_path)],
check=True,
capture_output=True,
)
try:
subprocess.run(
["git", "clone", "--depth", "1", MS_REPO, str(repo_path)],
check=True,
capture_output=True,
text=True,
)
except subprocess.CalledProcessError as exc:
print("\n❌ git clone failed.", file=sys.stderr)
if exc.stderr:
print(exc.stderr.strip(), file=sys.stderr)
raise
# Find ALL SKILL.md files
all_skill_files = list(temp_path.rglob("SKILL.md"))
all_skill_files = list(repo_path.rglob("SKILL.md"))
print(f"\n2⃣ Total SKILL.md files found: {len(all_skill_files)}")
# Categorize by location
location_types = defaultdict(list)
for skill_file in all_skill_files:
path_str = str(skill_file)
if ".github/skills" in path_str:
path_str = skill_file.as_posix()
if ".github/skills/" in path_str:
location_types["github_skills"].append(skill_file)
elif ".github/plugins" in path_str:
elif ".github/plugins/" in path_str:
location_types["github_plugins"].append(skill_file)
elif "/skills/" in path_str:
location_types["skills_dir"].append(skill_file)
@@ -81,7 +137,7 @@ def analyze_skill_locations():
for skill_file in all_skill_files:
try:
rel = skill_file.parent.relative_to(temp_path)
rel = skill_file.parent.relative_to(repo_path)
except ValueError:
rel = skill_file.parent
@@ -163,9 +219,13 @@ def analyze_skill_locations():
"invalid_names": len(invalid_names),
"passed": is_pass,
}
finally:
if repo_path is not None:
shutil.rmtree(repo_path, ignore_errors=True)
if __name__ == "__main__":
configure_utf8_output()
try:
results = analyze_skill_locations()
@@ -176,14 +236,18 @@ if __name__ == "__main__":
if results["passed"]:
print("\n✅ V4 FLAT STRUCTURE IS VALID")
print(" All names are unique and valid directory names!")
sys.exit(0)
else:
print("\n⚠️ V4 FLAT STRUCTURE NEEDS FIXES")
if results["collisions"] > 0:
print(f" {results['collisions']} name collisions to resolve")
if results["invalid_names"] > 0:
print(f" {results['invalid_names']} invalid directory names")
sys.exit(1)
except subprocess.CalledProcessError as exc:
sys.exit(exc.returncode or 1)
except Exception as e:
print(f"\n❌ Error: {e}")
import traceback
traceback.print_exc()
print(f"\n❌ Error: {e}", file=sys.stderr)
traceback.print_exc(file=sys.stderr)
sys.exit(1)

View File

@@ -1,7 +1,31 @@
#!/usr/bin/env python3
import io
import json
import os
import re
import sys
def configure_utf8_output() -> None:
"""Best-effort UTF-8 stdout/stderr on Windows without dropping diagnostics."""
if sys.platform != "win32":
return
for stream_name in ("stdout", "stderr"):
stream = getattr(sys, stream_name)
try:
stream.reconfigure(encoding="utf-8", errors="backslashreplace")
continue
except Exception:
pass
buffer = getattr(stream, "buffer", None)
if buffer is not None:
setattr(
sys,
stream_name,
io.TextIOWrapper(buffer, encoding="utf-8", errors="backslashreplace"),
)
def update_readme():
@@ -55,11 +79,12 @@ def update_readme():
content,
)
with open(readme_path, "w", encoding="utf-8") as f:
with open(readme_path, "w", encoding="utf-8", newline="\n") as f:
f.write(content)
print("✅ README.md updated successfully.")
if __name__ == "__main__":
configure_utf8_output()
update_readme()

View File

@@ -2,6 +2,29 @@ import os
import re
import argparse
import sys
import io
def configure_utf8_output() -> None:
"""Best-effort UTF-8 stdout/stderr on Windows without dropping diagnostics."""
if sys.platform != "win32":
return
for stream_name in ("stdout", "stderr"):
stream = getattr(sys, stream_name)
try:
stream.reconfigure(encoding="utf-8", errors="backslashreplace")
continue
except Exception:
pass
buffer = getattr(stream, "buffer", None)
if buffer is not None:
setattr(
sys,
stream_name,
io.TextIOWrapper(buffer, encoding="utf-8", errors="backslashreplace"),
)
WHEN_TO_USE_PATTERNS = [
re.compile(r"^##\s+When\s+to\s+Use", re.MULTILINE | re.IGNORECASE),
@@ -12,39 +35,37 @@ WHEN_TO_USE_PATTERNS = [
def has_when_to_use_section(content):
return any(pattern.search(content) for pattern in WHEN_TO_USE_PATTERNS)
import yaml
def parse_frontmatter(content, rel_path=None):
"""
Simple frontmatter parser using regex to avoid external dependencies.
Returns a dict of key-values.
Parse frontmatter using PyYAML for robustness.
Returns a dict of key-values and a list of error messages.
"""
fm_match = re.search(r'^---\s*\n(.*?)\n---', content, re.DOTALL)
if not fm_match:
return None, []
return None, ["Missing or malformed YAML frontmatter"]
fm_text = fm_match.group(1)
metadata = {}
lines = fm_text.split('\n')
fm_errors = []
for i, line in enumerate(lines):
if ':' in line:
key, val = line.split(':', 1)
metadata[key.strip()] = val.strip().strip('"').strip("'")
# Check for multi-line description issue (problem identification for the user)
if key.strip() == "description":
stripped_val = val.strip()
if (stripped_val.startswith('"') and stripped_val.endswith('"')) or \
(stripped_val.startswith("'") and stripped_val.endswith("'")):
if i + 1 < len(lines) and lines[i+1].startswith(' '):
fm_errors.append(f"description is wrapped in quotes but followed by indented lines. This causes YAML truncation.")
# Check for literal indicators wrapped in quotes
if stripped_val in ['"|"', "'>'", '"|"', "'>'"]:
fm_errors.append(f"description uses a block indicator {stripped_val} inside quotes. Remove quotes for proper YAML block behavior.")
return metadata, fm_errors
try:
metadata = yaml.safe_load(fm_text) or {}
# Identification of the specific regression issue for better reporting
if "description" in metadata:
desc = metadata["description"]
if not desc or (isinstance(desc, str) and not desc.strip()):
fm_errors.append("description field is empty or whitespace only.")
elif desc == "|":
fm_errors.append("description contains only the YAML block indicator '|', likely due to a parsing regression.")
return metadata, fm_errors
except yaml.YAMLError as e:
return None, [f"YAML Syntax Error: {e}"]
def validate_skills(skills_dir, strict_mode=False):
configure_utf8_output()
print(f"🔍 Validating skills in: {skills_dir}")
print(f"⚙️ Mode: {'STRICT (CI)' if strict_mode else 'Standard (Dev)'}")
@@ -90,12 +111,15 @@ def validate_skills(skills_dir, strict_mode=False):
elif metadata["name"] != os.path.basename(root):
errors.append(f"{rel_path}: Name '{metadata['name']}' does not match folder name '{os.path.basename(root)}'")
if "description" not in metadata:
if "description" not in metadata or metadata["description"] is None:
errors.append(f"{rel_path}: Missing 'description' in frontmatter")
else:
# agentskills-ref checks for short descriptions
if len(metadata["description"]) > 200:
errors.append(f"{rel_path}: Description is oversized ({len(metadata['description'])} chars). Must be concise.")
desc = metadata["description"]
if not isinstance(desc, str):
errors.append(f"{rel_path}: 'description' must be a string, got {type(desc).__name__}")
elif len(desc) > 300: # increased limit for multi-line support
errors.append(f"{rel_path}: Description is oversized ({len(desc)} chars). Must be concise.")
# Risk Validation (Quality Bar)
if "risk" not in metadata:

View File

@@ -3,12 +3,16 @@ id: 10-andruia-skill-smith
name: 10-andruia-skill-smith
description: "Ingeniero de Sistemas de Andru.ia. Diseña, redacta y despliega nuevas habilidades (skills) dentro del repositorio siguiendo el Estándar de Diamante."
category: andruia
risk: official
risk: safe
source: personal
date_added: "2026-02-25"
---
# 🔨 Andru.ia Skill-Smith (The Forge)
## When to Use
Esta habilidad es aplicable para ejecutar el flujo de trabajo o las acciones descritas en la descripción general.
## 📝 Descripción
Soy el Ingeniero de Sistemas de Andru.ia. Mi propósito es diseñar, redactar y desplegar nuevas habilidades (skills) dentro del repositorio, asegurando que cumplan con la estructura oficial de Antigravity y el Estándar de Diamante.
@@ -38,4 +42,4 @@ Generar el código para los siguientes archivos:
## ⚠️ Reglas de Oro
- **Prefijos Numéricos:** Asignar un número correlativo a la carpeta (ej. 11, 12, 13) para mantener el orden.
- **Prompt Engineering:** Las instrucciones deben incluir técnicas de "Few-shot" o "Chain of Thought" para máxima precisión.
- **Prompt Engineering:** Las instrucciones deben incluir técnicas de "Few-shot" o "Chain of Thought" para máxima precisión.

View File

@@ -1,9 +1,9 @@
---
name: ai-engineer
description: |
description: Build production-ready LLM applications, advanced RAG systems, and intelligent agents. Implements vector search, multimodal AI, agent orchestration, and enterprise AI integrations.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
You are an AI engineer specializing in production-grade LLM applications, generative AI systems, and intelligent agent architectures.

View File

@@ -1,9 +1,9 @@
---
name: ai-product
description: "Every product will be AI-powered. The question is whether you'll build it right or ship a demo that falls apart in production. This skill covers LLM integration patterns, RAG architecture, prompt ..."
description: Every product will be AI-powered. The question is whether you'll build it right or ship a demo that falls apart in production. This skill covers LLM integration patterns, RAG architecture, prompt ...
risk: unknown
source: "vibeship-spawner-skills (Apache 2.0)"
date_added: "2026-02-27"
source: vibeship-spawner-skills (Apache 2.0)
date_added: '2026-02-27'
---
# AI Product Development

View File

@@ -1,9 +1,9 @@
---
name: analytics-tracking
description: >
description: Design, audit, and improve analytics tracking systems that produce reliable, decision-ready data.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Analytics Tracking & Measurement Strategy

View File

@@ -3,6 +3,7 @@ name: android_ui_verification
description: Automated end-to-end UI testing and verification on an Android Emulator using ADB.
risk: safe
source: community
date_added: "2026-02-28"
---
# Android UI Verification Skill

View File

@@ -1,9 +1,9 @@
---
name: angular
description: >-
description: Modern Angular (v20+) expert with deep knowledge of Signals, Standalone Components, Zoneless applications, SSR/Hydration, and reactive patterns.
risk: safe
source: self
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Angular Expert

View File

@@ -1,9 +1,9 @@
---
name: api-documenter
description: |
description: Master API documentation with OpenAPI 3.1, AI-powered tools, and modern developer experience practices. Create interactive docs, generate SDKs, and build comprehensive developer portals.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
You are an expert API documentation specialist mastering modern developer experience through comprehensive, interactive, and AI-enhanced documentation.

View File

@@ -1,9 +1,9 @@
---
name: arm-cortex-expert
description: >
description: Senior embedded software engineer specializing in firmware and driver development for ARM Cortex-M microcontrollers (Teensy, STM32, nRF52, SAMD).
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# @arm-cortex-expert

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-agents-persistent-dotnet
description: |
description: Azure AI Agents Persistent SDK for .NET. Low-level SDK for creating and managing AI agents with threads, messages, runs, and tools.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.AI.Agents.Persistent (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-agents-persistent-java
description: |
description: Azure AI Agents Persistent SDK for Java. Low-level SDK for creating and managing AI agents with threads, messages, runs, and tools.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure AI Agents Persistent SDK for Java

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-contentsafety-py
description: |
description: Azure AI Content Safety SDK for Python. Use for detecting harmful content in text and images with multi-severity classification.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure AI Content Safety SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-contentunderstanding-py
description: |
description: Azure AI Content Understanding SDK for Python. Use for multimodal content extraction from documents, images, audio, and video.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure AI Content Understanding SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-document-intelligence-dotnet
description: |
description: Azure AI Document Intelligence SDK for .NET. Extract text, tables, and structured data from documents using prebuilt and custom models.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.AI.DocumentIntelligence (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-ml-py
description: |
description: Azure Machine Learning SDK v2 for Python. Use for ML workspaces, jobs, models, datasets, compute, and pipelines.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Machine Learning SDK v2 for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-openai-dotnet
description: |
description: Azure OpenAI SDK for .NET. Client library for Azure OpenAI and OpenAI services. Use for chat completions, embeddings, image generation, audio transcription, and assistants.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.AI.OpenAI (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-projects-dotnet
description: |
description: Azure AI Projects SDK for .NET. High-level client for Azure AI Foundry projects including agents, connections, datasets, deployments, evaluations, and indexes.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.AI.Projects (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-projects-java
description: |
description: Azure AI Projects SDK for Java. High-level SDK for Azure AI Foundry project management including connections, datasets, indexes, and evaluations.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure AI Projects SDK for Java

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-textanalytics-py
description: |
description: Azure AI Text Analytics SDK for sentiment analysis, entity recognition, key phrases, language detection, PII, and healthcare NLP. Use for natural language processing on text.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure AI Text Analytics SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-transcription-py
description: |
description: Azure AI Transcription SDK for Python. Use for real-time and batch speech-to-text transcription with timestamps and diarization.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure AI Transcription SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-translation-document-py
description: |
description: Azure AI Document Translation SDK for batch translation of documents with format preservation. Use for translating Word, PDF, Excel, PowerPoint, and other document formats at scale.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure AI Document Translation SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-translation-text-py
description: |
description: Azure AI Text Translation SDK for real-time text translation, transliteration, language detection, and dictionary lookup. Use for translating text content in applications.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure AI Text Translation SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-vision-imageanalysis-py
description: |
description: Azure AI Vision Image Analysis SDK for captions, tags, objects, OCR, people detection, and smart cropping. Use for computer vision and image understanding tasks.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure AI Vision Image Analysis SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-voicelive-dotnet
description: |
description: Azure AI Voice Live SDK for .NET. Build real-time voice AI applications with bidirectional WebSocket communication.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.AI.VoiceLive (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-voicelive-java
description: |
description: Azure AI VoiceLive SDK for Java. Real-time bidirectional voice conversations with AI assistants using WebSocket.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure AI VoiceLive SDK for Java

View File

@@ -1,9 +1,9 @@
---
name: azure-ai-voicelive-ts
description: |
description: Azure AI Voice Live SDK for JavaScript/TypeScript. Build real-time voice AI applications with bidirectional WebSocket communication.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# @azure/ai-voicelive (JavaScript/TypeScript)

View File

@@ -1,9 +1,9 @@
---
name: azure-appconfiguration-java
description: |
description: Azure App Configuration SDK for Java. Centralized application configuration management with key-value settings, feature flags, and snapshots.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure App Configuration SDK for Java

View File

@@ -1,9 +1,9 @@
---
name: azure-appconfiguration-py
description: |
description: Azure App Configuration SDK for Python. Use for centralized configuration management, feature flags, and dynamic settings.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure App Configuration SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-compute-batch-java
description: |
description: Azure Batch SDK for Java. Run large-scale parallel and HPC batch jobs with pools, jobs, tasks, and compute nodes.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Batch SDK for Java

View File

@@ -1,9 +1,9 @@
---
name: azure-containerregistry-py
description: |
description: Azure Container Registry SDK for Python. Use for managing container images, artifacts, and repositories.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Container Registry SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-cosmos-java
description: |
description: Azure Cosmos DB SDK for Java. NoSQL database operations with global distribution, multi-model support, and reactive patterns.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Cosmos DB SDK for Java

View File

@@ -1,9 +1,9 @@
---
name: azure-cosmos-py
description: |
description: Azure Cosmos DB SDK for Python (NoSQL API). Use for document CRUD, queries, containers, and globally distributed data.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Cosmos DB SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-cosmos-rust
description: |
description: Azure Cosmos DB SDK for Rust (NoSQL API). Use for document CRUD, queries, containers, and globally distributed data.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Cosmos DB SDK for Rust

View File

@@ -1,9 +1,9 @@
---
name: azure-cosmos-ts
description: |
description: Azure Cosmos DB JavaScript/TypeScript SDK (@azure/cosmos) for data plane operations. Use for CRUD operations on documents, queries, bulk operations, and container management.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# @azure/cosmos (TypeScript/JavaScript)

View File

@@ -1,9 +1,9 @@
---
name: azure-data-tables-py
description: |
description: Azure Tables SDK for Python (Storage and Cosmos DB). Use for NoSQL key-value storage, entity CRUD, and batch operations.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Tables SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-eventgrid-dotnet
description: |
description: Azure Event Grid SDK for .NET. Client library for publishing and consuming events with Azure Event Grid. Use for event-driven architectures, pub/sub messaging, CloudEvents, and EventGridEvents.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.Messaging.EventGrid (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-eventgrid-py
description: |
description: Azure Event Grid SDK for Python. Use for publishing events, handling CloudEvents, and event-driven architectures.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Event Grid SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-eventhub-dotnet
description: |
description: Azure Event Hubs SDK for .NET.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.Messaging.EventHubs (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-eventhub-py
description: |
description: Azure Event Hubs SDK for Python streaming. Use for high-throughput event ingestion, producers, consumers, and checkpointing.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Event Hubs SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-eventhub-rust
description: |
description: Azure Event Hubs SDK for Rust. Use for sending and receiving events, streaming data ingestion.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Event Hubs SDK for Rust

View File

@@ -1,9 +1,9 @@
---
name: azure-identity-dotnet
description: |
description: Azure Identity SDK for .NET. Authentication library for Azure SDK clients using Microsoft Entra ID. Use for DefaultAzureCredential, managed identity, service principals, and developer credentials.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.Identity (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-identity-py
description: |
description: Azure Identity SDK for Python authentication. Use for DefaultAzureCredential, managed identity, service principals, and token caching.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Identity SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-identity-rust
description: |
description: Azure Identity SDK for Rust authentication. Use for DeveloperToolsCredential, ManagedIdentityCredential, ClientSecretCredential, and token-based authentication.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Identity SDK for Rust

View File

@@ -1,9 +1,9 @@
---
name: azure-keyvault-certificates-rust
description: |
description: Azure Key Vault Certificates SDK for Rust. Use for creating, importing, and managing certificates.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Key Vault Certificates SDK for Rust

View File

@@ -1,9 +1,9 @@
---
name: azure-keyvault-keys-rust
description: |
description: 'Azure Key Vault Keys SDK for Rust. Use for creating, managing, and using cryptographic keys. Triggers: "keyvault keys rust", "KeyClient rust", "create key rust", "encrypt rust", "sign rust".'
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Key Vault Keys SDK for Rust

View File

@@ -1,9 +1,9 @@
---
name: azure-keyvault-py
description: |
description: Azure Key Vault SDK for Python. Use for secrets, keys, and certificates management with secure storage.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Key Vault SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-keyvault-secrets-rust
description: |
description: 'Azure Key Vault Secrets SDK for Rust. Use for storing and retrieving secrets, passwords, and API keys. Triggers: "keyvault secrets rust", "SecretClient rust", "get secret rust", "set secret rust".'
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Key Vault Secrets SDK for Rust

View File

@@ -1,9 +1,9 @@
---
name: azure-maps-search-dotnet
description: |
description: Azure Maps SDK for .NET. Location-based services including geocoding, routing, rendering, geolocation, and weather. Use for address search, directions, map tiles, IP geolocation, and weather data.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Maps (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-messaging-webpubsubservice-py
description: |
description: Azure Web PubSub Service SDK for Python. Use for real-time messaging, WebSocket connections, and pub/sub patterns.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Web PubSub Service SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-mgmt-apicenter-dotnet
description: |
description: Azure API Center SDK for .NET. Centralized API inventory management with governance, versioning, and discovery.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.ApiCenter (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-mgmt-apicenter-py
description: |
description: Azure API Center Management SDK for Python. Use for managing API inventory, metadata, and governance across your organization.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure API Center Management SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-mgmt-apimanagement-dotnet
description: |
description: Azure Resource Manager SDK for API Management in .NET.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.ApiManagement (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-mgmt-apimanagement-py
description: |
description: Azure API Management SDK for Python. Use for managing APIM services, APIs, products, subscriptions, and policies.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure API Management SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-mgmt-applicationinsights-dotnet
description: |
description: Azure Application Insights SDK for .NET. Application performance monitoring and observability resource management.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.ApplicationInsights (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-mgmt-arizeaiobservabilityeval-dotnet
description: |
description: Azure Resource Manager SDK for Arize AI Observability and Evaluation (.NET).
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.ArizeAIObservabilityEval

View File

@@ -1,9 +1,9 @@
---
name: azure-mgmt-botservice-dotnet
description: |
description: Azure Resource Manager SDK for Bot Service in .NET. Management plane operations for creating and managing Azure Bot resources, channels (Teams, DirectLine, Slack), and connection settings.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.BotService (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-mgmt-botservice-py
description: |
description: Azure Bot Service Management SDK for Python. Use for creating, managing, and configuring Azure Bot Service resources.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Bot Service Management SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-mgmt-fabric-dotnet
description: |
description: Azure Resource Manager SDK for Fabric in .NET.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.Fabric (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-mgmt-fabric-py
description: |
description: Azure Fabric Management SDK for Python. Use for managing Microsoft Fabric capacities and resources.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Fabric Management SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-mgmt-weightsandbiases-dotnet
description: |
description: Azure Weights & Biases SDK for .NET. ML experiment tracking and model management via Azure Marketplace. Use for creating W&B instances, managing SSO, marketplace integration, and ML observability.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.WeightsAndBiases (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-monitor-ingestion-java
description: |
description: Azure Monitor Ingestion SDK for Java. Send custom logs to Azure Monitor via Data Collection Rules (DCR) and Data Collection Endpoints (DCE).
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Monitor Ingestion SDK for Java

View File

@@ -1,9 +1,9 @@
---
name: azure-monitor-ingestion-py
description: |
description: Azure Monitor Ingestion SDK for Python. Use for sending custom logs to Log Analytics workspace via Logs Ingestion API.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Monitor Ingestion SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-monitor-opentelemetry-exporter-java
description: |
description: Azure Monitor OpenTelemetry Exporter for Java. Export OpenTelemetry traces, metrics, and logs to Azure Monitor/Application Insights.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Monitor OpenTelemetry Exporter for Java

View File

@@ -1,9 +1,9 @@
---
name: azure-monitor-opentelemetry-exporter-py
description: |
description: Azure Monitor OpenTelemetry Exporter for Python. Use for low-level OpenTelemetry export to Application Insights.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Monitor OpenTelemetry Exporter for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-monitor-opentelemetry-py
description: |
description: Azure Monitor OpenTelemetry Distro for Python. Use for one-line Application Insights setup with auto-instrumentation.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Monitor OpenTelemetry Distro for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-monitor-query-java
description: |
description: Azure Monitor Query SDK for Java. Execute Kusto queries against Log Analytics workspaces and query metrics from Azure resources.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Monitor Query SDK for Java

View File

@@ -1,9 +1,9 @@
---
name: azure-monitor-query-py
description: |
description: Azure Monitor Query SDK for Python. Use for querying Log Analytics workspaces and Azure Monitor metrics.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure Monitor Query SDK for Python

View File

@@ -1,9 +1,9 @@
---
name: azure-postgres-ts
description: |
description: Connect to Azure Database for PostgreSQL Flexible Server from Node.js/TypeScript using the pg (node-postgres) package.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure PostgreSQL for TypeScript (node-postgres)

View File

@@ -1,9 +1,9 @@
---
name: azure-resource-manager-cosmosdb-dotnet
description: |
description: Azure Resource Manager SDK for Cosmos DB in .NET.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.CosmosDB (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-resource-manager-durabletask-dotnet
description: |
description: Azure Resource Manager SDK for Durable Task Scheduler in .NET.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.DurableTask (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-resource-manager-mysql-dotnet
description: |
description: Azure MySQL Flexible Server SDK for .NET. Database management for MySQL Flexible Server deployments.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.MySql (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-resource-manager-playwright-dotnet
description: |
description: Azure Resource Manager SDK for Microsoft Playwright Testing in .NET.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.Playwright (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-resource-manager-postgresql-dotnet
description: |
description: Azure PostgreSQL Flexible Server SDK for .NET. Database management for PostgreSQL Flexible Server deployments.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.PostgreSql (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-resource-manager-redis-dotnet
description: |
description: Azure Resource Manager SDK for Redis in .NET.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.Redis (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-resource-manager-sql-dotnet
description: |
description: Azure Resource Manager SDK for Azure SQL in .NET.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.ResourceManager.Sql (.NET)

View File

@@ -1,9 +1,9 @@
---
name: azure-search-documents-dotnet
description: |
description: Azure AI Search SDK for .NET (Azure.Search.Documents). Use for building search applications with full-text, vector, semantic, and hybrid search.
risk: unknown
source: community
date_added: "2026-02-27"
date_added: '2026-02-27'
---
# Azure.Search.Documents (.NET)

Some files were not shown because too many files have changed in this diff Show More