Add Trinity Core tasks and Gemini MCP consultation

Tasks Added:
- Task #109: MCP Logging in Trinity Console (full spec)
- Task #110: Uptime Kuma monitor cleanup
- Task #111: Claude Desktop MCP integration

Consultations:
- gemini-mcp-connector-2026-04-11.md - Full MCP protocol guidance
- gemini-social-api-strategy-2026-04-10.md - Social sync strategy

Key insights from Gemini:
- Claude.ai web doesn't support custom MCP connectors yet
- Use Claude Desktop + local wrapper script for now
- Trinity Core REST API works as-is, no rewrite needed
- Future: SSE support when Anthropic opens remote MCP

Chronicler #76
This commit is contained in:
Claude
2026-04-11 07:28:48 +00:00
parent 9f68a4f661
commit bb68e37504
6 changed files with 1314 additions and 0 deletions

View File

@@ -0,0 +1,271 @@
# Gemini Consultation: Trinity Core MCP Connector Integration
**Date:** April 11, 2026 @ 2:30 AM CT
**From:** Michael (The Wizard) + Claude (Chronicler #76)
**To:** Gemini (Architectural Partner)
**Re:** How to connect Claude to our custom MCP server (Trinity Core)
---
## Hey Gemini! 👋
Quick late-night question for you. We just built something cool and want your input on the next step.
---
## The Situation
Tonight we deployed **Trinity Core** — a Raspberry Pi 4B in Michael's house running:
- Node.js/Express MCP server on port 3000
- Cloudflare Tunnel exposing it at `https://mcp.firefrostgaming.com`
- SSH access to all 7 Firefrost servers
- API token authentication + command logging
**It works great!** We can execute commands on any server via:
```bash
curl -X POST https://mcp.firefrostgaming.com/exec \
-H "Authorization: Bearer FFG-Trinity-2026-Core-Access" \
-H "Content-Type: application/json" \
-d '{"server":"command-center","command":"hostname"}'
```
Returns:
```json
{"server":"command-center","command":"hostname","success":true,"stdout":"command-center","stderr":"","error":null}
```
---
## What We're Trying to Do
We want **Claude to call Trinity Core directly** — like how Claude currently calls Canva, Stripe, and Cloudflare via MCP connectors.
Right now, Claude's sandbox can't make outbound requests to `mcp.firefrostgaming.com`. Michael has to run curl commands manually, which defeats the purpose.
The dream: Claude says "I'll check the server status" → calls Trinity Core tool → gets response → continues conversation.
---
## Specific Questions
1. **How do custom MCP connectors work?** Is there a public spec for the Model Context Protocol that we can implement on Trinity Core?
2. **What's the simplest path to integration?** Options we're considering:
- Implement full MCP protocol on Trinity Core
- Build a Cloudflare Worker as a translation layer
- Register Trinity Core somehow with Claude.ai
- Wait for Anthropic to open up custom connectors
3. **Is there an MCP server template or SDK** we could use as a starting point? We saw references to `@modelcontextprotocol/sdk` but unclear if that works with Claude.ai specifically.
4. **Authentication concerns** — How do MCP connectors handle auth? Our current Bearer token approach — would that work, or does MCP expect something different (OAuth, API keys, etc.)?
5. **Any gotchas** we should know about before going down this path?
---
## Context That Might Help
- **Current endpoints:**
- `GET /` — health check (no auth)
- `GET /servers` — list servers (auth required)
- `POST /exec` — execute command (auth required)
- **Claude already has MCP connectors for:** Google Calendar, Gmail, Canva, Cloudflare, Stripe, Mermaid Chart
- **Our stack:** Node.js/Express on Pi, Cloudflare Tunnel, simple REST API
- **Goal:** Give Claude operational access to Firefrost infrastructure with proper audit trail
---
Thanks Gemini! 🔥❄️
— Michael + Claude (Chronicler #76)
---
## Gemini's Response (April 11, 2026 @ 2:35 AM CT)
**Summary:** Use Claude Desktop App with a local MCP wrapper script that bridges to our existing REST API. Claude.ai web doesn't support custom MCP connectors yet.
### Key Points
1. **Local MCP vs Managed MCP** — Claude.ai uses "Managed MCP" (Anthropic-hosted connectors like Canva). For custom servers, use **Claude Desktop App** which allows local config files.
2. **Bridge Approach (Recommended)** — Don't rewrite Trinity Core. Create a small Node.js wrapper script that:
- Implements MCP protocol via `@modelcontextprotocol/sdk`
- Receives commands from Claude Desktop via stdio
- Forwards to our existing REST API at `mcp.firefrostgaming.com`
3. **MCP SDK Structure:**
```javascript
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
const server = new Server({
name: "trinity-core",
version: "1.0.0",
}, {
capabilities: { tools: {} }
});
// Define tools, handle requests...
```
4. **Claude Desktop Config** (`~/Library/Application Support/Claude/claude_desktop_config.json`):
```json
{
"mcpServers": {
"trinity": {
"command": "node",
"args": ["/path/to/your/trinity-mcp-wrapper.js"],
"env": {
"TRINITY_TOKEN": "FFG-Trinity-2026-Core-Access"
}
}
}
}
```
5. **Authentication** — Handled via environment variables in the config. Our Bearer token approach works perfectly.
6. **Gotcha Warning** — Gemini recommends a **Command Allowlist** on Trinity Core until we're confident in the persona's stability. Valid concern.
---
## Thank You Response
**Date:** April 11, 2026 @ 2:40 AM CT
---
## Hey Gemini!
Perfect — this is exactly what we needed. The "bridge script" approach is brilliant because we don't have to rewrite Trinity Core at all.
**Quick follow-up thoughts:**
1. **Claude Desktop vs Claude.ai** — This is a key insight. Michael primarily uses Claude.ai (web), so we have two paths:
- Use Claude Desktop for MCP access (requires switching workflows)
- Wait for Anthropic to open custom connectors on web (unknown timeline)
2. **Command Allowlist** — Totally agree. We discussed this earlier tonight and Michael explicitly said "I trust you with rm, apt, and reboot" — but your point about persona stability across Chronicler instances is valid. We'll implement tiered permissions:
- **Safe (no approval):** `hostname`, `uptime`, `df`, `free`, `systemctl status`, `docker ps`, `cat` (logs), `git pull`
- **Operational (no approval):** `systemctl restart <known-service>`, deploy scripts
- **Dangerous (log + notify):** `rm`, `apt`, `reboot`, raw shell
3. **Implementation Plan:**
- Create the MCP wrapper script
- Test on Michael's machine with Claude Desktop
- Document for successor Chroniclers
- Later: evaluate if we want to run the wrapper on the Pi itself
**One question:** Is there any way to make this work with Claude.ai web in the future? Like, could we potentially get Trinity Core added to the MCP connector marketplace, or is that only for commercial integrations?
Thanks for the late-night wisdom! 🔥❄️
— Michael + Claude (Chronicler #76)
---
## Conclusion
**Path Forward:** Claude Desktop + MCP SDK wrapper script is the way.
**Next Steps:**
1. Install Claude Desktop on Michael's machine
2. Create MCP wrapper script using `@modelcontextprotocol/sdk`
3. Configure Claude Desktop to point to wrapper
4. Implement command allowlist on Trinity Core
5. Test the full loop: Claude Desktop → MCP → Trinity Core → Server → Response
**Documented as Task #111.**
---
## Gemini's Follow-up Response (April 11, 2026 @ 2:45 AM CT)
**Summary:** Provided complete wrapper script code and clarified future Web UI path.
### Future of Claude.ai Web + Custom MCP
Two likely paths:
1. **Remote MCP ("Bring Your Own URL")** — Anthropic may allow users to input remote URLs directly. Would require Trinity Core to support **SSE (Server-Sent Events)** instead of stdio.
2. **Enterprise/Team Gateway** — Shared resource registration for organizations.
### Complete Wrapper Script
```javascript
// trinity-wrapper.js
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { CallToolRequestSchema, ListToolsRequestSchema } from "@modelcontextprotocol/sdk/types.js";
import axios from "axios";
const server = new Server({ name: "trinity-bridge", version: "1.0.0" }, { capabilities: { tools: {} } });
const TRINITY_URL = "https://mcp.firefrostgaming.com/exec";
const AUTH_TOKEN = process.env.TRINITY_TOKEN;
server.setRequestHandler(ListToolsRequestSchema, async () => ({
tools: [{
name: "run_command",
description: "Execute a command on a Firefrost server",
inputSchema: {
type: "object",
properties: {
server: { type: "string" },
command: { type: "string" }
},
required: ["server", "command"]
}
}]
}));
server.setRequestHandler(CallToolRequestSchema, async (request) => {
if (request.params.name === "run_command") {
const { server, command } = request.params.arguments;
// Simple Allowlist Check
const forbidden = ["rm ", "format", "mkfs"];
if (forbidden.some(word => command.includes(word))) {
return { content: [{ type: "text", text: "Error: Safety block triggered for command." }], isError: true };
}
const response = await axios.post(TRINITY_URL, { server, command }, {
headers: { Authorization: `Bearer ${AUTH_TOKEN}` }
});
return { content: [{ type: "text", text: response.data.stdout || response.data.error }] };
}
});
const transport = new StdioServerTransport();
await server.connect(transport);
```
### Tiered Security ("Trinity Shield")
| Tier | Commands | Handling |
|------|----------|----------|
| **Safe** | `uptime`, `df -h`, `docker ps`, `hostname` | Immediate execution |
| **Operational** | `git pull`, `systemctl restart` | Log + Execute |
| **Dangerous** | `rm`, `fdisk`, `reboot`, `mkfs` | Block or require `--force-wizard` flag |
### Strategy Summary
1. **Keep Trinity Core as-is** — REST API on Pi works great
2. **Desktop Bridge** — Run wrapper locally on Michael's machine
3. **Future Web Transition** — If Anthropic opens remote MCP, update Pi to support SSE, delete local bridge
---
## Final Conclusion
The path is clear:
- **Now:** Claude Desktop + local MCP wrapper → Trinity Core REST API
- **Future:** Direct Web UI integration when Anthropic supports remote MCP (SSE)
Gemini's approach hits the right balance — Michael gets autonomy, safety rails exist for dangerous commands, and the architecture is ready to evolve.

View File

@@ -0,0 +1,93 @@
# Gemini Consultation: Social Platform API Integration Strategy
**Date:** April 10, 2026
**From:** Michael (The Wizard) + Claude (Chronicler #76)
**To:** Gemini (Architectural Partner)
**Re:** Best approach to import analytics data from TikTok, Facebook, Instagram, X, and Bluesky into Trinity Console
---
## Hey Gemini! 👋
We just deployed the Social Analytics module to Trinity Console (your architecture review earlier today was spot-on — thanks!). Manual entry is working great, but now we want to automate data ingestion from all 5 platforms.
We need your guidance on the best path forward for each platform. Michael is 5 days from soft launch, so we need to prioritize smartly.
---
## Context
**What We Built:**
- Social Analytics module in Trinity Console (`/admin/social`)
- Database tables: `social_posts` and `social_account_snapshots`
- Supports 5 platforms: TikTok, Facebook, Instagram, X, Bluesky
- Manual entry working now
- `platform_post_id` column ready for API matching (per your earlier recommendation)
**Current Stack:**
- Arbiter (Node.js/Express) on Command Center
- PostgreSQL database
- Already have OAuth flows for Discord and Stripe
- n8n available on TX1 for workflow automation
**The Goal:**
Automatically pull post analytics (views, likes, comments, shares, watch time, etc.) into our unified `social_posts` table — ideally daily via cron or webhook.
---
## The Platforms
### 1. TikTok
- **Current state:** Best performing platform (1.9K views, 98% from For You page)
- **What we know:** Business API requires verified business + possible ad spend history. Analytics API needs OAuth 2.0, developer approval takes days/weeks.
- **Question:** Is there a faster path? Third-party services? Scraping? Or is the official API the only real option?
### 2. Facebook
- **Current state:** Posting Reels, decent performance
- **What we know:** Meta Graph API covers both FB and IG. Requires business verification.
- **Question:** What's the approval timeline? Any gotchas for a gaming community vs. e-commerce?
### 3. Instagram
- **Current state:** Cross-posting from Facebook
- **What we know:** Same Meta Graph API as Facebook
- **Question:** Can we get both FB and IG with one integration, or do they require separate auth flows?
### 4. X (Twitter)
- **Current state:** Active, part of FOMO campaign
- **What we know:** API is now paid ($100/mo basic tier). Free tier is severely limited.
- **Question:** Is the $100/mo worth it for our scale? Any free alternatives that provide analytics?
### 5. Bluesky
- **Current state:** Growing platform, we're early
- **What we know:** AT Protocol is open and free. Most developer-friendly of the bunch.
- **Question:** Is there an official analytics endpoint, or do we need to calculate engagement from post data?
---
## Specific Questions
1. **Priority order:** Given limited time before soft launch, which platform should we tackle first for API integration? Which gives us the best ROI on development time?
2. **n8n vs. direct integration:** Should we build API integrations directly into Arbiter, or use n8n workflows to fetch data and push to our database? Pros/cons?
3. **Third-party aggregators:** Are there services (like Buffer, Hootsuite, or specialized analytics APIs) that aggregate multiple platforms into one API? Would that be faster than integrating each platform separately?
4. **Rate limits and quotas:** What should we watch out for with daily polling? Any platforms where webhooks are better than polling?
5. **What are we missing?** Any platforms or approaches we haven't considered that would help a gaming community specifically?
---
## Additional Context
- Soft launch is April 15 (5 days)
- Main goal is visibility, not real-time — daily sync is fine
- Meg (Community Manager) is currently doing manual entry
- We'd rather do it right than do it fast and break it
- Michael has 57 years of pattern recognition but limited patience for platform bureaucracy
---
Thanks Gemini! 🔥❄️
— Michael + Claude (Chronicler #76)

View File

@@ -0,0 +1,42 @@
---
task_number: 109
title: Trinity Core MCP Logging in Trinity Console
status: Planned
priority: P2-Medium
is_blocker: false
owner: Michael
tags:
- trinity-core
- trinity-console
- logging
- infrastructure
estimated_hours: 4
---
# Trinity Core MCP Logging
Add MCP command logging to Trinity Console. All commands executed through Trinity Core will be logged to PostgreSQL and viewable at `/admin/mcp-logs` with filtering by server, status, and date.
## Why
- Centralized audit trail for all remote commands
- No Discord noise — silent logging
- Searchable/filterable history
- Consistent with Trinity Console architecture
## Components
1. `mcp_logs` PostgreSQL table
2. Arbiter `/api/internal/mcp/log` endpoint
3. Trinity Core updated to POST logs to Arbiter
4. Trinity Console `/admin/mcp-logs` page with filters
## Full Spec
See `docs/tasks/task-109-mcp-logging/README.md` for complete implementation details including SQL schema, API code, and UI template.
## Prerequisites
- Trinity Core deployed ✅
- Arbiter running ✅
- INTERNAL_API_TOKEN configured ✅

View File

@@ -0,0 +1,120 @@
---
task_number: 110
title: Uptime Kuma Monitor Cleanup and Update
status: Planned
priority: P2-Medium
is_blocker: false
owner: Michael
tags:
- infrastructure
- monitoring
- uptime-kuma
estimated_hours: 2
---
# Uptime Kuma Monitor Cleanup
Audit and update Uptime Kuma monitors at status.firefrostgaming.com to reflect current infrastructure.
## Current State (April 11, 2026)
**Location:** Command Center (63.143.34.217)
**Service:** systemd `uptime-kuma.service`
**Data:** `/opt/uptime-kuma/data/kuma.db`
**URL:** https://status.firefrostgaming.com
### Existing Monitors (18 total)
| ID | Name | Type | Status |
|----|------|------|--------|
| 1 | TX1 | ping | ✅ Keep |
| 3 | Gitea - Git Repository | http | ✅ Keep |
| 4 | Pterodactyl Panel | ping | ✅ Keep |
| 5 | Command Center | ping | ✅ Keep |
| 6 | Paymenter Billing | ping | ❌ **DELETE** - Decommissioned |
| 7 | Ghost CMS | ping | ❌ **DELETE** - Decommissioned |
| 9 | NC1 | ping | ✅ Keep |
| 10 | Stoneblock 4 - TX | port | ✅ Keep |
| 11 | FoundryVTT - TX1 | port | ✅ Keep |
| 12 | Reclamation - TX1 | port | ⚠️ **VERIFY** - Is this still running? |
| 13 | Society: Sunlit Valley - TX1 | port | ✅ Keep |
| 14 | Vanilla 1.21.11 - TX1 | port | ⚠️ **VERIFY** - Current version? |
| 15 | All The Mons - TX1 | port | ✅ Keep |
| 16 | The Ember Project - NC1 | port | ⚠️ **VERIFY** - Is this still running? |
| 17 | Minecolonies - NC1 | port | ⚠️ **VERIFY** - Is this still running? |
| 18 | All The Mods 10 - NC1 | port | ✅ Keep |
### Monitors to DELETE
| Name | Reason |
|------|--------|
| Paymenter Billing | Decommissioned April 2026, replaced by Stripe direct |
| Ghost CMS | Decommissioned April 2026, replaced by 11ty |
### Monitors to ADD
| Name | Type | Target | Notes |
|------|------|--------|-------|
| Trinity Core MCP | http | https://mcp.firefrostgaming.com | New Pi gateway |
| Trinity Console | http | https://discord-bot.firefrostgaming.com/admin | Admin dashboard |
| Website | http | https://firefrostgaming.com | Main website (11ty) |
| Arbiter API | http | https://discord-bot.firefrostgaming.com/health | Backend health |
| Wiki VPS | ping | 64.50.188.14 | Wiki.js server |
| Services VPS | ping | 38.68.14.188 | Mailcow server |
| Dev Panel | ping | 64.50.188.128 | Development server |
| Vaultwarden | http | https://vault.firefrostgaming.com | Password manager |
| Mythcraft 5 | port | TBD | Game server (verify port) |
### Monitors to VERIFY
Check if these game servers still exist in Pterodactyl:
- Reclamation - TX1
- Vanilla 1.21.11 - TX1 (version current?)
- The Ember Project - NC1
- Minecolonies - NC1
---
## Implementation Steps
### Via Uptime Kuma Web UI (Easiest)
1. Log into https://status.firefrostgaming.com
2. Delete obsolete monitors (Paymenter, Ghost CMS)
3. Add new monitors listed above
4. Verify game server monitors are accurate
5. Test all monitors show green
### Via SQLite (If needed)
Delete monitors:
```sql
DELETE FROM monitor WHERE id IN (6, 7);
```
Note: Uptime Kuma may need restart after direct DB changes.
---
## Access
- **Web UI:** https://status.firefrostgaming.com
- **Credentials:** In Vaultwarden under "Uptime Kuma"
- **Service:** `sudo systemctl restart uptime-kuma`
---
## Post-Update Checklist
- [ ] Paymenter monitor deleted
- [ ] Ghost CMS monitor deleted
- [ ] Trinity Core MCP monitor added and green
- [ ] Trinity Console monitor added and green
- [ ] Website monitor added and green
- [ ] All infrastructure servers monitored
- [ ] Game server list matches Pterodactyl
- [ ] Status page looks clean and accurate
---
**Fire + Frost + Foundation = Where Love Builds Legacy** 💙🔥❄️

View File

@@ -0,0 +1,244 @@
---
task_number: 111
title: Claude Desktop MCP Integration for Trinity Core
status: Planned
priority: P3-Low
is_blocker: false
owner: Michael
tags:
- trinity-core
- mcp
- claude-desktop
- infrastructure
estimated_hours: 2
---
# Claude Desktop MCP Integration
Connect Claude Desktop to Trinity Core via MCP protocol, enabling Claude to execute commands on Firefrost servers without manual curl commands.
## Why
Currently, Claude (web) can't call Trinity Core directly — Michael must run curl commands. With Claude Desktop + MCP wrapper, Claude gains native tool access to the server fleet.
## Prerequisites
- Trinity Core deployed ✅
- Claude Desktop installed on Michael's machine
- Node.js on Michael's machine
## Architecture
```
Claude Desktop
↓ MCP (stdio)
Local Wrapper Script (trinity-wrapper.js)
↓ HTTPS + Bearer Token
Trinity Core (mcp.firefrostgaming.com)
↓ SSH
Target Server
```
## Implementation
### 1. Install Dependencies (Michael's Machine)
```bash
mkdir ~/trinity-mcp && cd ~/trinity-mcp
npm init -y
npm install @modelcontextprotocol/sdk axios
```
### 2. Create Wrapper Script
Create `~/trinity-mcp/trinity-wrapper.js`:
```javascript
import { Server } from "@modelcontextprotocol/sdk/server/index.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { CallToolRequestSchema, ListToolsRequestSchema } from "@modelcontextprotocol/sdk/types.js";
import axios from "axios";
const server = new Server(
{ name: "trinity-bridge", version: "1.0.0" },
{ capabilities: { tools: {} } }
);
const TRINITY_URL = "https://mcp.firefrostgaming.com/exec";
const AUTH_TOKEN = process.env.TRINITY_TOKEN;
// List available tools
server.setRequestHandler(ListToolsRequestSchema, async () => ({
tools: [
{
name: "run_command",
description: "Execute a shell command on a Firefrost Gaming server. Available servers: command-center, tx1-dallas, nc1-charlotte, panel-vps, dev-panel, wiki-vps, services-vps",
inputSchema: {
type: "object",
properties: {
server: {
type: "string",
description: "Server name (e.g., command-center, tx1-dallas)",
enum: ["command-center", "tx1-dallas", "nc1-charlotte", "panel-vps", "dev-panel", "wiki-vps", "services-vps"]
},
command: {
type: "string",
description: "The Linux command to execute"
}
},
required: ["server", "command"]
}
},
{
name: "list_servers",
description: "List all available Firefrost servers",
inputSchema: {
type: "object",
properties: {}
}
}
]
}));
// Handle tool calls
server.setRequestHandler(CallToolRequestSchema, async (request) => {
const { name, arguments: args } = request.params;
if (name === "list_servers") {
return {
content: [{
type: "text",
text: JSON.stringify({
servers: [
{ name: "command-center", host: "63.143.34.217", description: "Gitea, Arbiter, Uptime Kuma, Vaultwarden" },
{ name: "tx1-dallas", host: "38.68.14.26", description: "Game servers, Wings, FoundryVTT, n8n" },
{ name: "nc1-charlotte", host: "216.239.104.130", description: "Game servers, Wings" },
{ name: "panel-vps", host: "45.94.168.138", description: "Pterodactyl Panel" },
{ name: "dev-panel", host: "64.50.188.128", description: "Development Pterodactyl + Blueprint" },
{ name: "wiki-vps", host: "64.50.188.14", description: "Wiki.js instances, MkDocs" },
{ name: "services-vps", host: "38.68.14.188", description: "Mailcow" }
]
}, null, 2)
}]
};
}
if (name === "run_command") {
const { server, command } = args;
// Trinity Shield - Safety blocks
const forbidden = ["rm -rf /", "mkfs", "fdisk", "dd if=", "> /dev/sd"];
if (forbidden.some(pattern => command.includes(pattern))) {
return {
content: [{ type: "text", text: "⛔ BLOCKED: Destructive command detected. Execute manually via SSH if needed." }],
isError: true
};
}
try {
const response = await axios.post(TRINITY_URL, { server, command }, {
headers: {
Authorization: `Bearer ${AUTH_TOKEN}`,
"Content-Type": "application/json"
},
timeout: 35000
});
const data = response.data;
if (data.success) {
return {
content: [{
type: "text",
text: `✅ [${server}] Command succeeded\n\n${data.stdout || "(no output)"}`
}]
};
} else {
return {
content: [{
type: "text",
text: `❌ [${server}] Command failed\n\nSTDERR: ${data.stderr || "(none)"}\nError: ${data.error || "Unknown"}`
}],
isError: true
};
}
} catch (err) {
return {
content: [{ type: "text", text: `❌ Connection error: ${err.message}` }],
isError: true
};
}
}
return {
content: [{ type: "text", text: `Unknown tool: ${name}` }],
isError: true
};
});
// Start server
const transport = new StdioServerTransport();
await server.connect(transport);
```
### 3. Add to package.json
```json
{
"name": "trinity-mcp",
"version": "1.0.0",
"type": "module",
"main": "trinity-wrapper.js"
}
```
### 4. Configure Claude Desktop
**macOS:** `~/Library/Application Support/Claude/claude_desktop_config.json`
**Windows:** `%APPDATA%\Claude\claude_desktop_config.json`
```json
{
"mcpServers": {
"trinity": {
"command": "node",
"args": ["/path/to/trinity-mcp/trinity-wrapper.js"],
"env": {
"TRINITY_TOKEN": "FFG-Trinity-2026-Core-Access"
}
}
}
}
```
### 5. Restart Claude Desktop
After restarting, a hammer icon (🔨) should appear indicating MCP tools are available.
## Testing
In Claude Desktop, try:
- "List my Firefrost servers"
- "Check uptime on command-center"
- "What's the disk usage on tx1-dallas?"
## Trinity Shield (Safety Tiers)
| Tier | Commands | Behavior |
|------|----------|----------|
| Safe | `hostname`, `uptime`, `df`, `free`, `docker ps` | Execute immediately |
| Operational | `git pull`, `systemctl restart <service>` | Execute + Log |
| Blocked | `rm -rf /`, `mkfs`, `fdisk`, destructive patterns | Return error, require manual SSH |
## Future: Claude.ai Web Integration
When Anthropic enables "Remote MCP" for web:
1. Update Trinity Core to support SSE (Server-Sent Events)
2. Register URL directly in Claude.ai settings
3. Delete local wrapper script
See `docs/consultations/gemini-mcp-connector-2026-04-11.md` for full Gemini consultation.
---
**Fire + Frost + Foundation = Where Love Builds Legacy** 💙🔥❄️

View File

@@ -0,0 +1,544 @@
# Task #109: Trinity Core MCP Logging in Trinity Console
**Status:** Planned
**Priority:** P2-Medium
**Owner:** Michael
**Created:** April 11, 2026 by Chronicler #76
**Estimated Hours:** 3-4
---
## Overview
Add MCP command logging to Trinity Console instead of Discord notifications. All commands executed through Trinity Core will be logged to Arbiter's PostgreSQL database and viewable in a new Trinity Console page with filtering.
---
## Architecture
```
Trinity Core (Pi)
↓ POST /api/internal/mcp/log
Arbiter (Command Center)
↓ INSERT
PostgreSQL (mcp_logs table)
↓ SELECT
Trinity Console /admin/mcp-logs
```
---
## Implementation
### 1. Database Schema
Run on Command Center:
```sql
CREATE TABLE mcp_logs (
id SERIAL PRIMARY KEY,
server VARCHAR(50) NOT NULL,
command TEXT NOT NULL,
success BOOLEAN NOT NULL,
stdout TEXT,
stderr TEXT,
error TEXT,
executed_at TIMESTAMP WITH TIME ZONE DEFAULT NOW(),
execution_time_ms INTEGER
);
CREATE INDEX idx_mcp_logs_server ON mcp_logs(server);
CREATE INDEX idx_mcp_logs_executed_at ON mcp_logs(executed_at DESC);
CREATE INDEX idx_mcp_logs_success ON mcp_logs(success);
```
### 2. Arbiter Internal API Endpoint
Add to `/home/claude/firefrost-services/services/arbiter-3.0/src/routes/api.js`:
```javascript
// MCP Log endpoint
router.post('/internal/mcp/log', authenticateInternal, async (req, res) => {
try {
const { server, command, success, stdout, stderr, error, execution_time_ms } = req.body;
if (!server || !command || success === undefined) {
return res.status(400).json({ error: 'Missing required fields' });
}
const result = await pool.query(
`INSERT INTO mcp_logs (server, command, success, stdout, stderr, error, execution_time_ms)
VALUES ($1, $2, $3, $4, $5, $6, $7)
RETURNING id`,
[server, command, success, stdout || '', stderr || '', error || null, execution_time_ms || null]
);
res.json({ success: true, id: result.rows[0].id });
} catch (err) {
console.error('MCP log error:', err);
res.status(500).json({ error: 'Failed to log command' });
}
});
// MCP Logs list endpoint (for Trinity Console)
router.get('/internal/mcp/logs', authenticateInternal, async (req, res) => {
try {
const { server, success, limit = 100, offset = 0 } = req.query;
let query = 'SELECT * FROM mcp_logs WHERE 1=1';
const params = [];
let paramCount = 0;
if (server) {
paramCount++;
query += ` AND server = $${paramCount}`;
params.push(server);
}
if (success !== undefined) {
paramCount++;
query += ` AND success = $${paramCount}`;
params.push(success === 'true');
}
query += ' ORDER BY executed_at DESC';
paramCount++;
query += ` LIMIT $${paramCount}`;
params.push(parseInt(limit));
paramCount++;
query += ` OFFSET $${paramCount}`;
params.push(parseInt(offset));
const result = await pool.query(query, params);
// Get total count for pagination
let countQuery = 'SELECT COUNT(*) FROM mcp_logs WHERE 1=1';
const countParams = [];
let countParamNum = 0;
if (server) {
countParamNum++;
countQuery += ` AND server = $${countParamNum}`;
countParams.push(server);
}
if (success !== undefined) {
countParamNum++;
countQuery += ` AND success = $${countParamNum}`;
countParams.push(success === 'true');
}
const countResult = await pool.query(countQuery, countParams);
res.json({
logs: result.rows,
total: parseInt(countResult.rows[0].count),
limit: parseInt(limit),
offset: parseInt(offset)
});
} catch (err) {
console.error('MCP logs fetch error:', err);
res.status(500).json({ error: 'Failed to fetch logs' });
}
});
```
### 3. Update Trinity Core MCP Server
Replace `/home/claude_executor/mcp-server/index.js` on the Pi:
```javascript
const express = require('express');
const { exec } = require('child_process');
const fs = require('fs');
const app = express();
app.use(express.json());
const API_TOKEN = 'FFG-Trinity-2026-Core-Access';
const LOG_FILE = '/home/claude_executor/mcp-server/command.log';
const ARBITER_URL = 'https://discord-bot.firefrostgaming.com/api/internal/mcp/log';
const ARBITER_TOKEN = '6fYF1akCRW6pM2F8n3S3RxeIod4YgRniUJNEQurvBP4=';
const SERVERS = {
'command-center': { host: '63.143.34.217', user: 'root' },
'tx1-dallas': { host: '38.68.14.26', user: 'root' },
'nc1-charlotte': { host: '216.239.104.130', user: 'root' },
'panel-vps': { host: '45.94.168.138', user: 'root' },
'dev-panel': { host: '64.50.188.128', user: 'root' },
'wiki-vps': { host: '64.50.188.14', user: 'architect' },
'services-vps': { host: '38.68.14.188', user: 'root' }
};
function log(msg) {
const line = `[${new Date().toISOString()}] ${msg}\n`;
fs.appendFileSync(LOG_FILE, line);
console.log(line.trim());
}
async function logToArbiter(data) {
try {
await fetch(ARBITER_URL, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-Internal-Token': ARBITER_TOKEN
},
body: JSON.stringify(data)
});
} catch (err) {
log(`Failed to log to Arbiter: ${err.message}`);
}
}
function auth(req, res, next) {
const token = req.headers.authorization?.replace('Bearer ', '');
if (token !== API_TOKEN) {
log(`AUTH FAILED from ${req.ip}`);
return res.status(401).json({ error: 'Unauthorized' });
}
next();
}
app.get('/', (req, res) => {
res.json({ status: 'Trinity Core Online', timestamp: new Date().toISOString() });
});
app.get('/servers', auth, (req, res) => {
res.json({ servers: Object.entries(SERVERS).map(([name, info]) => ({ name, ...info })) });
});
app.post('/exec', auth, (req, res) => {
const { command, server } = req.body;
if (!command || !server) {
return res.status(400).json({ error: 'Missing command or server' });
}
const target = SERVERS[server];
if (!target) {
return res.status(400).json({ error: `Unknown server: ${server}` });
}
log(`EXEC [${server}] ${command}`);
const sshCmd = `ssh -o ConnectTimeout=10 ${target.user}@${target.host} "${command.replace(/"/g, '\\"')}"`;
const startTime = Date.now();
exec(sshCmd, { timeout: 30000 }, async (error, stdout, stderr) => {
const success = !error;
const executionTime = Date.now() - startTime;
log(`RESULT [${server}] success=${success} time=${executionTime}ms`);
// Log to Arbiter (async, don't block response)
logToArbiter({
server,
command,
success,
stdout: stdout.trim(),
stderr: stderr.trim(),
error: error ? error.message : null,
execution_time_ms: executionTime
});
res.json({
server,
command,
success,
stdout: stdout.trim(),
stderr: stderr.trim(),
error: error ? error.message : null
});
});
});
const PORT = 3000;
app.listen(PORT, () => {
log('Trinity Core MCP Server started');
});
```
### 4. Trinity Console UI
Create `/home/claude/firefrost-services/services/arbiter-3.0/src/views/admin/mcp-logs.ejs`:
```html
<%- include('../partials/header', { title: 'MCP Logs' }) %>
<div class="container mt-4">
<div class="d-flex justify-content-between align-items-center mb-4">
<h1>🖥️ MCP Command Logs</h1>
<a href="/admin" class="btn btn-outline-secondary">← Back to Dashboard</a>
</div>
<!-- Filters -->
<div class="card mb-4">
<div class="card-body">
<form method="GET" class="row g-3">
<div class="col-md-3">
<label class="form-label">Server</label>
<select name="server" class="form-select">
<option value="">All Servers</option>
<option value="command-center" <%= query.server === 'command-center' ? 'selected' : '' %>>Command Center</option>
<option value="tx1-dallas" <%= query.server === 'tx1-dallas' ? 'selected' : '' %>>TX1 Dallas</option>
<option value="nc1-charlotte" <%= query.server === 'nc1-charlotte' ? 'selected' : '' %>>NC1 Charlotte</option>
<option value="panel-vps" <%= query.server === 'panel-vps' ? 'selected' : '' %>>Panel VPS</option>
<option value="dev-panel" <%= query.server === 'dev-panel' ? 'selected' : '' %>>Dev Panel</option>
<option value="wiki-vps" <%= query.server === 'wiki-vps' ? 'selected' : '' %>>Wiki VPS</option>
<option value="services-vps" <%= query.server === 'services-vps' ? 'selected' : '' %>>Services VPS</option>
</select>
</div>
<div class="col-md-3">
<label class="form-label">Status</label>
<select name="success" class="form-select">
<option value="">All</option>
<option value="true" <%= query.success === 'true' ? 'selected' : '' %>>Success</option>
<option value="false" <%= query.success === 'false' ? 'selected' : '' %>>Failed</option>
</select>
</div>
<div class="col-md-3 d-flex align-items-end">
<button type="submit" class="btn btn-primary">Filter</button>
<a href="/admin/mcp-logs" class="btn btn-outline-secondary ms-2">Reset</a>
</div>
</form>
</div>
</div>
<!-- Stats -->
<div class="row mb-4">
<div class="col-md-3">
<div class="card bg-light">
<div class="card-body text-center">
<div class="text-muted small">Total Commands</div>
<div class="fs-3 fw-bold"><%= total %></div>
</div>
</div>
</div>
<div class="col-md-3">
<div class="card bg-success bg-opacity-10">
<div class="card-body text-center">
<div class="text-muted small">Successful</div>
<div class="fs-3 fw-bold text-success"><%= successCount %></div>
</div>
</div>
</div>
<div class="col-md-3">
<div class="card bg-danger bg-opacity-10">
<div class="card-body text-center">
<div class="text-muted small">Failed</div>
<div class="fs-3 fw-bold text-danger"><%= failCount %></div>
</div>
</div>
</div>
<div class="col-md-3">
<div class="card bg-info bg-opacity-10">
<div class="card-body text-center">
<div class="text-muted small">Avg Execution</div>
<div class="fs-3 fw-bold text-info"><%= avgTime %>ms</div>
</div>
</div>
</div>
</div>
<!-- Logs Table -->
<div class="card">
<div class="card-body p-0">
<table class="table table-hover mb-0">
<thead class="table-light">
<tr>
<th>Time</th>
<th>Server</th>
<th>Command</th>
<th>Status</th>
<th>Duration</th>
<th></th>
</tr>
</thead>
<tbody>
<% logs.forEach(log => { %>
<tr>
<td class="text-muted small"><%= new Date(log.executed_at).toLocaleString() %></td>
<td><span class="badge bg-secondary"><%= log.server %></span></td>
<td><code class="small"><%= log.command.substring(0, 50) %><%= log.command.length > 50 ? '...' : '' %></code></td>
<td>
<% if (log.success) { %>
<span class="badge bg-success">✓ Success</span>
<% } else { %>
<span class="badge bg-danger">✗ Failed</span>
<% } %>
</td>
<td class="text-muted small"><%= log.execution_time_ms || '-' %>ms</td>
<td>
<button class="btn btn-sm btn-outline-secondary"
data-bs-toggle="modal"
data-bs-target="#logModal<%= log.id %>">
Details
</button>
</td>
</tr>
<!-- Modal for details -->
<div class="modal fade" id="logModal<%= log.id %>" tabindex="-1">
<div class="modal-dialog modal-lg">
<div class="modal-content">
<div class="modal-header">
<h5 class="modal-title">Command Details</h5>
<button type="button" class="btn-close" data-bs-dismiss="modal"></button>
</div>
<div class="modal-body">
<p><strong>Server:</strong> <%= log.server %></p>
<p><strong>Command:</strong></p>
<pre class="bg-dark text-light p-3 rounded"><%= log.command %></pre>
<p><strong>STDOUT:</strong></p>
<pre class="bg-light p-3 rounded" style="max-height: 200px; overflow: auto;"><%= log.stdout || '(empty)' %></pre>
<% if (log.stderr) { %>
<p><strong>STDERR:</strong></p>
<pre class="bg-warning bg-opacity-10 p-3 rounded"><%= log.stderr %></pre>
<% } %>
<% if (log.error) { %>
<p><strong>Error:</strong></p>
<pre class="bg-danger bg-opacity-10 p-3 rounded"><%= log.error %></pre>
<% } %>
</div>
</div>
</div>
</div>
<% }) %>
<% if (logs.length === 0) { %>
<tr>
<td colspan="6" class="text-center text-muted py-4">No logs found</td>
</tr>
<% } %>
</tbody>
</table>
</div>
</div>
<!-- Pagination -->
<% if (total > limit) { %>
<nav class="mt-4">
<ul class="pagination justify-content-center">
<% const totalPages = Math.ceil(total / limit); %>
<% const currentPage = Math.floor(offset / limit) + 1; %>
<% for (let i = 1; i <= totalPages && i <= 10; i++) { %>
<li class="page-item <%= currentPage === i ? 'active' : '' %>">
<a class="page-link" href="?<%= new URLSearchParams({...query, offset: (i-1) * limit}).toString() %>"><%= i %></a>
</li>
<% } %>
</ul>
</nav>
<% } %>
</div>
<%- include('../partials/footer') %>
```
### 5. Add Route
Add to `/home/claude/firefrost-services/services/arbiter-3.0/src/routes/admin/index.js`:
```javascript
// MCP Logs page
router.get('/mcp-logs', requireAuth, async (req, res) => {
try {
const { server, success, limit = 50, offset = 0 } = req.query;
// Build query
let query = 'SELECT * FROM mcp_logs WHERE 1=1';
const params = [];
let paramCount = 0;
if (server) {
paramCount++;
query += ` AND server = $${paramCount}`;
params.push(server);
}
if (success !== undefined && success !== '') {
paramCount++;
query += ` AND success = $${paramCount}`;
params.push(success === 'true');
}
// Get logs
const logsQuery = query + ` ORDER BY executed_at DESC LIMIT $${paramCount + 1} OFFSET $${paramCount + 2}`;
const logsResult = await pool.query(logsQuery, [...params, parseInt(limit), parseInt(offset)]);
// Get total count
const countResult = await pool.query(query.replace('SELECT *', 'SELECT COUNT(*)'), params);
const total = parseInt(countResult.rows[0].count);
// Get success/fail counts
const statsResult = await pool.query(`
SELECT
COUNT(*) FILTER (WHERE success = true) as success_count,
COUNT(*) FILTER (WHERE success = false) as fail_count,
AVG(execution_time_ms) as avg_time
FROM mcp_logs
`);
res.render('admin/mcp-logs', {
logs: logsResult.rows,
total,
limit: parseInt(limit),
offset: parseInt(offset),
query: req.query,
successCount: statsResult.rows[0].success_count || 0,
failCount: statsResult.rows[0].fail_count || 0,
avgTime: Math.round(statsResult.rows[0].avg_time || 0)
});
} catch (err) {
console.error('MCP logs error:', err);
res.status(500).send('Error loading MCP logs');
}
});
```
### 6. Add to Dashboard Navigation
Add link in dashboard sidebar/nav to `/admin/mcp-logs`.
---
## Deployment Steps
1. **Database:** Run SQL schema on Command Center
2. **Arbiter:** Add API endpoints, add route, add view, deploy
3. **Trinity Core:** Update index.js on Pi, restart mcp-server
4. **Test:** Execute a command, verify it appears in Trinity Console
---
## Testing Checklist
- [ ] Database table created
- [ ] POST /api/internal/mcp/log works
- [ ] GET /api/internal/mcp/logs works with filters
- [ ] Trinity Core sends logs to Arbiter
- [ ] Trinity Console shows logs
- [ ] Filters work (server, success/fail)
- [ ] Details modal shows full output
- [ ] Pagination works
---
## Dependencies
- Trinity Core must be online and have network access to discord-bot.firefrostgaming.com
- INTERNAL_API_TOKEN must be configured
---
## Notes
- Local file logging on Pi remains as backup
- Arbiter logging is async (doesn't slow down command execution)
- Logs are kept indefinitely (add retention policy later if needed)
---
**Fire + Frost + Foundation = Where Love Builds Legacy** 💙🔥❄️