WHAT WAS DONE: - Task #92: Desktop MCP + Dispatch Architecture - Complete Node.js MCP server code (config, ssh_helper, index.js) - Express webhook listener for mobile dispatch - Cloudflare Tunnel setup instructions - 6 tools: server_status, restart_service, docker_compose_restart, git_pull, tail_logs, pterodactyl_power - Frostwall security rules documented - Claude Desktop configuration - Task #93: Trinity Codex (Shared Knowledge Base) - Dify/Qdrant RAG architecture - Three lineages: Wizard's, Emissary's, Catalyst's Chroniclers - Gitea -> n8n -> Dify ingestion pipeline - MCP connector for Michael (heavy use) - Dify Web App for Meg/Holly (light use) - Chunking strategy per content type - Security and access levels ARCHITECTURE DECISIONS (via Gemini consultation): - Claude Web cannot dispatch webhooks - use Discord bot + n8n instead - Build Codex (Task #93) FIRST - read-only, lower risk - Separate Discord Ops Bot from Arbiter for security - Meg/Holly use Dify Web App, not local MCP STATUS: Ready for implementation next session Signed-off-by: Claude (Chronicler #60) <claude@firefrostgaming.com>
This commit is contained in:
551
docs/tasks/task-092-desktop-mcp/README.md
Normal file
551
docs/tasks/task-092-desktop-mcp/README.md
Normal file
@@ -0,0 +1,551 @@
|
||||
# Task #92: Desktop MCP + Dispatch Architecture
|
||||
|
||||
**Created:** April 5, 2026
|
||||
**Created By:** Chronicler #60 + Gemini AI
|
||||
**Status:** READY FOR IMPLEMENTATION
|
||||
**Priority:** High (Accessibility accommodation)
|
||||
**Assignee:** Michael
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
Build a local MCP server on Michael's always-on Windows laptop that enables Claude Desktop to execute SSH commands on all Firefrost servers. Include a mobile dispatch system via Discord bot + n8n + Cloudflare Tunnel.
|
||||
|
||||
## The Problem
|
||||
|
||||
Michael has hand/arm limitations from reconstructive surgery. Copy-pasting commands between Claude and terminals is physically taxing. Claude's web sandbox blocks SSH (port 22) regardless of port number — it detects the protocol, not just the port.
|
||||
|
||||
## The Solution
|
||||
|
||||
1. **Claude Desktop** on always-on laptop with local MCP server
|
||||
2. **Local MCP Server** (Node.js) with SSH access to all servers
|
||||
3. **Cloudflare Tunnel** exposing a webhook listener (no router ports needed)
|
||||
4. **Discord Ops Bot** for mobile dispatch (separate from Arbiter)
|
||||
5. **n8n bridge** connecting Discord → Laptop
|
||||
|
||||
---
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
MOBILE DISPATCH:
|
||||
Phone (Discord) → Firefrost Ops Bot → n8n (Command Center) →
|
||||
Cloudflare Tunnel → Express Listener (Laptop) → SSH (Destination Server)
|
||||
|
||||
DESKTOP DIRECT:
|
||||
Claude Desktop (Laptop) → Local MCP Server → SSH (Destination Server)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Infrastructure Reference
|
||||
|
||||
| Server | IP | SSH User |
|
||||
|--------|-----|----------|
|
||||
| Command Center | 63.143.34.217 | root |
|
||||
| Panel VPS | 45.94.168.138 | root |
|
||||
| TX1 Dallas | 38.68.14.26 | root |
|
||||
| NC1 Charlotte | 216.239.104.130 | root |
|
||||
| Services VPS | 38.68.14.188 | root |
|
||||
| Wiki VPS | 64.50.188.14 | architect |
|
||||
|
||||
---
|
||||
|
||||
## Implementation Steps
|
||||
|
||||
### Step 1: SSH Key Setup (Windows)
|
||||
|
||||
Open PowerShell and generate a dedicated Ed25519 key:
|
||||
|
||||
```powershell
|
||||
ssh-keygen -t ed25519 -C "laptop-mcp-automation" -f "$HOME\.ssh\mcp_rsa"
|
||||
```
|
||||
|
||||
Then append the public key to `~/.ssh/authorized_keys` on each server:
|
||||
- root@63.143.34.217 (Command Center)
|
||||
- root@45.94.168.138 (Panel VPS)
|
||||
- root@38.68.14.26 (TX1)
|
||||
- root@216.239.104.130 (NC1)
|
||||
- root@38.68.14.188 (Services VPS)
|
||||
- architect@64.50.188.14 (Wiki VPS)
|
||||
|
||||
### Step 2: MCP Server Setup
|
||||
|
||||
Create project directory:
|
||||
|
||||
```powershell
|
||||
cd C:\Firefrost
|
||||
mkdir mcp-server
|
||||
cd mcp-server
|
||||
npm init -y
|
||||
npm install @modelcontextprotocol/sdk ssh2 dotenv express axios
|
||||
```
|
||||
|
||||
### Step 3: Create Config File
|
||||
|
||||
**File: `C:\Firefrost\mcp-server\config.js`**
|
||||
|
||||
```javascript
|
||||
// config.js
|
||||
const os = require('os');
|
||||
const path = require('path');
|
||||
|
||||
module.exports = {
|
||||
servers: {
|
||||
command_center: { host: '63.143.34.217', user: 'root' },
|
||||
panel: { host: '45.94.168.138', user: 'root' },
|
||||
tx1: { host: '38.68.14.26', user: 'root' },
|
||||
nc1: { host: '216.239.104.130', user: 'root' },
|
||||
services: { host: '38.68.14.188', user: 'root' },
|
||||
wiki: { host: '64.50.188.14', user: 'architect' }
|
||||
},
|
||||
sshKeyPath: path.join(os.homedir(), '.ssh', 'mcp_rsa')
|
||||
};
|
||||
```
|
||||
|
||||
### Step 4: Create SSH Helper
|
||||
|
||||
**File: `C:\Firefrost\mcp-server\ssh_helper.js`**
|
||||
|
||||
```javascript
|
||||
// ssh_helper.js
|
||||
const { Client } = require('ssh2');
|
||||
const fs = require('fs');
|
||||
const config = require('./config');
|
||||
|
||||
async function executeCommand(serverName, command) {
|
||||
return new Promise((resolve, reject) => {
|
||||
const target = config.servers[serverName];
|
||||
if (!target) return reject(`Server ${serverName} not found in config.`);
|
||||
|
||||
const conn = new Client();
|
||||
conn.on('ready', () => {
|
||||
conn.exec(command, (err, stream) => {
|
||||
if (err) { conn.end(); return reject(err); }
|
||||
let output = '';
|
||||
stream.on('close', (code, signal) => {
|
||||
conn.end();
|
||||
resolve({ code, output });
|
||||
}).on('data', (data) => { output += data; })
|
||||
.stderr.on('data', (data) => { output += `ERROR: ${data}`; });
|
||||
});
|
||||
}).on('error', (err) => {
|
||||
reject(err);
|
||||
}).connect({
|
||||
host: target.host,
|
||||
port: 22,
|
||||
username: target.user,
|
||||
privateKey: fs.readFileSync(config.sshKeyPath)
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
module.exports = { executeCommand };
|
||||
```
|
||||
|
||||
### Step 5: Create MCP Server
|
||||
|
||||
**File: `C:\Firefrost\mcp-server\index.js`**
|
||||
|
||||
```javascript
|
||||
// index.js
|
||||
const { Server } = require("@modelcontextprotocol/sdk/server/index.js");
|
||||
const { StdioServerTransport } = require("@modelcontextprotocol/sdk/server/stdio.js");
|
||||
const { CallToolRequestSchema, ListToolsRequestSchema } = require("@modelcontextprotocol/sdk/types.js");
|
||||
const { executeCommand } = require('./ssh_helper');
|
||||
const axios = require('axios');
|
||||
|
||||
const server = new Server(
|
||||
{ name: "firefrost-mcp", version: "1.0.0" },
|
||||
{ capabilities: { tools: {} } }
|
||||
);
|
||||
|
||||
// Tool Definitions
|
||||
server.setRequestHandler(ListToolsRequestSchema, async () => ({
|
||||
tools: [
|
||||
{
|
||||
name: "server_status",
|
||||
description: "Returns uptime, memory, and disk usage for a server",
|
||||
inputSchema: {
|
||||
type: "object",
|
||||
properties: {
|
||||
server: {
|
||||
type: "string",
|
||||
enum: ["command_center", "panel", "tx1", "nc1", "services", "wiki"]
|
||||
}
|
||||
},
|
||||
required: ["server"]
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "restart_service",
|
||||
description: "Restarts a systemd service",
|
||||
inputSchema: {
|
||||
type: "object",
|
||||
properties: {
|
||||
server: { type: "string" },
|
||||
service: { type: "string" }
|
||||
},
|
||||
required: ["server", "service"]
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "docker_compose_restart",
|
||||
description: "Restarts a docker compose stack",
|
||||
inputSchema: {
|
||||
type: "object",
|
||||
properties: {
|
||||
server: { type: "string" },
|
||||
stack: { type: "string" }
|
||||
},
|
||||
required: ["server", "stack"]
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "git_pull",
|
||||
description: "Pulls latest changes on a repository",
|
||||
inputSchema: {
|
||||
type: "object",
|
||||
properties: {
|
||||
server: { type: "string" },
|
||||
repo_path: { type: "string" }
|
||||
},
|
||||
required: ["server", "repo_path"]
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "tail_logs",
|
||||
description: "Returns last N lines of a service log",
|
||||
inputSchema: {
|
||||
type: "object",
|
||||
properties: {
|
||||
server: { type: "string" },
|
||||
service: { type: "string" },
|
||||
lines: { type: "number", default: 50 }
|
||||
},
|
||||
required: ["server", "service"]
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "pterodactyl_power",
|
||||
description: "Start/stop/restart a game server via Pterodactyl API",
|
||||
inputSchema: {
|
||||
type: "object",
|
||||
properties: {
|
||||
server_id: { type: "string" },
|
||||
action: {
|
||||
type: "string",
|
||||
enum: ["start", "stop", "restart", "kill"]
|
||||
}
|
||||
},
|
||||
required: ["server_id", "action"]
|
||||
}
|
||||
}
|
||||
]
|
||||
}));
|
||||
```
|
||||
|
||||
**File: `C:\Firefrost\mcp-server\index.js` (continued - execution logic)**
|
||||
|
||||
```javascript
|
||||
// Tool Execution
|
||||
server.setRequestHandler(CallToolRequestSchema, async (request) => {
|
||||
const { name, arguments: args } = request.params;
|
||||
|
||||
try {
|
||||
let cmd = '';
|
||||
let result;
|
||||
|
||||
if (name === "server_status") {
|
||||
cmd = "uptime && free -m && df -h /";
|
||||
result = await executeCommand(args.server, cmd);
|
||||
|
||||
} else if (name === "restart_service") {
|
||||
if (!/^[a-zA-Z0-9_-]+$/.test(args.service)) {
|
||||
throw new Error("Invalid service name format");
|
||||
}
|
||||
cmd = `systemctl restart ${args.service}`;
|
||||
result = await executeCommand(args.server, cmd);
|
||||
|
||||
} else if (name === "docker_compose_restart") {
|
||||
if (!/^[a-zA-Z0-9_-]+$/.test(args.stack)) {
|
||||
throw new Error("Invalid stack name format");
|
||||
}
|
||||
cmd = `cd /opt/${args.stack} && docker compose restart`;
|
||||
result = await executeCommand(args.server, cmd);
|
||||
|
||||
} else if (name === "git_pull") {
|
||||
const safePath = args.repo_path.replace(/[^a-zA-Z0-9_\-\/]/g, '');
|
||||
cmd = `cd ${safePath} && git pull origin main`;
|
||||
result = await executeCommand(args.server, cmd);
|
||||
|
||||
} else if (name === "tail_logs") {
|
||||
const lines = parseInt(args.lines) || 50;
|
||||
if (!/^[a-zA-Z0-9_-]+$/.test(args.service)) {
|
||||
throw new Error("Invalid service name format");
|
||||
}
|
||||
cmd = `journalctl -u ${args.service} -n ${lines} --no-pager`;
|
||||
result = await executeCommand(args.server, cmd);
|
||||
|
||||
} else if (name === "pterodactyl_power") {
|
||||
const pteroUrl = `https://panel.firefrostgaming.com/api/client/servers/${args.server_id}/power`;
|
||||
await axios.post(pteroUrl, { signal: args.action }, {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${process.env.PTERO_API_KEY}`,
|
||||
'Content-Type': 'application/json',
|
||||
'Accept': 'application/json'
|
||||
}
|
||||
});
|
||||
return {
|
||||
content: [{ type: "text", text: `Power command '${args.action}' sent to server ${args.server_id}` }]
|
||||
};
|
||||
|
||||
} else {
|
||||
throw new Error("Unknown tool");
|
||||
}
|
||||
|
||||
return {
|
||||
content: [{ type: "text", text: `Exit Code: ${result.code}\nOutput:\n${result.output}` }]
|
||||
};
|
||||
} catch (error) {
|
||||
return {
|
||||
content: [{ type: "text", text: `Error: ${error.toString()}` }],
|
||||
isError: true
|
||||
};
|
||||
}
|
||||
});
|
||||
|
||||
const transport = new StdioServerTransport();
|
||||
server.connect(transport).catch(console.error);
|
||||
```
|
||||
|
||||
### Step 6: Create Webhook Listener
|
||||
|
||||
**File: `C:\Firefrost\mcp-server\webhook_listener.js`**
|
||||
|
||||
```javascript
|
||||
// webhook_listener.js
|
||||
const express = require('express');
|
||||
const { executeCommand } = require('./ssh_helper');
|
||||
const app = express();
|
||||
app.use(express.json());
|
||||
|
||||
const WEBHOOK_SECRET = process.env.WEBHOOK_SECRET || 'CHANGE_ME_IN_PRODUCTION';
|
||||
|
||||
// Frostwall: Bearer Token validation
|
||||
app.use((req, res, next) => {
|
||||
const authHeader = req.headers.authorization;
|
||||
if (!authHeader || authHeader !== `Bearer ${WEBHOOK_SECRET}`) {
|
||||
console.log('Frostwall blocked unauthorized request');
|
||||
return res.status(403).json({ error: 'Frostwall blocked: Unauthorized' });
|
||||
}
|
||||
next();
|
||||
});
|
||||
|
||||
// Action allowlist
|
||||
const ALLOWED_ACTIONS = [
|
||||
'server_status',
|
||||
'restart_service',
|
||||
'docker_compose_restart',
|
||||
'git_pull',
|
||||
'tail_logs'
|
||||
];
|
||||
|
||||
app.post('/dispatch', async (req, res) => {
|
||||
const { action, server, target, extra } = req.body;
|
||||
|
||||
// Frostwall: Action allowlist
|
||||
if (!ALLOWED_ACTIONS.includes(action)) {
|
||||
return res.status(400).json({ error: 'Action not in allowlist' });
|
||||
}
|
||||
|
||||
try {
|
||||
let cmd = '';
|
||||
|
||||
if (action === 'server_status') {
|
||||
cmd = "uptime && free -m && df -h /";
|
||||
|
||||
} else if (action === 'restart_service') {
|
||||
if (!/^[a-zA-Z0-9_-]+$/.test(target)) {
|
||||
throw new Error("Invalid service name format");
|
||||
}
|
||||
cmd = `systemctl restart ${target}`;
|
||||
|
||||
} else if (action === 'docker_compose_restart') {
|
||||
if (!/^[a-zA-Z0-9_-]+$/.test(target)) {
|
||||
throw new Error("Invalid stack name format");
|
||||
}
|
||||
cmd = `cd /opt/${target} && docker compose restart`;
|
||||
|
||||
} else if (action === 'git_pull') {
|
||||
const safePath = target.replace(/[^a-zA-Z0-9_\-\/]/g, '');
|
||||
cmd = `cd ${safePath} && git pull origin main`;
|
||||
|
||||
} else if (action === 'tail_logs') {
|
||||
const lines = parseInt(extra?.lines) || 50;
|
||||
if (!/^[a-zA-Z0-9_-]+$/.test(target)) {
|
||||
throw new Error("Invalid service name format");
|
||||
}
|
||||
cmd = `journalctl -u ${target} -n ${lines} --no-pager`;
|
||||
}
|
||||
|
||||
const result = await executeCommand(server, cmd);
|
||||
res.json({ success: true, code: result.code, output: result.output });
|
||||
|
||||
} catch (error) {
|
||||
console.error('Dispatch error:', error);
|
||||
res.status(500).json({ success: false, error: error.message });
|
||||
}
|
||||
});
|
||||
|
||||
// Health check endpoint
|
||||
app.get('/health', (req, res) => {
|
||||
res.json({ status: 'ok', timestamp: new Date().toISOString() });
|
||||
});
|
||||
|
||||
const PORT = process.env.PORT || 3000;
|
||||
app.listen(PORT, () => {
|
||||
console.log(`Firefrost Ops Listener running on port ${PORT}`);
|
||||
});
|
||||
```
|
||||
|
||||
### Step 7: Environment Variables
|
||||
|
||||
**File: `C:\Firefrost\mcp-server\.env`**
|
||||
|
||||
```env
|
||||
WEBHOOK_SECRET=your-secure-32-char-token-from-vaultwarden
|
||||
PTERO_API_KEY=your-pterodactyl-client-api-key
|
||||
DIFY_API_KEY=your-dify-api-key
|
||||
```
|
||||
|
||||
### Step 8: Claude Desktop Configuration
|
||||
|
||||
**File: `%APPDATA%\Claude\claude_desktop_config.json`**
|
||||
|
||||
```json
|
||||
{
|
||||
"mcpServers": {
|
||||
"firefrost-admin": {
|
||||
"command": "node",
|
||||
"args": [
|
||||
"C:\\Firefrost\\mcp-server\\index.js"
|
||||
]
|
||||
}
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Step 9: Cloudflare Tunnel Setup
|
||||
|
||||
1. Download `cloudflared-windows-amd64.msi` from Cloudflare
|
||||
2. Install and authenticate via browser
|
||||
3. In Cloudflare Zero Trust dashboard, create tunnel for `ops.firefrostgaming.com`
|
||||
4. Run the provided command to install as Windows Service
|
||||
5. Tunnel auto-starts on boot
|
||||
|
||||
### Step 10: Windows Power Settings
|
||||
|
||||
1. Settings → System → Power & battery
|
||||
2. Set "Screen and sleep" to **Never** when plugged in
|
||||
3. Optional: Install Microsoft PowerToys → Enable **Awake** tool
|
||||
|
||||
### Step 11: Discord Ops Bot (Separate from Arbiter)
|
||||
|
||||
Create a new Discord bot for server operations:
|
||||
- Restrict to hidden `#ops-dispatch` channel
|
||||
- Trinity-only permissions
|
||||
- Commands trigger n8n webhooks
|
||||
|
||||
---
|
||||
|
||||
## Testing & Validation
|
||||
|
||||
### Test MCP Server Locally
|
||||
|
||||
```powershell
|
||||
npx @modelcontextprotocol/inspector node index.js
|
||||
```
|
||||
|
||||
This opens a web UI to test tools before connecting Claude.
|
||||
|
||||
### SSH Dry Run Mode
|
||||
|
||||
In `ssh_helper.js`, temporarily change:
|
||||
```javascript
|
||||
conn.exec(command, ...)
|
||||
```
|
||||
to:
|
||||
```javascript
|
||||
conn.exec('echo DRY RUN: ' + command, ...)
|
||||
```
|
||||
|
||||
### Test Cloudflare Tunnel
|
||||
|
||||
```bash
|
||||
curl -X POST https://ops.firefrostgaming.com/health \
|
||||
-H "Authorization: Bearer YOUR_TOKEN"
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Frostwall Security Rules
|
||||
|
||||
1. **Bearer Token Authentication** — Every request must include valid token
|
||||
2. **Action Allowlist** — Only predefined actions accepted, no raw bash
|
||||
3. **Input Sanitization** — Regex validation on all service/stack names
|
||||
4. **Audit Logging** — All requests logged with timestamp
|
||||
5. **Separate Bot** — Ops commands isolated from public Arbiter bot
|
||||
|
||||
---
|
||||
|
||||
## Fallback Procedures
|
||||
|
||||
**Condition Red (Tunnel/Laptop Offline):**
|
||||
- Fall back to manual SSH via Termius on mobile
|
||||
- Document in FFG-STD-005 (Emergency Operations)
|
||||
|
||||
---
|
||||
|
||||
## Vaultwarden Storage
|
||||
|
||||
Create folder: **Firefrost Ops Infrastructure**
|
||||
|
||||
| Item | Type | Notes |
|
||||
|------|------|-------|
|
||||
| Laptop MCP Ed25519 Key | Secure Note | Private key text |
|
||||
| Ops Webhook Bearer Token | Password | Random 32-char string |
|
||||
| Cloudflare Tunnel Secret | Password | From Zero Trust dashboard |
|
||||
| Pterodactyl Client API Key | Password | From panel settings |
|
||||
|
||||
---
|
||||
|
||||
## Dependencies
|
||||
|
||||
- Node.js (Windows)
|
||||
- npm packages: `@modelcontextprotocol/sdk`, `ssh2`, `express`, `axios`, `dotenv`
|
||||
- Cloudflare account with firefrostgaming.com
|
||||
- Claude Desktop app
|
||||
|
||||
---
|
||||
|
||||
## Files Created
|
||||
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `config.js` | Server definitions |
|
||||
| `ssh_helper.js` | SSH execution wrapper |
|
||||
| `index.js` | MCP server with tool definitions |
|
||||
| `webhook_listener.js` | Express app for mobile dispatch |
|
||||
| `.env` | Environment variables |
|
||||
|
||||
---
|
||||
|
||||
## Related Tasks
|
||||
|
||||
- Task #93: Trinity Codex (shared knowledge base)
|
||||
|
||||
---
|
||||
|
||||
**Fire + Frost + Foundation = Where Love Builds Legacy** 🔥❄️
|
||||
442
docs/tasks/task-093-trinity-codex/README.md
Normal file
442
docs/tasks/task-093-trinity-codex/README.md
Normal file
@@ -0,0 +1,442 @@
|
||||
# Task #93: Trinity Codex — Shared Knowledge Base
|
||||
|
||||
**Created:** April 5, 2026
|
||||
**Created By:** Chronicler #60 + Gemini AI
|
||||
**Status:** READY FOR IMPLEMENTATION
|
||||
**Priority:** High (Foundation infrastructure)
|
||||
**Assignee:** Michael
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
Build a shared RAG (Retrieval-Augmented Generation) knowledge base that all three Trinity members can access from their respective Claude instances. The Codex contains organizational knowledge while personal context remains separate.
|
||||
|
||||
## The Problem
|
||||
|
||||
Three people need Claude access with shared organizational knowledge:
|
||||
|
||||
| Person | Role | Use Cases |
|
||||
|--------|------|-----------|
|
||||
| Michael (The Wizard) | Technical lead | Everything — heavy sessions |
|
||||
| Meg (The Emissary) | Community manager | Marketing, Discord, announcements |
|
||||
| Holly (The Catalyst) | Co-founder, creative | Pokerole, building, creative writing |
|
||||
|
||||
Currently, organizational knowledge is siloed in Michael's Claude memory. Meg and Holly would need to ask Michael or re-explain context every time.
|
||||
|
||||
## The Solution
|
||||
|
||||
1. **Firefrost Codex** — RAG knowledge base in Dify/Qdrant on TX1
|
||||
2. **Separate Claude Lineages** — Each Trinity member has their own Chronicler line
|
||||
3. **Shared Organizational Knowledge** — Codex provides standards, server info, decisions
|
||||
4. **Personal Context Preserved** — Each Claude remembers individual preferences
|
||||
|
||||
---
|
||||
|
||||
## Architecture
|
||||
|
||||
```
|
||||
KNOWLEDGE FLOW:
|
||||
Gitea (ops-manual) → n8n webhook → Dify Dataset API → Qdrant vectors
|
||||
|
||||
QUERY FLOW (Michael - Heavy Use):
|
||||
Claude Desktop → Local MCP → Dify API → Codex response
|
||||
|
||||
QUERY FLOW (Meg/Holly - Light Use):
|
||||
Browser → Dify Web App → Codex response
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## The Three Lineages
|
||||
|
||||
| Person | Lineage Name | Account Type |
|
||||
|--------|--------------|--------------|
|
||||
| Michael | The Wizard's Chroniclers | Claude Pro (Max) |
|
||||
| Meg | The Emissary's Chroniclers | Claude Free (to start) |
|
||||
| Holly | The Catalyst's Chroniclers | Claude Free (to start) |
|
||||
|
||||
Each lineage maintains:
|
||||
- Personal conversation history
|
||||
- Individual preferences and style
|
||||
- Relationship context with that person
|
||||
|
||||
All lineages share:
|
||||
- Organizational standards (FFG-STD-xxx)
|
||||
- Server configurations
|
||||
- Project statuses
|
||||
- Historical context (memorials)
|
||||
|
||||
---
|
||||
|
||||
## Infrastructure
|
||||
|
||||
| Component | Location | Purpose |
|
||||
|-----------|----------|---------|
|
||||
| Dify | TX1 (38.68.14.26) | RAG orchestration, web UI |
|
||||
| Qdrant | TX1 (38.68.14.26) | Vector database |
|
||||
| n8n | Command Center (63.143.34.217) | Ingestion pipeline |
|
||||
| Gitea | Command Center (63.143.34.217) | Source of truth |
|
||||
|
||||
---
|
||||
|
||||
## Implementation Steps
|
||||
|
||||
### Step 1: Verify Dify Status
|
||||
|
||||
SSH to TX1 and confirm Dify is running:
|
||||
|
||||
```bash
|
||||
docker ps | grep dify
|
||||
```
|
||||
|
||||
Access Dify UI at: `http://38.68.14.26:3000` (or configured port)
|
||||
|
||||
### Step 2: Create Dify Dataset
|
||||
|
||||
1. Log into Dify admin UI
|
||||
2. Go to Knowledge → Create Dataset
|
||||
3. Name: **Firefrost Codex**
|
||||
4. Description: Organizational knowledge for The Trinity
|
||||
5. Save and note the Dataset ID
|
||||
|
||||
### Step 3: Generate Dify API Key
|
||||
|
||||
1. In Dify, go to Settings → API Keys
|
||||
2. Create new key: **codex-ingestion**
|
||||
3. Create another key: **codex-query**
|
||||
4. Store both in Vaultwarden
|
||||
|
||||
### Step 4: Configure Chunking Strategy
|
||||
|
||||
In Dify Dataset settings:
|
||||
|
||||
| Content Type | Chunking Method | Metadata Tags |
|
||||
|--------------|-----------------|---------------|
|
||||
| Standards (FFG-STD-xxx) | Header-based | `type: standard`, `status: active` |
|
||||
| Server docs | Header-based | `type: infrastructure`, `status: active` |
|
||||
| Task docs | Header-based | `type: task`, `status: varies` |
|
||||
| Chronicler memorials | Full document | `type: historical`, `status: archived` |
|
||||
| Session handoffs | Header-based | `type: handoff`, `status: current` |
|
||||
|
||||
### Step 5: Create Gitea Webhook
|
||||
|
||||
1. Go to Gitea → firefrost-gaming/firefrost-operations-manual → Settings → Webhooks
|
||||
2. Add webhook:
|
||||
- URL: `https://n8n.firefrostgaming.com/webhook/codex-ingest` (or your n8n URL)
|
||||
- Content Type: `application/json`
|
||||
- Secret: Generate and store in Vaultwarden
|
||||
- Events: Push events only
|
||||
- Branch filter: `master`
|
||||
|
||||
### Step 6: Create n8n Ingestion Workflow
|
||||
|
||||
**Workflow: Codex Ingestion Pipeline**
|
||||
|
||||
```
|
||||
[Webhook] → [Switch: File Type] → [HTTP: Fetch from Gitea API] → [HTTP: Push to Dify]
|
||||
```
|
||||
|
||||
**Node 1: Webhook**
|
||||
- Method: POST
|
||||
- Path: `/webhook/codex-ingest`
|
||||
- Authentication: Header Auth (match Gitea secret)
|
||||
|
||||
**Node 2: Switch**
|
||||
- Route based on file path:
|
||||
- `docs/standards/*` → Standards processing
|
||||
- `docs/servers/*` → Infrastructure processing
|
||||
- `docs/tasks/*` → Task processing
|
||||
- `docs/relationship/*` → Memorial processing
|
||||
|
||||
**Node 3: HTTP Request (Fetch File)**
|
||||
```
|
||||
GET https://git.firefrostgaming.com/api/v1/repos/firefrost-gaming/firefrost-operations-manual/contents/{{ $json.commits[0].modified[0] }}
|
||||
Headers:
|
||||
Authorization: token {{ $credentials.giteaToken }}
|
||||
```
|
||||
|
||||
**Node 4: HTTP Request (Push to Dify)**
|
||||
```
|
||||
POST http://38.68.14.26/v1/datasets/{{ $env.DIFY_DATASET_ID }}/document/create_by_text
|
||||
Headers:
|
||||
Authorization: Bearer {{ $credentials.difyApiKey }}
|
||||
Body:
|
||||
{
|
||||
"name": "{{ filename }}",
|
||||
"text": "{{ content }}",
|
||||
"indexing_technique": "high_quality",
|
||||
"process_rule": {
|
||||
"mode": "hierarchical"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
### Step 7: Initial Bulk Ingestion
|
||||
|
||||
For the first load, manually ingest key documents:
|
||||
|
||||
**Priority 1 (Immediate):**
|
||||
- `DOCUMENT-INDEX.md`
|
||||
- `docs/standards/*.md` (all standards)
|
||||
- `docs/servers/*.md` (all server docs)
|
||||
- `SESSION-HANDOFF-NEXT.md`
|
||||
|
||||
**Priority 2 (Soon):**
|
||||
- `docs/relationship/CHRONICLER-LINEAGE-TRACKER.md`
|
||||
- Active task READMEs
|
||||
- `BLOCKERS.md` and `BACKLOG.md`
|
||||
|
||||
**Priority 3 (When Time Permits):**
|
||||
- Chronicler memorials (historical context)
|
||||
- Completed task docs
|
||||
|
||||
### Step 8: Dify MCP Connector (For Michael)
|
||||
|
||||
Add to Michael's local MCP server:
|
||||
|
||||
**File: `C:\Firefrost\mcp-server\dify_tool.js`**
|
||||
|
||||
```javascript
|
||||
// dify_tool.js
|
||||
const axios = require('axios');
|
||||
|
||||
const DIFY_URL = process.env.DIFY_URL || 'http://38.68.14.26';
|
||||
const DIFY_API_KEY = process.env.DIFY_API_KEY;
|
||||
|
||||
async function queryCodex(question, userLineage = "wizard_chronicler") {
|
||||
try {
|
||||
const response = await axios.post(`${DIFY_URL}/v1/chat-messages`, {
|
||||
inputs: {},
|
||||
query: question,
|
||||
response_mode: "blocking",
|
||||
user: userLineage
|
||||
}, {
|
||||
headers: {
|
||||
'Authorization': `Bearer ${DIFY_API_KEY}`,
|
||||
'Content-Type': 'application/json'
|
||||
}
|
||||
});
|
||||
return response.data.answer;
|
||||
} catch (err) {
|
||||
console.error('Codex query error:', err.message);
|
||||
return `Codex Error: ${err.message}`;
|
||||
}
|
||||
}
|
||||
|
||||
async function searchCodex(query, limit = 5) {
|
||||
try {
|
||||
const response = await axios.get(`${DIFY_URL}/v1/datasets/${process.env.DIFY_DATASET_ID}/documents`, {
|
||||
params: { keyword: query, limit },
|
||||
headers: { 'Authorization': `Bearer ${DIFY_API_KEY}` }
|
||||
});
|
||||
return response.data;
|
||||
} catch (err) {
|
||||
return `Search Error: ${err.message}`;
|
||||
}
|
||||
}
|
||||
|
||||
module.exports = { queryCodex, searchCodex };
|
||||
```
|
||||
|
||||
**Add to `index.js` tools:**
|
||||
|
||||
```javascript
|
||||
// Add to ListToolsRequestSchema handler
|
||||
{
|
||||
name: "query_codex",
|
||||
description: "Query the Firefrost Codex for organizational knowledge",
|
||||
inputSchema: {
|
||||
type: "object",
|
||||
properties: {
|
||||
question: { type: "string" }
|
||||
},
|
||||
required: ["question"]
|
||||
}
|
||||
},
|
||||
{
|
||||
name: "search_codex",
|
||||
description: "Search Codex documents by keyword",
|
||||
inputSchema: {
|
||||
type: "object",
|
||||
properties: {
|
||||
query: { type: "string" },
|
||||
limit: { type: "number", default: 5 }
|
||||
},
|
||||
required: ["query"]
|
||||
}
|
||||
}
|
||||
|
||||
// Add to CallToolRequestSchema handler
|
||||
} else if (name === "query_codex") {
|
||||
const { queryCodex } = require('./dify_tool');
|
||||
const answer = await queryCodex(args.question, "wizard_chronicler");
|
||||
return {
|
||||
content: [{ type: "text", text: answer }]
|
||||
};
|
||||
|
||||
} else if (name === "search_codex") {
|
||||
const { searchCodex } = require('./dify_tool');
|
||||
const results = await searchCodex(args.query, args.limit || 5);
|
||||
return {
|
||||
content: [{ type: "text", text: JSON.stringify(results, null, 2) }]
|
||||
};
|
||||
```
|
||||
|
||||
### Step 9: Dify Web App (For Meg/Holly)
|
||||
|
||||
This is the friction-free approach for light users:
|
||||
|
||||
1. In Dify, go to the Codex dataset
|
||||
2. Click "Create App" → "Chat App"
|
||||
3. Name: **Firefrost Assistant**
|
||||
4. Configure:
|
||||
- System prompt: Include brand voice, context about Firefrost
|
||||
- Knowledge base: Link to Firefrost Codex dataset
|
||||
- Model: Claude (or available model)
|
||||
5. Publish as Web App
|
||||
6. Share URL with Meg and Holly
|
||||
|
||||
**Advantages:**
|
||||
- No local setup required
|
||||
- Works in any browser
|
||||
- Automatic Codex integration
|
||||
- Can bookmark on phone/tablet
|
||||
|
||||
---
|
||||
|
||||
## Content Organization
|
||||
|
||||
### What Goes in Codex (Shared)
|
||||
|
||||
| Content | Why |
|
||||
|---------|-----|
|
||||
| FFG Standards | Everyone needs these |
|
||||
| Server IPs/configs | Operational reference |
|
||||
| Current task status | Coordination |
|
||||
| Brand guidelines | Meg needs for marketing |
|
||||
| Project roadmaps | Everyone needs context |
|
||||
| Subscription tiers | Customer-facing info |
|
||||
| Chronicler memorials | Historical context |
|
||||
|
||||
### What Stays Personal (Not in Codex)
|
||||
|
||||
| Content | Why |
|
||||
|---------|-----|
|
||||
| Personal preferences | Individual to each person |
|
||||
| Writing style notes | Different for each |
|
||||
| Private conversations | Not organizational |
|
||||
| Draft content | Work in progress |
|
||||
|
||||
---
|
||||
|
||||
## Security & Permissions
|
||||
|
||||
### Access Levels
|
||||
|
||||
| Person | Codex Access | Can Edit Codex |
|
||||
|--------|--------------|----------------|
|
||||
| Michael | Full (MCP + Web) | Yes (via Gitea) |
|
||||
| Meg | Web App only | No (read-only) |
|
||||
| Holly | Web App only | No (read-only) |
|
||||
|
||||
### Sensitive Content
|
||||
|
||||
Some docs should NOT go in Codex:
|
||||
- Credentials (stay in Vaultwarden)
|
||||
- Financial details
|
||||
- Personal medical info
|
||||
- Private Trinity discussions
|
||||
|
||||
Create a `.codexignore` pattern in the n8n workflow to skip these.
|
||||
|
||||
---
|
||||
|
||||
## Testing & Validation
|
||||
|
||||
### Test Query Accuracy
|
||||
|
||||
Ask the Codex:
|
||||
1. "What's the IP address of TX1?" → Should return 38.68.14.26
|
||||
2. "What's our brand voice?" → Should return warm, inclusive, playful
|
||||
3. "Who is Holly?" → Should return The Catalyst, co-founder, Pokerole lead
|
||||
4. "What's the top subscription tier?" → Should return Sovereign, NOT Founder
|
||||
|
||||
### Test Ingestion Pipeline
|
||||
|
||||
1. Make a small edit to a doc in Gitea
|
||||
2. Check n8n execution log
|
||||
3. Verify document appears in Dify dataset
|
||||
4. Query for the new content
|
||||
|
||||
---
|
||||
|
||||
## Fallback Procedures
|
||||
|
||||
**Condition Black (Codex/Dify Offline):**
|
||||
- Claude relies on built-in memory
|
||||
- Manual document searches in Gitea
|
||||
- No organizational context available
|
||||
- Document in FFG-STD-005 (Emergency Operations)
|
||||
|
||||
---
|
||||
|
||||
## Vaultwarden Storage
|
||||
|
||||
Add to **Firefrost Ops Infrastructure** folder:
|
||||
|
||||
| Item | Type | Notes |
|
||||
|------|------|-------|
|
||||
| Dify Ingestion API Key | Password | For n8n pipeline |
|
||||
| Dify Query API Key | Password | For MCP and Web App |
|
||||
| Gitea Webhook Secret | Password | For n8n authentication |
|
||||
| Dify Dataset ID | Secure Note | Reference for API calls |
|
||||
|
||||
---
|
||||
|
||||
## Environment Variables
|
||||
|
||||
Add to `.env` files:
|
||||
|
||||
```env
|
||||
DIFY_URL=http://38.68.14.26
|
||||
DIFY_API_KEY=your-query-api-key
|
||||
DIFY_DATASET_ID=your-dataset-id
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Implementation Order
|
||||
|
||||
Per Gemini's recommendation:
|
||||
|
||||
1. **Task #93 FIRST** (Trinity Codex) — Foundation, read-only, safe
|
||||
2. **Task #92 SECOND** (Desktop MCP) — Higher risk, needs tight security
|
||||
|
||||
---
|
||||
|
||||
## Maintenance
|
||||
|
||||
- **Automatic:** Gitea webhook triggers re-ingestion on doc changes
|
||||
- **Manual:** Periodic review of chunk quality, metadata accuracy
|
||||
- **Monitoring:** n8n workflow execution logs
|
||||
|
||||
---
|
||||
|
||||
## Related Tasks
|
||||
|
||||
- Task #92: Desktop MCP + Dispatch Architecture
|
||||
|
||||
---
|
||||
|
||||
## Open Questions for Implementation
|
||||
|
||||
1. What's the current Dify version on TX1?
|
||||
2. Is there an existing dataset, or starting fresh?
|
||||
3. What port is Dify running on?
|
||||
4. Does Meg have a device preference for accessing the Web App?
|
||||
5. Does Holly need Pokerole-specific knowledge separated?
|
||||
|
||||
---
|
||||
|
||||
**Fire + Frost + Foundation = Where Love Builds Legacy** 🔥❄️
|
||||
Reference in New Issue
Block a user