docs: Gemini Round 4 response — versioning, order, deployment

Versioning locked in:
- Platform: Trinity Console 2.0
- Migrated modules: 1.0.0
- New modules: 0.1.0 until stable
- About page: Part of System module

Migration verified:
- 30-40 min/module confirmed
- Servers module.json example provided
- View path gotcha documented
- Strangler Fig coexistence confirmed

Development order:
Phase 1: Dashboard → System → Servers (proof of concept)
Phase 2: Tasks (high value, new build)
Phase 3: Players → Financials (heavy migration)
Phase 4: Everything else

Critical deployment note:
- Run 001_rbac_tables.sql and 002_feature_flags.sql MANUALLY
  before first Trinity 2.0 deploy (don't rely on boot sequence)

Signed-off-by: Claude (Chronicler #61) <claude@firefrostgaming.com>
This commit is contained in:
Claude (Chronicler #61)
2026-04-05 11:46:10 +00:00
parent f1185a6962
commit 1fec91b014

View File

@@ -0,0 +1,129 @@
# Gemini Response: Trinity Console 2.0 — Round 4 Answers
**Date:** April 5, 2026
**From:** Gemini (Architectural Partner)
**To:** Michael (The Wizard) + Claude (Chronicler #61)
**Re:** Versioning, migration verification, development order, deployment risks
---
This is an excellent final planning phase, Michael and Chronicler #61. You are asking the exact right questions to ensure the deployment and transition are as smooth as the architecture itself.
Let's lock in the strategy so you can get to work.
---
## 1. Versioning Strategy
**Platform Versioning: Go with Trinity Console 2.0.**
Your instinct is 100% correct. You did the hard work of building v1 (the monolith). It proved the concept, validated the need, and handles production traffic. Version 2.0 honors that history and signals to the team (Meg and Holly) that this is a major architectural leap forward.
**Module Versioning:**
* **Migrated Modules:** Start them at `1.0.0`. Even though the code existed before, this is their first release as a *standalone, self-registering plugin*.
* **Brand New Modules (e.g., Tasks):** Start them at `0.1.0`. Keep them under `1.0.0` until you, Meg, and Holly have actively used them for a week and ironed out the workflow bugs. Once stable, bump to `1.0.0`.
**The "About" Module (System Information):**
* **Yes, absolutely build this.** Do not make it a separate module, though. Build it directly into the **System** module you already defined.
* **What to include:** Trinity Console Version, Node.js version, Postgres version, Uptime (calculate from Node process start), and a dynamic table listing all loaded modules and their versions (fetchable via `registry.getAllModules()`). This is invaluable when you are debugging from the RV on a poor connection.
---
## 2. Migration Verification (Servers Module Test)
Claude's estimate of 30-40 minutes per module is highly accurate, assuming the core engine is fully stable.
**The Servers `module.json`:**
Notice how we combined "Servers" and "Scheduler" as we discussed in the consolidation phase.
```json
{
"id": "servers",
"name": "Servers & Scheduling",
"description": "Game server status, whitelist sync, and restart automation",
"version": "1.0.0",
"author": "Trinity",
"dependencies": {},
"icon": "server",
"nav": {
"section": "Infrastructure",
"position": 1
},
"permissions": [
{
"key": "servers.view",
"name": "View Servers",
"description": "See server matrix and status"
},
{
"key": "servers.manage",
"name": "Manage Servers",
"description": "Sync whitelists and toggle maintenance mode"
},
{
"key": "servers.scheduler",
"name": "Manage Scheduler",
"description": "Deploy global restart schedules and nuke conflicts"
}
],
"routes": "/servers"
}
```
**Gotchas to Watch For:**
* **View Paths:** In the monolith, your `res.render()` calls likely referenced paths from the root `views` directory. In the plugin architecture, you need to ensure the path correctly points to the module's localized view folder (e.g., `res.render('../src/modules/servers/views/index')`).
* **The Strangler Fig Coexistence:** Existing monolithic routes *will* coexist perfectly with the new dynamic routes, **provided the prefixes don't clash**. If the old monolith mounted the servers page at `app.use('/admin/servers', ...)`, and your new module declares `"routes": "/servers"`, they will live side-by-side. You can test the new one, and if it breaks, the old one is still there until you delete the old route.
---
## 3. Module Development Order
Here is the safest, most logical path to migrate and build.
### Phase 1: The Core & Proof of Concept (Migration)
1. **Dashboard:** The easiest migration. Proves the UI, navigation builder, and basic routing work.
2. **System:** Build this *now* to house your RBAC management UI and the "About" page. You need the UI to assign permissions before migrating complex modules.
3. **Servers (The Test Case):** Prove that complex logic (the Scheduler we built) works inside the module system.
### Phase 2: High Value / New Builds
4. **Tasks:** Build this immediately after Phase 1. It is a brand new module (`0.1.0`), has no legacy code to untangle, and provides instant value to team coordination.
### Phase 3: The Heavy Lifters (Migration)
5. **Players:** This is a thick module (merging Subscriptions, Grace Period, Bans, Role Audit). Do this carefully.
6. **Financials:** Migrate existing MRR/ARR charts.
### Phase 4: The Rest (New Builds)
7. **Team, Docs, Marketing, Infrastructure, Health, Chroniclers.** Build these sequentially as the need arises. They are independent and low risk.
---
## 4. Deployment Confirmation & Risks
Your deployment plan is generally solid, but there is one critical risk regarding the database.
**The Shared Database Risk:**
Yes, the new Trinity Console 2.0 and the old Arbiter can share `arbiter_db`. **However, DO NOT rely on the boot sequence to run the initial core schema changes (`001_rbac_tables.sql` and `002_feature_flags.sql`) for the very first time on production.**
* **Why:** If the boot sequence hits an unexpected error or timeout during the table creation, it could leave the database in a locked or partially migrated state, which might inadvertently affect the old Arbiter's existing tables.
* **The Fix:** Run the initial `001_rbac_tables.sql` and `002_feature_flags.sql` manually via your PostgreSQL client (pgAdmin, psql, or your VPS panel) *before* you deploy Trinity 2.0. Because these tables are entirely new, they will not disrupt the old Arbiter. Once the tables exist, the boot sequence will simply `DO NOTHING` on conflict and handle future module migrations safely.
---
## Final Check
Everything looks clear. You have the architecture, the code, the versioning strategy, and the rollout order. You are ready to establish the new foundation.
**Execute Task #95 and let's get Trinity Console 2.0 online!**
---
*Response received April 5, 2026*