Reorg: Move uptime-kuma deployment to docs/deployment/
This commit is contained in:
@@ -1,261 +0,0 @@
|
||||
# Uptime Kuma Deployment - Service 2/5
|
||||
|
||||
**Service:** Uptime Kuma (Infrastructure Monitoring)
|
||||
**Domain:** status.firefrostgaming.com
|
||||
**Location:** TX1 Dallas (38.68.14.26)
|
||||
**Deployed:** February 9, 2026
|
||||
**Status:** ✅ OPERATIONAL
|
||||
|
||||
---
|
||||
|
||||
## Service Overview
|
||||
|
||||
**Purpose:** Real-time infrastructure monitoring with Discord notifications
|
||||
|
||||
**Monitors Active (6 total):**
|
||||
- TX1 Dallas (Ping - <1ms) ✅
|
||||
- Pterodactyl Panel (Ping - 41ms) ✅
|
||||
- Gitea - Git Repository (HTTPS) ✅
|
||||
- Command Center (Ping) ✅
|
||||
- Paymenter Billing (Ping) ✅
|
||||
- Ghost CMS (Ping) ✅
|
||||
|
||||
**NC1 Charlotte:** Excluded due to network routing issue between TX1/NC1 datacenters
|
||||
|
||||
---
|
||||
|
||||
## Installation Details
|
||||
|
||||
**Version:** 2.1.0
|
||||
**Database:** SQLite
|
||||
**Internal Port:** 3001
|
||||
**External Access:** HTTPS via Nginx reverse proxy
|
||||
**SSL Certificate:** Let's Encrypt (expires 2026-05-10)
|
||||
|
||||
|
||||
## Technical Installation
|
||||
|
||||
### Prerequisites Installed
|
||||
- Node.js 20.20.0 (via NodeSource)
|
||||
- npm 10.8.2
|
||||
- Nginx 1.24.0 (already present)
|
||||
|
||||
### Installation Steps
|
||||
1. Created system user: `uptime-kuma`
|
||||
2. Cloned official repository to `/opt/uptime-kuma`
|
||||
3. Ran `npm run setup` (version 2.1.0)
|
||||
4. Created startup script at `/opt/uptime-kuma/start.sh`
|
||||
5. Configured systemd service: `/etc/systemd/system/uptime-kuma.service`
|
||||
|
||||
### Service Configuration
|
||||
|
||||
**SystemD Unit:**
|
||||
- User: uptime-kuma
|
||||
- Working Directory: /opt/uptime-kuma
|
||||
- ExecStart: /opt/uptime-kuma/start.sh
|
||||
- Restart: always (10 second delay)
|
||||
|
||||
**Startup Script:**
|
||||
```bash
|
||||
#!/bin/bash
|
||||
cd /opt/uptime-kuma
|
||||
node server/server.js
|
||||
```
|
||||
|
||||
|
||||
## Nginx Configuration
|
||||
|
||||
**Config File:** `/etc/nginx/sites-available/uptime-kuma`
|
||||
```nginx
|
||||
server {
|
||||
listen 80;
|
||||
server_name status.firefrostgaming.com;
|
||||
|
||||
location / {
|
||||
proxy_pass http://localhost:3001;
|
||||
proxy_http_version 1.1;
|
||||
proxy_set_header Upgrade $http_upgrade;
|
||||
proxy_set_header Connection "upgrade";
|
||||
proxy_set_header Host $host;
|
||||
proxy_set_header X-Real-IP $remote_addr;
|
||||
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
|
||||
proxy_set_header X-Forwarded-Proto $scheme;
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
**SSL:** Configured via Certbot (auto-renewal enabled)
|
||||
|
||||
---
|
||||
|
||||
## Discord Integration
|
||||
|
||||
**Notification Channel:** #network-status (in Support category)
|
||||
**Webhook URL:** Configured (private)
|
||||
**Bot Display Name:** Uptime Kuma
|
||||
**Friendly Name:** Firefrost Network Status
|
||||
|
||||
**Applied To:** All 6 active monitors
|
||||
**Default Enabled:** Yes (future monitors auto-notify)
|
||||
|
||||
|
||||
## Issues Encountered & Resolutions
|
||||
|
||||
### Issue 1: Missing dist/index.html
|
||||
**Problem:** Initial `npm install uptime-kuma` didn't build frontend files
|
||||
**Error:** `Cannot find 'dist/index.html', did you install correctly?`
|
||||
**Solution:** Used proper installation method - git clone + `npm run setup`
|
||||
**Lesson:** Always use official installation method, not just npm install
|
||||
|
||||
### Issue 2: NC1 Charlotte Unreachable
|
||||
**Problem:** TX1 cannot reach NC1 (216.239.104.130) - "Network is unreachable"
|
||||
**Diagnosis:** No route between TX1 Dallas and NC1 Charlotte datacenters
|
||||
**Attempted:** Ping monitoring, TCP port 25565 monitoring
|
||||
**Resolution:** Excluded NC1 from monitoring for now
|
||||
**Future Fix:** Consider push monitoring or external monitoring point
|
||||
|
||||
### Issue 3: Start Script Deleted During Cleanup
|
||||
**Problem:** Systemd service failed after cleaning /opt/uptime-kuma
|
||||
**Cause:** start.sh removed when clearing directory for git clone
|
||||
**Solution:** Recreated start.sh after successful git clone
|
||||
**Lesson:** Document all custom scripts before cleanup operations
|
||||
|
||||
---
|
||||
|
||||
## Verification Results
|
||||
|
||||
**Service Status:**
|
||||
```
|
||||
● uptime-kuma.service - Uptime Kuma - Self-hosted monitoring tool
|
||||
Active: active (running)
|
||||
Uptime: 100%
|
||||
```
|
||||
|
||||
**Port Check:**
|
||||
```
|
||||
ss -tlnp | grep 3001
|
||||
LISTEN *:3001 (node process)
|
||||
```
|
||||
|
||||
**HTTPS Access:**
|
||||
```
|
||||
curl -I https://status.firefrostgaming.com
|
||||
HTTP/1.1 302 Found (redirects to setup - expected)
|
||||
```
|
||||
|
||||
**DNS Resolution:**
|
||||
```
|
||||
dig status.firefrostgaming.com +short
|
||||
38.68.14.26
|
||||
```
|
||||
|
||||
|
||||
## Next Steps
|
||||
|
||||
**Immediate:**
|
||||
- Monitor Discord notifications for false positives
|
||||
- Verify alert timing and message formatting
|
||||
- Test monitor recovery notifications
|
||||
|
||||
**Short-term:**
|
||||
- Add monitoring for individual game servers (15 total)
|
||||
- Configure status page for public visibility
|
||||
- Set up maintenance mode notifications
|
||||
|
||||
**Long-term:**
|
||||
- Implement NC1 monitoring solution (push or external)
|
||||
- Add custom monitors for specific services (MySQL, Redis if deployed)
|
||||
- Configure notification escalation rules
|
||||
|
||||
---
|
||||
|
||||
## Admin Account
|
||||
|
||||
**Username:** Created during web setup
|
||||
**Dashboard:** https://status.firefrostgaming.com
|
||||
**Database:** `/opt/uptime-kuma/data/kuma.db` (SQLite)
|
||||
|
||||
---
|
||||
|
||||
## Maintenance
|
||||
|
||||
**Backup Strategy:**
|
||||
- SQLite database: `/opt/uptime-kuma/data/kuma.db`
|
||||
- Configuration: Monitor definitions stored in database
|
||||
- Frequency: Daily (to be implemented with Phase 0.5 backup system)
|
||||
|
||||
**Updates:**
|
||||
```bash
|
||||
su - uptime-kuma
|
||||
cd /opt/uptime-kuma
|
||||
git pull
|
||||
npm ci --omit dev
|
||||
systemctl restart uptime-kuma
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Revision History
|
||||
|
||||
| Version | Date | Author | Changes |
|
||||
|---------|------|--------|---------|
|
||||
| **1.0** | 2026-02-09 | Michael | Initial deployment. Service 2/5 complete. |
|
||||
|
||||
---
|
||||
|
||||
**END OF DOCUMENT**
|
||||
|
||||
## Next Steps
|
||||
|
||||
**Immediate:**
|
||||
- Monitor Discord notifications for false positives
|
||||
- Verify alert timing and message formatting
|
||||
- Test monitor recovery notifications
|
||||
|
||||
**Short-term:**
|
||||
- Add monitoring for individual game servers (15 total)
|
||||
- Configure status page for public visibility
|
||||
- Set up maintenance mode notifications
|
||||
|
||||
**Long-term:**
|
||||
- Implement NC1 monitoring solution (push or external)
|
||||
- Add custom monitors for specific services (MySQL, Redis if deployed)
|
||||
- Configure notification escalation rules
|
||||
|
||||
---
|
||||
|
||||
## Admin Account
|
||||
|
||||
**Username:** Created during web setup
|
||||
**Dashboard:** https://status.firefrostgaming.com
|
||||
**Database:** `/opt/uptime-kuma/data/kuma.db` (SQLite)
|
||||
|
||||
---
|
||||
|
||||
## Maintenance
|
||||
|
||||
**Backup Strategy:**
|
||||
- SQLite database: `/opt/uptime-kuma/data/kuma.db`
|
||||
- Configuration: Monitor definitions stored in database
|
||||
- Frequency: Daily (to be implemented with Phase 0.5 backup system)
|
||||
|
||||
**Updates:**
|
||||
```bash
|
||||
su - uptime-kuma
|
||||
cd /opt/uptime-kuma
|
||||
git pull
|
||||
npm ci --omit dev
|
||||
systemctl restart uptime-kuma
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Revision History
|
||||
|
||||
| Version | Date | Author | Changes |
|
||||
|---------|------|--------|---------|
|
||||
| **1.0** | 2026-02-09 | Michael | Initial deployment. Service 2/5 complete. |
|
||||
|
||||
---
|
||||
|
||||
**END OF DOCUMENT**
|
||||
Reference in New Issue
Block a user