Files
firefrost-operations-manual/docs/tasks/firefrost-codex/MIGRATION-TO-OPEN-WEBUI.md
Chronicler a8de0aed57 docs: Create Open WebUI + Repomix migration plan
Complete step-by-step plan to replace AnythingLLM with Open WebUI + Repomix.

Problem: AnythingLLM with 319 files has poor retrieval quality
Solution: Open WebUI (better RAG) + Repomix (single packaged digest)

Migration includes:
- Install Repomix to package operations manual
- Replace AnythingLLM with Open WebUI (same port)
- Upload single digest file instead of 319 individual docs
- Recreate workspaces and user accounts
- Update sync script to use Repomix

Expected improvements:
- Better search relevance (clean context vs noisy corpus)
- Faster responses (efficient RAG engine)
- Simpler maintenance (re-run packager vs re-sync files)

Risk: LOW - can rollback to AnythingLLM in 2 minutes
Time: ~1 hour total
Status: Ready to execute when Michael is ready

Document: CODEX-MIGRATION-001
2026-02-21 21:29:52 +00:00

12 KiB

Firefrost Codex - Migration to Open WebUI + Repomix

Document ID: CODEX-MIGRATION-001
Created: February 21, 2026
Status: 📋 PLANNED - Ready to Execute
Estimated Time: 1 hour
Risk Level: LOW (can rollback to AnythingLLM if needed)


🎯 OBJECTIVE

Replace AnythingLLM with Open WebUI + Repomix workflow to solve document retrieval quality issues.

Current Problem:

  • AnythingLLM indexing 319 files creates poor search relevance
  • AI finds old/archived docs instead of current operational info
  • Vector search overwhelmed by document volume and similarity

Solution:

  • Open WebUI: More efficient RAG engine, better performance
  • Repomix: Packages entire operations manual into single, clean file
  • Combined: Fast + accurate retrieval without noise

📊 COMPARISON

Feature AnythingLLM (Current) Open WebUI + Repomix
Documents 319 individual files 1 packaged digest
Search Quality Poor (finds old docs) Good (clean context)
Performance Slow indexing Fast responses
Updates Re-sync 319 files Re-run packager
Maintenance High (vector DB) Low (single file)
Cost $0/month $0/month

⚠️ PREREQUISITES

Before starting:

  • SSH access to TX1 (38.68.14.26)
  • Docker running on TX1
  • Ollama running on TX1 (port 11434)
  • Git repo cloned at /root/firefrost-operations-manual
  • ~30 minutes of focused time
  • Backup plan (can restart AnythingLLM if needed)

🛠️ MIGRATION STEPS

Step 1: Install Repomix (5 minutes)

SSH to TX1:

ssh root@38.68.14.26

Install Repomix globally:

npm install -g repomix

Verify installation:

repomix --version

Expected output: Version number (e.g., repomix v2.x.x)


Step 2: Generate Operations Manual Digest (5 minutes)

Navigate to operations manual:

cd /root/firefrost-operations-manual

Run Repomix to package entire repo:

repomix --output firefrost-ops-digest.md

What this does:

  • Scans entire repository
  • Automatically excludes .git, node_modules, binaries
  • Creates single markdown file with intelligent structure
  • Optimized for LLM consumption

Verify the digest:

ls -lh firefrost-ops-digest.md
wc -l firefrost-ops-digest.md

Expected: File should be several thousand lines, representing all docs


Step 3: Stop AnythingLLM (2 minutes)

Navigate to AnythingLLM directory:

cd /opt/anythingllm

Stop the containers:

docker-compose down

Verify stopped:

docker ps | grep anythingllm

Expected: No output (containers stopped)

Optional - Remove containers completely:

# Only if you want to free up space
# docker-compose down -v  # This removes volumes too

Note: We're keeping the data volumes in case we need to rollback.


Step 4: Install Open WebUI (10 minutes)

Pull and run Open WebUI container:

docker run -d \
  -p 3001:8080 \
  --name open-webui \
  --add-host=host.docker.internal:host-gateway \
  -v open-webui:/app/backend/data \
  -e OLLAMA_BASE_URL=http://host.docker.internal:11434 \
  --restart always \
  ghcr.io/open-webui/open-webui:main

Breakdown:

  • -p 3001:8080 - Same port as old Codex (http://38.68.14.26:3001)
  • --add-host - Allows container to reach host's Ollama
  • -v open-webui:/app/backend/data - Persistent storage
  • -e OLLAMA_BASE_URL - Points to existing Ollama instance
  • --restart always - Auto-start on server reboot

Verify container running:

docker ps | grep open-webui

Expected: Container should show as "Up"

Check logs:

docker logs open-webui

Expected: Should see startup messages, no errors


Step 5: Initial Setup via Web UI (10 minutes)

Access Open WebUI:

  1. Open browser to: http://38.68.14.26:3001
  2. First visit will prompt to create admin account
  3. Create account:
    • Username: mkrause612
    • Email: (your email)
    • Password: (secure password)

Verify Ollama connection:

  1. Click Settings (gear icon)
  2. Go to "Connections"
  3. Should show Ollama at http://host.docker.internal:11434
  4. Should list available models (qwen2.5-coder:7b, llama3.3:70b)

If models don't show:

# Verify Ollama is accessible from container
docker exec open-webui curl -s http://host.docker.internal:11434/api/tags

Step 6: Upload Operations Manual Digest (5 minutes)

In Open WebUI:

  1. Click "Documents" in sidebar
  2. Click "Upload Documents"
  3. Select /root/firefrost-operations-manual/firefrost-ops-digest.md
  4. Upload completes
  5. Document should appear in list

Alternative - Upload via command line:

# Get file size to verify
ls -lh /root/firefrost-operations-manual/firefrost-ops-digest.md

# File is uploaded via web UI (easier than API for first time)

Step 7: Create Workspace and Test (10 minutes)

Create Operations workspace:

  1. Click "Workspaces" (or similar)
  2. Create new workspace: "Operations"
  3. Attach firefrost-ops-digest.md document
  4. Select model: qwen2.5-coder:7b

Test queries:

Query 1: Current tasks

What are the current Tier 0 tasks according to tasks.md?

Expected: Should list actual current tasks (Whitelist Manager, NC1 Cleanup, etc.)

Query 2: Infrastructure

What servers does Firefrost Gaming operate and what are their IP addresses?

Expected: Should list TX1, NC1, Command Center, etc. with correct IPs

Query 3: Recent work

What was accomplished in the most recent Codex session?

Expected: Should reference Phase 2 work, workspaces created, etc.

If responses are accurate: Migration successful!


Step 8: Create Additional Workspaces (10 minutes)

Replicate the workspace structure:

  1. Operations (already created)

    • Document: firefrost-ops-digest.md
    • Model: qwen2.5-coder:7b
    • Users: Admins only
  2. Brainstorming

    • Document: firefrost-ops-digest.md
    • Model: llama3.3:70b (for deeper thinking)
    • Users: Admins only
  3. Public KB

    • Document: (none yet - future public docs)
    • Model: qwen2.5-coder:7b
    • Users: Public access (future)
  4. Subscriber KB

    • Document: (none yet - future subscriber docs)
    • Model: qwen2.5-coder:7b
    • Users: Subscriber access (future)
  5. Relationship

    • Document: firefrost-ops-digest.md (has relationship docs in it)
    • Model: qwen2.5-coder:7b
    • Users: Admins only
  6. Pokerole Project

    • Document: (none yet - needs pokerole repo)
    • Model: qwen2.5-coder:7b
    • Users: Admins + Holly

Step 9: Create User Accounts (5 minutes)

In Open WebUI Settings → Users:

  1. gingerfury (Meg)

    • Role: Admin
    • Email: (Meg's email)
    • Temporary password (she can change)
  2. Unicorn20089 (Holly)

    • Role: User (not Admin)
    • Email: (Holly's email)
    • Access: Pokerole Project workspace only

Note: Open WebUI has different permission model than AnythingLLM. May need to configure workspace access differently.


Step 10: Document Sync Automation (10 minutes)

Update the sync workflow:

Instead of syncing individual files, we now just re-run Repomix.

Create new sync script:

cat > /root/codex-sync-openwebui.sh << 'EOFSCRIPT'
#!/bin/bash
# Firefrost Codex - Open WebUI Sync Script
# Updates the operations manual digest

set -e

REPO_DIR="/root/firefrost-operations-manual"
DIGEST_FILE="$REPO_DIR/firefrost-ops-digest.md"

echo "=== Firefrost Codex Sync (Open WebUI) ==="
echo "Started: $(date)"
echo ""

# Pull latest from Git
cd "$REPO_DIR"
echo "📥 Pulling latest from Git..."
git pull
echo ""

# Regenerate digest
echo "📦 Regenerating operations manual digest..."
repomix --output firefrost-ops-digest.md
echo ""

# Show file info
echo "✅ Digest updated:"
ls -lh "$DIGEST_FILE"
echo ""

echo "📌 Next step: Upload firefrost-ops-digest.md to Open WebUI"
echo "   Location: http://38.68.14.26:3001"
echo ""

echo "✅ Sync complete: $(date)"
EOFSCRIPT

chmod +x /root/codex-sync-openwebui.sh

Test the new sync script:

/root/codex-sync-openwebui.sh

Then manually upload the new digest via web UI


🔄 ROLLBACK PLAN

If Open WebUI doesn't work or has issues:

Step 1: Stop Open WebUI

docker stop open-webui
docker rm open-webui

Step 2: Restart AnythingLLM

cd /opt/anythingllm
docker-compose up -d

Step 3: Access old Codex

Time to rollback: 2 minutes


📋 POST-MIGRATION CHECKLIST

After migration is complete:

  • Open WebUI accessible at http://38.68.14.26:3001
  • Ollama connection working (models listed)
  • Operations workspace created with digest uploaded
  • Test queries return accurate, current information
  • Additional workspaces created (Brainstorming, etc.)
  • User accounts created (gingerfury, Unicorn20089)
  • Sync script updated and tested
  • Old AnythingLLM containers stopped (but not deleted)
  • Documentation updated in Git

Optional cleanup (after 1 week of successful use):

  • Remove AnythingLLM Docker volumes (frees ~500MB)
  • Remove old sync script /root/codex-sync.sh
  • Update PROJECT-SCOPE with new architecture

📊 EXPECTED IMPROVEMENTS

Performance:

  • Faster responses (Open WebUI more efficient)
  • No indexing lag (single file vs 319 files)
  • Lower memory usage

Accuracy:

  • Better retrieval (clean context vs noisy corpus)
  • Current info prioritized (packaged digest vs mixed archives)
  • Fewer "hallucinations" (clearer document structure)

Maintenance:

  • Simpler updates (re-run packager vs re-sync 319 files)
  • Clear versioning (digest is dated file)
  • Easier to troubleshoot (one file to check)

🆘 TROUBLESHOOTING

If Open WebUI won't start:

# Check logs
docker logs open-webui

# Common issues:
# - Port 3001 already in use (make sure AnythingLLM is stopped)
# - Ollama not accessible (check host.docker.internal)

If Ollama connection fails:

# Test from inside container
docker exec open-webui curl http://host.docker.internal:11434/api/tags

# If that fails, Ollama might not be running
systemctl status ollama

If upload fails:

  • Check file size: ls -lh firefrost-ops-digest.md
  • File too large? Repomix should auto-limit, but check
  • Try uploading via web UI instead of API

If queries return poor results:

  • Check which model is being used
  • Verify digest file uploaded correctly
  • Try more specific queries
  • May need to adjust retrieval settings

📝 NOTES

Why Repomix instead of manual curation?

  • Automatic exclusion of .git, node_modules, binaries
  • Intelligent structure preservation
  • Consistent formatting for LLMs
  • Easy to regenerate when repo updates
  • Industry-standard tool (maintained, documented)

Why Open WebUI instead of alternatives?

  • Most mature AnythingLLM alternative
  • Active development community
  • Better performance than AnythingLLM
  • More features (function calling, tools)
  • Docker-based (consistent with our stack)

Can we keep AnythingLLM running alongside?

  • Yes, but they'd need different ports
  • Not recommended (confusing to have both)
  • Better to fully commit to one approach

🔗 RESOURCES


SUCCESS CRITERIA

Migration is successful when:

  1. Open WebUI accessible and responsive
  2. Operations manual digest uploaded
  3. Test queries return current, accurate information
  4. Response time under 10 seconds
  5. Multiple workspaces functional
  6. User accounts working with correct permissions
  7. No errors in Docker logs

At that point: Phase 2 can be considered COMPLETE


Fire + Frost + Foundation + Better Tools = Where Love Builds Legacy 💙🔥❄️

Document Status: Ready to execute
Next Action: Run Step 1 when ready
Estimated Completion: 1 hour from start to finish