Complete step-by-step plan to replace AnythingLLM with Open WebUI + Repomix. Problem: AnythingLLM with 319 files has poor retrieval quality Solution: Open WebUI (better RAG) + Repomix (single packaged digest) Migration includes: - Install Repomix to package operations manual - Replace AnythingLLM with Open WebUI (same port) - Upload single digest file instead of 319 individual docs - Recreate workspaces and user accounts - Update sync script to use Repomix Expected improvements: - Better search relevance (clean context vs noisy corpus) - Faster responses (efficient RAG engine) - Simpler maintenance (re-run packager vs re-sync files) Risk: LOW - can rollback to AnythingLLM in 2 minutes Time: ~1 hour total Status: Ready to execute when Michael is ready Document: CODEX-MIGRATION-001
12 KiB
Firefrost Codex - Migration to Open WebUI + Repomix
Document ID: CODEX-MIGRATION-001
Created: February 21, 2026
Status: 📋 PLANNED - Ready to Execute
Estimated Time: 1 hour
Risk Level: LOW (can rollback to AnythingLLM if needed)
🎯 OBJECTIVE
Replace AnythingLLM with Open WebUI + Repomix workflow to solve document retrieval quality issues.
Current Problem:
- AnythingLLM indexing 319 files creates poor search relevance
- AI finds old/archived docs instead of current operational info
- Vector search overwhelmed by document volume and similarity
Solution:
- Open WebUI: More efficient RAG engine, better performance
- Repomix: Packages entire operations manual into single, clean file
- Combined: Fast + accurate retrieval without noise
📊 COMPARISON
| Feature | AnythingLLM (Current) | Open WebUI + Repomix |
|---|---|---|
| Documents | 319 individual files | 1 packaged digest |
| Search Quality | Poor (finds old docs) | Good (clean context) |
| Performance | Slow indexing | Fast responses |
| Updates | Re-sync 319 files | Re-run packager |
| Maintenance | High (vector DB) | Low (single file) |
| Cost | $0/month | $0/month |
⚠️ PREREQUISITES
Before starting:
- SSH access to TX1 (38.68.14.26)
- Docker running on TX1
- Ollama running on TX1 (port 11434)
- Git repo cloned at /root/firefrost-operations-manual
- ~30 minutes of focused time
- Backup plan (can restart AnythingLLM if needed)
🛠️ MIGRATION STEPS
Step 1: Install Repomix (5 minutes)
SSH to TX1:
ssh root@38.68.14.26
Install Repomix globally:
npm install -g repomix
Verify installation:
repomix --version
Expected output: Version number (e.g., repomix v2.x.x)
Step 2: Generate Operations Manual Digest (5 minutes)
Navigate to operations manual:
cd /root/firefrost-operations-manual
Run Repomix to package entire repo:
repomix --output firefrost-ops-digest.md
What this does:
- Scans entire repository
- Automatically excludes .git, node_modules, binaries
- Creates single markdown file with intelligent structure
- Optimized for LLM consumption
Verify the digest:
ls -lh firefrost-ops-digest.md
wc -l firefrost-ops-digest.md
Expected: File should be several thousand lines, representing all docs
Step 3: Stop AnythingLLM (2 minutes)
Navigate to AnythingLLM directory:
cd /opt/anythingllm
Stop the containers:
docker-compose down
Verify stopped:
docker ps | grep anythingllm
Expected: No output (containers stopped)
Optional - Remove containers completely:
# Only if you want to free up space
# docker-compose down -v # This removes volumes too
Note: We're keeping the data volumes in case we need to rollback.
Step 4: Install Open WebUI (10 minutes)
Pull and run Open WebUI container:
docker run -d \
-p 3001:8080 \
--name open-webui \
--add-host=host.docker.internal:host-gateway \
-v open-webui:/app/backend/data \
-e OLLAMA_BASE_URL=http://host.docker.internal:11434 \
--restart always \
ghcr.io/open-webui/open-webui:main
Breakdown:
-p 3001:8080- Same port as old Codex (http://38.68.14.26:3001)--add-host- Allows container to reach host's Ollama-v open-webui:/app/backend/data- Persistent storage-e OLLAMA_BASE_URL- Points to existing Ollama instance--restart always- Auto-start on server reboot
Verify container running:
docker ps | grep open-webui
Expected: Container should show as "Up"
Check logs:
docker logs open-webui
Expected: Should see startup messages, no errors
Step 5: Initial Setup via Web UI (10 minutes)
Access Open WebUI:
- Open browser to: http://38.68.14.26:3001
- First visit will prompt to create admin account
- Create account:
- Username:
mkrause612 - Email: (your email)
- Password: (secure password)
- Username:
Verify Ollama connection:
- Click Settings (gear icon)
- Go to "Connections"
- Should show Ollama at
http://host.docker.internal:11434 - Should list available models (qwen2.5-coder:7b, llama3.3:70b)
If models don't show:
# Verify Ollama is accessible from container
docker exec open-webui curl -s http://host.docker.internal:11434/api/tags
Step 6: Upload Operations Manual Digest (5 minutes)
In Open WebUI:
- Click "Documents" in sidebar
- Click "Upload Documents"
- Select
/root/firefrost-operations-manual/firefrost-ops-digest.md - Upload completes
- Document should appear in list
Alternative - Upload via command line:
# Get file size to verify
ls -lh /root/firefrost-operations-manual/firefrost-ops-digest.md
# File is uploaded via web UI (easier than API for first time)
Step 7: Create Workspace and Test (10 minutes)
Create Operations workspace:
- Click "Workspaces" (or similar)
- Create new workspace: "Operations"
- Attach
firefrost-ops-digest.mddocument - Select model:
qwen2.5-coder:7b
Test queries:
Query 1: Current tasks
What are the current Tier 0 tasks according to tasks.md?
Expected: Should list actual current tasks (Whitelist Manager, NC1 Cleanup, etc.)
Query 2: Infrastructure
What servers does Firefrost Gaming operate and what are their IP addresses?
Expected: Should list TX1, NC1, Command Center, etc. with correct IPs
Query 3: Recent work
What was accomplished in the most recent Codex session?
Expected: Should reference Phase 2 work, workspaces created, etc.
If responses are accurate: ✅ Migration successful!
Step 8: Create Additional Workspaces (10 minutes)
Replicate the workspace structure:
-
Operations (already created)
- Document: firefrost-ops-digest.md
- Model: qwen2.5-coder:7b
- Users: Admins only
-
Brainstorming
- Document: firefrost-ops-digest.md
- Model: llama3.3:70b (for deeper thinking)
- Users: Admins only
-
Public KB
- Document: (none yet - future public docs)
- Model: qwen2.5-coder:7b
- Users: Public access (future)
-
Subscriber KB
- Document: (none yet - future subscriber docs)
- Model: qwen2.5-coder:7b
- Users: Subscriber access (future)
-
Relationship
- Document: firefrost-ops-digest.md (has relationship docs in it)
- Model: qwen2.5-coder:7b
- Users: Admins only
-
Pokerole Project
- Document: (none yet - needs pokerole repo)
- Model: qwen2.5-coder:7b
- Users: Admins + Holly
Step 9: Create User Accounts (5 minutes)
In Open WebUI Settings → Users:
-
gingerfury (Meg)
- Role: Admin
- Email: (Meg's email)
- Temporary password (she can change)
-
Unicorn20089 (Holly)
- Role: User (not Admin)
- Email: (Holly's email)
- Access: Pokerole Project workspace only
Note: Open WebUI has different permission model than AnythingLLM. May need to configure workspace access differently.
Step 10: Document Sync Automation (10 minutes)
Update the sync workflow:
Instead of syncing individual files, we now just re-run Repomix.
Create new sync script:
cat > /root/codex-sync-openwebui.sh << 'EOFSCRIPT'
#!/bin/bash
# Firefrost Codex - Open WebUI Sync Script
# Updates the operations manual digest
set -e
REPO_DIR="/root/firefrost-operations-manual"
DIGEST_FILE="$REPO_DIR/firefrost-ops-digest.md"
echo "=== Firefrost Codex Sync (Open WebUI) ==="
echo "Started: $(date)"
echo ""
# Pull latest from Git
cd "$REPO_DIR"
echo "📥 Pulling latest from Git..."
git pull
echo ""
# Regenerate digest
echo "📦 Regenerating operations manual digest..."
repomix --output firefrost-ops-digest.md
echo ""
# Show file info
echo "✅ Digest updated:"
ls -lh "$DIGEST_FILE"
echo ""
echo "📌 Next step: Upload firefrost-ops-digest.md to Open WebUI"
echo " Location: http://38.68.14.26:3001"
echo ""
echo "✅ Sync complete: $(date)"
EOFSCRIPT
chmod +x /root/codex-sync-openwebui.sh
Test the new sync script:
/root/codex-sync-openwebui.sh
Then manually upload the new digest via web UI
🔄 ROLLBACK PLAN
If Open WebUI doesn't work or has issues:
Step 1: Stop Open WebUI
docker stop open-webui
docker rm open-webui
Step 2: Restart AnythingLLM
cd /opt/anythingllm
docker-compose up -d
Step 3: Access old Codex
- URL: http://38.68.14.26:3001
- All data still intact (we didn't remove volumes)
Time to rollback: 2 minutes
📋 POST-MIGRATION CHECKLIST
After migration is complete:
- Open WebUI accessible at http://38.68.14.26:3001
- Ollama connection working (models listed)
- Operations workspace created with digest uploaded
- Test queries return accurate, current information
- Additional workspaces created (Brainstorming, etc.)
- User accounts created (gingerfury, Unicorn20089)
- Sync script updated and tested
- Old AnythingLLM containers stopped (but not deleted)
- Documentation updated in Git
Optional cleanup (after 1 week of successful use):
- Remove AnythingLLM Docker volumes (frees ~500MB)
- Remove old sync script
/root/codex-sync.sh - Update PROJECT-SCOPE with new architecture
📊 EXPECTED IMPROVEMENTS
Performance:
- Faster responses (Open WebUI more efficient)
- No indexing lag (single file vs 319 files)
- Lower memory usage
Accuracy:
- Better retrieval (clean context vs noisy corpus)
- Current info prioritized (packaged digest vs mixed archives)
- Fewer "hallucinations" (clearer document structure)
Maintenance:
- Simpler updates (re-run packager vs re-sync 319 files)
- Clear versioning (digest is dated file)
- Easier to troubleshoot (one file to check)
🆘 TROUBLESHOOTING
If Open WebUI won't start:
# Check logs
docker logs open-webui
# Common issues:
# - Port 3001 already in use (make sure AnythingLLM is stopped)
# - Ollama not accessible (check host.docker.internal)
If Ollama connection fails:
# Test from inside container
docker exec open-webui curl http://host.docker.internal:11434/api/tags
# If that fails, Ollama might not be running
systemctl status ollama
If upload fails:
- Check file size:
ls -lh firefrost-ops-digest.md - File too large? Repomix should auto-limit, but check
- Try uploading via web UI instead of API
If queries return poor results:
- Check which model is being used
- Verify digest file uploaded correctly
- Try more specific queries
- May need to adjust retrieval settings
📝 NOTES
Why Repomix instead of manual curation?
- Automatic exclusion of .git, node_modules, binaries
- Intelligent structure preservation
- Consistent formatting for LLMs
- Easy to regenerate when repo updates
- Industry-standard tool (maintained, documented)
Why Open WebUI instead of alternatives?
- Most mature AnythingLLM alternative
- Active development community
- Better performance than AnythingLLM
- More features (function calling, tools)
- Docker-based (consistent with our stack)
Can we keep AnythingLLM running alongside?
- Yes, but they'd need different ports
- Not recommended (confusing to have both)
- Better to fully commit to one approach
🔗 RESOURCES
- Open WebUI Docs: https://docs.openwebui.com/
- Repomix GitHub: https://github.com/yamadashy/repomix
- Ollama API: http://38.68.14.26:11434/api/tags
- New Codex URL: http://38.68.14.26:3001
✅ SUCCESS CRITERIA
Migration is successful when:
- ✅ Open WebUI accessible and responsive
- ✅ Operations manual digest uploaded
- ✅ Test queries return current, accurate information
- ✅ Response time under 10 seconds
- ✅ Multiple workspaces functional
- ✅ User accounts working with correct permissions
- ✅ No errors in Docker logs
At that point: Phase 2 can be considered COMPLETE ✅
Fire + Frost + Foundation + Better Tools = Where Love Builds Legacy 💙🔥❄️
Document Status: Ready to execute
Next Action: Run Step 1 when ready
Estimated Completion: 1 hour from start to finish