Fresh start - excluded large ROM JSON files
This commit is contained in:
25
memory/projects/coding-workflow.md
Normal file
25
memory/projects/coding-workflow.md
Normal file
@@ -0,0 +1,25 @@
|
||||
# Coding Setup & Workflow
|
||||
|
||||
## Environment
|
||||
- **Primary Channel:** Discord #coding (1468627455656067074)
|
||||
- **Repo:** whoisfrost/controls-web
|
||||
- **Model:** kimi-k2.5:cloud (switched from deepseek-v3.2 for less token bloat)
|
||||
|
||||
## Git Workflow
|
||||
- **main:** Stable/production branch
|
||||
- **dev:** Testing/playground branch
|
||||
- **Feature branches:** Create → merge to dev → merge dev to main
|
||||
- **Process:** Work PC pushes to dev → Dev server pulls dev → Live server pulls main
|
||||
|
||||
## LAMP Stack
|
||||
- **Dashboard:** `C:\web\htdocs\dashboard` (active)
|
||||
- **Web Root:** `C:\web\htdocs\`
|
||||
- **Old Dashboard:** `/var/www/html/dashboardv2/` (ignore)
|
||||
|
||||
## Tools
|
||||
- Shell: Bash
|
||||
- SSH, curl, jq
|
||||
|
||||
---
|
||||
|
||||
*Trigger words: coding, git, repo, workflow, lamp, dashboard, controls-web*
|
||||
59
memory/projects/discord-reminder-system.md
Normal file
59
memory/projects/discord-reminder-system.md
Normal file
@@ -0,0 +1,59 @@
|
||||
# Discord Reminder System
|
||||
|
||||
**Status:** ✅ Live
|
||||
**Channel:** #schedule (1474636036905631867)
|
||||
**Created:** 2026-02-25
|
||||
**Files:**
|
||||
- `tools/reminder-manager.py` - Core Python logic
|
||||
- `tools/reminder-handler.ps1` - PowerShell wrapper
|
||||
- `skills/discord-reminder/SKILL.md` - Documentation
|
||||
- `data/reminders.db` - SQLite database
|
||||
|
||||
## How It Works
|
||||
|
||||
1. User says: `remind me 20m Call John`
|
||||
2. I parse natural language time (20m, 2h, tomorrow 9am, etc.)
|
||||
3. Create OpenClaw cron job for exact time
|
||||
4. At trigger time, cron fires → sends Discord message
|
||||
5. One-shot jobs auto-delete after firing
|
||||
|
||||
## Supported Time Formats
|
||||
|
||||
| Format | Example |
|
||||
|--------|---------|
|
||||
| Minutes | `20m`, `5m` |
|
||||
| Hours | `2h`, `1h30m` |
|
||||
| Today time | `9am`, `2:30pm`, `15:00` |
|
||||
| Tomorrow | `tomorrow`, `tomorrow 9am` |
|
||||
|
||||
## Commands
|
||||
|
||||
- `remind me [time] [message]` - Add reminder
|
||||
- `list reminders` - Show active reminders
|
||||
- `cancel reminder #[id]` - Cancel by ID
|
||||
|
||||
## Technical Details
|
||||
|
||||
- **Backend:** OpenClaw cron + SQLite
|
||||
- **Delivery:** systemEvent → Discord channel
|
||||
- **Timezone:** America/Chicago (automatic DST handling)
|
||||
- **Cleanup:** Old reminders auto-delete via Daily Cron Cleanup job
|
||||
|
||||
## Testing History
|
||||
|
||||
- ✅ 2026-02-25 - Basic 5m reminders working
|
||||
- ✅ 2026-02-25 - 3pm same-day reminders working
|
||||
- ✅ 2026-02-27 - Confirmed still working after 2 days
|
||||
|
||||
## Known Quirks
|
||||
|
||||
- Must say "remind me" — "remind 4 minutes" won't work (by design)
|
||||
- One-shot reminders don't show in `list` after firing (auto-deleted)
|
||||
- All times are America/Chicago
|
||||
|
||||
## Future Ideas
|
||||
|
||||
- [ ] Recurring reminders (daily, weekly)
|
||||
- [ ] Snooze functionality
|
||||
- [ ] Reminder categories/tags
|
||||
- [ ] DM reminders (vs channel-only)
|
||||
31
memory/projects/discord-voice-bot.md
Normal file
31
memory/projects/discord-voice-bot.md
Normal file
@@ -0,0 +1,31 @@
|
||||
# Discord Voice Bot (GLaDOS)
|
||||
|
||||
## Overview
|
||||
Voice-controlled Discord bot with conversational AI
|
||||
|
||||
## Location
|
||||
`/home/admin/.openclaw/workspace/discord-voice-bot/`
|
||||
|
||||
## Architecture (v2 - 2026-02-08)
|
||||
- **TTS:** HTTP endpoint at 192.168.0.17:5050 ("glados" voice)
|
||||
- **STT:** Whisper on 192.168.0.17:10300 (Wyoming protocol)
|
||||
- **LLM:** Ollama at 192.168.0.17:11434 (qwen3-coder-next:cloud)
|
||||
- **Auto-join:** Bot auto-joins #coding voice channel on startup
|
||||
|
||||
## Commands
|
||||
- `!join` - Join voice channel
|
||||
- `!leave` - Leave voice channel
|
||||
- `!test [text]` - Test TTS
|
||||
|
||||
## MCP Integration
|
||||
- `list_projects()` - "What am I working on?"
|
||||
- `create_project()` - Start new project
|
||||
- `add_task()` - Add tasks via voice
|
||||
- `update_task_status()` - Mark complete
|
||||
|
||||
## Status
|
||||
✅ **Completed** — Operational and always available in #coding voice channel
|
||||
|
||||
---
|
||||
|
||||
*Trigger words: voice bot, glados, discord bot, TTS, STT, wyoming, voice control*
|
||||
26
memory/projects/home-assistant.md
Normal file
26
memory/projects/home-assistant.md
Normal file
@@ -0,0 +1,26 @@
|
||||
# Home Assistant
|
||||
|
||||
## Environment
|
||||
- **URL:** http://192.168.0.39:8123
|
||||
- **Auth Token:** eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiI0ZTcyMGRjZDkzODg0ZTFiOTE5YzkyMDVkZmNlMTY5NSIsImlhdCI6MTc3MzI1NTM5MiwiZXhwIjoyMDg4NjE1MzkyfQ.t7QYwdOP7w2s0tilu9A4zh5saezNRhveuXEJdz4ILKM
|
||||
- **MQTT Broker:** 192.168.0.39:1883 (corey/41945549)
|
||||
- **Status:** In Progress (entity cleanup ongoing)
|
||||
|
||||
## Room-Assistant (Bluetooth Presence)
|
||||
- **Host:** 192.168.0.95 (livingroom-pi)
|
||||
- **SSH Key:** ~/.ssh/id_ed25519 (ed25519)
|
||||
- **Config:** `/home/admin/config/local.yml`
|
||||
- **Phone MAC:** b0:c2:c7:07:28:b5
|
||||
- **Entities:** sensor.livingroom_cluster_size, sensor.livingroom_cluster_leader
|
||||
- **Note:** mdns module needed for cluster discovery (not critical)
|
||||
|
||||
## Channel
|
||||
Discord #home-assistant (1466074219829006599)
|
||||
|
||||
## Current Tasks
|
||||
- [ ] Entity cleanup (972 devices in spreadsheet)
|
||||
- [ ] Fix hallway sensor battery (1%)
|
||||
|
||||
---
|
||||
|
||||
*Trigger words: home assistant, HA, mqtt, room-assistant, presence, sensor*
|
||||
38
memory/projects/memory-system.md
Normal file
38
memory/projects/memory-system.md
Normal file
@@ -0,0 +1,38 @@
|
||||
# Memory System
|
||||
|
||||
## Architecture
|
||||
Hybrid system: File-based (human-readable) + SQLite (structured)
|
||||
|
||||
## Components
|
||||
|
||||
### File-Based Memory
|
||||
- **Daily Notes:** `memory/YYYY-MM-DD.md`
|
||||
- **Curated Memory:** `MEMORY.md` (index + always-load info)
|
||||
- **Workspace Context:** `workspace-context.md` (temp state)
|
||||
|
||||
### SQLite Database
|
||||
- **Location:** `~/.openclaw/memory.db`
|
||||
- **Tables:** memory_cells, scenes, memory_fts
|
||||
- **Extraction:** Memory Worker agent (3 AM daily)
|
||||
|
||||
### Cloud Backup
|
||||
- **Service:** Supermemory.ai
|
||||
- **Schedule:** Daily 2 AM
|
||||
- **Script:** `scripts/backup-memory.py`
|
||||
|
||||
## Workers
|
||||
- **Memory Worker:** Extracts structured data daily at 3 AM
|
||||
- **Job Verifier:** Checks all overnight jobs at 9 AM
|
||||
|
||||
## Status
|
||||
✅ **Complete** — Hierarchical restructure finished (Feb 25, 2026)
|
||||
- Daily notes automation working
|
||||
- MEMORY.md as curated index
|
||||
- SQLite database with FTS
|
||||
- Supermemory backups daily
|
||||
- Memory worker extracting nightly
|
||||
- Obsidian integration with Dataview
|
||||
|
||||
---
|
||||
|
||||
*Trigger words: memory, database, sqlite, supermemory, backup, extraction*
|
||||
240
memory/projects/memory-vector-system.md
Normal file
240
memory/projects/memory-vector-system.md
Normal file
@@ -0,0 +1,240 @@
|
||||
# Memory Vector System
|
||||
|
||||
**Status:** ✅ Supermonkey-Powered
|
||||
**Created:** 2026-03-02
|
||||
**Replaces:** Supermemory cloud embedding API
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
Local semantic memory search using SQLite + Ollama embeddings. No cloud dependency, no API limits, works offline.
|
||||
|
||||
## Architecture
|
||||
|
||||
### File-Based Pipeline
|
||||
```
|
||||
Memory Files (markdown)
|
||||
↓
|
||||
memory_embedding_worker.py
|
||||
↓
|
||||
Ollama (nomic-embed-text) → 768-dim vectors
|
||||
↓
|
||||
SQLite + sqlite-vector extension
|
||||
↓
|
||||
Cosine similarity search
|
||||
```
|
||||
|
||||
### Real-Time Session Pipeline (NEW)
|
||||
```
|
||||
Discord/Chat Messages
|
||||
↓
|
||||
OpenClaw Session Transcript (.jsonl)
|
||||
↓
|
||||
session_monitor.py (cron every 2 min)
|
||||
↓
|
||||
Count messages → At 15: summarize
|
||||
↓
|
||||
Ollama (nomic-embed-text)
|
||||
↓
|
||||
SQLite + sqlite-vector
|
||||
```
|
||||
|
||||
**Innovation:** We read OpenClaw's own session transcripts to auto-capture conversations without manual tracking!
|
||||
|
||||
## Components
|
||||
|
||||
### Core Module
|
||||
**File:** `memory_vector.py`
|
||||
- `MemoryVectorDB` class — database wrapper
|
||||
- `store_memory()` — save embedding
|
||||
- `search_memories()` — semantic search
|
||||
- `setup_memory_vectors()` — one-time init
|
||||
|
||||
### Worker Scripts
|
||||
**File:** `tools/memory_embedding_worker.py`
|
||||
- Daily/batch processing for memory.md files
|
||||
- Processes section-by-section
|
||||
- Called by cron at 3 AM
|
||||
|
||||
**File:** `tools/session_monitor.py` ⭐ NEW
|
||||
- Reads OpenClaw session transcripts live
|
||||
- Tracks message counts automatically
|
||||
- Creates snapshots every 15 messages
|
||||
- Cron: every 2 minutes
|
||||
- Database: `session_tracking` table
|
||||
|
||||
**File:** `tools/session_snapshotter.py`
|
||||
- Manual session capture (legacy)
|
||||
- Use session_monitor.py for auto-tracking
|
||||
|
||||
**File:** `tools/search_memories.py`
|
||||
- CLI tool for manual searches
|
||||
- Interactive or one-shot mode
|
||||
|
||||
**File:** `tools/bulk_memory_loader.py`
|
||||
- One-time historical import
|
||||
- Processed 1,186 embeddings on first run
|
||||
|
||||
**File:** `scripts/memory-embeddings-cron.ps1`
|
||||
- PowerShell wrapper for daily cron
|
||||
- Checks Ollama availability
|
||||
|
||||
## Database Schema
|
||||
|
||||
### memory_embeddings
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| id | INTEGER | Primary key |
|
||||
| source_type | TEXT | "daily", "memory_md", "project", "session_snapshot", "auto_session" |
|
||||
| source_path | TEXT | File path + section or timestamp |
|
||||
| content_text | TEXT | First 500 chars (searchable preview) |
|
||||
| embedding | BLOB | 768-dim Float32 vector (3,072 bytes) |
|
||||
| created_at | TIMESTAMP | Auto-set |
|
||||
|
||||
### session_tracking ⭐ NEW
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| session_id | TEXT | OpenClaw session UUID |
|
||||
| channel_key | TEXT | discord:channel_id |
|
||||
| transcript_path | TEXT | Path to .jsonl file |
|
||||
| last_message_index | INTEGER | Last processed line |
|
||||
| messages_since_snapshot | INTEGER | Counter since last embed |
|
||||
| last_checkpoint_time | TIMESTAMP | Last check |
|
||||
| is_active | BOOLEAN | Session still exists? |
|
||||
|
||||
## Cron Schedule
|
||||
|
||||
| Job | Schedule | What It Does |
|
||||
|-----|----------|--------------|
|
||||
| **Memory Embeddings Daily** | 3:00 AM | Process yesterday's memory file |
|
||||
| **Session Monitor** ⭐ NEW | Every 2 min | Reads transcripts, auto-snapshots at 15 msgs |
|
||||
| Session Snapshots (legacy) | Manual | Manual capture via script |
|
||||
|
||||
**How Session Monitor Works:**
|
||||
1. Reads `.openclaw/agents/main/sessions/*.jsonl`
|
||||
2. Tracks `last_message_index` per session
|
||||
3. Counts new user messages
|
||||
4. At 15 messages: summarize → embed → store
|
||||
5. Updates checkpoint in `session_tracking` table
|
||||
|
||||
## Usage
|
||||
|
||||
### Search
|
||||
```powershell
|
||||
# Search by query
|
||||
python tools/search_memories.py "home assistant automation"
|
||||
|
||||
# Interactive mode
|
||||
python tools/search_memories.py --interactive
|
||||
```
|
||||
|
||||
### Manual Snapshot
|
||||
```powershell
|
||||
python tools/session_snapshotter.py "Summary of important discussion"
|
||||
```
|
||||
|
||||
### From Python
|
||||
```python
|
||||
from memory_vector import search_memories
|
||||
|
||||
# Generate query embedding with Ollama
|
||||
# Then search
|
||||
results = search_memories(query_embedding, k=5)
|
||||
# Returns: [(source_path, content_text, distance), ...]
|
||||
```
|
||||
|
||||
## Stats
|
||||
|
||||
| Metric | Value |
|
||||
|--------|-------|
|
||||
| Total embeddings | **1,623** |
|
||||
| Daily notes | 818 |
|
||||
| Project files | 332 |
|
||||
| MEMORY.md | 33 |
|
||||
| Manual session snapshots | 2 |
|
||||
| **Auto session snapshots** ⭐ | **27** |
|
||||
| Tracked sessions | 245 |
|
||||
| Active sessions | 243 |
|
||||
| Database size | ~5 MB |
|
||||
|
||||
**Live Stats Query:**
|
||||
```powershell
|
||||
python -c "import sqlite3; db=sqlite3.connect(r'C:\Users\admin\.openclaw\memory.db'); c=db.cursor(); c.execute('SELECT COUNT(*) FROM memory_embeddings'); print('Total:', c.fetchone()[0]); c.execute('SELECT COUNT(*) FROM memory_embeddings WHERE source_type=\'auto_session\''); print('Auto snapshots:', c.fetchone()[0]); c.execute('SELECT COUNT(*) FROM session_tracking WHERE is_active=1'); print('Active sessions:', c.fetchone()[0]); db.close()"
|
||||
```
|
||||
|
||||
**Check Session Monitor Status:**
|
||||
```powershell
|
||||
# See last run
|
||||
python tools/session_monitor.py
|
||||
|
||||
# Check if cron is running
|
||||
openclaw cron list | findstr "Session Monitor"
|
||||
```
|
||||
|
||||
## The Innovation ⭐
|
||||
|
||||
**Problem:** How to automatically capture live conversations without manual tracking?
|
||||
|
||||
**Solution:** Read OpenClaw's own session transcripts!
|
||||
|
||||
OpenClaw stores every session in `.openclaw/agents/main/sessions/[session-id].jsonl`. We discovered we can:
|
||||
|
||||
1. **Monitor these files live** — cron job every 2 minutes
|
||||
2. **Track line position** — `last_message_index` checkpoint
|
||||
3. **Count user messages** — parse JSONL for `role: "user"`
|
||||
4. **Auto-snapshot at threshold** — 15 messages → summarize → embed
|
||||
|
||||
**Why this matters:**
|
||||
- ✅ No manual message counting
|
||||
- ✅ Survives session restarts
|
||||
- ✅ Multi-channel aware (each channel = separate session file)
|
||||
- ✅ No OpenClaw hooks required (we read their existing data)
|
||||
|
||||
**Credit:** Corey's genius idea 💡 *(Corey, 2026-03-03)*
|
||||
|
||||
---
|
||||
|
||||
## Comparison: Old vs New
|
||||
|
||||
| Feature | Supermemory | SQLite-Vector |
|
||||
|---------|-------------|---------------|
|
||||
| Cloud dependency | Required | None |
|
||||
| API limits | Yes | No |
|
||||
| Offline use | No | Yes |
|
||||
| Embeddings stored | Cloud | Local |
|
||||
| Search speed | Network latency | <100ms local |
|
||||
| Reliability | Crashing | Stable |
|
||||
| Cost | API-based | Free |
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Ollama not running
|
||||
```powershell
|
||||
# Check status
|
||||
Invoke-RestMethod -Uri "http://localhost:11434/api/tags"
|
||||
|
||||
# Start Ollama
|
||||
ollama serve
|
||||
```
|
||||
|
||||
### Missing model
|
||||
```powershell
|
||||
ollama pull nomic-embed-text
|
||||
```
|
||||
|
||||
### Database locked
|
||||
Close any GUI tools (DB Browser) before running scripts.
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
- [ ] Keyword filtering alongside vector search
|
||||
- [ ] Date range queries
|
||||
- [ ] Source type filtering (e.g., only projects)
|
||||
- [ ] Embedding quality scoring
|
||||
- [ ] Auto-summarization improvements
|
||||
|
||||
---
|
||||
|
||||
**Status:** Fully operational
|
||||
**Last updated:** 2026-03-02
|
||||
120
memory/projects/mission-control-dashboard-tlc.md
Normal file
120
memory/projects/mission-control-dashboard-tlc.md
Normal file
@@ -0,0 +1,120 @@
|
||||
# Mission Control Dashboard — TLC Project
|
||||
|
||||
**Status: Parked (awaiting direction decision) — Needs Consolidation & Refresh
|
||||
**Created: 2026-02-17 (Next.js) / 2026-02-20 (Python)
|
||||
**Goal:** Combine both dashboards into one unified system
|
||||
|
||||
---
|
||||
|
||||
## Current State
|
||||
|
||||
We have **TWO** Mission Control dashboards:
|
||||
|
||||
### 1. Next.js Dashboard
|
||||
| Attribute | Value |
|
||||
|-----------|-------|
|
||||
| **Location** | `C:\web\htdocs\mission-control` |
|
||||
| **URL** | http://localhost:3000 |
|
||||
| **Tech** | Next.js 14, TypeScript, Tailwind |
|
||||
| **Status** | ✅ Running but needs updates |
|
||||
| **Features** | Project cards, Kanban boards, Status widgets, Themes |
|
||||
|
||||
### 2. Python/Flask Dashboard
|
||||
| Attribute | Value |
|
||||
|-----------|-------|
|
||||
| **Location** | `C:\web\htdocs\mission-control-py` |
|
||||
| **URL** | http://localhost:5050 |
|
||||
| **Tech** | Python/Flask, MySQL, HA API |
|
||||
| **Status** | ✅ Running but needs updates |
|
||||
| **Features** | Admin CRUD, Weather, Gateway status, Projects |
|
||||
|
||||
---
|
||||
|
||||
## The Problem
|
||||
|
||||
Two dashboards doing similar things. Need to:
|
||||
|
||||
| Decision | Status |
|
||||
|----------|--------|
|
||||
| **Pick one as primary?** | Undecided |
|
||||
| **Merge features?** | Undecided |
|
||||
| **Start fresh?** | Option on table |
|
||||
|
||||
---
|
||||
|
||||
## Features to Preserve (From Both)
|
||||
|
||||
### From Next.js Version:
|
||||
- ✅ Project cards with Kanban boards
|
||||
- ✅ Drag-and-drop task management
|
||||
- ✅ Real-time status widgets (Gateway, Discord, Cron)
|
||||
- ✅ 4 themes (NASA Retro, Luxury Commander, Cyberpunk, Organic)
|
||||
- ✅ Knowledge base reference cards
|
||||
- ✅ GLaDOS voice control integration
|
||||
|
||||
### From Python Version:
|
||||
- ✅ MySQL persistence (vs JSON files)
|
||||
- ✅ Home Assistant API integration
|
||||
- ✅ Weather widget
|
||||
- ✅ Admin CRUD interface
|
||||
- ✅ News tracking
|
||||
- ✅ Gateway status with real endpoints
|
||||
|
||||
---
|
||||
|
||||
## Known Issues / TLC Needed
|
||||
|
||||
| Issue | Dashboard | Priority |
|
||||
|-------|-----------|----------|
|
||||
| Status widgets outdated | Both | High |
|
||||
| Memory system not integrated | Both | High |
|
||||
| Mobile responsiveness | Next.js | Medium |
|
||||
| Theme switching flaky | Next.js | Low |
|
||||
| No unified data source | Both | High |
|
||||
|
||||
---
|
||||
|
||||
## Possible Directions
|
||||
|
||||
### Option A: Refactor Next.js
|
||||
- Keep Next.js as frontend
|
||||
- Add MySQL backend (like Python version)
|
||||
- Port Python features over
|
||||
- Add memory system widgets
|
||||
|
||||
### Option B: Enhance Python
|
||||
- Keep Python as primary
|
||||
- Add theme system from Next.js
|
||||
- Better UI/UX polish
|
||||
- Mobile responsiveness
|
||||
|
||||
### Option C: Start Fresh
|
||||
- New tech stack (Svelte? Vue?)
|
||||
- Learn from both versions
|
||||
- Design unified architecture
|
||||
- Full rebuild
|
||||
|
||||
### Option D: Hybrid API
|
||||
- Python as backend API
|
||||
- Next.js as frontend
|
||||
- Separate concerns properly
|
||||
|
||||
---
|
||||
|
||||
## Next Steps (When We Pick This Up)
|
||||
|
||||
1. **Audit both dashboards** — what works, what doesn't
|
||||
2. **Decide architecture** — A, B, C, or D
|
||||
3. **Design unified schema** — data models, APIs
|
||||
4. **Migrate/merge features** — one dashboard to rule them all
|
||||
|
||||
---
|
||||
|
||||
**Notes:**
|
||||
- Frigate is ✅ working (Docker)
|
||||
- UniFi is 🟡 parked (auth issues later)
|
||||
- This project is on hold until direction decided
|
||||
|
||||
---
|
||||
|
||||
*Trigger words: dashboard tlc, mission control refresh, combine dashboards, dashboard v2*
|
||||
95
memory/projects/mission-control-dashboard.md
Normal file
95
memory/projects/mission-control-dashboard.md
Normal file
@@ -0,0 +1,95 @@
|
||||
# Mission Control Dashboard
|
||||
|
||||
## Overview
|
||||
Next.js 14 dashboard for project management and system monitoring
|
||||
|
||||
## Location
|
||||
`C:\web\htdocs\mission-control`
|
||||
**URL:** http://localhost:3000
|
||||
|
||||
## Status
|
||||
🔄 **Consolidating** — Combined with Python version
|
||||
**See:** `mission-control-dashboard-tlc.md` for unified project plan
|
||||
|
||||
**Note:** Both dashboards running feature-complete but need refresh/merge.
|
||||
|
||||
## Features
|
||||
|
||||
### Project Management
|
||||
- Project cards with status (Planning, In Progress, Completed)
|
||||
- Kanban task boards
|
||||
- Drag-and-drop task management
|
||||
|
||||
### Status Widgets (Real-Time)
|
||||
- OpenClaw gateway status
|
||||
- Home Assistant connectivity
|
||||
- Discord bot status
|
||||
- GLaDOS MCP health
|
||||
- Cron job monitoring
|
||||
|
||||
### Quick Actions
|
||||
- Common operation buttons
|
||||
- One-click system checks
|
||||
|
||||
### Knowledge Base
|
||||
- Reference cards with commands
|
||||
- API endpoint documentation
|
||||
- Path references
|
||||
|
||||
## Tech Stack
|
||||
- **Framework:** Next.js 14+ with App Router
|
||||
- **Language:** TypeScript
|
||||
- **Styling:** Tailwind CSS + shadcn/ui
|
||||
- **Data:** Real-time polling APIs
|
||||
|
||||
## API Endpoints
|
||||
|
||||
### System Status
|
||||
- `GET /api/gateway` — Live OpenClaw status (30s refresh)
|
||||
- `GET /api/cron` — All cron jobs with status indicators
|
||||
- `GET /api/backup` — Supermemory backup status
|
||||
|
||||
### Project Management
|
||||
- `GET /api/projects` — List all projects
|
||||
- `POST /api/projects` — Create new project
|
||||
- `PUT /api/projects/[id]` — Update project
|
||||
- `POST /api/projects/[id]/tasks` — Add task
|
||||
- `PUT /api/projects/[id]/tasks/[taskId]` — Update task status
|
||||
|
||||
## Voice Control (GLaDOS Integration)
|
||||
|
||||
**MCP Tools:**
|
||||
- `list_projects()` — "What am I working on?"
|
||||
- `create_project(name)` — Start new project
|
||||
- `add_task(project, task)` — Add task via voice
|
||||
- `update_task_status(taskId, status)` — Mark complete
|
||||
- `get_project_status()` — Check overall progress
|
||||
|
||||
## Themes
|
||||
4 distinct themes created:
|
||||
1. **NASA Retro** — 1969 CRT monitor aesthetic
|
||||
2. **Luxury Commander** — Swiss editorial, watch brand quality
|
||||
3. **Cyberpunk Terminal** — Neon dystopian tech
|
||||
4. **Organic Living** — Biomorphic, natural flows
|
||||
|
||||
Switch via: `themes\switch-theme.bat [1-4]`
|
||||
|
||||
## Data Storage
|
||||
- **Projects:** `data/projects.json` (dynamic read/write)
|
||||
- **Data Store:** `data/data-store.ts` (read/write utilities)
|
||||
|
||||
## Usage
|
||||
```bash
|
||||
cd C:\web\htdocs\mission-control
|
||||
npm run dev
|
||||
# Open http://localhost:3000
|
||||
```
|
||||
|
||||
## Next Steps
|
||||
- Add Home Assistant real-time data (once HA cleanup complete)
|
||||
- Mobile responsiveness tweaks
|
||||
- PTT key binding (Spacebar hold)
|
||||
|
||||
---
|
||||
|
||||
*Trigger words: dashboard, mission control, nextjs, projects, kanban, status*
|
||||
189
memory/projects/mission-control-python.md
Normal file
189
memory/projects/mission-control-python.md
Normal file
@@ -0,0 +1,189 @@
|
||||
---
|
||||
## Mission Control Dashboard (Python) — Consolidating
|
||||
|
||||
**Status:** 🔄 Combined with Next.js version — See TLC project
|
||||
**Location:** `C:\web\htdocs\mission-control-py`
|
||||
**URL:** http://localhost:5050
|
||||
**URL (network):** http://192.168.0.34:5050
|
||||
**Tech:** Python/Flask + MySQL + Home Assistant API
|
||||
|
||||
---
|
||||
|
||||
### Quick Reference
|
||||
|
||||
| Command | Where |
|
||||
|---------|-------|
|
||||
| Run server | `cd C:\web\htdocs\mission-control-py && python app.py` |
|
||||
| Restart | Server auto-reloads on file changes |
|
||||
| Port | 5050 (Frigate-safe) |
|
||||
|
||||
---
|
||||
|
||||
### Pages
|
||||
|
||||
| Route | File | Purpose |
|
||||
|-------|------|---------|
|
||||
| `/` | `dashboard.html` | Home: Gateway status, Cron jobs, Weather, Projects, Tasks |
|
||||
| `/admin` | `admin.html` | CRUD for Projects/Tasks in MySQL |
|
||||
| `/news` | `news.html` | Last 10 news briefs (JSON archive) |
|
||||
| `/alerts` | `alerts.html` | System alerts from Job Verifier cron |
|
||||
| `/calendar` | `calendar.html` | HA calendar + 24 F1 races |
|
||||
| `/memory` | `memory.html` | Memory file stats, cleanup candidates |
|
||||
|
||||
---
|
||||
|
||||
### File Structure
|
||||
|
||||
```
|
||||
mission-control-py/
|
||||
├── app.py # Main Flask app, routes
|
||||
├── api/
|
||||
│ ├── gateway.py # get_gateway_status(), get_cron_jobs()
|
||||
│ ├── ha_calendar.py # HA calendar API wrapper
|
||||
│ └── memory_files.py # Memory file scanner
|
||||
├── templates/
|
||||
│ ├── base.html # Master template
|
||||
│ ├── dashboard.html # Home page
|
||||
│ ├── admin.html # Admin CRUD
|
||||
│ ├── news.html # News briefs archive
|
||||
│ ├── alerts.html # System alerts
|
||||
│ ├── calendar.html # HA calendar
|
||||
│ └── memory.html # Memory file stats
|
||||
├── static/
|
||||
│ └── style.css # All styles
|
||||
├── outputs/
|
||||
│ ├── news/ # YYYY-MM-DD.json, latest.json
|
||||
│ └── alerts/ # latest.json
|
||||
└── models/
|
||||
└── models.py # Project, Task (MySQL)
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
### Features
|
||||
|
||||
#### Caching (5-minute TTL)
|
||||
- Gateway status cached in `_cache` dict
|
||||
- Cron jobs cached
|
||||
- Weather cached
|
||||
- Location: `app.py` → `_CACHE_TTL = timedelta(minutes=5)`
|
||||
|
||||
#### Weather
|
||||
- Source: `weather.home` HA entity
|
||||
- API: `GET /api/states/weather.home`
|
||||
- Location: `get_weather()` in `app.py`
|
||||
- Fallback: Shows "--" if HA unavailable
|
||||
|
||||
#### Calendar
|
||||
- Source: HA `calendar.appointments` entity
|
||||
- Fetch API: `GET /api/calendars/calendar.appointments?start=...&end=...`
|
||||
- Add API: `POST /api/services/calendar/create_event`
|
||||
- Location: `api/ha_calendar.py`
|
||||
- **Known Issue:** Date range filter may not work correctly (March 3 event not showing)
|
||||
|
||||
#### News Briefs
|
||||
- Source: Cron job output JSON
|
||||
- Files: `outputs/news/latest.json` + `outputs/news/YYYY-MM-DD.json`
|
||||
- Format: `{title, date, generated_at, content, sources, summary}`
|
||||
- Crontab: Daily 8 AM → `openclaw cron run 3cf50074-...`
|
||||
|
||||
#### Alerts
|
||||
- Source: Job Verifier cron output JSON
|
||||
- File: `outputs/alerts/latest.json`
|
||||
- Format: `{date, generated_at, alerts: [{level, message, time, ...}]}`
|
||||
- Crontab: Daily 9 AM → `openclaw cron run 8e2b06ee-...`
|
||||
|
||||
#### Memory Files
|
||||
- Source: `C:\Users\admin\.openclaw\workspace\memory\*.md`
|
||||
- Features: Line count, TODO count, cleanup scoring
|
||||
- Location: `api/memory_files.py`
|
||||
|
||||
---
|
||||
|
||||
### Known Issues & Fixes
|
||||
|
||||
| Issue | Status | Fix |
|
||||
|-------|--------|-----|
|
||||
| Calendar events beyond 10 days not showing | **KNOWN** | Date range filter bug — March 3 event (11 days out) not displaying |
|
||||
| Double "°F" in weather | FIXED | Strip HA unit before adding; `get_weather()` in `app.py` |
|
||||
| Calendar agent creates cron jobs instead of calendar events | **KNOWN** | Agent using `cron.add` instead of `ha_calendar.add()` — needs fix |
|
||||
|
||||
---
|
||||
|
||||
### HA API Reference
|
||||
|
||||
**Base URL:** `http://192.168.0.39:8123`
|
||||
**Token:** See `TOOLS.md` or `api/ha_calendar.py`
|
||||
|
||||
| Endpoint | Method | Description |
|
||||
|----------|--------|-------------|
|
||||
| `/api/states/weather.home` | GET | Current weather state |
|
||||
| `/api/calendars` | GET | List calendars |
|
||||
| `/api/calendars/calendar.appointments` | GET | Get appointments (params: `start`, `end`) |
|
||||
| `/api/services/calendar/create_event` | POST | Add event (body: `entity_id`, `summary`, `start_date_time`, `end_date_time`) |
|
||||
|
||||
**Auth:** Header `Authorization: Bearer <token>`
|
||||
|
||||
---
|
||||
|
||||
### Cron Jobs Integrated
|
||||
|
||||
| Job Name | ID | Schedule | Output |
|
||||
|----------|-----|----------|--------|
|
||||
| Daily News Brief | `3cf50074-8736-423e-ac41-6e6fb807bfee` | 0 8 * * * (8 AM) | JSON + Discord + /news |
|
||||
| Job Verifier Daily | `8e2b06ee-f2ea-4b33-ae39-23cf04b1a657` | 0 9 * * * (9 AM) | JSON + /alerts |
|
||||
|
||||
---
|
||||
|
||||
### Configuration
|
||||
|
||||
**MySQL Connection:**
|
||||
- Database: `alexai`
|
||||
- Tables: `projects`, `tasks`
|
||||
- Config: `config.py` → `SQLALCHEMY_DATABASE_URI`
|
||||
|
||||
**HA Token:**
|
||||
- In `api/ha_calendar.py` and `app.py`
|
||||
- Falls back to env var `HA_TOKEN`
|
||||
|
||||
---
|
||||
|
||||
### Debugging Tips
|
||||
|
||||
1. **Check HA directly:**
|
||||
```bash
|
||||
python -c "
|
||||
import requests
|
||||
r = requests.get('http://192.168.0.39:8123/api/calendars/calendar.appointments',
|
||||
params={'start': '2026-03-03T00:00:00', 'end': '2026-03-04T00:00:00'},
|
||||
headers={'Authorization': 'Bearer <token>'})
|
||||
print(r.json())
|
||||
"
|
||||
```
|
||||
|
||||
2. **Test API directly:**
|
||||
```python
|
||||
from api.ha_calendar import get_calendar_events, add_calendar_event
|
||||
events = get_calendar_events('calendar.appointments', days=14)
|
||||
print([e['summary'] for e in events])
|
||||
```
|
||||
|
||||
3. **Clear cache:** Restart server (auto-reloads)
|
||||
|
||||
4. **Check logs:** Terminal output shows "Starting Mission Control..."
|
||||
|
||||
---
|
||||
|
||||
### Changes Log
|
||||
|
||||
| Date | Change |
|
||||
|------|--------|
|
||||
| 2026-02-19 | Initial build, 5 pages |
|
||||
| 2026-02-20 | Added weather, calendar, memory, F1 races (24 races added) |
|
||||
| 2026-02-20 | Fixed double °F in weather |
|
||||
| 2026-02-20 | Calendar agent created (has bug - creates cron instead of events) |
|
||||
|
||||
---
|
||||
|
||||
**Last Updated:** 2026-02-20
|
||||
**Maintainer:** Corey's Agent
|
||||
51
memory/projects/nzb-download-system.md
Normal file
51
memory/projects/nzb-download-system.md
Normal file
@@ -0,0 +1,51 @@
|
||||
# NZB Download System
|
||||
|
||||
**Status:** Operational
|
||||
**Added:** 2026-03-09
|
||||
|
||||
## Overview
|
||||
|
||||
Automated NZB download pipeline via NZBGeek API + SABnzbd.
|
||||
|
||||
## Workflow
|
||||
|
||||
OpenClaw -> pirate/ -> SABnzbd -> VPN -> downloads
|
||||
|
||||
## Components
|
||||
|
||||
- **NZBGeek API** (api.nzbgeek.info) - Source index
|
||||
- **OpenClaw** (AIAgents) - Search, grab NZBs
|
||||
- **pirate/** (workspace/pirate/) - Watch folder
|
||||
- **SABnzbd** (VPN box 192.168.0.17) - Download client
|
||||
- **VPN** - Routes traffic
|
||||
|
||||
## Download Preferences
|
||||
|
||||
1. English only
|
||||
2. Under 10GB (default)
|
||||
3. Atmos preferred
|
||||
|
||||
## API Key
|
||||
|
||||
ueQEr7DGNzdtit3AYCP3snwduLtJPaeD
|
||||
|
||||
## First Download
|
||||
|
||||
Tron (1982) - 2160p DSNP WEB-DL, DD+ 5.1 Atmos, 9.9 GB
|
||||
|
||||
---
|
||||
|
||||
## Future: Torrent Support
|
||||
|
||||
Same pirate/ folder for .torrent files via qBittorrent.
|
||||
## Note
|
||||
|
||||
Once NZB is dropped in pirate/, SABnzbd auto-picks it up. No manual intervention needed.
|
||||
|
||||
## Downloads
|
||||
|
||||
| Date | Title | Quality | Size | Status |
|
||||
|------|-------|---------|------|--------|
|
||||
| 2026-03-09 | Tron (1982) | 2160p DSNP WEB-DL DD+ 5.1 Atmos | 9.9 GB | Downloaded |
|
||||
| 2026-03-13 | Good Luck Have Fun Don't Die | 1080p WEBRip x265 DDP5.1 | 2.2 GB | Queued |
|
||||
| 2026-03-20 | We Bury the Dead (2024) | 1080p WebRip YTS x264 AAC5.1 | 1.84 GB | Queued |
|
||||
126
memory/projects/obsidian-workflow.md
Normal file
126
memory/projects/obsidian-workflow.md
Normal file
@@ -0,0 +1,126 @@
|
||||
# Obsidian Workflow Rules
|
||||
|
||||
**Purpose**: Define where and how to save content to Obsidian vault.
|
||||
|
||||
When user says "save this to Obsidian" or "save to vault", **READ THIS FILE FIRST**.
|
||||
|
||||
---
|
||||
|
||||
## Vault Location
|
||||
|
||||
```
|
||||
C:\Users\admin\Documents\Corey
|
||||
```
|
||||
|
||||
CLI: `notesmd-cli` (default vault set)
|
||||
|
||||
---
|
||||
|
||||
## Folder Structure & Rules
|
||||
|
||||
### Daily Notes/
|
||||
**Purpose**: Daily journal / log
|
||||
**Naming**: YYYY-MM-DD.md (auto-generated)
|
||||
**Use for**: Daily logs, quick notes, "save to today"
|
||||
|
||||
### Projects/
|
||||
**Purpose**: Active project documentation
|
||||
**Naming**: `[Project Name]/[Topic].md`
|
||||
**Use for**: Ongoing work, project-specific notes
|
||||
|
||||
#### Sub-folders:
|
||||
- `Projects/ROM Library/` - ROM organization notes
|
||||
- `Projects/Home Assistant/` - HA automations, configs
|
||||
- `Projects/Coding/` - Code snippets, workflows
|
||||
|
||||
### Summaries/
|
||||
**Purpose**: Content summaries (articles, videos, etc.)
|
||||
**Naming**: `[Source] - [Title].md`
|
||||
**Use for**: `/summarize` output, web article summaries
|
||||
|
||||
### Personal Notes/
|
||||
**Purpose**: Non-work personal notes
|
||||
**Naming**: Freeform
|
||||
**Use for**: Ideas, thoughts, personal references
|
||||
|
||||
### Guides/
|
||||
**Purpose**: How-to guides, references
|
||||
**Naming**: `[Topic] Guide.md`
|
||||
**Use for**: Tutorials, step-by-step instructions
|
||||
|
||||
### Home Assistant/
|
||||
**Purpose**: Home Assistant specific
|
||||
**Naming**: Device/automation specific
|
||||
**Use for**: HA configs, automations, device notes
|
||||
|
||||
### OpenClaw/
|
||||
**Purpose**: OpenClaw agent notes
|
||||
**Naming**: Freeform
|
||||
**Use for**: Agent configuration, workflow notes
|
||||
|
||||
### News Briefs/
|
||||
**Purpose**: Daily/weekly news summaries
|
||||
**Naming**: `News YYYY-MM-DD.md`
|
||||
**Use for**: Daily news brief output
|
||||
|
||||
### Tasks/
|
||||
**Purpose**: Task tracking
|
||||
**Naming**: Freeform or by project
|
||||
**Use for**: TODOs, action items
|
||||
|
||||
### UniFi/
|
||||
**Purpose**: UniFi network notes
|
||||
**Naming**: Device/issue specific
|
||||
**Use for**: Network config, device inventory
|
||||
|
||||
### WORK/
|
||||
**Purpose**: Work-related notes
|
||||
**Naming**: By client/project
|
||||
**Use for**: Work projects, client notes
|
||||
|
||||
---
|
||||
|
||||
## File Naming Conventions
|
||||
|
||||
- Use Title Case: `ROM Library Inventory.md`
|
||||
- Date prefix for dated content: `2026-03-10 Event Log.md`
|
||||
- No special characters except `-` and spaces
|
||||
- Keep names readable, avoid abbreviations unless obvious
|
||||
|
||||
---
|
||||
|
||||
## Default Behaviors
|
||||
|
||||
| Content Type | Default Location |
|
||||
|--------------|------------------|
|
||||
| Daily log entry | Daily Notes/today |
|
||||
| Project note | Projects/[Project]/ |
|
||||
| Article summary | Summaries/ |
|
||||
| How-to guide | Guides/ |
|
||||
| Personal note | Personal Notes/ |
|
||||
| Work note | WORK/ |
|
||||
|
||||
---
|
||||
|
||||
## Workflow Checklist
|
||||
|
||||
When saving to Obsidian:
|
||||
|
||||
1. ✅ Read this file first (obsidian-workflow.md)
|
||||
2. ✅ Identify content type
|
||||
3. ✅ Choose appropriate folder
|
||||
4. ✅ Apply naming convention
|
||||
5. ✅ Create note with `notesmd-cli create "[folder]/[name]" --content "..."`
|
||||
|
||||
---
|
||||
|
||||
## Tags
|
||||
|
||||
Default tags to include in frontmatter:
|
||||
- `#project/[name]` - Project association
|
||||
- `#status/active` or `#status/completed`
|
||||
- `#type/[note/guide/summary/log]`
|
||||
|
||||
---
|
||||
|
||||
*Last Updated: 2026-03-10*
|
||||
57
memory/projects/proton-mail-bridge.md
Normal file
57
memory/projects/proton-mail-bridge.md
Normal file
@@ -0,0 +1,57 @@
|
||||
# Proton Mail Bridge Integration
|
||||
|
||||
## Overview
|
||||
Email integration via Proton Mail Bridge for IMAP/SMTP access
|
||||
|
||||
## Connection Details
|
||||
- **IMAP Server:** 127.0.0.1:1143
|
||||
- **SMTP Server:** 127.0.0.1:1025
|
||||
- **Username:** alexthenerdyai@proton.me
|
||||
- **Security:** AUTH=PLAIN (local connection only)
|
||||
- **Bridge Version:** Proton Mail Bridge 03.22.00 - gluon
|
||||
|
||||
## Test Results (2026-02-18)
|
||||
✅ Connected successfully to Bridge IMAP
|
||||
✅ Authenticated with Bridge password
|
||||
✅ Retrieved folder list (10 folders found)
|
||||
✅ INBOX has 3 messages, all unread
|
||||
|
||||
## Available Folders
|
||||
- INBOX
|
||||
- Sent
|
||||
- Drafts
|
||||
- Starred
|
||||
- Archive
|
||||
- Spam
|
||||
- Trash
|
||||
- All Mail
|
||||
- Folders
|
||||
- Labels
|
||||
|
||||
## Capabilities
|
||||
- Read emails from INBOX
|
||||
- Search, mark read/unread, move messages
|
||||
- Send emails via SMTP
|
||||
- Full folder management
|
||||
|
||||
## Potential Uses
|
||||
- Daily email digest
|
||||
- Notifications
|
||||
- Automated responses
|
||||
- Email-based triggers
|
||||
|
||||
## Files Created
|
||||
- `tools/proton_imap_test.py` — Initial test script
|
||||
- `tools/proton_imap_simple.py` — Working IMAP test script
|
||||
|
||||
## Status
|
||||
✅ **Functional** — Bridge connected and tested
|
||||
|
||||
## Next Steps
|
||||
- [ ] Integrate with cron jobs for email-based actions
|
||||
- [ ] Create email notification system
|
||||
- [ ] Set up email-to-Discord forwarding if desired
|
||||
|
||||
---
|
||||
|
||||
*Trigger words: proton, email, imap, smtp, bridge, mail*
|
||||
51
memory/projects/research-agents.md
Normal file
51
memory/projects/research-agents.md
Normal file
@@ -0,0 +1,51 @@
|
||||
# Research Agent Spawning
|
||||
|
||||
## Overview
|
||||
System for spawning isolated sub-agents to handle research tasks asynchronously
|
||||
|
||||
## Status
|
||||
🔄 **In Progress** — Designing spawn and report workflows
|
||||
|
||||
## Goals
|
||||
- Spawn sub-agents for independent research tasks
|
||||
- Parallel execution for multiple research queries
|
||||
- Automatic summarization and reporting back
|
||||
- Resource management (don't overwhelm system)
|
||||
|
||||
## Architecture Ideas
|
||||
|
||||
### Option 1: Cron-Based Research
|
||||
- Spawn agents via cron jobs
|
||||
- Results delivered via Discord
|
||||
- Good for: scheduled research, news monitoring
|
||||
|
||||
### Option 2: On-Demand Spawning
|
||||
- User triggers: "/research [topic]"
|
||||
- Spawns isolated session
|
||||
- Reports back when complete
|
||||
- Good for: deep dives, ad-hoc questions
|
||||
|
||||
### Option 3: Persistent Research Channel
|
||||
- Dedicated Discord channel
|
||||
- All research requests go there
|
||||
- Agent monitors and spawns workers
|
||||
- Good for: continuous research, collaborative
|
||||
|
||||
## Current Exploration
|
||||
Testing in #projects channel (1468257895152881796)
|
||||
|
||||
## Questions to Answer
|
||||
- [ ] How many concurrent agents?
|
||||
- [ ] How to handle long-running research?
|
||||
- [ ] Where to store results? (Obsidian? SQLite?)
|
||||
- [ ] How to prevent spawn loops?
|
||||
- [ ] Cost tracking per agent?
|
||||
|
||||
## Related
|
||||
- [[Multi-User Agent Architecture]] — multi-user considerations
|
||||
- [[Memory System]] — where research results might go
|
||||
|
||||
---
|
||||
|
||||
*Created: 2026-02-25*
|
||||
*Next: Test spawning mechanisms*
|
||||
59
memory/projects/rom-library.md
Normal file
59
memory/projects/rom-library.md
Normal file
@@ -0,0 +1,59 @@
|
||||
# ROM Library Organization
|
||||
|
||||
**Status**: In Progress
|
||||
**Started**: 2026-03-09
|
||||
**Location**: R:\ (38.8 TB)
|
||||
|
||||
## Goal
|
||||
Phase 1: Inventory all ROMs across multiple gaming systems
|
||||
Phase 2: Detect duplicates via MD5 hashing
|
||||
Phase 3: Identify missing ROMs from No-Intro/Redump sets (future)
|
||||
|
||||
## Library Structure
|
||||
```
|
||||
R:/
|
||||
├── Rom Sets (Organized)/
|
||||
│ ├── Nintendo/
|
||||
│ ├── Sony/
|
||||
│ ├── Sega/
|
||||
│ ├── Microsoft/
|
||||
│ ├── Atari/
|
||||
│ ├── Arcade/
|
||||
│ ├── Computers/
|
||||
│ └── Misc Consoles/
|
||||
└── Rom Sets (Somewhat Organized)/
|
||||
```
|
||||
|
||||
## Quick Scan Results (2026-03-09)
|
||||
- **Total**: 98,601 items, 1,701 GB
|
||||
- **Top by count**: Commodore 64 (24,349), Atari (10,935), MAME (8,651)
|
||||
- **Top by size**: PSN ISO Pack (672 GB), Nintendo 3DS (412 GB), TurboGrafx-CD (234 GB)
|
||||
|
||||
## By Manufacturer
|
||||
| Manufacturer | Items | Size |
|
||||
|--------------|-------|------|
|
||||
| Computers | 47,327 | 61.89 GB |
|
||||
| Arcade | 12,951 | 32.97 GB |
|
||||
| Atari | 12,399 | 2.56 GB |
|
||||
| Nintendo | 12,017 | 467.24 GB |
|
||||
| Sony | 3,106 | 672.40 GB |
|
||||
| Sega | 2,747 | 3.54 GB |
|
||||
| Microsoft | 1,661 | 0.05 GB |
|
||||
|
||||
## Disc vs Cartridge Systems
|
||||
- **Disc systems** (count folders): PSX (1,516), PS3 (77), PS VITA (6), Saturn (3)
|
||||
- **Cartridge systems** (count files): NES (1,592), SNES, Genesis, GBA, etc.
|
||||
|
||||
## Scripts
|
||||
- `tools/rom-quick-scan.py` - Quick count (completed)
|
||||
- `tools/rom-full-scan.py` - Duplicate detection (overnight scan)
|
||||
|
||||
## Output Files
|
||||
- `rom-inventory/rom-inventory.json` - Quick scan
|
||||
- `rom-inventory/rom-full-*.json` - Full scan with duplicates
|
||||
|
||||
## Notes
|
||||
- Hash only files under 50MB (speed vs coverage tradeoff)
|
||||
- Node gateway has 30s timeout - use background processes for long scans
|
||||
- No-Intro DAT files available at https://datomatic.no-intro.org/
|
||||
\n## Full Scan Results (2026-04-09)\n\n**Status:** Complete\n\n| Metric | Value |\n|--------|-------|\n| Total files | 773,442 |\n| Total size | 21.9 TB |\n| Files hashed | 756,454 |\n| Skipped (too large) | 16,987 |\n| **Duplicates found** | **44,844** |\n\n**Runtime:** 13 hours\n\n**Output:** `rom-inventory/rom-full-scan.json`\n\n**Next steps:**\n1. Analyze JSON to identify duplicate clusters\n2. Determine which systems have most duplicates\n3. Create cleanup plan (manual review vs auto-delete)\n
|
||||
158
memory/projects/supermonkey-memory-system.md
Normal file
158
memory/projects/supermonkey-memory-system.md
Normal file
@@ -0,0 +1,158 @@
|
||||
# Supermonkey Memory System
|
||||
|
||||
**Status:** ✅ Production
|
||||
**Created:** 2026-03-02
|
||||
**Location:** `~/.openclaw/memory.db`
|
||||
|
||||
---
|
||||
|
||||
## Overview
|
||||
|
||||
Local semantic memory search using SQLite + Ollama embeddings. Replaces flaky Supermemory cloud API.
|
||||
|
||||
**Why "Supermonkey"?**
|
||||
- Works offline (like a monkey with a typewriter)
|
||||
- No cloud dependency
|
||||
- Just keeps going
|
||||
|
||||
---
|
||||
|
||||
## Architecture
|
||||
|
||||
### File-Based Pipeline (Daily)
|
||||
```
|
||||
Memory Files (markdown)
|
||||
↓
|
||||
memory_embedding_worker.py
|
||||
↓
|
||||
Ollama (nomic-embed-text) → 768-dim vectors
|
||||
↓
|
||||
SQLite + sqlite-vector extension
|
||||
↓
|
||||
Cosine similarity search
|
||||
```
|
||||
|
||||
### Real-Time Session Pipeline (Live)
|
||||
```
|
||||
Discord/Chat Messages
|
||||
↓
|
||||
OpenClaw Session Transcript (.jsonl)
|
||||
↓
|
||||
session_monitor.py (cron every 2 min)
|
||||
↓
|
||||
Count messages → At 15: summarize → embed → store
|
||||
↓
|
||||
Ollama (nomic-embed-text)
|
||||
↓
|
||||
SQLite + sqlite-vector
|
||||
```
|
||||
|
||||
**The Innovation:** Read OpenClaw's own session transcripts to auto-capture conversations without manual tracking or hooks!
|
||||
|
||||
---
|
||||
|
||||
## Components
|
||||
|
||||
| File | Purpose |
|
||||
|------|---------|
|
||||
| `memory_vector.py` | Core SQLite-vector wrapper |
|
||||
| `memory_embedding_worker.py` | Daily memory file processor |
|
||||
| `session_monitor.py` | Real-time transcript capture |
|
||||
| `session_snapshotter.py` | Manual session capture |
|
||||
| `search_memories.py` | CLI search tool |
|
||||
| `bulk_memory_loader.py` | One-time historical import |
|
||||
|
||||
---
|
||||
|
||||
## Quick Commands
|
||||
|
||||
```powershell
|
||||
# Search memories
|
||||
python tools/search_memories.py "home assistant automation"
|
||||
|
||||
# Check stats
|
||||
python -c "import sqlite3; db=sqlite3.connect(r'C:\Users\admin\.openclaw\memory.db'); c=db.cursor(); c.execute('SELECT COUNT(*) FROM memory_embeddings'); print('Total:', c.fetchone()[0]); c.execute('SELECT COUNT(*) FROM memory_embeddings WHERE source_type=\'auto_session\''); print('Auto snapshots:', c.fetchone()[0]); db.close()"
|
||||
|
||||
# Run daily worker manually
|
||||
python tools/memory_embedding_worker.py --date 2026-03-03
|
||||
|
||||
# Run session monitor manually
|
||||
python tools/session_monitor.py
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## Current Stats
|
||||
|
||||
| Metric | Value |
|
||||
|--------|-------|
|
||||
| Total embeddings | ~1,623 |
|
||||
| Daily notes processed | 818 |
|
||||
| Project files | 332 |
|
||||
| MEMORY.md sections | 33 |
|
||||
| Manual session snapshots | 2 |
|
||||
| **Auto session snapshots** | **27** |
|
||||
| Tracked sessions | 245 |
|
||||
| Active sessions | 243 |
|
||||
| Database size | ~5 MB |
|
||||
|
||||
---
|
||||
|
||||
## Database Schema
|
||||
|
||||
### memory_embeddings
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| id | INTEGER | Primary key |
|
||||
| source_type | TEXT | daily, memory_md, project, auto_session |
|
||||
| source_path | TEXT | File path + section |
|
||||
| content_text | TEXT | First 500 chars |
|
||||
| embedding | BLOB | 768-dim vector |
|
||||
| created_at | TIMESTAMP | Auto-set |
|
||||
|
||||
### session_tracking
|
||||
| Column | Type | Description |
|
||||
|--------|------|-------------|
|
||||
| session_id | TEXT | OpenClaw UUID |
|
||||
| transcript_path | TEXT | Path to .jsonl |
|
||||
| last_message_index | INTEGER | Checkpoint |
|
||||
| messages_since_snapshot | INTEGER | Counter |
|
||||
| is_active | BOOLEAN | Active? |
|
||||
|
||||
---
|
||||
|
||||
## Cron Schedule
|
||||
|
||||
| Job | Schedule | Purpose |
|
||||
|-----|----------|---------|
|
||||
| Memory Embeddings Daily | 3:00 AM | Process yesterday's memory files |
|
||||
| Session Monitor | Every 2 min | Auto-snapshot live conversations |
|
||||
|
||||
---
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
**Ollama not running:**
|
||||
```powershell
|
||||
ollama serve
|
||||
```
|
||||
|
||||
**Database locked:**
|
||||
Close DB Browser for SQLite
|
||||
|
||||
**Unicode errors in cron:**
|
||||
All emojis replaced with ASCII-safe markers
|
||||
|
||||
---
|
||||
|
||||
## Future Enhancements
|
||||
|
||||
- [ ] Keyword filtering alongside vector search
|
||||
- [ ] Date range queries
|
||||
- [ ] Source type filtering
|
||||
- [ ] Embedding quality scoring
|
||||
|
||||
---
|
||||
|
||||
**Credit:** Corey's genius idea to read session.json files 💡
|
||||
**System:** Operational and self-managing
|
||||
66
memory/projects/unifi-monitoring.md
Normal file
66
memory/projects/unifi-monitoring.md
Normal file
@@ -0,0 +1,66 @@
|
||||
# UniFi Network Monitoring
|
||||
|
||||
**Status:** 🔄 In Progress
|
||||
**Location:** `~/.openclaw/workspace/skills/unifi/`
|
||||
**Added:** 2026-02-27
|
||||
|
||||
## Overview
|
||||
|
||||
Network monitoring via UniFi Controller API. Read-only access to:
|
||||
- Device status (APs, switches, USG)
|
||||
- Connected clients
|
||||
- Network health
|
||||
- DPI traffic stats
|
||||
- Alerts/alarms
|
||||
|
||||
## Hardware
|
||||
|
||||
- **Gateway:** UniFi USG
|
||||
- **Controller:** Running on Home Assistant box (192.168.0.39:8123)
|
||||
- **API URL:** `https://192.168.0.39:8443` (standard controller port)
|
||||
|
||||
## Setup Status
|
||||
|
||||
- [x] Skill files installed
|
||||
- [ ] Credentials configured
|
||||
- [ ] Connection tested
|
||||
- [ ] Commands verified working
|
||||
|
||||
## Commands
|
||||
|
||||
| Script | Purpose |
|
||||
|--------|---------|
|
||||
| `bash scripts/devices.sh` | List UniFi devices |
|
||||
| `bash scripts/clients.sh` | List connected clients |
|
||||
| `bash scripts/health.sh` | Network health |
|
||||
| `bash scripts/top-apps.sh` | Top bandwidth consumers |
|
||||
| `bash scripts/alerts.sh` | Recent alarms |
|
||||
| `bash scripts/dashboard.sh` | Full dashboard |
|
||||
|
||||
All support `json` argument for raw output.
|
||||
|
||||
## Configuration
|
||||
|
||||
File: `~/.clawdbot/credentials/unifi/config.json`
|
||||
|
||||
```json
|
||||
{
|
||||
"url": "https://192.168.0.39:8443",
|
||||
"username": "admin",
|
||||
"password": "YOUR_PASSWORD",
|
||||
"site": "default"
|
||||
}
|
||||
```
|
||||
|
||||
## Notes
|
||||
|
||||
- Requires local admin account (not UniFi SSO)
|
||||
- Scripts use `curl -k` for self-signed controller certs
|
||||
- DPI must be enabled in UniFi for bandwidth stats
|
||||
- Read-only API access (safe for monitoring)
|
||||
|
||||
## Related
|
||||
|
||||
- **Skill Docs:** `skills/unifi/README.md`
|
||||
- **API Reference:** `skills/unifi/references/unifi-readonly-endpoints.md`
|
||||
- **Source:** OpenClaw official skills repo
|
||||
Reference in New Issue
Block a user