23 KiB
2026-01-30
Room Presence - WORKING!
- Status: Connected! livingroom-pi (192.168.0.95) running room-assistant
- MQTT: HA broker found Pi immediately
- Entities:
cluster_leaderandcluster_sizeappeared in HA - Phone: B0:C2:C7:07:28:B4 paired with Pi (motorola razr 2024)
- Next: Add phone MAC to config, complete npm install, test device tracking
Obsidian CLI
- obsidian-cli: Installed via Scoop, vault set
- Location:
C:\Users\admin\clawd\skills\obsidian\ - Pending: Researching sync approach with SQL backend
Skills Installed (10 total)
| Skill | Purpose |
|---|---|
| auto-updater | Daily auto-update skill |
| clean-code | Clean code principles |
| coding-agent | Codex/Claude Code CLI runner |
| clawdbot-backup | Backup/restore Clawdbot |
| home-assistant | HA REST API integration |
| memory-to-obsidian | Sync notes to Obsidian |
| qbittorrent | Torrent management |
| radarr | Movie automation |
| skills-search | Search skills.sh |
| sonarr | TV show automation |
Backup System
- Location:
P:\Clawdbot-Backups\ - Script:
C:\Users\admin\clawd\backup-clawdbot.ps1 - Schedule: Daily at 2 AM via cron job
Downloads Integration (VPN machine)
- qbittorrent, sonarr, radarr all installed
- Config location:
~/.clawdbot/credentials/
SSH to Pi
- IP: 192.168.0.95 / admin / 41945549
- Windows SSH quirk: ed25519 keys not working properly
- Workaround: Run SSH commands in user's terminal directly
Reference Links
- MoltDirectory: https://moltdirectory.com/ - 672 skills, 28 categories for Molt/Clawdbot agents
- ClawdHub: https://clawdhub.com/ - OpenClaw skills registry
- Awesome OpenClaw Skills: https://github.com/VoltAgent/awesome-openclaw-skills - Community-curated skill list
OpenClaw Branding History
- Clawdbot → Original name, still used for personal instances
- MoltBot → Platform name during development
- OpenClaw → Current open-source branding
- Just branding — Architecture and skills remain compatible across all names
SQL-Based Storage Project (IN PROGRESS)
Completed Tonight
- MySQL Installed: C:\web\mysql\ (MariaDB 10.4.32)
- Database Created:
clawdbot_projects - Tables Created:
daily_notes- Raw logs from memory fileslong_term_memories- Curated knowledge (16 entries)action_items- Task tracking (6 entries)backup_logs- Backup historysessions- Session history
- User Created:
AlexAI/alexisabignerd - Schema File:
C:\Users\admin\clawd\schema-detailed.sql(8.8KB with full documentation) - Credentials File:
C:\Users\admin\clawd\.clawdbot\credentials\mysql.json
Quick Reference
Host: localhost:3306
Database: clawdbot_projects
User: AlexAI
Password: alexisabignerd
Example Queries
-- Find all HA-related memories
SELECT * FROM long_term_memories WHERE JSON_CONTAINS(tags, '"ha"');
-- Show pending action items by priority
SELECT * FROM action_items WHERE status = 'pending' ORDER BY priority DESC;
-- Check backup history
SELECT * FROM backup_logs ORDER BY timestamp DESC LIMIT 10;
Idea: Use SQL as central data layer for automation and tracking
Proposed Architecture
- SQL Database: Central fact store and query engine
- Sources: Room-assistant, HA, Clawdbot, downloads stack
- Output: SQL → Obsidian markdown exports
Possible Use Cases
-
Room Presence & Phone Tracking
- Store BLE/RFID presence events in SQL
- Query daily/weekly summaries (time present, patterns)
- Export to Obsidian for analysis
- Track Kathy + Corey's presence separately
-
Home Assistant State History
- Query
home_assistant_v2.db(SQLite) - Track entity changes, on/off patterns
- Generate occupancy reports
- Detect anomalies (devices stuck on/off)
- Query
-
Clawdbot Memory → SQL Mirror
- Mirror memory files in SQL tables
- Complex queries (tags + date ranges + keyword search)
- Faster lookups than file scanning
- Historical versioning built-in
-
Downloads & Media Stack
- Track sonarr/radarr history
- Monitor download success/fail rates
- Export "watchlist" to Obsidian
-
Backup & System Logs
- SQL table for backup logs (success, duration, size)
- System metrics (CPU, memory, disk over time)
- Alert thresholds based on SQL queries
Technical Notes
- MySQL for central database
- Apache for web server
- Python/Flask or PHP for web apps
- HA Custom Plugin (Python) to query MySQL natively in HA dashboards
- HA Integration Options:
- Custom
sensorentities from SQL queries - Custom dashboard cards showing SQL data
- Automatic entity creation from presence tracking
- Real-time updates on HA dashboard
- Custom
- PowerShell:
MySql.Datalibrary ormysqlCLI - Obsidian CLI for markdown export after SQL queries
- Apache can serve HTML dashboards showing SQL data
- Tinker/Flask for Python-based web apps
- Use XAMPP or standalone install for AMP stack
Proposed Architecture
┌─────────────────────────────────────────────────┐
│ Home Assistant Dashboard │
│ (Native HA cards + custom plugin sensors) │
├─────────────────────────────────────────────────┤
│ Custom HA Plugin (Python) │
│ - Queries MySQL │
│ - Creates HA entities │
│ - Real-time updates │
├─────────────────────────────────────────────────┤
│ MySQL Database │
├─────────────────────────────────────────────────┤
│ Room-Assistant │ Downloads │ Clawdbot │
└─────────────────────────────────────────────────┘
HA Plugins to Consider
- SQL Sensor Plugin - Generic SQL → HA sensor
- Room Presence Tracker - Phone BLE → SQL → HA entity
- Media Library Dashboard - Sonarr/Radarr stats
- Clawdbot Memory Cards - Show today's memory on HA
Action Items
- Verify room-assistant phone tracking works
- Add Kathy's phone to room-assistant config
- Replace Master Hallway motion sensor battery (1%)
- Clone SD card for more Pis after first Pi fully working
- Update HA Dashboard with frontend-design style
- Set MySQL root password for security ✅
- Create non-root user for Clawdbot (AlexAI/alexisabignerd) ✅
AI Art/Video Generation Project (PLANNING)
Hardware
- GPU: RTX 5090 (24GB VRAM) — perfect for FLUX, SDXL
- Storage: Install 4TB NVME for models and outputs
- RAM: 32GB available for batching
Software Stack
- ComfyUI — Node-based workflow editor
- Supports SDXL, FLUX, AnimateDiff, SVD
- Highly extensible, free, local
- MySQL — Track generation history
- Table:
generation_history(prompt, settings, seed, model, timestamp, output_path) - Query prompts by style, model, date range
- Table:
- Apache/PHP — Web UI for triggering generations (optional)
Database Schema (Proposed)
CREATE TABLE generation_history (
id INT AUTO_INCREMENT PRIMARY KEY,
prompt TEXT NOT NULL,
negative_prompt TEXT,
model VARCHAR(100),
width INT, height INT,
steps INT,
cfg_scale DECIMAL(4,2),
seed BIGINT,
sampler VARCHAR(50),
output_path VARCHAR(500),
generation_time_seconds DECIMAL(8,2),
vram_used_gb DECIMAL(4,2),
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP,
tags JSON,
rating INT -- 1-5 personal rating
);
Future Features
- Browse generation history by prompt/search
- "Retry with variations" based on successful generations
- Export favorites to Obsidian
- Track cost savings vs API services
Resources
- ComfyUI: https://github.com/comfyanonymous/ComfyUI
- FLUX.1 Models: https://blackforestlabs.ai/
- Stable Diffusion: https://stability.ai/stable-diffusion
ComfyUI Installation (COMPLETED)
Installed Location
- Path:
C:\ComfyUI\ComfyUI - Version: 0.3.67 with ComfyUI-Manager V3.37
- Python: 3.12.10 with PyTorch 2.7 nightly
GPU Configuration
- Flag:
--highvramfor RTX 5090 (24GB) - PyTorch: Nightly build with CUDA 12.4 for Blackwell (sm_120) support
- Note: sm_120 warning appears but doesn't affect functionality
Quick Launch
- Script:
C:\ComfyUI\launch_comfyui.bat - URL:
http://[PC-IP-ADDRESS]:8188(network access enabled)
Models to Download
Place all in C:\ComfyUI\ComfyUI\models\checkpoints\
| Model | Size | Link |
|---|---|---|
| FLUX.1-schnell (FP8) | ~4GB | https://huggingface.co/Comfy-Org/flux1-schnell |
| SDXL Base 1.0 | ~6GB | https://huggingface.co/stabilityai/stable-diffusion-xl-base-1.0 |
| SD 1.5 | ~4GB | https://huggingface.co/CompVis/stable-diffusion-v-1-4-original |
Files Created
C:\ComfyUI\launch_comfyui.bat- Launcher with--listen 0.0.0.0C:\ComfyUI\COMFYUI_README.md- Full documentationC:\ComfyUI\ComfyUI\download_models.bat- Download instructions
Known Issues
- PyTorch sm_120 compatibility warning (non-breaking)
- torchaudio missing (not needed for images)
Home Focus (2026-02-02)
Office covered by Hank. Back to home priorities:
Current Home Projects:
- Room-assistant (phone tracking verification)
- ComfyUI setup + model downloads
- MySQL/Apex dashboard refinement
- Home automation integration
Action Items Updated:
Goal
Enable natural language queries against databases with smart clarification when results are empty. Users ask questions in plain English; Clawdbot generates queries, executes them, and presents results conversationally.
Architecture: Orchestrated Agent Pipeline
Use Clawdbot's sub-agent system for multi-stage processing:
Main Session → Search Orchestrator
│
┌───────────┼───────────┐
▼ ▼ ▼
┌──────────┐ ┌──────────┐ ┌──────────┐
│ Intent │ │ Structured│ │ Vector │
│ Parser │→│ Search │→│ Search │
└──────────┘ │ (MSSQL) │ │ (Qdrant) │
└──────────┘ └──────────┘
│
▼
┌──────────────────┐
│ Result Synthesizer │
└──────────────────┘
│
▼
Main Session Response
Database Stack (Office Clone)
| Database | Type | Purpose |
|---|---|---|
| MSSQL | Relational | Exact queries, aggregations, relationships |
| Qdrant | Vector DB | Semantic search, similarity, "find similar" |
| Context Docs | JSON/Markdown | Schema docs, example queries, agent prompts |
Key Files Available
- Schema documentation — For agent context
- User queries — Collected examples to train/reverse-engineer intent
- Context data — Per-table, per-column documentation
Features
-
Orchestrated Agents (sub-agent pipeline)
- Intent Parser — Detect what user wants
- Structured Search Agent (MSSQL) — Exact/relational queries
- Vector Search Agent (Qdrant) — Semantic similarity
- Result Synthesizer — Combine and present naturally
- Each agent: isolated, timeout-able, retry-able
-
Natural Language → SQL
- Parse user intent from NL
- Generate valid MSSQL queries
- Handle WHERE, JOIN, ORDER BY, LIMIT
-
Smart Result Handling
- SUCCESS: Present results naturally (tables, summaries, bullet lists)
- EMPTY: Clarifying questions instead of "No results found"
- MULTI-AMBIGUOUS: Ask which interpretation
-
Clarification Examples
- "No results for 'orders' — did you mean 'transactions'?"
- "50,000 rows found — limit, summarize, or export?"
- "I found similar customers in Qdrant. Want their order history from MSSQL?"
-
Schema Awareness
- Use existing schema docs as agent context
- Cache table structures per session
- Learn from user corrections
Workflow Example
User: "Show me customers like the one who bought 500 widgets"
1. Intent Parser: Detect "find similar" pattern
2. Vector Agent: Find similar customer profiles in Qdrant
3. SQL Agent: Get recent orders for similar customers in MSSQL
4. Synthesizer: "I found 12 similar customers with X characteristics..."
Architecture
User: "Show me all users who bought X last month"
↓
┌─────────────────────────────────────────┐
│ Intent Parser │
│ - Extract entities, dates, filters │
│ - Detect aggregation vs lookup │
└─────────────────────────────────────────┘
↓
┌─────────────────────────────────────────┐
│ Query Generator │
│ - Build SQL based on intent │
│ - Apply optimizations / limits │
└─────────────────────────────────────────┘
↓
┌─────────────────────────────────────────┐
│ Executor │
│ - Run query (with timeout/caching) │
│ - Handle errors gracefully │
└─────────────────────────────────────────┘
↓
┌─────────────────────────────────────────┐
│ Response Handler │
│ - Format results naturally │
│ - Handle empty → clarifying response │
│ - Pagination for large results │
└─────────────────────────────────────────┘
↓
User: "Here's the data" OR "Hmm, did you mean X?"
Technical Notes
- MySQL for local data
- Apache serving dashboard
- Skills available for various tasks
Action Items
Use Cases (Brainstorming)
| Agent | Type | Trigger |
|---|---|---|
| Network Monitor | Long-running | Ping sweep every 30s, detect down hosts |
| Inventory Alert | Cron-prompted | Check inventory levels, warn if below threshold |
| Shipping/Receiving | Webhook | Notify when shipment received |
| System Health | Cron-prompted | CPU, memory, disk alerts |
| Database Anomaly | Cron-prompted | Query patterns, slow queries, connection spikes |
Agent Archetypes
1. Long-Running Stateful Agent
- Stays awake, maintains context
- Polls continuously, remembers last state
- Use: Network monitoring, system health
2. Cron-Prompted Agent
- Wakes on schedule, checks condition, sleeps
- Use: Daily inventory checks, weekly reports
3. Webhook-Triggered Agent
- Spawned by external event
- Use: Shipping notifications, instant alerts
Example: Inventory Monitor Agent
Trigger: Cron every 15 min (or webhook from ERP system)
Task: "Check inventory items where quantity < reorder_level"
Logic:
- Query: SELECT item, quantity, reorder_level FROM inventory
- Filter: WHERE quantity < reorder_level
- If items found: Alert user via Discord/Slack
- If empty: Do nothing, sleep until next cycle
Example: Network Monitor Agent
Trigger: Spawn as persistent agent
Task: "Ping 192.168.1.0/24 subnet every 30s"
Logic:
- Maintain state of known hosts
- On state change (up/down): Notify
- Weekly summary: "Network was 99.7% uptime"
Benefits of Agent-Based Monitoring
- Automation: No manual checkbox checking
- Alerts: Proactive notification vs reactive discovery
- History: Track when issues started/stopped
- Scalability: Add new monitors by spawning new agents
Implementation Notes
- Use cron for scheduled checks (
crontool) - Use webhooks for event-driven (
api/webhooks.php) - Use
sessions_spawnwithcleanup=deletefor short-lived agents - Use long-running process for stateful monitors
Future Enhancements
- Query Explainer: "I found 50 orders by showing you the SQL..."
- Saved Queries: Let users name and reuse common searches
- Export Options: "Export these 100 rows to CSV?"
- Voice Support: "Hey Clawdbot, show me..."
- Multi-Database: Single query spanning both MSSQL and Qdrant
Multi-User Architecture (PROJECT EXTENSION)
Goal
Support multiple users with individual identities, preferences, and context when deployed to office Slack.
Slack Integration Context
- Platform: Slack (primary office communication)
- User IDs: Slack format (U01XXXXXXXXXX)
- Identity Files: Each user has dedicated context file
- Routing: Load user-specific context based on Slack user ID
User Identity Structure
memory/
├── USER.md # Corey's home identity (stays as-is)
└── users/
├── user_manifest.json # Discord/Slack ID → User File mapping
├── USER-197498647183622144.md # Ethan's identity
├── MEMORY-197498647183622144.md # Ethan's personal memories
└── skills.json # Ethan's skill config
Example user_manifest.json:
{
"U01ABC123DEF": "USER-john.doe.md",
"U01GHI456JKL": "USER-jane.smith.md"
}
User File Template
# User Identity
## Name
John Doe
## Roles
- sql_search (can query MSSQL)
- dashboard_view (read-only)
- admin (full access)
## Preferences
- Format: "Brief summary with bullet points"
- Max Results: 50 rows default
- Data Sensitivity: "High" (hide certain fields)
## Context
- Department: Sales
- Common Queries: ["daily sales", "team performance", "customer metrics"]
- Saved Searches: ["weekly_recap", "monthly_targets"]
## Permissions
- Databases: ["sales_db", "customers_db"]
- Restricted Tables: ["payroll", "executive_compensation"]
Context Loading Flow
Discord/Slack Message
↓
Extract User ID (e.g., 197498647183622144)
↓
Check users/user_manifest.json → Find USER-[id].md
↓
Load: USER-[id].md + MEMORY-[id].md + skills.json
↓
Inject into session context
↓
Agent responds with user's identity + preferences
User Files Created (Ethan - ID: 197498647183622144)
| File | Purpose |
|---|---|
users/user_manifest.json |
Maps ID → user files |
users/USER-197498647183622144.md |
Ethan's identity + permissions |
users/MEMORY-197498647183622144.md |
His personal memories (starts empty) |
users/skills.json |
His enabled skills (gmail, calendar, web_search) |
Privacy Discovery (IMPORTANT LESSON)
The Problem
Initial thought: Single channel for all users Reality check: Shared channel = shared memories, queries, emails exposed to everyone
Solution: Private Channels per User
#general → Shared context (projects, automation, fun stuff)
#corey-private → Corey's personal context (memories, email, sensitive queries)
#ethan-private → Ethan's personal context
#kathy-private → Will add when Kathy joins
How It Works
| Channel | Loads |
|---|---|
| #general | Shared/default context |
| #corey-private | USER.md + MEMORY.md (Corey) |
| #ethan-private | USER-ethan.md + MEMORY-ethan.md |
Why Private Channels > DMs
- Conversation history preserved in server
- No DM vs channel switching for the user
- Searchable within Discord/Slack
- Privacy maintained per channel
Implementation Note
Channel ID detection needed:
- If message from
#corey-private→ load Corey's files - If from
#ethan-private→ load Ethan's files - If from shared channel → load shared/default context
Lesson Learned
Security/privacy paranoia leads to better architecture. Asking "what could go wrong?" prevented a later data exposure issue.
Technical Implementation Notes
- Current: Clawdbot loads
memory/USER.mdandmemory/MEMORY.md - Manual detection: Ethan (197498647183622144) recognized, context loaded manually for now
- Full integration needed: Detect channel ID, load matching USER/MEMORY files
- Backward Compatible: Default to
USER.mdif no user_id/channel match found - Ethan tested: Multi-user context works in DM testing
Permission Levels
| Level | Access |
|---|---|
| read | View queries, browse results |
| sql_search | Execute searches, generate queries |
| export | Download CSV, share results |
| admin | Manage user permissions, schema changes |
User Management Commands
@clawdbot user add U01XXX --name "Jane" --role sql_search
@clawdbot user permissions U01XXX --databases sales_db --restricted payroll
@clawdbot user list
@clawdbot user remove U01XXX
Privacy Considerations
- User A should not see User B's saved searches unless shared
- Query history per user
- Admin audit log of who ran what query
Known Issues (ComfyUI)
- PyTorch sm_120 compatibility warning (non-breaking)
- torchaudio missing (not needed for images)
Home Focus (2026-02-02 Updated)
Office covered by Hank. Back to home priorities:
Current Home Projects:
- Room-assistant (phone tracking verification)
- ComfyUI setup + model downloads
- MySQL + Apache dashboard refinement
- Home automation (HA) integration
Multi-User: Ethan added as test user (197498647183622144) for multi-user functionality.
Action Items (Home Focus)
- Verify room-assistant phone tracking works
- Add Kathy's phone to room-assistant config
- Replace Master Hallway motion sensor battery (1%)
- Set MySQL root password for security ✅
- Create non-root user for Clawdbot (AlexAI/alexisabignerd) ✅
Quick Reference
MySQL: localhost:3306 / clawdbot_projects / AlexAI / alexisabignerd
ComfyUI: http://localhost:8188 (needs --listen flag for network)
Apache: http://localhost/ (dashboard)