Add vault content - WORK folder, Tasks, Projects, Summaries, Templates

This commit is contained in:
AlexAI
2026-02-24 09:46:07 -06:00
parent bc3284ad69
commit c402fb1161
41 changed files with 4299 additions and 0 deletions

View File

@@ -0,0 +1,86 @@
# Coding Workflow
## Environment
- **Primary Channel:** Discord #coding (1468627455656067074)
- **Repository:** whoisfrost/controls-web (Lassite controls internal website)
- **Model:** kimi-k2.5:cloud (home-assistant), qwen3-coder-next:cloud (coding)
---
## Git Workflow
### Branch Strategy
| Branch | Purpose |
|--------|---------|
| `main` | Stable/production branch |
| `dev` | Testing/playground branch |
| `feature/*` | Specific changes (deleted after merge) |
### Deployment Process
```
Work PC → GitHub (dev) → Dev Server (pulls dev) → Live Server (pulls main)
```
### Workflow Steps
1. Create feature branch from dev
2. Work and commit
3. Push feature branch to GitHub
4. Merge feature → dev
5. Test on dev server
6. Merge dev → main
7. Deploy to live server
### Commands
```bash
git checkout -b dev # Create dev branch
git push origin dev # Push dev branch
git checkout -b feature-name # Create feature branch
git push origin feature-name # Push feature
git checkout dev
git merge feature-name # Merge into dev
git push origin dev
git checkout main
git merge dev # After testing
git push origin main
```
---
## LAMP Stack
### Local Development
- **Dashboard:** `C:\web\htdocs\dashboard` (active project)
- **Web Root:** `C:\web\htdocs\`
- **Old Dashboard:** `/var/www/html/dashboardv2/` (ignore - archived)
### Production
- Dev server pulls from `dev` branch
- Live server pulls from `main` branch
---
## Tools & Environment
- **Shell:** Bash
- **SSH, curl, jq**
- **Node.js:** For Next.js/React projects
- **Python:** For Flask backends
---
## Model Assignments (Confirmed 2026-02-10)
| Channel | Model | Reason |
|---------|-------|--------|
| #home-assistant | kimi-k2.5:cloud | General purpose |
| #coding | qwen3-coder-next:cloud | Optimized for code |
| #projects | kimi-k2.5:cloud | General purpose |
---
## Related Projects
- [[Mission Control Dashboard (Python)]] — Flask dashboard :5050
- [[Mission Control Dashboard (Next.js)]] — React dashboard :3000 (completed)
---
*Status: Active | Updated: 2026-02-22*

View File

@@ -0,0 +1,90 @@
# Discord Voice Bot (GLaDOS)
## Overview
Voice-controlled Discord bot with conversational AI — always available in #coding voice channel
---
## Location
`/home/admin/.openclaw/workspace/discord-voice-bot/`
---
## Architecture (Completed 2026-02-08)
### Components
| Component | Endpoint | Purpose |
|-----------|----------|---------|
| **TTS** | 192.168.0.17:5050 | GLaDOS voice synthesis |
| **STT** | 192.168.0.17:10300 | Whisper via Wyoming protocol |
| **LLM** | 192.168.0.17:11434 | Ollama (qwen3-coder-next:cloud) |
| **Auto-join** | On startup | Joins #coding voice channel |
### Voice Pipeline
```
Wake Word → STT (Whisper) → LLM (Ollama) → TTS (GLaDOS) → Voice Output
```
---
## Commands
- `!join` — Join voice channel
- `!leave` — Leave voice channel
- `!test [text]` — Test TTS synthesis
- `!listen` — 5-second voice recording (7-10 sec latency)
---
## MCP Integration (OpenClaw Bridge)
### MCP Tools Available via Voice
- `list_projects()` — "What am I working on?"
- `create_project(name)` — Start new project
- `add_task(project, task)` — Add tasks via voice
- `update_task_status(taskId, status)` — Mark complete
- `get_project_status()` — Check overall progress
### MCP Server Config
```yaml
mcp_servers:
- name: "openclaw"
transport: "stdio"
command: "python"
args:
- "C:\Users\admin\.openclaw\workspace\discord-voice-bot\openclaw_mcp_server.py"
```
---
## Files Created
- `openclaw_mcp_server.py` — FastMCP server exposing OpenClaw tools
- `main.py` — Bot implementation using discord.py, wyoming
- `config.yaml` — Configuration
- `requirements.txt` — Dependencies (discord.py, requests, numpy, wyoming)
---
## What Changed (v2)
- Removed GLaDOS direct dependencies (simpler integration)
- Uses existing services (Whisper, Ollama, HTTP TTS)
- Auto-joins voice channel on startup
- Clean async architecture
- MCP bridge for OpenClaw integration
---
## Known Limitations
- Voice latency: 7-10 seconds for `!listen` (5s recording + processing)
- Not conversational due to latency
- Best for command-based interactions, not real-time chat
---
## Status
**COMPLETED** — Operational in #coding voice channel
Bot auto-joins on startup and accepts voice commands via MCP bridge.
---
*Status: Complete | Version: 2.0 | Updated: 2026-02-22*

100
Projects/Home Assistant.md Normal file
View File

@@ -0,0 +1,100 @@
# Home Assistant
## Environment
- **URL:** http://192.168.0.39:8123
- **Auth Token:** eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9.eyJpc3MiOiIxMTRhMDMwYjNhNjg0N2JiYWY2YmUxNmY4YTBhYjVkZiIsImlhdCI6MTc3MDE3NTU0NSwiZXhwIjoyMDg1NTM1NTQ1fQ.8xxRzURj0NoFoo_UdtH34IXcuBX6fSpofdA8GjCK-B4
- **MQTT Broker:** 192.168.0.39:1883 (corey/41945549)
- **Status:** 🔄 In Progress (entity cleanup ongoing)
- **Discord Channel:** #home-assistant (1466074219829006599)
---
## Room-Assistant (Bluetooth Presence) - WORKING! ✅
**What it does:**
- Pi Zero W BLE scans for known phones
- Reports to HA via MQTT
- Creates device_tracker entities per room
- Multi-Pi setup gives room-level presence
### Hardware
- **Pi 1:** livingroom-pi (192.168.0.95)
- **Bluetooth MAC:** B8:27:EB:50:C9:40
- **SSH:** admin / 41945549
- **SSH Key:** ~/.ssh/id_ed25519 (Windows SSH quirk - needs manual workarounds)
### Phone Tracking
- **Corey's Phone:** B0:C2:C7:07:28:B4 (motorola razr 2024)
- **Phone MAC Discovery:**
- iPhone: Settings → About → Bluetooth
- Android: Settings → About Phone → Status → Bluetooth address
### Room-Assistant Config (`/home/admin/config/local.yml`)
```yaml
mqtt:
host: "192.168.0.39"
port: 1883
username: "corey"
password: "41945549"
bluetooth:
adminToken: "AF07072FBAC1FD6281FBE765DF6D841E"
timeout: 60
addresses:
- "b0:c2:c7:07:28:b5" # Corey's phone MAC
settings:
room: "LivingRoom"
```
### Home Assistant Entities Created
- `sensor.livingroom_cluster_size`
- `sensor.livingroom_cluster_leader`
- `device_tracker.livingroom_phone` (when phone in range)
### Key Config Notes
- Config location: `/home/admin/config/local.yml` (NOT in room-assistant subdir)
- Config format: keys at root level (v2.x), no "global." prefix
- mdns module needed: `sudo npm install -g mdns` (for cluster discovery)
- RSSI threshold: `-70` (adjustable for range)
---
## Current Tasks
### High Priority
- [ ] Entity cleanup (972 devices in spreadsheet)
- [x] Fix hallway sensor battery (1%)
- [ ] Add Kathy's phone to room-assistant config
### In Progress
- [x] Room-assistant setup and working
- [x] Phone MAC paired and tracking
- [ ] Multi-Pi expansion (clone SD after first Pi stable)
---
## Setup Workflow (for future Pis)
1. Flash Pi OS Lite to SD card
2. Enable SSH + WiFi via Pi Imager
3. Copy config files to Pi
4. Run `sudo ./setup-room-assistant.sh`
5. Script prompts for: room name, MQTT creds, phone MACs
---
## Files Created
- `room-assistant-config.yml` — BLE presence config template
- `setup-room-assistant.sh` — Automated Pi setup script
- `ROOM_ASSISTANT_SETUP.md` — Complete setup documentation
- `temp/write_ha_names.py` — Push entity changes back to HA
---
## Integration Points
- HA REST API (native)
- MQTT (for room-assistant)
- Mission Control dashboard (for display)
- Discord notifications (alerts)
---
*Status: Room presence working, entity cleanup ongoing | Updated: 2026-02-22*

144
Projects/Memory System.md Normal file
View File

@@ -0,0 +1,144 @@
# Memory System
## Architecture Evolution
Started as flat file system, evolved to hierarchical hybrid (file + SQL) architecture.
---
## Current Stack (February 2026)
### 1. Hierarchical File Structure
Based on ucsandman's design — index + drill-down:
`
memory/
├── MEMORY.md # ~2.4k token index (was 5-10k)
├── people/
│ └── corey.md # User details
├── projects/
│ ├── mission-control-python.md
│ ├── home-assistant.md
│ ├── memory-system.md
│ └── [others]
├── decisions/
│ └── 2026-02.md # Monthly decisions
└── 2026-02-21.md # Daily notes
`
**Benefits:**
- Session start: ~2.4k tokens (was 5-10k)
- On-demand drill-down: ~1k tokens per file
- **~70% token savings**
### 2. SQLite Database
- **Location:** ~/.openclaw/memory.db
- **Tables:**
- memory_cells — Structured data
- scenes — Context/grouping
- memory_fts — Full-text search
- **Extraction:** Memory Worker agent (3 AM daily)
### 3. Supermemory.ai Backup
- **Schedule:** Daily 2 AM
- **Script:** scripts/backup-memory.py
- **History:** Document IDs tracked per backup
### 4. MySQL Mirror (Legacy)
- **Host:** localhost:3306
- **Database:** clawdbot_projects
- **Tables:** daily_notes, long_term_memories, action_items, backup_logs, sessions
- **User:** AlexAI/alexisabignerd
- **Status:** Functional but not actively used
---
## Key Decisions
| Date | Decision | Status |
|------|----------|--------|
| 2026-02-09 | Supermemory backup with cron | Active |
| 2026-02-12 | workspace-context.md for temp state | Active |
| 2026-02-16 | Hybrid File + SQLite (not MySQL) | Active |
| 2026-02-18 | Hierarchical index + drill-down | Live |
| 2026-02-20 | Memory Worker writes to SQLite | Working |
---
## Ollama Memory Embeddings (2026-02-10)
**Setup:**
- Installed skill from https://clawhub.ai/vidarbrekke/ollama-memory-embeddings
- Uses Ollama /v1/embeddings endpoint
- Replaces node-llama-cpp for local embeddings
**Trade-off:**
- ✅ Faster, OpenAI-compatible API
- ⚠️ Two embedding universes: old (node-llama-cpp) vs new (Ollama)
- Historical memories may search differently
---
## Workers
### Memory Worker Daily
- **Schedule:** 3 AM
- **Task:** Extract structured data → SQLite
- **Extraction source:** memory/YYYY-MM-DD.md files
- **Output:** memory.db + Discord report
### Job Verifier Daily
- **Schedule:** 9 AM
- **Task:** Check overnight job statuses
- **Reports to:** #alerts channel
---
## Integration Points
| System | Direction | Method |
|--------|-----------|--------|
| OpenClaw Memory Search | Read | semantic search |
| SQLite Database | Read/Write | memory tool |
| Supermemory | Write (backup) | API |
| Obsidian Vault | Write (export) | notesmd-cli |
| MySQL | Legacy | Direct SQL |
---
## Skills Related
| Skill | Purpose |
|-------|---------|
| ollama-memory-embeddings | Ollama-based embeddings |
| memory-to-obsidian | Export notes to Obsidian |
| ez-cronjob | Fix cron scheduling issues |
| self-improving-agent | Log errors/corrections |
---
## Troubleshooting Notes
**Cron deadlock bug:**
- Issue: cron tool hangs
- Fix: Use openclaw cron CLI via exec instead
**Context bloat:**
- Issue: 256k limit hit frequently
- Fix: Spawn sub-agents for complex tasks
**Session recovery:**
- workspace-context.md survives crashes
- Cleaned daily at ~11 PM
---
## Future Ideas
1. SQL → HA Dashboard integration
2. Automated presence tracking export
3. Memory stats/analytics in Mission Control
4. Re-embed historical memories for consistency
---
Status: Live | Last Major Update: 2026-02-18 (Hierarchical restructure)

View File

@@ -0,0 +1,261 @@
---
title: Memory System Architecture
category: Projects
project: Memory System
type: Architecture/Documentation
status: complete
date: 2026-02-16
last_updated: 2026-02-23
tags: [memory, architecture, openclaw, system-design, data-flow]
source_file: memory_system_architecture.md
---
# Memory System Architecture
*Diagram of how information flows and persists in the OpenClaw system*
---
## Overview
```
┌─────────────────────────────────────────────────────────────────────────────┐
│ INFORMATION FLOW │
└─────────────────────────────────────────────────────────────────────────────┘
User Conversation
┌─────────────────────┐ ┌─────────────────────┐ ┌─────────────────────┐
│ ME (Main Agent) │────▶│ Memory Worker │────▶│ SQLite Database │
│ (Real-time) │ │ (Daily 3 AM) │ │ (Structured) │
└─────────────────────┘ └─────────────────────┘ └─────────────────────┘
│ │ │
│ │ │
▼ ▼ ▼
┌─────────────────────┐ ┌─────────────────────┐ ┌─────────────────────┐
│ Daily Notes │ │ Query Interface │ │ Stats/Search │
│ (memory/*.md) │◄────│ (On Demand) │◄────│ (SQL/FTS) │
└─────────────────────┘ └─────────────────────┘ └─────────────────────┘
┌─────────────────────┐
│ MEMORY.md │
│ (Curated) │
└─────────────────────┘
┌─────────────────────┐
│ Supermemory.ai │
│ (Cloud Backup) │
└─────────────────────┘
```
---
## Storage Layers (By Speed & Persistence)
### 1. ⚡ Session RAM (Transient)
| Aspect | Details |
|--------|---------|
| **What** | Current conversation context, tool outputs, working memory |
| **Writes** | Every message I process |
| **When** | Real-time during conversation |
| **Stored** | Until session ends or compaction (30 min - 4 hours) |
| **Survives** | ❌ Session crash ❌ Gateway restart |
| **Size** | ~100K-250K tokens |
**Risk:** Compaction clears this. The "danger zone" is between last tool use and compaction.
---
### 2. 📝 Daily Notes (Short-term)
| Aspect | Details |
|--------|---------|
| **What** | Raw daily activity, decisions, tasks, errors |
| **Writes** | Pre-compaction flush (automatic) + manual captures |
| **When** | End of productive sessions, before `/compact` |
| **Stored** | `memory/YYYY-MM-DD.md` |
| **Survives** | ✅ Session crash ✅ Gateway restart |
| **Retention** | ~30-90 days (manually reviewed) |
| **Format** | Free-form markdown |
**Written by:** Me (main agent) during heartbeat or EOD ritual
---
### 3. 🧠 MEMORY.md (Long-term)
| Aspect | Details |
|--------|---------|
| **What** | Curated important info, distilled from daily notes |
| **Writes** | Manual review of daily notes, during heartbeats |
| **When** | Every few days, or when something critical happens |
| **Stored** | `MEMORY.md` in workspace root |
| **Survives** | ✅ Everything (file-based) |
| **Retention** | Permanent (manual curation) |
| **Format** | Human-readable markdown |
**Written by:** Me, after reviewing daily notes
---
### 4. 📊 SQLite Database (Structured)
| Aspect | Details |
|--------|---------|
| **What** | Structured: tasks, decisions, facts, projects with salience |
| **Writes** | Memory Worker (automated daily extraction) |
| **When** | Daily 3:00 AM (cron job) |
| **Stored** | `~/.openclaw/memory.db` |
| **Survives** | ✅ File-based |
| **Retention** | Permanent (until manual deletion) |
| **Format** | Relational: cells, scenes, FTS index |
**Written by:** Memory Worker agent (spawned via cron)
**Schema:**
```sql
memory_cells: id, scene, cell_type, salience, content, source_file, created_at
scenes: scene, summary, item_count, updated_at
memory_fts: full-text search index
```
---
### 5. 🌐 Supermemory.ai (Cloud)
| Aspect | Details |
|--------|---------|
| **What** | Full backup of all memory files |
| **Writes** | Supermemory Backup job (automated) |
| **When** | Daily 2:00 AM |
| **Stored** | Supermemory.ai cloud service |
| **Survives** | ✅ Disk failure ✅ Workspace loss |
| **Retention** | Cloud provider dependent |
| **Format** | API-uploaded documents |
**Written by:** Python script via cron job
---
### 6. 📋 Workspace Context (Session Bridge)
| Aspect | Details |
|--------|---------|
| **What** | Current conversation, in-progress, finished today |
| **Writes** | Real-time during session |
| **When** | Continuously updated |
| **Stored** | `workspace-context.md` |
| **Survives** | ✅ Session crash ✅ Channel switch |
| **Retention** | Cleared nightly (~11 PM) |
| **Format** | Structured markdown |
**Special:** Survives between channels and session crashes. Cleared daily.
---
## Retention Summary
| Layer | Retention | Cleared When | Backup |
|-------|-----------|--------------|--------|
| Session RAM | Minutes-hours | Compaction | ❌ |
| Workspace Context | ~24 hours | 11 PM nightly | ❌ |
| Daily Notes | 30-90 days | Manual archive | Supermemory |
| MEMORY.md | Permanent | Manual edit | Supermemory |
| SQLite DB | Permanent | Manual delete | ❌ (local only) |
| Supermemory | Permanent | Cloud provider | N/A (is backup) |
---
## Write Triggers
```
Every Message
├─► Session RAM (immediate)
└─► If important ┐
workspace-context.md
Pre-compaction ┤
memory/YYYY-MM-DD.md
Periodic review ┤
MEMORY.md
Daily 2 AM ┤
Supermemory.ai
Daily 3 AM ┤
SQLite Database
```
---
## Access Patterns
### I (Main Agent) Access:
| Source | When | Purpose |
|--------|------|---------|
| MEMORY.md | Every session startup | Core identity, user prefs, important facts |
| USER.md | Every session startup | Who Corey is |
| SOUL.md | Every session startup | How I should behave |
| workspace-context.md | Every session startup | Current conversation state |
| memory/*.md | During heartbeats | Recent context |
| SQLite DB | On demand | Structured queries ("what tasks pending?") |
### Memory Worker Access:
| Source | When | Purpose |
|--------|------|---------|
| IDENTITY.md | Daily 3 AM | Who it is |
| SOUL.md | Daily 3 AM | Its mission |
| HEARTBEAT.md | Daily 3 AM | What to do (the script) |
| memory/YYYY-MM-DD.md | Daily 3 AM | What to extract |
| SQLite DB | Daily 3 AM | Where to write |
---
## Failure Recovery
### Scenario: Session Crash
-**Survives:** Files (MEMORY.md, daily notes, workspace-context)
-**Lost:** Session RAM (compaction would have cleared anyway)
- 🔄 **Recovery:** Read files on restart, reconstruct context
### Scenario: Gateway Restart
-**Survives:** All files, SQLite DB
-**Lost:** Session state, cron job state (must recreate jobs)
- 🔄 **Recovery:** Gateway restart, verify cron jobs running
### Scenario: Disk Failure
-**Survives:** Supermemory.ai (cloud backup)
-**Lost:** Local files, SQLite DB
- 🔄 **Recovery:** Restore from Supermemory, recreate DB (re-extract from notes)
---
## Key Insights
1. **Text > Brain** — Files persist, my session doesn't
2. **Daily notes = raw, MEMORY.md = curated** — Filter noise from signal
3. **Worker = automated structuring** — Don't have to manually organize everything
4. **Hybrid = best of both** — Human-readable + machine-queryable
5. **Multiple backups** — Local files + cloud (Supermemory) + structured DB
---
## Related Files
- [[Memory System]] — OpenClaw summary (in OpenClaw/ folder)
- MEMORY.md — Curated long-term memory
- memory/YYYY-MM-DD.md — Daily notes
- workspace-context.md — Session bridge
- ~/.openclaw/memory.db — Structured SQLite database
---
*Generated: 2026-02-16*
*Copied to Obsidian: 2026-02-23*
*System Version: Multi-agent with SQLite extraction*

View File

@@ -0,0 +1,40 @@
# Mission Control Dashboard (Python)
**Location:** C:\web\htdocs\mission-control-py
**URL:** http://localhost:5050
**Tech:** Python/Flask + MySQL + Home Assistant API
## Quick Reference
| Command | Action |
|---------|--------|
| Run | cd C:\web\htdocs\mission-control-py && python app.py |
| Port | 5050 (Frigate-safe) |
| Restart | Auto-reloads on file changes |
## Pages
| Route | Purpose |
|-------|---------|
| / | Gateway status, Cron jobs, Weather |
| /admin | Projects/Tasks CRUD |
| /news | News briefs archive |
| /alerts | System alerts |
| /calendar | HA calendar + F1 races |
| /memory | Memory file stats |
## Features
- **Caching:** 5-minute TTL for Gateway/Cron/Weather
- **Weather:** From HA weather.home entity
- **News:** Cron-fed JSON at 8 AM daily
- **Calendar:** HA integration (date range bug known)
## Known Issues
| Issue | Status |
|-------|--------|
| Calendar >10 days not showing | Known |
| Calendar agent creates cron not events | Known |
_Last updated: 2026-02-20_

View File

@@ -0,0 +1,203 @@
---
title: Multi-User Agent Architecture
category: Projects
project: System Architecture
type: Research/Design
status: planning
date: 2026-02-24
tags: [openclaw, multi-user, architecture, collaboration, discord, design]
---
# Multi-User Agent Architecture
**Question:** What's the best way to handle multiple users using one agent for tasks and project management?
**Date:** 2026-02-24
**Status:** Planning/Research
---
## Executive Summary
Four approaches ranging from simple (separate channels) to complex (multi-agent). Recommendation: **Start with Option 1 (Separate Channels)** and add frontmatter tagging (Option 3) as needed.
---
## Option 1: Separate Channels Per User ⭐ RECOMMENDED
**Best for:** Small teams (2-5 people), clear separation
### Setup
```
Discord Server:
├── #home-assistant (shared - only you/active)
├── #corey-projects (your own)
├── #alice-projects (Alice's channel)
├── #bob-projects (Bob's channel)
└── #shared-projects (collaborative)
```
### How It Works
- Each user gets their own channel
- Agent has separate sessions for each
- Context isolation = no cross-contamination
- Shared projects in #shared-projects
### Pros
- ✅ Total privacy per user
- ✅ Simple to manage
- ✅ One agent, multiple sessions
- ✅ User-specific memory stays isolated
### Cons
- More channels to manage
- Shared projects need explicit cross-posting
---
## Option 2: User Tagging System (Hybrid)
**Best for:** Shared workspace with user-aware tasks
### Setup
- One shared channel (#projects)
- Users tag messages: `@alice` or `#for-alice`
- Or: "Task for Alice: ..."
### How It Works
- Agent identifies user from message content
- Store tasks with user attribution
- Query by user when needed
### Example Workflow
```
Corey: #task for @alice - Review the architecture doc
Alice: (in same channel) I reviewed it, looks good
Corey: #task for @bob - Deploy the update
```
### Pros
- ✅ Single channel
- ✅ Collaborative feel
- ✅ Clear ownership
### Cons
- Requires discipline with tagging
- Context can get messy
- Privacy is opt-in (explicit tags)
---
## Option 3: User Profiles in Frontmatter
**Best for:** Shared Obsidian vault, Dataview queries
### Setup
- Tasks/projects have frontmatter with `assignee: alice`
- Query using Dataview: `WHERE assignee = "alice"`
### Example Task
```markdown
---
title: Review architecture
project: Memory System
assignee: alice
status: todo
due: 2026-02-25
---
Alice needs to review the memory system architecture.
```
### Query
```dataview
TABLE status, due
FROM "Tasks"
WHERE assignee = "alice" AND status = "todo"
SORT due ASC
```
### Pros
- ✅ Perfect for Obsidian
- ✅ Dataview-powered dashboards
- ✅ Automatic organization
### Cons
- Requires frontmatter discipline
- Real-time chat = harder to manage
---
## Option 4: Multi-Agent (Advanced)
**Best for:** Teams where each person wants their own agent
### Setup
- Each user has their own OpenClaw instance
- Shared resources via Obsidian/SQLite
- Sync via git or shared database
### How It Syncs
- Obsidian vault is shared (Syncthing/Nextcloud)
- Both agents read/write same files
- Different identities, same data
### Pros
- ✅ Full separation
- ✅ Personalized agents
- ✅ Shared long-term memory
### Cons
- Complex setup
- More resources (multiple agents)
- Merge conflicts possible
---
## Recommendation
**Start with:** Option 1 (Separate Channels)
**Why:**
- Simplest to implement
- No context pollution
- Easy permissions (Discord native)
- Can add Option 3 (Frontmatter) later
**Hybrid Approach:**
- Private channels per user (`#alice-projects`, `#bob-projects`)
- Shared channel for collaboration (`#team-projects`)
- Obsidian vault is shared with user-tagged frontmatter
---
## Decision Matrix
| Factor | Option 1 | Option 2 | Option 3 | Option 4 |
|--------|----------|----------|----------|----------|
| **Privacy** | High | Medium | High | High |
| **Setup Complexity** | Low | Low | Medium | High |
| **Collaboration** | Medium | High | Medium | Medium |
| **Scalability** | Medium | High | High | High |
| **Discord Channels** | Many | One | N/A | Many |
---
## Next Steps
1. **Define users:** How many people? Technical level?
2. **Privacy check:** Do users need isolation from each other?
3. **Collaboration needs:** Shared projects vs individual work?
4. **Pilot:** Start with Option 1, iterate based on feedback
---
## Related
- [[Memory System]] — How agent memory works
- [[Dataview Query Examples]] — Query by user/assignee
---
*Created: 2026-02-24*
*Based on discussion in #home-assistant*

View File

@@ -0,0 +1,70 @@
---
title: Projects Index
category: Projects
status: active
last_updated: 2026-02-24
tags: [projects, index, active, completed]
---
# Projects Index
## Active Projects
### Mission Control Python
- Status: Live on :5050
- Type: Dashboard
- Tech: Flask + MySQL + HA API
- Notes: 6 pages, cron integration, weather, calendar
- Issues: Calendar date range bug, calendar agent bug
### Home Assistant
- Status: Entity cleanup ongoing
- Type: Home Automation
- Notes: 972 devices cleanup, Room-Assistant working
### Memory System
- Status: Hierarchical restructure complete
- Type: Infrastructure
- Notes: File + SQLite hybrid, Supermemory backups
### Coding Workflow
- Status: Active
- Type: DevOps
- Notes: Git/controls-web, Kimi/mini model split
### Auto-Summarize Workflow {#sync}
- Status: Active
- Type: Agent Capability
- Notes: "/summarize [URL]" → fetch → summarize → save to Summaries/
- Works with: Any web URL
### Multi-User Agent Architecture
- Status: Planning
- Type: System Design
- Notes: Researching multi-user approaches for task/project management
- Options: Separate channels, tagging system, frontmatter, multi-agent
- Decision: Start with Option 1 (separate Discord channels)
## Completed Projects
### Discord Voice Bot (GLaDOS)
- Status: Completed 2026-02-08
- Type: Voice Bot
- Notes: Always in #coding voice
### Proton Mail Bridge
- Status: Completed 2026-02-18
- Type: Email Integration
- Notes: IMAP/SMTP tested
## How to Sync
1. Edit any project note in Obsidian
2. Tell me to sync
3. I read the updated note and update my memory
### How to Summarize
1. Type: `/summarize [URL]`
2. I fetch, summarize, and save to `Summaries/`
3. Confirm and link back to you
Last Updated: 2026-02-23

View File

@@ -0,0 +1,84 @@
# Proton Mail Bridge
## Overview
Email integration via Proton Mail Bridge for IMAP/SMTP access. Gives local IMAP/SMTP endpoints that can be used with standard email clients.
---
## Connection Details
- **IMAP Server:** 127.0.0.1:1143
- **SMTP Server:** 127.0.0.1:1025
- **Username:** alexthenerdyai@proton.me
- **Security:** AUTH=PLAIN (local connection only)
- **Bridge Version:** Proton Mail Bridge 03.22.00 - gluon
---
## Test Results (2026-02-18)
✅ Connected successfully to Bridge IMAP
✅ Authenticated with Bridge password
✅ Retrieved folder list (10 folders found)
✅ INBOX has 3 messages, all unread
---
## Folders
- INBOX
- Sent
- Drafts
- Starred
- Archive
- Spam
- Trash
- All Mail
- Folders
- Labels
---
## Capabilities
- Read emails from INBOX
- Search, mark read/unread, move messages
- Send emails via SMTP
- Full folder management
---
## Files Created
- `tools/proton_imap_test.py` — Initial test script
- `tools/proton_imap_simple.py` — Working IMAP test script
---
## Potential Uses
- Daily email digest
- Notifications
- Automated responses
- Email-based triggers (IFTTT style)
---
## Next Steps
- [ ] Integrate with cron jobs for email-based actions
- [ ] Create email notification system
- [ ] Set up email-to-Discord forwarding
- [ ] Build email digest bot
---
## Technical Notes
- Bridge runs locally (127.0.0.1)
- AUTH=PLAIN is safe because connection is local only
- Standard Python imaplib/smtplib libraries work
- No Proton-specific libraries needed
---
## Status
**FUNCTIONAL** — Bridge connected and tested
Ready for automation integration.
---
*Status: Complete & Tested | Updated: 2026-02-22*