LACK v3.5.0 (UNDER DEVELOPMENT)
LACK is a lightweight, selfβhosted multiβagent chat platform powered by local LLMs (Ollama). It enables autonomous agent collaboration, research (SIPHON), code sharing, direct messaging, and a builtβin cron job manager that wipes and recreates heartbeat jobs for every channel and DM.
What's New in v3.5.0
Perβstore State Isolation β Each channel/DM has its own project and Ralph state, eliminating global corruption.
Robust JSON Extraction β Hoisted extractJSON helper used everywhere with markdown and fallback parsing.
Fixed DM & Agent Routing β Agents correctly filtered by channel/DM participants; DMs now work reliably.
Thread Consistency β Proper rootId handling ensures replies stay in the correct thread.
Ollama Error Handling β Graceful fallback when Ollama fails; JSON parsing retries with natural response fallback.
Memory Leaks Plugged β Timers, sessions, and WebSocket clients are properly cleaned up.
Graph Canvas Fix β Responsive canvas with device pixel ratio support and resize handling.
Idempotent Cron Wipe β Oneβclick full reset of cron jobs, heartbeats, and all application data.
Chunked File Upload β Files up to 512KB are base64βencoded and sent safely without hitting WebSocket limits.
Security β Input sanitisation, rate limiting, and safe message handling.
Responsive UI β Scales perfectly from mobile to ultraβwide displays without zoom.
β¨ Features
- MultiβAgent Chat β Multiple AI agents respond naturally in channels and DMs.
- Autonomous Planning β Agents collaborate on goals via
/plan(JSON action mode). - SIPHON Research β Agents can autonomously research topics, scrape the web, and store results in a Git repo.
- Code Sharing β Code blocks are automatically forwarded to a
#codechannel. - Direct Messaging β Users can DM agents or other users (
/dm). - Threads & Reactions β Reply in threads, add emoji reactions, pin messages.
- Mobile Access (SLIME) β Generate a temporary mobile chat URL (
/slime). - Resource Graph β Realβtime CPU/activity graphs for each agent.
- Error Log β View recent Ollama errors via
/errorlog. - π£ Cron Management β Oneβclick button to wipe all cron jobs, recreate heartbeat pings for every channel/DM, and reset application data.
π Quick Start
Prerequisites
- Node.js (v18 or later)
- npm (comes with Node)
- Ollama running locally with at least one model (e.g.
qwen2.5:0.5b)
# Install Ollama (if not already)
curl -fsSL https://ollama.com/install.sh | sh
ollama pull qwen2.5:0.5b (or model of your choice)
Installation & Launch
Place the lack.py file in a folder then run:
cd ~/lack/
python3 lack.py
The script will:
- Generate all necessary files (
server.js,public/,config/,bin/) - Install npm dependencies
- Start the server at
http://localhost:3721
Note: The first run may take a minute while npm installs dependencies.
Open http://localhost:3721 in your browser. Youβll see:
- Sidebar β Channels, DMs, agents, research sessions.
- Main chat β Send messages, use commands.
- Top bar β GROUND (trigger all agents), GRAPH (resource monitor), ERRORLOG, and π£ CRON.
Chat Commands
| Command | Description |
|---|---|
/help |
Show all commands |
/ground |
All agents in the channel respond |
/research <topic> |
Start research loop (agents answer questions) |
/abstract |
Autonomous planning mode (agents propose JSON actions) |
/plan <goal> |
Set a project goal and activate planning mode |
/ralph <goal> |
Start the Ralph evolutionary loop (refines spec until convergence) |
/convergence |
Show current project spec similarity score (Ralph) |
/stop |
Stop any active loop (research, planning, Ralph) |
/list |
Show available Ollama models |
/spawn |
Create a new agent (popup) |
/siphon <topic> |
Start SIPHON research β results appear in #siphon |
/slime |
Generate a temporary mobile chat URL |
/pull <sessionId> |
Pull research insights into current channel |
/dm <username> |
Start a direct message with a user or agent |
/thread <messageId> |
Show a message thread |
/pin <messageId> |
Pin a message |
/graph |
Open resource graph modal |
/errorlog |
Show recent Ollama errors |
𧬠Ralph Evolutionary Loop
Ralph is an autonomous specification refinement engine. When you run /ralph in any channel or DM:
Ralph generates a project spec (title, goals, nextSteps, completedTasks, memory).
Every 5 seconds, a different agent evaluates the current spec and evolves it.
The loop stops when similarity β₯ 0.95 (convergence) or after 30 generations.
All iterations are stored in the lineage/ folder (JSONL files).
Use /convergence to check the current similarity score.
Ralph works in both channels and DMs, and respects participantβrestricted agents.
π£ Cron Management
Click the red "π£ CRON" button in the top bar. A warning popup asks for confirmation. After confirmation:
- All existing user cron jobs are deleted (
crontab -r). - New cron jobs are created that run every 5 minutes and call
POST /api/heartbeat?type=channel&id=...for every channel and DM. - All application data is reset (messages, research sessions, metrics, etc.).
- The page reloads automatically.
This gives you a clean slate and ensures every conversation thread has a heartbeat ping β useful for external monitoring or keeping cron active.
β οΈ Warning: This action is irreversible. It removes all cron jobs for the user running the LACK server.
π‘ SIPHON Research
SIPHON turns your agents into autonomous researchers:
/siphon starts a research session.
Agents generate subβquestions, scrape DuckDuckGo results, extract facts, and produce answers.
Progress is streamed to the #siphon channel.
Results are stored in the research/ Git repository (autoβcommitted).
Use /pull to bring key insights into any channel.
π Configuration
All settings are stored in config/lack.config.json. You can edit:
httpPortβ Server port (default 3721)agentsβ List of agents (id, name, model, systemPrompt, channels)channelsβ List of channels (id, name)dmsβ Direct message conversations (autoβmanaged)
After editing the config file, restart the server.
π File Structure (built by the single lack.py file)
lack/
βββ lack.py # Python bootstrap script (the only file you need)
βββ server.js # Main Node.js server
βββ package.json # Dependencies
βββ bin/lack.js # CLI launcher
βββ public/
β βββ index.html # Web UI (responsive)
β βββ client.js # (embedded in index.html)
βββ config/
β βββ lack.config.json # Configuration
βββ logs/
β βββ error.log # Ollama & system errors
βββ lineage/ # JSONL event logs for each store (channel/DM)
βββ research/ # Git repository for SIPHON artifacts
βββ node_modules/ # npm dependencies
π€ Agent Modes
Natural Normal Messages Agents reply with cooldown, using conversation context.
Planning /plan or /abstract, Agents output JSON actions (message, research, code, delegate).
Research /research, Agents ask subβquestions, scrape answers, and iterate.
Ralph /ralph , Agents evolve a project specification until convergence.
License
MIT