⚠ SETUP REQUIRED — CAN'T OPEN AS file://

WebLLM needs SharedArrayBuffer (WebAssembly threads).
Browsers block this on file:// — serve via localhost HTTP instead:

Run proxy.py (recommended):
python proxy.py 8080 click to copy
Then open: http://127.0.0.1:8080/test.html

proxy.py serves files + adds COOP/COEP headers + provides CORS proxy for search/scrape.
NO MODEL ITER 0 ISO:? CON
⚡ Chronos — ReAct Agent
0 msgs
FULL AGENT · WEBLLM EDITION
Features:
✓ soul.md / user.md — persistent memory
✓ skills/*.md — dynamic injection
✓ ReAct loop: think → act → observe
✓ Tools: web_search · scrape · summarize · remember
✓ read_memory · forget · schedule · inject_js
✓ rag_index · rag_search · rag_prompt — hybrid retrieval
✓ network_scan — discover local services
✓ Ctrl+` — console log popup
✓ ↑ Arrow — command palette

⚠ Serve with: python proxy.py 8080
← SELECT MODEL AND CLICK LOAD
>_
CONSOLE LOG (Ctrl+`)