# BTC Accumulation Bot - Context & Guidelines High-performance crypto data collection and trading system for cbBTC on Hyperliquid, optimized for Synology NAS deployment. ## Project Overview - **Purpose**: Collect 1-minute candle data for cbBTC-PERP from Hyperliquid, compute technical indicators on custom timeframes (e.g., 37m, 148m), and execute accumulation strategies. - **Tech Stack**: - **Backend**: Python 3.11+ (FastAPI, asyncio, websockets, asyncpg, pandas, numpy). - **Database**: TimescaleDB (PostgreSQL 15 extension) on Synology DS218+. - **Infrastructure**: Docker Compose (Network Mode: host for performance). - **Frontend**: Vanilla JS dashboard for real-time monitoring and charts. - **Architecture**: - `DataCollector`: Main orchestrator managing WebSocket ingestion, buffering, and database writes. - `IndicatorEngine`: Computes technical indicators (MA44, MA125) on multiple timeframes. - `Brain`: Decision engine that evaluates signals based on indicator state. - `CustomTimeframeGenerator`: Dynamically generates non-standard intervals from 1m base candles. ## Building and Running ### Development & Deployment - **Deployment**: Run `chmod +x scripts/deploy.sh && ./scripts/deploy.sh` to scaffold directories and start Docker services. - **Service Control**: - Start: `cd docker && docker-compose up -d` - Stop: `cd docker && docker-compose down` - Logs: `docker-compose logs -f [service_name]` (e.g., `data_collector`, `api_server`, `timescaledb`) - **Backups**: `scripts/backup.sh` (manages PostgreSQL dumps with 7-day retention). ### Local API Server (Hybrid Setup) To run the dashboard/API locally while the database remains on the NAS: 1. **NAS Config**: Ensure `5433:5432` mapping is active in `docker-compose.yml`. 2. **Local Environment**: Create a `.env` file locally: ```env DB_HOST=NAS_IP_ADDRESS DB_PORT=5433 DB_NAME=btc_data DB_USER=btc_bot DB_PASSWORD=YOUR_PASSWORD ``` 3. **Run Locally**: ```bash pip install -r requirements.txt python -m uvicorn src.api.server:app --host 0.0.0.0 --port 8000 --reload ``` ### Testing - **Manual Verification**: - API Health: `curl http://localhost:8000/api/v1/health` - DB Status: `docker exec btc_timescale pg_isready -U btc_bot` - **Indicator Testing**: `scripts/test_ma44_performance.py` ## Development Conventions ### Coding Standards - **Modularity**: Keep files small (< 500 lines) and focused on a single responsibility. - **Async First**: Use `asyncio` for all I/O bound operations (WebSockets, Database, API). - **Type Safety**: Use Pydantic models for configuration and API responses. - **Logging**: Use structured logging with console prefixes: `[SYSTEM]`, `[CLP]`, `[HEDGE]`, `[MONITOR]`. - **Golden Rule (Data)**: Always read the source of truth from the blockchain or primary API (Hyperliquid) rather than relying on local calculations if possible. - **Golden Rule (Positioning)**: CLP positions must use symmetric grid snapping. ### Database Design - **Hypertables**: Use TimescaleDB hypertables for `candles` and `indicators` with weekly partitioning. - **Compression**: Automatic compression is enabled for data older than 7 days to save space on NAS. - **Gaps**: Continuous gap detection and automatic backfill from Hyperliquid REST API. ### Dashboard & API - **FastAPI**: Main entry point in `src/api/server.py`. - **Static Assets**: Dashboard located in `src/api/dashboard/static/`. - **Indicator Engine**: JS-side mirror of indicators for the UI chart located in `src/api/dashboard/static/js/indicators/`. ## Project Roadmap - **Phase 1 (Complete)**: Data collection, TimescaleDB integration, Basic API. - **Phase 2 (In-Progress)**: Indicator Engine (SMA, EMA, RSI, MACD), Brain decision logic. - **Phase 3 (TODO)**: Web3 execution layer (Uniswap V3 swaps on Base). - **Phase 4 (TODO)**: Aave V3 integration for yield on collected cbBTC.