Compare commits

2 Commits

Author SHA1 Message Date
218f0f5107 refactor: modularize dashboard strategies and enhance indicator engine
- Refactored strategy-panel.js to use a modular registry system for trading strategies.
- Introduced PingPongStrategy and moved strategy-specific logic to a new strategies/ directory.
- Enhanced the indicator engine with Multi-Timeframe (MTF) support and robust forward-filling.
- Optimized BaseIndicator and RMA calculations for better performance.
- Updated UI components (chart.js, indicators-panel, signal-markers) to support the new architecture.
- Added markers-plugin.js for improved signal visualization.
2026-03-10 11:52:11 +01:00
8b167f8b2c docs: add technical description for Ping-Pong strategy with RSI and Hurst Bands 2026-03-05 21:12:06 +01:00
74 changed files with 14404 additions and 360 deletions

89
DOCKER_GUIDE.md Normal file
View File

@ -0,0 +1,89 @@
# Docker Management & Troubleshooting Guide
This guide provides the necessary commands to build, manage, and troubleshoot the BTC Bot Docker environment.
## 1. Manual Build Commands
Always execute these commands from the **project root** directory.
```bash
# Build the Data Collector
docker build --network host -f docker/Dockerfile.collector -t btc_collector .
# Build the API Server
docker build --network host -f docker/Dockerfile.api -t btc_api .
# Build the Bot (Ensure the tag matches docker-compose.yml)
docker build --no-cache --network host -f docker/Dockerfile.bot -t btc_ping_pong_bot .
```
---
## 2. Managing Containers
Run these commands from the **docker/** directory (`~/btc_bot/docker`).
### Restart All Services
```bash
# Full reset: Stop, remove, and recreate all containers
docker-compose down
docker-compose up -d
```
### Partial Restart (Specific Service)
```bash
# Rebuild and restart only the bot (ignores dependencies like DB)
docker-compose up -d --no-deps ping_pong_bot
```
### Stop/Start Services
```bash
docker-compose stop <service_name> # Temporarily stop
docker-compose start <service_name> # Start a stopped container
```
---
## 3. Checking Logs
Use these commands to diagnose why a service might be crashing or restarting.
```bash
# Follow live logs for the Bot (last 100 lines)
docker-compose logs -f --tail 100 ping_pong_bot
# Follow live logs for the Collector
docker-compose logs -f btc_collector
# Follow live logs for the API Server
docker-compose logs -f api_server
# View logs for ALL services combined
docker-compose logs -f
```
---
## 4. Troubleshooting Checklist
| Symptom | Common Cause & Solution |
| :--- | :--- |
| **`.env` Parsing Warning** | Check for `//` comments (use `#` instead) or hidden characters at the start of the file. |
| **Container "Restarting" Loop** | Check logs! Usually missing `API_KEY`/`API_SECRET` or DB connection failure. |
| **"No containers to restart"** | Use `docker-compose up -d` first. `restart` only works for existing containers. |
| **Database Connection Refused** | Ensure `DB_PORT=5433` is used for `host` network mode. Check if port is open with `netstat`. |
| **Code Changes Not Applying** | Rebuild the image (`--no-cache`) if you changed `requirements.txt` or the `Dockerfile`. |
---
## 5. Useful Debugging Commands
```bash
# Check status of all containers
docker-compose ps
# List all local docker images
docker images
# Check if the database port is listening on the host
netstat -tulnp | grep 5433
# Access the shell inside a running container
docker exec -it btc_ping_pong_bot /bin/bash
```

65
GEMINI.md Normal file
View File

@ -0,0 +1,65 @@
# Gemini Context: BTC Trading Dashboard
This project is a Bitcoin trading platform and automated bot system. It features a FastAPI backend, a real-time data collector, a PostgreSQL (TimescaleDB) database, and an interactive HTML/JS dashboard for technical analysis and strategy visualization.
## Project Overview
- **Purpose**: Real-time BTC data collection, technical indicator computation, and trading strategy execution/backtesting.
- **Core Technologies**:
- **Backend**: Python 3.9+ with FastAPI.
- **Frontend**: Vanilla HTML/JS with `lightweight-charts`.
- **Database**: PostgreSQL with TimescaleDB extension for time-series optimization.
- **Infrastructure**: Docker & Docker Compose.
- **Architecture**:
- `data_collector`: Handles WebSocket data ingestion and custom timeframe generation.
- `api_server`: Serves the dashboard and REST API for candle/indicator data.
- `indicator_engine`: Computes SMA, EMA, and specialized HTS indicators.
- `strategies`: Contains trading logic (e.g., Ping Pong bot, HTS strategy).
## Building and Running
### Local Setup
1. **Environment**:
```bash
python -m venv venv
source venv/bin/activate # venv\Scripts\activate on Windows
pip install -r requirements.txt
```
2. **Configuration**: Create a `.env` file based on the project's requirements (see `README.md`).
3. **Database Test**: `python test_db.py`
4. **Run API Server**: `uvicorn src.api.server:app --reload --host 0.0.0.0 --port 8000`
### Docker Deployment
- **Commands**:
- `docker-compose up -d` (from the `docker/` directory or root depending on setup).
- **Services**: `timescaledb`, `data_collector`, `api_server`, `ping_pong_bot`.
## Key Files and Directories
- `src/api/server.py`: FastAPI entry point and REST endpoints.
- `src/data_collector/main.py`: Data collection service logic.
- `src/data_collector/indicator_engine.py`: Technical indicator calculations (stateless math).
- `src/api/dashboard/static/`: Frontend assets (HTML, CSS, JS).
- `src/strategies/`: Directory for trading strategy implementations.
- `HTS_STRATEGY.md`: Detailed documentation for the "Higher Timeframe Trend System" strategy.
- `AGENTS.md`: Specific coding guidelines and standards for AI agents.
## Development Conventions
### Python Standards
- **Style**: Follow PEP 8; use Type Hints consistently.
- **Documentation**: Use Google-style docstrings for all public functions and classes.
- **Asynchrony**: Use `async`/`await` for all database (via `asyncpg`) and network operations.
- **Validation**: Use Pydantic models for data validation and settings.
### Frontend Standards
- **Tech**: Vanilla CSS (Avoid Tailwind unless requested) and Vanilla JS.
- **Location**: Static files reside in `src/api/dashboard/static/`.
### AI Coding Guidelines (from `AGENTS.md`)
- **Organization**: Place new code in corresponding modules (`api`, `data_collector`, `strategies`).
- **Error Handling**: Use explicit exceptions; log errors with context; never suppress silently.
- **Security**: Protect credentials; use environment variables; validate all inputs.
## Strategy: HTS (Higher Timeframe Trend System)
The project emphasizes the **HTS strategy**, which uses fast (33) and slow (144) RMA channels to identify trends. Key rules include price position relative to Red (Slow) and Aqua (Fast) channels, and a 1H Red Zone filter for long trades. Refer to `HTS_STRATEGY.md` for full logic.

79
HTS_STRATEGY.md Normal file
View File

@ -0,0 +1,79 @@
# HTS (Higher Timeframe Trend System) Strategy
A trend-following strategy based on channel breakouts using fast and slow moving averages of High/Low prices.
## Strategy Rules
### 1. Core Trend Signal
- **Bullish Trend**: Price trading above the Red (Slow) Channel and Aqua (Fast) Channel is above Red Channel
- **Bearish Trend**: Price trading below the Red (Slow) Channel and Aqua (Fast) Channel is below Red Channel
### 2. Entry Rules
- **Long Entry**: Wait for price to break above Slow Red Channel. Candle close above shorth (Fast Low line) while fast lines are above slow lines.
- **Short Entry**: Wait for price to break below Slow Red Channel. Look for close below shortl (Fast Low line) while fast lines are below slow lines.
### 3. 1H Red Zone Filter
- Only take Longs if the price is above the 1H Red Zone (Slow Channel), regardless of fast line direction
- Can be disabled in configuration
### 4. Stop Loss & Trailing Stop
- **Stop Loss**: Place on opposite side of Red (Slow) Channel
- Long stop: longl (Slow Low) line
- Short stop: slowh (Slow High) line
- **Trailing Stop**: As Red Channel moves, move stop loss accordingly
### 5. RMA Default
- Uses RMA (Running Moving Average) by default - slower and smoother than EMA
- Designed for long-term trends, late to react to sudden crashes (feature, not bug)
## Configuration Parameters
| Parameter | Default | Range | Description |
|-----------|---------|-------|-------------|
| `shortPeriod` | 33 | 5-200 | Fast period for HTS |
| `longPeriod` | 144 | 10-500 | Slow period for HTS |
| `maType` | RMA | - | Moving average type (RMA/SMA/EMA/WMA/VWMA) |
| `useAutoHTS` | false | - | Compute HTS on timeframe/4 from 1m data |
| `use1HFilter` | true | - | Enable 1H Red Zone filter |
## Usage
1. Select "HTS Trend Strategy" from the strategies dropdown
2. Configure parameters:
- Periods: typically 33/144 for 15min-1hour charts
- Enable Auto HTS for multi-timeframe analysis
- Enable/disable 1H filter as needed
3. Run simulation to see backtesting results
4. View entry/exit markers on the chart
## Visualization
- **Cyan Lines**: Fast channels (33-period)
- **Red Lines**: Slow channels (144-period)
- **Green Arrows**: Buy signals (fast low crossover)
- **Red Arrows**: Sell signals (fast high crossover)
- **Background Shading**: Trend zones (green=bullish, red=bearish)
## Signal Strength
Pure HTS signals don't mix with other indicators. Signals are based solely on:
- Crossover detection
- Channel alignment
- Price position relative to channels
- Higher timeframe confirmation (1H filter if enabled)
## Example Setup
For a 15-minute chart:
- Fast Period: 33
- Slow Period: 144
- MA Type: RMA (default)
- Auto HTS: Disabled (or enable to see HTS on ~4-minute perspective)
- 1H Filter: Enabled (for better trade filtering)
## Notes
- This strategy is designed for trend-following, not ranging markets
- RMA is slower than EMA, giving smoother signals but later entries
- 1H filter significantly reduces false signals for long trades
- Works best in volatile but trending assets like BTC

17
RUN_SERVER.bat Normal file
View File

@ -0,0 +1,17 @@
@echo off
title BTC Dashboard Server
cd /d "%~dp0"
echo ===================================
echo Starting BTC Dashboard Server
echo ===================================
echo.
echo Dashboard: http://localhost:8000/dashboard
echo API Docs: http://localhost:8000/docs
echo.
echo Press Ctrl+C to stop
echo ===================================
echo.
call venv\Scripts\uvicorn src.api.server:app --host 0.0.0.0 --port 8000 --reload
pause

94
config/data_config.yaml Normal file
View File

@ -0,0 +1,94 @@
# Data Collection Configuration
data_collection:
# Primary data source
primary_exchange: "hyperliquid"
# Assets to collect
assets:
cbBTC:
symbol: "cbBTC-PERP"
enabled: true
base_asset: "cbBTC"
quote_asset: "USD"
# Validation settings
validation:
enabled: true
tolerance_percent: 1.0 # 1% price divergence allowed
check_interval_minutes: 5
# Reference sources for cross-validation
references:
uniswap_v3:
enabled: true
chain: "base"
pool_address: "0x4f1480ba4F40f2A41a718c8699E64976b222b56d" # cbBTC/USDC
rpc_url: "https://base-mainnet.g.alchemy.com/v2/YOUR_API_KEY"
coinbase:
enabled: true
api_url: "https://api.exchange.coinbase.com"
# Intervals to collect (1m is base, others computed)
intervals:
- "1m" # Base collection
indicators:
ma44:
type: "sma"
period: 44
intervals: ["1d"]
ma125:
type: "sma"
period: 125
intervals: ["1d"]
# WebSocket settings
websocket:
url: "wss://api.hyperliquid.xyz/ws"
reconnect_attempts: 10
reconnect_delays: [1, 2, 5, 10, 30, 60, 120, 300, 600, 900] # seconds
ping_interval: 30
ping_timeout: 10
# Buffer settings
buffer:
max_size: 1000 # candles in memory
flush_interval_seconds: 30
batch_size: 100
# Database settings
database:
host: "${DB_HOST}"
port: ${DB_PORT}
name: "${DB_NAME}"
user: "${DB_USER}"
password: "${DB_PASSWORD}"
pool_size: 5
max_overflow: 10
# Backfill settings
backfill:
enabled: true
max_gap_minutes: 60
rest_api_url: "https://api.hyperliquid.xyz/info"
# Quality monitoring
quality_monitor:
enabled: true
check_interval_seconds: 300 # 5 minutes
anomaly_detection:
price_change_threshold: 0.10 # 10%
volume_spike_std: 5.0 # 5 sigma
# Logging
logging:
level: "INFO"
format: "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
file: "/app/logs/collector.log"
max_size_mb: 100
backup_count: 10
# Performance
performance:
max_cpu_percent: 80
max_memory_mb: 256

23
docker/Dockerfile.api Normal file
View File

@ -0,0 +1,23 @@
FROM python:3.11-slim
WORKDIR /app
# Copy requirements first (for better caching)
COPY requirements.txt .
# Install Python dependencies
RUN pip install --no-cache-dir -r requirements.txt
# Copy application code
COPY src/ ./src/
COPY config/ ./config/
COPY scripts/ ./scripts/
# Set Python path
ENV PYTHONPATH=/app
# Expose API port
EXPOSE 8000
# Run the API server
CMD ["uvicorn", "src.api.server:app", "--host", "0.0.0.0", "--port", "8000"]

View File

@ -3,10 +3,10 @@ FROM python:3.11-slim
WORKDIR /app
# Copy requirements first
COPY requirements.txt .
COPY requirements_bot.txt .
# Install dependencies
RUN pip install --no-cache-dir -r requirements.txt
RUN pip install --no-cache-dir -r requirements_bot.txt
# Copy application code
COPY src/ ./src/

View File

@ -0,0 +1,21 @@
FROM python:3.11-slim
WORKDIR /app
# Copy requirements first (for better caching)
COPY requirements.txt .
# Install Python dependencies
# --no-cache-dir reduces image size
RUN pip install --no-cache-dir -r requirements.txt
# Copy application code
COPY src/ ./src/
COPY config/ ./config/
COPY scripts/ ./scripts/
# Set Python path
ENV PYTHONPATH=/app
# Run the collector
CMD ["python", "-m", "src.data_collector.main"]

View File

@ -0,0 +1 @@
timescale/timescaledb:2.11.2-pg15

View File

@ -1,6 +1,85 @@
# Update docker-compose.yml to mount source code as volume
version: '3.8'
services:
timescaledb:
image: timescale/timescaledb:2.11.2-pg15
container_name: btc_timescale
environment:
POSTGRES_USER: btc_bot
POSTGRES_PASSWORD: ${DB_PASSWORD}
POSTGRES_DB: btc_data
TZ: Europe/Warsaw
volumes:
- /volume1/btc_bot/data:/var/lib/postgresql/data
- /volume1/btc_bot/backups:/backups
- ./timescaledb.conf:/etc/postgresql/postgresql.conf
- ./init-scripts:/docker-entrypoint-initdb.d
ports:
- "5433:5432"
command: postgres -c config_file=/etc/postgresql/postgresql.conf
restart: unless-stopped
deploy:
resources:
limits:
memory: 1.5G
reservations:
memory: 512M
healthcheck:
test: ["CMD-SHELL", "pg_isready -U btc_bot -d btc_data"]
interval: 10s
timeout: 5s
retries: 5
data_collector:
build:
context: ..
dockerfile: docker/Dockerfile.collector
image: btc_collector
container_name: btc_collector
network_mode: host
environment:
- DB_HOST=20.20.20.20
- DB_PORT=5433
- DB_NAME=btc_data
- DB_USER=btc_bot
- DB_PASSWORD=${DB_PASSWORD}
- LOG_LEVEL=INFO
volumes:
- ../src:/app/src
- /volume1/btc_bot/logs:/app/logs
- ../config:/app/config:ro
restart: unless-stopped
deploy:
resources:
limits:
memory: 256M
reservations:
memory: 128M
api_server:
build:
context: ..
dockerfile: docker/Dockerfile.api
image: btc_api
container_name: btc_api
network_mode: host
environment:
- DB_HOST=20.20.20.20
- DB_PORT=5433
- DB_NAME=btc_data
- DB_USER=btc_bot
- DB_PASSWORD=${DB_PASSWORD}
volumes:
- ../src:/app/src
- /volume1/btc_bot/exports:/app/exports
- ../config:/app/config:ro
restart: unless-stopped
deploy:
resources:
limits:
memory: 512M
ping_pong_bot:
build:
context: ..

View File

@ -0,0 +1,199 @@
-- 1. Enable TimescaleDB extension
CREATE EXTENSION IF NOT EXISTS timescaledb;
-- 2. Create candles table (main data storage)
CREATE TABLE IF NOT EXISTS candles (
time TIMESTAMPTZ NOT NULL,
symbol TEXT NOT NULL,
interval TEXT NOT NULL,
open DECIMAL(18,8) NOT NULL,
high DECIMAL(18,8) NOT NULL,
low DECIMAL(18,8) NOT NULL,
close DECIMAL(18,8) NOT NULL,
volume DECIMAL(18,8) NOT NULL,
validated BOOLEAN DEFAULT FALSE,
source TEXT DEFAULT 'hyperliquid',
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- 3. Convert to hypertable (partitioned by time)
SELECT create_hypertable('candles', 'time',
chunk_time_interval => INTERVAL '7 days',
if_not_exists => TRUE
);
-- 4. Create unique constraint for upserts (required by ON CONFLICT)
ALTER TABLE candles
ADD CONSTRAINT candles_unique_candle
UNIQUE (time, symbol, interval);
-- 5. Create indexes for efficient queries
CREATE INDEX IF NOT EXISTS idx_candles_symbol_time
ON candles (symbol, interval, time DESC);
CREATE INDEX IF NOT EXISTS idx_candles_validated
ON candles (validated) WHERE validated = FALSE;
-- 5. Create indicators table (computed values)
CREATE TABLE IF NOT EXISTS indicators (
time TIMESTAMPTZ NOT NULL,
symbol TEXT NOT NULL,
interval TEXT NOT NULL,
indicator_name TEXT NOT NULL,
value DECIMAL(18,8) NOT NULL,
parameters JSONB,
computed_at TIMESTAMPTZ DEFAULT NOW()
);
-- 6. Convert indicators to hypertable
SELECT create_hypertable('indicators', 'time',
chunk_time_interval => INTERVAL '7 days',
if_not_exists => TRUE
);
-- 7. Create unique constraint + index for indicators (required for upserts)
ALTER TABLE indicators
ADD CONSTRAINT indicators_unique
UNIQUE (time, symbol, interval, indicator_name);
CREATE INDEX IF NOT EXISTS idx_indicators_lookup
ON indicators (symbol, interval, indicator_name, time DESC);
-- 8. Create data quality log table
CREATE TABLE IF NOT EXISTS data_quality (
time TIMESTAMPTZ NOT NULL DEFAULT NOW(),
check_type TEXT NOT NULL,
severity TEXT NOT NULL,
symbol TEXT,
details JSONB,
resolved BOOLEAN DEFAULT FALSE
);
CREATE INDEX IF NOT EXISTS idx_quality_unresolved
ON data_quality (resolved) WHERE resolved = FALSE;
CREATE INDEX IF NOT EXISTS idx_quality_time
ON data_quality (time DESC);
-- 9. Create collector state tracking table
CREATE TABLE IF NOT EXISTS collector_state (
id SERIAL PRIMARY KEY,
symbol TEXT NOT NULL UNIQUE,
last_candle_time TIMESTAMPTZ,
last_validation_time TIMESTAMPTZ,
total_candles BIGINT DEFAULT 0,
updated_at TIMESTAMPTZ DEFAULT NOW()
);
-- 10. Insert initial state for cbBTC
INSERT INTO collector_state (symbol, last_candle_time)
VALUES ('cbBTC', NULL)
ON CONFLICT (symbol) DO NOTHING;
-- 11. Enable compression for old data (after 7 days)
ALTER TABLE candles SET (
timescaledb.compress,
timescaledb.compress_segmentby = 'symbol,interval'
);
ALTER TABLE indicators SET (
timescaledb.compress,
timescaledb.compress_segmentby = 'symbol,interval,indicator_name'
);
-- 12. Add compression policies
SELECT add_compression_policy('candles', INTERVAL '7 days', if_not_exists => TRUE);
SELECT add_compression_policy('indicators', INTERVAL '7 days', if_not_exists => TRUE);
-- 13. Create function to update collector state
CREATE OR REPLACE FUNCTION update_collector_state()
RETURNS TRIGGER AS $$
BEGIN
INSERT INTO collector_state (symbol, last_candle_time, total_candles)
VALUES (NEW.symbol, NEW.time, 1)
ON CONFLICT (symbol)
DO UPDATE SET
last_candle_time = NEW.time,
total_candles = collector_state.total_candles + 1,
updated_at = NOW();
RETURN NEW;
END;
$$ LANGUAGE plpgsql;
-- 14. Create trigger to auto-update state
DROP TRIGGER IF EXISTS trigger_update_state ON candles;
CREATE TRIGGER trigger_update_state
AFTER INSERT ON candles
FOR EACH ROW
EXECUTE FUNCTION update_collector_state();
-- 15. Create view for data health check
CREATE OR REPLACE VIEW data_health AS
SELECT
symbol,
COUNT(*) as total_candles,
COUNT(*) FILTER (WHERE validated) as validated_candles,
MAX(time) as latest_candle,
MIN(time) as earliest_candle,
NOW() - MAX(time) as time_since_last
FROM candles
GROUP BY symbol;
-- 16. Create decisions table (brain outputs - buy/sell/hold with full context)
CREATE TABLE IF NOT EXISTS decisions (
time TIMESTAMPTZ NOT NULL,
symbol TEXT NOT NULL,
interval TEXT NOT NULL,
decision_type TEXT NOT NULL,
strategy TEXT NOT NULL,
confidence DECIMAL(5,4),
price_at_decision DECIMAL(18,8),
indicator_snapshot JSONB NOT NULL,
candle_snapshot JSONB NOT NULL,
reasoning TEXT,
backtest_id TEXT,
executed BOOLEAN DEFAULT FALSE,
execution_price DECIMAL(18,8),
execution_time TIMESTAMPTZ,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- 17. Convert decisions to hypertable
SELECT create_hypertable('decisions', 'time',
chunk_time_interval => INTERVAL '7 days',
if_not_exists => TRUE
);
-- 18. Indexes for decisions - separate live from backtest queries
CREATE INDEX IF NOT EXISTS idx_decisions_live
ON decisions (symbol, interval, time DESC) WHERE backtest_id IS NULL;
CREATE INDEX IF NOT EXISTS idx_decisions_backtest
ON decisions (backtest_id, symbol, time DESC) WHERE backtest_id IS NOT NULL;
CREATE INDEX IF NOT EXISTS idx_decisions_type
ON decisions (symbol, decision_type, time DESC);
-- 19. Create backtest_runs metadata table
CREATE TABLE IF NOT EXISTS backtest_runs (
id TEXT PRIMARY KEY,
strategy TEXT NOT NULL,
symbol TEXT NOT NULL DEFAULT 'BTC',
start_time TIMESTAMPTZ NOT NULL,
end_time TIMESTAMPTZ NOT NULL,
intervals TEXT[] NOT NULL,
config JSONB,
results JSONB,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- 20. Compression for decisions
ALTER TABLE decisions SET (
timescaledb.compress,
timescaledb.compress_segmentby = 'symbol,interval,strategy'
);
SELECT add_compression_policy('decisions', INTERVAL '7 days', if_not_exists => TRUE);
-- Success message
SELECT 'Database schema initialized successfully' as status;

View File

@ -0,0 +1,43 @@
-- Create a read-only user for API access (optional security)
DO $$
BEGIN
IF NOT EXISTS (SELECT FROM pg_roles WHERE rolname = 'btc_api') THEN
CREATE USER btc_api WITH PASSWORD 'api_password_change_me';
END IF;
END
$$;
-- Grant read-only permissions
GRANT CONNECT ON DATABASE btc_data TO btc_api;
GRANT USAGE ON SCHEMA public TO btc_api;
GRANT SELECT ON ALL TABLES IN SCHEMA public TO btc_api;
-- Grant sequence access for ID columns
GRANT USAGE ON ALL SEQUENCES IN SCHEMA public TO btc_api;
-- Apply to future tables
ALTER DEFAULT PRIVILEGES IN SCHEMA public GRANT SELECT ON TABLES TO btc_api;
-- Create continuous aggregate for hourly stats (optional optimization)
CREATE MATERIALIZED VIEW IF NOT EXISTS hourly_stats
WITH (timescaledb.continuous) AS
SELECT
time_bucket('1 hour', time) as bucket,
symbol,
interval,
FIRST(open, time) as first_open,
MAX(high) as max_high,
MIN(low) as min_low,
LAST(close, time) as last_close,
SUM(volume) as total_volume,
COUNT(*) as candle_count
FROM candles
GROUP BY bucket, symbol, interval;
-- Add refresh policy for continuous aggregate
SELECT add_continuous_aggregate_policy('hourly_stats',
start_offset => INTERVAL '1 month',
end_offset => INTERVAL '1 hour',
schedule_interval => INTERVAL '1 hour',
if_not_exists => TRUE
);

41
docker/timescaledb.conf Normal file
View File

@ -0,0 +1,41 @@
# Optimized for Synology DS218+ (2GB RAM, dual-core CPU)
# Required for TimescaleDB
shared_preload_libraries = 'timescaledb'
# Memory settings
shared_buffers = 256MB
effective_cache_size = 768MB
work_mem = 16MB
maintenance_work_mem = 128MB
# Connection settings
listen_addresses = '*'
max_connections = 50
max_locks_per_transaction = 256
max_worker_processes = 2
max_parallel_workers_per_gather = 1
max_parallel_workers = 2
max_parallel_maintenance_workers = 1
# Write performance
wal_buffers = 16MB
checkpoint_completion_target = 0.9
random_page_cost = 1.1
effective_io_concurrency = 200
# TimescaleDB settings
timescaledb.max_background_workers = 4
# Logging (use default pg_log directory inside PGDATA)
logging_collector = on
log_directory = 'log'
log_filename = 'postgresql-%Y-%m-%d_%H%M%S.log'
log_rotation_age = 1d
log_rotation_size = 100MB
log_min_messages = warning
log_min_error_statement = error
# Auto-vacuum for hypertables
autovacuum_max_workers = 2
autovacuum_naptime = 10s

34
kill_port_8000.bat Normal file
View File

@ -0,0 +1,34 @@
@echo off
setlocal enabledelayedexpansion
echo ===================================
echo Kill Process on Port 8000
echo ===================================echo.
REM Find PID using port 8000
for /f "tokens=5" %%a in ('netstat -ano ^| findstr ":8000" ^| findstr "LISTENING"') do (
set "PID=%%a"
)
if "%PID%"=="" (
echo No process found on port 8000
) else (
echo Found process PID: %PID% on port 8000
taskkill /F /PID %PID% 2>nul
if %errorlevel% equ 0 (
echo Process killed successfully
) else (
echo Failed to kill process
)
)
echo.
sleep 2
netstat -ano | findstr ":8000" | findstr "LISTENING"
if %errorlevel% neq 0 (
echo Port 8000 is now free
) else (
echo Port 8000 still in use
)
pause

View File

@ -1,7 +1,10 @@
pybit
pandas
numpy
pyyaml
python-dotenv
rich
asyncpg
fastapi>=0.104.0
uvicorn[standard]>=0.24.0
asyncpg>=0.29.0
aiohttp>=3.9.0
websockets>=12.0
pydantic>=2.5.0
pydantic-settings>=2.1.0
pyyaml>=6.0
python-dotenv>=1.0.0
python-multipart>=0.0.6

6
requirements_bot.txt Normal file
View File

@ -0,0 +1,6 @@
pybit
pandas
numpy
pyyaml
python-dotenv
rich

36
scripts/backfill.sh Normal file
View File

@ -0,0 +1,36 @@
#!/bin/bash
# Backfill script for Hyperliquid historical data
# Usage: ./backfill.sh [coin] [days|max] [intervals...]
# Examples:
# ./backfill.sh BTC 7 "1m" # Last 7 days of 1m candles
# ./backfill.sh BTC max "1m 1h 1d" # Maximum available data for all intervals
set -e
COIN=${1:-BTC}
DAYS=${2:-7}
INTERVALS=${3:-"1m"}
echo "=== Hyperliquid Historical Data Backfill ==="
echo "Coin: $COIN"
if [ "$DAYS" == "max" ]; then
echo "Mode: MAXIMUM (up to 5000 candles per interval)"
else
echo "Days: $DAYS"
fi
echo "Intervals: $INTERVALS"
echo ""
# Change to project root
cd "$(dirname "$0")/.."
# Run backfill inside Docker container
docker exec btc_collector python -m src.data_collector.backfill \
--coin "$COIN" \
--days "$DAYS" \
--intervals $INTERVALS \
--db-host localhost \
--db-port 5433
echo ""
echo "=== Backfill Complete ==="

37
scripts/backup.sh Normal file
View File

@ -0,0 +1,37 @@
#!/bin/bash
# Backup script for Synology DS218+
# Run via Task Scheduler every 6 hours
BACKUP_DIR="/volume1/btc_bot/backups"
DB_NAME="btc_data"
DB_USER="btc_bot"
RETENTION_DAYS=30
DATE=$(date +%Y%m%d_%H%M)
echo "Starting backup at $(date)"
# Create backup directory if not exists
mkdir -p $BACKUP_DIR
# Create backup
docker exec btc_timescale pg_dump -U $DB_USER -Fc $DB_NAME > $BACKUP_DIR/btc_data_$DATE.dump
# Compress
if [ -f "$BACKUP_DIR/btc_data_$DATE.dump" ]; then
gzip $BACKUP_DIR/btc_data_$DATE.dump
echo "Backup created: btc_data_$DATE.dump.gz"
# Calculate size
SIZE=$(du -h $BACKUP_DIR/btc_data_$DATE.dump.gz | cut -f1)
echo "Backup size: $SIZE"
else
echo "Error: Backup file not created"
exit 1
fi
# Delete old backups
DELETED=$(find $BACKUP_DIR -name "*.dump.gz" -mtime +$RETENTION_DAYS | wc -l)
find $BACKUP_DIR -name "*.dump.gz" -mtime +$RETENTION_DAYS -delete
echo "Deleted $DELETED old backup(s)"
echo "Backup completed at $(date)"

107
scripts/check_db_stats.py Normal file
View File

@ -0,0 +1,107 @@
#!/usr/bin/env python3
"""
Quick database statistics checker
Shows oldest date, newest date, and count for each interval
"""
import asyncio
import asyncpg
import os
from datetime import datetime
async def check_database_stats():
# Database connection (uses same env vars as your app)
conn = await asyncpg.connect(
host=os.getenv('DB_HOST', 'localhost'),
port=int(os.getenv('DB_PORT', 5432)),
database=os.getenv('DB_NAME', 'btc_data'),
user=os.getenv('DB_USER', 'btc_bot'),
password=os.getenv('DB_PASSWORD', '')
)
try:
print("=" * 70)
print("DATABASE STATISTICS")
print("=" * 70)
print()
# Check for each interval
intervals = ['1m', '3m', '5m', '15m', '30m', '37m', '1h', '2h', '4h', '8h', '12h', '1d']
for interval in intervals:
stats = await conn.fetchrow("""
SELECT
COUNT(*) as count,
MIN(time) as oldest,
MAX(time) as newest
FROM candles
WHERE symbol = 'BTC' AND interval = $1
""", interval)
if stats['count'] > 0:
oldest = stats['oldest'].strftime('%Y-%m-%d %H:%M') if stats['oldest'] else 'N/A'
newest = stats['newest'].strftime('%Y-%m-%d %H:%M') if stats['newest'] else 'N/A'
count = stats['count']
# Calculate days of data
if stats['oldest'] and stats['newest']:
days = (stats['newest'] - stats['oldest']).days
print(f"{interval:6} | {count:>8,} candles | {days:>4} days | {oldest} to {newest}")
print()
print("=" * 70)
# Check indicators
print("\nINDICATORS AVAILABLE:")
indicators = await conn.fetch("""
SELECT DISTINCT indicator_name, interval, COUNT(*) as count
FROM indicators
WHERE symbol = 'BTC'
GROUP BY indicator_name, interval
ORDER BY interval, indicator_name
""")
if indicators:
for ind in indicators:
print(f" {ind['indicator_name']:10} on {ind['interval']:6} | {ind['count']:>8,} values")
else:
print(" No indicators found in database")
print()
print("=" * 70)
# Check 1m specifically with more detail
print("\n1-MINUTE DATA DETAIL:")
one_min_stats = await conn.fetchrow("""
SELECT
COUNT(*) as count,
MIN(time) as oldest,
MAX(time) as newest,
COUNT(*) FILTER (WHERE time > NOW() - INTERVAL '24 hours') as last_24h
FROM candles
WHERE symbol = 'BTC' AND interval = '1m'
""")
if one_min_stats['count'] > 0:
total_days = (one_min_stats['newest'] - one_min_stats['oldest']).days
expected_candles = total_days * 24 * 60 # 1 candle per minute
actual_candles = one_min_stats['count']
coverage = (actual_candles / expected_candles) * 100 if expected_candles > 0 else 0
print(f" Total candles: {actual_candles:,}")
print(f" Date range: {one_min_stats['oldest'].strftime('%Y-%m-%d')} to {one_min_stats['newest'].strftime('%Y-%m-%d')}")
print(f" Total days: {total_days}")
print(f" Expected candles: {expected_candles:,} (if complete)")
print(f" Coverage: {coverage:.1f}%")
print(f" Last 24 hours: {one_min_stats['last_24h']:,} candles")
else:
print(" No 1m data found")
print()
print("=" * 70)
finally:
await conn.close()
if __name__ == "__main__":
asyncio.run(check_database_stats())

18
scripts/check_status.sh Normal file
View File

@ -0,0 +1,18 @@
#!/bin/bash
# Check the status of the indicators table (constraints and compression)
docker exec -i btc_timescale psql -U btc_bot -d btc_data <<EOF
\x
SELECT 'Checking constraints...' as step;
SELECT conname, pg_get_constraintdef(oid)
FROM pg_constraint
WHERE conrelid = 'indicators'::regclass;
SELECT 'Checking compression settings...' as step;
SELECT * FROM timescaledb_information.hypertables
WHERE hypertable_name = 'indicators';
SELECT 'Checking compression jobs...' as step;
SELECT * FROM timescaledb_information.jobs
WHERE hypertable_name = 'indicators';
EOF

59
scripts/deploy.sh Normal file
View File

@ -0,0 +1,59 @@
#!/bin/bash
# Deployment script for Synology DS218+
set -e
echo "=== BTC Bot Data Collector Deployment ==="
echo ""
# Check if running on Synology
if [ ! -d "/volume1" ]; then
echo "Warning: This script is designed for Synology NAS"
echo "Continuing anyway..."
fi
# Create directories
echo "Creating directories..."
mkdir -p /volume1/btc_bot/{data,backups,logs,exports}
# Check if Docker is installed
if ! command -v docker &> /dev/null; then
echo "Error: Docker not found. Please install Docker package from Synology Package Center"
exit 1
fi
# Copy configuration
echo "Setting up configuration..."
if [ ! -f "/volume1/btc_bot/.env" ]; then
cp .env.example /volume1/btc_bot/.env
echo "Created .env file. Please edit /volume1/btc_bot/.env with your settings"
fi
# Build and start services
echo "Building and starting services..."
cd docker
docker-compose pull
docker-compose build --no-cache
docker-compose up -d
# Wait for database
echo "Waiting for database to be ready..."
sleep 10
# Check status
echo ""
echo "=== Status ==="
docker-compose ps
echo ""
echo "=== Logs (last 20 lines) ==="
docker-compose logs --tail=20
echo ""
echo "=== Deployment Complete ==="
echo "Database available at: localhost:5432"
echo "API available at: http://localhost:8000"
echo ""
echo "To view logs: docker-compose logs -f"
echo "To stop: docker-compose down"
echo "To backup: ./scripts/backup.sh"

View File

@ -0,0 +1,54 @@
#!/bin/bash
# Fix indicators table schema - Version 2 (Final)
# Handles TimescaleDB compression constraints properly
echo "Fixing indicators table schema (v2)..."
# 1. Decompress chunks individually (safest method)
# We fetch the list of compressed chunks and process them one by one
echo "Checking for compressed chunks..."
CHUNKS=$(docker exec -i btc_timescale psql -U btc_bot -d btc_data -t -c "SELECT chunk_schema || '.' || chunk_name FROM timescaledb_information.chunks WHERE hypertable_name = 'indicators' AND is_compressed = true;")
for chunk in $CHUNKS; do
# Trim whitespace
chunk=$(echo "$chunk" | xargs)
if [[ ! -z "$chunk" ]]; then
echo "Decompressing chunk: $chunk"
docker exec -i btc_timescale psql -U btc_bot -d btc_data -c "SELECT decompress_chunk('$chunk');"
fi
done
# 2. Execute the schema changes
docker exec -i btc_timescale psql -U btc_bot -d btc_data <<EOF
BEGIN;
-- Remove policy first
SELECT remove_compression_policy('indicators', if_exists => true);
-- Disable compression setting (REQUIRED to add unique constraint)
ALTER TABLE indicators SET (timescaledb.compress = false);
-- Deduplicate data (just in case duplicates exist)
DELETE FROM indicators a USING indicators b
WHERE a.ctid < b.ctid
AND a.time = b.time
AND a.symbol = b.symbol
AND a.interval = b.interval
AND a.indicator_name = b.indicator_name;
-- Add the unique constraint
ALTER TABLE indicators ADD CONSTRAINT indicators_unique UNIQUE (time, symbol, interval, indicator_name);
-- Re-enable compression configuration
ALTER TABLE indicators SET (
timescaledb.compress,
timescaledb.compress_segmentby = 'symbol,interval,indicator_name'
);
-- Re-add compression policy (7 days)
SELECT add_compression_policy('indicators', INTERVAL '7 days', if_not_exists => true);
COMMIT;
SELECT 'Indicators schema fix v2 completed successfully' as status;
EOF

View File

@ -0,0 +1,65 @@
import asyncio
import logging
import os
import sys
# Add src to path
sys.path.append(os.path.join(os.path.dirname(__file__), '..'))
from src.data_collector.database import DatabaseManager
from src.data_collector.custom_timeframe_generator import CustomTimeframeGenerator
# Configure logging
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
)
logger = logging.getLogger(__name__)
async def main():
logger.info("Starting custom timeframe generation...")
# DB connection settings from env or defaults
db_host = os.getenv('DB_HOST', 'localhost')
db_port = int(os.getenv('DB_PORT', 5432))
db_name = os.getenv('DB_NAME', 'btc_data')
db_user = os.getenv('DB_USER', 'btc_bot')
db_password = os.getenv('DB_PASSWORD', '')
db = DatabaseManager(
host=db_host,
port=db_port,
database=db_name,
user=db_user,
password=db_password
)
await db.connect()
try:
generator = CustomTimeframeGenerator(db)
await generator.initialize()
# Generate 37m from 1m
logger.info("Generating 37m candles from 1m data...")
count_37m = await generator.generate_historical('37m')
logger.info(f"Generated {count_37m} candles for 37m")
# Generate 148m from 37m
# Note: 148m generation relies on 37m data existing
logger.info("Generating 148m candles from 37m data...")
count_148m = await generator.generate_historical('148m')
logger.info(f"Generated {count_148m} candles for 148m")
logger.info("Done!")
except Exception as e:
logger.error(f"Error generating custom timeframes: {e}")
import traceback
traceback.print_exc()
finally:
await db.disconnect()
if __name__ == "__main__":
asyncio.run(main())

View File

@ -0,0 +1,87 @@
#!/usr/bin/env python3
"""
Generate custom timeframes (37m, 148m) from historical 1m data
Run once to backfill all historical data
"""
import asyncio
import argparse
import logging
import sys
from pathlib import Path
# Add parent to path
sys.path.insert(0, str(Path(__file__).parent.parent / 'src'))
from data_collector.database import DatabaseManager
from data_collector.custom_timeframe_generator import CustomTimeframeGenerator
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
)
logger = logging.getLogger(__name__)
async def main():
parser = argparse.ArgumentParser(description='Generate custom timeframe candles')
parser.add_argument('--interval',
default='all',
help='Which interval to generate (default: all, choices: 3m, 5m, 1h, 37m, etc.)')
parser.add_argument('--batch-size', type=int, default=5000,
help='Number of source candles per batch')
parser.add_argument('--verify', action='store_true',
help='Verify integrity after generation')
args = parser.parse_args()
# Initialize database
db = DatabaseManager()
await db.connect()
try:
generator = CustomTimeframeGenerator(db)
await generator.initialize()
if not generator.first_1m_time:
logger.error("No 1m data found in database. Cannot generate custom timeframes.")
return 1
if args.interval == 'all':
intervals = list(generator.STANDARD_INTERVALS.keys()) + list(generator.CUSTOM_INTERVALS.keys())
else:
intervals = [args.interval]
for interval in intervals:
logger.info(f"=" * 60)
logger.info(f"Generating {interval} candles")
logger.info(f"=" * 60)
# Generate historical data
count = await generator.generate_historical(
interval=interval,
batch_size=args.batch_size
)
logger.info(f"Generated {count} {interval} candles")
# Verify if requested
if args.verify:
logger.info(f"Verifying {interval} integrity...")
stats = await generator.verify_integrity(interval)
logger.info(f"Stats: {stats}")
except Exception as e:
logger.error(f"Error: {e}", exc_info=True)
return 1
finally:
await db.disconnect()
logger.info("Custom timeframe generation complete!")
return 0
if __name__ == '__main__':
exit_code = asyncio.run(main())
sys.exit(exit_code)

31
scripts/health_check.sh Normal file
View File

@ -0,0 +1,31 @@
#!/bin/bash
# Health check script for cron/scheduler
# Check if containers are running
if ! docker ps | grep -q "btc_timescale"; then
echo "ERROR: TimescaleDB container not running"
# Send notification (if configured)
exit 1
fi
if ! docker ps | grep -q "btc_collector"; then
echo "ERROR: Data collector container not running"
exit 1
fi
# Check database connectivity
docker exec btc_timescale pg_isready -U btc_bot -d btc_data > /dev/null 2>&1
if [ $? -ne 0 ]; then
echo "ERROR: Cannot connect to database"
exit 1
fi
# Check if recent data exists
LATEST=$(docker exec btc_timescale psql -U btc_bot -d btc_data -t -c "SELECT MAX(time) FROM candles WHERE time > NOW() - INTERVAL '5 minutes';" 2>/dev/null)
if [ -z "$LATEST" ]; then
echo "WARNING: No recent data in database"
exit 1
fi
echo "OK: All systems operational"
exit 0

11
scripts/run_test.sh Normal file
View File

@ -0,0 +1,11 @@
#!/bin/bash
# Run performance test inside Docker container
# Usage: ./run_test.sh [days] [interval]
DAYS=${1:-7}
INTERVAL=${2:-1m}
echo "Running MA44 performance test: ${DAYS} days of ${INTERVAL} data"
echo "=================================================="
docker exec btc_collector python scripts/test_ma44_performance.py --days $DAYS --interval $INTERVAL

View File

@ -0,0 +1,187 @@
#!/usr/bin/env python3
"""
Performance Test Script for MA44 Strategy
Tests backtesting performance on Synology DS218+ with 6GB RAM
Usage:
python test_ma44_performance.py [--days DAYS] [--interval INTERVAL]
Example:
python test_ma44_performance.py --days 7 --interval 1m
"""
import asyncio
import argparse
import time
import sys
import os
from datetime import datetime, timedelta, timezone
# Add src to path
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..', 'src'))
from data_collector.database import DatabaseManager
from data_collector.indicator_engine import IndicatorEngine, IndicatorConfig
from data_collector.brain import Brain
from data_collector.backtester import Backtester
async def run_performance_test(days: int = 7, interval: str = "1m"):
"""Run MA44 backtest and measure performance"""
print("=" * 70)
print(f"PERFORMANCE TEST: MA44 Strategy")
print(f"Timeframe: {interval}")
print(f"Period: Last {days} days")
print(f"Hardware: Synology DS218+ (6GB RAM)")
print("=" * 70)
print()
# Database connection (adjust these if needed)
db = DatabaseManager(
host=os.getenv('DB_HOST', 'localhost'),
port=int(os.getenv('DB_PORT', 5432)),
database=os.getenv('DB_NAME', 'btc_data'),
user=os.getenv('DB_USER', 'btc_bot'),
password=os.getenv('DB_PASSWORD', '')
)
try:
await db.connect()
print("✓ Database connected")
# Calculate date range
end_date = datetime.now(timezone.utc)
start_date = end_date - timedelta(days=days)
print(f"✓ Date range: {start_date.date()} to {end_date.date()}")
print(f"✓ Symbol: BTC")
print(f"✓ Strategy: MA44 (44-period SMA)")
print()
# Check data availability
async with db.acquire() as conn:
count = await conn.fetchval("""
SELECT COUNT(*) FROM candles
WHERE symbol = 'BTC'
AND interval = $1
AND time >= $2
AND time <= $3
""", interval, start_date, end_date)
print(f"📊 Data points: {count:,} {interval} candles")
if count == 0:
print("❌ ERROR: No data found for this period!")
print(f" Run: python -m data_collector.backfill --days {days} --intervals {interval}")
return
print(f" (Expected: ~{count * int(interval.replace('m','').replace('h','').replace('d',''))} minutes of data)")
print()
# Setup indicator configuration
indicator_configs = [
IndicatorConfig("ma44", "sma", 44, [interval])
]
engine = IndicatorEngine(db, indicator_configs)
brain = Brain(db, engine)
backtester = Backtester(db, engine, brain)
print("⚙️ Running backtest...")
print("-" * 70)
# Measure execution time
start_time = time.time()
await backtester.run("BTC", [interval], start_date, end_date)
end_time = time.time()
execution_time = end_time - start_time
print("-" * 70)
print()
# Fetch results from database
async with db.acquire() as conn:
latest_backtest = await conn.fetchrow("""
SELECT id, strategy, start_time, end_time, intervals, results, created_at
FROM backtest_runs
WHERE strategy LIKE '%ma44%'
ORDER BY created_at DESC
LIMIT 1
""")
if latest_backtest and latest_backtest['results']:
import json
results = json.loads(latest_backtest['results'])
print("📈 RESULTS:")
print("=" * 70)
print(f" Total Trades: {results.get('total_trades', 'N/A')}")
print(f" Win Rate: {results.get('win_rate', 0):.1f}%")
print(f" Win Count: {results.get('win_count', 0)}")
print(f" Loss Count: {results.get('loss_count', 0)}")
print(f" Total P&L: ${results.get('total_pnl', 0):.2f}")
print(f" P&L Percent: {results.get('total_pnl_pct', 0):.2f}%")
print(f" Initial Balance: ${results.get('initial_balance', 1000):.2f}")
print(f" Final Balance: ${results.get('final_balance', 1000):.2f}")
print(f" Max Drawdown: {results.get('max_drawdown', 0):.2f}%")
print()
print("⏱️ PERFORMANCE:")
print(f" Execution Time: {execution_time:.2f} seconds")
print(f" Candles/Second: {count / execution_time:.0f}")
print(f" Backtest ID: {latest_backtest['id']}")
print()
# Performance assessment
if execution_time < 30:
print("✅ PERFORMANCE: Excellent (< 30s)")
elif execution_time < 60:
print("✅ PERFORMANCE: Good (< 60s)")
elif execution_time < 300:
print("⚠️ PERFORMANCE: Acceptable (1-5 min)")
else:
print("❌ PERFORMANCE: Slow (> 5 min) - Consider shorter periods or higher TFs")
print()
print("💡 RECOMMENDATIONS:")
if execution_time > 60:
print(" • For faster results, use higher timeframes (15m, 1h, 4h)")
print(" • Or reduce date range (< 7 days)")
else:
print(" • Hardware is sufficient for this workload")
print(" • Can handle larger date ranges or multiple timeframes")
else:
print("❌ ERROR: No results found in database!")
print(" The backtest may have failed. Check server logs.")
except Exception as e:
print(f"\n❌ ERROR: {e}")
import traceback
traceback.print_exc()
finally:
await db.disconnect()
print()
print("=" * 70)
print("Test completed")
print("=" * 70)
def main():
parser = argparse.ArgumentParser(description='Test MA44 backtest performance')
parser.add_argument('--days', type=int, default=7,
help='Number of days to backtest (default: 7)')
parser.add_argument('--interval', type=str, default='1m',
help='Candle interval (default: 1m)')
args = parser.parse_args()
# Run the async test
asyncio.run(run_performance_test(args.days, args.interval))
if __name__ == "__main__":
main()

87
scripts/update_schema.sh Normal file
View File

@ -0,0 +1,87 @@
#!/bin/bash
# Apply schema updates to a running TimescaleDB container without wiping data
echo "Applying schema updates to btc_timescale container..."
# Execute the schema SQL inside the container
# We use psql with the environment variables set in docker-compose
docker exec -i btc_timescale psql -U btc_bot -d btc_data <<EOF
-- 1. Unique constraint for indicators (if not exists)
DO \$\$
BEGIN
IF NOT EXISTS (SELECT 1 FROM pg_constraint WHERE conname = 'indicators_unique') THEN
ALTER TABLE indicators ADD CONSTRAINT indicators_unique UNIQUE (time, symbol, interval, indicator_name);
END IF;
END \$\$;
-- 2. Index for indicators
CREATE INDEX IF NOT EXISTS idx_indicators_lookup ON indicators (symbol, interval, indicator_name, time DESC);
-- 3. Data health view update
CREATE OR REPLACE VIEW data_health AS
SELECT
symbol,
COUNT(*) as total_candles,
COUNT(*) FILTER (WHERE validated) as validated_candles,
MAX(time) as latest_candle,
MIN(time) as earliest_candle,
NOW() - MAX(time) as time_since_last
FROM candles
GROUP BY symbol;
-- 4. Decisions table
CREATE TABLE IF NOT EXISTS decisions (
time TIMESTAMPTZ NOT NULL,
symbol TEXT NOT NULL,
interval TEXT NOT NULL,
decision_type TEXT NOT NULL,
strategy TEXT NOT NULL,
confidence DECIMAL(5,4),
price_at_decision DECIMAL(18,8),
indicator_snapshot JSONB NOT NULL,
candle_snapshot JSONB NOT NULL,
reasoning TEXT,
backtest_id TEXT,
executed BOOLEAN DEFAULT FALSE,
execution_price DECIMAL(18,8),
execution_time TIMESTAMPTZ,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- 5. Decisions hypertable (ignore error if already exists)
DO \$\$
BEGIN
PERFORM create_hypertable('decisions', 'time', chunk_time_interval => INTERVAL '7 days', if_not_exists => TRUE);
EXCEPTION WHEN OTHERS THEN
NULL; -- Ignore if already hypertable
END \$\$;
-- 6. Decisions indexes
CREATE INDEX IF NOT EXISTS idx_decisions_live ON decisions (symbol, interval, time DESC) WHERE backtest_id IS NULL;
CREATE INDEX IF NOT EXISTS idx_decisions_backtest ON decisions (backtest_id, symbol, time DESC) WHERE backtest_id IS NOT NULL;
CREATE INDEX IF NOT EXISTS idx_decisions_type ON decisions (symbol, decision_type, time DESC);
-- 7. Backtest runs table
CREATE TABLE IF NOT EXISTS backtest_runs (
id TEXT PRIMARY KEY,
strategy TEXT NOT NULL,
symbol TEXT NOT NULL DEFAULT 'BTC',
start_time TIMESTAMPTZ NOT NULL,
end_time TIMESTAMPTZ NOT NULL,
intervals TEXT[] NOT NULL,
config JSONB,
results JSONB,
created_at TIMESTAMPTZ DEFAULT NOW()
);
-- 8. Compression policies
DO \$\$
BEGIN
ALTER TABLE decisions SET (timescaledb.compress, timescaledb.compress_segmentby = 'symbol,interval,strategy');
PERFORM add_compression_policy('decisions', INTERVAL '7 days', if_not_exists => TRUE);
EXCEPTION WHEN OTHERS THEN
NULL; -- Ignore compression errors if already set
END \$\$;
SELECT 'Schema update completed successfully' as status;
EOF

33
scripts/verify_files.sh Normal file
View File

@ -0,0 +1,33 @@
#!/bin/bash
# BTC Bot Dashboard Setup Script
# Run this from ~/btc_bot to verify all files exist
echo "=== BTC Bot File Verification ==="
echo ""
FILES=(
"src/api/server.py"
"src/api/websocket_manager.py"
"src/api/dashboard/static/index.html"
"docker/Dockerfile.api"
"docker/Dockerfile.collector"
)
for file in "${FILES[@]}"; do
if [ -f "$file" ]; then
size=$(stat -f%z "$file" 2>/dev/null || stat -c%s "$file" 2>/dev/null || echo "unknown")
echo "$file (${size} bytes)"
else
echo "$file (MISSING)"
fi
done
echo ""
echo "=== Next Steps ==="
echo "1. If all files exist, rebuild:"
echo " cd ~/btc_bot"
echo " docker build --network host --no-cache -f docker/Dockerfile.api -t btc_api ."
echo " cd docker && docker-compose up -d"
echo ""
echo "2. Check logs:"
echo " docker logs btc_api --tail 20"

86
src/TV/HTS.pine Normal file
View File

@ -0,0 +1,86 @@
//@version=5
indicator(title='HTS p1otek (Fixed)', overlay=true )
// Helper function to return the correct timeframe string for request.security
// Note: We let Pine Script infer the return type to avoid syntax errors
getAutoTFString(chartTFInMinutes) =>
float autoTFMinutes = chartTFInMinutes / 4.0
// Use an existing time resolution string if possible (D, W, M)
if timeframe.isdaily
// 'D' timeframe is 1440 minutes. 1440 / 4 = 360 minutes (6 hours)
// We return "360" which Pine Script accepts as a resolution
str.tostring(math.round(autoTFMinutes))
else if timeframe.isweekly or timeframe.ismonthly
// Cannot divide W or M timeframes reliably, return current timeframe string
timeframe.period
else
// For standard minute timeframes, use the calculated minutes
str.tostring(math.round(autoTFMinutes))
// Inputs
// FIXED: Changed input.integer to input.int
short = input.int(33, "fast")
long = input.int(144, "slow")
auto = input.bool(false, title = "auto HTS (timeframe/4)")
draw_1h = input.bool(false, title = "draw 1h slow HTS")
metoda = input.string(title = "type average", defval = "RMA", options=["RMA", "EMA", "SMA", "WMA", "VWMA"])
// Calculate chart TF in minutes
float chartTFInMinutes = timeframe.in_seconds() / 60
// Get the auto-calculated timeframe string
string autoTFString = getAutoTFString(chartTFInMinutes)
srednia(src, length, type) =>
switch type
"RMA" => ta.rma(src, length)
"EMA" => ta.ema(src, length)
"SMA" => ta.sma(src, length)
"WMA" => ta.wma(src, length)
"VWMA" => ta.vwma(src, length)
// === Non-Auto (Current Timeframe) Calculations ===
string currentTFString = timeframe.period
shortl = request.security(syminfo.tickerid, currentTFString, srednia(low, short, metoda))
shorth = request.security(syminfo.tickerid, currentTFString, srednia(high, short, metoda))
longl = request.security(syminfo.tickerid, currentTFString, srednia(low, long, metoda))
longh = request.security(syminfo.tickerid, currentTFString, srednia(high, long, metoda))
// === Auto Timeframe Calculations ===
shortl_auto = request.security(syminfo.tickerid, autoTFString, srednia(low, short, metoda))
shorth_auto = request.security(syminfo.tickerid, autoTFString, srednia(high, short, metoda))
longl_auto = request.security(syminfo.tickerid, autoTFString, srednia(low, long, metoda))
longh_auto = request.security(syminfo.tickerid, autoTFString, srednia(high, long, metoda))
// === 1H Timeframe Calculations ===
// Use a fixed '60' for 1 hour
longl_1h = request.security(syminfo.tickerid, "60", srednia(low, long, metoda))
longh_1h = request.security(syminfo.tickerid, "60", srednia(high, long, metoda))
// === Plotting ===
// Auto HTS
plot(auto ? shortl_auto: na, color=color.new(color.aqua, 0), linewidth=1, title="fast low auto")
plot(auto ? shorth_auto: na, color=color.new(color.aqua, 0), linewidth=1, title="fast high auto")
plot(auto ? longl_auto: na, color=color.new(color.red, 0), linewidth=1, title="slow low auto")
plot(auto ? longh_auto: na, color=color.new(color.red, 0), linewidth=1, title="slow high auto")
// Current TF (only when Auto is enabled, for reference)
ll = plot( auto ? longl: na, color=color.new(color.red, 80), linewidth=1, title="current slow low")
oo = plot( auto ? longh: na, color=color.new(color.red, 80), linewidth=1, title="current slow high")
fill(ll,oo, color=color.new(color.red, 90))
// 1H Zone
zone_1hl = plot( draw_1h ? longl_1h: na, color=color.new(color.red, 80), linewidth=1, title="1h slow low")
zone_1hh = plot( draw_1h ? longh_1h: na, color=color.new(color.red, 80), linewidth=1, title="1h slow high")
fill(zone_1hl,zone_1hh, color=color.new(color.red, 90))
// Non-Auto HTS
plot(not auto ? shortl: na, color=color.new(color.aqua, 0), linewidth=1, title="fast low")
plot(not auto ? shorth: na, color=color.new(color.aqua, 0), linewidth=1, title="fast high")
plot(not auto ? longl: na, color=color.new(color.red, 0), linewidth=1, title="slow low")
plot(not auto ? longh: na, color=color.new(color.red, 0), linewidth=1, title="slow high")

View File

@ -0,0 +1,726 @@
/* ============================================================================
NEW INDICATOR PANEL STYLES - Single Panel, TradingView-inspired
============================================================================ */
.indicator-panel {
display: flex;
flex-direction: column;
height: 100%;
overflow-y: auto;
overflow-x: hidden;
}
.subrbar::-webkit-scrollbar {
width: 6px;
}
.indicator-panel::-webkit-scrollbar-thumb {
background: #363a44;
border-radius: 3px;
}
.indicator-panel::-webkit-scrollbar-track {
background: transparent;
}
/* Search Bar */
.indicator-search {
display: flex;
align-items: center;
background: var(--tv-bg);
border: 1px solid var(--tv-border);
border-radius: 6px;
padding: 8px 12px;
margin: 8px 12px;
gap: 8px;
transition: border-color 0.2s;
}
.indicator-search:focus-within {
border-color: var(--tv-blue);
}
.search-icon {
color: var(--tv-text-secondary);
font-size: 14px;
}
.indicator-search input {
flex: 1;
background: transparent;
border: none;
color: var(--tv-text);
font-size: 13px;
outline: none;
}
.indicator-search input::placeholder {
color: var(--tv-text-secondary);
}
.search-clear {
background: transparent;
border: none;
color: var(--tv-text-secondary);
cursor: pointer;
padding: 2px 6px;
font-size: 16px;
line-height: 1;
}
.search-clear:hover {
color: var(--tv-text);
}
/* Category Tabs */
.category-tabs {
display: flex;
gap: 4px;
padding: 4px 12px;
overflow-x: auto;
scrollbar-width: none;
}
.category-tabs::-webkit-scrollbar {
display: none;
}
.category-tab {
background: transparent;
border: none;
color: var(--tv-text-secondary);
font-size: 11px;
padding: 6px 10px;
border-radius: 4px;
cursor: pointer;
white-space: nowrap;
transition: all 0.2s;
}
.category-tab:hover {
background: var(--tv-hover);
color: var(--tv-text);
}
.category-tab.active {
background: rgba(41, 98, 255, 0.1);
color: var(--tv-blue);
font-weight: 600;
}
/* Indicator Sections */
.indicator-section {
margin: 8px 12px 12px;
}
.indicator-section.favorites {
background: rgba(41, 98, 255, 0.05);
border-radius: 6px;
padding: 8px;
margin-top: 4px;
}
.section-title {
font-size: 10px;
color: var(--tv-text-secondary);
text-transform: uppercase;
letter-spacing: 0.5px;
padding: 8px 0;
display: flex;
align-items: center;
gap: 5px;
}
.section-title button.clear-all,
.section-title button.visibility-toggle {
display: none;
}
.section-title:hover button.clear-all,
.section-title:hover button.visibility-toggle {
display: inline-block;
}
.visibility-toggle,
.clear-all {
background: var(--tv-red);
border: none;
color: white;
font-size: 9px;
padding: 2px 8px;
border-radius: 3px;
cursor: pointer;
}
.visibility-toggle {
background: var(--tv-blue);
}
.visibility-toggle:hover,
.clear-all:hover {
opacity: 0.9;
}
/* Indicator Items */
.indicator-item {
background: var(--tv-panel-bg);
border: 1px solid var(--tv-border);
border-radius: 6px;
margin-bottom: 2px;
transition: all 0.2s;
overflow: hidden;
}
.indicator-item:hover {
border-color: var(--tv-blue);
}
.indicator-item.favorite {
border-color: rgba(41, 98, 255, 0.3);
}
.indicator-item-main {
display: flex;
align-items: center;
gap: 6px;
padding: 8px 10px;
cursor: pointer;
}
.indicator-name {
flex: 1;
font-size: 12px;
color: var(--tv-text);
font-weight: 500;
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
}
.indicator-desc {
font-size: 11px;
color: var(--tv-text-secondary);
margin-left: 8px;
}
.indicator-actions {
display: flex;
gap: 4px;
margin-left: auto;
}
.indicator-btn {
background: transparent;
border: 1px solid transparent;
color: var(--tv-text-secondary);
cursor: pointer;
width: 24px;
height: 24px;
border-radius: 4px;
font-size: 13px;
display: flex;
align-items: center;
justify-content: center;
transition: all 0.15s;
flex-shrink: 0;
}
.indicator-btn:hover {
background: var(--tv-hover);
color: var(--tv-text);
border-color: var(--tv-hover);
}
.indicator-btn.add:hover {
background: var(--tv-blue);
color: white;
border-color: var(--tv-blue);
}
.indicator-presets {
display: none;
}
@media (min-width: 768px) {
.indicator-presets {
display: block;
}
.indicator-desc {
display: inline;
font-size: 11px;
color: var(--tv-text-secondary);
white-space: nowrap;
overflow: hidden;
text-overflow: ellipsis;
max-width: 120px;
}
}
/* Active Indicator Item */
.indicator-item.active {
border-color: var(--tv-blue);
}
.indicator-item.active .indicator-name {
color: var(--tv-blue);
font-weight: 600;
}
.indicator-item.active.expanded {
border-color: var(--tv-blue);
background: rgba(41, 98, 255, 0.05);
}
.drag-handle {
cursor: grab;
color: var(--tv-text-secondary);
font-size: 12px;
user-select: none;
padding: 0 2px;
}
.drag-handle:hover {
color: var(--tv-text);
}
.indicator-btn.visible,
.indicator-btn.expand,
.indicator-btn.favorite {
width: 20px;
height: 20px;
font-size: 11px;
}
.indicator-btn.expand.rotated {
transform: rotate(180deg);
}
/* Indicator Config (Expanded) */
.indicator-config {
border-top: 1px solid var(--tv-border);
background: rgba(0, 0, 0, 0.2);
animation: slideDown 0.2s ease;
}
@keyframes slideDown {
from {
opacity: 0;
max-height: 0;
}
to {
opacity: 1;
max-height: 1000px;
}
}
.config-sections {
padding: 12px;
}
.config-section {
margin-bottom: 16px;
}
.config-section:last-child {
margin-bottom: 0;
}
.section-subtitle {
font-size: 10px;
color: var(--tv-text-secondary);
text-transform: uppercase;
letter-spacing: 0.5px;
margin-bottom: 8px;
display: flex;
align-items: center;
gap: 8px;
}
.preset-action-btn {
background: var(--tv-blue);
border: none;
color: white;
font-size: 9px;
padding: 2px 8px;
border-radius: 3px;
cursor: pointer;
margin-left: auto;
}
.preset-action-btn:hover {
opacity: 0.9;
}
/* Config Row */
.config-row {
display: flex;
align-items: center;
gap: 8px;
margin-bottom: 8px;
}
.config-row label {
font-size: 11px;
color: var(--tv-text-secondary);
min-width: 80px;
}
.config-row select,
.config-row input[type="text"],
.config-row input[type="number"] {
flex: 1;
background: var(--tv-bg);
border: 1px solid var(--tv-border);
border-radius: 4px;
color: var(--tv-text);
font-size: 12px;
padding: 4px 8px;
min-width: 0;
}
.config-row select:focus,
.config-row input:focus {
outline: none;
border-color: var(--tv-blue);
}
.input-with-preset {
display: flex;
align-items: center;
gap: 4px;
flex: 1;
}
.input-with-preset input {
flex: 1;
}
.presets-btn {
background: transparent;
border: 1px solid var(--tv-border);
color: var(--tv-text-secondary);
cursor: pointer;
padding: 4px 8px;
font-size: 10px;
border-radius: 3px;
}
.presets-btn:hover {
background: var(--tv-hover);
}
/* Color Picker */
.color-picker {
display: flex;
align-items: center;
gap: 8px;
flex: 1;
}
.color-picker input[type="color"] {
width: 32px;
height: 28px;
border: 1px solid var(--tv-border);
border-radius: 4px;
cursor: pointer;
padding: 0;
background: transparent;
}
.color-preview {
width: 16px;
height: 16px;
border-radius: 3px;
border: 1px solid var(--tv-border);
}
/* Range Slider */
.config-row input[type="range"] {
flex: 1;
accent-color: var(--tv-blue);
}
/* Actions */
.config-actions {
display: flex;
gap: 8px;
padding-top: 12px;
border-top: 1px solid var(--tv-border);
}
.btn-secondary {
flex: 1;
background: var(--tv-bg);
border: 1px solid var(--tv-border);
color: var(--tv-text);
padding: 6px 12px;
border-radius: 4px;
cursor: pointer;
font-size: 12px;
}
.btn-secondary:hover {
background: var(--tv-hover);
}
.btn-danger {
flex: 1;
background: var(--tv-red);
border: none;
color: white;
padding: 6px 12px;
border-radius: 4px;
cursor: pointer;
font-size: 12px;
}
.btn-danger:hover {
opacity: 0.9;
}
/* No Results */
.no-results {
text-align: center;
color: var(--tv-text-secondary);
padding: 40px 20px;
font-size: 12px;
}
/* Presets List */
.presets-list {
max-height: 200px;
overflow-y: auto;
}
.preset-item {
display: flex;
align-items: center;
justify-content: space-between;
padding: 6px 8px;
border-radius: 4px;
cursor: pointer;
transition: background 0.15s;
}
.preset-item:hover {
background: var(--tv-hover);
}
.preset-item.applied {
background: rgba(38, 166, 154, 0.1);
border-radius: 4px;
}
.preset-label {
font-size: 11px;
color: var(--tv-text);
}
.preset-delete {
background: transparent;
border: none;
color: var(--tv-text-secondary);
cursor: pointer;
padding: 2px 6px;
font-size: 14px;
line-height: 1;
}
.preset-delete:hover {
color: var(--tv-red);
}
.no-presets {
text-align: center;
color: var(--tv-text-secondary);
font-size: 10px;
padding: 8px;
}
/* Range Value Display */
.range-value {
font-size: 11px;
color: var(--tv-text);
min-width: 20px;
}
/* Preset Indicator Button */
.preset-indicator {
background: transparent;
border: 1px solid var(--tv-border);
color: var(--tv-text-secondary);
cursor: pointer;
padding: 2px 6px;
font-size: 10px;
border-radius: 3px;
}
.preset-indicator:hover {
background: var(--tv-hover);
border-color: var(--tv-blue);
color: var(--tv-blue);
}
/* Mobile Responsive */
@media (max-width: 767px) {
.category-tabs {
font-size: 10px;
padding: 4px 8px;
}
.category-tab {
padding: 4px 8px;
}
.indicator-item-main {
padding: 6px 8px;
}
.indicator-btn {
width: 20px;
height: 20px;
}
.config-actions {
flex-direction: column;
}
.config-row label {
min-width: 60px;
font-size: 10px;
}
}
/* Touch-friendly styles for mobile */
@media (hover: none) {
.indicator-btn {
min-width: 40px;
min-height: 40px;
}
.category-tab {
padding: 10px 14px;
}
.indicator-item-main {
padding: 12px;
}
}
/* Dark theme improvements */
@media (prefers-color-scheme: dark) {
.indicator-search {
background: #1e222d;
}
.indicator-item {
background: #1e222d;
}
.indicator-config {
background: rgba(0, 0, 0, 0.3);
}
}
/* Animations */
.indicator-item {
transition: all 0.2s ease;
}
.indicator-config > * {
animation: fadeIn 0.2s ease;
}
@keyframes fadeIn {
from {
opacity: 0;
transform: translateY(-5px);
}
to {
opacity: 1;
transform: translateY(0);
}
}
/* Scrollbar styling for presets list */
.presets-list::-webkit-scrollbar {
width: 4px;
}
.presets-list::-webkit-scrollbar-thumb {
background: var(--tv-border);
border-radius: 2px;
}
/* Sidebar Tabs */
.sidebar-tabs {
display: flex;
gap: 4px;
flex: 1;
margin-right: 8px;
}
.sidebar-tab {
flex: 1;
background: transparent;
border: none;
color: var(--tv-text-secondary);
font-size: 11px;
padding: 6px 8px;
border-radius: 4px;
cursor: pointer;
transition: all 0.2s;
white-space: nowrap;
}
.sidebar-tab:hover {
background: var(--tv-hover);
color: var(--tv-text);
}
.sidebar-tab.active {
background: rgba(41, 98, 255, 0.15);
color: var(--tv-blue);
font-weight: 600;
}
/* Sidebar Tab Panels */
.sidebar-tab-panel {
display: none;
animation: fadeIn 0.2s ease;
}
.sidebar-tab-panel.active {
display: block;
}
/* Collapsed sidebar adjustments */
.right-sidebar.collapsed .sidebar-tabs {
display: none;
}
/* Strategy Panel Styles */
.indicator-checklist {
max-height: 120px;
overflow-y: auto;
background: var(--tv-bg);
border: 1px solid var(--tv-border);
border-radius: 4px;
padding: 4px;
margin-top: 4px;
}
.indicator-checklist::-webkit-scrollbar {
width: 4px;
}
.indicator-checklist::-webkit-scrollbar-thumb {
background: var(--tv-border);
border-radius: 2px;
}
.checklist-item {
display: flex;
align-items: center;
gap: 8px;
padding: 4px 8px;
font-size: 12px;
cursor: pointer;
border-radius: 3px;
}
.checklist-item:hover {
background: var(--tv-hover);
}
.checklist-item input {
cursor: pointer;
}
.equity-chart-container {
width: 100%;
height: 150px;
margin-top: 12px;
border-radius: 4px;
overflow: hidden;
border: 1px solid var(--tv-border);
background: var(--tv-bg);
}
.results-actions {
display: flex;
gap: 8px;
margin-top: 12px;
}
.chart-toggle-group {
display: flex;
background: var(--tv-hover);
border-radius: 4px;
padding: 2px;
}
.chart-toggle-group .toggle-btn {
padding: 2px 8px;
font-size: 10px;
border: none;
background: transparent;
color: var(--tv-text-secondary);
cursor: pointer;
border-radius: 3px;
transition: all 0.2s ease;
}
.chart-toggle-group .toggle-btn.active {
background: var(--tv-border);
color: var(--tv-text);
}
.chart-toggle-group .toggle-btn:hover:not(.active) {
color: var(--tv-text);
}

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,82 @@
import { TradingDashboard, refreshTA, openAIAnalysis } from './ui/chart.js';
import { restoreSidebarState, toggleSidebar, initSidebarTabs, restoreSidebarTabState } from './ui/sidebar.js';
import {
initIndicatorPanel,
getActiveIndicators,
setActiveIndicators,
drawIndicatorsOnChart,
addIndicator,
removeIndicatorById
} from './ui/indicators-panel-new.js';
import { initStrategyPanel } from './ui/strategy-panel.js';
import { IndicatorRegistry } from './indicators/index.js';
import { TimezoneConfig } from './config/timezone.js';
window.dashboard = null;
window.toggleSidebar = toggleSidebar;
window.refreshTA = refreshTA;
window.openAIAnalysis = openAIAnalysis;
window.TimezoneConfig = TimezoneConfig;
window.renderIndicatorList = function() {
// This function is no longer needed for sidebar indicators
};
// Export init function for global access
window.initIndicatorPanel = initIndicatorPanel;
window.addIndicator = addIndicator;
window.toggleIndicator = addIndicator;
window.drawIndicatorsOnChart = drawIndicatorsOnChart;
window.updateIndicatorCandles = drawIndicatorsOnChart;
window.IndicatorRegistry = IndicatorRegistry;
document.addEventListener('DOMContentLoaded', async () => {
// Attach toggle sidebar event listener
const toggleBtn = document.getElementById('sidebarToggleBtn');
if (toggleBtn) {
toggleBtn.addEventListener('click', toggleSidebar);
}
// Initialize timezone selector
const timezoneSelect = document.getElementById('timezoneSelect');
const settingsPopup = document.getElementById('settingsPopup');
const settingsBtn = document.getElementById('btnSettings');
if (timezoneSelect) {
timezoneSelect.value = TimezoneConfig.getTimezone();
timezoneSelect.addEventListener('change', (e) => {
TimezoneConfig.setTimezone(e.target.value);
settingsPopup.classList.remove('show');
// Redraw chart and indicators
if (window.dashboard) {
window.drawIndicatorsOnChart?.();
}
});
}
// Toggle settings popup
if (settingsBtn && settingsPopup) {
settingsBtn.addEventListener('click', (e) => {
e.stopPropagation();
settingsPopup.classList.toggle('show');
});
settingsPopup.addEventListener('click', (e) => {
e.stopPropagation();
});
document.addEventListener('click', () => {
settingsPopup.classList.remove('show');
});
}
window.dashboard = new TradingDashboard();
restoreSidebarState();
restoreSidebarTabState();
initSidebarTabs();
// Initialize panels
window.initIndicatorPanel();
initStrategyPanel();
});

View File

@ -0,0 +1,76 @@
const TimezoneConfig = {
timezone: localStorage.getItem('timezone') || 'Europe/Warsaw',
availableTimezones: [
{ value: 'UTC', label: 'UTC', offset: 0 },
{ value: 'Europe/London', label: 'London (GMT/BST)', offset: 0 },
{ value: 'Europe/Paris', label: 'Central Europe (CET/CEST)', offset: 1 },
{ value: 'Europe/Warsaw', label: 'Warsaw (CET/CEST)', offset: 1 },
{ value: 'America/New_York', label: 'New York (EST/EDT)', offset: -5 },
{ value: 'America/Chicago', label: 'Chicago (CST/CDT)', offset: -6 },
{ value: 'America/Los_Angeles', label: 'Los Angeles (PST/PDT)', offset: -8 },
{ value: 'Asia/Tokyo', label: 'Tokyo (JST)', offset: 9 },
{ value: 'Asia/Shanghai', label: 'Shanghai (CST)', offset: 8 },
{ value: 'Australia/Sydney', label: 'Sydney (AEST/AEDT)', offset: 10 },
],
setTimezone(tz) {
this.timezone = tz;
localStorage.setItem('timezone', tz);
document.dispatchEvent(new CustomEvent('timezone-changed', { detail: tz }));
},
getTimezone() {
return this.timezone;
},
getOffsetHours(tz = this.timezone) {
const now = new Date();
const tzDate = new Date(now.toLocaleString('en-US', { timeZone: tz }));
const utcDate = new Date(now.toLocaleString('en-US', { timeZone: 'UTC' }));
return (tzDate - utcDate) / 3600000;
},
formatDate(timestamp) {
const date = new Date(timestamp);
const tz = this.timezone;
const options = {
timeZone: tz,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit', second: '2-digit',
hour12: false
};
const formatter = new Intl.DateTimeFormat('en-GB', options);
const parts = formatter.formatToParts(date);
const get = (type) => parts.find(p => p.type === type).value;
return `${get('day')}/${get('month')}/${get('year')} ${get('hour')}:${get('minute')}`;
},
formatTickMark(timestamp) {
const date = new Date(timestamp * 1000);
const tz = this.timezone;
const options = {
timeZone: tz,
year: 'numeric', month: '2-digit', day: '2-digit',
hour: '2-digit', minute: '2-digit',
hour12: false
};
const formatter = new Intl.DateTimeFormat('en-GB', options);
const parts = formatter.formatToParts(date);
const get = (type) => parts.find(p => p.type === type).value;
// If it's exactly midnight, just show the date, otherwise show time too
const isMidnight = get('hour') === '00' && get('minute') === '00';
if (isMidnight) {
return `${get('day')}/${get('month')}/${get('year')}`;
}
return `${get('day')}/${get('month')} ${get('hour')}:${get('minute')}`;
}
};
export { TimezoneConfig };

View File

@ -0,0 +1,15 @@
export const INTERVALS = ['1m', '3m', '5m', '15m', '30m', '37m', '1h', '2h', '4h', '8h', '12h', '1d', '3d', '1w', '1M'];
export const COLORS = {
tvBg: '#131722',
tvPanelBg: '#1e222d',
tvBorder: '#2a2e39',
tvText: '#d1d4dc',
tvTextSecondary: '#787b86',
tvGreen: '#26a69a',
tvRed: '#ef5350',
tvBlue: '#2962ff',
tvHover: '#2a2e39'
};
export const API_BASE = '/api/v1';

View File

@ -0,0 +1 @@
export { INTERVALS, COLORS, API_BASE } from './constants.js';

View File

@ -0,0 +1,118 @@
// Self-contained ATR indicator
// Includes math, metadata, signal calculation, and base class
// Signal constants (defined in each indicator file)
const SIGNAL_TYPES = {
BUY: 'buy',
SELL: 'sell',
HOLD: 'hold'
};
const SIGNAL_COLORS = {
buy: '#26a69a',
hold: '#787b86',
sell: '#ef5350'
};
// Base class (inline replacement for BaseIndicator)
class BaseIndicator {
constructor(config) {
this.id = config.id;
this.type = config.type;
this.name = config.name;
this.params = config.params || {};
this.timeframe = config.timeframe || '1m';
this.series = [];
this.visible = config.visible !== false;
this.cachedResults = null;
this.cachedMeta = null;
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
}
// Signal calculation for ATR
function calculateATRSignal(indicator, lastCandle, prevCandle, values) {
const atr = values?.atr;
const close = lastCandle.close;
const prevClose = prevCandle?.close;
if (!atr || atr === null || !prevClose) {
return null;
}
const atrPercent = atr / close * 100;
const priceChange = Math.abs(close - prevClose);
const atrRatio = priceChange / atr;
if (atrRatio > 1.5) {
return {
type: SIGNAL_TYPES.HOLD,
strength: 70,
value: atr,
reasoning: `High volatility: ATR (${atr.toFixed(2)}, ${atrPercent.toFixed(2)}%)`
};
}
return null;
}
// ATR Indicator class
export class ATRIndicator extends BaseIndicator {
constructor(config) {
super(config);
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
calculate(candles) {
const period = this.params.period || 14;
const results = new Array(candles.length).fill(null);
const tr = new Array(candles.length).fill(0);
for (let i = 1; i < candles.length; i++) {
const h_l = candles[i].high - candles[i].low;
const h_pc = Math.abs(candles[i].high - candles[i-1].close);
const l_pc = Math.abs(candles[i].low - candles[i-1].close);
tr[i] = Math.max(h_l, h_pc, l_pc);
}
let atr = 0;
let sum = 0;
for (let i = 1; i <= period; i++) sum += tr[i];
atr = sum / period;
results[period] = atr;
for (let i = period + 1; i < candles.length; i++) {
atr = (atr * (period - 1) + tr[i]) / period;
results[i] = atr;
}
return results.map(atr => ({ atr }));
}
getMetadata() {
return {
name: 'ATR',
description: 'Average True Range - measures market volatility',
inputs: [{
name: 'period',
label: 'Period',
type: 'number',
default: 14,
min: 1,
max: 100,
description: 'Period for ATR calculation'
}],
plots: [{
id: 'value',
color: '#795548',
title: 'ATR',
lineWidth: 1
}],
displayMode: 'pane'
};
}
}
export { calculateATRSignal };

View File

@ -0,0 +1,118 @@
// Self-contained Bollinger Bands indicator
// Includes math, metadata, signal calculation, and base class
// Signal constants (defined in each indicator file)
const SIGNAL_TYPES = {
BUY: 'buy',
SELL: 'sell',
HOLD: 'hold'
};
const SIGNAL_COLORS = {
buy: '#26a69a',
hold: '#787b86',
sell: '#ef5350'
};
// Base class (inline replacement for BaseIndicator)
class BaseIndicator {
constructor(config) {
this.id = config.id;
this.type = config.type;
this.name = config.name;
this.params = config.params || {};
this.timeframe = config.timeframe || '1m';
this.series = [];
this.visible = config.visible !== false;
this.cachedResults = null;
this.cachedMeta = null;
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
}
// Signal calculation for Bollinger Bands
function calculateBollingerBandsSignal(indicator, lastCandle, prevCandle, values, prevValues) {
const close = lastCandle.close;
const prevClose = prevCandle?.close;
const upper = values?.upper;
const lower = values?.lower;
const prevUpper = prevValues?.upper;
const prevLower = prevValues?.lower;
if (!upper || !lower || prevUpper === undefined || prevLower === undefined || prevClose === undefined) {
return null;
}
// BUY: Price crosses DOWN through lower band (reversal/bounce play)
if (prevClose > prevLower && close <= lower) {
return {
type: SIGNAL_TYPES.BUY,
strength: 70,
value: close,
reasoning: `Price crossed DOWN through lower Bollinger Band`
};
}
// SELL: Price crosses UP through upper band (overextended play)
else if (prevClose < prevUpper && close >= upper) {
return {
type: SIGNAL_TYPES.SELL,
strength: 70,
value: close,
reasoning: `Price crossed UP through upper Bollinger Band`
};
}
return null;
}
// Bollinger Bands Indicator class
export class BollingerBandsIndicator extends BaseIndicator {
constructor(config) {
super(config);
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
calculate(candles) {
const period = this.params.period || 20;
const stdDevMult = this.params.stdDev || 2;
const results = new Array(candles.length).fill(null);
for (let i = period - 1; i < candles.length; i++) {
let sum = 0;
for (let j = 0; j < period; j++) sum += candles[i-j].close;
const sma = sum / period;
let diffSum = 0;
for (let j = 0; j < period; j++) diffSum += Math.pow(candles[i-j].close - sma, 2);
const stdDev = Math.sqrt(diffSum / period);
results[i] = {
middle: sma,
upper: sma + (stdDevMult * stdDev),
lower: sma - (stdDevMult * stdDev)
};
}
return results;
}
getMetadata() {
return {
name: 'Bollinger Bands',
description: 'Volatility bands around a moving average',
inputs: [
{ name: 'period', label: 'Period', type: 'number', default: 20, min: 1, max: 100 },
{ name: 'stdDev', label: 'Std Dev', type: 'number', default: 2, min: 0.5, max: 5, step: 0.5 }
],
plots: [
{ id: 'upper', color: '#4caf50', title: 'Upper' },
{ id: 'middle', color: '#4caf50', title: 'Middle', lineStyle: 2 },
{ id: 'lower', color: '#4caf50', title: 'Lower' }
],
displayMode: 'overlay'
};
}
}
export { calculateBollingerBandsSignal };

View File

@ -0,0 +1,255 @@
// Self-contained HTS Trend System indicator
// Includes math, metadata, signal calculation, and base class
// Signal constants (defined in each indicator file)
const SIGNAL_TYPES = {
BUY: 'buy',
SELL: 'sell',
HOLD: 'hold'
};
const SIGNAL_COLORS = {
buy: '#26a69a',
hold: '#787b86',
sell: '#ef5350'
};
// Base class (inline replacement for BaseIndicator)
class BaseIndicator {
constructor(config) {
this.id = config.id;
this.type = config.type;
this.name = config.name;
this.params = config.params || {};
this.timeframe = config.timeframe || '1m';
this.series = [];
this.visible = config.visible !== false;
this.cachedResults = null;
this.cachedMeta = null;
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
}
// MA calculations inline (SMA/EMA/RMA/WMA/VWMA)
function calculateSMA(candles, period, source = 'close') {
const results = new Array(candles.length).fill(null);
let sum = 0;
for (let i = 0; i < candles.length; i++) {
sum += candles[i][source];
if (i >= period) sum -= candles[i - period][source];
if (i >= period - 1) results[i] = sum / period;
}
return results;
}
function calculateEMA(candles, period, source = 'close') {
const multiplier = 2 / (period + 1);
const results = new Array(candles.length).fill(null);
let ema = 0;
let sum = 0;
for (let i = 0; i < candles.length; i++) {
if (i < period) {
sum += candles[i][source];
if (i === period - 1) {
ema = sum / period;
results[i] = ema;
}
} else {
ema = (candles[i][source] - ema) * multiplier + ema;
results[i] = ema;
}
}
return results;
}
function calculateRMA(candles, period, source = 'close') {
const multiplier = 1 / period;
const results = new Array(candles.length).fill(null);
let rma = 0;
let sum = 0;
for (let i = 0; i < candles.length; i++) {
if (i < period) {
sum += candles[i][source];
if (i === period - 1) {
rma = sum / period;
results[i] = rma;
}
} else {
rma = (candles[i][source] - rma) * multiplier + rma;
results[i] = rma;
}
}
return results;
}
function calculateWMA(candles, period, source = 'close') {
const results = new Array(candles.length).fill(null);
const weightSum = (period * (period + 1)) / 2;
for (let i = period - 1; i < candles.length; i++) {
let sum = 0;
for (let j = 0; j < period; j++) {
sum += candles[i - j][source] * (period - j);
}
results[i] = sum / weightSum;
}
return results;
}
function calculateVWMA(candles, period, source = 'close') {
const results = new Array(candles.length).fill(null);
for (let i = period - 1; i < candles.length; i++) {
let sumPV = 0;
let sumV = 0;
for (let j = 0; j < period; j++) {
sumPV += candles[i - j][source] * candles[i - j].volume;
sumV += candles[i - j].volume;
}
results[i] = sumV !== 0 ? sumPV / sumV : null;
}
return results;
}
// MA dispatcher function
function getMA(type, candles, period, source = 'close') {
switch (type.toUpperCase()) {
case 'SMA': return calculateSMA(candles, period, source);
case 'EMA': return calculateEMA(candles, period, source);
case 'RMA': return calculateRMA(candles, period, source);
case 'WMA': return calculateWMA(candles, period, source);
case 'VWMA': return calculateVWMA(candles, period, source);
default: return calculateSMA(candles, period, source);
}
}
// Signal calculation for HTS
function calculateHTSSignal(indicator, lastCandle, prevCandle, values, prevValues) {
const slowLow = values?.slowLow;
const slowHigh = values?.slowHigh;
const prevSlowLow = prevValues?.slowLow;
const prevSlowHigh = prevValues?.slowHigh;
if (!slowLow || !slowHigh || !prevSlowLow || !prevSlowHigh) {
return null;
}
const close = lastCandle.close;
const prevClose = prevCandle?.close;
if (prevClose === undefined) return null;
// BUY: Price crosses UP through slow low
if (prevClose <= prevSlowLow && close > slowLow) {
return {
type: SIGNAL_TYPES.BUY,
strength: 85,
value: close,
reasoning: `Price crossed UP through slow low`
};
}
// SELL: Price crosses DOWN through slow high
else if (prevClose >= prevSlowHigh && close < slowHigh) {
return {
type: SIGNAL_TYPES.SELL,
strength: 85,
value: close,
reasoning: `Price crossed DOWN through slow high`
};
}
return null;
}
// HTS Indicator class
export class HTSIndicator extends BaseIndicator {
constructor(config) {
super(config);
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
calculate(candles, oneMinCandles = null, targetTF = null) {
const shortPeriod = this.params.short || 33;
const longPeriod = this.params.long || 144;
const maType = this.params.maType || 'RMA';
const useAutoHTS = this.params.useAutoHTS || false;
let workingCandles = candles;
if (useAutoHTS && oneMinCandles && targetTF) {
const tfMultipliers = {
'5m': 5,
'15m': 15,
'30m': 30,
'37m': 37,
'1h': 60,
'4h': 240
};
const tfGroup = tfMultipliers[targetTF] || 5;
const grouped = [];
let currentGroup = [];
for (let i = 0; i < oneMinCandles.length; i++) {
currentGroup.push(oneMinCandles[i]);
if (currentGroup.length >= tfGroup) {
grouped.push({
time: currentGroup[tfGroup - 1].time,
open: currentGroup[tfGroup - 1].open,
high: currentGroup[tfGroup - 1].high,
low: currentGroup[tfGroup - 1].low,
close: currentGroup[tfGroup - 1].close,
volume: currentGroup[tfGroup - 1].volume
});
currentGroup = [];
}
}
workingCandles = grouped;
}
const shortHigh = getMA(maType, workingCandles, shortPeriod, 'high');
const shortLow = getMA(maType, workingCandles, shortPeriod, 'low');
const longHigh = getMA(maType, workingCandles, longPeriod, 'high');
const longLow = getMA(maType, workingCandles, longPeriod, 'low');
return workingCandles.map((_, i) => ({
fastHigh: shortHigh[i],
fastLow: shortLow[i],
slowHigh: longHigh[i],
slowLow: longLow[i],
fastMidpoint: ((shortHigh[i] || 0) + (shortLow[i] || 0)) / 2,
slowMidpoint: ((longHigh[i] || 0) + (longLow[i] || 0)) / 2
}));
}
getMetadata() {
const useAutoHTS = this.params?.useAutoHTS || false;
const fastLineWidth = useAutoHTS ? 1 : 1;
const slowLineWidth = useAutoHTS ? 2 : 2;
return {
name: 'HTS Trend System',
description: 'High/Low Trend System with Fast and Slow MAs',
inputs: [
{ name: 'short', label: 'Fast Period', type: 'number', default: 33, min: 1, max: 500 },
{ name: 'long', label: 'Slow Period', type: 'number', default: 144, min: 1, max: 500 },
{ name: 'maType', label: 'MA Type', type: 'select', options: ['SMA', 'EMA', 'RMA', 'WMA', 'VWMA'], default: 'RMA' },
{ name: 'useAutoHTS', label: 'Auto HTS (TF/4)', type: 'boolean', default: false }
],
plots: [
{ id: 'fastHigh', color: '#00bcd4', title: 'Fast High', width: fastLineWidth },
{ id: 'fastLow', color: '#00bcd4', title: 'Fast Low', width: fastLineWidth },
{ id: 'slowHigh', color: '#f44336', title: 'Slow High', width: slowLineWidth },
{ id: 'slowLow', color: '#f44336', title: 'Slow Low', width: slowLineWidth }
],
displayMode: 'overlay'
};
}
}
export { calculateHTSSignal };

View File

@ -0,0 +1,421 @@
// Self-contained Hurst Bands indicator
// Based on J.M. Hurst's cyclic price channel theory
// Using RMA + ATR displacement method
import { INTERVALS } from '../core/constants.js';
const SIGNAL_TYPES = {
BUY: 'buy',
SELL: 'sell'
};
const SIGNAL_COLORS = {
buy: '#9e9e9e',
sell: '#9e9e9e'
};
class BaseIndicator {
constructor(config) {
this.config = config;
this.id = config.id;
this.type = config.type;
this.name = config.name;
this.params = config.params || {};
this.timeframe = config.timeframe || 'chart';
this.series = [];
this.visible = config.visible !== false;
if (config.cachedResults === undefined) config.cachedResults = null;
if (config.cachedMeta === undefined) config.cachedMeta = null;
if (config.cachedTimeframe === undefined) config.cachedTimeframe = null;
if (config.isFetching === undefined) config.isFetching = false;
if (config.lastProcessedTime === undefined) config.lastProcessedTime = 0;
}
get cachedResults() { return this.config.cachedResults; }
set cachedResults(v) { this.config.cachedResults = v; }
get cachedMeta() { return this.config.cachedMeta; }
set cachedMeta(v) { this.config.cachedMeta = v; }
get cachedTimeframe() { return this.config.cachedTimeframe; }
set cachedTimeframe(v) { this.config.cachedTimeframe = v; }
get isFetching() { return this.config.isFetching; }
set isFetching(v) { this.config.isFetching = v; }
get lastProcessedTime() { return this.config.lastProcessedTime; }
set lastProcessedTime(v) { this.config.lastProcessedTime = v; }
}
// Optimized RMA that can start from a previous state
function calculateRMAIncremental(sourceValue, prevRMA, length) {
if (prevRMA === null || isNaN(prevRMA)) return sourceValue;
const alpha = 1 / length;
return alpha * sourceValue + (1 - alpha) * prevRMA;
}
// Calculate RMA for a full array with stable initialization
function calculateRMA(sourceArray, length) {
const rma = new Array(sourceArray.length).fill(null);
let sum = 0;
const alpha = 1 / length;
const smaLength = Math.round(length);
for (let i = 0; i < sourceArray.length; i++) {
if (i < smaLength - 1) {
sum += sourceArray[i];
} else if (i === smaLength - 1) {
sum += sourceArray[i];
rma[i] = sum / smaLength;
} else {
const prevRMA = rma[i - 1];
rma[i] = (prevRMA === null || isNaN(prevRMA))
? sourceArray[i]
: alpha * sourceArray[i] + (1 - alpha) * prevRMA;
}
}
return rma;
}
function calculateHurstSignal(indicator, lastCandle, prevCandle, values, prevValues) {
const close = lastCandle.close;
const prevClose = prevCandle?.close;
const upper = values?.upper;
const lower = values?.lower;
const prevUpper = prevValues?.upper;
const prevLower = prevValues?.lower;
if (close === undefined || prevClose === undefined || !upper || !lower || !prevUpper || !prevLower) {
return null;
}
// BUY: Price crosses DOWN through lower Hurst Band (dip entry)
if (prevClose > prevLower && close <= lower) {
return {
type: 'buy',
strength: 80,
value: close,
reasoning: `Price crossed DOWN through lower Hurst Band`
};
}
// SELL: Price crosses DOWN through upper Hurst Band (reversal entry)
if (prevClose > prevUpper && close <= upper) {
return {
type: 'sell',
strength: 80,
value: close,
reasoning: `Price crossed DOWN through upper Hurst Band`
};
}
return null;
}
function getEffectiveTimeframe(params) {
return params.timeframe === 'chart' ? window.dashboard?.currentInterval || '1m' : params.timeframe;
}
function intervalToSeconds(interval) {
const amount = parseInt(interval);
const unit = interval.replace(/[0-9]/g, '');
switch (unit) {
case 'm': return amount * 60;
case 'h': return amount * 3600;
case 'd': return amount * 86400;
case 'w': return amount * 604800;
case 'M': return amount * 2592000;
default: return 60;
}
}
async function getCandlesForTimeframe(tf, startTime, endTime) {
const url = `/api/v1/candles?symbol=BTC&interval=${tf}&start=${startTime.toISOString()}&end=${endTime.toISOString()}&limit=5000`;
const response = await fetch(url);
if (!response.ok) {
console.error(`Failed to fetch candles for ${tf}:`, response.status, response.statusText);
return [];
}
const data = await response.json();
// API returns newest first (desc), but indicators need oldest first (asc)
// Also convert time to numeric seconds to match targetCandles
return (data.candles || []).reverse().map(c => ({
...c,
time: Math.floor(new Date(c.time).getTime() / 1000)
}));
}
/**
* Robust forward filling for MTF data.
* @param {Array} results - MTF results (e.g. 5m)
* @param {Array} targetCandles - Chart candles (e.g. 1m)
*/
function forwardFillResults(results, targetCandles) {
if (!results || results.length === 0) {
return new Array(targetCandles.length).fill(null);
}
const filled = new Array(targetCandles.length).fill(null);
let resIdx = 0;
for (let i = 0; i < targetCandles.length; i++) {
const targetTime = targetCandles[i].time;
// Advance result index while next result time is <= target time
while (resIdx < results.length - 1 && results[resIdx + 1].time <= targetTime) {
resIdx++;
}
// If the current result is valid for this target time, use it
// (result time must be <= target time)
if (results[resIdx] && results[resIdx].time <= targetTime) {
filled[i] = results[resIdx];
}
}
return filled;
}
export class HurstBandsIndicator extends BaseIndicator {
constructor(config) {
super(config);
if (!this.params.timeframe) this.params.timeframe = 'chart';
if (!this.params.markerBuyShape) this.params.markerBuyShape = 'custom';
if (!this.params.markerSellShape) this.params.markerSellShape = 'custom';
if (!this.params.markerBuyColor) this.params.markerBuyColor = '#9e9e9e';
if (!this.params.markerSellColor) this.params.markerSellColor = '#9e9e9e';
if (!this.params.markerBuyCustom) this.params.markerBuyCustom = '▲';
if (!this.params.markerSellCustom) this.params.markerSellCustom = '▼';
}
calculate(candles) {
const effectiveTf = getEffectiveTimeframe(this.params);
const lastCandle = candles[candles.length - 1];
// Case 1: Different timeframe (MTF)
if (effectiveTf !== window.dashboard?.currentInterval && this.params.timeframe !== 'chart') {
// If we have cached results, try to forward fill them to match the current candle count
if (this.cachedResults && this.cachedTimeframe === effectiveTf) {
// If results are stale (last result time is behind last candle time), trigger background fetch
const lastResult = this.cachedResults[this.cachedResults.length - 1];
const needsFetch = !this.isFetching && (!lastResult || lastCandle.time > lastResult.time + (intervalToSeconds(effectiveTf) / 2));
if (needsFetch) {
this._fetchAndCalculateMtf(effectiveTf, candles);
}
// If length matches exactly and params haven't changed, return
if (this.cachedResults.length === candles.length && !this.shouldRecalculate()) {
return this.cachedResults;
}
// If length differs (e.g. new 1m candle but 5m not fetched yet), forward fill
const filled = forwardFillResults(this.cachedResults.filter(r => r !== null), candles);
this.cachedResults = filled;
return filled;
}
// Initial fetch
if (!this.isFetching) {
this._fetchAndCalculateMtf(effectiveTf, candles);
}
return new Array(candles.length).fill(null);
}
// Case 2: Same timeframe as chart (Incremental or Full)
// Check if we can do incremental update
if (this.cachedResults &&
this.cachedResults.length > 0 &&
this.cachedTimeframe === effectiveTf &&
!this.shouldRecalculate() &&
candles.length >= this.cachedResults.length &&
candles[this.cachedResults.length - 1].time === this.cachedResults[this.cachedResults.length - 1].time) {
// Only calculate new candles
if (candles.length > this.cachedResults.length) {
const newResults = this._calculateIncremental(candles, this.cachedResults);
this.cachedResults = newResults;
}
return this.cachedResults;
}
// Full calculation
const results = this._calculateCore(candles);
this.cachedTimeframe = effectiveTf;
this.updateCachedMeta(this.params);
this.cachedResults = results;
return results;
}
_calculateCore(candles) {
const mcl_t = this.params.period || 30;
const mcm = this.params.multiplier || 1.8;
const mcl = mcl_t / 2;
const mcl_2 = Math.round(mcl / 2);
const results = new Array(candles.length).fill(null);
const closes = candles.map(c => c.close);
// True Range for ATR
const trArray = candles.map((d, i) => {
const prevClose = i > 0 ? candles[i - 1].close : null;
if (prevClose === null || isNaN(prevClose)) return d.high - d.low;
return Math.max(d.high - d.low, Math.abs(d.high - prevClose), Math.abs(d.low - prevClose));
});
const ma_mcl = calculateRMA(closes, mcl);
const atr = calculateRMA(trArray, mcl);
for (let i = 0; i < candles.length; i++) {
const mcm_off = mcm * (atr[i] || 0);
const historicalIndex = i - mcl_2;
const historical_ma = historicalIndex >= 0 ? ma_mcl[historicalIndex] : null;
const centerLine = (historical_ma === null || isNaN(historical_ma)) ? closes[i] : historical_ma;
results[i] = {
time: candles[i].time,
upper: centerLine + mcm_off,
lower: centerLine - mcm_off,
ma: ma_mcl[i], // Store intermediate state for incremental updates
atr: atr[i]
};
}
return results;
}
_calculateIncremental(candles, oldResults) {
const mcl_t = this.params.period || 30;
const mcm = this.params.multiplier || 1.8;
const mcl = mcl_t / 2;
const mcl_2 = Math.round(mcl / 2);
const results = [...oldResults];
const startIndex = oldResults.length;
for (let i = startIndex; i < candles.length; i++) {
const close = candles[i].close;
const prevClose = candles[i-1].close;
const tr = Math.max(candles[i].high - candles[i].low, Math.abs(candles[i].high - prevClose), Math.abs(candles[i].low - prevClose));
const prevMA = results[i-1]?.ma;
const prevATR = results[i-1]?.atr;
const currentMA = calculateRMAIncremental(close, prevMA, mcl);
const currentATR = calculateRMAIncremental(tr, prevATR, mcl);
// For displaced center line, we still need the MA from i - mcl_2
// Since i >= oldResults.length, i - mcl_2 might be in the old results
let historical_ma = null;
const historicalIndex = i - mcl_2;
if (historicalIndex >= 0) {
historical_ma = historicalIndex < startIndex ? results[historicalIndex].ma : null; // In this simple incremental, we don't look ahead
}
const centerLine = (historical_ma === null || isNaN(historical_ma)) ? close : historical_ma;
const mcm_off = mcm * (currentATR || 0);
results[i] = {
time: candles[i].time,
upper: centerLine + mcm_off,
lower: centerLine - mcm_off,
ma: currentMA,
atr: currentATR
};
}
return results;
}
async _fetchAndCalculateMtf(effectiveTf, targetCandles) {
this.isFetching = true;
try {
console.log(`[Hurst] Fetching MTF data for ${effectiveTf}...`);
const chartData = window.dashboard?.allData?.get(window.dashboard?.currentInterval) || targetCandles;
if (!chartData || chartData.length === 0) {
console.warn('[Hurst] No chart data available for timeframe fetch');
this.isFetching = false;
return;
}
// Calculate warmup needed (period + half width)
const mcl_t = this.params.period || 30;
const warmupBars = mcl_t * 2; // Extra buffer
const tfSeconds = intervalToSeconds(effectiveTf);
const warmupOffsetSeconds = warmupBars * tfSeconds;
// Candles endpoint expects ISO strings or timestamps.
// chartData[0].time is the earliest candle on chart.
const startTime = new Date((chartData[0].time - warmupOffsetSeconds) * 1000);
const endTime = new Date(chartData[chartData.length - 1].time * 1000);
const tfCandles = await getCandlesForTimeframe(effectiveTf, startTime, endTime);
if (tfCandles.length === 0) {
console.warn(`[Hurst] No candles fetched for ${effectiveTf}`);
this.isFetching = false;
return;
}
console.log(`[Hurst] Fetched ${tfCandles.length} candles for ${effectiveTf}. Calculating...`);
const tfResults = this._calculateCore(tfCandles);
const finalResults = forwardFillResults(tfResults, targetCandles);
// Persist results on the config object
this.cachedResults = finalResults;
this.cachedTimeframe = effectiveTf;
this.updateCachedMeta(this.params);
console.log(`[Hurst] MTF calculation complete for ${effectiveTf}. Triggering redraw.`);
// Trigger a redraw of the dashboard to show the new data
if (window.drawIndicatorsOnChart) {
window.drawIndicatorsOnChart();
}
} catch (err) {
console.error('[Hurst] Error in _fetchAndCalculateMtf:', err);
} finally {
this.isFetching = false;
}
}
getMetadata() {
return {
name: 'Hurst Bands',
description: 'Cyclic price channels based on Hurst theory',
inputs: [
{
name: 'timeframe',
label: 'Timeframe',
type: 'select',
default: 'chart',
options: ['chart', ...INTERVALS],
labels: { chart: '(Main Chart)' }
},
{ name: 'period', label: 'Hurst Cycle Length (mcl_t)', type: 'number', default: 30, min: 5, max: 200 },
{ name: 'multiplier', label: 'Multiplier (mcm)', type: 'number', default: 1.8, min: 0.5, max: 10, step: 0.1 }
],
plots: [
{ id: 'upper', color: '#808080', title: 'Upper', lineWidth: 1 },
{ id: 'lower', color: '#808080', title: 'Lower', lineWidth: 1 }
],
bands: [
{ topId: 'upper', bottomId: 'lower', color: 'rgba(128, 128, 128, 0.05)' }
],
displayMode: 'overlay'
};
}
shouldRecalculate() {
const effectiveTf = getEffectiveTimeframe(this.params);
return this.cachedTimeframe !== effectiveTf ||
(this.cachedMeta && (this.cachedMeta.period !== this.params.period ||
this.cachedMeta.multiplier !== this.params.multiplier));
}
updateCachedMeta(params) {
this.cachedMeta = {
period: params.period,
multiplier: params.multiplier
};
}
}
export { calculateHurstSignal };

View File

@ -0,0 +1,69 @@
// Indicator registry and exports for self-contained indicators
// Import all indicator classes and their signal functions
export { MAIndicator, calculateMASignal } from './moving_average.js';
export { MACDIndicator, calculateMACDSignal } from './macd.js';
export { HTSIndicator, calculateHTSSignal } from './hts.js';
export { RSIIndicator, calculateRSISignal } from './rsi.js';
export { BollingerBandsIndicator, calculateBollingerBandsSignal } from './bb.js';
export { StochasticIndicator, calculateStochSignal } from './stoch.js';
export { ATRIndicator, calculateATRSignal } from './atr.js';
export { HurstBandsIndicator, calculateHurstSignal } from './hurst.js';
// Import for registry
import { MAIndicator as MAI, calculateMASignal as CMA } from './moving_average.js';
import { MACDIndicator as MACDI, calculateMACDSignal as CMC } from './macd.js';
import { HTSIndicator as HTSI, calculateHTSSignal as CHTS } from './hts.js';
import { RSIIndicator as RSII, calculateRSISignal as CRSI } from './rsi.js';
import { BollingerBandsIndicator as BBI, calculateBollingerBandsSignal as CBB } from './bb.js';
import { StochasticIndicator as STOCHI, calculateStochSignal as CST } from './stoch.js';
import { ATRIndicator as ATRI, calculateATRSignal as CATR } from './atr.js';
import { HurstBandsIndicator as HURSTI, calculateHurstSignal as CHURST } from './hurst.js';
// Signal function registry for easy dispatch
export const SignalFunctionRegistry = {
ma: CMA,
macd: CMC,
hts: CHTS,
rsi: CRSI,
bb: CBB,
stoch: CST,
atr: CATR,
hurst: CHURST
};
// Indicator registry for UI
export const IndicatorRegistry = {
ma: MAI,
macd: MACDI,
hts: HTSI,
rsi: RSII,
bb: BBI,
stoch: STOCHI,
atr: ATRI,
hurst: HURSTI
};
/**
* Get list of available indicators for the UI catalog
*/
export function getAvailableIndicators() {
return Object.entries(IndicatorRegistry).map(([type, IndicatorClass]) => {
const instance = new IndicatorClass({ type, params: {}, name: '' });
const meta = instance.getMetadata();
return {
type,
name: meta.name || type.toUpperCase(),
description: meta.description || ''
};
});
}
/**
* Get signal function for an indicator type
* @param {string} indicatorType - The type of indicator (e.g., 'ma', 'rsi')
* @returns {Function|null} The signal calculation function or null if not found
*/
export function getSignalFunction(indicatorType) {
return SignalFunctionRegistry[indicatorType] || null;
}

View File

@ -0,0 +1,153 @@
// Self-contained MACD indicator
// Includes math, metadata, signal calculation, and base class
// Signal constants (defined in each indicator file)
const SIGNAL_TYPES = {
BUY: 'buy',
SELL: 'sell',
HOLD: 'hold'
};
const SIGNAL_COLORS = {
buy: '#26a69a',
hold: '#787b86',
sell: '#ef5350'
};
// Base class (inline replacement for BaseIndicator)
class BaseIndicator {
constructor(config) {
this.id = config.id;
this.type = config.type;
this.name = config.name;
this.params = config.params || {};
this.timeframe = config.timeframe || '1m';
this.series = [];
this.visible = config.visible !== false;
this.cachedResults = null;
this.cachedMeta = null;
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
}
// EMA calculation inline (needed for MACD)
function calculateEMAInline(data, period) {
const multiplier = 2 / (period + 1);
const ema = [];
for (let i = 0; i < data.length; i++) {
if (i < period - 1) {
ema.push(null);
} else if (i === period - 1) {
ema.push(data[i]);
} else {
ema.push((data[i] - ema[i - 1]) * multiplier + ema[i - 1]);
}
}
return ema;
}
// Signal calculation for MACD
function calculateMACDSignal(indicator, lastCandle, prevCandle, values, prevValues) {
const macd = values?.macd;
const signal = values?.signal;
const prevMacd = prevValues?.macd;
const prevSignal = prevValues?.signal;
if (macd === undefined || macd === null || signal === undefined || signal === null ||
prevMacd === undefined || prevMacd === null || prevSignal === undefined || prevSignal === null) {
return null;
}
// BUY: MACD crosses UP through Signal line
if (prevMacd <= prevSignal && macd > signal) {
return {
type: SIGNAL_TYPES.BUY,
strength: 80,
value: macd,
reasoning: `MACD crossed UP through Signal line`
};
}
// SELL: MACD crosses DOWN through Signal line
else if (prevMacd >= prevSignal && macd < signal) {
return {
type: SIGNAL_TYPES.SELL,
strength: 80,
value: macd,
reasoning: `MACD crossed DOWN through Signal line`
};
}
return null;
}
// MACD Indicator class
export class MACDIndicator extends BaseIndicator {
constructor(config) {
super(config);
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
calculate(candles) {
const fast = this.params.fast || 12;
const slow = this.params.slow || 26;
const signalPeriod = this.params.signal || 9;
const closes = candles.map(c => c.close);
// Use inline EMA calculation instead of MA.ema()
const fastEMA = calculateEMAInline(closes, fast);
const slowEMA = calculateEMAInline(closes, slow);
const macdLine = fastEMA.map((f, i) => (f !== null && slowEMA[i] !== null) ? f - slowEMA[i] : null);
let sum = 0;
let ema = 0;
let count = 0;
const signalLine = macdLine.map(m => {
if (m === null) return null;
count++;
if (count < signalPeriod) {
sum += m;
return null;
} else if (count === signalPeriod) {
sum += m;
ema = sum / signalPeriod;
return ema;
} else {
ema = (m - ema) * (2 / (signalPeriod + 1)) + ema;
return ema;
}
});
return macdLine.map((m, i) => ({
macd: m,
signal: signalLine[i],
histogram: (m !== null && signalLine[i] !== null) ? m - signalLine[i] : null
}));
}
getMetadata() {
return {
name: 'MACD',
description: 'Moving Average Convergence Divergence - trend & momentum',
inputs: [
{ name: 'fast', label: 'Fast Period', type: 'number', default: 12 },
{ name: 'slow', label: 'Slow Period', type: 'number', default: 26 },
{ name: 'signal', label: 'Signal Period', type: 'number', default: 9 }
],
plots: [
{ id: 'macd', color: '#2196f3', title: 'MACD' },
{ id: 'signal', color: '#ff5722', title: 'Signal' },
{ id: 'histogram', color: '#607d8b', title: 'Histogram', type: 'histogram' }
],
displayMode: 'pane'
};
}
}
export { calculateMACDSignal };

View File

@ -0,0 +1,221 @@
// Self-contained Moving Average indicator with SMA/EMA/RMA/WMA/VWMA support
// Includes math, metadata, signal calculation, and base class
// Signal constants (defined in each indicator file)
const SIGNAL_TYPES = {
BUY: 'buy',
SELL: 'sell',
HOLD: 'hold'
};
const SIGNAL_COLORS = {
buy: '#26a69a',
hold: '#787b86',
sell: '#ef5350'
};
// Base class (inline replacement for BaseIndicator)
class BaseIndicator {
constructor(config) {
this.id = config.id;
this.type = config.type;
this.name = config.name;
this.params = config.params || {};
this.timeframe = config.timeframe || '1m';
this.series = [];
this.visible = config.visible !== false;
this.cachedResults = null;
this.cachedMeta = null;
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
}
// Moving Average math (SMA/EMA/RMA/WMA/VWMA)
function calculateSMA(candles, period, source = 'close') {
const results = new Array(candles.length).fill(null);
let sum = 0;
for (let i = 0; i < candles.length; i++) {
sum += candles[i][source];
if (i >= period) sum -= candles[i - period][source];
if (i >= period - 1) results[i] = sum / period;
}
return results;
}
function calculateEMA(candles, period, source = 'close') {
const multiplier = 2 / (period + 1);
const results = new Array(candles.length).fill(null);
let ema = 0;
let sum = 0;
for (let i = 0; i < candles.length; i++) {
if (i < period) {
sum += candles[i][source];
if (i === period - 1) {
ema = sum / period;
results[i] = ema;
}
} else {
ema = (candles[i][source] - ema) * multiplier + ema;
results[i] = ema;
}
}
return results;
}
function calculateRMA(candles, period, source = 'close') {
const multiplier = 1 / period;
const results = new Array(candles.length).fill(null);
let rma = 0;
let sum = 0;
for (let i = 0; i < candles.length; i++) {
if (i < period) {
sum += candles[i][source];
if (i === period - 1) {
rma = sum / period;
results[i] = rma;
}
} else {
rma = (candles[i][source] - rma) * multiplier + rma;
results[i] = rma;
}
}
return results;
}
function calculateWMA(candles, period, source = 'close') {
const results = new Array(candles.length).fill(null);
const weightSum = (period * (period + 1)) / 2;
for (let i = period - 1; i < candles.length; i++) {
let sum = 0;
for (let j = 0; j < period; j++) {
sum += candles[i - j][source] * (period - j);
}
results[i] = sum / weightSum;
}
return results;
}
function calculateVWMA(candles, period, source = 'close') {
const results = new Array(candles.length).fill(null);
for (let i = period - 1; i < candles.length; i++) {
let sumPV = 0;
let sumV = 0;
for (let j = 0; j < period; j++) {
sumPV += candles[i - j][source] * candles[i - j].volume;
sumV += candles[i - j].volume;
}
results[i] = sumV !== 0 ? sumPV / sumV : null;
}
return results;
}
// Signal calculation for Moving Average
function calculateMASignal(indicator, lastCandle, prevCandle, values, prevValues) {
const close = lastCandle.close;
const prevClose = prevCandle?.close;
const ma = values?.ma;
const prevMa = prevValues?.ma;
if (!ma && ma !== 0) return null;
if (prevClose === undefined || prevMa === undefined || prevMa === null) return null;
// BUY: Price crosses UP through MA
if (prevClose <= prevMa && close > ma) {
return {
type: SIGNAL_TYPES.BUY,
strength: 80,
value: close,
reasoning: `Price crossed UP through MA`
};
}
// SELL: Price crosses DOWN through MA
else if (prevClose >= prevMa && close < ma) {
return {
type: SIGNAL_TYPES.SELL,
strength: 80,
value: close,
reasoning: `Price crossed DOWN through MA`
};
}
return null;
}
// MA Indicator class
export class MAIndicator extends BaseIndicator {
constructor(config) {
super(config);
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
calculate(candles) {
const maType = (this.params.maType || 'SMA').toLowerCase();
const period = this.params.period || 44;
let maValues;
switch (maType) {
case 'sma':
maValues = calculateSMA(candles, period, this.params.source || 'close');
break;
case 'ema':
maValues = calculateEMA(candles, period, this.params.source || 'close');
break;
case 'rma':
maValues = calculateRMA(candles, period, this.params.source || 'close');
break;
case 'wma':
maValues = calculateWMA(candles, period, this.params.source || 'close');
break;
case 'vwma':
maValues = calculateVWMA(candles, period, this.params.source || 'close');
break;
default:
maValues = calculateSMA(candles, period, this.params.source || 'close');
}
return maValues.map(ma => ({ ma }));
}
getMetadata() {
return {
name: 'MA',
description: 'Moving Average (SMA/EMA/RMA/WMA/VWMA)',
inputs: [
{
name: 'period',
label: 'Period',
type: 'number',
default: 44,
min: 1,
max: 500
},
{
name: 'maType',
label: 'MA Type',
type: 'select',
options: ['SMA', 'EMA', 'RMA', 'WMA', 'VWMA'],
default: 'SMA'
}
],
plots: [
{
id: 'ma',
color: '#2962ff',
title: 'MA',
style: 'solid',
width: 1
}
],
displayMode: 'overlay'
};
}
}
// Export signal function for external use
export { calculateMASignal };

View File

@ -0,0 +1,141 @@
// Self-contained RSI indicator
// Includes math, metadata, signal calculation, and base class
// Signal constants (defined in each indicator file)
const SIGNAL_TYPES = {
BUY: 'buy',
SELL: 'sell',
HOLD: 'hold'
};
const SIGNAL_COLORS = {
buy: '#26a69a',
hold: '#787b86',
sell: '#ef5350'
};
// Base class (inline replacement for BaseIndicator)
class BaseIndicator {
constructor(config) {
this.id = config.id;
this.type = config.type;
this.name = config.name;
this.params = config.params || {};
this.timeframe = config.timeframe || '1m';
this.series = [];
this.visible = config.visible !== false;
this.cachedResults = null;
this.cachedMeta = null;
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
}
// Signal calculation for RSI
function calculateRSISignal(indicator, lastCandle, prevCandle, values, prevValues) {
const rsi = values?.rsi;
const prevRsi = prevValues?.rsi;
const overbought = indicator.params?.overbought || 70;
const oversold = indicator.params?.oversold || 30;
if (rsi === undefined || rsi === null || prevRsi === undefined || prevRsi === null) {
return null;
}
// BUY when RSI crosses UP through oversold level
if (prevRsi < oversold && rsi >= oversold) {
return {
type: SIGNAL_TYPES.BUY,
strength: 75,
value: rsi,
reasoning: `RSI crossed UP through oversold level (${oversold})`
};
}
// SELL when RSI crosses DOWN through overbought level
else if (prevRsi > overbought && rsi <= overbought) {
return {
type: SIGNAL_TYPES.SELL,
strength: 75,
value: rsi,
reasoning: `RSI crossed DOWN through overbought level (${overbought})`
};
}
return null;
}
// RSI Indicator class
export class RSIIndicator extends BaseIndicator {
constructor(config) {
super(config);
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
calculate(candles) {
const period = this.params.period || 14;
const overbought = this.params.overbought || 70;
const oversold = this.params.oversold || 30;
// 1. Calculate RSI using RMA (Wilder's Smoothing)
let rsiValues = new Array(candles.length).fill(null);
let upSum = 0;
let downSum = 0;
const rmaAlpha = 1 / period;
for (let i = 1; i < candles.length; i++) {
const diff = candles[i].close - candles[i-1].close;
const up = diff > 0 ? diff : 0;
const down = diff < 0 ? -diff : 0;
if (i < period) {
upSum += up;
downSum += down;
} else if (i === period) {
upSum += up;
downSum += down;
const avgUp = upSum / period;
const avgDown = downSum / period;
rsiValues[i] = avgDown === 0 ? 100 : (avgUp === 0 ? 0 : 100 - (100 / (1 + avgUp / avgDown)));
upSum = avgUp;
downSum = avgDown;
} else {
upSum = (up - upSum) * rmaAlpha + upSum;
downSum = (down - downSum) * rmaAlpha + downSum;
rsiValues[i] = downSum === 0 ? 100 : (upSum === 0 ? 0 : 100 - (100 / (1 + upSum / downSum)));
}
}
// Combine results
return rsiValues.map((rsi, i) => {
return {
paneBg: 80,
rsi: rsi,
overboughtBand: overbought,
oversoldBand: oversold
};
});
}
getMetadata() {
return {
name: 'RSI',
description: 'Relative Strength Index',
inputs: [
{ name: 'period', label: 'RSI Length', type: 'number', default: 14, min: 1, max: 100 },
{ name: 'overbought', label: 'Overbought Level', type: 'number', default: 70, min: 50, max: 95 },
{ name: 'oversold', label: 'Oversold Level', type: 'number', default: 30, min: 5, max: 50 }
],
plots: [
{ id: 'rsi', color: '#7E57C2', title: '', style: 'solid', width: 1, lastValueVisible: true },
{ id: 'overboughtBand', color: '#787B86', title: '', style: 'dashed', width: 1, lastValueVisible: false },
{ id: 'oversoldBand', color: '#787B86', title: '', style: 'dashed', width: 1, lastValueVisible: false }
],
displayMode: 'pane',
paneMin: 0,
paneMax: 100
};
}
}
export { calculateRSISignal };

View File

@ -0,0 +1,139 @@
// Self-contained Stochastic Oscillator indicator
// Includes math, metadata, signal calculation, and base class
// Signal constants (defined in each indicator file)
const SIGNAL_TYPES = {
BUY: 'buy',
SELL: 'sell',
HOLD: 'hold'
};
const SIGNAL_COLORS = {
buy: '#26a69a',
hold: '#787b86',
sell: '#ef5350'
};
// Base class (inline replacement for BaseIndicator)
class BaseIndicator {
constructor(config) {
this.id = config.id;
this.type = config.type;
this.name = config.name;
this.params = config.params || {};
this.timeframe = config.timeframe || '1m';
this.series = [];
this.visible = config.visible !== false;
this.cachedResults = null;
this.cachedMeta = null;
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
}
// Signal calculation for Stochastic
function calculateStochSignal(indicator, lastCandle, prevCandle, values, prevValues) {
const k = values?.k;
const d = values?.d;
const prevK = prevValues?.k;
const prevD = prevValues?.d;
const overbought = indicator.params?.overbought || 80;
const oversold = indicator.params?.oversold || 20;
if (k === undefined || d === undefined || prevK === undefined || prevD === undefined) {
return null;
}
// BUY: %K crosses UP through %D while both are oversold
if (prevK <= prevD && k > d && k < oversold) {
return {
type: SIGNAL_TYPES.BUY,
strength: 80,
value: k,
reasoning: `Stochastic %K crossed UP through %D in oversold zone`
};
}
// SELL: %K crosses DOWN through %D while both are overbought
else if (prevK >= prevD && k < d && k > overbought) {
return {
type: SIGNAL_TYPES.SELL,
strength: 80,
value: k,
reasoning: `Stochastic %K crossed DOWN through %D in overbought zone`
};
}
return null;
}
// Stochastic Oscillator Indicator class
export class StochasticIndicator extends BaseIndicator {
constructor(config) {
super(config);
this.lastSignalTimestamp = null;
this.lastSignalType = null;
}
calculate(candles) {
const kPeriod = this.params.kPeriod || 14;
const dPeriod = this.params.dPeriod || 3;
const results = new Array(candles.length).fill(null);
const kValues = new Array(candles.length).fill(null);
for (let i = kPeriod - 1; i < candles.length; i++) {
let lowest = Infinity;
let highest = -Infinity;
for (let j = 0; j < kPeriod; j++) {
lowest = Math.min(lowest, candles[i-j].low);
highest = Math.max(highest, candles[i-j].high);
}
const diff = highest - lowest;
kValues[i] = diff === 0 ? 50 : ((candles[i].close - lowest) / diff) * 100;
}
for (let i = kPeriod + dPeriod - 2; i < candles.length; i++) {
let sum = 0;
for (let j = 0; j < dPeriod; j++) sum += kValues[i-j];
results[i] = { k: kValues[i], d: sum / dPeriod };
}
return results;
}
getMetadata() {
return {
name: 'Stochastic',
description: 'Stochastic Oscillator - compares close to high-low range',
inputs: [
{
name: 'kPeriod',
label: '%K Period',
type: 'number',
default: 14,
min: 1,
max: 100,
description: 'Lookback period for %K calculation'
},
{
name: 'dPeriod',
label: '%D Period',
type: 'number',
default: 3,
min: 1,
max: 20,
description: 'Smoothing period for %D (SMA of %K)'
}
],
plots: [
{ id: 'k', color: '#3f51b5', title: '%K', style: 'solid', width: 1 },
{ id: 'd', color: '#ff9800', title: '%D', style: 'solid', width: 1 }
],
displayMode: 'pane',
paneMin: 0,
paneMax: 100
};
}
}
export { calculateStochSignal };

View File

@ -0,0 +1,9 @@
export const StrategyRegistry = {};
export function registerStrategy(name, strategyModule) {
StrategyRegistry[name] = strategyModule;
}
export function getStrategy(name) {
return StrategyRegistry[name];
}

View File

@ -0,0 +1,612 @@
import { getSignalFunction } from '../indicators/index.js';
const STORAGE_KEY = 'ping_pong_settings';
function getSavedSettings() {
const saved = localStorage.getItem(STORAGE_KEY);
if (!saved) return null;
try {
return JSON.parse(saved);
} catch (e) {
return null;
}
}
export const PingPongStrategy = {
id: 'ping_pong',
name: 'Ping-Pong',
saveSettings: function() {
const settings = {
startDate: document.getElementById('simStartDate').value,
stopDate: document.getElementById('simStopDate').value,
contractType: document.getElementById('simContractType').value,
direction: document.getElementById('simDirection').value,
autoDirection: document.getElementById('simAutoDirection').checked,
capital: document.getElementById('simCapital').value,
exchangeLeverage: document.getElementById('simExchangeLeverage').value,
maxEffectiveLeverage: document.getElementById('simMaxEffectiveLeverage').value,
posSize: document.getElementById('simPosSize').value,
tp: document.getElementById('simTP').value
};
localStorage.setItem(STORAGE_KEY, JSON.stringify(settings));
const btn = document.getElementById('saveSimSettings');
if (btn) {
const originalText = btn.textContent;
btn.textContent = 'Saved!';
btn.style.color = '#26a69a';
setTimeout(() => {
btn.textContent = originalText;
btn.style.color = '';
}, 2000);
}
},
renderUI: function(activeIndicators, formatDisplayDate) {
const saved = getSavedSettings();
// Format initial values for display
let startDisplay = saved?.startDate || '01/01/2026 00:00';
let stopDisplay = saved?.stopDate || '';
if (startDisplay.includes('T')) {
startDisplay = formatDisplayDate(new Date(startDisplay));
}
if (stopDisplay.includes('T')) {
stopDisplay = formatDisplayDate(new Date(stopDisplay));
}
const renderIndicatorChecklist = (prefix) => {
if (activeIndicators.length === 0) {
return '<div style="padding: 8px; color: var(--tv-text-secondary); font-size: 11px;">No active indicators on chart</div>';
}
return activeIndicators.map(ind => `
<label class="checklist-item">
<input type="checkbox" data-id="${ind.id}" class="sim-${prefix}-check">
<span>${ind.name}</span>
</label>
`).join('');
};
const autoDirChecked = saved?.autoDirection === true;
const disableManualStr = autoDirChecked ? 'disabled' : '';
return `
<div class="sim-input-group">
<label>Start Date & Time</label>
<input type="text" id="simStartDate" class="sim-input" value="${startDisplay}" placeholder="DD/MM/YYYY HH:MM">
</div>
<div class="sim-input-group">
<label>Stop Date & Time (Optional)</label>
<input type="text" id="simStopDate" class="sim-input" value="${stopDisplay}" placeholder="DD/MM/YYYY HH:MM">
</div>
<div class="sim-input-group" style="background: rgba(38, 166, 154, 0.1); padding: 8px; border-radius: 4px; border: 1px solid rgba(38, 166, 154, 0.2);">
<label class="checklist-item" style="margin-bottom: 0;">
<input type="checkbox" id="simAutoDirection" ${autoDirChecked ? 'checked' : ''}>
<span style="color: #26a69a; font-weight: bold;">Auto-Detect Direction (1D MA44)</span>
</label>
<div style="font-size: 10px; color: var(--tv-text-secondary); margin-left: 24px; margin-top: 4px;">
Price > MA44: LONG (Inverse/BTC Margin)<br>
Price < MA44: SHORT (Linear/USDT Margin)
</div>
</div>
<div class="sim-input-group">
<label>Contract Type (Manual)</label>
<select id="simContractType" class="sim-input" ${disableManualStr}>
<option value="linear" ${saved?.contractType === 'linear' ? 'selected' : ''}>Linear (USDT-Margined)</option>
<option value="inverse" ${saved?.contractType === 'inverse' ? 'selected' : ''}>Inverse (Coin-Margined)</option>
</select>
</div>
<div class="sim-input-group">
<label>Direction (Manual)</label>
<select id="simDirection" class="sim-input" ${disableManualStr}>
<option value="long" ${saved?.direction === 'long' ? 'selected' : ''}>Long</option>
<option value="short" ${saved?.direction === 'short' ? 'selected' : ''}>Short</option>
</select>
</div>
<div class="sim-input-group">
<label>Initial Capital ($)</label>
<input type="number" id="simCapital" class="sim-input" value="${saved?.capital || '10000'}" min="1">
</div>
<div class="sim-input-group">
<label>Exchange Leverage (Ping Size Multiplier)</label>
<input type="number" id="simExchangeLeverage" class="sim-input" value="${saved?.exchangeLeverage || '1'}" min="1" max="100">
</div>
<div class="sim-input-group">
<label>Max Effective Leverage (Total Account Cap)</label>
<input type="number" id="simMaxEffectiveLeverage" class="sim-input" value="${saved?.maxEffectiveLeverage || '5'}" min="1" max="100">
</div>
<div class="sim-input-group">
<label>Position Size ($ Margin per Ping)</label>
<input type="number" id="simPosSize" class="sim-input" value="${saved?.posSize || '10'}" min="1">
</div>
<div class="sim-input-group">
<label>Take Profit (%)</label>
<input type="number" id="simTP" class="sim-input" value="${saved?.tp || '15'}" step="0.1" min="0.1">
</div>
<div class="sim-input-group">
<div style="display: flex; justify-content: space-between; align-items: center; margin-bottom: 4px;">
<label style="margin-bottom: 0;">Open Signal Indicators</label>
<button class="action-btn-text" id="saveSimSettings" style="font-size: 10px; color: #00bcd4; background: none; border: none; cursor: pointer; padding: 0;">Save Defaults</button>
</div>
<div class="indicator-checklist" id="openSignalsList">
${renderIndicatorChecklist('open')}
</div>
</div>
<div class="sim-input-group">
<label>Close Signal Indicators (Empty = Accumulation)</label>
<div class="indicator-checklist" id="closeSignalsList">
${renderIndicatorChecklist('close')}
</div>
</div>
`;
},
attachListeners: function() {
const autoCheck = document.getElementById('simAutoDirection');
const contractSelect = document.getElementById('simContractType');
const dirSelect = document.getElementById('simDirection');
if (autoCheck) {
autoCheck.addEventListener('change', (e) => {
const isAuto = e.target.checked;
contractSelect.disabled = isAuto;
dirSelect.disabled = isAuto;
});
}
const saveBtn = document.getElementById('saveSimSettings');
if (saveBtn) saveBtn.addEventListener('click', this.saveSettings.bind(this));
},
runSimulation: async function(activeIndicators, displayResultsCallback) {
const btn = document.getElementById('runSimulationBtn');
btn.disabled = true;
btn.textContent = 'Preparing Data...';
try {
const startVal = document.getElementById('simStartDate').value;
const stopVal = document.getElementById('simStopDate').value;
const config = {
startDate: new Date(startVal).getTime() / 1000,
stopDate: stopVal ? new Date(stopVal).getTime() / 1000 : Math.floor(Date.now() / 1000),
autoDirection: document.getElementById('simAutoDirection').checked,
contractType: document.getElementById('simContractType').value,
direction: document.getElementById('simDirection').value,
capital: parseFloat(document.getElementById('simCapital').value),
exchangeLeverage: parseFloat(document.getElementById('simExchangeLeverage').value),
maxEffectiveLeverage: parseFloat(document.getElementById('simMaxEffectiveLeverage').value),
posSize: parseFloat(document.getElementById('simPosSize').value),
tp: parseFloat(document.getElementById('simTP').value) / 100,
openIndicators: Array.from(document.querySelectorAll('.sim-open-check:checked')).map(el => el.dataset.id),
closeIndicators: Array.from(document.querySelectorAll('.sim-close-check:checked')).map(el => el.dataset.id)
};
if (config.openIndicators.length === 0) {
alert('Please choose at least one indicator for opening positions.');
return;
}
const interval = window.dashboard?.currentInterval || '1d';
// 1. Ensure data is loaded for the range
let allCandles = window.dashboard?.allData?.get(interval) || [];
const earliestInCache = allCandles.length > 0 ? allCandles[0].time : Infinity;
const latestInCache = allCandles.length > 0 ? allCandles[allCandles.length - 1].time : -Infinity;
if (config.startDate < earliestInCache || config.stopDate > latestInCache) {
btn.textContent = 'Fetching from Server...';
let currentEndISO = new Date(config.stopDate * 1000).toISOString();
const startISO = new Date(config.startDate * 1000).toISOString();
let keepFetching = true;
let newCandlesAdded = false;
while (keepFetching) {
const response = await fetch(`/api/v1/candles?symbol=BTC&interval=${interval}&start=${startISO}&end=${currentEndISO}&limit=10000`);
const data = await response.json();
if (data.candles && data.candles.length > 0) {
const fetchedCandles = data.candles.reverse().map(c => ({
time: Math.floor(new Date(c.time).getTime() / 1000),
open: parseFloat(c.open),
high: parseFloat(c.high),
low: parseFloat(c.low),
close: parseFloat(c.close),
volume: parseFloat(c.volume || 0)
}));
allCandles = window.dashboard.mergeData(allCandles, fetchedCandles);
newCandlesAdded = true;
// If we received 10000 candles, there might be more. We fetch again using the oldest candle's time - 1s
if (data.candles.length === 10000) {
const oldestTime = fetchedCandles[0].time;
if (oldestTime <= config.startDate) {
keepFetching = false;
} else {
currentEndISO = new Date((oldestTime - 1) * 1000).toISOString();
btn.textContent = `Fetching older data... (${new Date(oldestTime * 1000).toLocaleDateString()})`;
}
} else {
keepFetching = false;
}
} else {
keepFetching = false;
}
}
if (newCandlesAdded) {
window.dashboard.allData.set(interval, allCandles);
window.dashboard.candleSeries.setData(allCandles);
btn.textContent = 'Calculating Indicators...';
window.drawIndicatorsOnChart?.();
await new Promise(r => setTimeout(r, 500));
}
}
// --- Auto-Direction: Fetch 1D candles for MA(44) ---
let dailyCandles = [];
let dailyMaMap = new Map(); // timestamp (midnight UTC) -> MA44 value
if (config.autoDirection) {
btn.textContent = 'Fetching 1D MA(44)...';
// Fetch 1D candles starting 45 days BEFORE the simulation start date to warm up the MA
const msPerDay = 24 * 60 * 60 * 1000;
const dailyStartISO = new Date((config.startDate * 1000) - (45 * msPerDay)).toISOString();
const stopISO = new Date(config.stopDate * 1000).toISOString();
const dailyResponse = await fetch(`/api/v1/candles?symbol=BTC&interval=1d&start=${dailyStartISO}&end=${stopISO}&limit=5000`);
const dailyData = await dailyResponse.json();
if (dailyData.candles && dailyData.candles.length > 0) {
dailyCandles = dailyData.candles.reverse().map(c => ({
time: Math.floor(new Date(c.time).getTime() / 1000),
close: parseFloat(c.close)
}));
// Calculate MA(44)
const maPeriod = 44;
for (let i = maPeriod - 1; i < dailyCandles.length; i++) {
let sum = 0;
for (let j = 0; j < maPeriod; j++) {
sum += dailyCandles[i - j].close;
}
const maValue = sum / maPeriod;
// Store the MA value using the midnight UTC timestamp of that day
dailyMaMap.set(dailyCandles[i].time, maValue);
}
} else {
console.warn('[Simulation] Failed to fetch 1D candles for Auto-Direction. Falling back to manual.');
config.autoDirection = false;
}
}
// --------------------------------------------------
btn.textContent = 'Simulating...';
// Filter candles by the exact range
const simCandles = allCandles.filter(c => c.time >= config.startDate && c.time <= config.stopDate);
if (simCandles.length === 0) {
alert('No data available for the selected range.');
return;
}
// Calculate indicator signals
const indicatorSignals = {};
for (const indId of [...new Set([...config.openIndicators, ...config.closeIndicators])]) {
const ind = activeIndicators.find(a => a.id === indId);
if (!ind) continue;
const signalFunc = getSignalFunction(ind.type);
const results = ind.cachedResults;
if (results && signalFunc) {
indicatorSignals[indId] = simCandles.map(candle => {
const idx = allCandles.findIndex(c => c.time === candle.time);
if (idx < 1) return null;
const values = typeof results[idx] === 'object' && results[idx] !== null ? results[idx] : { ma: results[idx] };
const prevValues = typeof results[idx-1] === 'object' && results[idx-1] !== null ? results[idx-1] : { ma: results[idx-1] };
return signalFunc(ind, allCandles[idx], allCandles[idx-1], values, prevValues);
});
}
}
// Simulation Initial State
const startPrice = simCandles[0].open;
// We maintain a single "walletBalanceUsd" variable as the source of truth for the account size
let walletBalanceUsd = config.capital;
// At any given time, the active margin type determines how we use this balance
// When LONG (Inverse), we theoretically buy BTC with it.
// When SHORT (Linear), we just use it as USDT.
// Set initial state based on auto or manual
if (config.autoDirection && dailyMaMap.size > 0) {
// Find the MA value for the day before start date
const simStartDayTime = Math.floor(simCandles[0].time / 86400) * 86400; // Midnight UTC
let closestMA = Array.from(dailyMaMap.entries())
.filter(([t]) => t <= simStartDayTime)
.sort((a,b) => b[0] - a[0])[0];
if (closestMA) {
const price = simCandles[0].open;
if (price > closestMA[1]) {
config.direction = 'long';
config.contractType = 'inverse';
} else {
config.direction = 'short';
config.contractType = 'linear';
}
}
}
let equityData = { usd: [], btc: [] };
let totalQty = 0; // Linear: BTC Contracts, Inverse: USD Contracts
let avgPrice = 0;
let avgPriceData = [];
let posSizeData = { btc: [], usd: [] };
let trades = [];
let currentDayStart = Math.floor(simCandles[0].time / 86400) * 86400;
const PARTIAL_EXIT_PCT = 0.15;
const MIN_POSITION_VALUE_USD = 15;
for (let i = 0; i < simCandles.length; i++) {
const candle = simCandles[i];
const price = candle.close;
let actionTakenInThisCandle = false;
// --- Auto-Direction Daily Check (Midnight UTC) ---
if (config.autoDirection) {
const candleDayStart = Math.floor(candle.time / 86400) * 86400;
if (candleDayStart > currentDayStart) {
currentDayStart = candleDayStart;
// It's a new day! Get yesterday's MA(44)
let closestMA = Array.from(dailyMaMap.entries())
.filter(([t]) => t < currentDayStart)
.sort((a,b) => b[0] - a[0])[0];
if (closestMA) {
const maValue = closestMA[1];
let newDirection = config.direction;
let newContractType = config.contractType;
if (candle.open > maValue) {
newDirection = 'long';
newContractType = 'inverse';
} else {
newDirection = 'short';
newContractType = 'linear';
}
// Did the trend flip?
if (newDirection !== config.direction) {
// Force close open position at candle.open (market open)
if (totalQty > 0) {
let pnlUsd = 0;
if (config.contractType === 'linear') {
pnlUsd = config.direction === 'long' ? (candle.open - avgPrice) * totalQty : (avgPrice - candle.open) * totalQty;
walletBalanceUsd += pnlUsd;
} else { // inverse
// PnL in BTC, converted back to USD
const pnlBtc = config.direction === 'long'
? totalQty * (1/avgPrice - 1/candle.open)
: totalQty * (1/candle.open - 1/avgPrice);
// Inverse margin is BTC, so balance was in BTC.
// But we maintain walletBalanceUsd, so we just add the USD value of the PNL
pnlUsd = pnlBtc * candle.open;
walletBalanceUsd += pnlUsd;
}
trades.push({
type: config.direction, recordType: 'exit', time: candle.time,
entryPrice: avgPrice, exitPrice: candle.open, pnl: pnlUsd, reason: 'Force Close (Trend Flip)',
currentUsd: 0, currentQty: 0
});
totalQty = 0;
avgPrice = 0;
}
// Apply flip
config.direction = newDirection;
config.contractType = newContractType;
}
}
}
}
// ------------------------------------------------
// 1. Check TP
if (totalQty > 0) {
let isTP = false;
let exitPrice = price;
if (config.direction === 'long') {
if (candle.high >= avgPrice * (1 + config.tp)) {
isTP = true;
exitPrice = avgPrice * (1 + config.tp);
}
} else {
if (candle.low <= avgPrice * (1 - config.tp)) {
isTP = true;
exitPrice = avgPrice * (1 - config.tp);
}
}
if (isTP) {
let qtyToClose = totalQty * PARTIAL_EXIT_PCT;
let remainingQty = totalQty - qtyToClose;
let remainingValueUsd = config.contractType === 'linear' ? remainingQty * exitPrice : remainingQty;
let reason = 'TP (Partial)';
if (remainingValueUsd < MIN_POSITION_VALUE_USD) {
qtyToClose = totalQty;
reason = 'TP (Full - Min Size)';
}
let pnlUsd;
if (config.contractType === 'linear') {
pnlUsd = config.direction === 'long' ? (exitPrice - avgPrice) * qtyToClose : (avgPrice - exitPrice) * qtyToClose;
walletBalanceUsd += pnlUsd;
} else {
const pnlBtc = config.direction === 'long'
? qtyToClose * (1/avgPrice - 1/exitPrice)
: qtyToClose * (1/exitPrice - 1/avgPrice);
pnlUsd = pnlBtc * exitPrice;
walletBalanceUsd += pnlUsd;
}
totalQty -= qtyToClose;
trades.push({
type: config.direction, recordType: 'exit', time: candle.time,
entryPrice: avgPrice, exitPrice: exitPrice, pnl: pnlUsd, reason: reason,
currentUsd: config.contractType === 'linear' ? totalQty * price : totalQty,
currentQty: config.contractType === 'linear' ? totalQty : totalQty / price
});
actionTakenInThisCandle = true;
}
}
// 2. Check Close Signals
if (!actionTakenInThisCandle && totalQty > 0 && config.closeIndicators.length > 0) {
const hasCloseSignal = config.closeIndicators.some(id => {
const sig = indicatorSignals[id][i];
if (!sig) return false;
return config.direction === 'long' ? sig.type === 'sell' : sig.type === 'buy';
});
if (hasCloseSignal) {
let qtyToClose = totalQty * PARTIAL_EXIT_PCT;
let remainingQty = totalQty - qtyToClose;
let remainingValueUsd = config.contractType === 'linear' ? remainingQty * price : remainingQty;
let reason = 'Signal (Partial)';
if (remainingValueUsd < MIN_POSITION_VALUE_USD) {
qtyToClose = totalQty;
reason = 'Signal (Full - Min Size)';
}
let pnlUsd;
if (config.contractType === 'linear') {
pnlUsd = config.direction === 'long' ? (price - avgPrice) * qtyToClose : (avgPrice - price) * qtyToClose;
walletBalanceUsd += pnlUsd;
} else {
const pnlBtc = config.direction === 'long'
? qtyToClose * (1/avgPrice - 1/price)
: qtyToClose * (1/price - 1/avgPrice);
pnlUsd = pnlBtc * price;
walletBalanceUsd += pnlUsd;
}
totalQty -= qtyToClose;
trades.push({
type: config.direction, recordType: 'exit', time: candle.time,
entryPrice: avgPrice, exitPrice: price, pnl: pnlUsd, reason: reason,
currentUsd: config.contractType === 'linear' ? totalQty * price : totalQty,
currentQty: config.contractType === 'linear' ? totalQty : totalQty / price
});
actionTakenInThisCandle = true;
}
}
// Calculate Current Equity for Margin Check
let currentEquityUsd = walletBalanceUsd;
if (totalQty > 0) {
if (config.contractType === 'linear') {
currentEquityUsd += config.direction === 'long' ? (price - avgPrice) * totalQty : (avgPrice - price) * totalQty;
} else {
const upnlBtc = config.direction === 'long' ? totalQty * (1/avgPrice - 1/price) : totalQty * (1/price - 1/avgPrice);
currentEquityUsd += (upnlBtc * price);
}
}
// 3. Check Open Signals
if (!actionTakenInThisCandle) {
const hasOpenSignal = config.openIndicators.some(id => {
const sig = indicatorSignals[id][i];
if (!sig) return false;
return config.direction === 'long' ? sig.type === 'buy' : sig.type === 'sell';
});
if (hasOpenSignal) {
const entryValUsd = config.posSize * config.exchangeLeverage;
const currentNotionalUsd = config.contractType === 'linear' ? totalQty * price : totalQty;
const projectedEffectiveLeverage = (currentNotionalUsd + entryValUsd) / Math.max(currentEquityUsd, 0.0000001);
if (projectedEffectiveLeverage <= config.maxEffectiveLeverage) {
if (config.contractType === 'linear') {
const entryQty = entryValUsd / price;
avgPrice = ((totalQty * avgPrice) + (entryQty * price)) / (totalQty + entryQty);
totalQty += entryQty;
} else {
avgPrice = (totalQty + entryValUsd) / ((totalQty / avgPrice || 0) + (entryValUsd / price));
totalQty += entryValUsd;
}
trades.push({
type: config.direction, recordType: 'entry', time: candle.time,
entryPrice: price, reason: 'Entry',
currentUsd: config.contractType === 'linear' ? totalQty * price : totalQty,
currentQty: config.contractType === 'linear' ? totalQty : totalQty / price
});
}
}
}
// Final Equity Recording
let finalEquityUsd = walletBalanceUsd;
if (totalQty > 0) {
if (config.contractType === 'linear') {
finalEquityUsd += config.direction === 'long' ? (price - avgPrice) * totalQty : (avgPrice - price) * totalQty;
} else {
const upnlBtc = config.direction === 'long' ? totalQty * (1/avgPrice - 1/price) : totalQty * (1/price - 1/avgPrice);
finalEquityUsd += (upnlBtc * price);
}
}
let finalEquityBtc = finalEquityUsd / price;
equityData.usd.push({ time: candle.time, value: finalEquityUsd });
equityData.btc.push({ time: candle.time, value: finalEquityBtc });
if (totalQty > 0.000001) {
avgPriceData.push({
time: candle.time,
value: avgPrice,
color: config.direction === 'long' ? '#26a69a' : '#ef5350' // Green for long, Red for short
});
}
posSizeData.btc.push({ time: candle.time, value: config.contractType === 'linear' ? totalQty : totalQty / price });
posSizeData.usd.push({ time: candle.time, value: config.contractType === 'linear' ? totalQty * price : totalQty });
}
displayResultsCallback(trades, equityData, config, simCandles[simCandles.length-1].close, avgPriceData, posSizeData);
} catch (error) {
console.error('[Simulation] Error:', error);
alert('Simulation failed.');
} finally {
btn.disabled = false;
btn.textContent = 'Run Simulation';
}
}
};

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,243 @@
const HTS_COLORS = {
fastHigh: '#00bcd4',
fastLow: '#00bcd4',
slowHigh: '#f44336',
slowLow: '#f44336',
bullishZone: 'rgba(38, 166, 154, 0.1)',
bearishZone: 'rgba(239, 83, 80, 0.1)',
channelRegion: 'rgba(41, 98, 255, 0.05)'
};
let HTSOverlays = [];
export class HTSVisualizer {
constructor(chart, candles) {
this.chart = chart;
this.candles = candles;
this.overlays = [];
}
clear() {
this.overlays.forEach(overlay => {
try {
this.chart.removeSeries(overlay.series);
} catch (e) { }
});
this.overlays = [];
}
addHTSChannels(htsData, isAutoHTS = false) {
this.clear();
if (!htsData || htsData.length === 0) return;
const alpha = isAutoHTS ? 0.3 : 0.3;
const lineWidth = isAutoHTS ? 1 : 2;
const fastHighSeries = this.chart.addSeries(LightweightCharts.LineSeries, {
color: `rgba(0, 188, 212, ${alpha})`,
lineWidth: lineWidth,
lastValueVisible: false,
title: 'HTS Fast High' + (isAutoHTS ? ' (Auto)' : ''),
priceLineVisible: false,
crosshairMarkerVisible: false
});
const fastLowSeries = this.chart.addSeries(LightweightCharts.LineSeries, {
color: `rgba(0, 188, 212, ${alpha})`,
lineWidth: lineWidth,
lastValueVisible: false,
title: 'HTS Fast Low' + (isAutoHTS ? ' (Auto)' : ''),
priceLineVisible: false,
crosshairMarkerVisible: false
});
const slowHighSeries = this.chart.addSeries(LightweightCharts.LineSeries, {
color: `rgba(244, 67, 54, ${alpha})`,
lineWidth: lineWidth + 1,
lastValueVisible: false,
title: 'HTS Slow High' + (isAutoHTS ? ' (Auto)' : ''),
priceLineVisible: false,
crosshairMarkerVisible: false
});
const slowLowSeries = this.chart.addSeries(LightweightCharts.LineSeries, {
color: `rgba(244, 67, 54, ${alpha})`,
lineWidth: lineWidth + 1,
lastValueVisible: false,
title: 'HTS Slow Low' + (isAutoHTS ? ' (Auto)' : ''),
priceLineVisible: false,
crosshairMarkerVisible: false
});
const fastHighData = htsData.map(h => ({ time: h.time, value: h.fastHigh }));
const fastLowData = htsData.map(h => ({ time: h.time, value: h.fastLow }));
const slowHighData = htsData.map(h => ({ time: h.time, value: h.slowHigh }));
const slowLowData = htsData.map(h => ({ time: h.time, value: h.slowLow }));
fastHighSeries.setData(fastHighData);
fastLowSeries.setData(fastLowData);
slowHighSeries.setData(slowHighData);
slowLowSeries.setData(slowLowData);
this.overlays.push(
{ series: fastHighSeries, name: 'fastHigh' },
{ series: fastLowSeries, name: 'fastLow' },
{ series: slowHighSeries, name: 'slowHigh' },
{ series: slowLowSeries, name: 'slowLow' }
);
return {
fastHigh: fastHighSeries,
fastLow: fastLowSeries,
slowHigh: slowHighSeries,
slowLow: slowLowSeries
};
}
addTrendZones(htsData) {
if (!htsData || htsData.length < 2) return;
const trendZones = [];
let currentZone = null;
for (let i = 1; i < htsData.length; i++) {
const prev = htsData[i - 1];
const curr = htsData[i];
const prevBullish = prev.fastLow > prev.slowLow && prev.fastHigh > prev.slowHigh;
const currBullish = curr.fastLow > curr.slowLow && curr.fastHigh > curr.slowHigh;
const prevBearish = prev.fastLow < prev.slowLow && prev.fastHigh < prev.slowHigh;
const currBearish = curr.fastLow < curr.slowLow && curr.fastHigh < curr.slowHigh;
if (currBullish && !prevBullish) {
currentZone = { type: 'bullish', start: curr.time };
} else if (currBearish && !prevBearish) {
currentZone = { type: 'bearish', start: curr.time };
} else if (!currBullish && !currBearish && currentZone) {
currentZone.end = prev.time;
trendZones.push({ ...currentZone });
currentZone = null;
}
}
if (currentZone) {
currentZone.end = htsData[htsData.length - 1].time;
trendZones.push(currentZone);
}
trendZones.forEach(zone => {
const zoneSeries = this.chart.addSeries(LightweightCharts.AreaSeries, {
topColor: zone.type === 'bullish' ? 'rgba(38, 166, 154, 0.02)' : 'rgba(239, 83, 80, 0.02)',
bottomColor: zone.type === 'bullish' ? 'rgba(38, 166, 154, 0.02)' : 'rgba(239, 83, 80, 0.02)',
lineColor: 'transparent',
lastValueVisible: false,
priceLineVisible: false,
});
if (this.candles && this.candles.length > 0) {
const maxPrice = Math.max(...this.candles.map(c => c.high)) * 2;
const minPrice = Math.min(...this.candles.map(c => c.low)) * 0.5;
const startTime = zone.start || (this.candles[0]?.time);
const endTime = zone.end || (this.candles[this.candles.length - 1]?.time);
zoneSeries.setData([
{ time: startTime, value: minPrice },
{ time: startTime, value: maxPrice },
{ time: endTime, value: maxPrice },
{ time: endTime, value: minPrice }
]);
}
this.overlays.push({ series: zoneSeries, name: `trendZone_${zone.type}_${zone.start}` });
});
}
addCrossoverMarkers(htsData) {
if (!htsData || htsData.length < 2) return;
const markers = [];
for (let i = 1; i < htsData.length; i++) {
const prev = htsData[i - 1];
const curr = htsData[i];
if (!prev || !curr) continue;
const price = curr.price;
const prevFastLow = prev.fastLow;
const currFastLow = curr.fastLow;
const prevFastHigh = prev.fastHigh;
const currFastHigh = curr.fastHigh;
const prevSlowLow = prev.slowLow;
const currSlowLow = curr.slowLow;
const prevSlowHigh = prev.slowHigh;
const currSlowHigh = curr.slowHigh;
if (prevFastLow <= prevSlowLow && currFastLow > currSlowLow && price > currSlowLow) {
markers.push({
time: curr.time,
position: 'belowBar',
color: '#26a69a',
shape: 'arrowUp',
text: 'BUY',
size: 1.2
});
}
if (prevFastHigh >= prevSlowHigh && currFastHigh < currSlowHigh && price < currSlowHigh) {
markers.push({
time: curr.time,
position: 'aboveBar',
color: '#ef5350',
shape: 'arrowDown',
text: 'SELL',
size: 1.2
});
}
}
const candleSeries = this.candleData?.series;
if (candleSeries) {
try {
if (typeof candleSeries.setMarkers === 'function') {
candleSeries.setMarkers(markers);
} else if (typeof SeriesMarkersPrimitive !== 'undefined') {
if (!this.markerPrimitive) {
this.markerPrimitive = new SeriesMarkersPrimitive();
candleSeries.attachPrimitive(this.markerPrimitive);
}
this.markerPrimitive.setMarkers(markers);
}
} catch (e) {
console.warn('[HTS] Error setting markers:', e);
}
}
return markers;
}
}
export function addHTSVisualization(chart, candleSeries, htsData, candles, isAutoHTS = false) {
const visualizer = new HTSVisualizer(chart, candles);
visualizer.candleData = { series: candleSeries };
visualizer.addHTSChannels(htsData, isAutoHTS);
// Disable trend zones to avoid visual clutter
// visualizer.addTrendZones(htsData);
if (window.showCrossoverMarkers !== false) {
setTimeout(() => {
try {
visualizer.addCrossoverMarkers(htsData);
} catch (e) {
console.warn('Crossover markers skipped (API limitation):', e.message);
}
}, 100);
}
return visualizer;
}

View File

@ -0,0 +1,14 @@
export { TradingDashboard, refreshTA, openAIAnalysis } from './chart.js';
export { toggleSidebar, restoreSidebarState } from './sidebar.js';
export {
renderIndicatorList,
addNewIndicator,
selectIndicator,
renderIndicatorConfig,
applyIndicatorConfig,
removeIndicator,
removeIndicatorByIndex,
drawIndicatorsOnChart,
getActiveIndicators,
setActiveIndicators
} from './indicators-panel.js';

File diff suppressed because it is too large Load Diff

View File

@ -0,0 +1,703 @@
import { getAvailableIndicators, IndicatorRegistry as IR } from '../indicators/index.js';
let activeIndicators = [];
let configuringId = null;
let previewingType = null; // type being previewed (not yet added)
let nextInstanceId = 1;
const DEFAULT_COLORS = ['#2962ff', '#26a69a', '#ef5350', '#ff9800', '#9c27b0', '#00bcd4', '#ffeb3b', '#e91e63'];
const LINE_TYPES = ['solid', 'dotted', 'dashed'];
function getDefaultColor(index) {
return DEFAULT_COLORS[index % DEFAULT_COLORS.length];
}
function getPlotGroupName(plotId) {
if (plotId.toLowerCase().includes('fast')) return 'Fast';
if (plotId.toLowerCase().includes('slow')) return 'Slow';
if (plotId.toLowerCase().includes('upper')) return 'Upper';
if (plotId.toLowerCase().includes('lower')) return 'Lower';
if (plotId.toLowerCase().includes('middle') || plotId.toLowerCase().includes('basis')) return 'Middle';
if (plotId.toLowerCase().includes('signal')) return 'Signal';
if (plotId.toLowerCase().includes('histogram')) return 'Histogram';
if (plotId.toLowerCase().includes('k')) return '%K';
if (plotId.toLowerCase().includes('d')) return '%D';
return plotId;
}
function groupPlotsByColor(plots) {
const groups = {};
plots.forEach((plot, idx) => {
const groupName = getPlotGroupName(plot.id);
if (!groups[groupName]) {
groups[groupName] = { name: groupName, indices: [], plots: [] };
}
groups[groupName].indices.push(idx);
groups[groupName].plots.push(plot);
});
return Object.values(groups);
}
/** Generate a short label for an active indicator showing its key params */
function getIndicatorLabel(indicator) {
const meta = getIndicatorMeta(indicator);
if (!meta) return indicator.name;
const paramParts = meta.inputs.map(input => {
const val = indicator.params[input.name];
if (val !== undefined && val !== input.default) return val;
if (val !== undefined) return val;
return null;
}).filter(v => v !== null);
if (paramParts.length > 0) {
return `${indicator.name} (${paramParts.join(', ')})`;
}
return indicator.name;
}
function getIndicatorMeta(indicator) {
const IndicatorClass = IR?.[indicator.type];
if (!IndicatorClass) return null;
const instance = new IndicatorClass({ type: indicator.type, params: indicator.params, name: indicator.name });
return instance.getMetadata();
}
export function getActiveIndicators() {
return activeIndicators;
}
export function setActiveIndicators(indicators) {
activeIndicators = indicators;
}
/**
* Render the indicator catalog (available indicators) and active list.
* Catalog items are added via double-click (multiple instances allowed).
*/
export function renderIndicatorList() {
const container = document.getElementById('indicatorList');
if (!container) return;
const available = getAvailableIndicators();
container.innerHTML = `
<div class="indicator-catalog">
${available.map(ind => `
<div class="indicator-catalog-item ${previewingType === ind.type ? 'previewing' : ''}"
title="${ind.description || ''}"
data-type="${ind.type}">
<span class="indicator-catalog-name">${ind.name}</span>
<span class="indicator-catalog-add" data-type="${ind.type}">+</span>
</div>
`).join('')}
</div>
${activeIndicators.length > 0 ? `
<div class="indicator-active-divider">Active</div>
<div class="indicator-active-list">
${activeIndicators.map(ind => {
const isConfiguring = ind.id === configuringId;
const plotGroups = groupPlotsByColor(ind.plots || []);
const colorDots = plotGroups.map(group => {
const firstIdx = group.indices[0];
const color = ind.params[`_color_${firstIdx}`] || '#2962ff';
return `<span class="indicator-color-dot" style="background: ${color};"></span>`;
}).join('');
const label = getIndicatorLabel(ind);
return `
<div class="indicator-active-item ${isConfiguring ? 'configuring' : ''}"
data-id="${ind.id}">
<span class="indicator-active-eye" data-id="${ind.id}"
title="${ind.visible !== false ? 'Hide' : 'Show'}">
${ind.visible !== false ? '👁' : '👁‍🗨'}
</span>
<span class="indicator-active-name" data-id="${ind.id}">${label}</span>
${colorDots}
<button class="indicator-config-btn ${isConfiguring ? 'active' : ''}"
data-id="${ind.id}" title="Configure">⚙</button>
<button class="indicator-remove-btn"
data-id="${ind.id}" title="Remove">×</button>
</div>
`;
}).join('')}
</div>
` : ''}
`;
// Bind events via delegation
container.querySelectorAll('.indicator-catalog-item').forEach(el => {
el.addEventListener('click', () => previewIndicator(el.dataset.type));
el.addEventListener('dblclick', () => addIndicator(el.dataset.type));
});
container.querySelectorAll('.indicator-catalog-add').forEach(el => {
el.addEventListener('click', (e) => {
e.stopPropagation();
addIndicator(el.dataset.type);
});
});
container.querySelectorAll('.indicator-active-name').forEach(el => {
el.addEventListener('click', () => selectIndicatorConfig(el.dataset.id));
});
container.querySelectorAll('.indicator-config-btn').forEach(el => {
el.addEventListener('click', (e) => {
e.stopPropagation();
selectIndicatorConfig(el.dataset.id);
});
});
container.querySelectorAll('.indicator-remove-btn').forEach(el => {
el.addEventListener('click', (e) => {
e.stopPropagation();
removeIndicatorById(el.dataset.id);
});
});
container.querySelectorAll('.indicator-active-eye').forEach(el => {
el.addEventListener('click', (e) => {
e.stopPropagation();
toggleVisibility(el.dataset.id);
});
});
updateConfigPanel();
updateChartLegend();
}
function updateConfigPanel() {
const configPanel = document.getElementById('indicatorConfigPanel');
const configButtons = document.getElementById('configButtons');
if (!configPanel) return;
configPanel.style.display = 'block';
// Active indicator config takes priority over preview
const indicator = configuringId ? activeIndicators.find(a => a.id === configuringId) : null;
if (indicator) {
renderIndicatorConfig(indicator);
if (configButtons) configButtons.style.display = 'flex';
} else if (previewingType) {
renderPreviewConfig(previewingType);
if (configButtons) configButtons.style.display = 'none';
} else {
const container = document.getElementById('configForm');
if (container) {
container.innerHTML = '<div style="text-align: center; color: var(--tv-text-secondary); padding: 20px; font-size: 12px;">Click an indicator to preview its settings</div>';
}
if (configButtons) configButtons.style.display = 'none';
}
}
/** Single-click: preview config for a catalog indicator type (read-only) */
function previewIndicator(type) {
configuringId = null;
previewingType = previewingType === type ? null : type;
renderIndicatorList();
}
/** Render a read-only preview of an indicator's default config */
function renderPreviewConfig(type) {
const container = document.getElementById('configForm');
if (!container) return;
const IndicatorClass = IR?.[type];
if (!IndicatorClass) return;
const instance = new IndicatorClass({ type, params: {}, name: '' });
const meta = instance.getMetadata();
container.innerHTML = `
<div style="font-size: 11px; color: var(--tv-blue); margin-bottom: 4px; font-weight: 600;">${meta.name}</div>
<div style="font-size: 11px; color: var(--tv-text-secondary); margin-bottom: 10px;">${meta.description || ''}</div>
${meta.inputs.map(input => `
<div style="margin-bottom: 8px;">
<label style="font-size: 10px; color: var(--tv-text-secondary); text-transform: uppercase; display: block; margin-bottom: 4px;">${input.label}</label>
${input.type === 'select' ?
`<select class="sim-input" style="font-size: 12px; padding: 6px;" disabled>${input.options.map(o => `<option ${input.default === o ? 'selected' : ''}>${o}</option>`).join('')}</select>` :
`<input type="number" class="sim-input" value="${input.default}" ${input.step !== undefined ? `step="${input.step}"` : ''} style="font-size: 12px; padding: 6px;" disabled>`
}
</div>
`).join('')}
<div style="font-size: 10px; color: var(--tv-text-secondary); margin-top: 8px; text-align: center;">Double-click to add to chart</div>
`;
}
/** Add a new instance of an indicator type */
export function addIndicator(type) {
const IndicatorClass = IR?.[type];
if (!IndicatorClass) return;
previewingType = null;
const id = `${type}_${nextInstanceId++}`;
const instance = new IndicatorClass({ type, params: {}, name: '' });
const metadata = instance.getMetadata();
const params = {
_lineType: 'solid',
_lineWidth: 1
};
// Set Hurst-specific defaults
if (type === 'hurst') {
params.timeframe = 'chart';
params.markerBuyShape = 'custom';
params.markerSellShape = 'custom';
params.markerBuyColor = '#9e9e9e';
params.markerSellColor = '#9e9e9e';
params.markerBuyCustom = '▲';
params.markerSellCustom = '▼';
}
metadata.plots.forEach((plot, idx) => {
params[`_color_${idx}`] = plot.color || getDefaultColor(activeIndicators.length + idx);
});
metadata.inputs.forEach(input => {
params[input.name] = input.default;
});
activeIndicators.push({
id,
type,
name: metadata.name,
params,
plots: metadata.plots,
series: [],
visible: true
});
configuringId = id;
renderIndicatorList();
drawIndicatorsOnChart();
}
function selectIndicatorConfig(id) {
previewingType = null;
if (configuringId === id) {
configuringId = null;
} else {
configuringId = id;
}
renderIndicatorList();
}
function toggleVisibility(id) {
const indicator = activeIndicators.find(a => a.id === id);
if (!indicator) return;
indicator.visible = indicator.visible === false ? true : false;
// Show/hide all series for this indicator
indicator.series?.forEach(s => {
try {
s.applyOptions({ visible: indicator.visible });
} catch(e) {}
});
renderIndicatorList();
}
export function renderIndicatorConfig(indicator) {
const container = document.getElementById('configForm');
if (!container || !indicator) return;
const IndicatorClass = IR?.[indicator.type];
if (!IndicatorClass) {
container.innerHTML = '<div style="color: var(--tv-red);">Error loading indicator</div>';
return;
}
const instance = new IndicatorClass({ type: indicator.type, params: indicator.params, name: indicator.name });
const meta = instance.getMetadata();
const plotGroups = groupPlotsByColor(meta.plots);
const colorInputs = plotGroups.map(group => {
const firstIdx = group.indices[0];
const color = indicator.params[`_color_${firstIdx}`] || meta.plots[firstIdx].color || '#2962ff';
return `
<div style="margin-bottom: 8px;">
<label style="font-size: 10px; color: var(--tv-text-secondary); text-transform: uppercase; display: block; margin-bottom: 4px;">${group.name} Color</label>
<input type="color" id="config__color_${firstIdx}" value="${color}" style="width: 100%; height: 28px; border: 1px solid var(--tv-border); border-radius: 4px; cursor: pointer; background: var(--tv-bg);">
</div>
`;
}).join('');
container.innerHTML = `
<div style="font-size: 11px; color: var(--tv-blue); margin-bottom: 8px; font-weight: 600;">${getIndicatorLabel(indicator)}</div>
${colorInputs}
<div style="margin-bottom: 8px;">
<label style="font-size: 10px; color: var(--tv-text-secondary); text-transform: uppercase; display: block; margin-bottom: 4px;">Line Type</label>
<select id="config__lineType" class="sim-input" style="font-size: 12px; padding: 6px;">
${LINE_TYPES.map(lt => `<option value="${lt}" ${indicator.params._lineType === lt ? 'selected' : ''}>${lt.charAt(0).toUpperCase() + lt.slice(1)}</option>`).join('')}
</select>
</div>
<div style="margin-bottom: 8px;">
<label style="font-size: 10px; color: var(--tv-text-secondary); text-transform: uppercase; display: block; margin-bottom: 4px;">Line Width</label>
<input type="number" id="config__lineWidth" class="sim-input" value="${indicator.params._lineWidth || 1}" min="1" max="5" style="font-size: 12px; padding: 6px;">
</div>
${meta.inputs.map(input => `
<div style="margin-bottom: 8px;">
<label style="font-size: 10px; color: var(--tv-text-secondary); text-transform: uppercase; display: block; margin-bottom: 4px;">${input.label}</label>
${input.type === 'select' ?
`<select id="config_${input.name}" class="sim-input" style="font-size: 12px; padding: 6px;">${input.options.map(o => `<option value="${o}" ${indicator.params[input.name] === o ? 'selected' : ''}>${o}</option>`).join('')}</select>` :
`<input type="number" id="config_${input.name}" class="sim-input" value="${indicator.params[input.name]}" ${input.min !== undefined ? `min="${input.min}"` : ''} ${input.max !== undefined ? `max="${input.max}"` : ''} ${input.step !== undefined ? `step="${input.step}"` : ''} style="font-size: 12px; padding: 6px;">`
}
</div>
`).join('')}
`;
}
export function applyIndicatorConfig() {
const indicator = configuringId ? activeIndicators.find(a => a.id === configuringId) : null;
if (!indicator) return;
const IndicatorClass = IR?.[indicator.type];
if (!IndicatorClass) return;
const instance = new IndicatorClass({ type: indicator.type, params: {}, name: indicator.name });
const meta = instance.getMetadata();
const plotGroups = groupPlotsByColor(meta.plots);
plotGroups.forEach(group => {
const firstIdx = group.indices[0];
const colorEl = document.getElementById(`config__color_${firstIdx}`);
if (colorEl) {
const color = colorEl.value;
group.indices.forEach(idx => {
indicator.params[`_color_${idx}`] = color;
});
}
});
const lineTypeEl = document.getElementById('config__lineType');
const lineWidthEl = document.getElementById('config__lineWidth');
if (lineTypeEl) indicator.params._lineType = lineTypeEl.value;
if (lineWidthEl) indicator.params._lineWidth = parseInt(lineWidthEl.value);
meta.inputs.forEach(input => {
const el = document.getElementById(`config_${input.name}`);
if (el) {
indicator.params[input.name] = input.type === 'select' ? el.value : parseFloat(el.value);
}
});
renderIndicatorList();
drawIndicatorsOnChart();
}
export function removeIndicator() {
if (!configuringId) return;
removeIndicatorById(configuringId);
}
export function removeIndicatorById(id) {
const idx = activeIndicators.findIndex(a => a.id === id);
if (idx < 0) return;
activeIndicators[idx].series?.forEach(s => {
try { window.dashboard?.chart?.removeSeries(s); } catch(e) {}
});
activeIndicators.splice(idx, 1);
if (configuringId === id) {
configuringId = null;
}
renderIndicatorList();
drawIndicatorsOnChart();
}
export function removeIndicatorByIndex(index) {
if (index < 0 || index >= activeIndicators.length) return;
removeIndicatorById(activeIndicators[index].id);
}
let indicatorPanes = new Map();
let nextPaneIndex = 1;
export function drawIndicatorsOnChart() {
if (!window.dashboard || !window.dashboard.chart) return;
activeIndicators.forEach(ind => {
ind.series?.forEach(s => {
try { window.dashboard.chart.removeSeries(s); } catch(e) {}
});
});
const candles = window.dashboard.allData.get(window.dashboard.currentInterval);
if (!candles || candles.length === 0) return;
const lineStyleMap = { 'solid': LightweightCharts.LineStyle.Solid, 'dotted': LightweightCharts.LineStyle.Dotted, 'dashed': LightweightCharts.LineStyle.Dashed };
indicatorPanes.clear();
nextPaneIndex = 1;
const overlayIndicators = [];
const paneIndicators = [];
activeIndicators.forEach(ind => {
const IndicatorClass = IR?.[ind.type];
if (!IndicatorClass) return;
const instance = new IndicatorClass({ type: ind.type, params: ind.params, name: ind.name });
const meta = instance.getMetadata();
if (meta.displayMode === 'pane') {
paneIndicators.push({ indicator: ind, meta, instance });
} else {
overlayIndicators.push({ indicator: ind, meta, instance });
}
});
const totalPanes = 1 + paneIndicators.length;
const mainPaneHeight = paneIndicators.length > 0 ? 60 : 100;
const paneHeight = paneIndicators.length > 0 ? Math.floor(40 / paneIndicators.length) : 0;
window.dashboard.chart.panes()[0]?.setHeight(mainPaneHeight);
overlayIndicators.forEach(({ indicator, meta, instance }) => {
if (indicator.visible === false) {
indicator.series = [];
return;
}
renderIndicatorOnPane(indicator, meta, instance, candles, 0, lineStyleMap);
});
paneIndicators.forEach(({ indicator, meta, instance }, idx) => {
if (indicator.visible === false) {
indicator.series = [];
return;
}
const paneIndex = nextPaneIndex++;
indicatorPanes.set(indicator.id, paneIndex);
renderIndicatorOnPane(indicator, meta, instance, candles, paneIndex, lineStyleMap);
const pane = window.dashboard.chart.panes()[paneIndex];
if (pane) {
pane.setHeight(paneHeight);
}
});
updateChartLegend();
}
function renderIndicatorOnPane(indicator, meta, instance, candles, paneIndex, lineStyleMap) {
let results = instance.calculate(candles);
if (!results || !Array.isArray(results)) {
console.error(`[renderIndicatorOnPane] ${indicator.name}: Failed to get valid results (got ${typeof results})`);
return;
}
indicator.series = [];
const lineStyle = lineStyleMap[indicator.params._lineType] || LightweightCharts.LineStyle.Solid;
const lineWidth = indicator.params._lineWidth || 1;
const firstNonNull = Array.isArray(results) ? results.find(r => r !== null && r !== undefined) : null;
const isObjectResult = firstNonNull && typeof firstNonNull === 'object';
meta.plots.forEach((plot, plotIdx) => {
if (isObjectResult) {
// Find if this specific plot has any non-null data across all results
const hasData = results.some(r => r && r[plot.id] !== undefined && r[plot.id] !== null);
if (!hasData) return;
}
// Skip hidden plots
if (plot.visible === false) return;
const plotColor = indicator.params[`_color_${plotIdx}`] || plot.color || '#2962ff';
const data = [];
for (let i = 0; i < candles.length; i++) {
let value;
if (isObjectResult) {
value = results[i]?.[plot.id];
} else {
value = results[i];
}
if (value !== null && value !== undefined) {
data.push({
time: candles[i].time,
value: value
});
}
}
if (data.length === 0) return;
let series;
// Determine line style for this specific plot
let plotLineStyle = lineStyle;
if (plot.style === 'dashed') plotLineStyle = LightweightCharts.LineStyle.Dashed;
else if (plot.style === 'dotted') plotLineStyle = LightweightCharts.LineStyle.Dotted;
else if (plot.style === 'solid') plotLineStyle = LightweightCharts.LineStyle.Solid;
if (plot.type === 'histogram') {
series = window.dashboard.chart.addSeries(LightweightCharts.HistogramSeries, {
color: plotColor,
priceFormat: {
type: 'price',
precision: 4,
minMove: 0.0001
},
priceLineVisible: false,
lastValueVisible: false
}, paneIndex);
} else if (plot.type === 'baseline') {
series = window.dashboard.chart.addSeries(LightweightCharts.BaselineSeries, {
baseValue: { type: 'price', price: plot.baseValue || 0 },
topLineColor: plot.topLineColor || plotColor,
topFillColor1: plot.topFillColor1 || plotColor,
topFillColor2: plot.topFillColor2 || '#00000000',
bottomFillColor1: plot.bottomFillColor1 || '#00000000',
bottomColor: plot.bottomColor || '#00000000',
lineWidth: plot.width !== undefined ? plot.width : lineWidth,
lineStyle: plotLineStyle,
title: plot.title || '',
priceLineVisible: false,
lastValueVisible: plot.lastValueVisible !== false
}, paneIndex);
} else {
series = window.dashboard.chart.addSeries(LightweightCharts.LineSeries, {
color: plotColor,
lineWidth: plot.width !== undefined ? plot.width : lineWidth,
lineStyle: plotLineStyle,
title: plot.title || '',
priceLineVisible: false,
lastValueVisible: plot.lastValueVisible !== false
}, paneIndex);
}
series.setData(data);
series.plotId = plot.id;
// Skip hidden plots (visible: false)
if (plot.visible === false) {
series.applyOptions({ visible: false });
}
indicator.series.push(series);
});
// Render gradient zones if available
if (meta.gradientZones && indicator.series.length > 0) {
// Find the main series to attach zones to
let baseSeries = indicator.series[0];
meta.gradientZones.forEach(zone => {
if (zone.from === undefined || zone.to === undefined) return;
// We use createPriceLine on the series for horizontal bands with custom colors
baseSeries.createPriceLine({
price: zone.from,
color: zone.color.replace(/rgba\((\d+),\s*(\d+),\s*(\d+),\s*[^)]+\)/, 'rgb($1, $2, $3)'),
lineWidth: 1,
lineStyle: LightweightCharts.LineStyle.Solid,
axisLabelVisible: false,
title: zone.label || '',
});
if (zone.to !== zone.from) {
baseSeries.createPriceLine({
price: zone.to,
color: zone.color.replace(/rgba\((\d+),\s*(\d+),\s*(\d+),\s*[^)]+\)/, 'rgb($1, $2, $3)'),
lineWidth: 1,
lineStyle: LightweightCharts.LineStyle.Solid,
axisLabelVisible: false,
title: '',
});
}
});
}
}
/** Update the TradingView-style legend overlay on the chart */
export function updateChartLegend() {
let legend = document.getElementById('chartIndicatorLegend');
if (!legend) {
const chartWrapper = document.getElementById('chartWrapper');
if (!chartWrapper) return;
legend = document.createElement('div');
legend.id = 'chartIndicatorLegend';
legend.className = 'chart-indicator-legend';
chartWrapper.appendChild(legend);
}
if (activeIndicators.length === 0) {
legend.innerHTML = '';
legend.style.display = 'none';
return;
}
legend.style.display = 'flex';
legend.innerHTML = activeIndicators.map(ind => {
const label = getIndicatorLabel(ind);
const plotGroups = groupPlotsByColor(ind.plots || []);
const firstColor = ind.params['_color_0'] || '#2962ff';
const dimmed = ind.visible === false;
return `
<div class="legend-item ${dimmed ? 'legend-dimmed' : ''} ${ind.id === configuringId ? 'legend-selected' : ''}"
data-id="${ind.id}">
<span class="legend-dot" style="background: ${firstColor};"></span>
<span class="legend-label">${label}</span>
<span class="legend-close" data-id="${ind.id}" title="Remove">&times;</span>
</div>
`;
}).join('');
// Bind legend events
legend.querySelectorAll('.legend-item').forEach(el => {
el.addEventListener('click', (e) => {
if (e.target.classList.contains('legend-close')) return;
selectIndicatorConfig(el.dataset.id);
renderIndicatorList();
});
});
legend.querySelectorAll('.legend-close').forEach(el => {
el.addEventListener('click', (e) => {
e.stopPropagation();
removeIndicatorById(el.dataset.id);
});
});
}
// Legacy compat: toggleIndicator still works for external callers
export function toggleIndicator(type) {
addIndicator(type);
}
export function showIndicatorConfig(index) {
if (index >= 0 && index < activeIndicators.length) {
selectIndicatorConfig(activeIndicators[index].id);
}
}
export function showIndicatorConfigByType(type) {
const ind = activeIndicators.find(a => a.type === type);
if (ind) {
selectIndicatorConfig(ind.id);
}
}
window.addIndicator = addIndicator;
window.toggleIndicator = toggleIndicator;
window.showIndicatorConfig = showIndicatorConfig;
window.applyIndicatorConfig = applyIndicatorConfig;
window.removeIndicator = removeIndicator;
window.removeIndicatorById = removeIndicatorById;
window.removeIndicatorByIndex = removeIndicatorByIndex;
window.drawIndicatorsOnChart = drawIndicatorsOnChart;

View File

@ -0,0 +1,117 @@
export class SeriesMarkersPrimitive {
constructor(markers) {
this._markers = markers || [];
this._paneViews = [new MarkersPaneView(this)];
}
setMarkers(markers) {
this._markers = markers;
if (this._requestUpdate) {
this._requestUpdate();
}
}
attached(param) {
this._chart = param.chart;
this._series = param.series;
this._requestUpdate = param.requestUpdate;
this._requestUpdate();
}
detached() {
this._chart = undefined;
this._series = undefined;
this._requestUpdate = undefined;
}
updateAllViews() {}
paneViews() {
return this._paneViews;
}
}
class MarkersPaneView {
constructor(source) {
this._source = source;
}
renderer() {
return new MarkersRenderer(this._source);
}
}
class MarkersRenderer {
constructor(source) {
this._source = source;
}
draw(target) {
if (!this._source._chart || !this._source._series) return;
// Lightweight Charts v5 wraps context
const ctx = target.context;
const series = this._source._series;
const chart = this._source._chart;
const markers = this._source._markers;
ctx.save();
// Ensure markers are sorted by time (usually already done)
for (const marker of markers) {
const timeCoordinate = chart.timeScale().timeToCoordinate(marker.time);
if (timeCoordinate === null) continue;
// To position above or below bar, we need the candle data or we use the marker.value if provided
// For true aboveBar/belowBar without candle data, we might just use series.priceToCoordinate on marker.value
let price = marker.value;
// Fallbacks if no value provided (which our calculator does provide)
if (!price) continue;
const priceCoordinate = series.priceToCoordinate(price);
if (priceCoordinate === null) continue;
const x = timeCoordinate;
const size = 5;
const margin = 12; // Gap between price and marker
const isAbove = marker.position === 'aboveBar';
const y = isAbove ? priceCoordinate - margin : priceCoordinate + margin;
ctx.fillStyle = marker.color || '#26a69a';
ctx.beginPath();
if (marker.shape === 'arrowUp' || (!marker.shape && !isAbove)) {
ctx.moveTo(x, y - size);
ctx.lineTo(x - size, y + size);
ctx.lineTo(x + size, y + size);
} else if (marker.shape === 'arrowDown' || (!marker.shape && isAbove)) {
ctx.moveTo(x, y + size);
ctx.lineTo(x - size, y - size);
ctx.lineTo(x + size, y - size);
} else if (marker.shape === 'circle') {
ctx.arc(x, y, size, 0, Math.PI * 2);
} else if (marker.shape === 'square') {
ctx.rect(x - size, y - size, size * 2, size * 2);
} else if (marker.shape === 'custom' && marker.text) {
ctx.font = '12px Arial';
ctx.textAlign = 'center';
ctx.textBaseline = 'middle';
ctx.fillText(marker.text, x, y);
continue;
} else {
// Default triangle
if (isAbove) {
ctx.moveTo(x, y + size);
ctx.lineTo(x - size, y - size);
ctx.lineTo(x + size, y - size);
} else {
ctx.moveTo(x, y - size);
ctx.lineTo(x - size, y + size);
ctx.lineTo(x + size, y + size);
}
}
ctx.fill();
}
ctx.restore();
}
}

View File

@ -0,0 +1,73 @@
export function toggleSidebar() {
const sidebar = document.getElementById('rightSidebar');
sidebar.classList.toggle('collapsed');
localStorage.setItem('sidebar_collapsed', sidebar.classList.contains('collapsed'));
// Resize chart after sidebar toggle
setTimeout(() => {
if (window.dashboard && window.dashboard.chart) {
const container = document.getElementById('chart');
window.dashboard.chart.applyOptions({
width: container.clientWidth,
height: container.clientHeight
});
}
}, 350); // Wait for CSS transition
}
export function restoreSidebarState() {
const collapsed = localStorage.getItem('sidebar_collapsed') !== 'false'; // Default to collapsed
const sidebar = document.getElementById('rightSidebar');
if (collapsed && sidebar) {
sidebar.classList.add('collapsed');
}
}
// Tab Management
let activeTab = 'indicators';
export function initSidebarTabs() {
const tabs = document.querySelectorAll('.sidebar-tab');
tabs.forEach(tab => {
tab.addEventListener('click', () => {
switchTab(tab.dataset.tab);
});
});
}
export function switchTab(tabId) {
activeTab = tabId;
localStorage.setItem('sidebar_active_tab', tabId);
document.querySelectorAll('.sidebar-tab').forEach(tab => {
tab.classList.toggle('active', tab.dataset.tab === tabId);
});
document.querySelectorAll('.sidebar-tab-panel').forEach(panel => {
panel.classList.toggle('active', panel.id === `tab-${tabId}`);
});
if (tabId === 'indicators') {
setTimeout(() => {
if (window.drawIndicatorsOnChart) {
window.drawIndicatorsOnChart();
}
}, 50);
} else if (tabId === 'strategy') {
setTimeout(() => {
if (window.renderStrategyPanel) {
window.renderStrategyPanel();
}
}, 50);
}
}
export function getActiveTab() {
return activeTab;
}
export function restoreSidebarTabState() {
const savedTab = localStorage.getItem('sidebar_active_tab') || 'indicators';
switchTab(savedTab);
}

View File

@ -0,0 +1,231 @@
import { IndicatorRegistry } from '../indicators/index.js';
export function calculateSignalMarkers(candles) {
const activeIndicators = window.getActiveIndicators?.() || [];
const markers = [];
if (!candles || candles.length < 2) {
return markers;
}
for (const indicator of activeIndicators) {
if (indicator.params.showMarkers === false || indicator.params.showMarkers === 'false') {
continue;
}
console.log('[SignalMarkers] Processing indicator:', indicator.type, 'showMarkers:', indicator.params.showMarkers);
// Use cache if available
let results = indicator.cachedResults;
if (!results || !Array.isArray(results) || results.length !== candles.length) {
const IndicatorClass = IndicatorRegistry[indicator.type];
if (!IndicatorClass) {
continue;
}
const instance = new IndicatorClass(indicator);
results = instance.calculate(candles);
}
if (!results || !Array.isArray(results) || results.length === 0) {
continue;
}
const indicatorMarkers = findCrossoverMarkers(indicator, candles, results);
markers.push(...indicatorMarkers);
}
markers.sort((a, b) => a.time - b.time);
return markers;
}
function findCrossoverMarkers(indicator, candles, results) {
const markers = [];
const overbought = indicator.params?.overbought || 70;
const oversold = indicator.params?.oversold || 30;
const indicatorType = indicator.type;
const buyColor = indicator.params?.markerBuyColor || '#26a69a';
const sellColor = indicator.params?.markerSellColor || '#ef5350';
const buyShape = indicator.params?.markerBuyShape || 'arrowUp';
const sellShape = indicator.params?.markerSellShape || 'arrowDown';
const buyCustom = indicator.params?.markerBuyCustom || '◭';
const sellCustom = indicator.params?.markerSellCustom || '▼';
for (let i = 1; i < results.length; i++) {
const candle = candles[i];
const prevCandle = candles[i - 1];
const result = results[i];
const prevResult = results[i - 1];
if (!result || !prevResult) continue;
if (indicatorType === 'rsi' || indicatorType === 'stoch') {
const rsi = result.rsi ?? result;
const prevRsi = prevResult.rsi ?? prevResult;
if (rsi === undefined || prevRsi === undefined) continue;
if (prevRsi > overbought && rsi <= overbought) {
markers.push({
time: candle.time,
position: 'aboveBar',
color: sellColor,
shape: sellShape === 'custom' ? '' : sellShape,
text: sellShape === 'custom' ? sellCustom : ''
});
}
if (prevRsi < oversold && rsi >= oversold) {
markers.push({
time: candle.time,
position: 'belowBar',
color: buyColor,
shape: buyShape === 'custom' ? '' : buyShape,
text: buyShape === 'custom' ? buyCustom : ''
});
}
} else if (indicatorType === 'macd') {
const macd = result.macd ?? result.MACD;
const signal = result.signal ?? result.signalLine;
const prevMacd = prevResult.macd ?? prevResult.MACD;
const prevSignal = prevResult.signal ?? prevResult.signalLine;
if (macd === undefined || signal === undefined || prevMacd === undefined || prevSignal === undefined) continue;
const macdAbovePrev = prevMacd > prevSignal;
const macdAboveNow = macd > signal;
if (macdAbovePrev && !macdAboveNow) {
markers.push({
time: candle.time,
position: 'aboveBar',
color: sellColor,
shape: sellShape === 'custom' ? '' : sellShape,
text: sellShape === 'custom' ? sellCustom : ''
});
}
if (!macdAbovePrev && macdAboveNow) {
markers.push({
time: candle.time,
position: 'belowBar',
color: buyColor,
shape: buyShape === 'custom' ? '' : buyShape,
text: buyShape === 'custom' ? buyCustom : ''
});
}
} else if (indicatorType === 'bb') {
const upper = result.upper ?? result.upperBand;
const lower = result.lower ?? result.lowerBand;
if (upper === undefined || lower === undefined) continue;
const priceAboveUpperPrev = prevCandle.close > (prevResult.upper ?? prevResult.upperBand);
const priceAboveUpperNow = candle.close > upper;
if (priceAboveUpperPrev && !priceAboveUpperNow) {
markers.push({
time: candle.time,
position: 'aboveBar',
color: sellColor,
shape: sellShape === 'custom' ? '' : sellShape,
text: sellShape === 'custom' ? sellCustom : ''
});
}
if (!priceAboveUpperPrev && priceAboveUpperNow) {
markers.push({
time: candle.time,
position: 'belowBar',
color: buyColor,
shape: buyShape === 'custom' ? '' : buyShape,
text: buyShape === 'custom' ? buyCustom : ''
});
}
const priceBelowLowerPrev = prevCandle.close < (prevResult.lower ?? prevResult.lowerBand);
const priceBelowLowerNow = candle.close < lower;
if (priceBelowLowerPrev && !priceBelowLowerNow) {
markers.push({
time: candle.time,
position: 'belowBar',
color: buyColor,
shape: buyShape === 'custom' ? '' : buyShape,
text: buyShape === 'custom' ? buyCustom : ''
});
}
if (!priceBelowLowerPrev && priceBelowLowerNow) {
markers.push({
time: candle.time,
position: 'aboveBar',
color: sellColor,
shape: sellShape === 'custom' ? '' : sellShape,
text: sellShape === 'custom' ? sellCustom : ''
});
}
} else if (indicatorType === 'hurst') {
const upper = result.upper;
const lower = result.lower;
const prevUpper = prevResult?.upper;
const prevLower = prevResult?.lower;
if (upper === undefined || lower === undefined ||
prevUpper === undefined || prevLower === undefined) continue;
// BUY: price crosses down below lower band (was above, now below)
if (prevCandle.close > prevLower && candle.close < lower) {
markers.push({
time: candle.time,
position: 'belowBar',
color: buyColor,
shape: buyShape === 'custom' ? '' : buyShape,
text: buyShape === 'custom' ? buyCustom : ''
});
}
// SELL: price crosses down below upper band (was above, now below)
if (prevCandle.close > prevUpper && candle.close < upper) {
markers.push({
time: candle.time,
position: 'aboveBar',
color: sellColor,
shape: sellShape === 'custom' ? '' : sellShape,
text: sellShape === 'custom' ? sellCustom : ''
});
}
} else {
const ma = result.ma ?? result;
const prevMa = prevResult.ma ?? prevResult;
if (ma === undefined || prevMa === undefined) continue;
const priceAbovePrev = prevCandle.close > prevMa;
const priceAboveNow = candle.close > ma;
if (priceAbovePrev && !priceAboveNow) {
markers.push({
time: candle.time,
position: 'aboveBar',
color: sellColor,
shape: sellShape === 'custom' ? '' : sellShape,
text: sellShape === 'custom' ? sellCustom : ''
});
}
if (!priceAbovePrev && priceAboveNow) {
markers.push({
time: candle.time,
position: 'belowBar',
color: buyColor,
shape: buyShape === 'custom' ? '' : buyShape,
text: buyShape === 'custom' ? buyCustom : ''
});
}
}
}
return markers;
}

View File

@ -0,0 +1,367 @@
// Signal Calculator - orchestrates signal calculation using indicator-specific functions
// Signal calculation logic is now in each indicator file
import { IndicatorRegistry, getSignalFunction } from '../indicators/index.js';
/**
* Calculate signal for an indicator
* @param {Object} indicator - Indicator configuration
* @param {Array} candles - Candle data array
* @param {Object} indicatorValues - Computed indicator values for last candle
* @param {Object} prevIndicatorValues - Computed indicator values for previous candle
* @returns {Object} Signal object with type, strength, value, reasoning
*/
function calculateIndicatorSignal(indicator, candles, indicatorValues, prevIndicatorValues) {
const signalFunction = getSignalFunction(indicator.type);
if (!signalFunction) {
console.warn('[Signals] No signal function for indicator type:', indicator.type);
return null;
}
const lastCandle = candles[candles.length - 1];
const prevCandle = candles[candles.length - 2];
return signalFunction(indicator, lastCandle, prevCandle, indicatorValues, prevIndicatorValues);
}
/**
* Calculate aggregate summary signal from all indicators
*/
export function calculateSummarySignal(signals) {
console.log('[calculateSummarySignal] Input signals:', signals?.length);
if (!signals || signals.length === 0) {
return {
signal: 'hold',
strength: 0,
reasoning: 'No active indicators',
buyCount: 0,
sellCount: 0,
holdCount: 0
};
}
const buySignals = signals.filter(s => s.signal === 'buy');
const sellSignals = signals.filter(s => s.signal === 'sell');
const holdSignals = signals.filter(s => s.signal === 'hold');
const buyCount = buySignals.length;
const sellCount = sellSignals.length;
const holdCount = holdSignals.length;
const total = signals.length;
console.log('[calculateSummarySignal] BUY:', buyCount, 'SELL:', sellCount, 'HOLD:', holdCount);
const buyWeight = buySignals.reduce((sum, s) => sum + (s.strength || 0), 0);
const sellWeight = sellSignals.reduce((sum, s) => sum + (s.strength || 0), 0);
let summarySignal, strength, reasoning;
if (buyCount > sellCount && buyCount > holdCount) {
summarySignal = 'buy';
const avgBuyStrength = buyWeight / buyCount;
strength = Math.round(avgBuyStrength * (buyCount / total));
reasoning = `${buyCount} buy signals, ${sellCount} sell, ${holdCount} hold`;
} else if (sellCount > buyCount && sellCount > holdCount) {
summarySignal = 'sell';
const avgSellStrength = sellWeight / sellCount;
strength = Math.round(avgSellStrength * (sellCount / total));
reasoning = `${sellCount} sell signals, ${buyCount} buy, ${holdCount} hold`;
} else {
summarySignal = 'hold';
strength = 30;
reasoning = `Mixed signals: ${buyCount} buy, ${sellCount} sell, ${holdCount} hold`;
}
const result = {
signal: summarySignal,
strength: Math.min(Math.max(strength, 0), 100),
reasoning,
buyCount,
sellCount,
holdCount,
color: summarySignal === 'buy' ? '#26a69a' : summarySignal === 'sell' ? '#ef5350' : '#787b86'
};
console.log('[calculateSummarySignal] Result:', result);
return result;
}
/**
* Calculate historical crossovers for all indicators based on full candle history
* Finds the last time each indicator crossed from BUY to SELL or SELL to BUY
*/
function calculateHistoricalCrossovers(activeIndicators, candles) {
activeIndicators.forEach(indicator => {
const indicatorType = indicator.type || indicator.indicatorType;
// Recalculate indicator values for all candles (use cache if valid)
let results = indicator.cachedResults;
if (!results || !Array.isArray(results) || results.length !== candles.length) {
const IndicatorClass = IndicatorRegistry[indicatorType];
if (!IndicatorClass) return;
const instance = new IndicatorClass(indicator);
results = instance.calculate(candles);
// Don't save back to cache here, let drawIndicatorsOnChart be the source of truth for cache
}
if (!results || !Array.isArray(results) || results.length === 0) return;
// Find the most recent crossover by going backwards from the newest candle
// candles are sorted oldest first, newest last
let lastCrossoverTimestamp = null;
let lastSignalType = null;
// Get indicator-specific parameters
const overbought = indicator.params?.overbought || 70;
const oversold = indicator.params?.oversold || 30;
for (let i = candles.length - 1; i > 0; i--) {
const candle = candles[i]; // newer candle
const prevCandle = candles[i-1]; // older candle
const result = results[i];
const prevResult = results[i-1];
if (!result || !prevResult) continue;
// Handle different indicator types
if (indicatorType === 'rsi' || indicatorType === 'stoch') {
// RSI/Stochastic: check crossing overbought/oversold levels
const rsi = result.rsi !== undefined ? result.rsi : result;
const prevRsi = prevResult.rsi !== undefined ? prevResult.rsi : prevResult;
if (rsi === undefined || prevRsi === undefined) continue;
// SELL: crossed down through overbought (was above, now below)
if (prevRsi > overbought && rsi <= overbought) {
lastCrossoverTimestamp = candle.time;
lastSignalType = 'sell';
break;
}
// BUY: crossed up through oversold (was below, now above)
if (prevRsi < oversold && rsi >= oversold) {
lastCrossoverTimestamp = candle.time;
lastSignalType = 'buy';
break;
}
} else if (indicatorType === 'hurst') {
// Hurst Bands: check price crossing bands
const upper = result.upper;
const lower = result.lower;
const prevUpper = prevResult.upper;
const prevLower = prevResult.lower;
if (upper === undefined || lower === undefined ||
prevUpper === undefined || prevLower === undefined) continue;
// BUY: price crossed down below lower band
if (prevCandle.close > prevLower && candle.close < lower) {
lastCrossoverTimestamp = candle.time;
lastSignalType = 'buy';
break;
}
// SELL: price crossed down below upper band
if (prevCandle.close > prevUpper && candle.close < upper) {
lastCrossoverTimestamp = candle.time;
lastSignalType = 'sell';
break;
}
} else {
// MA-style: check price crossing MA
const ma = result.ma !== undefined ? result.ma : result;
const prevMa = prevResult.ma !== undefined ? prevResult.ma : prevResult;
if (ma === undefined || prevMa === undefined) continue;
// Check crossover: price was on one side of MA, now on the other side
const priceAbovePrev = prevCandle.close > prevMa;
const priceAboveNow = candle.close > ma;
// SELL signal: price crossed from above to below MA
if (priceAbovePrev && !priceAboveNow) {
lastCrossoverTimestamp = candle.time;
lastSignalType = 'sell';
break;
}
// BUY signal: price crossed from below to above MA
if (!priceAbovePrev && priceAboveNow) {
lastCrossoverTimestamp = candle.time;
lastSignalType = 'buy';
break;
}
}
}
// Always update the timestamp based on current data
// If crossover found use that time, otherwise use last candle time
if (lastCrossoverTimestamp) {
console.log(`[HistoricalCross] ${indicatorType}: Found ${lastSignalType} crossover at ${new Date(lastCrossoverTimestamp * 1000).toLocaleString()}`);
indicator.lastSignalTimestamp = lastCrossoverTimestamp;
indicator.lastSignalType = lastSignalType;
} else {
// No crossover found - use last candle time
const lastCandleTime = candles[candles.length - 1]?.time;
if (lastCandleTime) {
const lastResult = results[results.length - 1];
if (indicatorType === 'rsi' || indicatorType === 'stoch') {
// RSI/Stochastic: use RSI level to determine signal
const rsi = lastResult?.rsi !== undefined ? lastResult.rsi : lastResult;
if (rsi !== undefined) {
indicator.lastSignalType = rsi > overbought ? 'sell' : (rsi < oversold ? 'buy' : null);
indicator.lastSignalTimestamp = lastCandleTime;
}
} else if (indicatorType === 'hurst') {
// Hurst Bands: use price vs bands
const upper = lastResult?.upper;
const lower = lastResult?.lower;
const currentPrice = candles[candles.length - 1]?.close;
if (upper !== undefined && lower !== undefined && currentPrice !== undefined) {
if (currentPrice < lower) {
indicator.lastSignalType = 'buy';
} else if (currentPrice > upper) {
indicator.lastSignalType = 'sell';
} else {
indicator.lastSignalType = null;
}
indicator.lastSignalTimestamp = lastCandleTime;
}
} else {
// MA-style: use price vs MA
const ma = lastResult?.ma !== undefined ? lastResult.ma : lastResult;
if (ma !== undefined) {
const isAbove = candles[candles.length - 1].close > ma;
indicator.lastSignalType = isAbove ? 'buy' : 'sell';
indicator.lastSignalTimestamp = lastCandleTime;
}
}
}
}
});
}
/**
* Calculate signals for all active indicators
* @returns {Array} Array of indicator signals
*/
export function calculateAllIndicatorSignals() {
const activeIndicators = window.getActiveIndicators?.() || [];
const candles = window.dashboard?.allData?.get(window.dashboard?.currentInterval);
//console.log('[Signals] ========== calculateAllIndicatorSignals START ==========');
console.log('[Signals] Active indicators:', activeIndicators.length, 'Candles:', candles?.length || 0);
if (!candles || candles.length < 2) {
//console.log('[Signals] Insufficient candles available:', candles?.length || 0);
return [];
}
if (!activeIndicators || activeIndicators.length === 0) {
//console.log('[Signals] No active indicators');
return [];
}
const signals = [];
// Calculate crossovers for all indicators based on historical data
calculateHistoricalCrossovers(activeIndicators, candles);
for (const indicator of activeIndicators) {
const IndicatorClass = IndicatorRegistry[indicator.type];
if (!IndicatorClass) {
console.log('[Signals] No class for indicator type:', indicator.type);
continue;
}
// Use cached results if available, otherwise calculate
let results = indicator.cachedResults;
let meta = indicator.cachedMeta;
if (!results || !meta || !Array.isArray(results) || results.length !== candles.length) {
const instance = new IndicatorClass(indicator);
meta = instance.getMetadata();
results = instance.calculate(candles);
indicator.cachedResults = results;
indicator.cachedMeta = meta;
}
if (!results || !Array.isArray(results) || results.length === 0) {
console.log('[Signals] No valid results for indicator:', indicator.type);
continue;
}
const lastResult = results[results.length - 1];
const prevResult = results[results.length - 2];
if (lastResult === null || lastResult === undefined) {
console.log('[Signals] No valid last result for indicator:', indicator.type);
continue;
}
let values;
let prevValues;
if (typeof lastResult === 'object' && lastResult !== null && !Array.isArray(lastResult)) {
values = lastResult;
prevValues = prevResult;
} else if (typeof lastResult === 'number') {
values = { ma: lastResult };
prevValues = prevResult ? { ma: prevResult } : undefined;
} else {
console.log('[Signals] Unexpected result type for', indicator.type, ':', typeof lastResult);
continue;
}
const signal = calculateIndicatorSignal(indicator, candles, values, prevValues);
let currentSignal = signal;
let lastSignalDate = indicator.lastSignalTimestamp || null;
let lastSignalType = indicator.lastSignalType || null;
if (!currentSignal || !currentSignal.type) {
console.log('[Signals] No valid signal for', indicator.type, '- Using last signal if available');
if (lastSignalType && lastSignalDate) {
currentSignal = {
type: lastSignalType,
strength: 50,
value: candles[candles.length - 1]?.close,
reasoning: `No crossover (price equals MA)`
};
} else {
console.log('[Signals] No previous signal available - Skipping');
continue;
}
} else {
const currentCandleTimestamp = candles[candles.length - 1].time;
if (currentSignal.type !== lastSignalType || !lastSignalType) {
console.log('[Signals] Signal changed for', indicator.type, ':', lastSignalType, '->', currentSignal.type);
lastSignalDate = indicator.lastSignalTimestamp || currentCandleTimestamp;
lastSignalType = currentSignal.type;
indicator.lastSignalTimestamp = lastSignalDate;
indicator.lastSignalType = lastSignalType;
}
}
signals.push({
id: indicator.id,
name: meta?.name || indicator.type,
label: indicator.type?.toUpperCase(),
params: meta?.inputs && meta.inputs.length > 0
? indicator.params[meta.inputs[0].name]
: null,
type: indicator.type,
signal: currentSignal.type,
strength: Math.round(currentSignal.strength),
value: currentSignal.value,
reasoning: currentSignal.reasoning,
color: currentSignal.type === 'buy' ? '#26a69a' : currentSignal.type === 'sell' ? '#ef5350' : '#787b86',
lastSignalDate: lastSignalDate
});
}
//console.log('[Signals] ========== calculateAllIndicatorSignals END ==========');
console.log('[Signals] Total signals calculated:', signals.length);
return signals;
}

View File

@ -0,0 +1,370 @@
import { getStrategy, registerStrategy } from '../strategies/index.js';
import { PingPongStrategy } from '../strategies/ping-pong.js';
// Register available strategies
registerStrategy('ping_pong', PingPongStrategy);
let activeIndicators = [];
function formatDisplayDate(timestamp) {
if (!timestamp) return '';
const date = new Date(timestamp);
const day = String(date.getDate()).padStart(2, '0');
const month = String(date.getMonth() + 1).padStart(2, '0');
const year = date.getFullYear();
const hours = String(date.getHours()).padStart(2, '0');
const minutes = String(date.getMinutes()).padStart(2, '0');
return `${day}/${month}/${year} ${hours}:${minutes}`;
}
export function initStrategyPanel() {
window.renderStrategyPanel = renderStrategyPanel;
renderStrategyPanel();
// Listen for indicator changes to update the signal selection list
const originalAddIndicator = window.addIndicator;
window.addIndicator = function(...args) {
const res = originalAddIndicator.apply(this, args);
setTimeout(renderStrategyPanel, 100);
return res;
};
const originalRemoveIndicator = window.removeIndicatorById;
window.removeIndicatorById = function(...args) {
const res = originalRemoveIndicator.apply(this, args);
setTimeout(renderStrategyPanel, 100);
return res;
};
}
export function renderStrategyPanel() {
const container = document.getElementById('strategyPanel');
if (!container) return;
activeIndicators = window.getActiveIndicators?.() || [];
// For now, we only have Ping-Pong. Later we can add a strategy selector.
const currentStrategyId = 'ping_pong';
const strategy = getStrategy(currentStrategyId);
if (!strategy) {
container.innerHTML = `<div class="sidebar-section">Strategy not found.</div>`;
return;
}
container.innerHTML = `
<div class="sidebar-section">
<div class="sidebar-section-header">
<span>⚙️</span> ${strategy.name} Strategy
</div>
<div class="sidebar-section-content">
${strategy.renderUI(activeIndicators, formatDisplayDate)}
<button class="sim-run-btn" id="runSimulationBtn">Run Simulation</button>
</div>
</div>
<div id="simulationResults" class="sim-results" style="display: none;">
<!-- Results will be injected here -->
</div>
`;
// Attach strategy specific listeners (like disabling dropdowns when auto-detect is on)
if (strategy.attachListeners) {
strategy.attachListeners();
}
document.getElementById('runSimulationBtn').addEventListener('click', () => {
strategy.runSimulation(activeIndicators, displayResults);
});
}
// Keep the display logic here so all strategies can use the same rendering for results
let equitySeries = null;
let equityChart = null;
let posSeries = null;
let posSizeChart = null;
let tradeMarkers = [];
function displayResults(trades, equityData, config, endPrice, avgPriceData, posSizeData) {
const resultsDiv = document.getElementById('simulationResults');
resultsDiv.style.display = 'block';
if (window.dashboard) {
window.dashboard.setAvgPriceData(avgPriceData);
}
const entryTrades = trades.filter(t => t.recordType === 'entry').length;
const exitTrades = trades.filter(t => t.recordType === 'exit').length;
const profitableTrades = trades.filter(t => t.recordType === 'exit' && t.pnl > 0).length;
const winRate = exitTrades > 0 ? (profitableTrades / exitTrades * 100).toFixed(1) : 0;
const startPrice = equityData.usd[0].value / equityData.btc[0].value;
const startBtc = config.capital / startPrice;
const finalUsd = equityData.usd[equityData.usd.length - 1].value;
const finalBtc = finalUsd / endPrice;
const totalPnlUsd = finalUsd - config.capital;
const roi = (totalPnlUsd / config.capital * 100).toFixed(2);
const roiBtc = ((finalBtc - startBtc) / startBtc * 100).toFixed(2);
resultsDiv.innerHTML = `
<div class="sidebar-section">
<div class="sidebar-section-header">Results</div>
<div class="sidebar-section-content">
<div class="results-summary">
<div class="result-stat">
<div class="result-stat-value ${totalPnlUsd >= 0 ? 'positive' : 'negative'}">${roi}%</div>
<div class="result-stat-label">ROI (USD)</div>
</div>
<div class="result-stat">
<div class="result-stat-value ${parseFloat(roiBtc) >= 0 ? 'positive' : 'negative'}">${roiBtc}%</div>
<div class="result-stat-label">ROI (BTC)</div>
</div>
</div>
<div class="sim-stat-row">
<span>Starting Balance</span>
<span class="sim-value">$${config.capital.toFixed(0)} / ${startBtc.toFixed(4)} BTC</span>
</div>
<div class="sim-stat-row">
<span>Final Balance</span>
<span class="sim-value">$${finalUsd.toFixed(2)} / ${finalBtc.toFixed(4)} BTC</span>
</div>
<div class="sim-stat-row">
<span>Trades (Entry / Exit)</span>
<span class="sim-value">${entryTrades} / ${exitTrades}</span>
</div>
<div style="display: flex; justify-content: space-between; align-items: center; margin-top: 12px;">
<span style="font-size: 11px; color: var(--tv-text-secondary);">Equity Chart</span>
<div class="chart-toggle-group">
<button class="toggle-btn active" data-unit="usd">USD</button>
<button class="toggle-btn" data-unit="btc">BTC</button>
</div>
</div>
<div class="equity-chart-container" id="equityChart"></div>
<div style="display: flex; justify-content: space-between; align-items: center; margin-top: 12px;">
<span style="font-size: 11px; color: var(--tv-text-secondary);" id="posSizeLabel">Position Size (BTC)</span>
<div class="chart-toggle-group">
<button class="toggle-btn active" data-unit="usd">USD</button>
<button class="toggle-btn" data-unit="btc">BTC</button>
</div>
</div>
<div class="equity-chart-container" id="posSizeChart"></div>
<div class="results-actions">
<button class="action-btn secondary" id="toggleTradeMarkers">Show Markers</button>
<button class="action-btn secondary" id="clearSim">Clear</button>
</div>
</div>
</div>
`;
// Create Charts
const initCharts = () => {
const equityContainer = document.getElementById('equityChart');
if (equityContainer) {
equityContainer.innerHTML = '';
equityChart = LightweightCharts.createChart(equityContainer, {
layout: { background: { color: '#131722' }, textColor: '#d1d4dc' },
grid: { vertLines: { visible: false }, horzLines: { color: '#2a2e39' } },
rightPriceScale: { borderColor: '#2a2e39', autoScale: true },
timeScale: {
borderColor: '#2a2e39',
visible: true,
timeVisible: true,
secondsVisible: false,
tickMarkFormatter: (time, tickMarkType, locale) => {
return window.TimezoneConfig ? window.TimezoneConfig.formatTickMark(time) : new Date(time * 1000).toLocaleDateString();
},
},
localization: {
timeFormatter: (timestamp) => {
return window.TimezoneConfig ? window.TimezoneConfig.formatDate(timestamp * 1000) : new Date(timestamp * 1000).toLocaleString();
},
},
handleScroll: true,
handleScale: true
});
equitySeries = equityChart.addSeries(LightweightCharts.AreaSeries, {
lineColor: totalPnlUsd >= 0 ? '#26a69a' : '#ef5350',
topColor: totalPnlUsd >= 0 ? 'rgba(38, 166, 154, 0.4)' : 'rgba(239, 83, 80, 0.4)',
bottomColor: 'rgba(0, 0, 0, 0)',
lineWidth: 2,
});
equitySeries.setData(equityData['usd']);
equityChart.timeScale().fitContent();
}
const posSizeContainer = document.getElementById('posSizeChart');
if (posSizeContainer) {
posSizeContainer.innerHTML = '';
posSizeChart = LightweightCharts.createChart(posSizeContainer, {
layout: { background: { color: '#131722' }, textColor: '#d1d4dc' },
grid: { vertLines: { visible: false }, horzLines: { color: '#2a2e39' } },
rightPriceScale: { borderColor: '#2a2e39', autoScale: true },
timeScale: {
borderColor: '#2a2e39',
visible: true,
timeVisible: true,
secondsVisible: false,
tickMarkFormatter: (time, tickMarkType, locale) => {
return window.TimezoneConfig ? window.TimezoneConfig.formatTickMark(time) : new Date(time * 1000).toLocaleDateString();
},
},
localization: {
timeFormatter: (timestamp) => {
return window.TimezoneConfig ? window.TimezoneConfig.formatDate(timestamp * 1000) : new Date(timestamp * 1000).toLocaleString();
},
},
handleScroll: true,
handleScale: true
});
posSeries = posSizeChart.addSeries(LightweightCharts.AreaSeries, {
lineColor: '#00bcd4',
topColor: 'rgba(0, 188, 212, 0.4)',
bottomColor: 'rgba(0, 0, 0, 0)',
lineWidth: 2,
});
posSeries.setData(posSizeData['usd']);
posSizeChart.timeScale().fitContent();
const label = document.getElementById('posSizeLabel');
if (label) label.textContent = 'Position Size (USD)';
}
if (equityChart && posSizeChart) {
let isSyncing = false;
const syncCharts = (source, target) => {
if (isSyncing) return;
isSyncing = true;
const range = source.timeScale().getVisibleRange();
target.timeScale().setVisibleRange(range);
isSyncing = false;
};
equityChart.timeScale().subscribeVisibleTimeRangeChange(() => syncCharts(equityChart, posSizeChart));
posSizeChart.timeScale().subscribeVisibleTimeRangeChange(() => syncCharts(posSizeChart, equityChart));
}
const syncToMain = (param) => {
if (!param.time || !window.dashboard || !window.dashboard.chart) return;
const timeScale = window.dashboard.chart.timeScale();
const currentRange = timeScale.getVisibleRange();
if (!currentRange) return;
const width = currentRange.to - currentRange.from;
const halfWidth = width / 2;
timeScale.setVisibleRange({
from: param.time - halfWidth,
to: param.time + halfWidth
});
};
if (equityChart) equityChart.subscribeClick(syncToMain);
if (posSizeChart) posSizeChart.subscribeClick(syncToMain);
};
setTimeout(initCharts, 100);
resultsDiv.querySelectorAll('.toggle-btn').forEach(btn => {
btn.addEventListener('click', (e) => {
const unit = btn.dataset.unit;
resultsDiv.querySelectorAll(`.toggle-btn`).forEach(b => {
if (b.dataset.unit === unit) b.classList.add('active');
else b.classList.remove('active');
});
if (equitySeries) {
equitySeries.setData(equityData[unit]);
equityChart.timeScale().fitContent();
}
if (posSeries) {
posSeries.setData(posSizeData[unit]);
posSizeChart.timeScale().fitContent();
const label = document.getElementById('posSizeLabel');
if (label) label.textContent = `Position Size (${unit.toUpperCase()})`;
}
});
});
document.getElementById('toggleTradeMarkers').addEventListener('click', () => {
toggleSimulationMarkers(trades);
});
document.getElementById('clearSim').addEventListener('click', () => {
resultsDiv.style.display = 'none';
clearSimulationMarkers();
if (window.dashboard) {
window.dashboard.clearAvgPriceData();
}
});
}
function toggleSimulationMarkers(trades) {
if (tradeMarkers.length > 0) {
clearSimulationMarkers();
document.getElementById('toggleTradeMarkers').textContent = 'Show Markers';
return;
}
const markers = [];
trades.forEach(t => {
const usdVal = t.currentUsd !== undefined ? `$${t.currentUsd.toFixed(0)}` : '0';
const qtyVal = t.currentQty !== undefined ? `${t.currentQty.toFixed(4)} BTC` : '0';
const sizeStr = ` (${usdVal} / ${qtyVal})`;
if (t.recordType === 'entry') {
markers.push({
time: t.time,
position: t.type === 'long' ? 'belowBar' : 'aboveBar',
color: t.type === 'long' ? '#2962ff' : '#9c27b0',
shape: t.type === 'long' ? 'arrowUp' : 'arrowDown',
text: `Entry ${t.type.toUpperCase()}${sizeStr}`
});
}
if (t.recordType === 'exit') {
markers.push({
time: t.time,
position: t.type === 'long' ? 'aboveBar' : 'belowBar',
color: t.pnl >= 0 ? '#26a69a' : '#ef5350',
shape: t.type === 'long' ? 'arrowDown' : 'arrowUp',
text: `Exit ${t.reason}${sizeStr}`
});
}
});
markers.sort((a, b) => a.time - b.time);
if (window.dashboard) {
window.dashboard.setSimulationMarkers(markers);
tradeMarkers = markers;
document.getElementById('toggleTradeMarkers').textContent = 'Hide Markers';
}
}
function clearSimulationMarkers() {
if (window.dashboard) {
window.dashboard.clearSimulationMarkers();
tradeMarkers = [];
}
}
window.clearSimulationResults = function() {
const resultsDiv = document.getElementById('simulationResults');
if (resultsDiv) resultsDiv.style.display = 'none';
clearSimulationMarkers();
};

View File

@ -0,0 +1,23 @@
export function downloadFile(content, filename, mimeType) {
const blob = new Blob([content], { type: mimeType });
const url = URL.createObjectURL(blob);
const link = document.createElement('a');
link.href = url;
link.download = filename;
document.body.appendChild(link);
link.click();
document.body.removeChild(link);
URL.revokeObjectURL(url);
}
export function formatDate(date) {
return new Date(date).toISOString().slice(0, 16);
}
export function formatPrice(price, decimals = 2) {
return price.toFixed(decimals);
}
export function formatPercent(value) {
return (value >= 0 ? '+' : '') + value.toFixed(2) + '%';
}

View File

@ -0,0 +1 @@
export { downloadFile, formatDate, formatPrice, formatPercent } from './helpers.js';

547
src/api/server.py Normal file
View File

@ -0,0 +1,547 @@
"""
Simplified FastAPI server - working version
Removes the complex WebSocket manager that was causing issues
"""
import os
import asyncio
import logging
from dotenv import load_dotenv
load_dotenv()
from datetime import datetime, timedelta, timezone
from typing import Optional, List
from contextlib import asynccontextmanager
from fastapi import FastAPI, HTTPException, Query, BackgroundTasks, Response
from fastapi.staticfiles import StaticFiles
from fastapi.responses import StreamingResponse
from fastapi.middleware.cors import CORSMiddleware
import asyncpg
import csv
import io
from pydantic import BaseModel, Field
# Imports for backtest runner
from src.data_collector.database import DatabaseManager
from src.data_collector.indicator_engine import IndicatorEngine, IndicatorConfig
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
# Database connection settings
DB_HOST = os.getenv('DB_HOST', 'localhost')
DB_PORT = int(os.getenv('DB_PORT', 5432))
DB_NAME = os.getenv('DB_NAME', 'btc_data')
DB_USER = os.getenv('DB_USER', 'btc_bot')
DB_PASSWORD = os.getenv('DB_PASSWORD', '')
async def get_db_pool():
"""Create database connection pool"""
logger.info(f"Connecting to database: {DB_HOST}:{DB_PORT}/{DB_NAME} as {DB_USER}")
return await asyncpg.create_pool(
host=DB_HOST,
port=DB_PORT,
database=DB_NAME,
user=DB_USER,
password=DB_PASSWORD,
min_size=2,
max_size=20,
max_inactive_connection_lifetime=300
)
pool = None
@asynccontextmanager
async def lifespan(app: FastAPI):
"""Manage application lifespan"""
global pool
pool = await get_db_pool()
logger.info("API Server started successfully")
yield
if pool:
await pool.close()
logger.info("API Server stopped")
app = FastAPI(
title="BTC Bot Data API",
description="REST API for accessing BTC candle data",
version="1.1.0",
lifespan=lifespan
)
# Enable CORS
app.add_middleware(
CORSMiddleware,
allow_origins=["*"],
allow_credentials=True,
allow_methods=["*"],
allow_headers=["*"],
)
@app.get("/")
async def root():
"""Root endpoint"""
return {
"message": "BTC Bot Data API",
"docs": "/docs",
"dashboard": "/dashboard",
"status": "operational"
}
@app.get("/api/v1/candles")
async def get_candles(
symbol: str = Query("BTC", description="Trading pair symbol"),
interval: str = Query("1m", description="Candle interval"),
start: Optional[datetime] = Query(None, description="Start time (ISO format)"),
end: Optional[datetime] = Query(None, description="End time (ISO format)"),
limit: int = Query(1000, ge=1, le=10000, description="Maximum number of candles")
):
"""Get candle data for a symbol"""
async with pool.acquire() as conn:
query = """
SELECT time, symbol, interval, open, high, low, close, volume, validated
FROM candles
WHERE symbol = $1 AND interval = $2
"""
params = [symbol, interval]
if start:
query += f" AND time >= ${len(params) + 1}"
params.append(start)
if end:
query += f" AND time <= ${len(params) + 1}"
params.append(end)
query += f" ORDER BY time DESC LIMIT ${len(params) + 1}"
params.append(limit)
rows = await conn.fetch(query, *params)
return {
"symbol": symbol,
"interval": interval,
"count": len(rows),
"candles": [dict(row) for row in rows]
}
from typing import Optional, List
# ...
@app.get("/api/v1/candles/bulk")
async def get_candles_bulk(
symbol: str = Query("BTC"),
timeframes: List[str] = Query(["1h"]),
start: datetime = Query(...),
end: Optional[datetime] = Query(None),
):
"""Get multiple timeframes of candles in a single request for client-side processing"""
logger.info(f"Bulk candle request: {symbol}, TFs: {timeframes}, Start: {start}, End: {end}")
if not end:
end = datetime.now(timezone.utc)
results = {}
async with pool.acquire() as conn:
for tf in timeframes:
rows = await conn.fetch("""
SELECT time, open, high, low, close, volume
FROM candles
WHERE symbol = $1 AND interval = $2
AND time >= $3 AND time <= $4
ORDER BY time ASC
""", symbol, tf, start, end)
results[tf] = [
{
"time": r['time'].isoformat(),
"open": float(r['open']),
"high": float(r['high']),
"low": float(r['low']),
"close": float(r['close']),
"volume": float(r['volume'])
} for r in rows
]
logger.info(f"Returning {sum(len(v) for v in results.values())} candles total")
return results
@app.get("/api/v1/candles/latest")
async def get_latest_candle(symbol: str = "BTC", interval: str = "1m"):
"""Get the most recent candle"""
async with pool.acquire() as conn:
row = await conn.fetchrow("""
SELECT time, symbol, interval, open, high, low, close, volume
FROM candles
WHERE symbol = $1 AND interval = $2
ORDER BY time DESC
LIMIT 1
""", symbol, interval)
if not row:
raise HTTPException(status_code=404, detail="No data found")
return dict(row)
@app.get("/api/v1/stats")
async def get_stats(symbol: str = "BTC"):
"""Get trading statistics"""
async with pool.acquire() as conn:
# Get latest price and 24h stats
latest = await conn.fetchrow("""
SELECT close, time
FROM candles
WHERE symbol = $1 AND interval = '1m'
ORDER BY time DESC
LIMIT 1
""", symbol)
day_ago = await conn.fetchrow("""
SELECT close
FROM candles
WHERE symbol = $1 AND interval = '1m' AND time <= NOW() - INTERVAL '24 hours'
ORDER BY time DESC
LIMIT 1
""", symbol)
stats_24h = await conn.fetchrow("""
SELECT
MAX(high) as high_24h,
MIN(low) as low_24h,
SUM(volume) as volume_24h
FROM candles
WHERE symbol = $1 AND interval = '1m' AND time > NOW() - INTERVAL '24 hours'
""", symbol)
if not latest:
raise HTTPException(status_code=404, detail="No data found")
current_price = float(latest['close'])
previous_price = float(day_ago['close']) if day_ago else current_price
change_24h = ((current_price - previous_price) / previous_price * 100) if previous_price else 0
return {
"symbol": symbol,
"current_price": current_price,
"change_24h": round(change_24h, 2),
"high_24h": float(stats_24h['high_24h']) if stats_24h['high_24h'] else current_price,
"low_24h": float(stats_24h['low_24h']) if stats_24h['low_24h'] else current_price,
"volume_24h": float(stats_24h['volume_24h']) if stats_24h['volume_24h'] else 0,
"last_update": latest['time'].isoformat()
}
@app.get("/api/v1/health")
async def health_check():
"""System health check"""
try:
async with pool.acquire() as conn:
latest = await conn.fetchrow("""
SELECT symbol, MAX(time) as last_time, COUNT(*) as count
FROM candles
WHERE time > NOW() - INTERVAL '24 hours'
GROUP BY symbol
""")
return {
"status": "healthy",
"database": "connected",
"latest_candles": dict(latest) if latest else None,
"timestamp": datetime.utcnow().isoformat()
}
except Exception as e:
logger.error(f"Health check failed: {e}")
raise HTTPException(status_code=503, detail=f"Health check failed: {str(e)}")
@app.get("/api/v1/indicators")
async def get_indicators(
symbol: str = Query("BTC", description="Trading pair symbol"),
interval: str = Query("1d", description="Candle interval"),
name: str = Query(None, description="Filter by indicator name (e.g., ma44)"),
start: Optional[datetime] = Query(None, description="Start time"),
end: Optional[datetime] = Query(None, description="End time"),
limit: int = Query(1000, le=5000)
):
"""Get indicator values"""
async with pool.acquire() as conn:
query = """
SELECT time, indicator_name, value
FROM indicators
WHERE symbol = $1 AND interval = $2
"""
params = [symbol, interval]
if name:
query += f" AND indicator_name = ${len(params) + 1}"
params.append(name)
if start:
query += f" AND time >= ${len(params) + 1}"
params.append(start)
if end:
query += f" AND time <= ${len(params) + 1}"
params.append(end)
query += f" ORDER BY time DESC LIMIT ${len(params) + 1}"
params.append(limit)
rows = await conn.fetch(query, *params)
# Group by time for easier charting
grouped = {}
for row in rows:
ts = row['time'].isoformat()
if ts not in grouped:
grouped[ts] = {'time': ts}
grouped[ts][row['indicator_name']] = float(row['value'])
return {
"symbol": symbol,
"interval": interval,
"data": list(grouped.values())
}
@app.get("/api/v1/decisions")
async def get_decisions(
symbol: str = Query("BTC"),
interval: Optional[str] = Query(None),
backtest_id: Optional[str] = Query(None),
limit: int = Query(100, le=1000)
):
"""Get brain decisions"""
async with pool.acquire() as conn:
query = """
SELECT time, interval, decision_type, strategy, confidence,
price_at_decision, indicator_snapshot, reasoning, backtest_id
FROM decisions
WHERE symbol = $1
"""
params = [symbol]
if interval:
query += f" AND interval = ${len(params) + 1}"
params.append(interval)
if backtest_id:
query += f" AND backtest_id = ${len(params) + 1}"
params.append(backtest_id)
else:
query += " AND backtest_id IS NULL"
query += f" ORDER BY time DESC LIMIT ${len(params) + 1}"
params.append(limit)
rows = await conn.fetch(query, *params)
return [dict(row) for row in rows]
@app.get("/api/v1/backtests")
async def list_backtests(symbol: Optional[str] = None, limit: int = 20):
"""List historical backtests"""
async with pool.acquire() as conn:
query = """
SELECT id, strategy, symbol, start_time, end_time,
intervals, results, created_at
FROM backtest_runs
"""
params = []
if symbol:
query += " WHERE symbol = $1"
params.append(symbol)
query += f" ORDER BY created_at DESC LIMIT ${len(params) + 1}"
params.append(limit)
rows = await conn.fetch(query, *params)
return [dict(row) for row in rows]
@app.get("/api/v1/ta")
async def get_technical_analysis(
symbol: str = Query("BTC", description="Trading pair symbol"),
interval: str = Query("1d", description="Candle interval")
):
"""
Get technical analysis for a symbol
Uses stored indicators from DB if available, falls back to on-the-fly calc
"""
try:
async with pool.acquire() as conn:
# 1. Get latest price
latest = await conn.fetchrow("""
SELECT close, time
FROM candles
WHERE symbol = $1 AND interval = $2
ORDER BY time DESC
LIMIT 1
""", symbol, interval)
if not latest:
return {"error": "No candle data found"}
current_price = float(latest['close'])
timestamp = latest['time']
# 2. Get latest indicators from DB
indicators = await conn.fetch("""
SELECT indicator_name, value
FROM indicators
WHERE symbol = $1 AND interval = $2
AND time <= $3
ORDER BY time DESC
""", symbol, interval, timestamp)
# Convert list to dict, e.g. {'ma44': 65000, 'ma125': 64000}
# We take the most recent value for each indicator
ind_map = {}
for row in indicators:
name = row['indicator_name']
if name not in ind_map:
ind_map[name] = float(row['value'])
ma_44 = ind_map.get('ma44')
ma_125 = ind_map.get('ma125')
# Determine trend
if ma_44 and ma_125:
if current_price > ma_44 > ma_125:
trend = "Bullish"
trend_strength = "Strong" if current_price > ma_44 * 1.05 else "Moderate"
elif current_price < ma_44 < ma_125:
trend = "Bearish"
trend_strength = "Strong" if current_price < ma_44 * 0.95 else "Moderate"
else:
trend = "Neutral"
trend_strength = "Consolidation"
else:
trend = "Unknown"
trend_strength = "Insufficient data"
# 3. Find support/resistance (simple recent high/low)
rows = await conn.fetch("""
SELECT high, low
FROM candles
WHERE symbol = $1 AND interval = $2
ORDER BY time DESC
LIMIT 20
""", symbol, interval)
if rows:
highs = [float(r['high']) for r in rows]
lows = [float(r['low']) for r in rows]
resistance = max(highs)
support = min(lows)
price_range = resistance - support
if price_range > 0:
position = (current_price - support) / price_range * 100
else:
position = 50
else:
resistance = current_price
support = current_price
position = 50
return {
"symbol": symbol,
"interval": interval,
"timestamp": timestamp.isoformat(),
"current_price": round(current_price, 2),
"moving_averages": {
"ma_44": round(ma_44, 2) if ma_44 else None,
"ma_125": round(ma_125, 2) if ma_125 else None,
"price_vs_ma44": round((current_price / ma_44 - 1) * 100, 2) if ma_44 else None,
"price_vs_ma125": round((current_price / ma_125 - 1) * 100, 2) if ma_125 else None
},
"trend": {
"direction": trend,
"strength": trend_strength,
"signal": "Buy" if trend == "Bullish" and trend_strength == "Strong" else
"Sell" if trend == "Bearish" and trend_strength == "Strong" else "Hold"
},
"levels": {
"resistance": round(resistance, 2),
"support": round(support, 2),
"position_in_range": round(position, 1)
},
"ai_placeholder": {
"available": False,
"message": "AI analysis available via Gemini or local LLM",
"action": "Click to analyze with AI"
}
}
except Exception as e:
logger.error(f"Technical analysis error: {e}")
raise HTTPException(status_code=500, detail=f"Technical analysis failed: {str(e)}")
@app.get("/api/v1/export/csv")
async def export_csv(
symbol: str = "BTC",
interval: str = "1m",
days: int = Query(7, ge=1, le=365, description="Number of days to export")
):
"""Export candle data to CSV"""
start_date = datetime.utcnow() - timedelta(days=days)
async with pool.acquire() as conn:
query = """
SELECT time, open, high, low, close, volume
FROM candles
WHERE symbol = $1 AND interval = $2 AND time >= $3
ORDER BY time
"""
rows = await conn.fetch(query, symbol, interval, start_date)
if not rows:
raise HTTPException(status_code=404, detail="No data found for export")
output = io.StringIO()
writer = csv.writer(output)
writer.writerow(['timestamp', 'open', 'high', 'low', 'close', 'volume'])
for row in rows:
writer.writerow([
row['time'].isoformat(),
row['open'],
row['high'],
row['low'],
row['close'],
row['volume']
])
output.seek(0)
return StreamingResponse(
io.BytesIO(output.getvalue().encode()),
media_type="text/csv",
headers={
"Content-Disposition": f"attachment; filename={symbol}_{interval}_{days}d.csv"
}
)
# Serve static files for dashboard
app.mount("/dashboard", StaticFiles(directory="src/api/dashboard/static", html=True), name="dashboard")
if __name__ == "__main__":
import uvicorn
uvicorn.run(app, host="0.0.0.0", port=8000)

View File

@ -0,0 +1,21 @@
# Data collector module
from .websocket_client import HyperliquidWebSocket, Candle
from .candle_buffer import CandleBuffer
from .database import DatabaseManager
from .backfill import HyperliquidBackfill
from .custom_timeframe_generator import CustomTimeframeGenerator
from .indicator_engine import IndicatorEngine, IndicatorConfig
from .brain import Brain, Decision
__all__ = [
'HyperliquidWebSocket',
'Candle',
'CandleBuffer',
'DatabaseManager',
'HyperliquidBackfill',
'CustomTimeframeGenerator',
'IndicatorEngine',
'IndicatorConfig',
'Brain',
'Decision'
]

View File

@ -0,0 +1,368 @@
"""
Hyperliquid Historical Data Backfill Module
Downloads candle data from Hyperliquid REST API with pagination support
"""
import asyncio
import logging
from datetime import datetime, timezone, timedelta
from typing import List, Dict, Any, Optional
import aiohttp
from .database import DatabaseManager
from .websocket_client import Candle
logger = logging.getLogger(__name__)
class HyperliquidBackfill:
"""
Backfills historical candle data from Hyperliquid REST API
API Limitations:
- Max 5000 candles per coin/interval combination
- 500 candles per response (requires pagination)
- Available intervals: 1m, 3m, 5m, 15m, 30m, 1h, 2h, 4h, 8h, 12h, 1d, 3d, 1w, 1M
"""
API_URL = "https://api.hyperliquid.xyz/info"
MAX_CANDLES_PER_REQUEST = 500
# Hyperliquid API might limit total history, but we'll set a high limit
# and stop when no more data is returned
MAX_TOTAL_CANDLES = 500000
# Standard timeframes supported by Hyperliquid
INTERVALS = [
"1m", "3m", "5m", "15m", "30m",
"1h", "2h", "4h", "8h", "12h",
"1d", "3d", "1w", "1M"
]
def __init__(
self,
db: DatabaseManager,
coin: str = "BTC",
intervals: Optional[List[str]] = None
):
self.db = db
self.coin = coin
self.intervals = intervals or ["1m"] # Default to 1m
self.session: Optional[aiohttp.ClientSession] = None
async def __aenter__(self):
"""Async context manager entry"""
self.session = aiohttp.ClientSession()
return self
async def __aexit__(self, exc_type, exc_val, exc_tb):
"""Async context manager exit"""
if self.session:
await self.session.close()
async def fetch_candles(
self,
interval: str,
start_time: datetime,
end_time: Optional[datetime] = None
) -> List[Candle]:
"""
Fetch candles for a specific interval with pagination
Args:
interval: Candle interval (e.g., "1m", "1h", "1d")
start_time: Start time (inclusive)
end_time: End time (inclusive, defaults to now)
Returns:
List of Candle objects
"""
if interval not in self.INTERVALS:
raise ValueError(f"Invalid interval: {interval}. Must be one of {self.INTERVALS}")
end_time = end_time or datetime.now(timezone.utc)
# Convert to milliseconds
start_ms = int(start_time.timestamp() * 1000)
end_ms = int(end_time.timestamp() * 1000)
all_candles = []
total_fetched = 0
while total_fetched < self.MAX_TOTAL_CANDLES:
logger.info(f"Fetching {interval} candles from {datetime.fromtimestamp(start_ms/1000, tz=timezone.utc)} "
f"(batch {total_fetched//self.MAX_CANDLES_PER_REQUEST + 1})")
try:
batch = await self._fetch_batch(interval, start_ms, end_ms)
if not batch:
logger.info(f"No more {interval} candles available")
break
all_candles.extend(batch)
total_fetched += len(batch)
logger.info(f"Fetched {len(batch)} {interval} candles (total: {total_fetched})")
# Check if we got less than max, means we're done
if len(batch) < self.MAX_CANDLES_PER_REQUEST:
break
# Update start_time for next batch (last candle's time + 1ms)
last_candle = batch[-1]
start_ms = int(last_candle.time.timestamp() * 1000) + 1
# Small delay to avoid rate limiting
await asyncio.sleep(0.1)
except Exception as e:
logger.error(f"Error fetching {interval} candles: {e}")
break
logger.info(f"Backfill complete for {interval}: {len(all_candles)} candles total")
return all_candles
async def _fetch_batch(
self,
interval: str,
start_ms: int,
end_ms: int
) -> List[Candle]:
"""Fetch a single batch of candles from the API"""
if not self.session:
raise RuntimeError("Session not initialized. Use async context manager.")
payload = {
"type": "candleSnapshot",
"req": {
"coin": self.coin,
"interval": interval,
"startTime": start_ms,
"endTime": end_ms
}
}
async with self.session.post(self.API_URL, json=payload) as response:
if response.status != 200:
text = await response.text()
raise Exception(f"API error {response.status}: {text}")
data = await response.json()
if not isinstance(data, list):
logger.warning(f"Unexpected response format: {data}")
return []
candles = []
for item in data:
try:
candle = self._parse_candle_item(item, interval)
if candle:
candles.append(candle)
except Exception as e:
logger.warning(f"Failed to parse candle: {item}, error: {e}")
return candles
def _parse_candle_item(self, data: Dict[str, Any], interval: str) -> Optional[Candle]:
"""Parse a single candle item from API response"""
try:
# Format: {"t": 1770812400000, "T": ..., "s": "BTC", "i": "1m", "o": "67164.0", ...}
timestamp_ms = int(data.get("t", 0))
timestamp = datetime.fromtimestamp(timestamp_ms / 1000, tz=timezone.utc)
return Candle(
time=timestamp,
symbol=self.coin,
interval=interval,
open=float(data.get("o", 0)),
high=float(data.get("h", 0)),
low=float(data.get("l", 0)),
close=float(data.get("c", 0)),
volume=float(data.get("v", 0))
)
except (KeyError, ValueError, TypeError) as e:
logger.error(f"Failed to parse candle data: {e}, data: {data}")
return None
async def backfill_interval(
self,
interval: str,
days_back: int = 7
) -> int:
"""
Backfill a specific interval for the last N days
Args:
interval: Candle interval
days_back: Number of days to backfill (use 0 for max available)
Returns:
Number of candles inserted
"""
if days_back == 0:
# Fetch maximum available data (5000 candles)
return await self.backfill_max(interval)
end_time = datetime.now(timezone.utc)
start_time = end_time - timedelta(days=days_back)
logger.info(f"Starting backfill for {interval}: {start_time} to {end_time}")
candles = await self.fetch_candles(interval, start_time, end_time)
if not candles:
logger.warning(f"No candles fetched for {interval}")
return 0
# Insert into database
inserted = await self.db.insert_candles(candles)
logger.info(f"Inserted {inserted} candles for {interval}")
return inserted
async def backfill_max(self, interval: str) -> int:
"""
Backfill maximum available data (5000 candles) for an interval
Args:
interval: Candle interval
Returns:
Number of candles inserted
"""
logger.info(f"Fetching maximum available {interval} data (up to 5000 candles)")
# For weekly and monthly, start from 2020 to ensure we get all available data
# Hyperliquid launched around 2023, so this should capture everything
start_time = datetime(2020, 1, 1, tzinfo=timezone.utc)
end_time = datetime.now(timezone.utc)
logger.info(f"Fetching {interval} candles from {start_time} to {end_time}")
candles = await self.fetch_candles(interval, start_time, end_time)
if not candles:
logger.warning(f"No candles fetched for {interval}")
return 0
# Insert into database
inserted = await self.db.insert_candles(candles)
logger.info(f"Inserted {inserted} candles for {interval} (max available)")
return inserted
def _interval_to_minutes(self, interval: str) -> int:
"""Convert interval string to minutes"""
mapping = {
"1m": 1, "3m": 3, "5m": 5, "15m": 15, "30m": 30,
"1h": 60, "2h": 120, "4h": 240, "8h": 480, "12h": 720,
"1d": 1440, "3d": 4320, "1w": 10080, "1M": 43200
}
return mapping.get(interval, 1)
async def backfill_all_intervals(
self,
days_back: int = 7
) -> Dict[str, int]:
"""
Backfill all configured intervals
Args:
days_back: Number of days to backfill
Returns:
Dictionary mapping interval to count inserted
"""
results = {}
for interval in self.intervals:
try:
count = await self.backfill_interval(interval, days_back)
results[interval] = count
except Exception as e:
logger.error(f"Failed to backfill {interval}: {e}")
results[interval] = 0
return results
async def get_earliest_candle_time(self, interval: str) -> Optional[datetime]:
"""Get the earliest candle time available for an interval"""
# Try fetching from epoch to find earliest available
start_time = datetime(2020, 1, 1, tzinfo=timezone.utc)
end_time = datetime.now(timezone.utc)
candles = await self.fetch_candles(interval, start_time, end_time)
if candles:
earliest = min(c.time for c in candles)
logger.info(f"Earliest {interval} candle available: {earliest}")
return earliest
return None
async def main():
"""CLI entry point for backfill"""
import argparse
import os
parser = argparse.ArgumentParser(description="Backfill Hyperliquid historical data")
parser.add_argument("--coin", default="BTC", help="Coin symbol (default: BTC)")
parser.add_argument("--intervals", nargs="+", default=["1m"],
help="Intervals to backfill (default: 1m)")
parser.add_argument("--days", type=str, default="7",
help="Days to backfill (default: 7, use 'max' for maximum available)")
parser.add_argument("--db-host", default=os.getenv("DB_HOST", "localhost"),
help="Database host (default: localhost or DB_HOST env)")
parser.add_argument("--db-port", type=int, default=int(os.getenv("DB_PORT", 5432)),
help="Database port (default: 5432 or DB_PORT env)")
parser.add_argument("--db-name", default=os.getenv("DB_NAME", "btc_data"),
help="Database name (default: btc_data or DB_NAME env)")
parser.add_argument("--db-user", default=os.getenv("DB_USER", "btc_bot"),
help="Database user (default: btc_bot or DB_USER env)")
parser.add_argument("--db-password", default=os.getenv("DB_PASSWORD", ""),
help="Database password (default: from DB_PASSWORD env)")
args = parser.parse_args()
# Parse days argument
if args.days.lower() == "max":
days_back = 0 # 0 means max available
logger.info("Backfill mode: MAX (fetching up to 5000 candles per interval)")
else:
days_back = int(args.days)
logger.info(f"Backfill mode: Last {days_back} days")
# Setup logging
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
)
# Initialize database
db = DatabaseManager(
host=args.db_host,
port=args.db_port,
database=args.db_name,
user=args.db_user,
password=args.db_password
)
await db.connect()
try:
async with HyperliquidBackfill(db, args.coin, args.intervals) as backfill:
results = await backfill.backfill_all_intervals(days_back)
print("\n=== Backfill Summary ===")
for interval, count in results.items():
print(f"{interval}: {count} candles")
print(f"Total: {sum(results.values())} candles")
finally:
await db.disconnect()
if __name__ == "__main__":
asyncio.run(main())

View File

@ -0,0 +1,154 @@
"""
One-time backfill script to fill gaps in data.
Run with: python -m data_collector.backfill_gap --start "2024-01-01 09:34" --end "2024-01-01 19:39"
"""
import asyncio
import logging
import sys
import os
from datetime import datetime, timezone
from typing import Optional
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))))
from .database import DatabaseManager
from .backfill import HyperliquidBackfill
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
)
logger = logging.getLogger(__name__)
INTERVALS = ["1m", "3m", "5m", "15m", "30m", "1h", "2h", "4h", "8h", "12h", "1d", "3d", "1w"]
async def backfill_gap(
start_time: datetime,
end_time: datetime,
symbol: str = "BTC",
intervals: Optional[list] = None
) -> dict:
"""
Backfill a specific time gap for all intervals.
Args:
start_time: Gap start time (UTC)
end_time: Gap end time (UTC)
symbol: Trading symbol
intervals: List of intervals to backfill (default: all standard)
Returns:
Dictionary with interval -> count mapping
"""
intervals = intervals or INTERVALS
results = {}
db = DatabaseManager()
await db.connect()
logger.info(f"Backfilling gap: {start_time} to {end_time} for {symbol}")
try:
async with HyperliquidBackfill(db, symbol, intervals) as backfill:
for interval in intervals:
try:
logger.info(f"Backfilling {interval}...")
candles = await backfill.fetch_candles(interval, start_time, end_time)
if candles:
inserted = await db.insert_candles(candles)
results[interval] = inserted
logger.info(f" {interval}: {inserted} candles inserted")
else:
results[interval] = 0
logger.warning(f" {interval}: No candles returned")
await asyncio.sleep(0.3)
except Exception as e:
logger.error(f" {interval}: Error - {e}")
results[interval] = 0
finally:
await db.disconnect()
logger.info(f"Backfill complete. Total: {sum(results.values())} candles")
return results
async def auto_detect_and_fill_gaps(symbol: str = "BTC") -> dict:
"""
Detect and fill all gaps in the database for all intervals.
"""
db = DatabaseManager()
await db.connect()
results = {}
try:
async with HyperliquidBackfill(db, symbol, INTERVALS) as backfill:
for interval in INTERVALS:
try:
# Detect gaps
gaps = await db.detect_gaps(symbol, interval)
if not gaps:
logger.info(f"{interval}: No gaps detected")
results[interval] = 0
continue
logger.info(f"{interval}: {len(gaps)} gaps detected")
total_filled = 0
for gap in gaps:
gap_start = datetime.fromisoformat(gap['gap_start'].replace('Z', '+00:00'))
gap_end = datetime.fromisoformat(gap['gap_end'].replace('Z', '+00:00'))
logger.info(f" Filling gap: {gap_start} to {gap_end}")
candles = await backfill.fetch_candles(interval, gap_start, gap_end)
if candles:
inserted = await db.insert_candles(candles)
total_filled += inserted
logger.info(f" Filled {inserted} candles")
await asyncio.sleep(0.2)
results[interval] = total_filled
except Exception as e:
logger.error(f"{interval}: Error - {e}")
results[interval] = 0
finally:
await db.disconnect()
return results
async def main():
import argparse
parser = argparse.ArgumentParser(description="Backfill gaps in BTC data")
parser.add_argument("--start", help="Start time (YYYY-MM-DD HH:MM)", default=None)
parser.add_argument("--end", help="End time (YYYY-MM-DD HH:MM)", default=None)
parser.add_argument("--auto", action="store_true", help="Auto-detect and fill all gaps")
parser.add_argument("--symbol", default="BTC", help="Symbol to backfill")
args = parser.parse_args()
if args.auto:
await auto_detect_and_fill_gaps(args.symbol)
elif args.start and args.end:
start_time = datetime.strptime(args.start, "%Y-%m-%d %H:%M").replace(tzinfo=timezone.utc)
end_time = datetime.strptime(args.end, "%Y-%m-%d %H:%M").replace(tzinfo=timezone.utc)
await backfill_gap(start_time, end_time, args.symbol)
else:
parser.print_help()
if __name__ == "__main__":
asyncio.run(main())

196
src/data_collector/brain.py Normal file
View File

@ -0,0 +1,196 @@
"""
Brain - Simplified indicator evaluation
"""
import json
import logging
from dataclasses import dataclass
from datetime import datetime, timezone
from typing import Dict, Optional, Any, List
from .database import DatabaseManager
from .indicator_engine import IndicatorEngine
logger = logging.getLogger(__name__)
@dataclass
class Decision:
"""A single brain evaluation result"""
time: datetime
symbol: str
interval: str
decision_type: str # "buy", "sell", "hold"
strategy: str
confidence: float
price_at_decision: float
indicator_snapshot: Dict[str, Any]
candle_snapshot: Dict[str, Any]
reasoning: str
backtest_id: Optional[str] = None
def to_db_tuple(self) -> tuple:
"""Convert to positional tuple for DB insert"""
return (
self.time,
self.symbol,
self.interval,
self.decision_type,
self.strategy,
self.confidence,
self.price_at_decision,
json.dumps(self.indicator_snapshot),
json.dumps(self.candle_snapshot),
self.reasoning,
self.backtest_id,
)
class Brain:
"""
Evaluates market conditions using indicators.
Simplified version without complex strategy plug-ins.
"""
def __init__(
self,
db: DatabaseManager,
indicator_engine: IndicatorEngine,
strategy: str = "default",
):
self.db = db
self.indicator_engine = indicator_engine
self.strategy_name = strategy
logger.info("Brain initialized (Simplified)")
async def evaluate(
self,
symbol: str,
interval: str,
timestamp: datetime,
indicators: Optional[Dict[str, float]] = None,
backtest_id: Optional[str] = None,
current_position: Optional[Dict[str, Any]] = None,
) -> Decision:
"""
Evaluate market conditions and produce a decision.
"""
# Get indicator values
if indicators is None:
indicators = await self.indicator_engine.get_values_at(
symbol, interval, timestamp
)
# Get the triggering candle
candle = await self._get_candle(symbol, interval, timestamp)
if not candle:
return self._create_empty_decision(timestamp, symbol, interval, indicators, backtest_id)
price = float(candle["close"])
candle_dict = {
"time": candle["time"].isoformat(),
"open": float(candle["open"]),
"high": float(candle["high"]),
"low": float(candle["low"]),
"close": price,
"volume": float(candle["volume"]),
}
# Simple crossover logic example if needed, otherwise just return HOLD
# For now, we just return a neutral decision as "Strategies" are removed
decision = Decision(
time=timestamp,
symbol=symbol,
interval=interval,
decision_type="hold",
strategy=self.strategy_name,
confidence=0.0,
price_at_decision=price,
indicator_snapshot=indicators,
candle_snapshot=candle_dict,
reasoning="Strategy logic removed - Dashboard shows indicators",
backtest_id=backtest_id,
)
# Store to DB
await self._store_decision(decision)
return decision
def _create_empty_decision(self, timestamp, symbol, interval, indicators, backtest_id):
return Decision(
time=timestamp,
symbol=symbol,
interval=interval,
decision_type="hold",
strategy=self.strategy_name,
confidence=0.0,
price_at_decision=0.0,
indicator_snapshot=indicators or {},
candle_snapshot={},
reasoning="No candle data available",
backtest_id=backtest_id,
)
async def _get_candle(
self,
symbol: str,
interval: str,
timestamp: datetime,
) -> Optional[Dict[str, Any]]:
"""Fetch a specific candle from the database"""
async with self.db.acquire() as conn:
row = await conn.fetchrow("""
SELECT time, open, high, low, close, volume
FROM candles
WHERE symbol = $1 AND interval = $2 AND time = $3
""", symbol, interval, timestamp)
return dict(row) if row else None
async def _store_decision(self, decision: Decision) -> None:
"""Write decision to the decisions table"""
async with self.db.acquire() as conn:
await conn.execute("""
INSERT INTO decisions (
time, symbol, interval, decision_type, strategy,
confidence, price_at_decision, indicator_snapshot,
candle_snapshot, reasoning, backtest_id
)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11)
""", *decision.to_db_tuple())
async def get_recent_decisions(
self,
symbol: str,
limit: int = 20,
backtest_id: Optional[str] = None,
) -> List[Dict[str, Any]]:
"""Get recent decisions, optionally filtered by backtest_id"""
async with self.db.acquire() as conn:
if backtest_id is not None:
rows = await conn.fetch("""
SELECT time, symbol, interval, decision_type, strategy,
confidence, price_at_decision, indicator_snapshot,
candle_snapshot, reasoning, backtest_id
FROM decisions
WHERE symbol = $1 AND backtest_id = $2
ORDER BY time DESC
LIMIT $3
""", symbol, backtest_id, limit)
else:
rows = await conn.fetch("""
SELECT time, symbol, interval, decision_type, strategy,
confidence, price_at_decision, indicator_snapshot,
candle_snapshot, reasoning, backtest_id
FROM decisions
WHERE symbol = $1 AND backtest_id IS NULL
ORDER BY time DESC
LIMIT $2
""", symbol, limit)
return [dict(row) for row in rows]
def reset_state(self) -> None:
"""Reset internal state tracking"""
pass

View File

@ -0,0 +1,224 @@
"""
In-memory candle buffer with automatic batching
Optimized for low memory footprint on DS218+
"""
import asyncio
import logging
from collections import deque
from datetime import datetime, timezone
from typing import Dict, List, Optional, Callable, Any, Awaitable
from dataclasses import dataclass, field
from .websocket_client import Candle
logger = logging.getLogger(__name__)
@dataclass
class BufferStats:
"""Statistics for buffer performance monitoring"""
total_added: int = 0
total_flushed: int = 0
total_dropped: int = 0
last_flush_time: Optional[datetime] = None
avg_batch_size: float = 0.0
def to_dict(self) -> Dict[str, Any]:
return {
'total_added': self.total_added,
'total_flushed': self.total_flushed,
'total_dropped': self.total_dropped,
'last_flush_time': self.last_flush_time.isoformat() if self.last_flush_time else None,
'avg_batch_size': round(self.avg_batch_size, 2)
}
class CandleBuffer:
"""
Thread-safe circular buffer for candle data
Automatically flushes to database in batches
"""
def __init__(
self,
max_size: int = 1000,
flush_interval_seconds: float = 30.0,
batch_size: int = 100,
on_flush_callback: Optional[Callable[[List[Candle]], Awaitable[None]]] = None
):
self.max_size = max_size
self.flush_interval = flush_interval_seconds
self.batch_size = batch_size
self.on_flush = on_flush_callback
# Thread-safe buffer using deque
self._buffer: deque = deque(maxlen=max_size)
self._lock = asyncio.Lock()
self._flush_event = asyncio.Event()
self._stop_event = asyncio.Event()
self.stats = BufferStats()
self._batch_sizes: deque = deque(maxlen=100) # For averaging
# Tasks
self._flush_task: Optional[asyncio.Task] = None
async def start(self) -> None:
"""Start the background flush task"""
self._flush_task = asyncio.create_task(self._flush_loop())
logger.info(f"CandleBuffer started (max_size={self.max_size}, flush_interval={self.flush_interval}s)")
async def stop(self) -> None:
"""Stop the buffer and flush remaining data"""
self._stop_event.set()
self._flush_event.set() # Wake up flush loop
if self._flush_task:
try:
await asyncio.wait_for(self._flush_task, timeout=10.0)
except asyncio.TimeoutError:
logger.warning("Flush task did not stop in time")
# Final flush
await self.flush()
logger.info("CandleBuffer stopped")
async def add(self, candle: Candle) -> bool:
"""
Add a candle to the buffer
Returns True if added, False if buffer full and candle dropped
"""
async with self._lock:
if len(self._buffer) >= self.max_size:
logger.warning(f"Buffer full, dropping oldest candle. Size: {len(self._buffer)}")
self.stats.total_dropped += 1
self._buffer.append(candle)
self.stats.total_added += 1
# Trigger immediate flush if batch size reached
if len(self._buffer) >= self.batch_size:
self._flush_event.set()
return True
async def add_many(self, candles: List[Candle]) -> int:
"""Add multiple candles to the buffer"""
added = 0
for candle in candles:
if await self.add(candle):
added += 1
return added
async def get_batch(self, n: Optional[int] = None) -> List[Candle]:
"""Get up to N candles from buffer (without removing)"""
async with self._lock:
n = n or len(self._buffer)
return list(self._buffer)[:n]
async def flush(self) -> int:
"""
Manually flush buffer to callback
Returns number of candles flushed
"""
candles_to_flush: List[Candle] = []
async with self._lock:
if not self._buffer:
return 0
candles_to_flush = list(self._buffer)
self._buffer.clear()
if candles_to_flush and self.on_flush:
try:
await self.on_flush(candles_to_flush)
# Update stats
self.stats.total_flushed += len(candles_to_flush)
self.stats.last_flush_time = datetime.now(timezone.utc)
self._batch_sizes.append(len(candles_to_flush))
self.stats.avg_batch_size = sum(self._batch_sizes) / len(self._batch_sizes)
logger.debug(f"Flushed {len(candles_to_flush)} candles")
return len(candles_to_flush)
except Exception as e:
logger.error(f"Flush callback failed: {e}")
# Put candles back in buffer
async with self._lock:
for candle in reversed(candles_to_flush):
self._buffer.appendleft(candle)
return 0
elif candles_to_flush:
# No callback, just clear
self.stats.total_flushed += len(candles_to_flush)
return len(candles_to_flush)
return 0
async def _flush_loop(self) -> None:
"""Background task to periodically flush buffer"""
while not self._stop_event.is_set():
try:
# Wait for flush interval or until triggered
await asyncio.wait_for(
self._flush_event.wait(),
timeout=self.flush_interval
)
self._flush_event.clear()
# Flush if we have data
buffer_size = await self.get_buffer_size()
if buffer_size > 0:
await self.flush()
except asyncio.TimeoutError:
# Flush interval reached, flush if we have data
buffer_size = await self.get_buffer_size()
if buffer_size > 0:
await self.flush()
except Exception as e:
logger.error(f"Error in flush loop: {e}")
await asyncio.sleep(1)
def get_stats(self) -> BufferStats:
"""Get current buffer statistics"""
return self.stats
async def get_buffer_size(self) -> int:
"""Get current buffer size"""
async with self._lock:
return len(self._buffer)
def detect_gaps(self, candles: List[Candle]) -> List[Dict[str, Any]]:
"""
Detect gaps in candle sequence
Returns list of gap information
"""
if len(candles) < 2:
return []
gaps = []
sorted_candles = sorted(candles, key=lambda c: c.time)
for i in range(1, len(sorted_candles)):
prev = sorted_candles[i-1]
curr = sorted_candles[i]
# Calculate expected interval (1 minute)
expected_diff = 60 # seconds
actual_diff = (curr.time - prev.time).total_seconds()
if actual_diff > expected_diff * 1.5: # Allow 50% tolerance
gaps.append({
'from_time': prev.time.isoformat(),
'to_time': curr.time.isoformat(),
'missing_candles': int(actual_diff / expected_diff) - 1,
'duration_seconds': actual_diff
})
return gaps

View File

@ -0,0 +1,401 @@
"""
Custom Timeframe Generator
Generates both standard and custom timeframes from 1m data
Updates "building" candles in real-time
"""
import asyncio
import logging
import calendar
from datetime import datetime, timedelta, timezone
from typing import List, Optional, Dict, Tuple
from dataclasses import dataclass
from .database import DatabaseManager
from .websocket_client import Candle
logger = logging.getLogger(__name__)
@dataclass
class CustomCandle(Candle):
"""Extended candle with completion flag"""
is_complete: bool = True
class CustomTimeframeGenerator:
"""
Manages and generates multiple timeframes from 1m candles.
Standard intervals use clock-aligned boundaries.
Custom intervals use continuous bucketing from the first recorded 1m candle.
"""
# Standard intervals (Hyperliquid supported)
STANDARD_INTERVALS = {
'3m': {'type': 'min', 'value': 3},
'5m': {'type': 'min', 'value': 5},
'15m': {'type': 'min', 'value': 15},
'30m': {'type': 'min', 'value': 30},
'1h': {'type': 'hour', 'value': 1},
'2h': {'type': 'hour', 'value': 2},
'4h': {'type': 'hour', 'value': 4},
'8h': {'type': 'hour', 'value': 8},
'12h': {'type': 'hour', 'value': 12},
'1d': {'type': 'day', 'value': 1},
'3d': {'type': 'day', 'value': 3},
'1w': {'type': 'week', 'value': 1},
'1M': {'type': 'month', 'value': 1}
}
# Custom intervals
CUSTOM_INTERVALS = {
'37m': {'minutes': 37, 'source': '1m'},
'148m': {'minutes': 148, 'source': '37m'}
}
def __init__(self, db: DatabaseManager):
self.db = db
self.first_1m_time: Optional[datetime] = None
# Anchor for 3d candles (fixed date)
self.anchor_3d = datetime(2020, 1, 1, tzinfo=timezone.utc)
async def initialize(self) -> None:
"""Get first 1m timestamp for custom continuous bucketing"""
async with self.db.acquire() as conn:
first = await conn.fetchval("""
SELECT MIN(time)
FROM candles
WHERE interval = '1m' AND symbol = 'BTC'
""")
if first:
self.first_1m_time = first
logger.info(f"TF Generator: First 1m candle at {first}")
else:
logger.warning("TF Generator: No 1m data found")
def get_bucket_start(self, timestamp: datetime, interval: str) -> datetime:
"""Calculate bucket start time for any interval"""
# Handle custom intervals
if interval in self.CUSTOM_INTERVALS:
if not self.first_1m_time:
return timestamp # Fallback if not initialized
minutes = self.CUSTOM_INTERVALS[interval]['minutes']
delta = timestamp - self.first_1m_time
bucket_num = int(delta.total_seconds() // (minutes * 60))
return self.first_1m_time + timedelta(minutes=bucket_num * minutes)
# Handle standard intervals
if interval not in self.STANDARD_INTERVALS:
return timestamp
cfg = self.STANDARD_INTERVALS[interval]
t = timestamp.replace(second=0, microsecond=0)
if cfg['type'] == 'min':
n = cfg['value']
return t - timedelta(minutes=t.minute % n)
elif cfg['type'] == 'hour':
n = cfg['value']
t = t.replace(minute=0)
return t - timedelta(hours=t.hour % n)
elif cfg['type'] == 'day':
n = cfg['value']
t = t.replace(hour=0, minute=0)
if n == 1:
return t
else: # 3d
days_since_anchor = (t - self.anchor_3d).days
return t - timedelta(days=days_since_anchor % n)
elif cfg['type'] == 'week':
t = t.replace(hour=0, minute=0)
return t - timedelta(days=t.weekday()) # Monday start
elif cfg['type'] == 'month':
return t.replace(day=1, hour=0, minute=0)
return t
def get_expected_1m_count(self, bucket_start: datetime, interval: str) -> int:
"""Calculate expected number of 1m candles in a full bucket"""
if interval in self.CUSTOM_INTERVALS:
return self.CUSTOM_INTERVALS[interval]['minutes']
if interval in self.STANDARD_INTERVALS:
cfg = self.STANDARD_INTERVALS[interval]
if cfg['type'] == 'min': return cfg['value']
if cfg['type'] == 'hour': return cfg['value'] * 60
if cfg['type'] == 'day': return cfg['value'] * 1440
if cfg['type'] == 'week': return 7 * 1440
if cfg['type'] == 'month':
_, days = calendar.monthrange(bucket_start.year, bucket_start.month)
return days * 1440
return 1
async def aggregate_and_upsert(self, symbol: str, interval: str, bucket_start: datetime, conn=None) -> None:
"""Aggregate 1m data for a specific bucket and upsert"""
bucket_end = bucket_start # Initialize
if interval == '148m':
# Aggregate from 37m
source_interval = '37m'
expected_count = 4
else:
source_interval = '1m'
expected_count = self.get_expected_1m_count(bucket_start, interval)
# Calculate bucket end
if interval == '1M':
_, days = calendar.monthrange(bucket_start.year, bucket_start.month)
bucket_end = bucket_start + timedelta(days=days)
elif interval in self.STANDARD_INTERVALS:
cfg = self.STANDARD_INTERVALS[interval]
if cfg['type'] == 'min': bucket_end = bucket_start + timedelta(minutes=cfg['value'])
elif cfg['type'] == 'hour': bucket_end = bucket_start + timedelta(hours=cfg['value'])
elif cfg['type'] == 'day': bucket_end = bucket_start + timedelta(days=cfg['value'])
elif cfg['type'] == 'week': bucket_end = bucket_start + timedelta(weeks=1)
elif interval in self.CUSTOM_INTERVALS:
minutes = self.CUSTOM_INTERVALS[interval]['minutes']
bucket_end = bucket_start + timedelta(minutes=minutes)
else:
bucket_end = bucket_start + timedelta(minutes=1)
# Use provided connection or acquire a new one
if conn is None:
async with self.db.acquire() as connection:
await self._process_aggregation(connection, symbol, interval, source_interval, bucket_start, bucket_end, expected_count)
else:
await self._process_aggregation(conn, symbol, interval, source_interval, bucket_start, bucket_end, expected_count)
async def _process_aggregation(self, conn, symbol, interval, source_interval, bucket_start, bucket_end, expected_count):
"""Internal method to perform aggregation using a specific connection"""
rows = await conn.fetch(f"""
SELECT time, open, high, low, close, volume
FROM candles
WHERE symbol = $1 AND interval = $2
AND time >= $3 AND time < $4
ORDER BY time ASC
""", symbol, source_interval, bucket_start, bucket_end)
if not rows:
return
# Aggregate
is_complete = len(rows) >= expected_count
candle = CustomCandle(
time=bucket_start,
symbol=symbol,
interval=interval,
open=float(rows[0]['open']),
high=max(float(r['high']) for r in rows),
low=min(float(r['low']) for r in rows),
close=float(rows[-1]['close']),
volume=sum(float(r['volume']) for r in rows),
is_complete=is_complete
)
await self._upsert_candle(candle, conn)
async def _upsert_candle(self, c: CustomCandle, conn=None) -> None:
"""Upsert a single candle using provided connection or acquiring a new one"""
query = """
INSERT INTO candles (time, symbol, interval, open, high, low, close, volume, validated)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9)
ON CONFLICT (time, symbol, interval) DO UPDATE SET
open = EXCLUDED.open,
high = EXCLUDED.high,
low = EXCLUDED.low,
close = EXCLUDED.close,
volume = EXCLUDED.volume,
validated = EXCLUDED.validated,
created_at = NOW()
"""
values = (c.time, c.symbol, c.interval, c.open, c.high, c.low, c.close, c.volume, c.is_complete)
if conn is None:
async with self.db.acquire() as connection:
await connection.execute(query, *values)
else:
await conn.execute(query, *values)
async def update_realtime(self, new_1m_candles: List[Candle]) -> None:
"""
Update ALL timeframes (standard and custom) based on new 1m candles.
Called after 1m buffer flush.
Uses a single connection for all updates sequentially to prevent pool exhaustion.
"""
if not new_1m_candles:
return
if not self.first_1m_time:
await self.initialize()
if not self.first_1m_time:
return
symbol = new_1m_candles[0].symbol
async with self.db.acquire() as conn:
# 1. Update all standard intervals + 37m sequentially
# sequential is required because we are sharing the same connection 'conn'
intervals_to_update = list(self.STANDARD_INTERVALS.keys()) + ['37m']
for interval in intervals_to_update:
try:
bucket_start = self.get_bucket_start(new_1m_candles[-1].time, interval)
await self.aggregate_and_upsert(symbol, interval, bucket_start, conn=conn)
except Exception as e:
logger.error(f"Error updating interval {interval}: {e}")
# 2. Update 148m (it depends on 37m being updated first)
try:
bucket_148m = self.get_bucket_start(new_1m_candles[-1].time, '148m')
await self.aggregate_and_upsert(symbol, '148m', bucket_148m, conn=conn)
except Exception as e:
logger.error(f"Error updating interval 148m: {e}")
async def generate_historical(self, interval: str, batch_size: int = 5000) -> int:
"""
Force recalculation of all candles for a timeframe from 1m data.
"""
if not self.first_1m_time:
await self.initialize()
if not self.first_1m_time:
return 0
config = self.CUSTOM_INTERVALS.get(interval) or {'source': '1m'}
source_interval = config.get('source', '1m')
logger.info(f"Generating historical {interval} from {source_interval}...")
async with self.db.acquire() as conn:
min_max = await conn.fetchrow("""
SELECT MIN(time), MAX(time) FROM candles
WHERE symbol = 'BTC' AND interval = $1
""", source_interval)
if not min_max or not min_max[0]:
return 0
curr = self.get_bucket_start(min_max[0], interval)
end = min_max[1]
total_inserted = 0
while curr <= end:
await self.aggregate_and_upsert('BTC', interval, curr)
total_inserted += 1
if interval == '1M':
_, days = calendar.monthrange(curr.year, curr.month)
curr += timedelta(days=days)
elif interval in self.STANDARD_INTERVALS:
cfg = self.STANDARD_INTERVALS[interval]
if cfg['type'] == 'min': curr += timedelta(minutes=cfg['value'])
elif cfg['type'] == 'hour': curr += timedelta(hours=cfg['value'])
elif cfg['type'] == 'day': curr += timedelta(days=cfg['value'])
elif cfg['type'] == 'week': curr += timedelta(weeks=1)
else:
minutes = self.CUSTOM_INTERVALS[interval]['minutes']
curr += timedelta(minutes=minutes)
if total_inserted % 100 == 0:
logger.info(f"Generated {total_inserted} {interval} candles...")
await asyncio.sleep(0.01)
return total_inserted
async def generate_from_gap(self, interval: str) -> int:
"""
Generate candles only from where they're missing.
Compares source interval max time with target interval max time.
"""
if not self.first_1m_time:
await self.initialize()
if not self.first_1m_time:
return 0
config = self.CUSTOM_INTERVALS.get(interval) or {'source': '1m'}
source_interval = config.get('source', '1m')
async with self.db.acquire() as conn:
# Get source range
source_min_max = await conn.fetchrow("""
SELECT MIN(time), MAX(time) FROM candles
WHERE symbol = 'BTC' AND interval = $1
""", source_interval)
if not source_min_max or not source_min_max[1]:
return 0
# Get target (this interval) max time
target_max = await conn.fetchval("""
SELECT MAX(time) FROM candles
WHERE symbol = 'BTC' AND interval = $1
""", interval)
source_max = source_min_max[1]
if target_max:
# Start from next bucket after target_max
curr = self.get_bucket_start(target_max, interval)
if interval in self.CUSTOM_INTERVALS:
minutes = self.CUSTOM_INTERVALS[interval]['minutes']
curr = curr + timedelta(minutes=minutes)
elif interval in self.STANDARD_INTERVALS:
cfg = self.STANDARD_INTERVALS[interval]
if cfg['type'] == 'min': curr = curr + timedelta(minutes=cfg['value'])
elif cfg['type'] == 'hour': curr = curr + timedelta(hours=cfg['value'])
elif cfg['type'] == 'day': curr = curr + timedelta(days=cfg['value'])
elif cfg['type'] == 'week': curr = curr + timedelta(weeks=1)
else:
# No target data, start from source min
curr = self.get_bucket_start(source_min_max[0], interval)
end = source_max
if curr > end:
logger.info(f"{interval}: Already up to date (target: {target_max}, source: {source_max})")
return 0
logger.info(f"Generating {interval} from {curr} to {end}...")
total_inserted = 0
while curr <= end:
await self.aggregate_and_upsert('BTC', interval, curr)
total_inserted += 1
if interval == '1M':
_, days = calendar.monthrange(curr.year, curr.month)
curr += timedelta(days=days)
elif interval in self.STANDARD_INTERVALS:
cfg = self.STANDARD_INTERVALS[interval]
if cfg['type'] == 'min': curr += timedelta(minutes=cfg['value'])
elif cfg['type'] == 'hour': curr += timedelta(hours=cfg['value'])
elif cfg['type'] == 'day': curr += timedelta(days=cfg['value'])
elif cfg['type'] == 'week': curr += timedelta(weeks=1)
else:
minutes = self.CUSTOM_INTERVALS[interval]['minutes']
curr += timedelta(minutes=minutes)
if total_inserted % 50 == 0:
logger.info(f"Generated {total_inserted} {interval} candles...")
await asyncio.sleep(0.01)
logger.info(f"{interval}: Generated {total_inserted} candles")
return total_inserted
async def verify_integrity(self, interval: str) -> Dict:
async with self.db.acquire() as conn:
stats = await conn.fetchrow("""
SELECT
COUNT(*) as total_candles,
MIN(time) as earliest,
MAX(time) as latest,
COUNT(*) FILTER (WHERE validated = TRUE) as complete_candles,
COUNT(*) FILTER (WHERE validated = FALSE) as incomplete_candles
FROM candles
WHERE interval = $1 AND symbol = 'BTC'
""", interval)
return dict(stats) if stats else {}

View File

@ -0,0 +1,261 @@
"""
Database interface for TimescaleDB
Optimized for batch inserts and low resource usage
"""
import asyncio
import logging
from contextlib import asynccontextmanager
from datetime import datetime
from typing import List, Dict, Any, Optional
import os
import asyncpg
from asyncpg import Pool
from .websocket_client import Candle
logger = logging.getLogger(__name__)
class DatabaseManager:
"""Manages TimescaleDB connections and operations"""
def __init__(
self,
host: str = None,
port: int = None,
database: str = None,
user: str = None,
password: str = None,
pool_size: int = 20
):
self.host = host or os.getenv('DB_HOST', 'localhost')
self.port = port or int(os.getenv('DB_PORT', 5432))
self.database = database or os.getenv('DB_NAME', 'btc_data')
self.user = user or os.getenv('DB_USER', 'btc_bot')
self.password = password or os.getenv('DB_PASSWORD', '')
self.pool_size = int(os.getenv('DB_POOL_SIZE', pool_size))
self.pool: Optional[Pool] = None
async def connect(self) -> None:
"""Initialize connection pool"""
try:
self.pool = await asyncpg.create_pool(
host=self.host,
port=self.port,
database=self.database,
user=self.user,
password=self.password,
min_size=2,
max_size=self.pool_size,
command_timeout=60,
max_inactive_connection_lifetime=300
)
# Test connection
async with self.acquire() as conn:
version = await conn.fetchval('SELECT version()')
logger.info(f"Connected to database: {version[:50]}...")
logger.info(f"Database pool created (min: 2, max: {self.pool_size})")
except Exception as e:
logger.error(f"Failed to connect to database: {type(e).__name__}: {e!r}")
raise
async def disconnect(self) -> None:
"""Close connection pool"""
if self.pool:
await self.pool.close()
logger.info("Database pool closed")
@asynccontextmanager
async def acquire(self, timeout: float = 30.0):
"""Context manager for acquiring connection with timeout"""
if not self.pool:
raise RuntimeError("Database not connected")
try:
async with self.pool.acquire(timeout=timeout) as conn:
yield conn
except asyncio.TimeoutError:
logger.error(f"Database connection acquisition timed out after {timeout}s")
raise
async def insert_candles(self, candles: List[Candle]) -> int:
"""
Batch insert candles into database
Uses ON CONFLICT to handle duplicates
"""
if not candles:
return 0
# Prepare values for batch insert
values = [
(
c.time,
c.symbol,
c.interval,
c.open,
c.high,
c.low,
c.close,
c.volume,
False, # validated
'hyperliquid' # source
)
for c in candles
]
async with self.acquire() as conn:
# Use execute_many for efficient batch insert
result = await conn.executemany('''
INSERT INTO candles (time, symbol, interval, open, high, low, close, volume, validated, source)
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10)
ON CONFLICT (time, symbol, interval)
DO UPDATE SET
open = EXCLUDED.open,
high = EXCLUDED.high,
low = EXCLUDED.low,
close = EXCLUDED.close,
volume = EXCLUDED.volume,
source = EXCLUDED.source
''', values)
inserted = len(candles)
logger.debug(f"Inserted/updated {inserted} candles")
return inserted
async def get_candles(
self,
symbol: str,
interval: str,
start: Optional[datetime] = None,
end: Optional[datetime] = None,
limit: int = 1000
) -> List[Dict[str, Any]]:
"""Query candles from database"""
query = '''
SELECT time, symbol, interval, open, high, low, close, volume, validated
FROM candles
WHERE symbol = $1 AND interval = $2
'''
params = [symbol, interval]
if start:
query += ' AND time >= $3'
params.append(start)
if end:
query += f' AND time <= ${len(params) + 1}'
params.append(end)
query += f' ORDER BY time DESC LIMIT ${len(params) + 1}'
params.append(limit)
async with self.acquire() as conn:
rows = await conn.fetch(query, *params)
return [dict(row) for row in rows]
async def get_latest_candle(self, symbol: str, interval: str) -> Optional[Dict[str, Any]]:
"""Get the most recent candle for a symbol"""
async with self.acquire() as conn:
row = await conn.fetchrow('''
SELECT time, symbol, interval, open, high, low, close, volume
FROM candles
WHERE symbol = $1 AND interval = $2
ORDER BY time DESC
LIMIT 1
''', symbol, interval)
return dict(row) if row else None
async def detect_gaps(
self,
symbol: str,
interval: str,
since: Optional[datetime] = None
) -> List[Dict[str, Any]]:
"""
Detect missing candles in the database
Uses SQL window functions for efficiency
"""
since = since or datetime.utcnow().replace(hour=0, minute=0, second=0, microsecond=0)
async with self.acquire() as conn:
# Find gaps using lead/lag window functions
rows = await conn.fetch('''
WITH ordered AS (
SELECT
time,
LAG(time) OVER (ORDER BY time) as prev_time
FROM candles
WHERE symbol = $1
AND interval = $2
AND time >= $3
ORDER BY time
)
SELECT
prev_time as gap_start,
time as gap_end,
EXTRACT(EPOCH FROM (time - prev_time)) / 60 - 1 as missing_candles
FROM ordered
WHERE time - prev_time > INTERVAL '2 minutes'
ORDER BY prev_time
''', symbol, interval, since)
return [
{
'gap_start': row['gap_start'].isoformat(),
'gap_end': row['gap_end'].isoformat(),
'missing_candles': int(row['missing_candles'])
}
for row in rows
]
async def log_quality_issue(
self,
check_type: str,
severity: str,
symbol: Optional[str] = None,
details: Optional[Dict[str, Any]] = None
) -> None:
"""Log a data quality issue"""
async with self.acquire() as conn:
await conn.execute('''
INSERT INTO data_quality (check_type, severity, symbol, details)
VALUES ($1, $2, $3, $4)
''', check_type, severity, symbol, details)
logger.warning(f"Quality issue logged: {check_type} ({severity})")
async def get_health_stats(self) -> Dict[str, Any]:
"""Get database health statistics"""
async with self.acquire() as conn:
# Get table sizes
table_stats = await conn.fetch('''
SELECT
relname as table_name,
pg_size_pretty(pg_total_relation_size(relid)) as size,
n_live_tup as row_count
FROM pg_stat_user_tables
WHERE relname IN ('candles', 'indicators', 'data_quality')
''')
# Get latest candles
latest = await conn.fetch('''
SELECT symbol, MAX(time) as last_time, COUNT(*) as count
FROM candles
WHERE time > NOW() - INTERVAL '24 hours'
GROUP BY symbol
''')
return {
'tables': [dict(row) for row in table_stats],
'latest_candles': [dict(row) for row in latest],
'unresolved_issues': await conn.fetchval('''
SELECT COUNT(*) FROM data_quality WHERE resolved = FALSE
''')
}

View File

@ -0,0 +1,285 @@
"""
Indicator Engine - Computes and stores technical indicators
Stateless DB-backed design: same code for live updates and backtesting
"""
import asyncio
import json
import logging
from dataclasses import dataclass, field
from datetime import datetime, timezone
from typing import Dict, List, Optional, Any
from .database import DatabaseManager
logger = logging.getLogger(__name__)
@dataclass
class IndicatorConfig:
"""Configuration for a single indicator"""
name: str # e.g., "ma44"
type: str # e.g., "sma"
period: int # e.g., 44
intervals: List[str] # e.g., ["37m", "148m", "1d"]
@classmethod
def from_dict(cls, name: str, data: Dict[str, Any]) -> "IndicatorConfig":
"""Create config from YAML dict entry"""
return cls(
name=name,
type=data["type"],
period=data["period"],
intervals=data["intervals"],
)
@dataclass
class IndicatorResult:
"""Result of a single indicator computation"""
name: str
value: Optional[float]
period: int
timestamp: datetime
class IndicatorEngine:
"""
Computes technical indicators from candle data in the database.
Two modes, same math:
- on_interval_update(): called by live system after higher-TF candle update
- compute_at(): called by backtester for a specific point in time
Both query the DB for the required candle history and store results.
"""
def __init__(self, db: DatabaseManager, configs: List[IndicatorConfig]):
self.db = db
self.configs = configs
# Build lookup: interval -> list of configs that need computation
self._interval_configs: Dict[str, List[IndicatorConfig]] = {}
for cfg in configs:
for interval in cfg.intervals:
if interval not in self._interval_configs:
self._interval_configs[interval] = []
self._interval_configs[interval].append(cfg)
logger.info(
f"IndicatorEngine initialized with {len(configs)} indicators "
f"across intervals: {list(self._interval_configs.keys())}"
)
def get_configured_intervals(self) -> List[str]:
"""Return all intervals that have indicators configured"""
return list(self._interval_configs.keys())
async def on_interval_update(
self,
symbol: str,
interval: str,
timestamp: datetime,
) -> Dict[str, Optional[float]]:
"""
Compute all indicators configured for this interval.
Called by main.py after CustomTimeframeGenerator updates a higher TF.
Returns dict of indicator_name -> value (for use by Brain).
"""
configs = self._interval_configs.get(interval, [])
if not configs:
return {}
return await self._compute_and_store(symbol, interval, timestamp, configs)
async def compute_at(
self,
symbol: str,
interval: str,
timestamp: datetime,
) -> Dict[str, Optional[float]]:
"""
Compute indicators at a specific point in time.
Alias for on_interval_update -- used by backtester for clarity.
"""
return await self.on_interval_update(symbol, interval, timestamp)
async def compute_historical(
self,
symbol: str,
interval: str,
start: datetime,
end: datetime,
) -> int:
"""
Batch-compute indicators for a time range.
Iterates over every candle timestamp in [start, end] and computes.
Returns total number of indicator values stored.
"""
configs = self._interval_configs.get(interval, [])
if not configs:
logger.warning(f"No indicators configured for interval {interval}")
return 0
# Get all candle timestamps in range
async with self.db.acquire() as conn:
rows = await conn.fetch("""
SELECT time FROM candles
WHERE symbol = $1 AND interval = $2
AND time >= $3 AND time <= $4
ORDER BY time ASC
""", symbol, interval, start, end)
if not rows:
logger.warning(f"No candles found for {symbol}/{interval} in range")
return 0
timestamps = [row["time"] for row in rows]
total_stored = 0
logger.info(
f"Computing {len(configs)} indicators across "
f"{len(timestamps)} {interval} candles..."
)
for i, ts in enumerate(timestamps):
results = await self._compute_and_store(symbol, interval, ts, configs)
total_stored += sum(1 for v in results.values() if v is not None)
if (i + 1) % 100 == 0:
logger.info(f"Progress: {i + 1}/{len(timestamps)} candles processed")
await asyncio.sleep(0.01) # Yield to event loop
logger.info(
f"Historical compute complete: {total_stored} indicator values "
f"stored for {interval}"
)
return total_stored
async def _compute_and_store(
self,
symbol: str,
interval: str,
timestamp: datetime,
configs: List[IndicatorConfig],
) -> Dict[str, Optional[float]]:
"""Core computation: fetch candles, compute indicators, store results"""
# Determine max lookback needed
max_period = max(cfg.period for cfg in configs)
# Fetch enough candles for the longest indicator
async with self.db.acquire() as conn:
rows = await conn.fetch("""
SELECT time, open, high, low, close, volume
FROM candles
WHERE symbol = $1 AND interval = $2
AND time <= $3
ORDER BY time DESC
LIMIT $4
""", symbol, interval, timestamp, max_period)
if not rows:
return {cfg.name: None for cfg in configs}
# Reverse to chronological order
candles = list(reversed(rows))
closes = [float(c["close"]) for c in candles]
# Compute each indicator
results: Dict[str, Optional[float]] = {}
values_to_store: List[tuple] = []
for cfg in configs:
value = self._compute_indicator(cfg, closes)
results[cfg.name] = value
if value is not None:
values_to_store.append((
timestamp,
symbol,
interval,
cfg.name,
value,
json.dumps({"type": cfg.type, "period": cfg.period}),
))
# Batch upsert all computed values
if values_to_store:
async with self.db.acquire() as conn:
await conn.executemany("""
INSERT INTO indicators (time, symbol, interval, indicator_name, value, parameters)
VALUES ($1, $2, $3, $4, $5, $6)
ON CONFLICT (time, symbol, interval, indicator_name)
DO UPDATE SET
value = EXCLUDED.value,
parameters = EXCLUDED.parameters,
computed_at = NOW()
""", values_to_store)
logger.debug(
f"Stored {len(values_to_store)} indicator values for "
f"{symbol}/{interval} at {timestamp}"
)
return results
def _compute_indicator(
self,
config: IndicatorConfig,
closes: List[float],
) -> Optional[float]:
"""Dispatch to the correct computation function"""
if config.type == "sma":
return self.compute_sma(closes, config.period)
else:
logger.warning(f"Unknown indicator type: {config.type}")
return None
# ── Pure math functions (no DB, no async, easily testable) ──────────
@staticmethod
def compute_sma(closes: List[float], period: int) -> Optional[float]:
"""Simple Moving Average over the last `period` closes"""
if len(closes) < period:
return None
return sum(closes[-period:]) / period
async def get_latest_values(
self,
symbol: str,
interval: str,
) -> Dict[str, float]:
"""
Get the most recent indicator values for a symbol/interval.
Used by Brain to read current state.
"""
async with self.db.acquire() as conn:
rows = await conn.fetch("""
SELECT DISTINCT ON (indicator_name)
indicator_name, value, time
FROM indicators
WHERE symbol = $1 AND interval = $2
ORDER BY indicator_name, time DESC
""", symbol, interval)
return {row["indicator_name"]: float(row["value"]) for row in rows}
async def get_values_at(
self,
symbol: str,
interval: str,
timestamp: datetime,
) -> Dict[str, float]:
"""
Get indicator values at a specific timestamp.
Used by Brain during backtesting.
"""
async with self.db.acquire() as conn:
rows = await conn.fetch("""
SELECT indicator_name, value
FROM indicators
WHERE symbol = $1 AND interval = $2 AND time = $3
""", symbol, interval, timestamp)
return {row["indicator_name"]: float(row["value"]) for row in rows}

440
src/data_collector/main.py Normal file
View File

@ -0,0 +1,440 @@
"""
Main entry point for data collector service
Integrates WebSocket client, buffer, database, indicators, and brain
"""
import asyncio
import logging
import signal
import sys
from datetime import datetime, timezone
from typing import Optional, List
import os
import yaml
from .websocket_client import HyperliquidWebSocket, Candle
from .candle_buffer import CandleBuffer
from .database import DatabaseManager
from .custom_timeframe_generator import CustomTimeframeGenerator
from .indicator_engine import IndicatorEngine, IndicatorConfig
from .brain import Brain
from .backfill import HyperliquidBackfill
# Configure logging
logging.basicConfig(
level=getattr(logging, os.getenv('LOG_LEVEL', 'INFO')),
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
handlers=[
logging.StreamHandler(sys.stdout),
logging.FileHandler('/app/logs/collector.log') if os.path.exists('/app/logs') else logging.StreamHandler()
]
)
logger = logging.getLogger(__name__)
class DataCollector:
"""
Main data collection orchestrator
Manages WebSocket connection, buffering, and database writes
"""
STANDARD_INTERVALS = ["1m", "3m", "5m", "15m", "30m", "1h", "2h", "4h", "8h", "12h", "1d", "3d", "1w"]
def __init__(
self,
symbol: str = "BTC",
interval: str = "1m"
):
self.symbol = symbol
self.interval = interval
# Components
self.db: Optional[DatabaseManager] = None
self.buffer: Optional[CandleBuffer] = None
self.websocket: Optional[HyperliquidWebSocket] = None
self.custom_tf_generator: Optional[CustomTimeframeGenerator] = None
# State
self.is_running = False
self._stop_event = asyncio.Event()
self._tasks = []
async def start(self) -> None:
"""Initialize and start all components"""
logger.info(f"Starting DataCollector for {self.symbol}")
try:
# Initialize database
self.db = DatabaseManager()
await self.db.connect()
# Run startup backfill for all intervals
await self._startup_backfill()
# Initialize custom timeframe generator
self.custom_tf_generator = CustomTimeframeGenerator(self.db)
await self.custom_tf_generator.initialize()
# Regenerate custom timeframes after startup backfill
await self._regenerate_custom_timeframes()
# Initialize indicator engine
# Hardcoded config for now, eventually load from yaml
indicator_configs = [
IndicatorConfig("ma44", "sma", 44, ["37m", "148m", "1d"]),
IndicatorConfig("ma125", "sma", 125, ["37m", "148m", "1d"])
]
self.indicator_engine = IndicatorEngine(self.db, indicator_configs)
# Initialize brain
self.brain = Brain(self.db, self.indicator_engine)
# Initialize buffer
self.buffer = CandleBuffer(
max_size=1000,
flush_interval_seconds=30,
batch_size=100,
on_flush_callback=self._on_buffer_flush
)
await self.buffer.start()
# Initialize WebSocket client
self.websocket = HyperliquidWebSocket(
symbol=self.symbol,
interval=self.interval,
on_candle_callback=self._on_candle,
on_error_callback=self._on_error
)
# Setup signal handlers
self._setup_signal_handlers()
# Connect to WebSocket
await self.websocket.connect()
# Start main loops
self.is_running = True
self._tasks = [
asyncio.create_task(self.websocket.receive_loop()),
asyncio.create_task(self._health_check_loop()),
asyncio.create_task(self._monitoring_loop())
]
logger.info("DataCollector started successfully")
# Wait for stop signal
await self._stop_event.wait()
except Exception as e:
logger.error(f"Failed to start DataCollector: {type(e).__name__}: {e!r}")
raise
finally:
await self.stop()
async def _startup_backfill(self) -> None:
"""
Backfill missing data on startup for all standard intervals.
Uses both gap detection AND time-based backfill for robustness.
"""
logger.info("Running startup backfill for all intervals...")
try:
async with HyperliquidBackfill(self.db, self.symbol, self.STANDARD_INTERVALS) as backfill:
for interval in self.STANDARD_INTERVALS:
try:
# First, use gap detection to find any holes
gaps = await self.db.detect_gaps(self.symbol, interval)
if gaps:
logger.info(f"{interval}: {len(gaps)} gaps detected")
for gap in gaps:
gap_start = datetime.fromisoformat(gap['gap_start'].replace('Z', '+00:00'))
gap_end = datetime.fromisoformat(gap['gap_end'].replace('Z', '+00:00'))
logger.info(f" Filling gap: {gap_start} to {gap_end}")
candles = await backfill.fetch_candles(interval, gap_start, gap_end)
if candles:
inserted = await self.db.insert_candles(candles)
logger.info(f" Inserted {inserted} candles for gap")
await asyncio.sleep(0.2)
# Second, check if we're behind current time
latest = await self.db.get_latest_candle(self.symbol, interval)
now = datetime.now(timezone.utc)
if latest:
last_time = latest['time']
gap_minutes = (now - last_time).total_seconds() / 60
if gap_minutes > 2:
logger.info(f"{interval}: {gap_minutes:.0f} min behind, backfilling to now...")
candles = await backfill.fetch_candles(interval, last_time, now)
if candles:
inserted = await self.db.insert_candles(candles)
logger.info(f" Inserted {inserted} candles")
else:
logger.info(f"{interval}: up to date")
else:
# No data exists, backfill last 7 days
logger.info(f"{interval}: No data, backfilling 7 days...")
count = await backfill.backfill_interval(interval, days_back=7)
logger.info(f" Inserted {count} candles")
await asyncio.sleep(0.2)
except Exception as e:
logger.error(f"Startup backfill failed for {interval}: {e}")
import traceback
logger.error(traceback.format_exc())
continue
except Exception as e:
logger.error(f"Startup backfill error: {e}")
import traceback
logger.error(traceback.format_exc())
logger.info("Startup backfill complete")
async def _regenerate_custom_timeframes(self) -> None:
"""
Regenerate custom timeframes (37m, 148m) only from gaps.
Only generates candles that are missing, not all from beginning.
"""
if not self.custom_tf_generator:
return
logger.info("Checking custom timeframes for gaps...")
try:
for interval in ['37m', '148m']:
try:
count = await self.custom_tf_generator.generate_from_gap(interval)
if count > 0:
logger.info(f"{interval}: Generated {count} candles")
else:
logger.info(f"{interval}: Up to date")
except Exception as e:
logger.error(f"Failed to regenerate {interval}: {e}")
except Exception as e:
logger.error(f"Custom timeframe regeneration error: {e}")
logger.info("Custom timeframe check complete")
async def stop(self) -> None:
"""Graceful shutdown"""
if not self.is_running:
return
logger.info("Stopping DataCollector...")
self.is_running = False
self._stop_event.set()
# Cancel tasks
for task in self._tasks:
if not task.done():
task.cancel()
# Wait for tasks to complete
if self._tasks:
await asyncio.gather(*self._tasks, return_exceptions=True)
# Stop components
if self.websocket:
await self.websocket.disconnect()
if self.buffer:
await self.buffer.stop()
if self.db:
await self.db.disconnect()
logger.info("DataCollector stopped")
async def _on_candle(self, candle: Candle) -> None:
"""Handle incoming candle from WebSocket"""
try:
# Add to buffer
await self.buffer.add(candle)
logger.debug(f"Received candle: {candle.time} - Close: {candle.close}")
except Exception as e:
logger.error(f"Error processing candle: {e}")
async def _on_buffer_flush(self, candles: list) -> None:
"""Handle buffer flush - write to database and update custom timeframes"""
try:
inserted = await self.db.insert_candles(candles)
logger.info(f"Flushed {inserted} candles to database")
# Update custom timeframes (37m, 148m) in background
if self.custom_tf_generator and inserted > 0:
asyncio.create_task(
self._update_custom_timeframes(candles),
name="custom_tf_update"
)
except Exception as e:
logger.error(f"Failed to write candles to database: {e}")
raise # Re-raise to trigger buffer retry
async def _update_custom_timeframes(self, candles: list) -> None:
"""
Update custom timeframes in background, then trigger indicators/brain.
This chain ensures that indicators are computed on fresh candle data,
and the brain evaluates on fresh indicator data.
"""
try:
# 1. Update custom candles (37m, 148m, etc.)
await self.custom_tf_generator.update_realtime(candles)
logger.debug("Custom timeframes updated")
# 2. Trigger indicator updates for configured intervals
# We use the timestamp of the last 1m candle as the trigger point
trigger_time = candles[-1].time
if self.indicator_engine:
intervals = self.indicator_engine.get_configured_intervals()
for interval in intervals:
# Get the correct bucket start time for this interval
# e.g., if trigger_time is 09:48:00, 37m bucket might start at 09:25:00
if self.custom_tf_generator:
bucket_start = self.custom_tf_generator.get_bucket_start(trigger_time, interval)
else:
bucket_start = trigger_time
# Compute indicators for this bucket
raw_indicators = await self.indicator_engine.on_interval_update(
self.symbol, interval, bucket_start
)
# Filter out None values to satisfy type checker
indicators = {k: v for k, v in raw_indicators.items() if v is not None}
# 3. Evaluate brain if we have fresh indicators
if self.brain and indicators:
await self.brain.evaluate(
self.symbol, interval, bucket_start, indicators
)
except Exception as e:
logger.error(f"Failed to update custom timeframes/indicators: {e}")
# Don't raise - this is non-critical
async def _on_error(self, error: Exception) -> None:
"""Handle WebSocket errors"""
logger.error(f"WebSocket error: {error}")
# Could implement alerting here (Telegram, etc.)
async def _health_check_loop(self) -> None:
"""Periodic health checks"""
while self.is_running:
try:
await asyncio.sleep(60) # Check every minute
if not self.is_running:
break
# Check WebSocket health
health = self.websocket.get_connection_health()
if health['seconds_since_last_message'] and health['seconds_since_last_message'] > 120:
logger.warning("No messages received for 2+ minutes")
# Could trigger reconnection or alert
# Log stats
buffer_stats = self.buffer.get_stats()
logger.info(f"Health: {health}, Buffer: {buffer_stats.to_dict()}")
except asyncio.CancelledError:
break
except Exception as e:
logger.error(f"Error in health check: {e}")
async def _monitoring_loop(self) -> None:
"""Periodic monitoring and maintenance tasks"""
while self.is_running:
try:
await asyncio.sleep(300) # Every 5 minutes
if not self.is_running:
break
# Detect gaps
gaps = await self.db.detect_gaps(self.symbol, self.interval)
if gaps:
logger.warning(f"Detected {len(gaps)} data gaps: {gaps}")
await self._backfill_gaps(gaps)
# Log database health
health = await self.db.get_health_stats()
logger.info(f"Database health: {health}")
except asyncio.CancelledError:
break
except Exception as e:
logger.error(f"Error in monitoring loop: {e}")
async def _backfill_gaps(self, gaps: list) -> None:
"""Backfill detected data gaps from Hyperliquid"""
if not gaps:
return
logger.info(f"Starting backfill for {len(gaps)} gaps...")
try:
async with HyperliquidBackfill(self.db, self.symbol, [self.interval]) as backfill:
for gap in gaps:
gap_start = datetime.fromisoformat(gap['gap_start'].replace('Z', '+00:00'))
gap_end = datetime.fromisoformat(gap['gap_end'].replace('Z', '+00:00'))
logger.info(f"Backfilling gap: {gap_start} to {gap_end} ({gap['missing_candles']} candles)")
candles = await backfill.fetch_candles(self.interval, gap_start, gap_end)
if candles:
inserted = await self.db.insert_candles(candles)
logger.info(f"Backfilled {inserted} candles for gap {gap_start}")
# Update custom timeframes and indicators for backfilled data
if inserted > 0:
await self._update_custom_timeframes(candles)
else:
logger.warning(f"No candles available for gap {gap_start} to {gap_end}")
except Exception as e:
logger.error(f"Backfill failed: {e}")
def _setup_signal_handlers(self) -> None:
"""Setup handlers for graceful shutdown"""
def signal_handler(sig, frame):
logger.info(f"Received signal {sig}, shutting down...")
asyncio.create_task(self.stop())
signal.signal(signal.SIGINT, signal_handler)
signal.signal(signal.SIGTERM, signal_handler)
async def main():
"""Main entry point"""
collector = DataCollector(
symbol="BTC",
interval="1m"
)
try:
await collector.start()
except KeyboardInterrupt:
logger.info("Interrupted by user")
except Exception as e:
logger.error(f"Fatal error: {type(e).__name__}: {e!r}")
sys.exit(1)
if __name__ == "__main__":
asyncio.run(main())

View File

@ -0,0 +1,300 @@
"""
Hyperliquid WebSocket Client for cbBTC Data Collection
Optimized for Synology DS218+ with automatic reconnection
"""
import asyncio
import json
import logging
from datetime import datetime, timezone
from typing import Optional, Dict, Any, Callable, Awaitable, List
from dataclasses import dataclass
import websockets
from websockets.exceptions import ConnectionClosed, InvalidStatusCode
from websockets.typing import Data
logger = logging.getLogger(__name__)
@dataclass
class Candle:
"""Represents a single candlestick"""
time: datetime
symbol: str
interval: str
open: float
high: float
low: float
close: float
volume: float
def to_dict(self) -> Dict[str, Any]:
return {
'time': self.time,
'symbol': self.symbol,
'interval': self.interval,
'open': self.open,
'high': self.high,
'low': self.low,
'close': self.close,
'volume': self.volume
}
class HyperliquidWebSocket:
"""
WebSocket client for Hyperliquid exchange
Handles connection, reconnection, and candle data parsing
"""
def __init__(
self,
symbol: str = "BTC",
interval: str = "1m",
url: str = "wss://api.hyperliquid.xyz/ws",
reconnect_delays: Optional[List[int]] = None,
on_candle_callback: Optional[Callable[[Candle], Awaitable[None]]] = None,
on_error_callback: Optional[Callable[[Exception], Awaitable[None]]] = None
):
self.symbol = symbol
self.interval = interval
self.url = url
self.reconnect_delays = reconnect_delays or [1, 2, 5, 10, 30, 60, 120, 300, 600, 900]
self.on_candle = on_candle_callback
self.on_error = on_error_callback
self.websocket: Optional[websockets.WebSocketClientProtocol] = None
self.is_running = False
self.reconnect_count = 0
self.last_message_time: Optional[datetime] = None
self.last_candle_time: Optional[datetime] = None
self._should_stop = False
async def connect(self) -> None:
"""Establish WebSocket connection with subscription"""
try:
logger.info(f"Connecting to Hyperliquid WebSocket: {self.url}")
self.websocket = await websockets.connect(
self.url,
ping_interval=None,
ping_timeout=None,
close_timeout=10
)
# Subscribe to candle data
subscribe_msg = {
"method": "subscribe",
"subscription": {
"type": "candle",
"coin": self.symbol,
"interval": self.interval
}
}
await self.websocket.send(json.dumps(subscribe_msg))
response = await self.websocket.recv()
logger.info(f"Subscription response: {response}")
self.reconnect_count = 0
self.is_running = True
logger.info(f"Successfully connected and subscribed to {self.symbol} {self.interval} candles")
except Exception as e:
logger.error(f"Failed to connect: {e}")
raise
async def disconnect(self) -> None:
"""Gracefully close connection"""
self._should_stop = True
self.is_running = False
if self.websocket:
try:
await self.websocket.close()
logger.info("WebSocket connection closed")
except Exception as e:
logger.warning(f"Error closing WebSocket: {e}")
async def receive_loop(self) -> None:
"""Main message receiving loop"""
while self.is_running and not self._should_stop:
try:
if not self.websocket:
raise ConnectionClosed(None, None)
message = await self.websocket.recv()
self.last_message_time = datetime.now(timezone.utc)
await self._handle_message(message)
except ConnectionClosed as e:
if self._should_stop:
break
logger.warning(f"WebSocket connection closed: {e}")
await self._handle_reconnect()
except Exception as e:
logger.error(f"Error in receive loop: {e}")
if self.on_error:
await self.on_error(e)
await asyncio.sleep(1)
async def _handle_message(self, message: Data) -> None:
"""Parse and process incoming WebSocket message"""
try:
# Convert bytes to string if necessary
if isinstance(message, bytes):
message = message.decode('utf-8')
data = json.loads(message)
# Handle subscription confirmation
if data.get("channel") == "subscriptionResponse":
logger.info(f"Subscription confirmed: {data}")
return
# Handle candle data
if data.get("channel") == "candle":
candle_data = data.get("data", {})
if candle_data:
candle = self._parse_candle(candle_data)
if candle:
self.last_candle_time = candle.time
if self.on_candle:
await self.on_candle(candle)
# Handle ping/pong
if "ping" in data and self.websocket:
await self.websocket.send(json.dumps({"pong": data["ping"]}))
except json.JSONDecodeError as e:
logger.error(f"Failed to parse message: {e}")
except Exception as e:
logger.error(f"Error handling message: {e}")
def _parse_candle(self, data: Any) -> Optional[Candle]:
"""Parse candle data from WebSocket message"""
try:
# Hyperliquid candle format: [open, high, low, close, volume, timestamp]
if isinstance(data, list) and len(data) >= 6:
open_price = float(data[0])
high = float(data[1])
low = float(data[2])
close = float(data[3])
volume = float(data[4])
timestamp_ms = int(data[5])
elif isinstance(data, dict):
# New format: {'t': 1770812400000, 'T': ..., 's': 'BTC', 'i': '1m', 'o': '67164.0', 'c': ..., 'h': ..., 'l': ..., 'v': ..., 'n': ...}
if 't' in data and 'o' in data:
open_price = float(data.get("o", 0))
high = float(data.get("h", 0))
low = float(data.get("l", 0))
close = float(data.get("c", 0))
volume = float(data.get("v", 0))
timestamp_ms = int(data.get("t", 0))
else:
# Old format fallback
open_price = float(data.get("open", 0))
high = float(data.get("high", 0))
low = float(data.get("low", 0))
close = float(data.get("close", 0))
volume = float(data.get("volume", 0))
timestamp_ms = int(data.get("time", 0))
else:
logger.warning(f"Unknown candle format: {data}")
return None
timestamp = datetime.fromtimestamp(timestamp_ms / 1000, tz=timezone.utc)
return Candle(
time=timestamp,
symbol=self.symbol,
interval=self.interval,
open=open_price,
high=high,
low=low,
close=close,
volume=volume
)
except (KeyError, ValueError, TypeError) as e:
logger.error(f"Failed to parse candle data: {e}, data: {data}")
return None
async def _handle_reconnect(self) -> None:
"""Handle reconnection with exponential backoff"""
if self._should_stop:
return
if self.reconnect_count >= len(self.reconnect_delays):
logger.error("Max reconnection attempts reached")
self.is_running = False
if self.on_error:
await self.on_error(Exception("Max reconnection attempts reached"))
return
delay = self.reconnect_delays[self.reconnect_count]
self.reconnect_count += 1
logger.info(f"Reconnecting in {delay} seconds (attempt {self.reconnect_count})...")
await asyncio.sleep(delay)
try:
await self.connect()
except Exception as e:
logger.error(f"Reconnection failed: {e}")
def get_connection_health(self) -> Dict[str, Any]:
"""Return connection health metrics"""
now = datetime.now(timezone.utc)
return {
"is_connected": self.websocket is not None and self.is_running,
"is_running": self.is_running,
"reconnect_count": self.reconnect_count,
"last_message_time": self.last_message_time.isoformat() if self.last_message_time else None,
"last_candle_time": self.last_candle_time.isoformat() if self.last_candle_time else None,
"seconds_since_last_message": (now - self.last_message_time).total_seconds() if self.last_message_time else None
}
async def test_websocket():
"""Test function for WebSocket client"""
candles_received = []
stop_event = asyncio.Event()
async def on_candle(candle: Candle):
candles_received.append(candle)
print(f"Candle: {candle.time} - O:{candle.open} H:{candle.high} L:{candle.low} C:{candle.close} V:{candle.volume}")
if len(candles_received) >= 5:
print("Received 5 candles, stopping...")
stop_event.set()
client = HyperliquidWebSocket(
symbol="cbBTC-PERP",
interval="1m",
on_candle_callback=on_candle
)
try:
await client.connect()
# Run receive loop in background
receive_task = asyncio.create_task(client.receive_loop())
# Wait for stop event
await stop_event.wait()
await client.disconnect()
await receive_task
except KeyboardInterrupt:
print("\nStopping...")
finally:
await client.disconnect()
print(f"Total candles received: {len(candles_received)}")
if __name__ == "__main__":
logging.basicConfig(
level=logging.INFO,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
)
asyncio.run(test_websocket())

View File

@ -9,16 +9,9 @@ import asyncio
import pandas as pd
import numpy as np
from datetime import datetime, timezone
from typing import List, Dict, Any, Optional
from dotenv import load_dotenv
from rich.console import Console
from rich.table import Table
from rich.panel import Panel
from rich.layout import Layout
from rich import box
import asyncpg
# Try to import pybit
# Try to import pybit, if not available, we'll suggest installing it
try:
from pybit.unified_trading import HTTP
except ImportError:
@ -39,60 +32,13 @@ logging.basicConfig(
)
logger = logging.getLogger("PingPongBot")
class DatabaseManager:
"""Minimal Database Manager for the bot"""
def __init__(self, host, port, database, user, password):
self.host = host
self.port = int(port)
self.database = database
self.user = user
self.password = password
self.pool = None
async def connect(self):
try:
self.pool = await asyncpg.create_pool(
host=self.host, port=self.port, user=self.user,
password=self.password, database=self.database,
min_size=1, max_size=10
)
# Test connection
async with self.pool.acquire() as conn:
res = await conn.fetchval("SELECT 1")
if res == 1:
logger.info(f"Database connection verified at {self.host}:{self.port}")
else:
raise Exception("Database test query failed")
except Exception as e:
logger.error(f"DATABASE CONNECTION FAILED: {e}")
raise
async def get_candles(self, symbol: str, interval: str, limit: int = 100):
if not self.pool:
logger.error("Attempted to query DB before connecting")
return []
try:
async with self.pool.acquire() as conn:
rows = await conn.fetch('''
SELECT time, open, high, low, close, volume
FROM candles
WHERE symbol = $1 AND interval = $2
ORDER BY time DESC LIMIT $3
''', symbol, interval, limit)
return [dict(r) for r in rows]
except Exception as e:
logger.error(f"DB Query Error for {symbol} {interval}: {e}")
return []
class PingPongBot:
def __init__(self, config_path="config/ping_pong_config.yaml"):
self.version = "1.5.7"
with open(config_path, 'r') as f:
self.config = yaml.safe_load(f)
# Explicitly load from ENV to ensure they are available
self.api_key = os.getenv("BYBIT_API_KEY") or os.getenv("API_KEY")
self.api_secret = os.getenv("BYBIT_API_SECRET") or os.getenv("API_SECRET")
self.api_key = os.getenv("API_KEY")
self.api_secret = os.getenv("API_SECRET")
if not self.api_key or not self.api_secret:
raise ValueError("API_KEY and API_SECRET must be set in .env file")
@ -101,349 +47,310 @@ class PingPongBot:
testnet=False,
api_key=self.api_key,
api_secret=self.api_secret,
timeout=10
)
# Initialize DB with explicit credentials
self.db = DatabaseManager(
host=os.getenv('DB_HOST', '20.20.20.20'),
port=os.getenv('DB_PORT', 5433),
database=os.getenv('DB_NAME', 'btc_data'),
user=os.getenv('DB_USER', 'btc_bot'),
password=os.getenv('DB_PASSWORD', '')
)
self.symbol = self.config['symbol']
self.interval = self.config['interval']
self.direction = self.config['direction'].lower()
# Base settings
raw_symbol = self.config['symbol'].upper()
self.base_coin = raw_symbol.replace("USDT", "").replace("USDC", "").replace("USD", "")
self.db_symbol = self.base_coin
self.interval = str(self.config['interval'])
self.db_interval = self.interval + "m" if self.interval.isdigit() else self.interval
# Dynamic Strategy State
self.direction = None
self.category = None
self.symbol = None
self.settle_coin = None
# Tracking for SMA(44, 1D)
self.ma_44_val = 0.0
self.last_ma_check_time = 0
# Bot State
# State
self.last_candle_time = None
self.last_candle_price = 0.0
self.current_indicators = {
"rsi": {"value": 0.0, "timestamp": "N/A"},
"hurst_lower": {"value": 0.0, "timestamp": "N/A"},
"hurst_upper": {"value": 0.0, "timestamp": "N/A"}
}
self.current_indicators = {}
self.position = None
self.wallet_balance = 0
self.market_price = 0.0
self.status_msg = "Initializing..."
self.last_signal = None
self.start_time = datetime.now()
self.console = Console()
# Fixed Parameters from Config
self.partial_exit_pct = float(self.config.get('partial_exit_pct', 0.15))
self.min_val_usd = float(self.config.get('min_position_value_usd', 15.0))
self.pos_size_margin = float(self.config.get('pos_size_margin', 20.0))
self.leverage = float(self.config.get('exchange_leverage', 3.0))
self.max_eff_lev = float(self.config.get('max_effective_leverage', 1.0))
# Grid parameters from config
self.tp_pct = self.config['take_profit_pct'] / 100.0
self.partial_exit_pct = self.config['partial_exit_pct']
self.min_val_usd = self.config['min_position_value_usd']
self.pos_size_margin = self.config['pos_size_margin']
self.leverage = self.config['exchange_leverage']
self.max_eff_lev = self.config['max_effective_leverage']
def rma(self, series, length):
"""Rolling Moving Average (Wilder's Smoothing) - matches Pine Script ta.rma"""
alpha = 1 / length
return series.ewm(alpha=alpha, adjust=False).mean()
def calculate_indicators(self, df):
# RSI
"""Calculate RSI and Hurst Bands matching the JS/Dashboard implementation"""
# 1. RSI
rsi_cfg = self.config['rsi']
delta = df['close'].diff()
gain = delta.where(delta > 0, 0)
loss = -delta.where(delta < 0, 0)
df['rsi'] = 100 - (100 / (1 + (self.rma(gain, rsi_cfg['period']) / self.rma(loss, rsi_cfg['period']))))
gain = (delta.where(delta > 0, 0))
loss = (-delta.where(delta < 0, 0))
# Hurst
avg_gain = self.rma(gain, rsi_cfg['period'])
avg_loss = self.rma(loss, rsi_cfg['period'])
rs = avg_gain / avg_loss
df['rsi'] = 100 - (100 / (1 + rs))
# 2. Hurst Bands
hurst_cfg = self.config['hurst']
mcl = hurst_cfg['period'] / 2
mcl_t = hurst_cfg['period']
mcm = hurst_cfg['multiplier']
mcl = mcl_t / 2
mcl_2 = int(round(mcl / 2))
df['tr'] = np.maximum(df['high'] - df['low'], np.maximum(abs(df['high'] - df['close'].shift(1)), abs(df['low'] - df['close'].shift(1))))
# True Range
df['h_l'] = df['high'] - df['low']
df['h_pc'] = abs(df['high'] - df['close'].shift(1))
df['l_pc'] = abs(df['low'] - df['close'].shift(1))
df['tr'] = df[['h_l', 'h_pc', 'l_pc']].max(axis=1)
# RMA of Close and ATR
df['ma_mcl'] = self.rma(df['close'], mcl)
df['atr_mcl'] = self.rma(df['tr'], mcl)
df['center'] = df['ma_mcl'].shift(mcl_2).fillna(df['ma_mcl'])
mcm_off = hurst_cfg['multiplier'] * df['atr_mcl']
# Historical Offset
df['center'] = df['ma_mcl'].shift(mcl_2)
# Fill first values where shift produces NaN with the MA itself (as done in JS: historical_ma || src)
df['center'] = df['center'].fillna(df['ma_mcl'])
mcm_off = mcm * df['atr_mcl']
df['hurst_upper'] = df['center'] + mcm_off
df['hurst_lower'] = df['center'] - mcm_off
last_row = df.iloc[-1]
now_str = datetime.now().strftime("%H:%M:%S")
self.current_indicators["rsi"] = {"value": float(last_row['rsi']), "timestamp": now_str}
self.current_indicators["hurst_lower"] = {"value": float(last_row['hurst_lower']), "timestamp": now_str}
self.current_indicators["hurst_upper"] = {"value": float(last_row['hurst_upper']), "timestamp": now_str}
return df
async def update_direction(self):
"""Logic Point I: 1D MA44 check and Point II: Asset/Perp selection"""
async def fetch_data(self):
"""Fetch latest Klines from Bybit V5"""
try:
logger.info(f"Checking direction based on SMA(44, 1D) for {self.db_symbol}...")
candles_1d = await self.db.get_candles(self.db_symbol, "1d", limit=100)
# We fetch 200 candles to ensure indicators stabilize
response = self.session.get_kline(
category="linear",
symbol=self.symbol,
interval=self.interval,
limit=200
)
if not candles_1d or len(candles_1d) < 44:
got = len(candles_1d) if candles_1d else 0
self.status_msg = f"Error: Need 44 1D candles (Got {got})"
return False
df_1d = pd.DataFrame(candles_1d[::-1])
df_1d['close'] = df_1d['close'].astype(float)
self.ma_44_val = df_1d['close'].rolling(window=44).mean().iloc[-1]
# Use BTCUSDT (Linear) for reliable initial price check
ticker = await asyncio.to_thread(self.session.get_tickers, category="linear", symbol=f"{self.base_coin}USDT")
current_price = float(ticker['result']['list'][0]['lastPrice'])
self.market_price = current_price
new_direction = "long" if current_price > self.ma_44_val else "short"
if new_direction != self.direction:
logger.info(f"DIRECTION CHANGE: {self.direction} -> {new_direction} (Price: {current_price:.2f}, MA44: {self.ma_44_val:.2f})")
self.status_msg = f"Switching to {new_direction.upper()}"
if response['retCode'] != 0:
self.status_msg = f"API Error: {response['retMsg']}"
return None
if self.direction is not None:
await self.close_all_positions()
self.direction = new_direction
if self.direction == "long":
self.category = "inverse"
self.symbol = f"{self.base_coin}USD"
self.settle_coin = self.base_coin
klines = response['result']['list']
# Bybit returns newest first, we need oldest first
df = pd.DataFrame(klines, columns=['start_time', 'open', 'high', 'low', 'close', 'volume', 'turnover'])
df = df.astype(float)
df = df.iloc[::-1].reset_index(drop=True)
return self.calculate_indicators(df)
except Exception as e:
logger.error(f"Error fetching data: {e}")
self.status_msg = f"Fetch Error: {str(e)}"
return None
async def update_account_info(self):
"""Update position and balance information"""
try:
# Get Position
pos_response = self.session.get_positions(
category="linear",
symbol=self.symbol
)
if pos_response['retCode'] == 0:
positions = pos_response['result']['list']
active_pos = [p for p in positions if float(p['size']) > 0]
if active_pos:
self.position = active_pos[0]
else:
self.category = "linear"
self.symbol = "BTCPERP" if self.base_coin == "BTC" else f"{self.base_coin}USDC"
self.settle_coin = "USDC"
# Perform swap
await self.swap_assets(new_direction)
logger.info(f"Bot configured for {self.direction.upper()} | Symbol: {self.symbol} | Category: {self.category}")
self.last_candle_time = None
return True
self.position = None
return False
except Exception as e:
logger.error(f"Direction Update Error: {e}")
self.status_msg = f"Dir Error: {str(e)[:20]}"
return False
async def close_all_positions(self):
"""Closes any active position in the current category/symbol"""
try:
if not self.category or not self.symbol: return
pos = await asyncio.to_thread(self.session.get_positions, category=self.category, symbol=self.symbol)
if pos['retCode'] == 0:
for p in pos['result']['list']:
if float(p.get('size', 0)) > 0:
logger.info(f"Closing existing position: {p['size']} {self.symbol}")
await self.place_order(float(p['size']), is_close=True)
except Exception as e:
logger.error(f"Error closing positions: {e}")
async def swap_assets(self, target_direction):
"""Point II: Exchange BTC/USDC on Spot market with proper rounding"""
try:
logger.info(f"Swapping assets for {target_direction.upper()} mode...")
spot_symbol = f"{self.base_coin}USDC"
# Get Balance
wallet_response = self.session.get_wallet_balance(
category="linear",
accountType="UNIFIED",
coin="USDT"
)
# Use accountType='UNIFIED' for UTA accounts
balance = await asyncio.to_thread(self.session.get_wallet_balance, accountType="UNIFIED", coin=f"{self.base_coin},USDC")
coins = {c['coin']: float(c['walletBalance']) for c in balance['result']['list'][0]['coin']}
logger.info(f"Current Balances: {coins}")
if target_direction == "short":
# SHORT: Need USDC, Sell BTC (Max 6 decimals for BTCUSDC spot)
btc_bal = floor(coins.get(self.base_coin, 0) * 1000000) / 1000000
if btc_bal > 0.000001:
logger.info(f"Spot: Selling {btc_bal} {self.base_coin} for USDC")
res = await asyncio.to_thread(self.session.place_order,
category="spot", symbol=spot_symbol, side="Sell", orderType="Market", qty=f"{btc_bal:.6f}"
)
logger.info(f"Swap Result: {res['retMsg']}")
if wallet_response['retCode'] == 0:
result_list = wallet_response['result']['list']
if result_list:
# Priority 1: totalWalletBalance (for UTA pooled funds)
self.wallet_balance = float(result_list[0].get('totalWalletBalance', 0))
# If totalWalletBalance is 0, check the specific coin
if self.wallet_balance == 0:
coin_info = result_list[0].get('coin', [])
if coin_info:
self.wallet_balance = float(coin_info[0].get('walletBalance', 0))
else:
# LONG: Need BTC, Buy BTC with USDC (Max 4 decimals for USDC amount)
usdc_bal = floor(coins.get("USDC", 0) * 10000) / 10000
if usdc_bal > 1.0:
logger.info(f"Spot: Buying {self.base_coin} with {usdc_bal} USDC")
# marketUnit='quote' means spending USDC
res = await asyncio.to_thread(self.session.place_order,
category="spot", symbol=spot_symbol, side="Buy", orderType="Market",
qty=f"{usdc_bal:.4f}", marketUnit="quote"
)
logger.info(f"Swap Result: {res['retMsg']}")
await asyncio.sleep(5) # Wait for spot settlement
logger.error(f"Wallet API Error: {wallet_response['retMsg']}")
except Exception as e:
logger.error(f"Asset Swap Error: {e}")
async def update_exchange_data(self):
if not self.category or not self.symbol: return
try:
ticker = await asyncio.to_thread(self.session.get_tickers, category=self.category, symbol=self.symbol)
if ticker['retCode'] == 0:
self.market_price = float(ticker['result']['list'][0]['lastPrice'])
# settleCoin is only for USDC linear perpetuals
settle_coin = "USDC" if (self.category == "linear" and "USDC" in self.symbol) else None
pos = await asyncio.to_thread(self.session.get_positions, category=self.category, symbol=self.symbol, settleCoin=settle_coin)
if pos['retCode'] == 0:
active = [p for p in pos['result']['list'] if float(p.get('size', 0)) > 0]
self.position = active[0] if active else None
target_coin = self.settle_coin
wallet = await asyncio.to_thread(self.session.get_wallet_balance, category=self.category, accountType="UNIFIED", coin=target_coin)
if wallet['retCode'] == 0:
res_list = wallet['result']['list']
if res_list:
self.wallet_balance = float(res_list[0].get('totalWalletBalance', 0))
except Exception as e:
logger.error(f"Exchange Sync Error: {e}")
logger.error(f"Error updating account info: {e}")
def check_signals(self, df):
if len(df) < 2: return None
last, prev = df.iloc[-1], df.iloc[-2]
rsi_cfg, hurst_cfg = self.config['rsi'] or {}, self.config['hurst'] or {}
"""Determine if we should Open or Close based on indicators"""
if len(df) < 2:
return None
last = df.iloc[-1]
prev = df.iloc[-2]
# Signals defined by crossover
l_open = (rsi_cfg.get('enabled_for_open') and prev['rsi'] < rsi_cfg.get('oversold', 30) and last['rsi'] >= rsi_cfg.get('oversold', 30)) or \
(hurst_cfg.get('enabled_for_open') and prev['close'] > prev['hurst_lower'] and last['close'] <= last['hurst_lower'])
l_close = (rsi_cfg.get('enabled_for_close') and prev['rsi'] > rsi_cfg.get('overbought', 70) and last['rsi'] <= rsi_cfg.get('overbought', 70)) or \
(hurst_cfg.get('enabled_for_close') and prev['close'] < prev['hurst_upper'] and last['close'] >= last['hurst_upper'])
rsi_cfg = self.config['rsi']
hurst_cfg = self.config['hurst']
s_open = (rsi_cfg.get('enabled_for_open') and prev['rsi'] > rsi_cfg.get('overbought', 70) and last['rsi'] <= rsi_cfg.get('overbought', 70)) or \
(hurst_cfg.get('enabled_for_open') and prev['close'] < prev['hurst_upper'] and last['close'] >= last['hurst_upper'])
s_close = (rsi_cfg.get('enabled_for_close') and prev['rsi'] < rsi_cfg.get('oversold', 30) and last['rsi'] >= rsi_cfg.get('oversold', 30)) or \
(hurst_cfg.get('enabled_for_close') and prev['close'] > prev['hurst_lower'] and last['close'] <= last['hurst_lower'])
open_signal = False
close_signal = False
# 1. RSI Signals
rsi_buy = prev['rsi'] < rsi_cfg['oversold'] and last['rsi'] >= rsi_cfg['oversold']
rsi_sell = prev['rsi'] > rsi_cfg['overbought'] and last['rsi'] <= rsi_cfg['overbought']
# 2. Hurst Signals
hurst_buy = prev['close'] > prev['hurst_lower'] and last['close'] <= last['hurst_lower']
hurst_sell = prev['close'] > prev['hurst_upper'] and last['close'] <= last['hurst_upper']
# Logic for LONG
if self.direction == 'long':
return "open" if l_open else ("close" if l_close else None)
if (rsi_cfg['enabled_for_open'] and rsi_buy) or (hurst_cfg['enabled_for_open'] and hurst_buy):
open_signal = True
if (rsi_cfg['enabled_for_close'] and rsi_sell) or (hurst_cfg['enabled_for_close'] and hurst_sell):
close_signal = True
# Logic for SHORT
else:
return "open" if s_open else ("close" if s_close else None)
if (rsi_cfg['enabled_for_open'] and rsi_sell) or (hurst_cfg['enabled_for_open'] and hurst_sell):
open_signal = True
if (rsi_cfg['enabled_for_close'] and rsi_buy) or (hurst_cfg['enabled_for_close'] and hurst_buy):
close_signal = True
return "open" if open_signal else ("close" if close_signal else None)
async def execute_trade(self, signal):
if not signal or not self.market_price: return
last_price = self.market_price
async def execute_trade_logic(self, df, signal):
"""Apply the Ping-Pong strategy logic (Accumulation + TP)"""
last_price = float(df.iloc[-1]['close'])
if signal == "close" and self.position:
qty = float(self.position['size']) * self.partial_exit_pct
if (float(self.position['size']) - qty) * last_price < self.min_val_usd:
qty = float(self.position['size'])
await self.place_order(qty, is_close=True)
elif signal == "open":
cur_qty = float(self.position['size']) if self.position else 0
if self.category == "linear":
cur_notional = cur_qty * last_price
ping_notional = self.pos_size_margin * self.leverage
qty_to_open = ping_notional / last_price
else: # Inverse
cur_notional = cur_qty
ping_notional = self.pos_size_margin * self.leverage
qty_to_open = ping_notional
if (cur_notional + ping_notional) / max(self.wallet_balance, 1) <= self.max_eff_lev:
await self.place_order(qty_to_open, is_close=False)
else:
self.status_msg = "Max Leverage Reached"
async def place_order(self, qty, is_close=False):
if not self.category or not self.symbol: return
side = "Sell" if (self.direction == "long" and is_close) or (self.direction == "short" and not is_close) else "Buy"
pos_idx = 1 if self.direction == "long" else 2
try:
qty_str = str(int(qty)) if self.category == "inverse" else str(round(qty, 3))
res = await asyncio.to_thread(self.session.place_order,
category=self.category, symbol=self.symbol, side=side, orderType="Market",
qty=qty_str, reduceOnly=is_close, positionIdx=pos_idx
)
if res['retCode'] == 0:
self.last_signal = f"{side} {qty_str}"
self.status_msg = f"Order Success: {side}"
else:
self.status_msg = f"Order Error: {res['retMsg']}"
except Exception as e:
logger.error(f"Trade Error: {e}")
def render_dashboard(self):
self.console.print("\n" + "="*60)
title = f"PING-PONG BOT v{self.version} [{self.direction.upper() if self.direction else 'INIT'}]"
cfg_table = Table(title=title, box=box.ROUNDED, expand=True)
cfg_table.add_column("Property"); cfg_table.add_column("Value")
cfg_table.add_row("Symbol", self.symbol or "N/A"); cfg_table.add_row("Category", self.category or "N/A")
cfg_table.add_row("Market Price", f"${self.market_price:.2f}"); cfg_table.add_row("SMA(44, 1D)", f"${self.ma_44_val:.2f}")
cfg_table.add_row("Last Candle", f"{self.last_candle_time} (@${self.last_candle_price:.2f})")
ind_table = Table(title="INDICATORS", box=box.ROUNDED, expand=True)
ind_table.add_column("Indicator"); ind_table.add_column("Value"); ind_table.add_column("Updated")
for k in ["hurst_upper", "hurst_lower", "rsi"]:
v = self.current_indicators[k]
ind_table.add_row(k.upper().replace("_", " "), f"{v['value']:.2f}", v['timestamp'])
pos_table = Table(title="POSITION", box=box.ROUNDED, expand=True)
pos_table.add_column("Account Equity"); pos_table.add_column("Size"); pos_table.add_column("Entry"); pos_table.add_column("PnL")
# 1. Check Take Profit (TP)
if self.position:
pnl = float(self.position['unrealisedPnl'])
pos_table.add_row(f"${self.wallet_balance:.2f}", self.position['size'], self.position['avgPrice'], f"[bold {'green' if pnl>=0 else 'red'}]${pnl:.2f}")
avg_price = float(self.position['avgPrice'])
current_qty = float(self.position['size'])
is_tp = False
if self.direction == 'long':
if last_price >= avg_price * (1 + self.tp_pct):
is_tp = True
else:
if last_price <= avg_price * (1 - self.tp_pct):
is_tp = True
if is_tp:
qty_to_close = current_qty * self.partial_exit_pct
remaining_qty = current_qty - qty_to_close
# Min size check
if (remaining_qty * last_price) < self.min_val_usd:
qty_to_close = current_qty
self.status_msg = "TP: Closing Full Position (Min Size reached)"
else:
self.status_msg = f"TP: Closing Partial {self.partial_exit_pct*100}%"
self.place_order(qty_to_close, last_price, is_close=True)
return
# 2. Check Close Signal
if signal == "close" and self.position:
current_qty = float(self.position['size'])
qty_to_close = current_qty * self.partial_exit_pct
if (current_qty - qty_to_close) * last_price < self.min_val_usd:
qty_to_close = current_qty
self.status_msg = "Signal: Closing Position (Partial/Full)"
self.place_order(qty_to_close, last_price, is_close=True)
return
# 3. Check Open/Accumulate Signal
if signal == "open":
# Check Max Effective Leverage
current_qty = float(self.position['size']) if self.position else 0
current_notional = current_qty * last_price
entry_notional = self.pos_size_margin * self.leverage
projected_notional = current_notional + entry_notional
effective_leverage = projected_notional / max(self.wallet_balance, 1.0)
if effective_leverage <= self.max_eff_lev:
qty_to_open = entry_notional / last_price
# Round qty based on symbol precision (simplified)
qty_to_open = round(qty_to_open, 3)
self.status_msg = f"Signal: Opening/Accumulating {qty_to_open} units"
self.place_order(qty_to_open, last_price, is_close=False)
else:
self.status_msg = f"Signal Ignored: Max Leverage {effective_leverage:.2f} > {self.max_eff_lev}"
def place_order(self, qty, price, is_close=False):
"""Send order to Bybit V5"""
side = ""
if self.direction == "long":
side = "Sell" if is_close else "Buy"
else:
pos_table.add_row(f"${self.wallet_balance:.2f}", "0", "-", "-")
self.console.print(cfg_table); self.console.print(ind_table); self.console.print(pos_table)
self.console.print(f"[dim]Status: {self.status_msg} | Last Signal: {self.last_signal}[/]")
self.console.print("="*60 + "\n")
side = "Buy" if is_close else "Sell"
try:
response = self.session.place_order(
category="linear",
symbol=self.symbol,
side=side,
orderType="Market",
qty=str(qty),
timeInForce="GTC",
reduceOnly=is_close
)
if response['retCode'] == 0:
logger.info(f"Order Placed: {side} {qty} {self.symbol}")
self.last_signal = f"{side} {qty} @ Market"
else:
logger.error(f"Order Failed: {response['retMsg']}")
self.status_msg = f"Order Error: {response['retMsg']}"
except Exception as e:
logger.error(f"Execution Error: {e}")
self.status_msg = f"Exec Error: {str(e)}"
async def run(self):
try:
await self.db.connect()
await self.update_direction()
except Exception as e:
logger.error(f"Startup Failure: {e}")
return
last_exchange_update = 0
"""Main loop"""
logger.info(f"Bot started for {self.symbol} in {self.direction} mode")
while True:
try:
now = time.time()
# 1. Update Account
await self.update_account_info()
# 2. Fetch Data & Calculate Indicators
df = await self.fetch_data()
if df is not None:
# 3. Check for New Candle (for signal processing)
last_price = float(df.iloc[-1]['close'])
if now - self.last_ma_check_time >= 120:
await self.update_direction()
self.last_ma_check_time = now
# 4. Strategy Logic
signal = self.check_signals(df)
if signal:
logger.info(f"Signal detected: {signal} @ {last_price}")
await self.execute_trade_logic(df, signal)
# 5. Simple status log
if self.position:
logger.info(f"Price: {last_price:.2f} | Pos: {self.position['size']} @ {self.position['avgPrice']} | Wallet: {self.wallet_balance:.2f}")
else:
logger.info(f"Price: {last_price:.2f} | No Position | Wallet: {self.wallet_balance:.2f}")
await asyncio.sleep(self.config.get('loop_interval_seconds', 5))
if now - last_exchange_update >= 15:
await self.update_exchange_data()
last_exchange_update = now
candles = await self.db.get_candles(self.db_symbol, self.db_interval, limit=100)
if candles:
latest = candles[0]
if latest['time'] != self.last_candle_time:
df = pd.DataFrame(candles[::-1])
df = df.astype({'open': float, 'high': float, 'low': float, 'close': float, 'volume': float})
df = self.calculate_indicators(df)
signal = self.check_signals(df)
if signal: await self.execute_trade(signal)
self.last_candle_time = latest['time']
self.last_candle_price = latest['close']
self.status_msg = f"New Candle: {latest['time'].strftime('%H:%M:%S')}"
self.render_dashboard()
except Exception as e:
logger.error(f"Loop error: {e}")
self.status_msg = f"Error: {str(e)[:40]}"
await asyncio.sleep(5)
from math import floor
if __name__ == "__main__":
bot = PingPongBot()
asyncio.run(bot.run())
try:
bot = PingPongBot()
asyncio.run(bot.run())
except KeyboardInterrupt:
print("\nBot Stopped by User")
except Exception as e:
print(f"\nCritical Error: {e}")
logger.exception("Critical Error in main loop")

52
start_dev.cmd Normal file
View File

@ -0,0 +1,52 @@
@echo off
echo ===================================
echo BTC Trading Dashboard - Development Server
echo ===================================
echo.
REM Check if venv exists
if not exist "venv\Scripts\activate.bat" (
echo [ERROR] Virtual environment not found!
echo Please run setup first to create the venv.
echo.
pause
exit /b 1
)
REM Activate venv
call venv\Scripts\activate.bat
REM Check dependencies
echo [1/3] Checking dependencies...
pip show fastapi >nul 2>&1
if %errorlevel% neq 0 (
echo Installing dependencies...
pip install -r requirements.txt
if %errorlevel% neq 0 (
echo [ERROR] Failed to install dependencies
pause
exit /b 1
)
)
echo [2/3] Testing database connection...
python test_db.py
if %errorlevel% neq 0 (
echo [WARNING] Database connection test failed
echo Press Ctrl+C to cancel or any key to continue...
pause >nul
)
echo [3/3] Starting development server...
echo.
echo ===================================
echo Server will start at:
echo - API Docs: http://localhost:8000/docs
echo - Dashboard: http://localhost:8000/dashboard
echo - Health: http://localhost:8000/api/v1/health
echo ===================================
echo.
echo Press Ctrl+C to stop the server
echo.
uvicorn src.api.server:app --reload --host 0.0.0.0 --port 8000

48
start_dev.sh Normal file
View File

@ -0,0 +1,48 @@
#!/bin/bash
echo "==================================="
echo " BTC Trading Dashboard - Development Server"
echo "==================================="
echo ""
# Check if venv exists
if [ ! -d "venv" ]; then
echo "[ERROR] Virtual environment not found!"
echo "Please run setup first to create the venv."
exit 1
fi
# Activate venv
source venv/bin/activate
# Check dependencies
echo "[1/3] Checking dependencies..."
if ! pip show fastapi > /dev/null 2>&1; then
echo "Installing dependencies..."
pip install -r requirements.txt
if [ $? -ne 0 ]; then
echo "[ERROR] Failed to install dependencies"
exit 1
fi
fi
echo "[2/3] Testing database connection..."
python test_db.py
if [ $? -ne 0 ]; then
echo "[WARNING] Database connection test failed"
read -p "Press Enter to continue or Ctrl+C to cancel..."
fi
echo "[3/3] Starting development server..."
echo ""
echo "==================================="
echo " Server will start at:"
echo " - API Docs: http://localhost:8000/docs"
echo " - Dashboard: http://localhost:8000/dashboard"
echo " - Health: http://localhost:8000/api/v1/health"
echo "==================================="
echo ""
echo "Press Ctrl+C to stop the server"
echo ""
uvicorn src.api.server:app --reload --host 0.0.0.0 --port 8000

63
test_db.py Normal file
View File

@ -0,0 +1,63 @@
import asyncio
import os
from dotenv import load_dotenv
import asyncpg
load_dotenv()
async def test_db_connection():
"""Test database connection"""
try:
conn = await asyncpg.connect(
host=os.getenv('DB_HOST'),
port=int(os.getenv('DB_PORT', 5432)),
database=os.getenv('DB_NAME'),
user=os.getenv('DB_USER'),
password=os.getenv('DB_PASSWORD'),
)
version = await conn.fetchval('SELECT version()')
print(f"[OK] Database connected successfully!")
print(f" Host: {os.getenv('DB_HOST')}:{os.getenv('DB_PORT')}")
print(f" Database: {os.getenv('DB_NAME')}")
print(f" User: {os.getenv('DB_USER')}")
print(f" PostgreSQL: {version[:50]}...")
# Check if tables exist
tables = await conn.fetch("""
SELECT table_name FROM information_schema.tables
WHERE table_schema = 'public'
ORDER BY table_name
""")
table_names = [row['table_name'] for row in tables]
print(f"\n[OK] Found {len(table_names)} tables:")
for table in table_names:
print(f" - {table}")
# Check candles count
if 'candles' in table_names:
count = await conn.fetchval('SELECT COUNT(*) FROM candles')
latest_time = await conn.fetchval("""
SELECT MAX(time) FROM candles
WHERE time > NOW() - INTERVAL '7 days'
""")
print(f"\n[OK] Candles table has {count} total records")
if latest_time:
print(f" Latest candle (last 7 days): {latest_time}")
await conn.close()
return True
except Exception as e:
print(f"[FAIL] Database connection failed:")
print(f" Error: {e}")
print(f"\nCheck:")
print(f" 1. NAS is reachable at {os.getenv('DB_HOST')}:{os.getenv('DB_PORT')}")
print(f" 2. PostgreSQL is running")
print(f" 3. Database '{os.getenv('DB_NAME')}' exists")
print(f" 4. User '{os.getenv('DB_USER')}' has access")
return False
if __name__ == '__main__':
asyncio.run(test_db_connection())