Initial commit - BTC Trading Dashboard
- FastAPI backend with PostgreSQL database connection - Frontend dashboard with lightweight-charts - Technical indicators (SMA, EMA, RSI, MACD, Bollinger Bands, etc.) - Trading strategy simulation and backtesting - Database connection to NAS at 20.20.20.20:5433 - Development server setup and documentation
This commit is contained in:
56
.gitignore
vendored
Normal file
56
.gitignore
vendored
Normal file
@ -0,0 +1,56 @@
|
||||
# Python
|
||||
__pycache__/
|
||||
*.py[cod]
|
||||
*$py.class
|
||||
*.so
|
||||
.Python
|
||||
*.egg-info/
|
||||
dist/
|
||||
build/
|
||||
*.egg
|
||||
|
||||
# Virtual Environment
|
||||
venv/
|
||||
env/
|
||||
ENV/
|
||||
.venv
|
||||
|
||||
# IDE
|
||||
.vscode/
|
||||
.idea/
|
||||
*.swp
|
||||
*.swo
|
||||
*~
|
||||
.DS_Store
|
||||
|
||||
# Logs
|
||||
*.log
|
||||
logs/
|
||||
*.log.*
|
||||
|
||||
# Environment variables
|
||||
.env
|
||||
.env.local
|
||||
.env.*.local
|
||||
|
||||
# Database
|
||||
*.db
|
||||
*.sqlite
|
||||
*.sqlite3
|
||||
*.dump
|
||||
|
||||
# OS
|
||||
Thumbs.db
|
||||
.DS_Store
|
||||
desktop.ini
|
||||
|
||||
# Project specific
|
||||
*.pid
|
||||
*.seed
|
||||
*.pid.lock
|
||||
.pytest_cache/
|
||||
.coverage
|
||||
htmlcov/
|
||||
.tox/
|
||||
.cache
|
||||
nosetests.xml
|
||||
145
README.md
Normal file
145
README.md
Normal file
@ -0,0 +1,145 @@
|
||||
# BTC Trading Dashboard
|
||||
|
||||
A Bitcoin trading dashboard with FastAPI backend, PostgreSQL database, and technical analysis features.
|
||||
|
||||
## Architecture
|
||||
|
||||
- **Backend**: FastAPI (Python)
|
||||
- **Frontend**: HTML/JS dashboard with lightweight-charts
|
||||
- **Database**: PostgreSQL (connects to NAS)
|
||||
- **Features**:
|
||||
- Real-time candle data
|
||||
- Technical indicators (SMA, EMA, RSI, MACD, Bollinger Bands, etc.)
|
||||
- Trading strategy simulation
|
||||
- Backtesting
|
||||
|
||||
## Prerequisites
|
||||
|
||||
- Python 3.9+
|
||||
- PostgreSQL database (on NAS at 20.20.20.20:5433)
|
||||
|
||||
## Setup
|
||||
|
||||
### 1. Virtual Environment
|
||||
|
||||
```cmd
|
||||
python -m venv venv
|
||||
```
|
||||
|
||||
### 2. Install Dependencies
|
||||
|
||||
```cmd
|
||||
venv\Scripts\activate
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
### 3. Configure Database
|
||||
|
||||
Edit `.env` file:
|
||||
```
|
||||
DB_HOST=20.20.20.20
|
||||
DB_PORT=5433
|
||||
DB_NAME=btc_data
|
||||
DB_USER=btc_bot
|
||||
DB_PASSWORD=your_password
|
||||
```
|
||||
|
||||
### 4. Test Database Connection
|
||||
|
||||
```cmd
|
||||
python test_db.py
|
||||
```
|
||||
|
||||
## Running the Server
|
||||
|
||||
### Quick Start
|
||||
|
||||
**Windows:**
|
||||
```cmd
|
||||
start_dev.cmd
|
||||
```
|
||||
|
||||
**Linux/Mac:**
|
||||
```bash
|
||||
chmod +x start_dev.sh
|
||||
./start_dev.sh
|
||||
```
|
||||
|
||||
### Manual Start
|
||||
|
||||
```cmd
|
||||
venv\Scripts\activate
|
||||
uvicorn src.api.server:app --reload --host 0.0.0.0 --port 8000
|
||||
```
|
||||
|
||||
## Access the Application
|
||||
|
||||
Once the server is running:
|
||||
|
||||
- **Dashboard**: http://localhost:8000/dashboard
|
||||
- **API Docs**: http://localhost:8000/docs
|
||||
- **Health Check**: http://localhost:8000/api/v1/health
|
||||
|
||||
## Project Structure
|
||||
|
||||
```
|
||||
.
|
||||
├── config/
|
||||
│ └── data_config.yaml # Data collection configuration
|
||||
├── src/
|
||||
│ ├── api/
|
||||
│ │ ├── server.py # FastAPI application
|
||||
│ │ └── dashboard/ # Frontend static files
|
||||
│ ├── data_collector/ # Data collection modules
|
||||
│ │ ├── main.py # Data collector service
|
||||
│ │ ├── database.py # Database manager
|
||||
│ │ ├── websocket_client.py # WebSocket client
|
||||
│ │ ├── indicator_engine.py # Technical indicators
|
||||
│ │ ├── brain.py # Trading logic
|
||||
│ │ └── backtester.py # Backtesting engine
|
||||
│ └── strategies/ # Trading strategies
|
||||
│ ├── base.py # Base strategy class
|
||||
│ └── ma_strategy.py # Moving average strategy
|
||||
├── .env # Environment variables
|
||||
├── requirements.txt # Python dependencies
|
||||
└── test_db.py # Database connection test
|
||||
```
|
||||
|
||||
## API Endpoints
|
||||
|
||||
| Endpoint | Method | Description |
|
||||
|----------|--------|-------------|
|
||||
| `/` | GET | API info |
|
||||
| `/api/v1/health` | GET | System health check |
|
||||
| `/api/v1/candles` | GET | Get candle data |
|
||||
| `/api/v1/strategies` | GET | List available strategies |
|
||||
| `/api/v1/ta` | GET | Technical analysis |
|
||||
| `/api/v1/stats` | GET | Trading statistics |
|
||||
| `/api/v1/backtests` | POST | Trigger backtest |
|
||||
|
||||
## Development Tips
|
||||
|
||||
1. **Auto-reload**: The server reloads automatically when Python files change
|
||||
2. **Database changes**: Restart server to pick up schema changes
|
||||
3. **Frontend**: Edit HTML/JS in `src/api/dashboard/static/`
|
||||
4. **Indicators**: Add new indicators in `src/api/dashboard/static/js/indicators/`
|
||||
5. **Strategies**: Create strategies in `src/strategies/`
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Port 8000 already in use
|
||||
```cmd
|
||||
netstat -ano | findstr :8000
|
||||
taskkill /PID <PID> /F
|
||||
```
|
||||
|
||||
### Database connection failed
|
||||
1. Check NAS is reachable: `ping 20.20.20.20`
|
||||
2. Verify PostgreSQL is running on NAS
|
||||
3. Check `.env` credentials
|
||||
4. Run `python test_db.py` for diagnosis
|
||||
|
||||
### No data in dashboard
|
||||
1. Verify data collector is running on NAS
|
||||
2. Check database has candles table
|
||||
3. Use API docs to query data manually
|
||||
17
RUN_SERVER.bat
Normal file
17
RUN_SERVER.bat
Normal file
@ -0,0 +1,17 @@
|
||||
@echo off
|
||||
title BTC Dashboard Server
|
||||
cd /d "%~dp0"
|
||||
echo ===================================
|
||||
echo Starting BTC Dashboard Server
|
||||
echo ===================================
|
||||
echo.
|
||||
echo Dashboard: http://localhost:8000/dashboard
|
||||
echo API Docs: http://localhost:8000/docs
|
||||
echo.
|
||||
echo Press Ctrl+C to stop
|
||||
echo ===================================
|
||||
echo.
|
||||
|
||||
call venv\Scripts\uvicorn src.api.server:app --host 0.0.0.0 --port 8000 --reload
|
||||
|
||||
pause
|
||||
94
config/data_config.yaml
Normal file
94
config/data_config.yaml
Normal file
@ -0,0 +1,94 @@
|
||||
# Data Collection Configuration
|
||||
data_collection:
|
||||
# Primary data source
|
||||
primary_exchange: "hyperliquid"
|
||||
|
||||
# Assets to collect
|
||||
assets:
|
||||
cbBTC:
|
||||
symbol: "cbBTC-PERP"
|
||||
enabled: true
|
||||
base_asset: "cbBTC"
|
||||
quote_asset: "USD"
|
||||
|
||||
# Validation settings
|
||||
validation:
|
||||
enabled: true
|
||||
tolerance_percent: 1.0 # 1% price divergence allowed
|
||||
check_interval_minutes: 5
|
||||
|
||||
# Reference sources for cross-validation
|
||||
references:
|
||||
uniswap_v3:
|
||||
enabled: true
|
||||
chain: "base"
|
||||
pool_address: "0x4f1480ba4F40f2A41a718c8699E64976b222b56d" # cbBTC/USDC
|
||||
rpc_url: "https://base-mainnet.g.alchemy.com/v2/YOUR_API_KEY"
|
||||
|
||||
coinbase:
|
||||
enabled: true
|
||||
api_url: "https://api.exchange.coinbase.com"
|
||||
|
||||
# Intervals to collect (1m is base, others computed)
|
||||
intervals:
|
||||
- "1m" # Base collection
|
||||
indicators:
|
||||
ma44:
|
||||
type: "sma"
|
||||
period: 44
|
||||
intervals: ["1d"]
|
||||
ma125:
|
||||
type: "sma"
|
||||
period: 125
|
||||
intervals: ["1d"]
|
||||
|
||||
# WebSocket settings
|
||||
websocket:
|
||||
url: "wss://api.hyperliquid.xyz/ws"
|
||||
reconnect_attempts: 10
|
||||
reconnect_delays: [1, 2, 5, 10, 30, 60, 120, 300, 600, 900] # seconds
|
||||
ping_interval: 30
|
||||
ping_timeout: 10
|
||||
|
||||
# Buffer settings
|
||||
buffer:
|
||||
max_size: 1000 # candles in memory
|
||||
flush_interval_seconds: 30
|
||||
batch_size: 100
|
||||
|
||||
# Database settings
|
||||
database:
|
||||
host: "${DB_HOST}"
|
||||
port: ${DB_PORT}
|
||||
name: "${DB_NAME}"
|
||||
user: "${DB_USER}"
|
||||
password: "${DB_PASSWORD}"
|
||||
pool_size: 5
|
||||
max_overflow: 10
|
||||
|
||||
# Backfill settings
|
||||
backfill:
|
||||
enabled: true
|
||||
max_gap_minutes: 60
|
||||
rest_api_url: "https://api.hyperliquid.xyz/info"
|
||||
|
||||
# Quality monitoring
|
||||
quality_monitor:
|
||||
enabled: true
|
||||
check_interval_seconds: 300 # 5 minutes
|
||||
anomaly_detection:
|
||||
price_change_threshold: 0.10 # 10%
|
||||
volume_spike_std: 5.0 # 5 sigma
|
||||
|
||||
# Logging
|
||||
logging:
|
||||
level: "INFO"
|
||||
format: "%(asctime)s - %(name)s - %(levelname)s - %(message)s"
|
||||
file: "/app/logs/collector.log"
|
||||
max_size_mb: 100
|
||||
backup_count: 10
|
||||
|
||||
# Performance
|
||||
performance:
|
||||
max_cpu_percent: 80
|
||||
max_memory_mb: 256
|
||||
34
kill_port_8000.bat
Normal file
34
kill_port_8000.bat
Normal file
@ -0,0 +1,34 @@
|
||||
@echo off
|
||||
setlocal enabledelayedexpansion
|
||||
|
||||
echo ===================================
|
||||
echo Kill Process on Port 8000
|
||||
echo ===================================echo.
|
||||
|
||||
REM Find PID using port 8000
|
||||
for /f "tokens=5" %%a in ('netstat -ano ^| findstr ":8000" ^| findstr "LISTENING"') do (
|
||||
set "PID=%%a"
|
||||
)
|
||||
|
||||
if "%PID%"=="" (
|
||||
echo No process found on port 8000
|
||||
) else (
|
||||
echo Found process PID: %PID% on port 8000
|
||||
taskkill /F /PID %PID% 2>nul
|
||||
if %errorlevel% equ 0 (
|
||||
echo Process killed successfully
|
||||
) else (
|
||||
echo Failed to kill process
|
||||
)
|
||||
)
|
||||
|
||||
echo.
|
||||
sleep 2
|
||||
netstat -ano | findstr ":8000" | findstr "LISTENING"
|
||||
if %errorlevel% neq 0 (
|
||||
echo Port 8000 is now free
|
||||
) else (
|
||||
echo Port 8000 still in use
|
||||
)
|
||||
|
||||
pause
|
||||
10
requirements.txt
Normal file
10
requirements.txt
Normal file
@ -0,0 +1,10 @@
|
||||
fastapi>=0.104.0
|
||||
uvicorn[standard]>=0.24.0
|
||||
asyncpg>=0.29.0
|
||||
aiohttp>=3.9.0
|
||||
websockets>=12.0
|
||||
pydantic>=2.5.0
|
||||
pydantic-settings>=2.1.0
|
||||
pyyaml>=6.0
|
||||
python-dotenv>=1.0.0
|
||||
python-multipart>=0.0.6
|
||||
86
src/TV/HTS.pine
Normal file
86
src/TV/HTS.pine
Normal file
@ -0,0 +1,86 @@
|
||||
//@version=5
|
||||
indicator(title='HTS p1otek (Fixed)', overlay=true )
|
||||
|
||||
// Helper function to return the correct timeframe string for request.security
|
||||
// Note: We let Pine Script infer the return type to avoid syntax errors
|
||||
getAutoTFString(chartTFInMinutes) =>
|
||||
float autoTFMinutes = chartTFInMinutes / 4.0
|
||||
|
||||
// Use an existing time resolution string if possible (D, W, M)
|
||||
if timeframe.isdaily
|
||||
// 'D' timeframe is 1440 minutes. 1440 / 4 = 360 minutes (6 hours)
|
||||
// We return "360" which Pine Script accepts as a resolution
|
||||
str.tostring(math.round(autoTFMinutes))
|
||||
else if timeframe.isweekly or timeframe.ismonthly
|
||||
// Cannot divide W or M timeframes reliably, return current timeframe string
|
||||
timeframe.period
|
||||
else
|
||||
// For standard minute timeframes, use the calculated minutes
|
||||
str.tostring(math.round(autoTFMinutes))
|
||||
|
||||
// Inputs
|
||||
// FIXED: Changed input.integer to input.int
|
||||
short = input.int(33, "fast")
|
||||
long = input.int(144, "slow")
|
||||
auto = input.bool(false, title = "auto HTS (timeframe/4)")
|
||||
draw_1h = input.bool(false, title = "draw 1h slow HTS")
|
||||
|
||||
metoda = input.string(title = "type average", defval = "RMA", options=["RMA", "EMA", "SMA", "WMA", "VWMA"])
|
||||
|
||||
// Calculate chart TF in minutes
|
||||
float chartTFInMinutes = timeframe.in_seconds() / 60
|
||||
// Get the auto-calculated timeframe string
|
||||
string autoTFString = getAutoTFString(chartTFInMinutes)
|
||||
|
||||
|
||||
srednia(src, length, type) =>
|
||||
switch type
|
||||
"RMA" => ta.rma(src, length)
|
||||
"EMA" => ta.ema(src, length)
|
||||
"SMA" => ta.sma(src, length)
|
||||
"WMA" => ta.wma(src, length)
|
||||
"VWMA" => ta.vwma(src, length)
|
||||
|
||||
// === Non-Auto (Current Timeframe) Calculations ===
|
||||
string currentTFString = timeframe.period
|
||||
|
||||
shortl = request.security(syminfo.tickerid, currentTFString, srednia(low, short, metoda))
|
||||
shorth = request.security(syminfo.tickerid, currentTFString, srednia(high, short, metoda))
|
||||
longl = request.security(syminfo.tickerid, currentTFString, srednia(low, long, metoda))
|
||||
longh = request.security(syminfo.tickerid, currentTFString, srednia(high, long, metoda))
|
||||
|
||||
// === Auto Timeframe Calculations ===
|
||||
shortl_auto = request.security(syminfo.tickerid, autoTFString, srednia(low, short, metoda))
|
||||
shorth_auto = request.security(syminfo.tickerid, autoTFString, srednia(high, short, metoda))
|
||||
longl_auto = request.security(syminfo.tickerid, autoTFString, srednia(low, long, metoda))
|
||||
longh_auto = request.security(syminfo.tickerid, autoTFString, srednia(high, long, metoda))
|
||||
|
||||
// === 1H Timeframe Calculations ===
|
||||
// Use a fixed '60' for 1 hour
|
||||
longl_1h = request.security(syminfo.tickerid, "60", srednia(low, long, metoda))
|
||||
longh_1h = request.security(syminfo.tickerid, "60", srednia(high, long, metoda))
|
||||
|
||||
|
||||
// === Plotting ===
|
||||
|
||||
// Auto HTS
|
||||
plot(auto ? shortl_auto: na, color=color.new(color.aqua, 0), linewidth=1, title="fast low auto")
|
||||
plot(auto ? shorth_auto: na, color=color.new(color.aqua, 0), linewidth=1, title="fast high auto")
|
||||
plot(auto ? longl_auto: na, color=color.new(color.red, 0), linewidth=1, title="slow low auto")
|
||||
plot(auto ? longh_auto: na, color=color.new(color.red, 0), linewidth=1, title="slow high auto")
|
||||
|
||||
// Current TF (only when Auto is enabled, for reference)
|
||||
ll = plot( auto ? longl: na, color=color.new(color.red, 80), linewidth=1, title="current slow low")
|
||||
oo = plot( auto ? longh: na, color=color.new(color.red, 80), linewidth=1, title="current slow high")
|
||||
fill(ll,oo, color=color.new(color.red, 90))
|
||||
|
||||
// 1H Zone
|
||||
zone_1hl = plot( draw_1h ? longl_1h: na, color=color.new(color.red, 80), linewidth=1, title="1h slow low")
|
||||
zone_1hh = plot( draw_1h ? longh_1h: na, color=color.new(color.red, 80), linewidth=1, title="1h slow high")
|
||||
fill(zone_1hl,zone_1hh, color=color.new(color.red, 90))
|
||||
|
||||
// Non-Auto HTS
|
||||
plot(not auto ? shortl: na, color=color.new(color.aqua, 0), linewidth=1, title="fast low")
|
||||
plot(not auto ? shorth: na, color=color.new(color.aqua, 0), linewidth=1, title="fast high")
|
||||
plot(not auto ? longl: na, color=color.new(color.red, 0), linewidth=1, title="slow low")
|
||||
plot(not auto ? longh: na, color=color.new(color.red, 0), linewidth=1, title="slow high")
|
||||
1490
src/api/dashboard/static/index.html
Normal file
1490
src/api/dashboard/static/index.html
Normal file
File diff suppressed because it is too large
Load Diff
94
src/api/dashboard/static/js/app.js
Normal file
94
src/api/dashboard/static/js/app.js
Normal file
@ -0,0 +1,94 @@
|
||||
import { TradingDashboard, refreshTA, openAIAnalysis } from './ui/chart.js';
|
||||
import { restoreSidebarState, toggleSidebar } from './ui/sidebar.js';
|
||||
import { SimulationStorage } from './ui/storage.js';
|
||||
import { showExportDialog, closeExportDialog, performExport, exportSavedSimulation } from './ui/export.js';
|
||||
import {
|
||||
runSimulation,
|
||||
displayEnhancedResults,
|
||||
showSimulationMarkers,
|
||||
clearSimulationResults,
|
||||
getLastResults,
|
||||
setLastResults
|
||||
} from './ui/simulation.js';
|
||||
import {
|
||||
renderStrategies,
|
||||
selectStrategy,
|
||||
loadStrategies,
|
||||
saveSimulation,
|
||||
renderSavedSimulations,
|
||||
loadSavedSimulation,
|
||||
deleteSavedSimulation,
|
||||
setCurrentStrategy
|
||||
} from './ui/strategies-panel.js';
|
||||
import {
|
||||
renderIndicatorList,
|
||||
addIndicator,
|
||||
toggleIndicator,
|
||||
showIndicatorConfig,
|
||||
applyIndicatorConfig,
|
||||
removeIndicator,
|
||||
removeIndicatorById,
|
||||
removeIndicatorByIndex,
|
||||
drawIndicatorsOnChart
|
||||
} from './ui/indicators-panel.js';
|
||||
import { StrategyParams } from './strategies/config.js';
|
||||
import { IndicatorRegistry } from './indicators/index.js';
|
||||
|
||||
window.dashboard = null;
|
||||
|
||||
function setDefaultStartDate() {
|
||||
const startDateInput = document.getElementById('simStartDate');
|
||||
if (startDateInput) {
|
||||
const sevenDaysAgo = new Date();
|
||||
sevenDaysAgo.setDate(sevenDaysAgo.getDate() - 7);
|
||||
startDateInput.value = sevenDaysAgo.toISOString().slice(0, 16);
|
||||
}
|
||||
}
|
||||
|
||||
function updateTimeframeDisplay() {
|
||||
const display = document.getElementById('simTimeframeDisplay');
|
||||
if (display && window.dashboard) {
|
||||
display.value = window.dashboard.currentInterval.toUpperCase();
|
||||
}
|
||||
}
|
||||
|
||||
window.toggleSidebar = toggleSidebar;
|
||||
window.refreshTA = refreshTA;
|
||||
window.openAIAnalysis = openAIAnalysis;
|
||||
window.showExportDialog = showExportDialog;
|
||||
window.closeExportDialog = closeExportDialog;
|
||||
window.performExport = performExport;
|
||||
window.exportSavedSimulation = exportSavedSimulation;
|
||||
window.runSimulation = runSimulation;
|
||||
window.saveSimulation = saveSimulation;
|
||||
window.renderSavedSimulations = renderSavedSimulations;
|
||||
window.loadSavedSimulation = loadSavedSimulation;
|
||||
window.deleteSavedSimulation = deleteSavedSimulation;
|
||||
window.clearSimulationResults = clearSimulationResults;
|
||||
window.updateTimeframeDisplay = updateTimeframeDisplay;
|
||||
window.renderIndicatorList = renderIndicatorList;
|
||||
window.addIndicator = addIndicator;
|
||||
window.toggleIndicator = toggleIndicator;
|
||||
window.showIndicatorConfig = showIndicatorConfig;
|
||||
|
||||
window.StrategyParams = StrategyParams;
|
||||
window.SimulationStorage = SimulationStorage;
|
||||
window.IndicatorRegistry = IndicatorRegistry;
|
||||
|
||||
document.addEventListener('DOMContentLoaded', async () => {
|
||||
window.dashboard = new TradingDashboard();
|
||||
restoreSidebarState();
|
||||
setDefaultStartDate();
|
||||
updateTimeframeDisplay();
|
||||
renderSavedSimulations();
|
||||
|
||||
await loadStrategies();
|
||||
|
||||
renderIndicatorList();
|
||||
|
||||
const originalSwitchTimeframe = window.dashboard.switchTimeframe.bind(window.dashboard);
|
||||
window.dashboard.switchTimeframe = function(interval) {
|
||||
originalSwitchTimeframe(interval);
|
||||
setTimeout(() => drawIndicatorsOnChart(), 500);
|
||||
};
|
||||
});
|
||||
15
src/api/dashboard/static/js/core/constants.js
Normal file
15
src/api/dashboard/static/js/core/constants.js
Normal file
@ -0,0 +1,15 @@
|
||||
export const INTERVALS = ['1m', '3m', '5m', '15m', '30m', '37m', '1h', '2h', '4h', '8h', '12h', '1d', '3d', '1w', '1M'];
|
||||
|
||||
export const COLORS = {
|
||||
tvBg: '#131722',
|
||||
tvPanelBg: '#1e222d',
|
||||
tvBorder: '#2a2e39',
|
||||
tvText: '#d1d4dc',
|
||||
tvTextSecondary: '#787b86',
|
||||
tvGreen: '#26a69a',
|
||||
tvRed: '#ef5350',
|
||||
tvBlue: '#2962ff',
|
||||
tvHover: '#2a2e39'
|
||||
};
|
||||
|
||||
export const API_BASE = '/api/v1';
|
||||
1
src/api/dashboard/static/js/core/index.js
Normal file
1
src/api/dashboard/static/js/core/index.js
Normal file
@ -0,0 +1 @@
|
||||
export { INTERVALS, COLORS, API_BASE } from './constants.js';
|
||||
38
src/api/dashboard/static/js/indicators/atr.js
Normal file
38
src/api/dashboard/static/js/indicators/atr.js
Normal file
@ -0,0 +1,38 @@
|
||||
import { BaseIndicator } from './base.js';
|
||||
|
||||
export class ATRIndicator extends BaseIndicator {
|
||||
calculate(candles) {
|
||||
const period = this.params.period || 14;
|
||||
const results = new Array(candles.length).fill(null);
|
||||
const tr = new Array(candles.length).fill(0);
|
||||
|
||||
for (let i = 1; i < candles.length; i++) {
|
||||
const h_l = candles[i].high - candles[i].low;
|
||||
const h_pc = Math.abs(candles[i].high - candles[i-1].close);
|
||||
const l_pc = Math.abs(candles[i].low - candles[i-1].close);
|
||||
tr[i] = Math.max(h_l, h_pc, l_pc);
|
||||
}
|
||||
|
||||
let atr = 0;
|
||||
let sum = 0;
|
||||
for (let i = 1; i <= period; i++) sum += tr[i];
|
||||
atr = sum / period;
|
||||
results[period] = atr;
|
||||
|
||||
for (let i = period + 1; i < candles.length; i++) {
|
||||
atr = (atr * (period - 1) + tr[i]) / period;
|
||||
results[i] = atr;
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
getMetadata() {
|
||||
return {
|
||||
name: 'ATR',
|
||||
description: 'Average True Range - measures market volatility',
|
||||
inputs: [{ name: 'period', label: 'Period', type: 'number', default: 14, min: 1, max: 100 }],
|
||||
plots: [{ id: 'value', color: '#795548', title: 'ATR' }],
|
||||
displayMode: 'pane'
|
||||
};
|
||||
}
|
||||
}
|
||||
18
src/api/dashboard/static/js/indicators/base.js
Normal file
18
src/api/dashboard/static/js/indicators/base.js
Normal file
@ -0,0 +1,18 @@
|
||||
export class BaseIndicator {
|
||||
constructor(config) {
|
||||
this.name = config.name;
|
||||
this.type = config.type;
|
||||
this.params = config.params || {};
|
||||
this.timeframe = config.timeframe || '1m';
|
||||
}
|
||||
calculate(candles) { throw new Error("Not implemented"); }
|
||||
|
||||
getMetadata() {
|
||||
return {
|
||||
name: this.name,
|
||||
inputs: [],
|
||||
plots: [],
|
||||
displayMode: 'overlay'
|
||||
};
|
||||
}
|
||||
}
|
||||
43
src/api/dashboard/static/js/indicators/bb.js
Normal file
43
src/api/dashboard/static/js/indicators/bb.js
Normal file
@ -0,0 +1,43 @@
|
||||
import { BaseIndicator } from './base.js';
|
||||
|
||||
export class BollingerBandsIndicator extends BaseIndicator {
|
||||
calculate(candles) {
|
||||
const period = this.params.period || 20;
|
||||
const stdDevMult = this.params.stdDev || 2;
|
||||
const results = new Array(candles.length).fill(null);
|
||||
|
||||
for (let i = period - 1; i < candles.length; i++) {
|
||||
let sum = 0;
|
||||
for (let j = 0; j < period; j++) sum += candles[i-j].close;
|
||||
const sma = sum / period;
|
||||
|
||||
let diffSum = 0;
|
||||
for (let j = 0; j < period; j++) diffSum += Math.pow(candles[i-j].close - sma, 2);
|
||||
const stdDev = Math.sqrt(diffSum / period);
|
||||
|
||||
results[i] = {
|
||||
middle: sma,
|
||||
upper: sma + (stdDevMult * stdDev),
|
||||
lower: sma - (stdDevMult * stdDev)
|
||||
};
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
getMetadata() {
|
||||
return {
|
||||
name: 'Bollinger Bands',
|
||||
description: 'Volatility bands around a moving average',
|
||||
inputs: [
|
||||
{ name: 'period', label: 'Period', type: 'number', default: 20, min: 1, max: 100 },
|
||||
{ name: 'stdDev', label: 'Std Dev', type: 'number', default: 2, min: 0.5, max: 5, step: 0.5 }
|
||||
],
|
||||
plots: [
|
||||
{ id: 'upper', color: '#4caf50', title: 'Upper' },
|
||||
{ id: 'middle', color: '#4caf50', title: 'Middle', lineStyle: 2 },
|
||||
{ id: 'lower', color: '#4caf50', title: 'Lower' }
|
||||
],
|
||||
displayMode: 'overlay'
|
||||
};
|
||||
}
|
||||
}
|
||||
18
src/api/dashboard/static/js/indicators/ema.js
Normal file
18
src/api/dashboard/static/js/indicators/ema.js
Normal file
@ -0,0 +1,18 @@
|
||||
import { MA } from './ma.js';
|
||||
import { BaseIndicator } from './base.js';
|
||||
|
||||
export class EMAIndicator extends BaseIndicator {
|
||||
calculate(candles) {
|
||||
const period = this.params.period || 44;
|
||||
return MA.ema(candles, period, 'close');
|
||||
}
|
||||
|
||||
getMetadata() {
|
||||
return {
|
||||
name: 'EMA',
|
||||
inputs: [{ name: 'period', label: 'Period', type: 'number', default: 44, min: 1, max: 500 }],
|
||||
plots: [{ id: 'value', color: '#ff9800', title: 'EMA' }],
|
||||
displayMode: 'overlay'
|
||||
};
|
||||
}
|
||||
}
|
||||
41
src/api/dashboard/static/js/indicators/hts.js
Normal file
41
src/api/dashboard/static/js/indicators/hts.js
Normal file
@ -0,0 +1,41 @@
|
||||
import { MA } from './ma.js';
|
||||
import { BaseIndicator } from './base.js';
|
||||
|
||||
export class HTSIndicator extends BaseIndicator {
|
||||
calculate(candles) {
|
||||
const shortPeriod = this.params.short || 33;
|
||||
const longPeriod = this.params.long || 144;
|
||||
const maType = this.params.maType || 'RMA';
|
||||
|
||||
const shortHigh = MA.get(maType, candles, shortPeriod, 'high');
|
||||
const shortLow = MA.get(maType, candles, shortPeriod, 'low');
|
||||
const longHigh = MA.get(maType, candles, longPeriod, 'high');
|
||||
const longLow = MA.get(maType, candles, longPeriod, 'low');
|
||||
|
||||
return candles.map((_, i) => ({
|
||||
fastHigh: shortHigh[i],
|
||||
fastLow: shortLow[i],
|
||||
slowHigh: longHigh[i],
|
||||
slowLow: longLow[i]
|
||||
}));
|
||||
}
|
||||
|
||||
getMetadata() {
|
||||
return {
|
||||
name: 'HTS Trend System',
|
||||
description: 'High/Low Trend System with Fast and Slow MAs',
|
||||
inputs: [
|
||||
{ name: 'short', label: 'Fast Period', type: 'number', default: 33, min: 1, max: 500 },
|
||||
{ name: 'long', label: 'Slow Period', type: 'number', default: 144, min: 1, max: 500 },
|
||||
{ name: 'maType', label: 'MA Type', type: 'select', options: ['SMA', 'EMA', 'RMA', 'WMA', 'VWMA'], default: 'RMA' }
|
||||
],
|
||||
plots: [
|
||||
{ id: 'fastHigh', color: '#00bcd4', title: 'Fast High', width: 1 },
|
||||
{ id: 'fastLow', color: '#00bcd4', title: 'Fast Low', width: 1 },
|
||||
{ id: 'slowHigh', color: '#f44336', title: 'Slow High', width: 2 },
|
||||
{ id: 'slowLow', color: '#f44336', title: 'Slow Low', width: 2 }
|
||||
],
|
||||
displayMode: 'overlay'
|
||||
};
|
||||
}
|
||||
}
|
||||
43
src/api/dashboard/static/js/indicators/index.js
Normal file
43
src/api/dashboard/static/js/indicators/index.js
Normal file
@ -0,0 +1,43 @@
|
||||
export { MA } from './ma.js';
|
||||
export { BaseIndicator } from './base.js';
|
||||
export { HTSIndicator } from './hts.js';
|
||||
export { MAIndicator } from './ma_indicator.js';
|
||||
export { RSIIndicator } from './rsi.js';
|
||||
export { BollingerBandsIndicator } from './bb.js';
|
||||
export { MACDIndicator } from './macd.js';
|
||||
export { StochasticIndicator } from './stoch.js';
|
||||
export { ATRIndicator } from './atr.js';
|
||||
|
||||
import { HTSIndicator } from './hts.js';
|
||||
import { MAIndicator } from './ma_indicator.js';
|
||||
import { RSIIndicator } from './rsi.js';
|
||||
import { BollingerBandsIndicator } from './bb.js';
|
||||
import { MACDIndicator } from './macd.js';
|
||||
import { StochasticIndicator } from './stoch.js';
|
||||
import { ATRIndicator } from './atr.js';
|
||||
|
||||
export const IndicatorRegistry = {
|
||||
hts: HTSIndicator,
|
||||
ma: MAIndicator,
|
||||
rsi: RSIIndicator,
|
||||
bb: BollingerBandsIndicator,
|
||||
macd: MACDIndicator,
|
||||
stoch: StochasticIndicator,
|
||||
atr: ATRIndicator
|
||||
};
|
||||
|
||||
/**
|
||||
* Dynamically build the available indicators list from the registry.
|
||||
* Each indicator class provides its own name and description via getMetadata().
|
||||
*/
|
||||
export function getAvailableIndicators() {
|
||||
return Object.entries(IndicatorRegistry).map(([type, IndicatorClass]) => {
|
||||
const instance = new IndicatorClass({ type, params: {}, name: '' });
|
||||
const meta = instance.getMetadata();
|
||||
return {
|
||||
type,
|
||||
name: meta.name || type.toUpperCase(),
|
||||
description: meta.description || ''
|
||||
};
|
||||
});
|
||||
}
|
||||
93
src/api/dashboard/static/js/indicators/ma.js
Normal file
93
src/api/dashboard/static/js/indicators/ma.js
Normal file
@ -0,0 +1,93 @@
|
||||
export class MA {
|
||||
static get(type, candles, period, source = 'close') {
|
||||
switch (type.toUpperCase()) {
|
||||
case 'SMA': return MA.sma(candles, period, source);
|
||||
case 'EMA': return MA.ema(candles, period, source);
|
||||
case 'RMA': return MA.rma(candles, period, source);
|
||||
case 'WMA': return MA.wma(candles, period, source);
|
||||
case 'VWMA': return MA.vwma(candles, period, source);
|
||||
default: return MA.sma(candles, period, source);
|
||||
}
|
||||
}
|
||||
|
||||
static sma(candles, period, source = 'close') {
|
||||
const results = new Array(candles.length).fill(null);
|
||||
let sum = 0;
|
||||
for (let i = 0; i < candles.length; i++) {
|
||||
sum += candles[i][source];
|
||||
if (i >= period) sum -= candles[i - period][source];
|
||||
if (i >= period - 1) results[i] = sum / period;
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
static ema(candles, period, source = 'close') {
|
||||
const multiplier = 2 / (period + 1);
|
||||
const results = new Array(candles.length).fill(null);
|
||||
let ema = 0;
|
||||
let sum = 0;
|
||||
for (let i = 0; i < candles.length; i++) {
|
||||
if (i < period) {
|
||||
sum += candles[i][source];
|
||||
if (i === period - 1) {
|
||||
ema = sum / period;
|
||||
results[i] = ema;
|
||||
}
|
||||
} else {
|
||||
ema = (candles[i][source] - ema) * multiplier + ema;
|
||||
results[i] = ema;
|
||||
}
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
static rma(candles, period, source = 'close') {
|
||||
const multiplier = 1 / period;
|
||||
const results = new Array(candles.length).fill(null);
|
||||
let rma = 0;
|
||||
let sum = 0;
|
||||
|
||||
for (let i = 0; i < candles.length; i++) {
|
||||
if (i < period) {
|
||||
sum += candles[i][source];
|
||||
if (i === period - 1) {
|
||||
rma = sum / period;
|
||||
results[i] = rma;
|
||||
}
|
||||
} else {
|
||||
rma = (candles[i][source] - rma) * multiplier + rma;
|
||||
results[i] = rma;
|
||||
}
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
static wma(candles, period, source = 'close') {
|
||||
const results = new Array(candles.length).fill(null);
|
||||
const weightSum = (period * (period + 1)) / 2;
|
||||
|
||||
for (let i = period - 1; i < candles.length; i++) {
|
||||
let sum = 0;
|
||||
for (let j = 0; j < period; j++) {
|
||||
sum += candles[i - j][source] * (period - j);
|
||||
}
|
||||
results[i] = sum / weightSum;
|
||||
}
|
||||
return results;
|
||||
}
|
||||
|
||||
static vwma(candles, period, source = 'close') {
|
||||
const results = new Array(candles.length).fill(null);
|
||||
|
||||
for (let i = period - 1; i < candles.length; i++) {
|
||||
let sumPV = 0;
|
||||
let sumV = 0;
|
||||
for (let j = 0; j < period; j++) {
|
||||
sumPV += candles[i - j][source] * candles[i - j].volume;
|
||||
sumV += candles[i - j].volume;
|
||||
}
|
||||
results[i] = sumV !== 0 ? sumPV / sumV : null;
|
||||
}
|
||||
return results;
|
||||
}
|
||||
}
|
||||
23
src/api/dashboard/static/js/indicators/ma_indicator.js
Normal file
23
src/api/dashboard/static/js/indicators/ma_indicator.js
Normal file
@ -0,0 +1,23 @@
|
||||
import { MA } from './ma.js';
|
||||
import { BaseIndicator } from './base.js';
|
||||
|
||||
export class MAIndicator extends BaseIndicator {
|
||||
calculate(candles) {
|
||||
const period = this.params.period || 44;
|
||||
const maType = this.params.maType || 'SMA';
|
||||
return MA.get(maType, candles, period, 'close');
|
||||
}
|
||||
|
||||
getMetadata() {
|
||||
return {
|
||||
name: 'MA',
|
||||
description: 'Moving Average (SMA/EMA/RMA/WMA/VWMA)',
|
||||
inputs: [
|
||||
{ name: 'period', label: 'Period', type: 'number', default: 44, min: 1, max: 500 },
|
||||
{ name: 'maType', label: 'MA Type', type: 'select', options: ['SMA', 'EMA', 'RMA', 'WMA', 'VWMA'], default: 'SMA' }
|
||||
],
|
||||
plots: [{ id: 'value', color: '#2962ff', title: 'MA' }],
|
||||
displayMode: 'overlay'
|
||||
};
|
||||
}
|
||||
}
|
||||
60
src/api/dashboard/static/js/indicators/macd.js
Normal file
60
src/api/dashboard/static/js/indicators/macd.js
Normal file
@ -0,0 +1,60 @@
|
||||
import { MA } from './ma.js';
|
||||
import { BaseIndicator } from './base.js';
|
||||
|
||||
export class MACDIndicator extends BaseIndicator {
|
||||
calculate(candles) {
|
||||
const fast = this.params.fast || 12;
|
||||
const slow = this.params.slow || 26;
|
||||
const signal = this.params.signal || 9;
|
||||
|
||||
const fastEma = MA.ema(candles, fast, 'close');
|
||||
const slowEma = MA.ema(candles, slow, 'close');
|
||||
|
||||
const macdLine = fastEma.map((f, i) => (f !== null && slowEma[i] !== null) ? f - slowEma[i] : null);
|
||||
|
||||
const signalLine = new Array(candles.length).fill(null);
|
||||
const multiplier = 2 / (signal + 1);
|
||||
let ema = 0;
|
||||
let sum = 0;
|
||||
let count = 0;
|
||||
|
||||
for (let i = 0; i < macdLine.length; i++) {
|
||||
if (macdLine[i] === null) continue;
|
||||
count++;
|
||||
if (count < signal) {
|
||||
sum += macdLine[i];
|
||||
} else if (count === signal) {
|
||||
sum += macdLine[i];
|
||||
ema = sum / signal;
|
||||
signalLine[i] = ema;
|
||||
} else {
|
||||
ema = (macdLine[i] - ema) * multiplier + ema;
|
||||
signalLine[i] = ema;
|
||||
}
|
||||
}
|
||||
|
||||
return macdLine.map((m, i) => ({
|
||||
macd: m,
|
||||
signal: signalLine[i],
|
||||
histogram: (m !== null && signalLine[i] !== null) ? m - signalLine[i] : null
|
||||
}));
|
||||
}
|
||||
|
||||
getMetadata() {
|
||||
return {
|
||||
name: 'MACD',
|
||||
description: 'Moving Average Convergence Divergence - trend & momentum',
|
||||
inputs: [
|
||||
{ name: 'fast', label: 'Fast Period', type: 'number', default: 12 },
|
||||
{ name: 'slow', label: 'Slow Period', type: 'number', default: 26 },
|
||||
{ name: 'signal', label: 'Signal Period', type: 'number', default: 9 }
|
||||
],
|
||||
plots: [
|
||||
{ id: 'macd', color: '#2196f3', title: 'MACD' },
|
||||
{ id: 'signal', color: '#ff5722', title: 'Signal' },
|
||||
{ id: 'histogram', color: '#607d8b', title: 'Histogram', type: 'histogram' }
|
||||
],
|
||||
displayMode: 'pane'
|
||||
};
|
||||
}
|
||||
}
|
||||
69
src/api/dashboard/static/js/indicators/rsi.js
Normal file
69
src/api/dashboard/static/js/indicators/rsi.js
Normal file
@ -0,0 +1,69 @@
|
||||
import { BaseIndicator } from './base.js';
|
||||
|
||||
export class RSIIndicator extends BaseIndicator {
|
||||
calculate(candles) {
|
||||
const period = this.params.period || 14;
|
||||
|
||||
// 1. Calculate RSI using RMA (Wilder's Smoothing)
|
||||
let rsiValues = new Array(candles.length).fill(null);
|
||||
let upSum = 0;
|
||||
let downSum = 0;
|
||||
const rmaAlpha = 1 / period;
|
||||
|
||||
for (let i = 1; i < candles.length; i++) {
|
||||
const diff = candles[i].close - candles[i-1].close;
|
||||
const up = diff > 0 ? diff : 0;
|
||||
const down = diff < 0 ? -diff : 0;
|
||||
|
||||
if (i < period) {
|
||||
upSum += up;
|
||||
downSum += down;
|
||||
} else if (i === period) {
|
||||
upSum += up;
|
||||
downSum += down;
|
||||
const avgUp = upSum / period;
|
||||
const avgDown = downSum / period;
|
||||
rsiValues[i] = avgDown === 0 ? 100 : (avgUp === 0 ? 0 : 100 - (100 / (1 + avgUp / avgDown)));
|
||||
upSum = avgUp; // Store for next RMA step
|
||||
downSum = avgDown;
|
||||
} else {
|
||||
upSum = (up - upSum) * rmaAlpha + upSum;
|
||||
downSum = (down - downSum) * rmaAlpha + downSum;
|
||||
rsiValues[i] = downSum === 0 ? 100 : (upSum === 0 ? 0 : 100 - (100 / (1 + upSum / downSum)));
|
||||
}
|
||||
}
|
||||
|
||||
// Combine results
|
||||
return rsiValues.map((rsi, i) => {
|
||||
return {
|
||||
paneBg: 80, // Background lightening trick
|
||||
rsi: rsi,
|
||||
upperBand: 70,
|
||||
lowerBand: 30
|
||||
};
|
||||
});
|
||||
}
|
||||
|
||||
getMetadata() {
|
||||
const plots = [
|
||||
// RSI Line
|
||||
{ id: 'rsi', color: '#7E57C2', title: '', width: 1, lastValueVisible: true },
|
||||
|
||||
// Bands
|
||||
{ id: 'upperBand', color: '#787B86', title: '', style: 'dashed', width: 1, lastValueVisible: false },
|
||||
{ id: 'lowerBand', color: '#787B86', title: '', style: 'dashed', width: 1, lastValueVisible: false }
|
||||
];
|
||||
|
||||
return {
|
||||
name: 'RSI',
|
||||
description: 'Relative Strength Index',
|
||||
inputs: [
|
||||
{ name: 'period', label: 'RSI Length', type: 'number', default: 14, min: 1, max: 100 }
|
||||
],
|
||||
plots: plots,
|
||||
displayMode: 'pane',
|
||||
paneMin: 0,
|
||||
paneMax: 100
|
||||
};
|
||||
}
|
||||
}
|
||||
18
src/api/dashboard/static/js/indicators/sma.js
Normal file
18
src/api/dashboard/static/js/indicators/sma.js
Normal file
@ -0,0 +1,18 @@
|
||||
import { MA } from './ma.js';
|
||||
import { BaseIndicator } from './base.js';
|
||||
|
||||
export class SMAIndicator extends BaseIndicator {
|
||||
calculate(candles) {
|
||||
const period = this.params.period || 44;
|
||||
return MA.sma(candles, period, 'close');
|
||||
}
|
||||
|
||||
getMetadata() {
|
||||
return {
|
||||
name: 'SMA',
|
||||
inputs: [{ name: 'period', label: 'Period', type: 'number', default: 44, min: 1, max: 500 }],
|
||||
plots: [{ id: 'value', color: '#2962ff', title: 'SMA' }],
|
||||
displayMode: 'overlay'
|
||||
};
|
||||
}
|
||||
}
|
||||
48
src/api/dashboard/static/js/indicators/stoch.js
Normal file
48
src/api/dashboard/static/js/indicators/stoch.js
Normal file
@ -0,0 +1,48 @@
|
||||
import { BaseIndicator } from './base.js';
|
||||
|
||||
export class StochasticIndicator extends BaseIndicator {
|
||||
calculate(candles) {
|
||||
const kPeriod = this.params.kPeriod || 14;
|
||||
const dPeriod = this.params.dPeriod || 3;
|
||||
const results = new Array(candles.length).fill(null);
|
||||
|
||||
const kValues = new Array(candles.length).fill(null);
|
||||
|
||||
for (let i = kPeriod - 1; i < candles.length; i++) {
|
||||
let lowest = Infinity;
|
||||
let highest = -Infinity;
|
||||
for (let j = 0; j < kPeriod; j++) {
|
||||
lowest = Math.min(lowest, candles[i-j].low);
|
||||
highest = Math.max(highest, candles[i-j].high);
|
||||
}
|
||||
const diff = highest - lowest;
|
||||
kValues[i] = diff === 0 ? 50 : ((candles[i].close - lowest) / diff) * 100;
|
||||
}
|
||||
|
||||
for (let i = kPeriod + dPeriod - 2; i < candles.length; i++) {
|
||||
let sum = 0;
|
||||
for (let j = 0; j < dPeriod; j++) sum += kValues[i-j];
|
||||
results[i] = { k: kValues[i], d: sum / dPeriod };
|
||||
}
|
||||
|
||||
return results;
|
||||
}
|
||||
|
||||
getMetadata() {
|
||||
return {
|
||||
name: 'Stochastic',
|
||||
description: 'Stochastic Oscillator - compares close to high-low range',
|
||||
inputs: [
|
||||
{ name: 'kPeriod', label: 'K Period', type: 'number', default: 14 },
|
||||
{ name: 'dPeriod', label: 'D Period', type: 'number', default: 3 }
|
||||
],
|
||||
plots: [
|
||||
{ id: 'k', color: '#3f51b5', title: '%K' },
|
||||
{ id: 'd', color: '#ff9800', title: '%D' }
|
||||
],
|
||||
displayMode: 'pane',
|
||||
paneMin: 0,
|
||||
paneMax: 100
|
||||
};
|
||||
}
|
||||
}
|
||||
5
src/api/dashboard/static/js/strategies/config.js
Normal file
5
src/api/dashboard/static/js/strategies/config.js
Normal file
@ -0,0 +1,5 @@
|
||||
export const StrategyParams = {
|
||||
ma_trend: [
|
||||
{ name: 'period', label: 'MA Period', type: 'number', default: 44, min: 5, max: 500 }
|
||||
]
|
||||
};
|
||||
167
src/api/dashboard/static/js/strategies/engine.js
Normal file
167
src/api/dashboard/static/js/strategies/engine.js
Normal file
@ -0,0 +1,167 @@
|
||||
import { IndicatorRegistry } from '../indicators/index.js';
|
||||
import { RiskManager } from './risk-manager.js';
|
||||
|
||||
export class ClientStrategyEngine {
|
||||
constructor() {
|
||||
this.indicatorTypes = IndicatorRegistry;
|
||||
}
|
||||
|
||||
run(candlesMap, strategyConfig, riskConfig, simulationStart) {
|
||||
const primaryTF = strategyConfig.timeframes?.primary || '1d';
|
||||
const candles = candlesMap[primaryTF];
|
||||
if (!candles) return { error: `No candles for primary timeframe ${primaryTF}` };
|
||||
|
||||
const indicatorResults = {};
|
||||
console.log('Calculating indicators for timeframes:', Object.keys(candlesMap));
|
||||
for (const tf in candlesMap) {
|
||||
indicatorResults[tf] = {};
|
||||
const tfCandles = candlesMap[tf];
|
||||
const tfIndicators = (strategyConfig.indicators || []).filter(ind => (ind.timeframe || primaryTF) === tf);
|
||||
|
||||
console.log(` TF ${tf}: ${tfIndicators.length} indicators to calculate`);
|
||||
|
||||
for (const ind of tfIndicators) {
|
||||
const IndicatorClass = this.indicatorTypes[ind.type];
|
||||
if (IndicatorClass) {
|
||||
const instance = new IndicatorClass(ind);
|
||||
indicatorResults[tf][ind.name] = instance.calculate(tfCandles);
|
||||
const validValues = indicatorResults[tf][ind.name].filter(v => v !== null).length;
|
||||
console.log(` Calculated ${ind.name} on ${tf}: ${validValues} valid values`);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
const risk = new RiskManager(riskConfig);
|
||||
const trades = [];
|
||||
let position = null;
|
||||
const startTimeSec = Math.floor(new Date(simulationStart).getTime() / 1000);
|
||||
console.log('Simulation start (seconds):', startTimeSec, 'Date:', simulationStart);
|
||||
console.log('Total candles available:', candles.length);
|
||||
console.log('First candle time:', candles[0].time, 'Last candle time:', candles[candles.length - 1].time);
|
||||
|
||||
const pointers = {};
|
||||
for (const tf in candlesMap) pointers[tf] = 0;
|
||||
|
||||
let processedCandles = 0;
|
||||
|
||||
for (let i = 1; i < candles.length; i++) {
|
||||
const time = candles[i].time;
|
||||
const price = candles[i].close;
|
||||
|
||||
if (time < startTimeSec) {
|
||||
for (const tf in candlesMap) {
|
||||
while (pointers[tf] < candlesMap[tf].length - 1 &&
|
||||
candlesMap[tf][pointers[tf] + 1].time <= time) {
|
||||
pointers[tf]++;
|
||||
}
|
||||
}
|
||||
continue;
|
||||
}
|
||||
|
||||
processedCandles++;
|
||||
|
||||
for (const tf in candlesMap) {
|
||||
while (pointers[tf] < candlesMap[tf].length - 1 &&
|
||||
candlesMap[tf][pointers[tf] + 1].time <= time) {
|
||||
pointers[tf]++;
|
||||
}
|
||||
}
|
||||
|
||||
const signal = this.evaluate(i, pointers, candles, candlesMap, indicatorResults, strategyConfig, position);
|
||||
|
||||
if (signal === 'BUY' && !position) {
|
||||
const size = risk.calculateSize(price);
|
||||
position = { type: 'long', entryPrice: price, entryTime: candles[i].time, size };
|
||||
} else if (signal === 'SELL' && position) {
|
||||
const pnl = (price - position.entryPrice) * position.size;
|
||||
trades.push({ ...position, exitPrice: price, exitTime: candles[i].time, pnl, pnlPct: (pnl / (position.entryPrice * position.size)) * 100 });
|
||||
risk.balance += pnl;
|
||||
position = null;
|
||||
}
|
||||
}
|
||||
|
||||
console.log(`Simulation complete: ${processedCandles} candles processed after start date, ${trades.length} trades`);
|
||||
|
||||
return {
|
||||
total_trades: trades.length,
|
||||
win_rate: (trades.filter(t => t.pnl > 0).length / (trades.length || 1)) * 100,
|
||||
total_pnl: risk.balance - 1000,
|
||||
trades
|
||||
};
|
||||
}
|
||||
|
||||
evaluate(index, pointers, candles, candlesMap, indicatorResults, config, position) {
|
||||
const primaryTF = config.timeframes?.primary || '1d';
|
||||
|
||||
const getVal = (indName, tf) => {
|
||||
const tfValues = indicatorResults[tf]?.[indName];
|
||||
if (!tfValues) return null;
|
||||
return tfValues[pointers[tf]];
|
||||
};
|
||||
|
||||
const getPrice = (tf) => {
|
||||
const tfCandles = candlesMap[tf];
|
||||
if (!tfCandles) return null;
|
||||
return tfCandles[pointers[tf]].close;
|
||||
};
|
||||
|
||||
if (config.id === 'ma_trend') {
|
||||
const period = config.params?.period || 44;
|
||||
|
||||
if (index === 1) {
|
||||
console.log('First candle time:', candles[index].time, 'Date:', new Date(candles[index].time * 1000));
|
||||
console.log(`MA${period} value:`, getVal(`ma${period}`, primaryTF));
|
||||
}
|
||||
const maValue = getVal(`ma${period}`, primaryTF);
|
||||
const price = candles[index].close;
|
||||
|
||||
const secondaryTF = config.timeframes?.secondary?.[0];
|
||||
let secondaryBullish = true;
|
||||
let secondaryBearish = true;
|
||||
if (secondaryTF) {
|
||||
const secondaryPrice = getPrice(secondaryTF);
|
||||
const secondaryMA = getVal(`ma${period}_${secondaryTF}`, secondaryTF);
|
||||
if (secondaryPrice !== null && secondaryMA !== null) {
|
||||
secondaryBullish = secondaryPrice > secondaryMA;
|
||||
secondaryBearish = secondaryPrice < secondaryMA;
|
||||
}
|
||||
if (index === 1) {
|
||||
console.log(`Trend check: ${secondaryTF} price=${secondaryPrice}, MA=${secondaryMA}, bullish=${secondaryBullish}, bearish=${secondaryBearish}`);
|
||||
}
|
||||
}
|
||||
|
||||
if (maValue) {
|
||||
if (price > maValue && secondaryBullish) return 'BUY';
|
||||
if (price < maValue && secondaryBearish) return 'SELL';
|
||||
}
|
||||
}
|
||||
|
||||
const evaluateConditions = (conds) => {
|
||||
if (!conds || !conds.conditions) return false;
|
||||
const results = conds.conditions.map(c => {
|
||||
const targetTF = c.timeframe || primaryTF;
|
||||
const leftVal = c.indicator === 'price' ? getPrice(targetTF) : getVal(c.indicator, targetTF);
|
||||
const rightVal = typeof c.value === 'number' ? c.value : (c.value === 'price' ? getPrice(targetTF) : getVal(c.value, targetTF));
|
||||
|
||||
if (leftVal === null || rightVal === null) return false;
|
||||
|
||||
switch(c.operator) {
|
||||
case '>': return leftVal > rightVal;
|
||||
case '<': return leftVal < rightVal;
|
||||
case '>=': return leftVal >= rightVal;
|
||||
case '<=': return leftVal <= rightVal;
|
||||
case '==': return leftVal == rightVal;
|
||||
default: return false;
|
||||
}
|
||||
});
|
||||
|
||||
if (conds.logic === 'OR') return results.some(r => r);
|
||||
return results.every(r => r);
|
||||
};
|
||||
|
||||
if (evaluateConditions(config.entryLong)) return 'BUY';
|
||||
if (evaluateConditions(config.exitLong)) return 'SELL';
|
||||
|
||||
return 'HOLD';
|
||||
}
|
||||
}
|
||||
3
src/api/dashboard/static/js/strategies/index.js
Normal file
3
src/api/dashboard/static/js/strategies/index.js
Normal file
@ -0,0 +1,3 @@
|
||||
export { StrategyParams } from './config.js';
|
||||
export { RiskManager } from './risk-manager.js';
|
||||
export { ClientStrategyEngine } from './engine.js';
|
||||
17
src/api/dashboard/static/js/strategies/risk-manager.js
Normal file
17
src/api/dashboard/static/js/strategies/risk-manager.js
Normal file
@ -0,0 +1,17 @@
|
||||
export class RiskManager {
|
||||
constructor(config, initialBalance = 1000) {
|
||||
this.config = config || {
|
||||
positionSizing: { method: 'percent', value: 0.1 },
|
||||
stopLoss: { enabled: true, method: 'percent', value: 0.02 },
|
||||
takeProfit: { enabled: true, method: 'percent', value: 0.04 }
|
||||
};
|
||||
this.balance = initialBalance;
|
||||
this.equity = initialBalance;
|
||||
}
|
||||
calculateSize(price) {
|
||||
if (this.config.positionSizing.method === 'percent') {
|
||||
return (this.balance * this.config.positionSizing.value) / price;
|
||||
}
|
||||
return this.config.positionSizing.value / price;
|
||||
}
|
||||
}
|
||||
604
src/api/dashboard/static/js/ui/chart.js
Normal file
604
src/api/dashboard/static/js/ui/chart.js
Normal file
@ -0,0 +1,604 @@
|
||||
import { INTERVALS, COLORS } from '../core/index.js';
|
||||
|
||||
export class TradingDashboard {
|
||||
constructor() {
|
||||
this.chart = null;
|
||||
this.candleSeries = null;
|
||||
this.currentInterval = '1d';
|
||||
this.intervals = INTERVALS;
|
||||
this.allData = new Map();
|
||||
this.isLoading = false;
|
||||
this.hasInitialLoad = false;
|
||||
this.taData = null;
|
||||
|
||||
this.init();
|
||||
}
|
||||
|
||||
init() {
|
||||
this.createTimeframeButtons();
|
||||
this.initChart();
|
||||
this.initEventListeners();
|
||||
this.loadInitialData();
|
||||
|
||||
setInterval(() => {
|
||||
this.loadNewData();
|
||||
this.loadStats();
|
||||
if (new Date().getSeconds() < 15) this.loadTA();
|
||||
}, 10000);
|
||||
}
|
||||
|
||||
isAtRightEdge() {
|
||||
const timeScale = this.chart.timeScale();
|
||||
const visibleRange = timeScale.getVisibleLogicalRange();
|
||||
if (!visibleRange) return true;
|
||||
|
||||
const data = this.candleSeries.data();
|
||||
if (!data || data.length === 0) return true;
|
||||
|
||||
return visibleRange.to >= data.length - 5;
|
||||
}
|
||||
|
||||
createTimeframeButtons() {
|
||||
const container = document.getElementById('timeframeContainer');
|
||||
container.innerHTML = '';
|
||||
this.intervals.forEach(interval => {
|
||||
const btn = document.createElement('button');
|
||||
btn.className = 'timeframe-btn';
|
||||
btn.dataset.interval = interval;
|
||||
btn.textContent = interval;
|
||||
if (interval === this.currentInterval) {
|
||||
btn.classList.add('active');
|
||||
}
|
||||
btn.addEventListener('click', () => this.switchTimeframe(interval));
|
||||
container.appendChild(btn);
|
||||
});
|
||||
}
|
||||
|
||||
initChart() {
|
||||
const chartContainer = document.getElementById('chart');
|
||||
|
||||
this.chart = LightweightCharts.createChart(chartContainer, {
|
||||
layout: {
|
||||
background: { color: COLORS.tvBg },
|
||||
textColor: COLORS.tvText,
|
||||
panes: {
|
||||
background: { color: '#1e222d' },
|
||||
separatorColor: '#2a2e39',
|
||||
separatorHoverColor: '#363c4e',
|
||||
enableResize: true
|
||||
}
|
||||
},
|
||||
grid: {
|
||||
vertLines: { color: '#363d4e' },
|
||||
horzLines: { color: '#363d4e' },
|
||||
},
|
||||
rightPriceScale: {
|
||||
borderColor: '#363d4e',
|
||||
autoScale: true,
|
||||
},
|
||||
timeScale: {
|
||||
borderColor: '#363d4e',
|
||||
timeVisible: true,
|
||||
secondsVisible: false,
|
||||
rightOffset: 12,
|
||||
barSpacing: 10,
|
||||
},
|
||||
handleScroll: {
|
||||
vertTouchDrag: false,
|
||||
},
|
||||
});
|
||||
|
||||
this.candleSeries = this.chart.addSeries(LightweightCharts.CandlestickSeries, {
|
||||
upColor: '#ff9800',
|
||||
downColor: '#ff9800',
|
||||
borderUpColor: '#ff9800',
|
||||
borderDownColor: '#ff9800',
|
||||
wickUpColor: '#ff9800',
|
||||
wickDownColor: '#ff9800',
|
||||
lastValueVisible: false,
|
||||
priceLineVisible: false,
|
||||
}, 0);
|
||||
|
||||
this.currentPriceLine = this.candleSeries.createPriceLine({
|
||||
price: 0,
|
||||
color: '#26a69a',
|
||||
lineWidth: 1,
|
||||
lineStyle: LightweightCharts.LineStyle.Dotted,
|
||||
axisLabelVisible: true,
|
||||
title: '',
|
||||
});
|
||||
|
||||
this.initPriceScaleControls();
|
||||
this.initNavigationControls();
|
||||
|
||||
this.chart.timeScale().subscribeVisibleLogicalRangeChange(this.onVisibleRangeChange.bind(this));
|
||||
|
||||
window.addEventListener('resize', () => {
|
||||
this.chart.applyOptions({
|
||||
width: chartContainer.clientWidth,
|
||||
height: chartContainer.clientHeight,
|
||||
});
|
||||
});
|
||||
|
||||
document.addEventListener('visibilitychange', () => {
|
||||
if (document.visibilityState === 'visible') {
|
||||
this.loadNewData();
|
||||
this.loadTA();
|
||||
}
|
||||
});
|
||||
window.addEventListener('focus', () => {
|
||||
this.loadNewData();
|
||||
this.loadTA();
|
||||
});
|
||||
}
|
||||
|
||||
initPriceScaleControls() {
|
||||
const btnAutoScale = document.getElementById('btnAutoScale');
|
||||
const btnLogScale = document.getElementById('btnLogScale');
|
||||
|
||||
if (!btnAutoScale || !btnLogScale) return;
|
||||
|
||||
this.priceScaleState = {
|
||||
autoScale: true,
|
||||
logScale: false
|
||||
};
|
||||
|
||||
btnAutoScale.addEventListener('click', () => {
|
||||
this.priceScaleState.autoScale = !this.priceScaleState.autoScale;
|
||||
btnAutoScale.classList.toggle('active', this.priceScaleState.autoScale);
|
||||
|
||||
this.candleSeries.priceScale().applyOptions({
|
||||
autoScale: this.priceScaleState.autoScale
|
||||
});
|
||||
|
||||
console.log('Auto Scale:', this.priceScaleState.autoScale ? 'ON' : 'OFF');
|
||||
});
|
||||
|
||||
btnLogScale.addEventListener('click', () => {
|
||||
this.priceScaleState.logScale = !this.priceScaleState.logScale;
|
||||
btnLogScale.classList.toggle('active', this.priceScaleState.logScale);
|
||||
|
||||
let currentPriceRange = null;
|
||||
let currentTimeRange = null;
|
||||
if (!this.priceScaleState.autoScale) {
|
||||
try {
|
||||
currentPriceRange = this.candleSeries.priceScale().getVisiblePriceRange();
|
||||
} catch (e) {
|
||||
console.log('Could not get price range');
|
||||
}
|
||||
}
|
||||
try {
|
||||
currentTimeRange = this.chart.timeScale().getVisibleLogicalRange();
|
||||
} catch (e) {
|
||||
console.log('Could not get time range');
|
||||
}
|
||||
|
||||
this.candleSeries.priceScale().applyOptions({
|
||||
mode: this.priceScaleState.logScale ? LightweightCharts.PriceScaleMode.Logarithmic : LightweightCharts.PriceScaleMode.Normal
|
||||
});
|
||||
|
||||
this.chart.applyOptions({});
|
||||
|
||||
setTimeout(() => {
|
||||
if (currentTimeRange) {
|
||||
try {
|
||||
this.chart.timeScale().setVisibleLogicalRange(currentTimeRange);
|
||||
} catch (e) {
|
||||
console.log('Could not restore time range');
|
||||
}
|
||||
}
|
||||
|
||||
if (!this.priceScaleState.autoScale && currentPriceRange) {
|
||||
try {
|
||||
this.candleSeries.priceScale().setVisiblePriceRange(currentPriceRange);
|
||||
} catch (e) {
|
||||
console.log('Could not restore price range');
|
||||
}
|
||||
}
|
||||
}, 100);
|
||||
|
||||
console.log('Log Scale:', this.priceScaleState.logScale ? 'ON' : 'OFF');
|
||||
});
|
||||
|
||||
document.addEventListener('keydown', (e) => {
|
||||
if (e.key === 'a' || e.key === 'A') {
|
||||
if (e.target.tagName !== 'INPUT') {
|
||||
btnAutoScale.click();
|
||||
}
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
initNavigationControls() {
|
||||
const chartWrapper = document.getElementById('chartWrapper');
|
||||
const navLeft = document.getElementById('navLeft');
|
||||
const navRight = document.getElementById('navRight');
|
||||
const navRecent = document.getElementById('navRecent');
|
||||
|
||||
if (!chartWrapper || !navLeft || !navRight || !navRecent) return;
|
||||
|
||||
chartWrapper.addEventListener('mousemove', (e) => {
|
||||
const rect = chartWrapper.getBoundingClientRect();
|
||||
const distanceFromBottom = rect.bottom - e.clientY;
|
||||
chartWrapper.classList.toggle('show-nav', distanceFromBottom < 30);
|
||||
});
|
||||
|
||||
chartWrapper.addEventListener('mouseleave', () => {
|
||||
chartWrapper.classList.remove('show-nav');
|
||||
});
|
||||
|
||||
navLeft.addEventListener('click', () => this.navigateLeft());
|
||||
navRight.addEventListener('click', () => this.navigateRight());
|
||||
navRecent.addEventListener('click', () => this.navigateToRecent());
|
||||
}
|
||||
|
||||
navigateLeft() {
|
||||
const visibleRange = this.chart.timeScale().getVisibleLogicalRange();
|
||||
if (!visibleRange) return;
|
||||
|
||||
const visibleBars = visibleRange.to - visibleRange.from;
|
||||
const shift = visibleBars * 0.8;
|
||||
const newFrom = visibleRange.from - shift;
|
||||
const newTo = visibleRange.to - shift;
|
||||
|
||||
this.chart.timeScale().setVisibleLogicalRange({ from: newFrom, to: newTo });
|
||||
}
|
||||
|
||||
navigateRight() {
|
||||
const visibleRange = this.chart.timeScale().getVisibleLogicalRange();
|
||||
if (!visibleRange) return;
|
||||
|
||||
const visibleBars = visibleRange.to - visibleRange.from;
|
||||
const shift = visibleBars * 0.8;
|
||||
const newFrom = visibleRange.from + shift;
|
||||
const newTo = visibleRange.to + shift;
|
||||
|
||||
this.chart.timeScale().setVisibleLogicalRange({ from: newFrom, to: newTo });
|
||||
}
|
||||
|
||||
navigateToRecent() {
|
||||
this.chart.timeScale().scrollToRealTime();
|
||||
}
|
||||
|
||||
initEventListeners() {
|
||||
document.addEventListener('keydown', (e) => {
|
||||
if (e.target.tagName === 'INPUT' || e.target.tagName === 'BUTTON') return;
|
||||
|
||||
const shortcuts = {
|
||||
'1': '1m', '2': '3m', '3': '5m', '4': '15m', '5': '30m', '7': '37m',
|
||||
'6': '1h', '8': '4h', '9': '8h', '0': '12h',
|
||||
'd': '1d', 'D': '1d', 'w': '1w', 'W': '1w', 'm': '1M', 'M': '1M'
|
||||
};
|
||||
|
||||
if (shortcuts[e.key]) {
|
||||
this.switchTimeframe(shortcuts[e.key]);
|
||||
}
|
||||
|
||||
if (e.key === 'ArrowLeft') {
|
||||
this.navigateLeft();
|
||||
} else if (e.key === 'ArrowRight') {
|
||||
this.navigateRight();
|
||||
} else if (e.key === 'ArrowUp') {
|
||||
this.navigateToRecent();
|
||||
}
|
||||
});
|
||||
}
|
||||
|
||||
async loadInitialData() {
|
||||
await Promise.all([
|
||||
this.loadData(1000, true),
|
||||
this.loadStats()
|
||||
]);
|
||||
this.hasInitialLoad = true;
|
||||
}
|
||||
|
||||
async loadData(limit = 1000, fitToContent = false) {
|
||||
if (this.isLoading) return;
|
||||
this.isLoading = true;
|
||||
|
||||
try {
|
||||
const visibleRange = this.chart.timeScale().getVisibleLogicalRange();
|
||||
|
||||
const response = await fetch(`/api/v1/candles?symbol=BTC&interval=${this.currentInterval}&limit=${limit}`);
|
||||
const data = await response.json();
|
||||
|
||||
if (data.candles && data.candles.length > 0) {
|
||||
const chartData = data.candles.reverse().map(c => ({
|
||||
time: Math.floor(new Date(c.time).getTime() / 1000),
|
||||
open: parseFloat(c.open),
|
||||
high: parseFloat(c.high),
|
||||
low: parseFloat(c.low),
|
||||
close: parseFloat(c.close),
|
||||
volume: parseFloat(c.volume || 0)
|
||||
}));
|
||||
|
||||
const existingData = this.allData.get(this.currentInterval) || [];
|
||||
const mergedData = this.mergeData(existingData, chartData);
|
||||
this.allData.set(this.currentInterval, mergedData);
|
||||
|
||||
this.candleSeries.setData(mergedData);
|
||||
|
||||
if (fitToContent) {
|
||||
this.chart.timeScale().scrollToRealTime();
|
||||
} else if (visibleRange) {
|
||||
this.chart.timeScale().setVisibleLogicalRange(visibleRange);
|
||||
}
|
||||
|
||||
const latest = mergedData[mergedData.length - 1];
|
||||
this.updateStats(latest);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading data:', error);
|
||||
} finally {
|
||||
this.isLoading = false;
|
||||
}
|
||||
}
|
||||
|
||||
async loadNewData() {
|
||||
if (!this.hasInitialLoad || this.isLoading) return;
|
||||
|
||||
try {
|
||||
const response = await fetch(`/api/v1/candles?symbol=BTC&interval=${this.currentInterval}&limit=50`);
|
||||
const data = await response.json();
|
||||
|
||||
if (data.candles && data.candles.length > 0) {
|
||||
const atEdge = this.isAtRightEdge();
|
||||
|
||||
const currentSeriesData = this.candleSeries.data();
|
||||
const lastTimestamp = currentSeriesData.length > 0
|
||||
? currentSeriesData[currentSeriesData.length - 1].time
|
||||
: 0;
|
||||
|
||||
const chartData = data.candles.reverse().map(c => ({
|
||||
time: Math.floor(new Date(c.time).getTime() / 1000),
|
||||
open: parseFloat(c.open),
|
||||
high: parseFloat(c.high),
|
||||
low: parseFloat(c.low),
|
||||
close: parseFloat(c.close),
|
||||
volume: parseFloat(c.volume || 0)
|
||||
}));
|
||||
|
||||
chartData.forEach(candle => {
|
||||
if (candle.time >= lastTimestamp) {
|
||||
this.candleSeries.update(candle);
|
||||
}
|
||||
});
|
||||
|
||||
const existingData = this.allData.get(this.currentInterval) || [];
|
||||
this.allData.set(this.currentInterval, this.mergeData(existingData, chartData));
|
||||
|
||||
if (atEdge) {
|
||||
this.chart.timeScale().scrollToRealTime();
|
||||
}
|
||||
|
||||
const latest = chartData[chartData.length - 1];
|
||||
this.updateStats(latest);
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading new data:', error);
|
||||
}
|
||||
}
|
||||
|
||||
mergeData(existing, newData) {
|
||||
const dataMap = new Map();
|
||||
existing.forEach(c => dataMap.set(c.time, c));
|
||||
newData.forEach(c => dataMap.set(c.time, c));
|
||||
return Array.from(dataMap.values()).sort((a, b) => a.time - b.time);
|
||||
}
|
||||
|
||||
onVisibleRangeChange() {
|
||||
if (!this.hasInitialLoad || this.isLoading) {
|
||||
return;
|
||||
}
|
||||
|
||||
const visibleRange = this.chart.timeScale().getVisibleLogicalRange();
|
||||
if (!visibleRange) {
|
||||
return;
|
||||
}
|
||||
|
||||
const data = this.candleSeries.data();
|
||||
if (!data || data.length === 0) {
|
||||
return;
|
||||
}
|
||||
|
||||
const visibleBars = Math.ceil(visibleRange.to - visibleRange.from);
|
||||
const bufferSize = visibleBars * 2;
|
||||
const refillThreshold = bufferSize * 0.8;
|
||||
const barsFromLeft = Math.floor(visibleRange.from);
|
||||
|
||||
if (barsFromLeft < refillThreshold) {
|
||||
console.log(`Buffer low (${barsFromLeft} < ${refillThreshold.toFixed(0)}), silently prefetching ${bufferSize} candles...`);
|
||||
const oldestCandle = data[0];
|
||||
if (oldestCandle) {
|
||||
this.loadHistoricalData(oldestCandle.time, bufferSize);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
async loadHistoricalData(beforeTime, limit = 1000) {
|
||||
if (this.isLoading) {
|
||||
return;
|
||||
}
|
||||
this.isLoading = true;
|
||||
|
||||
try {
|
||||
const endTime = new Date((beforeTime - 1) * 1000);
|
||||
|
||||
const response = await fetch(
|
||||
`/api/v1/candles?symbol=BTC&interval=${this.currentInterval}&end=${endTime.toISOString()}&limit=${limit}`
|
||||
);
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP error! status: ${response.status}`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
|
||||
if (data.candles && data.candles.length > 0) {
|
||||
const chartData = data.candles.reverse().map(c => ({
|
||||
time: Math.floor(new Date(c.time).getTime() / 1000),
|
||||
open: parseFloat(c.open),
|
||||
high: parseFloat(c.high),
|
||||
low: parseFloat(c.low),
|
||||
close: parseFloat(c.close),
|
||||
volume: parseFloat(c.volume || 0)
|
||||
}));
|
||||
|
||||
const existingData = this.allData.get(this.currentInterval) || [];
|
||||
const mergedData = this.mergeData(existingData, chartData);
|
||||
this.allData.set(this.currentInterval, mergedData);
|
||||
|
||||
this.candleSeries.setData(mergedData);
|
||||
|
||||
// Recalculate indicators with the expanded dataset
|
||||
window.drawIndicatorsOnChart?.();
|
||||
|
||||
console.log(`Prefetched ${chartData.length} candles, total: ${mergedData.length}`);
|
||||
} else {
|
||||
console.log('No more historical data available');
|
||||
}
|
||||
} catch (error) {
|
||||
console.error('Error loading historical data:', error);
|
||||
} finally {
|
||||
this.isLoading = false;
|
||||
}
|
||||
}
|
||||
|
||||
async loadTA() {
|
||||
try {
|
||||
const response = await fetch(`/api/v1/ta?symbol=BTC&interval=${this.currentInterval}`);
|
||||
this.taData = await response.json();
|
||||
this.renderTA();
|
||||
} catch (error) {
|
||||
console.error('Error loading TA:', error);
|
||||
document.getElementById('taContent').innerHTML = '<div class="ta-error">Failed to load technical analysis</div>';
|
||||
}
|
||||
}
|
||||
|
||||
renderTA() {
|
||||
if (!this.taData || this.taData.error) {
|
||||
document.getElementById('taContent').innerHTML = `<div class="ta-error">${this.taData?.error || 'No data available'}</div>`;
|
||||
return;
|
||||
}
|
||||
|
||||
const data = this.taData;
|
||||
const trendClass = data.trend.direction.toLowerCase();
|
||||
const signalClass = data.trend.signal.toLowerCase();
|
||||
|
||||
const ma44Change = data.moving_averages.price_vs_ma44;
|
||||
const ma125Change = data.moving_averages.price_vs_ma125;
|
||||
|
||||
document.getElementById('taInterval').textContent = this.currentInterval.toUpperCase();
|
||||
document.getElementById('taLastUpdate').textContent = new Date().toLocaleTimeString();
|
||||
|
||||
document.getElementById('taContent').innerHTML = `
|
||||
<div class="ta-section">
|
||||
<div class="ta-section-title">Trend Analysis</div>
|
||||
<div class="ta-trend ${trendClass}">
|
||||
${data.trend.direction} ${trendClass === 'bullish' ? '↑' : trendClass === 'bearish' ? '↓' : '→'}
|
||||
</div>
|
||||
<div class="ta-strength">${data.trend.strength}</div>
|
||||
<span class="ta-signal ${signalClass}">${data.trend.signal}</span>
|
||||
</div>
|
||||
|
||||
<div class="ta-section">
|
||||
<div class="ta-section-title">Moving Averages</div>
|
||||
<div class="ta-ma-row">
|
||||
<span class="ta-ma-label">MA 44</span>
|
||||
<span class="ta-ma-value">
|
||||
${data.moving_averages.ma_44 ? data.moving_averages.ma_44.toFixed(2) : 'N/A'}
|
||||
${ma44Change !== null ? `<span class="ta-ma-change ${ma44Change >= 0 ? 'positive' : 'negative'}">${ma44Change >= 0 ? '+' : ''}${ma44Change.toFixed(1)}%</span>` : ''}
|
||||
</span>
|
||||
</div>
|
||||
<div class="ta-ma-row">
|
||||
<span class="ta-ma-label">MA 125</span>
|
||||
<span class="ta-ma-value">
|
||||
${data.moving_averages.ma_125 ? data.moving_averages.ma_125.toFixed(2) : 'N/A'}
|
||||
${ma125Change !== null ? `<span class="ta-ma-change ${ma125Change >= 0 ? 'positive' : 'negative'}">${ma125Change >= 0 ? '+' : ''}${ma125Change.toFixed(1)}%</span>` : ''}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<div class="ta-section">
|
||||
<div class="ta-section-title">Indicators</div>
|
||||
<div id="indicatorList" class="indicator-list"></div>
|
||||
</div>
|
||||
|
||||
<div class="ta-section" id="indicatorConfigPanel">
|
||||
<div class="ta-section-title">Configuration</div>
|
||||
<div id="configForm" style="margin-top: 8px;"></div>
|
||||
<div style="display: flex; gap: 8px; margin-top: 12px;" id="configButtons">
|
||||
<button class="ta-btn" onclick="applyIndicatorConfig()" style="flex: 1; font-size: 11px; background: var(--tv-blue); color: white; border: none;">Apply</button>
|
||||
<button class="ta-btn" onclick="removeIndicator()" style="flex: 1; font-size: 11px; border-color: var(--tv-red); color: var(--tv-red);">Remove</button>
|
||||
</div>
|
||||
</div>
|
||||
`;
|
||||
|
||||
window.renderIndicatorList?.();
|
||||
}
|
||||
|
||||
async loadStats() {
|
||||
try {
|
||||
const response = await fetch('/api/v1/stats?symbol=BTC');
|
||||
this.statsData = await response.json();
|
||||
} catch (error) {
|
||||
console.error('Error loading stats:', error);
|
||||
}
|
||||
}
|
||||
|
||||
updateStats(candle) {
|
||||
const price = candle.close;
|
||||
const isUp = candle.close >= candle.open;
|
||||
|
||||
if (this.currentPriceLine) {
|
||||
this.currentPriceLine.applyOptions({
|
||||
price: price,
|
||||
color: isUp ? '#26a69a' : '#ef5350',
|
||||
});
|
||||
}
|
||||
|
||||
document.getElementById('currentPrice').textContent = price.toFixed(2);
|
||||
|
||||
if (this.statsData) {
|
||||
const change = this.statsData.change_24h;
|
||||
document.getElementById('currentPrice').className = 'stat-value ' + (change >= 0 ? 'positive' : 'negative');
|
||||
document.getElementById('priceChange').textContent = (change >= 0 ? '+' : '') + change.toFixed(2) + '%';
|
||||
document.getElementById('priceChange').className = 'stat-value ' + (change >= 0 ? 'positive' : 'negative');
|
||||
document.getElementById('dailyHigh').textContent = this.statsData.high_24h.toFixed(2);
|
||||
document.getElementById('dailyLow').textContent = this.statsData.low_24h.toFixed(2);
|
||||
}
|
||||
}
|
||||
|
||||
switchTimeframe(interval) {
|
||||
if (!this.intervals.includes(interval) || interval === this.currentInterval) return;
|
||||
|
||||
this.currentInterval = interval;
|
||||
this.hasInitialLoad = false;
|
||||
|
||||
document.querySelectorAll('.timeframe-btn').forEach(btn => {
|
||||
btn.classList.toggle('active', btn.dataset.interval === interval);
|
||||
});
|
||||
|
||||
this.allData.delete(interval);
|
||||
this.loadInitialData();
|
||||
this.loadTA();
|
||||
|
||||
window.clearSimulationResults?.();
|
||||
window.updateTimeframeDisplay?.();
|
||||
}
|
||||
}
|
||||
|
||||
export function refreshTA() {
|
||||
if (window.dashboard) {
|
||||
window.dashboard.loadTA();
|
||||
}
|
||||
}
|
||||
|
||||
export function openAIAnalysis() {
|
||||
const symbol = 'BTC';
|
||||
const interval = window.dashboard?.currentInterval || '1d';
|
||||
const prompt = `Analyze Bitcoin (${symbol}) ${interval} chart. Current trend, support/resistance levels, and trading recommendation. Technical indicators: MA44, MA125.`;
|
||||
|
||||
const geminiUrl = `https://gemini.google.com/app?prompt=${encodeURIComponent(prompt)}`;
|
||||
window.open(geminiUrl, '_blank');
|
||||
}
|
||||
140
src/api/dashboard/static/js/ui/export.js
Normal file
140
src/api/dashboard/static/js/ui/export.js
Normal file
@ -0,0 +1,140 @@
|
||||
import { downloadFile } from '../utils/index.js';
|
||||
|
||||
export function showExportDialog() {
|
||||
if (!window.lastSimulationResults) {
|
||||
alert('Please run a simulation first');
|
||||
return;
|
||||
}
|
||||
|
||||
const overlay = document.createElement('div');
|
||||
overlay.className = 'dialog-overlay';
|
||||
overlay.onclick = () => closeExportDialog();
|
||||
document.body.appendChild(overlay);
|
||||
|
||||
const dialog = document.createElement('div');
|
||||
dialog.className = 'export-dialog';
|
||||
dialog.id = 'exportDialog';
|
||||
dialog.innerHTML = `
|
||||
<div class="export-dialog-title">📥 Export Simulation Report</div>
|
||||
<div class="export-options">
|
||||
<label class="export-option">
|
||||
<input type="radio" name="exportFormat" value="csv" checked>
|
||||
<span>CSV (Trades list)</span>
|
||||
</label>
|
||||
<label class="export-option">
|
||||
<input type="radio" name="exportFormat" value="json">
|
||||
<span>JSON (Full data)</span>
|
||||
</label>
|
||||
<label class="export-option">
|
||||
<input type="radio" name="exportFormat" value="both">
|
||||
<span>Both CSV + JSON</span>
|
||||
</label>
|
||||
</div>
|
||||
<div style="display: flex; gap: 8px;">
|
||||
<button class="action-btn secondary" onclick="closeExportDialog()" style="flex: 1;">Cancel</button>
|
||||
<button class="action-btn primary" onclick="performExport()" style="flex: 1;">Export</button>
|
||||
</div>
|
||||
`;
|
||||
document.body.appendChild(dialog);
|
||||
}
|
||||
|
||||
export function closeExportDialog() {
|
||||
const overlay = document.querySelector('.dialog-overlay');
|
||||
const dialog = document.getElementById('exportDialog');
|
||||
if (overlay) overlay.remove();
|
||||
if (dialog) dialog.remove();
|
||||
}
|
||||
|
||||
export function performExport() {
|
||||
const format = document.querySelector('input[name="exportFormat"]:checked').value;
|
||||
const sim = window.lastSimulationResults;
|
||||
const config = sim.config || {};
|
||||
const dateStr = new Date().toISOString().slice(0, 10);
|
||||
const baseFilename = generateSimulationName(config).replace(/[^a-zA-Z0-9_-]/g, '_');
|
||||
|
||||
if (format === 'csv' || format === 'both') {
|
||||
exportToCSV(sim, `${baseFilename}.csv`);
|
||||
}
|
||||
|
||||
if (format === 'json' || format === 'both') {
|
||||
exportToJSON(sim, `${baseFilename}.json`);
|
||||
}
|
||||
|
||||
closeExportDialog();
|
||||
}
|
||||
|
||||
function generateSimulationName(config) {
|
||||
if (!config) return 'Unnamed Simulation';
|
||||
|
||||
const start = new Date(config.startDate);
|
||||
const now = new Date();
|
||||
const duration = now - start;
|
||||
const oneDay = 24 * 60 * 60 * 1000;
|
||||
|
||||
let dateStr;
|
||||
if (duration < oneDay) {
|
||||
dateStr = start.toISOString().slice(0, 16).replace('T', ' ');
|
||||
} else {
|
||||
dateStr = start.toISOString().slice(0, 10);
|
||||
}
|
||||
|
||||
return `${config.strategyName}_${config.timeframe}_${dateStr}`;
|
||||
}
|
||||
|
||||
function exportToCSV(simulation, filename) {
|
||||
const results = simulation.results || simulation;
|
||||
const config = simulation.config || {};
|
||||
|
||||
let csv = 'Trade #,Entry Time,Exit Time,Entry Price,Exit Price,Size,P&L ($),P&L (%),Type\n';
|
||||
|
||||
(results.trades || []).forEach((trade, i) => {
|
||||
csv += `${i + 1},${trade.entryTime},${trade.exitTime},${trade.entryPrice},${trade.exitPrice},${trade.size},${trade.pnl},${trade.pnlPct},${trade.type}\n`;
|
||||
});
|
||||
|
||||
csv += '\n';
|
||||
csv += 'Summary\n';
|
||||
csv += `Strategy,${config.strategyName || 'Unknown'}\n`;
|
||||
csv += `Timeframe,${config.timeframe || 'Unknown'}\n`;
|
||||
csv += `Start Date,${config.startDate || 'Unknown'}\n`;
|
||||
csv += `Total Trades,${results.total_trades || 0}\n`;
|
||||
csv += `Win Rate (%),${(results.win_rate || 0).toFixed(2)}\n`;
|
||||
csv += `Total P&L ($),${(results.total_pnl || 0).toFixed(2)}\n`;
|
||||
csv += `Risk % per Trade,${config.riskPercent || 2}\n`;
|
||||
csv += `Stop Loss %,${config.stopLossPercent || 2}\n`;
|
||||
|
||||
downloadFile(csv, filename, 'text/csv');
|
||||
}
|
||||
|
||||
function exportToJSON(simulation, filename) {
|
||||
const exportData = {
|
||||
metadata: {
|
||||
exported_at: new Date().toISOString(),
|
||||
version: '1.0'
|
||||
},
|
||||
configuration: simulation.config || {},
|
||||
results: {
|
||||
summary: {
|
||||
total_trades: simulation.total_trades || simulation.results?.total_trades || 0,
|
||||
win_rate: simulation.win_rate || simulation.results?.win_rate || 0,
|
||||
total_pnl: simulation.total_pnl || simulation.results?.total_pnl || 0
|
||||
},
|
||||
trades: simulation.trades || simulation.results?.trades || [],
|
||||
equity_curve: simulation.equity_curve || []
|
||||
}
|
||||
};
|
||||
|
||||
downloadFile(JSON.stringify(exportData, null, 2), filename, 'application/json');
|
||||
}
|
||||
|
||||
export function exportSavedSimulation(id) {
|
||||
const sim = window.SimulationStorage?.get(id);
|
||||
if (!sim) {
|
||||
alert('Simulation not found');
|
||||
return;
|
||||
}
|
||||
|
||||
window.lastSimulationResults = sim;
|
||||
showExportDialog();
|
||||
}
|
||||
|
||||
window.generateSimulationName = generateSimulationName;
|
||||
37
src/api/dashboard/static/js/ui/index.js
Normal file
37
src/api/dashboard/static/js/ui/index.js
Normal file
@ -0,0 +1,37 @@
|
||||
export { TradingDashboard, refreshTA, openAIAnalysis } from './chart.js';
|
||||
export { toggleSidebar, restoreSidebarState } from './sidebar.js';
|
||||
export { SimulationStorage } from './storage.js';
|
||||
export { showExportDialog, closeExportDialog, performExport, exportSavedSimulation } from './export.js';
|
||||
export {
|
||||
runSimulation,
|
||||
displayEnhancedResults,
|
||||
showSimulationMarkers,
|
||||
clearSimulationMarkers,
|
||||
clearSimulationResults,
|
||||
getLastResults,
|
||||
setLastResults
|
||||
} from './simulation.js';
|
||||
export {
|
||||
renderStrategies,
|
||||
selectStrategy,
|
||||
renderStrategyParams,
|
||||
loadStrategies,
|
||||
saveSimulation,
|
||||
renderSavedSimulations,
|
||||
loadSavedSimulation,
|
||||
deleteSavedSimulation,
|
||||
getCurrentStrategy,
|
||||
setCurrentStrategy
|
||||
} from './strategies-panel.js';
|
||||
export {
|
||||
renderIndicatorList,
|
||||
addNewIndicator,
|
||||
selectIndicator,
|
||||
renderIndicatorConfig,
|
||||
applyIndicatorConfig,
|
||||
removeIndicator,
|
||||
removeIndicatorByIndex,
|
||||
drawIndicatorsOnChart,
|
||||
getActiveIndicators,
|
||||
setActiveIndicators
|
||||
} from './indicators-panel.js';
|
||||
677
src/api/dashboard/static/js/ui/indicators-panel.js
Normal file
677
src/api/dashboard/static/js/ui/indicators-panel.js
Normal file
@ -0,0 +1,677 @@
|
||||
import { getAvailableIndicators, IndicatorRegistry as IR } from '../indicators/index.js';
|
||||
|
||||
let activeIndicators = [];
|
||||
let configuringId = null;
|
||||
let previewingType = null; // type being previewed (not yet added)
|
||||
let nextInstanceId = 1;
|
||||
|
||||
const DEFAULT_COLORS = ['#2962ff', '#26a69a', '#ef5350', '#ff9800', '#9c27b0', '#00bcd4', '#ffeb3b', '#e91e63'];
|
||||
const LINE_TYPES = ['solid', 'dotted', 'dashed'];
|
||||
|
||||
function getDefaultColor(index) {
|
||||
return DEFAULT_COLORS[index % DEFAULT_COLORS.length];
|
||||
}
|
||||
|
||||
function getPlotGroupName(plotId) {
|
||||
if (plotId.toLowerCase().includes('fast')) return 'Fast';
|
||||
if (plotId.toLowerCase().includes('slow')) return 'Slow';
|
||||
if (plotId.toLowerCase().includes('upper')) return 'Upper';
|
||||
if (plotId.toLowerCase().includes('lower')) return 'Lower';
|
||||
if (plotId.toLowerCase().includes('middle') || plotId.toLowerCase().includes('basis')) return 'Middle';
|
||||
if (plotId.toLowerCase().includes('signal')) return 'Signal';
|
||||
if (plotId.toLowerCase().includes('histogram')) return 'Histogram';
|
||||
if (plotId.toLowerCase().includes('k')) return '%K';
|
||||
if (plotId.toLowerCase().includes('d')) return '%D';
|
||||
return plotId;
|
||||
}
|
||||
|
||||
function groupPlotsByColor(plots) {
|
||||
const groups = {};
|
||||
plots.forEach((plot, idx) => {
|
||||
const groupName = getPlotGroupName(plot.id);
|
||||
if (!groups[groupName]) {
|
||||
groups[groupName] = { name: groupName, indices: [], plots: [] };
|
||||
}
|
||||
groups[groupName].indices.push(idx);
|
||||
groups[groupName].plots.push(plot);
|
||||
});
|
||||
return Object.values(groups);
|
||||
}
|
||||
|
||||
/** Generate a short label for an active indicator showing its key params */
|
||||
function getIndicatorLabel(indicator) {
|
||||
const meta = getIndicatorMeta(indicator);
|
||||
if (!meta) return indicator.name;
|
||||
|
||||
const paramParts = meta.inputs.map(input => {
|
||||
const val = indicator.params[input.name];
|
||||
if (val !== undefined && val !== input.default) return val;
|
||||
if (val !== undefined) return val;
|
||||
return null;
|
||||
}).filter(v => v !== null);
|
||||
|
||||
if (paramParts.length > 0) {
|
||||
return `${indicator.name} (${paramParts.join(', ')})`;
|
||||
}
|
||||
return indicator.name;
|
||||
}
|
||||
|
||||
function getIndicatorMeta(indicator) {
|
||||
const IndicatorClass = IR?.[indicator.type];
|
||||
if (!IndicatorClass) return null;
|
||||
const instance = new IndicatorClass({ type: indicator.type, params: indicator.params, name: indicator.name });
|
||||
return instance.getMetadata();
|
||||
}
|
||||
|
||||
export function getActiveIndicators() {
|
||||
return activeIndicators;
|
||||
}
|
||||
|
||||
export function setActiveIndicators(indicators) {
|
||||
activeIndicators = indicators;
|
||||
}
|
||||
|
||||
/**
|
||||
* Render the indicator catalog (available indicators) and active list.
|
||||
* Catalog items are added via double-click (multiple instances allowed).
|
||||
*/
|
||||
export function renderIndicatorList() {
|
||||
const container = document.getElementById('indicatorList');
|
||||
if (!container) return;
|
||||
|
||||
const available = getAvailableIndicators();
|
||||
|
||||
container.innerHTML = `
|
||||
<div class="indicator-catalog">
|
||||
${available.map(ind => `
|
||||
<div class="indicator-catalog-item ${previewingType === ind.type ? 'previewing' : ''}"
|
||||
title="${ind.description || ''}"
|
||||
data-type="${ind.type}">
|
||||
<span class="indicator-catalog-name">${ind.name}</span>
|
||||
<span class="indicator-catalog-add" data-type="${ind.type}">+</span>
|
||||
</div>
|
||||
`).join('')}
|
||||
</div>
|
||||
${activeIndicators.length > 0 ? `
|
||||
<div class="indicator-active-divider">Active</div>
|
||||
<div class="indicator-active-list">
|
||||
${activeIndicators.map(ind => {
|
||||
const isConfiguring = ind.id === configuringId;
|
||||
const plotGroups = groupPlotsByColor(ind.plots || []);
|
||||
const colorDots = plotGroups.map(group => {
|
||||
const firstIdx = group.indices[0];
|
||||
const color = ind.params[`_color_${firstIdx}`] || '#2962ff';
|
||||
return `<span class="indicator-color-dot" style="background: ${color};"></span>`;
|
||||
}).join('');
|
||||
const label = getIndicatorLabel(ind);
|
||||
|
||||
return `
|
||||
<div class="indicator-active-item ${isConfiguring ? 'configuring' : ''}"
|
||||
data-id="${ind.id}">
|
||||
<span class="indicator-active-eye" data-id="${ind.id}"
|
||||
title="${ind.visible !== false ? 'Hide' : 'Show'}">
|
||||
${ind.visible !== false ? '👁' : '👁🗨'}
|
||||
</span>
|
||||
<span class="indicator-active-name" data-id="${ind.id}">${label}</span>
|
||||
${colorDots}
|
||||
<button class="indicator-config-btn ${isConfiguring ? 'active' : ''}"
|
||||
data-id="${ind.id}" title="Configure">⚙</button>
|
||||
<button class="indicator-remove-btn"
|
||||
data-id="${ind.id}" title="Remove">×</button>
|
||||
</div>
|
||||
`;
|
||||
}).join('')}
|
||||
</div>
|
||||
` : ''}
|
||||
`;
|
||||
|
||||
// Bind events via delegation
|
||||
container.querySelectorAll('.indicator-catalog-item').forEach(el => {
|
||||
el.addEventListener('click', () => previewIndicator(el.dataset.type));
|
||||
el.addEventListener('dblclick', () => addIndicator(el.dataset.type));
|
||||
});
|
||||
container.querySelectorAll('.indicator-catalog-add').forEach(el => {
|
||||
el.addEventListener('click', (e) => {
|
||||
e.stopPropagation();
|
||||
addIndicator(el.dataset.type);
|
||||
});
|
||||
});
|
||||
container.querySelectorAll('.indicator-active-name').forEach(el => {
|
||||
el.addEventListener('click', () => selectIndicatorConfig(el.dataset.id));
|
||||
});
|
||||
container.querySelectorAll('.indicator-config-btn').forEach(el => {
|
||||
el.addEventListener('click', (e) => {
|
||||
e.stopPropagation();
|
||||
selectIndicatorConfig(el.dataset.id);
|
||||
});
|
||||
});
|
||||
container.querySelectorAll('.indicator-remove-btn').forEach(el => {
|
||||
el.addEventListener('click', (e) => {
|
||||
e.stopPropagation();
|
||||
removeIndicatorById(el.dataset.id);
|
||||
});
|
||||
});
|
||||
container.querySelectorAll('.indicator-active-eye').forEach(el => {
|
||||
el.addEventListener('click', (e) => {
|
||||
e.stopPropagation();
|
||||
toggleVisibility(el.dataset.id);
|
||||
});
|
||||
});
|
||||
|
||||
updateConfigPanel();
|
||||
updateChartLegend();
|
||||
}
|
||||
|
||||
function updateConfigPanel() {
|
||||
const configPanel = document.getElementById('indicatorConfigPanel');
|
||||
const configButtons = document.getElementById('configButtons');
|
||||
if (!configPanel) return;
|
||||
|
||||
configPanel.style.display = 'block';
|
||||
|
||||
// Active indicator config takes priority over preview
|
||||
const indicator = configuringId ? activeIndicators.find(a => a.id === configuringId) : null;
|
||||
|
||||
if (indicator) {
|
||||
renderIndicatorConfig(indicator);
|
||||
if (configButtons) configButtons.style.display = 'flex';
|
||||
} else if (previewingType) {
|
||||
renderPreviewConfig(previewingType);
|
||||
if (configButtons) configButtons.style.display = 'none';
|
||||
} else {
|
||||
const container = document.getElementById('configForm');
|
||||
if (container) {
|
||||
container.innerHTML = '<div style="text-align: center; color: var(--tv-text-secondary); padding: 20px; font-size: 12px;">Click an indicator to preview its settings</div>';
|
||||
}
|
||||
if (configButtons) configButtons.style.display = 'none';
|
||||
}
|
||||
}
|
||||
|
||||
/** Single-click: preview config for a catalog indicator type (read-only) */
|
||||
function previewIndicator(type) {
|
||||
configuringId = null;
|
||||
previewingType = previewingType === type ? null : type;
|
||||
renderIndicatorList();
|
||||
}
|
||||
|
||||
/** Render a read-only preview of an indicator's default config */
|
||||
function renderPreviewConfig(type) {
|
||||
const container = document.getElementById('configForm');
|
||||
if (!container) return;
|
||||
|
||||
const IndicatorClass = IR?.[type];
|
||||
if (!IndicatorClass) return;
|
||||
|
||||
const instance = new IndicatorClass({ type, params: {}, name: '' });
|
||||
const meta = instance.getMetadata();
|
||||
|
||||
container.innerHTML = `
|
||||
<div style="font-size: 11px; color: var(--tv-blue); margin-bottom: 4px; font-weight: 600;">${meta.name}</div>
|
||||
<div style="font-size: 11px; color: var(--tv-text-secondary); margin-bottom: 10px;">${meta.description || ''}</div>
|
||||
|
||||
${meta.inputs.map(input => `
|
||||
<div style="margin-bottom: 8px;">
|
||||
<label style="font-size: 10px; color: var(--tv-text-secondary); text-transform: uppercase; display: block; margin-bottom: 4px;">${input.label}</label>
|
||||
${input.type === 'select' ?
|
||||
`<select class="sim-input" style="font-size: 12px; padding: 6px;" disabled>${input.options.map(o => `<option ${input.default === o ? 'selected' : ''}>${o}</option>`).join('')}</select>` :
|
||||
`<input type="number" class="sim-input" value="${input.default}" ${input.step !== undefined ? `step="${input.step}"` : ''} style="font-size: 12px; padding: 6px;" disabled>`
|
||||
}
|
||||
</div>
|
||||
`).join('')}
|
||||
|
||||
<div style="font-size: 10px; color: var(--tv-text-secondary); margin-top: 8px; text-align: center;">Double-click to add to chart</div>
|
||||
`;
|
||||
}
|
||||
|
||||
/** Add a new instance of an indicator type */
|
||||
export function addIndicator(type) {
|
||||
const IndicatorClass = IR?.[type];
|
||||
if (!IndicatorClass) return;
|
||||
|
||||
previewingType = null;
|
||||
const id = `${type}_${nextInstanceId++}`;
|
||||
const instance = new IndicatorClass({ type, params: {}, name: '' });
|
||||
const metadata = instance.getMetadata();
|
||||
|
||||
const params = {
|
||||
_lineType: 'solid',
|
||||
_lineWidth: 2
|
||||
};
|
||||
metadata.plots.forEach((plot, idx) => {
|
||||
params[`_color_${idx}`] = plot.color || getDefaultColor(activeIndicators.length + idx);
|
||||
});
|
||||
metadata.inputs.forEach(input => {
|
||||
params[input.name] = input.default;
|
||||
});
|
||||
|
||||
activeIndicators.push({
|
||||
id,
|
||||
type,
|
||||
name: metadata.name,
|
||||
params,
|
||||
plots: metadata.plots,
|
||||
series: [],
|
||||
visible: true
|
||||
});
|
||||
|
||||
configuringId = id;
|
||||
|
||||
renderIndicatorList();
|
||||
drawIndicatorsOnChart();
|
||||
}
|
||||
|
||||
function selectIndicatorConfig(id) {
|
||||
previewingType = null;
|
||||
if (configuringId === id) {
|
||||
configuringId = null;
|
||||
} else {
|
||||
configuringId = id;
|
||||
}
|
||||
renderIndicatorList();
|
||||
}
|
||||
|
||||
function toggleVisibility(id) {
|
||||
const indicator = activeIndicators.find(a => a.id === id);
|
||||
if (!indicator) return;
|
||||
|
||||
indicator.visible = indicator.visible === false ? true : false;
|
||||
|
||||
// Show/hide all series for this indicator
|
||||
indicator.series?.forEach(s => {
|
||||
try {
|
||||
s.applyOptions({ visible: indicator.visible });
|
||||
} catch(e) {}
|
||||
});
|
||||
|
||||
renderIndicatorList();
|
||||
}
|
||||
|
||||
export function renderIndicatorConfig(indicator) {
|
||||
const container = document.getElementById('configForm');
|
||||
if (!container || !indicator) return;
|
||||
|
||||
const IndicatorClass = IR?.[indicator.type];
|
||||
if (!IndicatorClass) {
|
||||
container.innerHTML = '<div style="color: var(--tv-red);">Error loading indicator</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
const instance = new IndicatorClass({ type: indicator.type, params: indicator.params, name: indicator.name });
|
||||
const meta = instance.getMetadata();
|
||||
|
||||
const plotGroups = groupPlotsByColor(meta.plots);
|
||||
|
||||
const colorInputs = plotGroups.map(group => {
|
||||
const firstIdx = group.indices[0];
|
||||
const color = indicator.params[`_color_${firstIdx}`] || meta.plots[firstIdx].color || '#2962ff';
|
||||
return `
|
||||
<div style="margin-bottom: 8px;">
|
||||
<label style="font-size: 10px; color: var(--tv-text-secondary); text-transform: uppercase; display: block; margin-bottom: 4px;">${group.name} Color</label>
|
||||
<input type="color" id="config__color_${firstIdx}" value="${color}" style="width: 100%; height: 28px; border: 1px solid var(--tv-border); border-radius: 4px; cursor: pointer; background: var(--tv-bg);">
|
||||
</div>
|
||||
`;
|
||||
}).join('');
|
||||
|
||||
container.innerHTML = `
|
||||
<div style="font-size: 11px; color: var(--tv-blue); margin-bottom: 8px; font-weight: 600;">${getIndicatorLabel(indicator)}</div>
|
||||
|
||||
${colorInputs}
|
||||
|
||||
<div style="margin-bottom: 8px;">
|
||||
<label style="font-size: 10px; color: var(--tv-text-secondary); text-transform: uppercase; display: block; margin-bottom: 4px;">Line Type</label>
|
||||
<select id="config__lineType" class="sim-input" style="font-size: 12px; padding: 6px;">
|
||||
${LINE_TYPES.map(lt => `<option value="${lt}" ${indicator.params._lineType === lt ? 'selected' : ''}>${lt.charAt(0).toUpperCase() + lt.slice(1)}</option>`).join('')}
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div style="margin-bottom: 8px;">
|
||||
<label style="font-size: 10px; color: var(--tv-text-secondary); text-transform: uppercase; display: block; margin-bottom: 4px;">Line Width</label>
|
||||
<input type="number" id="config__lineWidth" class="sim-input" value="${indicator.params._lineWidth || 2}" min="1" max="5" style="font-size: 12px; padding: 6px;">
|
||||
</div>
|
||||
|
||||
${meta.inputs.map(input => `
|
||||
<div style="margin-bottom: 8px;">
|
||||
<label style="font-size: 10px; color: var(--tv-text-secondary); text-transform: uppercase; display: block; margin-bottom: 4px;">${input.label}</label>
|
||||
${input.type === 'select' ?
|
||||
`<select id="config_${input.name}" class="sim-input" style="font-size: 12px; padding: 6px;">${input.options.map(o => `<option value="${o}" ${indicator.params[input.name] === o ? 'selected' : ''}>${o}</option>`).join('')}</select>` :
|
||||
`<input type="number" id="config_${input.name}" class="sim-input" value="${indicator.params[input.name]}" ${input.min !== undefined ? `min="${input.min}"` : ''} ${input.max !== undefined ? `max="${input.max}"` : ''} ${input.step !== undefined ? `step="${input.step}"` : ''} style="font-size: 12px; padding: 6px;">`
|
||||
}
|
||||
</div>
|
||||
`).join('')}
|
||||
`;
|
||||
}
|
||||
|
||||
export function applyIndicatorConfig() {
|
||||
const indicator = configuringId ? activeIndicators.find(a => a.id === configuringId) : null;
|
||||
if (!indicator) return;
|
||||
|
||||
const IndicatorClass = IR?.[indicator.type];
|
||||
if (!IndicatorClass) return;
|
||||
|
||||
const instance = new IndicatorClass({ type: indicator.type, params: {}, name: indicator.name });
|
||||
const meta = instance.getMetadata();
|
||||
|
||||
const plotGroups = groupPlotsByColor(meta.plots);
|
||||
plotGroups.forEach(group => {
|
||||
const firstIdx = group.indices[0];
|
||||
const colorEl = document.getElementById(`config__color_${firstIdx}`);
|
||||
if (colorEl) {
|
||||
const color = colorEl.value;
|
||||
group.indices.forEach(idx => {
|
||||
indicator.params[`_color_${idx}`] = color;
|
||||
});
|
||||
}
|
||||
});
|
||||
|
||||
const lineTypeEl = document.getElementById('config__lineType');
|
||||
const lineWidthEl = document.getElementById('config__lineWidth');
|
||||
|
||||
if (lineTypeEl) indicator.params._lineType = lineTypeEl.value;
|
||||
if (lineWidthEl) indicator.params._lineWidth = parseInt(lineWidthEl.value);
|
||||
|
||||
meta.inputs.forEach(input => {
|
||||
const el = document.getElementById(`config_${input.name}`);
|
||||
if (el) {
|
||||
indicator.params[input.name] = input.type === 'select' ? el.value : parseFloat(el.value);
|
||||
}
|
||||
});
|
||||
|
||||
renderIndicatorList();
|
||||
drawIndicatorsOnChart();
|
||||
}
|
||||
|
||||
export function removeIndicator() {
|
||||
if (!configuringId) return;
|
||||
removeIndicatorById(configuringId);
|
||||
}
|
||||
|
||||
export function removeIndicatorById(id) {
|
||||
const idx = activeIndicators.findIndex(a => a.id === id);
|
||||
if (idx < 0) return;
|
||||
|
||||
activeIndicators[idx].series?.forEach(s => {
|
||||
try { window.dashboard?.chart?.removeSeries(s); } catch(e) {}
|
||||
});
|
||||
|
||||
activeIndicators.splice(idx, 1);
|
||||
|
||||
if (configuringId === id) {
|
||||
configuringId = null;
|
||||
}
|
||||
|
||||
renderIndicatorList();
|
||||
drawIndicatorsOnChart();
|
||||
}
|
||||
|
||||
export function removeIndicatorByIndex(index) {
|
||||
if (index < 0 || index >= activeIndicators.length) return;
|
||||
removeIndicatorById(activeIndicators[index].id);
|
||||
}
|
||||
|
||||
let indicatorPanes = new Map();
|
||||
let nextPaneIndex = 1;
|
||||
|
||||
export function drawIndicatorsOnChart() {
|
||||
if (!window.dashboard || !window.dashboard.chart) return;
|
||||
|
||||
activeIndicators.forEach(ind => {
|
||||
ind.series?.forEach(s => {
|
||||
try { window.dashboard.chart.removeSeries(s); } catch(e) {}
|
||||
});
|
||||
});
|
||||
|
||||
const candles = window.dashboard.allData.get(window.dashboard.currentInterval);
|
||||
if (!candles || candles.length === 0) return;
|
||||
|
||||
const lineStyleMap = { 'solid': LightweightCharts.LineStyle.Solid, 'dotted': LightweightCharts.LineStyle.Dotted, 'dashed': LightweightCharts.LineStyle.Dashed };
|
||||
|
||||
indicatorPanes.clear();
|
||||
nextPaneIndex = 1;
|
||||
|
||||
const overlayIndicators = [];
|
||||
const paneIndicators = [];
|
||||
|
||||
activeIndicators.forEach(ind => {
|
||||
const IndicatorClass = IR?.[ind.type];
|
||||
if (!IndicatorClass) return;
|
||||
|
||||
const instance = new IndicatorClass({ type: ind.type, params: ind.params, name: ind.name });
|
||||
const meta = instance.getMetadata();
|
||||
|
||||
if (meta.displayMode === 'pane') {
|
||||
paneIndicators.push({ indicator: ind, meta, instance });
|
||||
} else {
|
||||
overlayIndicators.push({ indicator: ind, meta, instance });
|
||||
}
|
||||
});
|
||||
|
||||
const totalPanes = 1 + paneIndicators.length;
|
||||
const mainPaneHeight = paneIndicators.length > 0 ? 60 : 100;
|
||||
const paneHeight = paneIndicators.length > 0 ? Math.floor(40 / paneIndicators.length) : 0;
|
||||
|
||||
window.dashboard.chart.panes()[0]?.setHeight(mainPaneHeight);
|
||||
|
||||
overlayIndicators.forEach(({ indicator, meta, instance }) => {
|
||||
if (indicator.visible === false) {
|
||||
indicator.series = [];
|
||||
return;
|
||||
}
|
||||
|
||||
renderIndicatorOnPane(indicator, meta, instance, candles, 0, lineStyleMap);
|
||||
});
|
||||
|
||||
paneIndicators.forEach(({ indicator, meta, instance }, idx) => {
|
||||
if (indicator.visible === false) {
|
||||
indicator.series = [];
|
||||
return;
|
||||
}
|
||||
|
||||
const paneIndex = nextPaneIndex++;
|
||||
indicatorPanes.set(indicator.id, paneIndex);
|
||||
|
||||
renderIndicatorOnPane(indicator, meta, instance, candles, paneIndex, lineStyleMap);
|
||||
|
||||
const pane = window.dashboard.chart.panes()[paneIndex];
|
||||
if (pane) {
|
||||
pane.setHeight(paneHeight);
|
||||
}
|
||||
});
|
||||
|
||||
updateChartLegend();
|
||||
}
|
||||
|
||||
function renderIndicatorOnPane(indicator, meta, instance, candles, paneIndex, lineStyleMap) {
|
||||
const results = instance.calculate(candles);
|
||||
indicator.series = [];
|
||||
|
||||
const lineStyle = lineStyleMap[indicator.params._lineType] || LightweightCharts.LineStyle.Solid;
|
||||
const lineWidth = indicator.params._lineWidth || 2;
|
||||
|
||||
const firstNonNull = results?.find(r => r !== null && r !== undefined);
|
||||
const isObjectResult = firstNonNull && typeof firstNonNull === 'object';
|
||||
|
||||
meta.plots.forEach((plot, plotIdx) => {
|
||||
if (isObjectResult) {
|
||||
// Find if this specific plot has any non-null data across all results
|
||||
const hasData = results.some(r => r && r[plot.id] !== undefined && r[plot.id] !== null);
|
||||
if (!hasData) return;
|
||||
}
|
||||
|
||||
const plotColor = indicator.params[`_color_${plotIdx}`] || plot.color || '#2962ff';
|
||||
|
||||
const data = [];
|
||||
for (let i = 0; i < candles.length; i++) {
|
||||
let value;
|
||||
if (isObjectResult) {
|
||||
value = results[i]?.[plot.id];
|
||||
} else {
|
||||
value = results[i];
|
||||
}
|
||||
|
||||
if (value !== null && value !== undefined) {
|
||||
data.push({
|
||||
time: candles[i].time,
|
||||
value: value
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
if (data.length === 0) return;
|
||||
|
||||
let series;
|
||||
|
||||
// Determine line style for this specific plot
|
||||
let plotLineStyle = lineStyle;
|
||||
if (plot.style === 'dashed') plotLineStyle = LightweightCharts.LineStyle.Dashed;
|
||||
else if (plot.style === 'dotted') plotLineStyle = LightweightCharts.LineStyle.Dotted;
|
||||
else if (plot.style === 'solid') plotLineStyle = LightweightCharts.LineStyle.Solid;
|
||||
|
||||
if (plot.type === 'histogram') {
|
||||
series = window.dashboard.chart.addSeries(LightweightCharts.HistogramSeries, {
|
||||
color: plotColor,
|
||||
priceFormat: {
|
||||
type: 'price',
|
||||
precision: 4,
|
||||
minMove: 0.0001
|
||||
},
|
||||
priceLineVisible: false,
|
||||
lastValueVisible: false
|
||||
}, paneIndex);
|
||||
} else if (plot.type === 'baseline') {
|
||||
series = window.dashboard.chart.addSeries(LightweightCharts.BaselineSeries, {
|
||||
baseValue: { type: 'price', price: plot.baseValue || 0 },
|
||||
topLineColor: plot.topLineColor || plotColor,
|
||||
topFillColor1: plot.topFillColor1 || plotColor,
|
||||
topFillColor2: plot.topFillColor2 || '#00000000',
|
||||
bottomFillColor1: plot.bottomFillColor1 || '#00000000',
|
||||
bottomColor: plot.bottomColor || '#00000000',
|
||||
lineWidth: plot.width !== undefined ? plot.width : lineWidth,
|
||||
lineStyle: plotLineStyle,
|
||||
title: plot.title || '',
|
||||
priceLineVisible: false,
|
||||
lastValueVisible: plot.lastValueVisible !== false
|
||||
}, paneIndex);
|
||||
} else {
|
||||
series = window.dashboard.chart.addSeries(LightweightCharts.LineSeries, {
|
||||
color: plotColor,
|
||||
lineWidth: plot.width !== undefined ? plot.width : lineWidth,
|
||||
lineStyle: plotLineStyle,
|
||||
title: plot.title || '',
|
||||
priceLineVisible: false,
|
||||
lastValueVisible: plot.lastValueVisible !== false
|
||||
}, paneIndex);
|
||||
}
|
||||
|
||||
series.setData(data);
|
||||
indicator.series.push(series);
|
||||
});
|
||||
|
||||
// Render gradient zones if available
|
||||
if (meta.gradientZones && indicator.series.length > 0) {
|
||||
// Find the main series to attach zones to
|
||||
let baseSeries = indicator.series[0];
|
||||
|
||||
meta.gradientZones.forEach(zone => {
|
||||
if (zone.from === undefined || zone.to === undefined) return;
|
||||
|
||||
// We use createPriceLine on the series for horizontal bands with custom colors
|
||||
baseSeries.createPriceLine({
|
||||
price: zone.from,
|
||||
color: zone.color.replace(/rgba\((\d+),\s*(\d+),\s*(\d+),\s*[^)]+\)/, 'rgb($1, $2, $3)'),
|
||||
lineWidth: 1,
|
||||
lineStyle: LightweightCharts.LineStyle.Solid,
|
||||
axisLabelVisible: false,
|
||||
title: zone.label || '',
|
||||
});
|
||||
|
||||
if (zone.to !== zone.from) {
|
||||
baseSeries.createPriceLine({
|
||||
price: zone.to,
|
||||
color: zone.color.replace(/rgba\((\d+),\s*(\d+),\s*(\d+),\s*[^)]+\)/, 'rgb($1, $2, $3)'),
|
||||
lineWidth: 1,
|
||||
lineStyle: LightweightCharts.LineStyle.Solid,
|
||||
axisLabelVisible: false,
|
||||
title: '',
|
||||
});
|
||||
}
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
/** Update the TradingView-style legend overlay on the chart */
|
||||
export function updateChartLegend() {
|
||||
let legend = document.getElementById('chartIndicatorLegend');
|
||||
if (!legend) {
|
||||
const chartWrapper = document.getElementById('chartWrapper');
|
||||
if (!chartWrapper) return;
|
||||
legend = document.createElement('div');
|
||||
legend.id = 'chartIndicatorLegend';
|
||||
legend.className = 'chart-indicator-legend';
|
||||
chartWrapper.appendChild(legend);
|
||||
}
|
||||
|
||||
if (activeIndicators.length === 0) {
|
||||
legend.innerHTML = '';
|
||||
legend.style.display = 'none';
|
||||
return;
|
||||
}
|
||||
|
||||
legend.style.display = 'flex';
|
||||
legend.innerHTML = activeIndicators.map(ind => {
|
||||
const label = getIndicatorLabel(ind);
|
||||
const plotGroups = groupPlotsByColor(ind.plots || []);
|
||||
const firstColor = ind.params['_color_0'] || '#2962ff';
|
||||
const dimmed = ind.visible === false;
|
||||
|
||||
return `
|
||||
<div class="legend-item ${dimmed ? 'legend-dimmed' : ''} ${ind.id === configuringId ? 'legend-selected' : ''}"
|
||||
data-id="${ind.id}">
|
||||
<span class="legend-dot" style="background: ${firstColor};"></span>
|
||||
<span class="legend-label">${label}</span>
|
||||
<span class="legend-close" data-id="${ind.id}" title="Remove">×</span>
|
||||
</div>
|
||||
`;
|
||||
}).join('');
|
||||
|
||||
// Bind legend events
|
||||
legend.querySelectorAll('.legend-item').forEach(el => {
|
||||
el.addEventListener('click', (e) => {
|
||||
if (e.target.classList.contains('legend-close')) return;
|
||||
selectIndicatorConfig(el.dataset.id);
|
||||
renderIndicatorList();
|
||||
});
|
||||
});
|
||||
legend.querySelectorAll('.legend-close').forEach(el => {
|
||||
el.addEventListener('click', (e) => {
|
||||
e.stopPropagation();
|
||||
removeIndicatorById(el.dataset.id);
|
||||
});
|
||||
});
|
||||
}
|
||||
|
||||
// Legacy compat: toggleIndicator still works for external callers
|
||||
export function toggleIndicator(type) {
|
||||
addIndicator(type);
|
||||
}
|
||||
|
||||
export function showIndicatorConfig(index) {
|
||||
if (index >= 0 && index < activeIndicators.length) {
|
||||
selectIndicatorConfig(activeIndicators[index].id);
|
||||
}
|
||||
}
|
||||
|
||||
export function showIndicatorConfigByType(type) {
|
||||
const ind = activeIndicators.find(a => a.type === type);
|
||||
if (ind) {
|
||||
selectIndicatorConfig(ind.id);
|
||||
}
|
||||
}
|
||||
|
||||
window.addIndicator = addIndicator;
|
||||
window.toggleIndicator = toggleIndicator;
|
||||
window.showIndicatorConfig = showIndicatorConfig;
|
||||
window.applyIndicatorConfig = applyIndicatorConfig;
|
||||
window.removeIndicator = removeIndicator;
|
||||
window.removeIndicatorById = removeIndicatorById;
|
||||
window.removeIndicatorByIndex = removeIndicatorByIndex;
|
||||
window.drawIndicatorsOnChart = drawIndicatorsOnChart;
|
||||
23
src/api/dashboard/static/js/ui/sidebar.js
Normal file
23
src/api/dashboard/static/js/ui/sidebar.js
Normal file
@ -0,0 +1,23 @@
|
||||
export function toggleSidebar() {
|
||||
const sidebar = document.getElementById('rightSidebar');
|
||||
sidebar.classList.toggle('collapsed');
|
||||
localStorage.setItem('sidebar_collapsed', sidebar.classList.contains('collapsed'));
|
||||
|
||||
// Resize chart after sidebar toggle
|
||||
setTimeout(() => {
|
||||
if (window.dashboard && window.dashboard.chart) {
|
||||
const container = document.getElementById('chart');
|
||||
window.dashboard.chart.applyOptions({
|
||||
width: container.clientWidth,
|
||||
height: container.clientHeight
|
||||
});
|
||||
}
|
||||
}, 350); // Wait for CSS transition
|
||||
}
|
||||
|
||||
export function restoreSidebarState() {
|
||||
const collapsed = localStorage.getItem('sidebar_collapsed') === 'true';
|
||||
if (collapsed) {
|
||||
document.getElementById('rightSidebar').classList.add('collapsed');
|
||||
}
|
||||
}
|
||||
388
src/api/dashboard/static/js/ui/simulation.js
Normal file
388
src/api/dashboard/static/js/ui/simulation.js
Normal file
@ -0,0 +1,388 @@
|
||||
import { ClientStrategyEngine } from '../strategies/index.js';
|
||||
import { SimulationStorage } from './storage.js';
|
||||
import { downloadFile } from '../utils/index.js';
|
||||
import { showExportDialog, closeExportDialog, performExport } from './export.js';
|
||||
|
||||
let lastSimulationResults = null;
|
||||
|
||||
export function getLastResults() {
|
||||
return lastSimulationResults;
|
||||
}
|
||||
|
||||
export function setLastResults(results) {
|
||||
lastSimulationResults = results;
|
||||
window.lastSimulationResults = results;
|
||||
}
|
||||
|
||||
export async function runSimulation() {
|
||||
const strategyConfig = getStrategyConfig();
|
||||
if (!strategyConfig) {
|
||||
alert('Please select a strategy');
|
||||
return;
|
||||
}
|
||||
|
||||
const startDateInput = document.getElementById('simStartDate').value;
|
||||
if (!startDateInput) {
|
||||
alert('Please select a start date');
|
||||
return;
|
||||
}
|
||||
|
||||
const runBtn = document.getElementById('runSimBtn');
|
||||
runBtn.disabled = true;
|
||||
runBtn.textContent = '⏳ Running...';
|
||||
|
||||
try {
|
||||
const start = new Date(startDateInput);
|
||||
const fetchStart = new Date(start.getTime() - 200 * 24 * 60 * 60 * 1000);
|
||||
|
||||
if (!window.dashboard) {
|
||||
throw new Error('Dashboard not initialized');
|
||||
}
|
||||
const interval = window.dashboard.currentInterval;
|
||||
const secondaryTF = document.getElementById('simSecondaryTF').value;
|
||||
const riskPercent = parseFloat(document.getElementById('simRiskPercent').value);
|
||||
const stopLossPercent = parseFloat(document.getElementById('simStopLoss').value);
|
||||
|
||||
const timeframes = [interval];
|
||||
if (secondaryTF && secondaryTF !== '') {
|
||||
timeframes.push(secondaryTF);
|
||||
}
|
||||
|
||||
const query = new URLSearchParams({ symbol: 'BTC', start: fetchStart.toISOString() });
|
||||
timeframes.forEach(tf => query.append('timeframes', tf));
|
||||
|
||||
console.log('Fetching candles with query:', query.toString());
|
||||
|
||||
const response = await fetch(`/api/v1/candles/bulk?${query.toString()}`);
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`API error: ${response.status} ${response.statusText}`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
console.log('Candle data received:', data);
|
||||
console.log('Looking for interval:', interval);
|
||||
console.log('Available timeframes:', Object.keys(data));
|
||||
|
||||
if (!data[interval] || data[interval].length === 0) {
|
||||
throw new Error(`No candle data available for ${interval} timeframe. Check if data exists in database.`);
|
||||
}
|
||||
|
||||
const candlesMap = {
|
||||
[interval]: data[interval].map(c => ({
|
||||
time: Math.floor(new Date(c.time).getTime() / 1000),
|
||||
open: parseFloat(c.open),
|
||||
high: parseFloat(c.high),
|
||||
low: parseFloat(c.low),
|
||||
close: parseFloat(c.close)
|
||||
}))
|
||||
};
|
||||
|
||||
if (secondaryTF && data[secondaryTF]) {
|
||||
candlesMap[secondaryTF] = data[secondaryTF].map(c => ({
|
||||
time: Math.floor(new Date(c.time).getTime() / 1000),
|
||||
open: parseFloat(c.open),
|
||||
high: parseFloat(c.high),
|
||||
low: parseFloat(c.low),
|
||||
close: parseFloat(c.close)
|
||||
}));
|
||||
}
|
||||
|
||||
const engineConfig = {
|
||||
id: strategyConfig.id,
|
||||
params: strategyConfig.params,
|
||||
timeframes: { primary: interval, secondary: secondaryTF ? [secondaryTF] : [] },
|
||||
indicators: []
|
||||
};
|
||||
|
||||
console.log('Building strategy config:');
|
||||
console.log(' Primary TF:', interval);
|
||||
console.log(' Secondary TF:', secondaryTF);
|
||||
console.log(' Available candles:', Object.keys(candlesMap));
|
||||
|
||||
if (strategyConfig.id === 'ma_trend') {
|
||||
const period = strategyConfig.params?.period || 44;
|
||||
engineConfig.indicators.push({
|
||||
name: `ma${period}`,
|
||||
type: 'sma',
|
||||
params: { period: period },
|
||||
timeframe: interval
|
||||
});
|
||||
if (secondaryTF) {
|
||||
engineConfig.indicators.push({
|
||||
name: `ma${period}_${secondaryTF}`,
|
||||
type: 'sma',
|
||||
params: { period: period },
|
||||
timeframe: secondaryTF
|
||||
});
|
||||
}
|
||||
}
|
||||
|
||||
console.log(' Indicators configured:', engineConfig.indicators.map(i => `${i.name} on ${i.timeframe}`));
|
||||
|
||||
const riskConfig = {
|
||||
positionSizing: { method: 'percent', value: riskPercent },
|
||||
stopLoss: { enabled: true, method: 'percent', value: stopLossPercent }
|
||||
};
|
||||
|
||||
const engine = new ClientStrategyEngine();
|
||||
const results = engine.run(candlesMap, engineConfig, riskConfig, start);
|
||||
|
||||
if (results.error) throw new Error(results.error);
|
||||
|
||||
setLastResults({
|
||||
...results,
|
||||
config: {
|
||||
strategyId: strategyConfig.id,
|
||||
strategyName: window.availableStrategies?.find(s => s.id === strategyConfig.id)?.name || strategyConfig.id,
|
||||
timeframe: interval,
|
||||
secondaryTimeframe: secondaryTF,
|
||||
startDate: startDateInput,
|
||||
riskPercent: riskPercent,
|
||||
stopLossPercent: stopLossPercent,
|
||||
params: strategyConfig.params
|
||||
},
|
||||
runAt: new Date().toISOString()
|
||||
});
|
||||
|
||||
displayEnhancedResults(lastSimulationResults);
|
||||
|
||||
document.getElementById('resultsSection').style.display = 'block';
|
||||
|
||||
if (window.dashboard && candlesMap[interval]) {
|
||||
const chartData = candlesMap[interval].map(c => ({
|
||||
time: c.time,
|
||||
open: c.open,
|
||||
high: c.high,
|
||||
low: c.low,
|
||||
close: c.close
|
||||
}));
|
||||
window.dashboard.candleSeries.setData(chartData);
|
||||
window.dashboard.allData.set(interval, chartData);
|
||||
console.log(`Chart updated with ${chartData.length} candles from simulation range`);
|
||||
}
|
||||
|
||||
showSimulationMarkers();
|
||||
|
||||
} catch (error) {
|
||||
console.error('Simulation error:', error);
|
||||
alert('Simulation error: ' + error.message);
|
||||
} finally {
|
||||
runBtn.disabled = false;
|
||||
runBtn.textContent = '▶ Run Simulation';
|
||||
}
|
||||
}
|
||||
|
||||
export function displayEnhancedResults(simulation) {
|
||||
const results = simulation.results || simulation;
|
||||
|
||||
document.getElementById('simTrades').textContent = results.total_trades || '0';
|
||||
document.getElementById('simWinRate').textContent = (results.win_rate || 0).toFixed(1) + '%';
|
||||
|
||||
const pnl = results.total_pnl || 0;
|
||||
const pnlElement = document.getElementById('simPnL');
|
||||
pnlElement.textContent = (pnl >= 0 ? '+' : '') + '$' + pnl.toFixed(2);
|
||||
pnlElement.style.color = pnl >= 0 ? '#4caf50' : '#f44336';
|
||||
|
||||
let grossProfit = 0;
|
||||
let grossLoss = 0;
|
||||
(results.trades || []).forEach(trade => {
|
||||
if (trade.pnl > 0) grossProfit += trade.pnl;
|
||||
else grossLoss += Math.abs(trade.pnl);
|
||||
});
|
||||
const profitFactor = grossLoss > 0 ? (grossProfit / grossLoss).toFixed(2) : grossProfit > 0 ? '∞' : '0';
|
||||
document.getElementById('simProfitFactor').textContent = profitFactor;
|
||||
|
||||
drawEquitySparkline(results);
|
||||
}
|
||||
|
||||
function drawEquitySparkline(results) {
|
||||
const container = document.getElementById('equitySparkline');
|
||||
if (!container || !results.trades || results.trades.length === 0) {
|
||||
container.innerHTML = '<div style="text-align: center; color: var(--tv-text-secondary); padding: 20px; font-size: 11px;">No trades</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
let equity = 1000;
|
||||
const equityData = [{ time: results.trades[0].entryTime, equity: equity }];
|
||||
|
||||
results.trades.forEach(trade => {
|
||||
equity += trade.pnl;
|
||||
equityData.push({ time: trade.exitTime, equity: equity });
|
||||
});
|
||||
|
||||
if (lastSimulationResults) {
|
||||
lastSimulationResults.equity_curve = equityData;
|
||||
}
|
||||
|
||||
container.innerHTML = '<canvas id="sparklineCanvas" width="300" height="60"></canvas>';
|
||||
const canvas = document.getElementById('sparklineCanvas');
|
||||
const ctx = canvas.getContext('2d');
|
||||
|
||||
const minEquity = Math.min(...equityData.map(d => d.equity));
|
||||
const maxEquity = Math.max(...equityData.map(d => d.equity));
|
||||
const range = maxEquity - minEquity || 1;
|
||||
|
||||
ctx.strokeStyle = equityData[equityData.length - 1].equity >= equityData[0].equity ? '#4caf50' : '#f44336';
|
||||
ctx.lineWidth = 2;
|
||||
ctx.beginPath();
|
||||
|
||||
equityData.forEach((point, i) => {
|
||||
const x = (i / (equityData.length - 1)) * canvas.width;
|
||||
const y = canvas.height - ((point.equity - minEquity) / range) * canvas.height;
|
||||
|
||||
if (i === 0) ctx.moveTo(x, y);
|
||||
else ctx.lineTo(x, y);
|
||||
});
|
||||
|
||||
ctx.stroke();
|
||||
|
||||
ctx.fillStyle = '#888';
|
||||
ctx.font = '9px sans-serif';
|
||||
ctx.fillText('$' + equityData[0].equity.toFixed(0), 2, canvas.height - 2);
|
||||
ctx.fillText('$' + equityData[equityData.length - 1].equity.toFixed(0), canvas.width - 30, 10);
|
||||
}
|
||||
|
||||
let tradeLineSeries = [];
|
||||
|
||||
export function showSimulationMarkers() {
|
||||
const results = getLastResults();
|
||||
if (!results || !window.dashboard) return;
|
||||
|
||||
const trades = results.trades || results.results?.trades || [];
|
||||
const markers = [];
|
||||
|
||||
clearSimulationMarkers();
|
||||
|
||||
console.log('Plotting trades:', trades.length);
|
||||
|
||||
trades.forEach((trade, i) => {
|
||||
let entryTime, exitTime;
|
||||
|
||||
if (typeof trade.entryTime === 'number') {
|
||||
entryTime = trade.entryTime;
|
||||
} else {
|
||||
entryTime = Math.floor(new Date(trade.entryTime).getTime() / 1000);
|
||||
}
|
||||
|
||||
if (typeof trade.exitTime === 'number') {
|
||||
exitTime = trade.exitTime;
|
||||
} else {
|
||||
exitTime = Math.floor(new Date(trade.exitTime).getTime() / 1000);
|
||||
}
|
||||
|
||||
const pnlSymbol = trade.pnl > 0 ? '+' : '';
|
||||
|
||||
markers.push({
|
||||
time: entryTime,
|
||||
position: 'belowBar',
|
||||
color: '#2196f3',
|
||||
shape: 'arrowUp',
|
||||
text: 'BUY',
|
||||
size: 1
|
||||
});
|
||||
|
||||
markers.push({
|
||||
time: exitTime,
|
||||
position: 'aboveBar',
|
||||
color: trade.pnl > 0 ? '#4caf50' : '#f44336',
|
||||
shape: 'arrowDown',
|
||||
text: `SELL ${pnlSymbol}${trade.pnlPct.toFixed(1)}%`,
|
||||
size: 1
|
||||
});
|
||||
|
||||
const lineSeries = window.dashboard.chart.addSeries(LightweightCharts.LineSeries, {
|
||||
color: '#2196f3',
|
||||
lineWidth: 1,
|
||||
lastValueVisible: false,
|
||||
title: '',
|
||||
priceLineVisible: false,
|
||||
crosshairMarkerVisible: false
|
||||
}, 0);
|
||||
|
||||
lineSeries.setData([
|
||||
{ time: entryTime, value: trade.entryPrice },
|
||||
{ time: exitTime, value: trade.exitPrice }
|
||||
]);
|
||||
|
||||
tradeLineSeries.push(lineSeries);
|
||||
});
|
||||
|
||||
markers.sort((a, b) => a.time - b.time);
|
||||
|
||||
window.dashboard.candleSeries.setMarkers(markers);
|
||||
|
||||
console.log(`Plotted ${trades.length} trades with connection lines`);
|
||||
}
|
||||
|
||||
export function clearSimulationMarkers() {
|
||||
try {
|
||||
if (window.dashboard && window.dashboard.candleSeries && typeof window.dashboard.candleSeries.setMarkers === 'function') {
|
||||
window.dashboard.candleSeries.setMarkers([]);
|
||||
}
|
||||
} catch (e) {
|
||||
// Ignore errors clearing markers
|
||||
}
|
||||
|
||||
try {
|
||||
tradeLineSeries.forEach(series => {
|
||||
try {
|
||||
if (window.dashboard && window.dashboard.chart) {
|
||||
window.dashboard.chart.removeSeries(series);
|
||||
}
|
||||
} catch (e) {
|
||||
// Series might already be removed
|
||||
}
|
||||
});
|
||||
} catch (e) {
|
||||
// Ignore errors removing series
|
||||
}
|
||||
|
||||
tradeLineSeries = [];
|
||||
}
|
||||
|
||||
export function clearSimulationResults() {
|
||||
clearSimulationMarkers();
|
||||
|
||||
setLastResults(null);
|
||||
|
||||
const resultsSection = document.getElementById('resultsSection');
|
||||
if (resultsSection) {
|
||||
resultsSection.style.display = 'none';
|
||||
}
|
||||
|
||||
const simTrades = document.getElementById('simTrades');
|
||||
const simWinRate = document.getElementById('simWinRate');
|
||||
const simPnL = document.getElementById('simPnL');
|
||||
const simProfitFactor = document.getElementById('simProfitFactor');
|
||||
const equitySparkline = document.getElementById('equitySparkline');
|
||||
|
||||
if (simTrades) simTrades.textContent = '0';
|
||||
if (simWinRate) simWinRate.textContent = '0%';
|
||||
if (simPnL) {
|
||||
simPnL.textContent = '$0.00';
|
||||
simPnL.style.color = '';
|
||||
}
|
||||
if (simProfitFactor) simProfitFactor.textContent = '0';
|
||||
if (equitySparkline) equitySparkline.innerHTML = '';
|
||||
}
|
||||
|
||||
function getStrategyConfig() {
|
||||
const strategyId = window.currentStrategy;
|
||||
if (!strategyId) return null;
|
||||
|
||||
const params = {};
|
||||
const paramDefs = window.StrategyParams?.[strategyId] || [];
|
||||
|
||||
paramDefs.forEach(def => {
|
||||
const input = document.getElementById(`param_${def.name}`);
|
||||
if (input) {
|
||||
params[def.name] = def.type === 'number' ? parseFloat(input.value) : input.value;
|
||||
}
|
||||
});
|
||||
|
||||
return {
|
||||
id: strategyId,
|
||||
params: params
|
||||
};
|
||||
}
|
||||
47
src/api/dashboard/static/js/ui/storage.js
Normal file
47
src/api/dashboard/static/js/ui/storage.js
Normal file
@ -0,0 +1,47 @@
|
||||
export const SimulationStorage = {
|
||||
STORAGE_KEY: 'btc_bot_simulations',
|
||||
|
||||
getAll() {
|
||||
try {
|
||||
const data = localStorage.getItem(this.STORAGE_KEY);
|
||||
return data ? JSON.parse(data) : [];
|
||||
} catch (e) {
|
||||
console.error('Error reading simulations:', e);
|
||||
return [];
|
||||
}
|
||||
},
|
||||
|
||||
save(simulation) {
|
||||
try {
|
||||
const simulations = this.getAll();
|
||||
simulation.id = simulation.id || 'sim_' + Date.now();
|
||||
simulation.createdAt = new Date().toISOString();
|
||||
simulations.push(simulation);
|
||||
localStorage.setItem(this.STORAGE_KEY, JSON.stringify(simulations));
|
||||
return simulation.id;
|
||||
} catch (e) {
|
||||
console.error('Error saving simulation:', e);
|
||||
return null;
|
||||
}
|
||||
},
|
||||
|
||||
delete(id) {
|
||||
try {
|
||||
let simulations = this.getAll();
|
||||
simulations = simulations.filter(s => s.id !== id);
|
||||
localStorage.setItem(this.STORAGE_KEY, JSON.stringify(simulations));
|
||||
return true;
|
||||
} catch (e) {
|
||||
console.error('Error deleting simulation:', e);
|
||||
return false;
|
||||
}
|
||||
},
|
||||
|
||||
get(id) {
|
||||
return this.getAll().find(s => s.id === id);
|
||||
},
|
||||
|
||||
clear() {
|
||||
localStorage.removeItem(this.STORAGE_KEY);
|
||||
}
|
||||
};
|
||||
309
src/api/dashboard/static/js/ui/strategies-panel.js
Normal file
309
src/api/dashboard/static/js/ui/strategies-panel.js
Normal file
@ -0,0 +1,309 @@
|
||||
import { StrategyParams } from '../strategies/config.js';
|
||||
|
||||
let currentStrategy = null;
|
||||
|
||||
export function getCurrentStrategy() {
|
||||
return currentStrategy;
|
||||
}
|
||||
|
||||
export function setCurrentStrategy(strategyId) {
|
||||
currentStrategy = strategyId;
|
||||
window.currentStrategy = strategyId;
|
||||
}
|
||||
|
||||
export function renderStrategies(strategies) {
|
||||
const container = document.getElementById('strategyList');
|
||||
|
||||
if (!strategies || strategies.length === 0) {
|
||||
container.innerHTML = '<div style="text-align: center; color: var(--tv-text-secondary); padding: 20px;">No strategies available</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
container.innerHTML = strategies.map((s, index) => `
|
||||
<div class="strategy-item ${index === 0 ? 'selected' : ''}" data-strategy-id="${s.id}" onclick="selectStrategy('${s.id}')">
|
||||
<input type="radio" name="strategy" class="strategy-radio" ${index === 0 ? 'checked' : ''}>
|
||||
<span class="strategy-name">${s.name}</span>
|
||||
<span class="strategy-info" title="${s.description}">ⓘ</span>
|
||||
</div>
|
||||
`).join('');
|
||||
|
||||
if (strategies.length > 0) {
|
||||
selectStrategy(strategies[0].id);
|
||||
}
|
||||
|
||||
document.getElementById('runSimBtn').disabled = false;
|
||||
}
|
||||
|
||||
export function selectStrategy(strategyId) {
|
||||
document.querySelectorAll('.strategy-item').forEach(item => {
|
||||
item.classList.toggle('selected', item.dataset.strategyId === strategyId);
|
||||
const radio = item.querySelector('input[type="radio"]');
|
||||
if (radio) radio.checked = item.dataset.strategyId === strategyId;
|
||||
});
|
||||
|
||||
setCurrentStrategy(strategyId);
|
||||
renderStrategyParams(strategyId);
|
||||
}
|
||||
|
||||
export function renderStrategyParams(strategyId) {
|
||||
const container = document.getElementById('strategyParams');
|
||||
const params = StrategyParams[strategyId] || [];
|
||||
|
||||
if (params.length === 0) {
|
||||
container.innerHTML = '';
|
||||
return;
|
||||
}
|
||||
|
||||
container.innerHTML = params.map(param => `
|
||||
<div class="config-group">
|
||||
<label class="config-label">${param.label}</label>
|
||||
<input type="${param.type}"
|
||||
id="param_${param.name}"
|
||||
class="config-input"
|
||||
value="${param.default}"
|
||||
${param.min !== undefined ? `min="${param.min}"` : ''}
|
||||
${param.max !== undefined ? `max="${param.max}"` : ''}
|
||||
${param.step !== undefined ? `step="${param.step}"` : ''}
|
||||
>
|
||||
</div>
|
||||
`).join('');
|
||||
}
|
||||
|
||||
export async function loadStrategies() {
|
||||
try {
|
||||
console.log('Fetching strategies from API...');
|
||||
|
||||
const controller = new AbortController();
|
||||
const timeoutId = setTimeout(() => controller.abort(), 5000);
|
||||
|
||||
const response = await fetch('/api/v1/strategies?_=' + Date.now(), {
|
||||
signal: controller.signal
|
||||
});
|
||||
clearTimeout(timeoutId);
|
||||
|
||||
if (!response.ok) {
|
||||
throw new Error(`HTTP error! status: ${response.status}`);
|
||||
}
|
||||
|
||||
const data = await response.json();
|
||||
console.log('Strategies loaded:', data);
|
||||
|
||||
if (!data.strategies) {
|
||||
throw new Error('Invalid response format: missing strategies array');
|
||||
}
|
||||
|
||||
window.availableStrategies = data.strategies;
|
||||
renderStrategies(data.strategies);
|
||||
} catch (error) {
|
||||
console.error('Error loading strategies:', error);
|
||||
|
||||
let errorMessage = error.message;
|
||||
if (error.name === 'AbortError') {
|
||||
errorMessage = 'Request timeout - API server not responding';
|
||||
} else if (error.message.includes('Failed to fetch')) {
|
||||
errorMessage = 'Cannot connect to API server - is it running?';
|
||||
}
|
||||
|
||||
document.getElementById('strategyList').innerHTML =
|
||||
`<div style="color: var(--tv-red); padding: 20px; text-align: center;">
|
||||
${errorMessage}<br>
|
||||
<small style="color: var(--tv-text-secondary);">Check console (F12) for details</small>
|
||||
</div>`;
|
||||
}
|
||||
}
|
||||
|
||||
export function saveSimulation() {
|
||||
const results = getLastResults();
|
||||
if (!results) {
|
||||
alert('Please run a simulation first');
|
||||
return;
|
||||
}
|
||||
|
||||
const defaultName = generateSimulationName(results.config);
|
||||
const name = prompt('Save simulation as:', defaultName);
|
||||
|
||||
if (!name || name.trim() === '') return;
|
||||
|
||||
const simulation = {
|
||||
name: name.trim(),
|
||||
config: results.config,
|
||||
results: {
|
||||
total_trades: results.total_trades,
|
||||
win_rate: results.win_rate,
|
||||
total_pnl: results.total_pnl,
|
||||
trades: results.trades,
|
||||
equity_curve: results.equity_curve
|
||||
}
|
||||
};
|
||||
|
||||
const id = window.SimulationStorage?.save(simulation);
|
||||
if (id) {
|
||||
renderSavedSimulations();
|
||||
alert('Simulation saved successfully!');
|
||||
} else {
|
||||
alert('Error saving simulation');
|
||||
}
|
||||
}
|
||||
|
||||
function generateSimulationName(config) {
|
||||
if (!config) return 'Unnamed Simulation';
|
||||
|
||||
const start = new Date(config.startDate);
|
||||
const now = new Date();
|
||||
const duration = now - start;
|
||||
const oneDay = 24 * 60 * 60 * 1000;
|
||||
|
||||
let dateStr;
|
||||
if (duration < oneDay) {
|
||||
dateStr = start.toISOString().slice(0, 16).replace('T', ' ');
|
||||
} else {
|
||||
dateStr = start.toISOString().slice(0, 10);
|
||||
}
|
||||
|
||||
return `${config.strategyName}_${config.timeframe}_${dateStr}`;
|
||||
}
|
||||
|
||||
export function renderSavedSimulations() {
|
||||
const container = document.getElementById('savedSimulations');
|
||||
const simulations = window.SimulationStorage?.getAll() || [];
|
||||
|
||||
if (simulations.length === 0) {
|
||||
container.innerHTML = '<div style="text-align: center; color: var(--tv-text-secondary); padding: 10px; font-size: 12px;">No saved simulations</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
container.innerHTML = simulations.map(sim => `
|
||||
<div class="saved-sim-item">
|
||||
<span class="saved-sim-name" onclick="loadSavedSimulation('${sim.id}')" title="${sim.name}">
|
||||
${sim.name.length > 25 ? sim.name.slice(0, 25) + '...' : sim.name}
|
||||
</span>
|
||||
<div class="saved-sim-actions">
|
||||
<button class="sim-action-btn" onclick="loadSavedSimulation('${sim.id}')" title="Load">📂</button>
|
||||
<button class="sim-action-btn" onclick="exportSavedSimulation('${sim.id}')" title="Export">📥</button>
|
||||
<button class="sim-action-btn" onclick="deleteSavedSimulation('${sim.id}')" title="Delete">🗑️</button>
|
||||
</div>
|
||||
</div>
|
||||
`).join('');
|
||||
}
|
||||
|
||||
export function loadSavedSimulation(id) {
|
||||
const sim = window.SimulationStorage?.get(id);
|
||||
if (!sim) {
|
||||
alert('Simulation not found');
|
||||
return;
|
||||
}
|
||||
|
||||
if (sim.config) {
|
||||
document.getElementById('simSecondaryTF').value = sim.config.secondaryTimeframe || '';
|
||||
document.getElementById('simStartDate').value = sim.config.startDate || '';
|
||||
document.getElementById('simRiskPercent').value = sim.config.riskPercent || 2;
|
||||
document.getElementById('simStopLoss').value = sim.config.stopLossPercent || 2;
|
||||
|
||||
if (sim.config.strategyId) {
|
||||
selectStrategy(sim.config.strategyId);
|
||||
|
||||
if (sim.config.params) {
|
||||
Object.entries(sim.config.params).forEach(([key, value]) => {
|
||||
const input = document.getElementById(`param_${key}`);
|
||||
if (input) input.value = value;
|
||||
});
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
setLastResults(sim);
|
||||
displayEnhancedResults(sim.results);
|
||||
document.getElementById('resultsSection').style.display = 'block';
|
||||
}
|
||||
|
||||
export function deleteSavedSimulation(id) {
|
||||
if (!confirm('Are you sure you want to delete this simulation?')) return;
|
||||
|
||||
if (window.SimulationStorage?.delete(id)) {
|
||||
renderSavedSimulations();
|
||||
}
|
||||
}
|
||||
|
||||
function displayEnhancedResults(simulation) {
|
||||
const results = simulation.results || simulation;
|
||||
|
||||
document.getElementById('simTrades').textContent = results.total_trades || '0';
|
||||
document.getElementById('simWinRate').textContent = (results.win_rate || 0).toFixed(1) + '%';
|
||||
|
||||
const pnl = results.total_pnl || 0;
|
||||
const pnlElement = document.getElementById('simPnL');
|
||||
pnlElement.textContent = (pnl >= 0 ? '+' : '') + '$' + pnl.toFixed(2);
|
||||
pnlElement.style.color = pnl >= 0 ? '#4caf50' : '#f44336';
|
||||
|
||||
let grossProfit = 0;
|
||||
let grossLoss = 0;
|
||||
(results.trades || []).forEach(trade => {
|
||||
if (trade.pnl > 0) grossProfit += trade.pnl;
|
||||
else grossLoss += Math.abs(trade.pnl);
|
||||
});
|
||||
const profitFactor = grossLoss > 0 ? (grossProfit / grossLoss).toFixed(2) : grossProfit > 0 ? '∞' : '0';
|
||||
document.getElementById('simProfitFactor').textContent = profitFactor;
|
||||
|
||||
drawEquitySparkline(results);
|
||||
}
|
||||
|
||||
function drawEquitySparkline(results) {
|
||||
const container = document.getElementById('equitySparkline');
|
||||
if (!container || !results.trades || results.trades.length === 0) {
|
||||
container.innerHTML = '<div style="text-align: center; color: var(--tv-text-secondary); padding: 20px; font-size: 11px;">No trades</div>';
|
||||
return;
|
||||
}
|
||||
|
||||
let equity = 1000;
|
||||
const equityData = [{ time: results.trades[0].entryTime, equity: equity }];
|
||||
|
||||
results.trades.forEach(trade => {
|
||||
equity += trade.pnl;
|
||||
equityData.push({ time: trade.exitTime, equity: equity });
|
||||
});
|
||||
|
||||
const sim = getLastResults();
|
||||
if (sim) {
|
||||
sim.equity_curve = equityData;
|
||||
}
|
||||
|
||||
container.innerHTML = '<canvas id="sparklineCanvas" width="300" height="60"></canvas>';
|
||||
const canvas = document.getElementById('sparklineCanvas');
|
||||
const ctx = canvas.getContext('2d');
|
||||
|
||||
const minEquity = Math.min(...equityData.map(d => d.equity));
|
||||
const maxEquity = Math.max(...equityData.map(d => d.equity));
|
||||
const range = maxEquity - minEquity || 1;
|
||||
|
||||
ctx.strokeStyle = equityData[equityData.length - 1].equity >= equityData[0].equity ? '#4caf50' : '#f44336';
|
||||
ctx.lineWidth = 2;
|
||||
ctx.beginPath();
|
||||
|
||||
equityData.forEach((point, i) => {
|
||||
const x = (i / (equityData.length - 1)) * canvas.width;
|
||||
const y = canvas.height - ((point.equity - minEquity) / range) * canvas.height;
|
||||
|
||||
if (i === 0) ctx.moveTo(x, y);
|
||||
else ctx.lineTo(x, y);
|
||||
});
|
||||
|
||||
ctx.stroke();
|
||||
|
||||
ctx.fillStyle = '#888';
|
||||
ctx.font = '9px sans-serif';
|
||||
ctx.fillText('$' + equityData[0].equity.toFixed(0), 2, canvas.height - 2);
|
||||
ctx.fillText('$' + equityData[equityData.length - 1].equity.toFixed(0), canvas.width - 30, 10);
|
||||
}
|
||||
|
||||
function getLastResults() {
|
||||
return window.lastSimulationResults;
|
||||
}
|
||||
|
||||
function setLastResults(results) {
|
||||
window.lastSimulationResults = results;
|
||||
}
|
||||
|
||||
window.selectStrategy = selectStrategy;
|
||||
window.loadSavedSimulation = loadSavedSimulation;
|
||||
window.deleteSavedSimulation = deleteSavedSimulation;
|
||||
window.renderSavedSimulations = renderSavedSimulations;
|
||||
23
src/api/dashboard/static/js/utils/helpers.js
Normal file
23
src/api/dashboard/static/js/utils/helpers.js
Normal file
@ -0,0 +1,23 @@
|
||||
export function downloadFile(content, filename, mimeType) {
|
||||
const blob = new Blob([content], { type: mimeType });
|
||||
const url = URL.createObjectURL(blob);
|
||||
const link = document.createElement('a');
|
||||
link.href = url;
|
||||
link.download = filename;
|
||||
document.body.appendChild(link);
|
||||
link.click();
|
||||
document.body.removeChild(link);
|
||||
URL.revokeObjectURL(url);
|
||||
}
|
||||
|
||||
export function formatDate(date) {
|
||||
return new Date(date).toISOString().slice(0, 16);
|
||||
}
|
||||
|
||||
export function formatPrice(price, decimals = 2) {
|
||||
return price.toFixed(decimals);
|
||||
}
|
||||
|
||||
export function formatPercent(value) {
|
||||
return (value >= 0 ? '+' : '') + value.toFixed(2) + '%';
|
||||
}
|
||||
1
src/api/dashboard/static/js/utils/index.js
Normal file
1
src/api/dashboard/static/js/utils/index.js
Normal file
@ -0,0 +1 @@
|
||||
export { downloadFile, formatDate, formatPrice, formatPercent } from './helpers.js';
|
||||
636
src/api/server.py
Normal file
636
src/api/server.py
Normal file
@ -0,0 +1,636 @@
|
||||
"""
|
||||
Simplified FastAPI server - working version
|
||||
Removes the complex WebSocket manager that was causing issues
|
||||
"""
|
||||
|
||||
import os
|
||||
import asyncio
|
||||
import logging
|
||||
from dotenv import load_dotenv
|
||||
|
||||
load_dotenv()
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from typing import Optional, List
|
||||
from contextlib import asynccontextmanager
|
||||
|
||||
from fastapi import FastAPI, HTTPException, Query, BackgroundTasks, Response
|
||||
from fastapi.staticfiles import StaticFiles
|
||||
from fastapi.responses import StreamingResponse
|
||||
from fastapi.middleware.cors import CORSMiddleware
|
||||
import asyncpg
|
||||
import csv
|
||||
import io
|
||||
from pydantic import BaseModel, Field
|
||||
|
||||
# Imports for backtest runner
|
||||
from src.data_collector.database import DatabaseManager
|
||||
from src.data_collector.indicator_engine import IndicatorEngine, IndicatorConfig
|
||||
from src.data_collector.brain import Brain
|
||||
from src.data_collector.backtester import Backtester
|
||||
|
||||
# Imports for strategy discovery
|
||||
import importlib
|
||||
from src.strategies.base import BaseStrategy
|
||||
|
||||
|
||||
logging.basicConfig(level=logging.INFO)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
# Database connection settings
|
||||
DB_HOST = os.getenv('DB_HOST', 'localhost')
|
||||
DB_PORT = int(os.getenv('DB_PORT', 5432))
|
||||
DB_NAME = os.getenv('DB_NAME', 'btc_data')
|
||||
DB_USER = os.getenv('DB_USER', 'btc_bot')
|
||||
DB_PASSWORD = os.getenv('DB_PASSWORD', '')
|
||||
|
||||
|
||||
async def get_db_pool():
|
||||
"""Create database connection pool"""
|
||||
logger.info(f"Connecting to database: {DB_HOST}:{DB_PORT}/{DB_NAME} as {DB_USER}")
|
||||
return await asyncpg.create_pool(
|
||||
host=DB_HOST,
|
||||
port=DB_PORT,
|
||||
database=DB_NAME,
|
||||
user=DB_USER,
|
||||
password=DB_PASSWORD,
|
||||
min_size=2,
|
||||
max_size=20,
|
||||
max_inactive_connection_lifetime=300
|
||||
)
|
||||
|
||||
|
||||
pool = None
|
||||
|
||||
|
||||
@asynccontextmanager
|
||||
async def lifespan(app: FastAPI):
|
||||
"""Manage application lifespan"""
|
||||
global pool
|
||||
pool = await get_db_pool()
|
||||
logger.info("API Server started successfully")
|
||||
yield
|
||||
if pool:
|
||||
await pool.close()
|
||||
logger.info("API Server stopped")
|
||||
|
||||
|
||||
app = FastAPI(
|
||||
title="BTC Bot Data API",
|
||||
description="REST API for accessing BTC candle data",
|
||||
version="1.1.0",
|
||||
lifespan=lifespan
|
||||
)
|
||||
|
||||
# Enable CORS
|
||||
app.add_middleware(
|
||||
CORSMiddleware,
|
||||
allow_origins=["*"],
|
||||
allow_credentials=True,
|
||||
allow_methods=["*"],
|
||||
allow_headers=["*"],
|
||||
)
|
||||
|
||||
|
||||
@app.get("/")
|
||||
async def root():
|
||||
"""Root endpoint"""
|
||||
return {
|
||||
"message": "BTC Bot Data API",
|
||||
"docs": "/docs",
|
||||
"dashboard": "/dashboard",
|
||||
"status": "operational"
|
||||
}
|
||||
|
||||
|
||||
@app.get("/api/v1/strategies")
|
||||
async def list_strategies(response: Response):
|
||||
"""List all available trading strategies with metadata"""
|
||||
# Prevent caching
|
||||
response.headers["Cache-Control"] = "no-cache, no-store, must-revalidate"
|
||||
response.headers["Pragma"] = "no-cache"
|
||||
response.headers["Expires"] = "0"
|
||||
|
||||
# Strategy registry from brain.py
|
||||
strategy_registry = {
|
||||
"ma_trend": "src.strategies.ma_strategy.MAStrategy",
|
||||
}
|
||||
|
||||
strategies = []
|
||||
|
||||
for strategy_id, class_path in strategy_registry.items():
|
||||
try:
|
||||
module_path, class_name = class_path.rsplit('.', 1)
|
||||
module = importlib.import_module(module_path)
|
||||
strategy_class = getattr(module, class_name)
|
||||
|
||||
# Instantiate to get metadata
|
||||
strategy_instance = strategy_class()
|
||||
|
||||
strategies.append({
|
||||
"id": strategy_id,
|
||||
"name": strategy_instance.display_name,
|
||||
"description": strategy_instance.description,
|
||||
"required_indicators": strategy_instance.required_indicators
|
||||
})
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to load strategy {strategy_id}: {e}")
|
||||
|
||||
return {
|
||||
"strategies": strategies,
|
||||
"count": len(strategies)
|
||||
}
|
||||
|
||||
|
||||
@app.get("/api/v1/candles")
|
||||
async def get_candles(
|
||||
symbol: str = Query("BTC", description="Trading pair symbol"),
|
||||
interval: str = Query("1m", description="Candle interval"),
|
||||
start: Optional[datetime] = Query(None, description="Start time (ISO format)"),
|
||||
end: Optional[datetime] = Query(None, description="End time (ISO format)"),
|
||||
limit: int = Query(1000, ge=1, le=10000, description="Maximum number of candles")
|
||||
):
|
||||
"""Get candle data for a symbol"""
|
||||
async with pool.acquire() as conn:
|
||||
query = """
|
||||
SELECT time, symbol, interval, open, high, low, close, volume, validated
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
"""
|
||||
params = [symbol, interval]
|
||||
|
||||
if start:
|
||||
query += f" AND time >= ${len(params) + 1}"
|
||||
params.append(start)
|
||||
|
||||
if end:
|
||||
query += f" AND time <= ${len(params) + 1}"
|
||||
params.append(end)
|
||||
|
||||
query += f" ORDER BY time DESC LIMIT ${len(params) + 1}"
|
||||
params.append(limit)
|
||||
|
||||
rows = await conn.fetch(query, *params)
|
||||
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"interval": interval,
|
||||
"count": len(rows),
|
||||
"candles": [dict(row) for row in rows]
|
||||
}
|
||||
|
||||
|
||||
from typing import Optional, List
|
||||
|
||||
# ...
|
||||
|
||||
@app.get("/api/v1/candles/bulk")
|
||||
async def get_candles_bulk(
|
||||
symbol: str = Query("BTC"),
|
||||
timeframes: List[str] = Query(["1h"]),
|
||||
start: datetime = Query(...),
|
||||
end: Optional[datetime] = Query(None),
|
||||
):
|
||||
"""Get multiple timeframes of candles in a single request for client-side processing"""
|
||||
logger.info(f"Bulk candle request: {symbol}, TFs: {timeframes}, Start: {start}, End: {end}")
|
||||
if not end:
|
||||
end = datetime.now(timezone.utc)
|
||||
|
||||
results = {}
|
||||
|
||||
async with pool.acquire() as conn:
|
||||
for tf in timeframes:
|
||||
rows = await conn.fetch("""
|
||||
SELECT time, open, high, low, close, volume
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
AND time >= $3 AND time <= $4
|
||||
ORDER BY time ASC
|
||||
""", symbol, tf, start, end)
|
||||
|
||||
results[tf] = [
|
||||
{
|
||||
"time": r['time'].isoformat(),
|
||||
"open": float(r['open']),
|
||||
"high": float(r['high']),
|
||||
"low": float(r['low']),
|
||||
"close": float(r['close']),
|
||||
"volume": float(r['volume'])
|
||||
} for r in rows
|
||||
]
|
||||
|
||||
logger.info(f"Returning {sum(len(v) for v in results.values())} candles total")
|
||||
return results
|
||||
|
||||
|
||||
@app.get("/api/v1/candles/latest")
|
||||
async def get_latest_candle(symbol: str = "BTC", interval: str = "1m"):
|
||||
"""Get the most recent candle"""
|
||||
async with pool.acquire() as conn:
|
||||
row = await conn.fetchrow("""
|
||||
SELECT time, symbol, interval, open, high, low, close, volume
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
ORDER BY time DESC
|
||||
LIMIT 1
|
||||
""", symbol, interval)
|
||||
|
||||
if not row:
|
||||
raise HTTPException(status_code=404, detail="No data found")
|
||||
|
||||
return dict(row)
|
||||
|
||||
|
||||
@app.get("/api/v1/stats")
|
||||
async def get_stats(symbol: str = "BTC"):
|
||||
"""Get trading statistics"""
|
||||
async with pool.acquire() as conn:
|
||||
# Get latest price and 24h stats
|
||||
latest = await conn.fetchrow("""
|
||||
SELECT close, time
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = '1m'
|
||||
ORDER BY time DESC
|
||||
LIMIT 1
|
||||
""", symbol)
|
||||
|
||||
day_ago = await conn.fetchrow("""
|
||||
SELECT close
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = '1m' AND time <= NOW() - INTERVAL '24 hours'
|
||||
ORDER BY time DESC
|
||||
LIMIT 1
|
||||
""", symbol)
|
||||
|
||||
stats_24h = await conn.fetchrow("""
|
||||
SELECT
|
||||
MAX(high) as high_24h,
|
||||
MIN(low) as low_24h,
|
||||
SUM(volume) as volume_24h
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = '1m' AND time > NOW() - INTERVAL '24 hours'
|
||||
""", symbol)
|
||||
|
||||
if not latest:
|
||||
raise HTTPException(status_code=404, detail="No data found")
|
||||
|
||||
current_price = float(latest['close'])
|
||||
previous_price = float(day_ago['close']) if day_ago else current_price
|
||||
change_24h = ((current_price - previous_price) / previous_price * 100) if previous_price else 0
|
||||
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"current_price": current_price,
|
||||
"change_24h": round(change_24h, 2),
|
||||
"high_24h": float(stats_24h['high_24h']) if stats_24h['high_24h'] else current_price,
|
||||
"low_24h": float(stats_24h['low_24h']) if stats_24h['low_24h'] else current_price,
|
||||
"volume_24h": float(stats_24h['volume_24h']) if stats_24h['volume_24h'] else 0,
|
||||
"last_update": latest['time'].isoformat()
|
||||
}
|
||||
|
||||
|
||||
@app.get("/api/v1/health")
|
||||
async def health_check():
|
||||
"""System health check"""
|
||||
try:
|
||||
async with pool.acquire() as conn:
|
||||
latest = await conn.fetchrow("""
|
||||
SELECT symbol, MAX(time) as last_time, COUNT(*) as count
|
||||
FROM candles
|
||||
WHERE time > NOW() - INTERVAL '24 hours'
|
||||
GROUP BY symbol
|
||||
""")
|
||||
|
||||
return {
|
||||
"status": "healthy",
|
||||
"database": "connected",
|
||||
"latest_candles": dict(latest) if latest else None,
|
||||
"timestamp": datetime.utcnow().isoformat()
|
||||
}
|
||||
except Exception as e:
|
||||
logger.error(f"Health check failed: {e}")
|
||||
raise HTTPException(status_code=503, detail=f"Health check failed: {str(e)}")
|
||||
|
||||
|
||||
@app.get("/api/v1/indicators")
|
||||
async def get_indicators(
|
||||
symbol: str = Query("BTC", description="Trading pair symbol"),
|
||||
interval: str = Query("1d", description="Candle interval"),
|
||||
name: str = Query(None, description="Filter by indicator name (e.g., ma44)"),
|
||||
start: Optional[datetime] = Query(None, description="Start time"),
|
||||
end: Optional[datetime] = Query(None, description="End time"),
|
||||
limit: int = Query(1000, le=5000)
|
||||
):
|
||||
"""Get indicator values"""
|
||||
async with pool.acquire() as conn:
|
||||
query = """
|
||||
SELECT time, indicator_name, value
|
||||
FROM indicators
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
"""
|
||||
params = [symbol, interval]
|
||||
|
||||
if name:
|
||||
query += f" AND indicator_name = ${len(params) + 1}"
|
||||
params.append(name)
|
||||
|
||||
if start:
|
||||
query += f" AND time >= ${len(params) + 1}"
|
||||
params.append(start)
|
||||
|
||||
if end:
|
||||
query += f" AND time <= ${len(params) + 1}"
|
||||
params.append(end)
|
||||
|
||||
query += f" ORDER BY time DESC LIMIT ${len(params) + 1}"
|
||||
params.append(limit)
|
||||
|
||||
rows = await conn.fetch(query, *params)
|
||||
|
||||
# Group by time for easier charting
|
||||
grouped = {}
|
||||
for row in rows:
|
||||
ts = row['time'].isoformat()
|
||||
if ts not in grouped:
|
||||
grouped[ts] = {'time': ts}
|
||||
grouped[ts][row['indicator_name']] = float(row['value'])
|
||||
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"interval": interval,
|
||||
"data": list(grouped.values())
|
||||
}
|
||||
|
||||
|
||||
@app.get("/api/v1/decisions")
|
||||
async def get_decisions(
|
||||
symbol: str = Query("BTC"),
|
||||
interval: Optional[str] = Query(None),
|
||||
backtest_id: Optional[str] = Query(None),
|
||||
limit: int = Query(100, le=1000)
|
||||
):
|
||||
"""Get brain decisions"""
|
||||
async with pool.acquire() as conn:
|
||||
query = """
|
||||
SELECT time, interval, decision_type, strategy, confidence,
|
||||
price_at_decision, indicator_snapshot, reasoning, backtest_id
|
||||
FROM decisions
|
||||
WHERE symbol = $1
|
||||
"""
|
||||
params = [symbol]
|
||||
|
||||
if interval:
|
||||
query += f" AND interval = ${len(params) + 1}"
|
||||
params.append(interval)
|
||||
|
||||
if backtest_id:
|
||||
query += f" AND backtest_id = ${len(params) + 1}"
|
||||
params.append(backtest_id)
|
||||
else:
|
||||
query += " AND backtest_id IS NULL"
|
||||
|
||||
query += f" ORDER BY time DESC LIMIT ${len(params) + 1}"
|
||||
params.append(limit)
|
||||
|
||||
rows = await conn.fetch(query, *params)
|
||||
return [dict(row) for row in rows]
|
||||
|
||||
|
||||
@app.get("/api/v1/backtests")
|
||||
async def list_backtests(symbol: Optional[str] = None, limit: int = 20):
|
||||
"""List historical backtests"""
|
||||
async with pool.acquire() as conn:
|
||||
query = """
|
||||
SELECT id, strategy, symbol, start_time, end_time,
|
||||
intervals, results, created_at
|
||||
FROM backtest_runs
|
||||
"""
|
||||
params = []
|
||||
if symbol:
|
||||
query += " WHERE symbol = $1"
|
||||
params.append(symbol)
|
||||
|
||||
query += f" ORDER BY created_at DESC LIMIT ${len(params) + 1}"
|
||||
params.append(limit)
|
||||
|
||||
rows = await conn.fetch(query, *params)
|
||||
return [dict(row) for row in rows]
|
||||
|
||||
|
||||
class BacktestRequest(BaseModel):
|
||||
symbol: str = "BTC"
|
||||
intervals: list[str] = ["37m"]
|
||||
start_date: str = "2025-01-01" # ISO date
|
||||
end_date: Optional[str] = None
|
||||
|
||||
|
||||
async def run_backtest_task(req: BacktestRequest):
|
||||
"""Background task to run backtest"""
|
||||
db = DatabaseManager(
|
||||
host=DB_HOST, port=DB_PORT, database=DB_NAME,
|
||||
user=DB_USER, password=DB_PASSWORD
|
||||
)
|
||||
await db.connect()
|
||||
|
||||
try:
|
||||
# Load configs (hardcoded for now to match main.py)
|
||||
configs = [
|
||||
IndicatorConfig("ma44", "sma", 44, req.intervals),
|
||||
IndicatorConfig("ma125", "sma", 125, req.intervals)
|
||||
]
|
||||
|
||||
engine = IndicatorEngine(db, configs)
|
||||
brain = Brain(db, engine)
|
||||
backtester = Backtester(db, engine, brain)
|
||||
|
||||
start = datetime.fromisoformat(req.start_date).replace(tzinfo=timezone.utc)
|
||||
end = datetime.fromisoformat(req.end_date).replace(tzinfo=timezone.utc) if req.end_date else datetime.now(timezone.utc)
|
||||
|
||||
await backtester.run(req.symbol, req.intervals, start, end)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Backtest failed: {e}")
|
||||
finally:
|
||||
await db.disconnect()
|
||||
|
||||
|
||||
@app.post("/api/v1/backtests")
|
||||
async def trigger_backtest(req: BacktestRequest, background_tasks: BackgroundTasks):
|
||||
"""Start a backtest in the background"""
|
||||
background_tasks.add_task(run_backtest_task, req)
|
||||
return {"message": "Backtest started", "params": req.dict()}
|
||||
|
||||
|
||||
@app.get("/api/v1/ta")
|
||||
async def get_technical_analysis(
|
||||
symbol: str = Query("BTC", description="Trading pair symbol"),
|
||||
interval: str = Query("1d", description="Candle interval")
|
||||
):
|
||||
"""
|
||||
Get technical analysis for a symbol
|
||||
Uses stored indicators from DB if available, falls back to on-the-fly calc
|
||||
"""
|
||||
try:
|
||||
async with pool.acquire() as conn:
|
||||
# 1. Get latest price
|
||||
latest = await conn.fetchrow("""
|
||||
SELECT close, time
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
ORDER BY time DESC
|
||||
LIMIT 1
|
||||
""", symbol, interval)
|
||||
|
||||
if not latest:
|
||||
return {"error": "No candle data found"}
|
||||
|
||||
current_price = float(latest['close'])
|
||||
timestamp = latest['time']
|
||||
|
||||
# 2. Get latest indicators from DB
|
||||
indicators = await conn.fetch("""
|
||||
SELECT indicator_name, value
|
||||
FROM indicators
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
AND time <= $3
|
||||
ORDER BY time DESC
|
||||
""", symbol, interval, timestamp)
|
||||
|
||||
# Convert list to dict, e.g. {'ma44': 65000, 'ma125': 64000}
|
||||
# We take the most recent value for each indicator
|
||||
ind_map = {}
|
||||
for row in indicators:
|
||||
name = row['indicator_name']
|
||||
if name not in ind_map:
|
||||
ind_map[name] = float(row['value'])
|
||||
|
||||
ma_44 = ind_map.get('ma44')
|
||||
ma_125 = ind_map.get('ma125')
|
||||
|
||||
# Determine trend
|
||||
if ma_44 and ma_125:
|
||||
if current_price > ma_44 > ma_125:
|
||||
trend = "Bullish"
|
||||
trend_strength = "Strong" if current_price > ma_44 * 1.05 else "Moderate"
|
||||
elif current_price < ma_44 < ma_125:
|
||||
trend = "Bearish"
|
||||
trend_strength = "Strong" if current_price < ma_44 * 0.95 else "Moderate"
|
||||
else:
|
||||
trend = "Neutral"
|
||||
trend_strength = "Consolidation"
|
||||
else:
|
||||
trend = "Unknown"
|
||||
trend_strength = "Insufficient data"
|
||||
|
||||
# 3. Find support/resistance (simple recent high/low)
|
||||
rows = await conn.fetch("""
|
||||
SELECT high, low
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
ORDER BY time DESC
|
||||
LIMIT 20
|
||||
""", symbol, interval)
|
||||
|
||||
if rows:
|
||||
highs = [float(r['high']) for r in rows]
|
||||
lows = [float(r['low']) for r in rows]
|
||||
resistance = max(highs)
|
||||
support = min(lows)
|
||||
|
||||
price_range = resistance - support
|
||||
if price_range > 0:
|
||||
position = (current_price - support) / price_range * 100
|
||||
else:
|
||||
position = 50
|
||||
else:
|
||||
resistance = current_price
|
||||
support = current_price
|
||||
position = 50
|
||||
|
||||
return {
|
||||
"symbol": symbol,
|
||||
"interval": interval,
|
||||
"timestamp": timestamp.isoformat(),
|
||||
"current_price": round(current_price, 2),
|
||||
"moving_averages": {
|
||||
"ma_44": round(ma_44, 2) if ma_44 else None,
|
||||
"ma_125": round(ma_125, 2) if ma_125 else None,
|
||||
"price_vs_ma44": round((current_price / ma_44 - 1) * 100, 2) if ma_44 else None,
|
||||
"price_vs_ma125": round((current_price / ma_125 - 1) * 100, 2) if ma_125 else None
|
||||
},
|
||||
"trend": {
|
||||
"direction": trend,
|
||||
"strength": trend_strength,
|
||||
"signal": "Buy" if trend == "Bullish" and trend_strength == "Strong" else
|
||||
"Sell" if trend == "Bearish" and trend_strength == "Strong" else "Hold"
|
||||
},
|
||||
"levels": {
|
||||
"resistance": round(resistance, 2),
|
||||
"support": round(support, 2),
|
||||
"position_in_range": round(position, 1)
|
||||
},
|
||||
"ai_placeholder": {
|
||||
"available": False,
|
||||
"message": "AI analysis available via Gemini or local LLM",
|
||||
"action": "Click to analyze with AI"
|
||||
}
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Technical analysis error: {e}")
|
||||
raise HTTPException(status_code=500, detail=f"Technical analysis failed: {str(e)}")
|
||||
|
||||
|
||||
@app.get("/api/v1/export/csv")
|
||||
async def export_csv(
|
||||
symbol: str = "BTC",
|
||||
interval: str = "1m",
|
||||
days: int = Query(7, ge=1, le=365, description="Number of days to export")
|
||||
):
|
||||
"""Export candle data to CSV"""
|
||||
start_date = datetime.utcnow() - timedelta(days=days)
|
||||
|
||||
async with pool.acquire() as conn:
|
||||
query = """
|
||||
SELECT time, open, high, low, close, volume
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = $2 AND time >= $3
|
||||
ORDER BY time
|
||||
"""
|
||||
rows = await conn.fetch(query, symbol, interval, start_date)
|
||||
|
||||
if not rows:
|
||||
raise HTTPException(status_code=404, detail="No data found for export")
|
||||
|
||||
output = io.StringIO()
|
||||
writer = csv.writer(output)
|
||||
writer.writerow(['timestamp', 'open', 'high', 'low', 'close', 'volume'])
|
||||
|
||||
for row in rows:
|
||||
writer.writerow([
|
||||
row['time'].isoformat(),
|
||||
row['open'],
|
||||
row['high'],
|
||||
row['low'],
|
||||
row['close'],
|
||||
row['volume']
|
||||
])
|
||||
|
||||
output.seek(0)
|
||||
|
||||
return StreamingResponse(
|
||||
io.BytesIO(output.getvalue().encode()),
|
||||
media_type="text/csv",
|
||||
headers={
|
||||
"Content-Disposition": f"attachment; filename={symbol}_{interval}_{days}d.csv"
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
# Serve static files for dashboard
|
||||
app.mount("/dashboard", StaticFiles(directory="src/api/dashboard/static", html=True), name="dashboard")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import uvicorn
|
||||
uvicorn.run(app, host="0.0.0.0", port=8000)
|
||||
23
src/data_collector/__init__.py
Normal file
23
src/data_collector/__init__.py
Normal file
@ -0,0 +1,23 @@
|
||||
# Data collector module
|
||||
from .websocket_client import HyperliquidWebSocket, Candle
|
||||
from .candle_buffer import CandleBuffer
|
||||
from .database import DatabaseManager
|
||||
from .backfill import HyperliquidBackfill
|
||||
from .custom_timeframe_generator import CustomTimeframeGenerator
|
||||
from .indicator_engine import IndicatorEngine, IndicatorConfig
|
||||
from .brain import Brain, Decision
|
||||
from .backtester import Backtester
|
||||
|
||||
__all__ = [
|
||||
'HyperliquidWebSocket',
|
||||
'Candle',
|
||||
'CandleBuffer',
|
||||
'DatabaseManager',
|
||||
'HyperliquidBackfill',
|
||||
'CustomTimeframeGenerator',
|
||||
'IndicatorEngine',
|
||||
'IndicatorConfig',
|
||||
'Brain',
|
||||
'Decision',
|
||||
'Backtester'
|
||||
]
|
||||
368
src/data_collector/backfill.py
Normal file
368
src/data_collector/backfill.py
Normal file
@ -0,0 +1,368 @@
|
||||
"""
|
||||
Hyperliquid Historical Data Backfill Module
|
||||
Downloads candle data from Hyperliquid REST API with pagination support
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
from datetime import datetime, timezone, timedelta
|
||||
from typing import List, Dict, Any, Optional
|
||||
import aiohttp
|
||||
|
||||
from .database import DatabaseManager
|
||||
from .websocket_client import Candle
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class HyperliquidBackfill:
|
||||
"""
|
||||
Backfills historical candle data from Hyperliquid REST API
|
||||
|
||||
API Limitations:
|
||||
- Max 5000 candles per coin/interval combination
|
||||
- 500 candles per response (requires pagination)
|
||||
- Available intervals: 1m, 3m, 5m, 15m, 30m, 1h, 2h, 4h, 8h, 12h, 1d, 3d, 1w, 1M
|
||||
"""
|
||||
|
||||
API_URL = "https://api.hyperliquid.xyz/info"
|
||||
MAX_CANDLES_PER_REQUEST = 500
|
||||
# Hyperliquid API might limit total history, but we'll set a high limit
|
||||
# and stop when no more data is returned
|
||||
MAX_TOTAL_CANDLES = 500000
|
||||
|
||||
# Standard timeframes supported by Hyperliquid
|
||||
INTERVALS = [
|
||||
"1m", "3m", "5m", "15m", "30m",
|
||||
"1h", "2h", "4h", "8h", "12h",
|
||||
"1d", "3d", "1w", "1M"
|
||||
]
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
db: DatabaseManager,
|
||||
coin: str = "BTC",
|
||||
intervals: Optional[List[str]] = None
|
||||
):
|
||||
self.db = db
|
||||
self.coin = coin
|
||||
self.intervals = intervals or ["1m"] # Default to 1m
|
||||
self.session: Optional[aiohttp.ClientSession] = None
|
||||
|
||||
async def __aenter__(self):
|
||||
"""Async context manager entry"""
|
||||
self.session = aiohttp.ClientSession()
|
||||
return self
|
||||
|
||||
async def __aexit__(self, exc_type, exc_val, exc_tb):
|
||||
"""Async context manager exit"""
|
||||
if self.session:
|
||||
await self.session.close()
|
||||
|
||||
async def fetch_candles(
|
||||
self,
|
||||
interval: str,
|
||||
start_time: datetime,
|
||||
end_time: Optional[datetime] = None
|
||||
) -> List[Candle]:
|
||||
"""
|
||||
Fetch candles for a specific interval with pagination
|
||||
|
||||
Args:
|
||||
interval: Candle interval (e.g., "1m", "1h", "1d")
|
||||
start_time: Start time (inclusive)
|
||||
end_time: End time (inclusive, defaults to now)
|
||||
|
||||
Returns:
|
||||
List of Candle objects
|
||||
"""
|
||||
if interval not in self.INTERVALS:
|
||||
raise ValueError(f"Invalid interval: {interval}. Must be one of {self.INTERVALS}")
|
||||
|
||||
end_time = end_time or datetime.now(timezone.utc)
|
||||
|
||||
# Convert to milliseconds
|
||||
start_ms = int(start_time.timestamp() * 1000)
|
||||
end_ms = int(end_time.timestamp() * 1000)
|
||||
|
||||
all_candles = []
|
||||
total_fetched = 0
|
||||
|
||||
while total_fetched < self.MAX_TOTAL_CANDLES:
|
||||
logger.info(f"Fetching {interval} candles from {datetime.fromtimestamp(start_ms/1000, tz=timezone.utc)} "
|
||||
f"(batch {total_fetched//self.MAX_CANDLES_PER_REQUEST + 1})")
|
||||
|
||||
try:
|
||||
batch = await self._fetch_batch(interval, start_ms, end_ms)
|
||||
|
||||
if not batch:
|
||||
logger.info(f"No more {interval} candles available")
|
||||
break
|
||||
|
||||
all_candles.extend(batch)
|
||||
total_fetched += len(batch)
|
||||
|
||||
logger.info(f"Fetched {len(batch)} {interval} candles (total: {total_fetched})")
|
||||
|
||||
# Check if we got less than max, means we're done
|
||||
if len(batch) < self.MAX_CANDLES_PER_REQUEST:
|
||||
break
|
||||
|
||||
# Update start_time for next batch (last candle's time + 1ms)
|
||||
last_candle = batch[-1]
|
||||
start_ms = int(last_candle.time.timestamp() * 1000) + 1
|
||||
|
||||
# Small delay to avoid rate limiting
|
||||
await asyncio.sleep(0.1)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error fetching {interval} candles: {e}")
|
||||
break
|
||||
|
||||
logger.info(f"Backfill complete for {interval}: {len(all_candles)} candles total")
|
||||
return all_candles
|
||||
|
||||
async def _fetch_batch(
|
||||
self,
|
||||
interval: str,
|
||||
start_ms: int,
|
||||
end_ms: int
|
||||
) -> List[Candle]:
|
||||
"""Fetch a single batch of candles from the API"""
|
||||
if not self.session:
|
||||
raise RuntimeError("Session not initialized. Use async context manager.")
|
||||
|
||||
payload = {
|
||||
"type": "candleSnapshot",
|
||||
"req": {
|
||||
"coin": self.coin,
|
||||
"interval": interval,
|
||||
"startTime": start_ms,
|
||||
"endTime": end_ms
|
||||
}
|
||||
}
|
||||
|
||||
async with self.session.post(self.API_URL, json=payload) as response:
|
||||
if response.status != 200:
|
||||
text = await response.text()
|
||||
raise Exception(f"API error {response.status}: {text}")
|
||||
|
||||
data = await response.json()
|
||||
|
||||
if not isinstance(data, list):
|
||||
logger.warning(f"Unexpected response format: {data}")
|
||||
return []
|
||||
|
||||
candles = []
|
||||
for item in data:
|
||||
try:
|
||||
candle = self._parse_candle_item(item, interval)
|
||||
if candle:
|
||||
candles.append(candle)
|
||||
except Exception as e:
|
||||
logger.warning(f"Failed to parse candle: {item}, error: {e}")
|
||||
|
||||
return candles
|
||||
|
||||
def _parse_candle_item(self, data: Dict[str, Any], interval: str) -> Optional[Candle]:
|
||||
"""Parse a single candle item from API response"""
|
||||
try:
|
||||
# Format: {"t": 1770812400000, "T": ..., "s": "BTC", "i": "1m", "o": "67164.0", ...}
|
||||
timestamp_ms = int(data.get("t", 0))
|
||||
timestamp = datetime.fromtimestamp(timestamp_ms / 1000, tz=timezone.utc)
|
||||
|
||||
return Candle(
|
||||
time=timestamp,
|
||||
symbol=self.coin,
|
||||
interval=interval,
|
||||
open=float(data.get("o", 0)),
|
||||
high=float(data.get("h", 0)),
|
||||
low=float(data.get("l", 0)),
|
||||
close=float(data.get("c", 0)),
|
||||
volume=float(data.get("v", 0))
|
||||
)
|
||||
except (KeyError, ValueError, TypeError) as e:
|
||||
logger.error(f"Failed to parse candle data: {e}, data: {data}")
|
||||
return None
|
||||
|
||||
async def backfill_interval(
|
||||
self,
|
||||
interval: str,
|
||||
days_back: int = 7
|
||||
) -> int:
|
||||
"""
|
||||
Backfill a specific interval for the last N days
|
||||
|
||||
Args:
|
||||
interval: Candle interval
|
||||
days_back: Number of days to backfill (use 0 for max available)
|
||||
|
||||
Returns:
|
||||
Number of candles inserted
|
||||
"""
|
||||
if days_back == 0:
|
||||
# Fetch maximum available data (5000 candles)
|
||||
return await self.backfill_max(interval)
|
||||
|
||||
end_time = datetime.now(timezone.utc)
|
||||
start_time = end_time - timedelta(days=days_back)
|
||||
|
||||
logger.info(f"Starting backfill for {interval}: {start_time} to {end_time}")
|
||||
|
||||
candles = await self.fetch_candles(interval, start_time, end_time)
|
||||
|
||||
if not candles:
|
||||
logger.warning(f"No candles fetched for {interval}")
|
||||
return 0
|
||||
|
||||
# Insert into database
|
||||
inserted = await self.db.insert_candles(candles)
|
||||
logger.info(f"Inserted {inserted} candles for {interval}")
|
||||
|
||||
return inserted
|
||||
|
||||
async def backfill_max(self, interval: str) -> int:
|
||||
"""
|
||||
Backfill maximum available data (5000 candles) for an interval
|
||||
|
||||
Args:
|
||||
interval: Candle interval
|
||||
|
||||
Returns:
|
||||
Number of candles inserted
|
||||
"""
|
||||
logger.info(f"Fetching maximum available {interval} data (up to 5000 candles)")
|
||||
|
||||
# For weekly and monthly, start from 2020 to ensure we get all available data
|
||||
# Hyperliquid launched around 2023, so this should capture everything
|
||||
start_time = datetime(2020, 1, 1, tzinfo=timezone.utc)
|
||||
end_time = datetime.now(timezone.utc)
|
||||
|
||||
logger.info(f"Fetching {interval} candles from {start_time} to {end_time}")
|
||||
|
||||
candles = await self.fetch_candles(interval, start_time, end_time)
|
||||
|
||||
if not candles:
|
||||
logger.warning(f"No candles fetched for {interval}")
|
||||
return 0
|
||||
|
||||
# Insert into database
|
||||
inserted = await self.db.insert_candles(candles)
|
||||
logger.info(f"Inserted {inserted} candles for {interval} (max available)")
|
||||
|
||||
return inserted
|
||||
|
||||
def _interval_to_minutes(self, interval: str) -> int:
|
||||
"""Convert interval string to minutes"""
|
||||
mapping = {
|
||||
"1m": 1, "3m": 3, "5m": 5, "15m": 15, "30m": 30,
|
||||
"1h": 60, "2h": 120, "4h": 240, "8h": 480, "12h": 720,
|
||||
"1d": 1440, "3d": 4320, "1w": 10080, "1M": 43200
|
||||
}
|
||||
return mapping.get(interval, 1)
|
||||
|
||||
async def backfill_all_intervals(
|
||||
self,
|
||||
days_back: int = 7
|
||||
) -> Dict[str, int]:
|
||||
"""
|
||||
Backfill all configured intervals
|
||||
|
||||
Args:
|
||||
days_back: Number of days to backfill
|
||||
|
||||
Returns:
|
||||
Dictionary mapping interval to count inserted
|
||||
"""
|
||||
results = {}
|
||||
|
||||
for interval in self.intervals:
|
||||
try:
|
||||
count = await self.backfill_interval(interval, days_back)
|
||||
results[interval] = count
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to backfill {interval}: {e}")
|
||||
results[interval] = 0
|
||||
|
||||
return results
|
||||
|
||||
async def get_earliest_candle_time(self, interval: str) -> Optional[datetime]:
|
||||
"""Get the earliest candle time available for an interval"""
|
||||
# Try fetching from epoch to find earliest available
|
||||
start_time = datetime(2020, 1, 1, tzinfo=timezone.utc)
|
||||
end_time = datetime.now(timezone.utc)
|
||||
|
||||
candles = await self.fetch_candles(interval, start_time, end_time)
|
||||
|
||||
if candles:
|
||||
earliest = min(c.time for c in candles)
|
||||
logger.info(f"Earliest {interval} candle available: {earliest}")
|
||||
return earliest
|
||||
return None
|
||||
|
||||
|
||||
async def main():
|
||||
"""CLI entry point for backfill"""
|
||||
import argparse
|
||||
import os
|
||||
|
||||
parser = argparse.ArgumentParser(description="Backfill Hyperliquid historical data")
|
||||
parser.add_argument("--coin", default="BTC", help="Coin symbol (default: BTC)")
|
||||
parser.add_argument("--intervals", nargs="+", default=["1m"],
|
||||
help="Intervals to backfill (default: 1m)")
|
||||
parser.add_argument("--days", type=str, default="7",
|
||||
help="Days to backfill (default: 7, use 'max' for maximum available)")
|
||||
parser.add_argument("--db-host", default=os.getenv("DB_HOST", "localhost"),
|
||||
help="Database host (default: localhost or DB_HOST env)")
|
||||
parser.add_argument("--db-port", type=int, default=int(os.getenv("DB_PORT", 5432)),
|
||||
help="Database port (default: 5432 or DB_PORT env)")
|
||||
parser.add_argument("--db-name", default=os.getenv("DB_NAME", "btc_data"),
|
||||
help="Database name (default: btc_data or DB_NAME env)")
|
||||
parser.add_argument("--db-user", default=os.getenv("DB_USER", "btc_bot"),
|
||||
help="Database user (default: btc_bot or DB_USER env)")
|
||||
parser.add_argument("--db-password", default=os.getenv("DB_PASSWORD", ""),
|
||||
help="Database password (default: from DB_PASSWORD env)")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Parse days argument
|
||||
if args.days.lower() == "max":
|
||||
days_back = 0 # 0 means max available
|
||||
logger.info("Backfill mode: MAX (fetching up to 5000 candles per interval)")
|
||||
else:
|
||||
days_back = int(args.days)
|
||||
logger.info(f"Backfill mode: Last {days_back} days")
|
||||
|
||||
# Setup logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
||||
)
|
||||
|
||||
# Initialize database
|
||||
db = DatabaseManager(
|
||||
host=args.db_host,
|
||||
port=args.db_port,
|
||||
database=args.db_name,
|
||||
user=args.db_user,
|
||||
password=args.db_password
|
||||
)
|
||||
|
||||
await db.connect()
|
||||
|
||||
try:
|
||||
async with HyperliquidBackfill(db, args.coin, args.intervals) as backfill:
|
||||
results = await backfill.backfill_all_intervals(days_back)
|
||||
|
||||
print("\n=== Backfill Summary ===")
|
||||
for interval, count in results.items():
|
||||
print(f"{interval}: {count} candles")
|
||||
print(f"Total: {sum(results.values())} candles")
|
||||
|
||||
finally:
|
||||
await db.disconnect()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
154
src/data_collector/backfill_gap.py
Normal file
154
src/data_collector/backfill_gap.py
Normal file
@ -0,0 +1,154 @@
|
||||
"""
|
||||
One-time backfill script to fill gaps in data.
|
||||
Run with: python -m data_collector.backfill_gap --start "2024-01-01 09:34" --end "2024-01-01 19:39"
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
import sys
|
||||
import os
|
||||
from datetime import datetime, timezone
|
||||
from typing import Optional
|
||||
|
||||
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.dirname(os.path.abspath(__file__)))))
|
||||
|
||||
from .database import DatabaseManager
|
||||
from .backfill import HyperliquidBackfill
|
||||
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
||||
)
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
INTERVALS = ["1m", "3m", "5m", "15m", "30m", "1h", "2h", "4h", "8h", "12h", "1d", "3d", "1w"]
|
||||
|
||||
|
||||
async def backfill_gap(
|
||||
start_time: datetime,
|
||||
end_time: datetime,
|
||||
symbol: str = "BTC",
|
||||
intervals: Optional[list] = None
|
||||
) -> dict:
|
||||
"""
|
||||
Backfill a specific time gap for all intervals.
|
||||
|
||||
Args:
|
||||
start_time: Gap start time (UTC)
|
||||
end_time: Gap end time (UTC)
|
||||
symbol: Trading symbol
|
||||
intervals: List of intervals to backfill (default: all standard)
|
||||
|
||||
Returns:
|
||||
Dictionary with interval -> count mapping
|
||||
"""
|
||||
intervals = intervals or INTERVALS
|
||||
results = {}
|
||||
|
||||
db = DatabaseManager()
|
||||
await db.connect()
|
||||
|
||||
logger.info(f"Backfilling gap: {start_time} to {end_time} for {symbol}")
|
||||
|
||||
try:
|
||||
async with HyperliquidBackfill(db, symbol, intervals) as backfill:
|
||||
for interval in intervals:
|
||||
try:
|
||||
logger.info(f"Backfilling {interval}...")
|
||||
candles = await backfill.fetch_candles(interval, start_time, end_time)
|
||||
|
||||
if candles:
|
||||
inserted = await db.insert_candles(candles)
|
||||
results[interval] = inserted
|
||||
logger.info(f" {interval}: {inserted} candles inserted")
|
||||
else:
|
||||
results[interval] = 0
|
||||
logger.warning(f" {interval}: No candles returned")
|
||||
|
||||
await asyncio.sleep(0.3)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f" {interval}: Error - {e}")
|
||||
results[interval] = 0
|
||||
|
||||
finally:
|
||||
await db.disconnect()
|
||||
|
||||
logger.info(f"Backfill complete. Total: {sum(results.values())} candles")
|
||||
return results
|
||||
|
||||
|
||||
async def auto_detect_and_fill_gaps(symbol: str = "BTC") -> dict:
|
||||
"""
|
||||
Detect and fill all gaps in the database for all intervals.
|
||||
"""
|
||||
db = DatabaseManager()
|
||||
await db.connect()
|
||||
|
||||
results = {}
|
||||
|
||||
try:
|
||||
async with HyperliquidBackfill(db, symbol, INTERVALS) as backfill:
|
||||
for interval in INTERVALS:
|
||||
try:
|
||||
# Detect gaps
|
||||
gaps = await db.detect_gaps(symbol, interval)
|
||||
|
||||
if not gaps:
|
||||
logger.info(f"{interval}: No gaps detected")
|
||||
results[interval] = 0
|
||||
continue
|
||||
|
||||
logger.info(f"{interval}: {len(gaps)} gaps detected")
|
||||
total_filled = 0
|
||||
|
||||
for gap in gaps:
|
||||
gap_start = datetime.fromisoformat(gap['gap_start'].replace('Z', '+00:00'))
|
||||
gap_end = datetime.fromisoformat(gap['gap_end'].replace('Z', '+00:00'))
|
||||
|
||||
logger.info(f" Filling gap: {gap_start} to {gap_end}")
|
||||
|
||||
candles = await backfill.fetch_candles(interval, gap_start, gap_end)
|
||||
|
||||
if candles:
|
||||
inserted = await db.insert_candles(candles)
|
||||
total_filled += inserted
|
||||
logger.info(f" Filled {inserted} candles")
|
||||
|
||||
await asyncio.sleep(0.2)
|
||||
|
||||
results[interval] = total_filled
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"{interval}: Error - {e}")
|
||||
results[interval] = 0
|
||||
|
||||
finally:
|
||||
await db.disconnect()
|
||||
|
||||
return results
|
||||
|
||||
|
||||
async def main():
|
||||
import argparse
|
||||
|
||||
parser = argparse.ArgumentParser(description="Backfill gaps in BTC data")
|
||||
parser.add_argument("--start", help="Start time (YYYY-MM-DD HH:MM)", default=None)
|
||||
parser.add_argument("--end", help="End time (YYYY-MM-DD HH:MM)", default=None)
|
||||
parser.add_argument("--auto", action="store_true", help="Auto-detect and fill all gaps")
|
||||
parser.add_argument("--symbol", default="BTC", help="Symbol to backfill")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
if args.auto:
|
||||
await auto_detect_and_fill_gaps(args.symbol)
|
||||
elif args.start and args.end:
|
||||
start_time = datetime.strptime(args.start, "%Y-%m-%d %H:%M").replace(tzinfo=timezone.utc)
|
||||
end_time = datetime.strptime(args.end, "%Y-%m-%d %H:%M").replace(tzinfo=timezone.utc)
|
||||
await backfill_gap(start_time, end_time, args.symbol)
|
||||
else:
|
||||
parser.print_help()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
391
src/data_collector/backtester.py
Normal file
391
src/data_collector/backtester.py
Normal file
@ -0,0 +1,391 @@
|
||||
"""
|
||||
Backtester - Historical replay driver for IndicatorEngine + Brain
|
||||
Iterates over stored candle data to simulate live trading decisions
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, Optional, Any
|
||||
from uuid import uuid4
|
||||
|
||||
from .database import DatabaseManager
|
||||
from .indicator_engine import IndicatorEngine, IndicatorConfig
|
||||
from .brain import Brain, Decision
|
||||
from .simulator import Account
|
||||
from src.strategies.base import SignalType
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class Backtester:
|
||||
"""
|
||||
Replays historical candle data through IndicatorEngine and Brain.
|
||||
Uses Simulator (Account) to track PnL, leverage, and fees.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
db: DatabaseManager,
|
||||
indicator_engine: IndicatorEngine,
|
||||
brain: Brain,
|
||||
):
|
||||
self.db = db
|
||||
self.indicator_engine = indicator_engine
|
||||
self.brain = brain
|
||||
self.account = Account(initial_balance=1000.0)
|
||||
|
||||
async def run(
|
||||
self,
|
||||
symbol: str,
|
||||
intervals: List[str],
|
||||
start: datetime,
|
||||
end: datetime,
|
||||
config: Optional[Dict[str, Any]] = None,
|
||||
) -> str:
|
||||
"""
|
||||
Run a full backtest over the given time range.
|
||||
"""
|
||||
backtest_id = str(uuid4())
|
||||
|
||||
logger.info(
|
||||
f"Starting backtest {backtest_id}: {symbol} "
|
||||
f"{intervals} from {start} to {end}"
|
||||
)
|
||||
|
||||
# Reset brain state
|
||||
self.brain.reset_state()
|
||||
|
||||
# Reset account for this run
|
||||
self.account = Account(initial_balance=1000.0)
|
||||
|
||||
# Store the run metadata
|
||||
await self._save_run_start(
|
||||
backtest_id, symbol, intervals, start, end, config
|
||||
)
|
||||
|
||||
total_decisions = 0
|
||||
|
||||
for interval in intervals:
|
||||
# Only process intervals that have indicators configured
|
||||
configured = self.indicator_engine.get_configured_intervals()
|
||||
if interval not in configured:
|
||||
logger.warning(
|
||||
f"Skipping interval {interval}: no indicators configured"
|
||||
)
|
||||
continue
|
||||
|
||||
# Get all candle timestamps in range
|
||||
timestamps = await self._get_candle_timestamps(
|
||||
symbol, interval, start, end
|
||||
)
|
||||
|
||||
if not timestamps:
|
||||
logger.warning(
|
||||
f"No candles found for {symbol}/{interval} in range"
|
||||
)
|
||||
continue
|
||||
|
||||
logger.info(
|
||||
f"Backtest {backtest_id}: processing {len(timestamps)} "
|
||||
f"{interval} candles..."
|
||||
)
|
||||
|
||||
for i, ts in enumerate(timestamps):
|
||||
# 1. Compute indicators
|
||||
raw_indicators = await self.indicator_engine.compute_at(
|
||||
symbol, interval, ts
|
||||
)
|
||||
indicators = {k: v for k, v in raw_indicators.items() if v is not None}
|
||||
|
||||
# 2. Get Current Position info for Strategy
|
||||
current_pos = self.account.get_position_dict()
|
||||
|
||||
# 3. Brain Evaluate
|
||||
decision: Decision = await self.brain.evaluate(
|
||||
symbol=symbol,
|
||||
interval=interval,
|
||||
timestamp=ts,
|
||||
indicators=indicators,
|
||||
backtest_id=backtest_id,
|
||||
current_position=current_pos
|
||||
)
|
||||
|
||||
# 4. Execute Decision in Simulator
|
||||
self._execute_decision(decision)
|
||||
|
||||
total_decisions += 1
|
||||
|
||||
if (i + 1) % 200 == 0:
|
||||
logger.info(
|
||||
f"Backtest {backtest_id}: {i + 1}/{len(timestamps)} "
|
||||
f"{interval} candles processed. Eq: {self.account.equity:.2f}"
|
||||
)
|
||||
await asyncio.sleep(0.01)
|
||||
|
||||
# Compute and store summary results from Simulator
|
||||
results = self.account.get_stats()
|
||||
results['total_evaluations'] = total_decisions
|
||||
|
||||
await self._save_run_results(backtest_id, results)
|
||||
|
||||
logger.info(
|
||||
f"Backtest {backtest_id} complete. Final Balance: {results['final_balance']:.2f}"
|
||||
)
|
||||
|
||||
return backtest_id
|
||||
|
||||
def _execute_decision(self, decision: Decision):
|
||||
"""Translate Brain decision into Account action"""
|
||||
price = decision.price_at_decision
|
||||
time = decision.time
|
||||
|
||||
# Open Long
|
||||
if decision.decision_type == SignalType.OPEN_LONG.value:
|
||||
self.account.open_position(time, 'long', price, leverage=1.0) # Todo: Configurable leverage
|
||||
|
||||
# Open Short
|
||||
elif decision.decision_type == SignalType.OPEN_SHORT.value:
|
||||
self.account.open_position(time, 'short', price, leverage=1.0)
|
||||
|
||||
# Close Long (only if we are long)
|
||||
elif decision.decision_type == SignalType.CLOSE_LONG.value:
|
||||
if self.account.current_position and self.account.current_position.side == 'long':
|
||||
self.account.close_position(time, price)
|
||||
|
||||
# Close Short (only if we are short)
|
||||
elif decision.decision_type == SignalType.CLOSE_SHORT.value:
|
||||
if self.account.current_position and self.account.current_position.side == 'short':
|
||||
self.account.close_position(time, price)
|
||||
|
||||
# Update equity mark-to-market
|
||||
self.account.update_equity(price)
|
||||
|
||||
async def _get_candle_timestamps(
|
||||
self,
|
||||
symbol: str,
|
||||
interval: str,
|
||||
start: datetime,
|
||||
end: datetime,
|
||||
) -> List[datetime]:
|
||||
"""Get all candle timestamps in a range, ordered chronologically"""
|
||||
async with self.db.acquire() as conn:
|
||||
rows = await conn.fetch("""
|
||||
SELECT time FROM candles
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
AND time >= $3 AND time <= $4
|
||||
ORDER BY time ASC
|
||||
""", symbol, interval, start, end)
|
||||
|
||||
return [row["time"] for row in rows]
|
||||
|
||||
async def _save_run_start(
|
||||
self,
|
||||
backtest_id: str,
|
||||
symbol: str,
|
||||
intervals: List[str],
|
||||
start: datetime,
|
||||
end: datetime,
|
||||
config: Optional[Dict[str, Any]],
|
||||
) -> None:
|
||||
"""Store backtest run metadata at start"""
|
||||
async with self.db.acquire() as conn:
|
||||
await conn.execute("""
|
||||
INSERT INTO backtest_runs (
|
||||
id, strategy, symbol, start_time, end_time,
|
||||
intervals, config
|
||||
)
|
||||
VALUES ($1, $2, $3, $4, $5, $6, $7)
|
||||
""",
|
||||
backtest_id,
|
||||
self.brain.strategy_name,
|
||||
symbol,
|
||||
start,
|
||||
end,
|
||||
intervals,
|
||||
json.dumps(config) if config else None,
|
||||
)
|
||||
|
||||
async def _compute_results(self, backtest_id, symbol):
|
||||
"""Deprecated: Logic moved to Account class"""
|
||||
return {}
|
||||
|
||||
async def _save_run_results(
|
||||
self,
|
||||
backtest_id: str,
|
||||
results: Dict[str, Any],
|
||||
) -> None:
|
||||
"""Update backtest run with final results"""
|
||||
# Remove trades list from stored results (can be large)
|
||||
stored_results = {k: v for k, v in results.items() if k != "trades"}
|
||||
|
||||
async with self.db.acquire() as conn:
|
||||
await conn.execute("""
|
||||
UPDATE backtest_runs
|
||||
SET results = $1
|
||||
WHERE id = $2
|
||||
""", json.dumps(stored_results), backtest_id)
|
||||
|
||||
async def get_run(self, backtest_id: str) -> Optional[Dict[str, Any]]:
|
||||
"""Get a specific backtest run with results"""
|
||||
async with self.db.acquire() as conn:
|
||||
row = await conn.fetchrow("""
|
||||
SELECT id, strategy, symbol, start_time, end_time,
|
||||
intervals, config, results, created_at
|
||||
FROM backtest_runs
|
||||
WHERE id = $1
|
||||
""", backtest_id)
|
||||
|
||||
return dict(row) if row else None
|
||||
|
||||
async def list_runs(
|
||||
self,
|
||||
symbol: Optional[str] = None,
|
||||
limit: int = 20,
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""List recent backtest runs"""
|
||||
async with self.db.acquire() as conn:
|
||||
if symbol:
|
||||
rows = await conn.fetch("""
|
||||
SELECT id, strategy, symbol, start_time, end_time,
|
||||
intervals, results, created_at
|
||||
FROM backtest_runs
|
||||
WHERE symbol = $1
|
||||
ORDER BY created_at DESC
|
||||
LIMIT $2
|
||||
""", symbol, limit)
|
||||
else:
|
||||
rows = await conn.fetch("""
|
||||
SELECT id, strategy, symbol, start_time, end_time,
|
||||
intervals, results, created_at
|
||||
FROM backtest_runs
|
||||
ORDER BY created_at DESC
|
||||
LIMIT $1
|
||||
""", limit)
|
||||
|
||||
return [dict(row) for row in rows]
|
||||
|
||||
async def cleanup_run(self, backtest_id: str) -> int:
|
||||
"""Delete all decisions and metadata for a backtest run"""
|
||||
async with self.db.acquire() as conn:
|
||||
result = await conn.execute("""
|
||||
DELETE FROM decisions WHERE backtest_id = $1
|
||||
""", backtest_id)
|
||||
|
||||
await conn.execute("""
|
||||
DELETE FROM backtest_runs WHERE id = $1
|
||||
""", backtest_id)
|
||||
|
||||
deleted = int(result.split()[-1]) if result else 0
|
||||
logger.info(
|
||||
f"Cleaned up backtest {backtest_id}: "
|
||||
f"{deleted} decisions deleted"
|
||||
)
|
||||
return deleted
|
||||
|
||||
|
||||
async def main():
|
||||
"""CLI entry point for running backtests"""
|
||||
import argparse
|
||||
import os
|
||||
|
||||
parser = argparse.ArgumentParser(
|
||||
description="Run backtest on historical data"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--symbol", default="BTC", help="Symbol (default: BTC)"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--intervals", nargs="+", default=["37m"],
|
||||
help="Intervals to backtest (default: 37m)"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--start", required=True,
|
||||
help="Start date (ISO format, e.g., 2025-01-01)"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--end", default=None,
|
||||
help="End date (ISO format, default: now)"
|
||||
)
|
||||
parser.add_argument(
|
||||
"--db-host", default=os.getenv("DB_HOST", "localhost"),
|
||||
)
|
||||
parser.add_argument(
|
||||
"--db-port", type=int, default=int(os.getenv("DB_PORT", 5432)),
|
||||
)
|
||||
parser.add_argument(
|
||||
"--db-name", default=os.getenv("DB_NAME", "btc_data"),
|
||||
)
|
||||
parser.add_argument(
|
||||
"--db-user", default=os.getenv("DB_USER", "btc_bot"),
|
||||
)
|
||||
parser.add_argument(
|
||||
"--db-password", default=os.getenv("DB_PASSWORD", ""),
|
||||
)
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format="%(asctime)s - %(name)s - %(levelname)s - %(message)s",
|
||||
)
|
||||
|
||||
# Parse dates
|
||||
start = datetime.fromisoformat(args.start).replace(tzinfo=timezone.utc)
|
||||
end = (
|
||||
datetime.fromisoformat(args.end).replace(tzinfo=timezone.utc)
|
||||
if args.end
|
||||
else datetime.now(timezone.utc)
|
||||
)
|
||||
|
||||
# Initialize components
|
||||
db = DatabaseManager(
|
||||
host=args.db_host,
|
||||
port=args.db_port,
|
||||
database=args.db_name,
|
||||
user=args.db_user,
|
||||
password=args.db_password,
|
||||
)
|
||||
await db.connect()
|
||||
|
||||
try:
|
||||
# Default indicator configs (MA44 + MA125 on selected intervals)
|
||||
configs = [
|
||||
IndicatorConfig("ma44", "sma", 44, args.intervals),
|
||||
IndicatorConfig("ma125", "sma", 125, args.intervals),
|
||||
]
|
||||
|
||||
indicator_engine = IndicatorEngine(db, configs)
|
||||
brain = Brain(db, indicator_engine)
|
||||
backtester = Backtester(db, indicator_engine, brain)
|
||||
|
||||
# Run the backtest
|
||||
backtest_id = await backtester.run(
|
||||
symbol=args.symbol,
|
||||
intervals=args.intervals,
|
||||
start=start,
|
||||
end=end,
|
||||
)
|
||||
|
||||
# Print results
|
||||
run = await backtester.get_run(backtest_id)
|
||||
if run and run.get("results"):
|
||||
results = json.loads(run["results"]) if isinstance(run["results"], str) else run["results"]
|
||||
print("\n=== Backtest Results ===")
|
||||
print(f"ID: {backtest_id}")
|
||||
print(f"Strategy: {run['strategy']}")
|
||||
print(f"Period: {run['start_time']} to {run['end_time']}")
|
||||
print(f"Intervals: {run['intervals']}")
|
||||
print(f"Total evaluations: {results.get('total_evaluations', 0)}")
|
||||
print(f"Total trades: {results.get('total_trades', 0)}")
|
||||
print(f"Win rate: {results.get('win_rate', 0)}%")
|
||||
print(f"Total P&L: {results.get('total_pnl_pct', 0)}%")
|
||||
print(f"Final Balance: {results.get('final_balance', 0)}")
|
||||
|
||||
finally:
|
||||
await db.disconnect()
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
226
src/data_collector/brain.py
Normal file
226
src/data_collector/brain.py
Normal file
@ -0,0 +1,226 @@
|
||||
"""
|
||||
Brain - Strategy evaluation and decision logging
|
||||
Pure strategy logic separated from DB I/O for testability
|
||||
"""
|
||||
|
||||
import json
|
||||
import logging
|
||||
from dataclasses import dataclass
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, Optional, Any, List, Callable
|
||||
|
||||
from .database import DatabaseManager
|
||||
from .indicator_engine import IndicatorEngine
|
||||
from src.strategies.base import BaseStrategy, StrategySignal, SignalType
|
||||
from src.strategies.ma_strategy import MAStrategy
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
def _create_ma44() -> BaseStrategy:
|
||||
return MAStrategy(config={"period": 44})
|
||||
|
||||
def _create_ma125() -> BaseStrategy:
|
||||
return MAStrategy(config={"period": 125})
|
||||
|
||||
STRATEGY_REGISTRY: Dict[str, Callable[[], BaseStrategy]] = {
|
||||
"ma_trend": MAStrategy,
|
||||
"ma44_strategy": _create_ma44,
|
||||
"ma125_strategy": _create_ma125,
|
||||
}
|
||||
|
||||
def load_strategy(strategy_name: str) -> BaseStrategy:
|
||||
"""Load a strategy instance from registry"""
|
||||
if strategy_name not in STRATEGY_REGISTRY:
|
||||
logger.warning(f"Strategy {strategy_name} not found, defaulting to ma_trend")
|
||||
strategy_name = "ma_trend"
|
||||
|
||||
factory = STRATEGY_REGISTRY[strategy_name]
|
||||
return factory()
|
||||
|
||||
@dataclass
|
||||
class Decision:
|
||||
"""A single brain evaluation result"""
|
||||
time: datetime
|
||||
symbol: str
|
||||
interval: str
|
||||
decision_type: str # "buy", "sell", "hold" -> Now maps to SignalType
|
||||
strategy: str
|
||||
confidence: float
|
||||
price_at_decision: float
|
||||
indicator_snapshot: Dict[str, Any]
|
||||
candle_snapshot: Dict[str, Any]
|
||||
reasoning: str
|
||||
backtest_id: Optional[str] = None
|
||||
|
||||
def to_db_tuple(self) -> tuple:
|
||||
"""Convert to positional tuple for DB insert"""
|
||||
return (
|
||||
self.time,
|
||||
self.symbol,
|
||||
self.interval,
|
||||
self.decision_type,
|
||||
self.strategy,
|
||||
self.confidence,
|
||||
self.price_at_decision,
|
||||
json.dumps(self.indicator_snapshot),
|
||||
json.dumps(self.candle_snapshot),
|
||||
self.reasoning,
|
||||
self.backtest_id,
|
||||
)
|
||||
|
||||
|
||||
class Brain:
|
||||
"""
|
||||
Evaluates market conditions using a loaded Strategy.
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
db: DatabaseManager,
|
||||
indicator_engine: IndicatorEngine,
|
||||
strategy: str = "ma44_strategy",
|
||||
):
|
||||
self.db = db
|
||||
self.indicator_engine = indicator_engine
|
||||
self.strategy_name = strategy
|
||||
self.active_strategy: BaseStrategy = load_strategy(strategy)
|
||||
|
||||
logger.info(f"Brain initialized with strategy: {self.active_strategy.name}")
|
||||
|
||||
async def evaluate(
|
||||
self,
|
||||
symbol: str,
|
||||
interval: str,
|
||||
timestamp: datetime,
|
||||
indicators: Optional[Dict[str, float]] = None,
|
||||
backtest_id: Optional[str] = None,
|
||||
current_position: Optional[Dict[str, Any]] = None,
|
||||
) -> Decision:
|
||||
"""
|
||||
Evaluate market conditions and produce a decision.
|
||||
"""
|
||||
# Get indicator values
|
||||
if indicators is None:
|
||||
indicators = await self.indicator_engine.get_values_at(
|
||||
symbol, interval, timestamp
|
||||
)
|
||||
|
||||
# Get the triggering candle
|
||||
candle = await self._get_candle(symbol, interval, timestamp)
|
||||
if not candle:
|
||||
return self._create_empty_decision(timestamp, symbol, interval, indicators, backtest_id)
|
||||
|
||||
price = float(candle["close"])
|
||||
candle_dict = {
|
||||
"time": candle["time"].isoformat(),
|
||||
"open": float(candle["open"]),
|
||||
"high": float(candle["high"]),
|
||||
"low": float(candle["low"]),
|
||||
"close": price,
|
||||
"volume": float(candle["volume"]),
|
||||
}
|
||||
|
||||
# Delegate to Strategy
|
||||
signal: StrategySignal = self.active_strategy.analyze(
|
||||
candle_dict, indicators, current_position
|
||||
)
|
||||
|
||||
# Build decision
|
||||
decision = Decision(
|
||||
time=timestamp,
|
||||
symbol=symbol,
|
||||
interval=interval,
|
||||
decision_type=signal.type.value,
|
||||
strategy=self.strategy_name,
|
||||
confidence=signal.confidence,
|
||||
price_at_decision=price,
|
||||
indicator_snapshot=indicators,
|
||||
candle_snapshot=candle_dict,
|
||||
reasoning=signal.reasoning,
|
||||
backtest_id=backtest_id,
|
||||
)
|
||||
|
||||
# Store to DB
|
||||
await self._store_decision(decision)
|
||||
|
||||
return decision
|
||||
|
||||
def _create_empty_decision(self, timestamp, symbol, interval, indicators, backtest_id):
|
||||
return Decision(
|
||||
time=timestamp,
|
||||
symbol=symbol,
|
||||
interval=interval,
|
||||
decision_type="hold",
|
||||
strategy=self.strategy_name,
|
||||
confidence=0.0,
|
||||
price_at_decision=0.0,
|
||||
indicator_snapshot=indicators or {},
|
||||
candle_snapshot={},
|
||||
reasoning="No candle data available",
|
||||
backtest_id=backtest_id,
|
||||
)
|
||||
|
||||
async def _get_candle(
|
||||
self,
|
||||
symbol: str,
|
||||
interval: str,
|
||||
timestamp: datetime,
|
||||
) -> Optional[Dict[str, Any]]:
|
||||
"""Fetch a specific candle from the database"""
|
||||
async with self.db.acquire() as conn:
|
||||
row = await conn.fetchrow("""
|
||||
SELECT time, open, high, low, close, volume
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = $2 AND time = $3
|
||||
""", symbol, interval, timestamp)
|
||||
|
||||
return dict(row) if row else None
|
||||
|
||||
async def _store_decision(self, decision: Decision) -> None:
|
||||
"""Write decision to the decisions table"""
|
||||
# Note: We might want to skip writing every single HOLD to DB to save space if simulating millions of candles
|
||||
# But keeping it for now for full traceability
|
||||
async with self.db.acquire() as conn:
|
||||
await conn.execute("""
|
||||
INSERT INTO decisions (
|
||||
time, symbol, interval, decision_type, strategy,
|
||||
confidence, price_at_decision, indicator_snapshot,
|
||||
candle_snapshot, reasoning, backtest_id
|
||||
)
|
||||
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10, $11)
|
||||
""", *decision.to_db_tuple())
|
||||
|
||||
async def get_recent_decisions(
|
||||
self,
|
||||
symbol: str,
|
||||
limit: int = 20,
|
||||
backtest_id: Optional[str] = None,
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""Get recent decisions, optionally filtered by backtest_id"""
|
||||
async with self.db.acquire() as conn:
|
||||
if backtest_id is not None:
|
||||
rows = await conn.fetch("""
|
||||
SELECT time, symbol, interval, decision_type, strategy,
|
||||
confidence, price_at_decision, indicator_snapshot,
|
||||
candle_snapshot, reasoning, backtest_id
|
||||
FROM decisions
|
||||
WHERE symbol = $1 AND backtest_id = $2
|
||||
ORDER BY time DESC
|
||||
LIMIT $3
|
||||
""", symbol, backtest_id, limit)
|
||||
else:
|
||||
rows = await conn.fetch("""
|
||||
SELECT time, symbol, interval, decision_type, strategy,
|
||||
confidence, price_at_decision, indicator_snapshot,
|
||||
candle_snapshot, reasoning, backtest_id
|
||||
FROM decisions
|
||||
WHERE symbol = $1 AND backtest_id IS NULL
|
||||
ORDER BY time DESC
|
||||
LIMIT $2
|
||||
""", symbol, limit)
|
||||
|
||||
return [dict(row) for row in rows]
|
||||
|
||||
def reset_state(self) -> None:
|
||||
"""Reset internal state tracking"""
|
||||
pass
|
||||
224
src/data_collector/candle_buffer.py
Normal file
224
src/data_collector/candle_buffer.py
Normal file
@ -0,0 +1,224 @@
|
||||
"""
|
||||
In-memory candle buffer with automatic batching
|
||||
Optimized for low memory footprint on DS218+
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
from collections import deque
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, Optional, Callable, Any, Awaitable
|
||||
from dataclasses import dataclass, field
|
||||
|
||||
from .websocket_client import Candle
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@dataclass
|
||||
class BufferStats:
|
||||
"""Statistics for buffer performance monitoring"""
|
||||
total_added: int = 0
|
||||
total_flushed: int = 0
|
||||
total_dropped: int = 0
|
||||
last_flush_time: Optional[datetime] = None
|
||||
avg_batch_size: float = 0.0
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
return {
|
||||
'total_added': self.total_added,
|
||||
'total_flushed': self.total_flushed,
|
||||
'total_dropped': self.total_dropped,
|
||||
'last_flush_time': self.last_flush_time.isoformat() if self.last_flush_time else None,
|
||||
'avg_batch_size': round(self.avg_batch_size, 2)
|
||||
}
|
||||
|
||||
|
||||
class CandleBuffer:
|
||||
"""
|
||||
Thread-safe circular buffer for candle data
|
||||
Automatically flushes to database in batches
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
max_size: int = 1000,
|
||||
flush_interval_seconds: float = 30.0,
|
||||
batch_size: int = 100,
|
||||
on_flush_callback: Optional[Callable[[List[Candle]], Awaitable[None]]] = None
|
||||
):
|
||||
self.max_size = max_size
|
||||
self.flush_interval = flush_interval_seconds
|
||||
self.batch_size = batch_size
|
||||
self.on_flush = on_flush_callback
|
||||
|
||||
# Thread-safe buffer using deque
|
||||
self._buffer: deque = deque(maxlen=max_size)
|
||||
self._lock = asyncio.Lock()
|
||||
self._flush_event = asyncio.Event()
|
||||
self._stop_event = asyncio.Event()
|
||||
|
||||
self.stats = BufferStats()
|
||||
self._batch_sizes: deque = deque(maxlen=100) # For averaging
|
||||
|
||||
# Tasks
|
||||
self._flush_task: Optional[asyncio.Task] = None
|
||||
|
||||
async def start(self) -> None:
|
||||
"""Start the background flush task"""
|
||||
self._flush_task = asyncio.create_task(self._flush_loop())
|
||||
logger.info(f"CandleBuffer started (max_size={self.max_size}, flush_interval={self.flush_interval}s)")
|
||||
|
||||
async def stop(self) -> None:
|
||||
"""Stop the buffer and flush remaining data"""
|
||||
self._stop_event.set()
|
||||
self._flush_event.set() # Wake up flush loop
|
||||
|
||||
if self._flush_task:
|
||||
try:
|
||||
await asyncio.wait_for(self._flush_task, timeout=10.0)
|
||||
except asyncio.TimeoutError:
|
||||
logger.warning("Flush task did not stop in time")
|
||||
|
||||
# Final flush
|
||||
await self.flush()
|
||||
logger.info("CandleBuffer stopped")
|
||||
|
||||
async def add(self, candle: Candle) -> bool:
|
||||
"""
|
||||
Add a candle to the buffer
|
||||
Returns True if added, False if buffer full and candle dropped
|
||||
"""
|
||||
async with self._lock:
|
||||
if len(self._buffer) >= self.max_size:
|
||||
logger.warning(f"Buffer full, dropping oldest candle. Size: {len(self._buffer)}")
|
||||
self.stats.total_dropped += 1
|
||||
|
||||
self._buffer.append(candle)
|
||||
self.stats.total_added += 1
|
||||
|
||||
# Trigger immediate flush if batch size reached
|
||||
if len(self._buffer) >= self.batch_size:
|
||||
self._flush_event.set()
|
||||
|
||||
return True
|
||||
|
||||
async def add_many(self, candles: List[Candle]) -> int:
|
||||
"""Add multiple candles to the buffer"""
|
||||
added = 0
|
||||
for candle in candles:
|
||||
if await self.add(candle):
|
||||
added += 1
|
||||
return added
|
||||
|
||||
async def get_batch(self, n: Optional[int] = None) -> List[Candle]:
|
||||
"""Get up to N candles from buffer (without removing)"""
|
||||
async with self._lock:
|
||||
n = n or len(self._buffer)
|
||||
return list(self._buffer)[:n]
|
||||
|
||||
async def flush(self) -> int:
|
||||
"""
|
||||
Manually flush buffer to callback
|
||||
Returns number of candles flushed
|
||||
"""
|
||||
candles_to_flush: List[Candle] = []
|
||||
|
||||
async with self._lock:
|
||||
if not self._buffer:
|
||||
return 0
|
||||
|
||||
candles_to_flush = list(self._buffer)
|
||||
self._buffer.clear()
|
||||
|
||||
if candles_to_flush and self.on_flush:
|
||||
try:
|
||||
await self.on_flush(candles_to_flush)
|
||||
|
||||
# Update stats
|
||||
self.stats.total_flushed += len(candles_to_flush)
|
||||
self.stats.last_flush_time = datetime.now(timezone.utc)
|
||||
self._batch_sizes.append(len(candles_to_flush))
|
||||
self.stats.avg_batch_size = sum(self._batch_sizes) / len(self._batch_sizes)
|
||||
|
||||
logger.debug(f"Flushed {len(candles_to_flush)} candles")
|
||||
return len(candles_to_flush)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Flush callback failed: {e}")
|
||||
# Put candles back in buffer
|
||||
async with self._lock:
|
||||
for candle in reversed(candles_to_flush):
|
||||
self._buffer.appendleft(candle)
|
||||
return 0
|
||||
elif candles_to_flush:
|
||||
# No callback, just clear
|
||||
self.stats.total_flushed += len(candles_to_flush)
|
||||
return len(candles_to_flush)
|
||||
|
||||
return 0
|
||||
|
||||
async def _flush_loop(self) -> None:
|
||||
"""Background task to periodically flush buffer"""
|
||||
while not self._stop_event.is_set():
|
||||
try:
|
||||
# Wait for flush interval or until triggered
|
||||
await asyncio.wait_for(
|
||||
self._flush_event.wait(),
|
||||
timeout=self.flush_interval
|
||||
)
|
||||
self._flush_event.clear()
|
||||
|
||||
# Flush if we have data
|
||||
buffer_size = await self.get_buffer_size()
|
||||
if buffer_size > 0:
|
||||
await self.flush()
|
||||
|
||||
except asyncio.TimeoutError:
|
||||
# Flush interval reached, flush if we have data
|
||||
buffer_size = await self.get_buffer_size()
|
||||
if buffer_size > 0:
|
||||
await self.flush()
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error in flush loop: {e}")
|
||||
await asyncio.sleep(1)
|
||||
|
||||
def get_stats(self) -> BufferStats:
|
||||
"""Get current buffer statistics"""
|
||||
return self.stats
|
||||
|
||||
async def get_buffer_size(self) -> int:
|
||||
"""Get current buffer size"""
|
||||
async with self._lock:
|
||||
return len(self._buffer)
|
||||
|
||||
def detect_gaps(self, candles: List[Candle]) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Detect gaps in candle sequence
|
||||
Returns list of gap information
|
||||
"""
|
||||
if len(candles) < 2:
|
||||
return []
|
||||
|
||||
gaps = []
|
||||
sorted_candles = sorted(candles, key=lambda c: c.time)
|
||||
|
||||
for i in range(1, len(sorted_candles)):
|
||||
prev = sorted_candles[i-1]
|
||||
curr = sorted_candles[i]
|
||||
|
||||
# Calculate expected interval (1 minute)
|
||||
expected_diff = 60 # seconds
|
||||
actual_diff = (curr.time - prev.time).total_seconds()
|
||||
|
||||
if actual_diff > expected_diff * 1.5: # Allow 50% tolerance
|
||||
gaps.append({
|
||||
'from_time': prev.time.isoformat(),
|
||||
'to_time': curr.time.isoformat(),
|
||||
'missing_candles': int(actual_diff / expected_diff) - 1,
|
||||
'duration_seconds': actual_diff
|
||||
})
|
||||
|
||||
return gaps
|
||||
401
src/data_collector/custom_timeframe_generator.py
Normal file
401
src/data_collector/custom_timeframe_generator.py
Normal file
@ -0,0 +1,401 @@
|
||||
"""
|
||||
Custom Timeframe Generator
|
||||
Generates both standard and custom timeframes from 1m data
|
||||
Updates "building" candles in real-time
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
import calendar
|
||||
from datetime import datetime, timedelta, timezone
|
||||
from typing import List, Optional, Dict, Tuple
|
||||
from dataclasses import dataclass
|
||||
|
||||
from .database import DatabaseManager
|
||||
from .websocket_client import Candle
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@dataclass
|
||||
class CustomCandle(Candle):
|
||||
"""Extended candle with completion flag"""
|
||||
is_complete: bool = True
|
||||
|
||||
|
||||
class CustomTimeframeGenerator:
|
||||
"""
|
||||
Manages and generates multiple timeframes from 1m candles.
|
||||
Standard intervals use clock-aligned boundaries.
|
||||
Custom intervals use continuous bucketing from the first recorded 1m candle.
|
||||
"""
|
||||
|
||||
# Standard intervals (Hyperliquid supported)
|
||||
STANDARD_INTERVALS = {
|
||||
'3m': {'type': 'min', 'value': 3},
|
||||
'5m': {'type': 'min', 'value': 5},
|
||||
'15m': {'type': 'min', 'value': 15},
|
||||
'30m': {'type': 'min', 'value': 30},
|
||||
'1h': {'type': 'hour', 'value': 1},
|
||||
'2h': {'type': 'hour', 'value': 2},
|
||||
'4h': {'type': 'hour', 'value': 4},
|
||||
'8h': {'type': 'hour', 'value': 8},
|
||||
'12h': {'type': 'hour', 'value': 12},
|
||||
'1d': {'type': 'day', 'value': 1},
|
||||
'3d': {'type': 'day', 'value': 3},
|
||||
'1w': {'type': 'week', 'value': 1},
|
||||
'1M': {'type': 'month', 'value': 1}
|
||||
}
|
||||
|
||||
# Custom intervals
|
||||
CUSTOM_INTERVALS = {
|
||||
'37m': {'minutes': 37, 'source': '1m'},
|
||||
'148m': {'minutes': 148, 'source': '37m'}
|
||||
}
|
||||
|
||||
def __init__(self, db: DatabaseManager):
|
||||
self.db = db
|
||||
self.first_1m_time: Optional[datetime] = None
|
||||
# Anchor for 3d candles (fixed date)
|
||||
self.anchor_3d = datetime(2020, 1, 1, tzinfo=timezone.utc)
|
||||
|
||||
async def initialize(self) -> None:
|
||||
"""Get first 1m timestamp for custom continuous bucketing"""
|
||||
async with self.db.acquire() as conn:
|
||||
first = await conn.fetchval("""
|
||||
SELECT MIN(time)
|
||||
FROM candles
|
||||
WHERE interval = '1m' AND symbol = 'BTC'
|
||||
""")
|
||||
if first:
|
||||
self.first_1m_time = first
|
||||
logger.info(f"TF Generator: First 1m candle at {first}")
|
||||
else:
|
||||
logger.warning("TF Generator: No 1m data found")
|
||||
|
||||
def get_bucket_start(self, timestamp: datetime, interval: str) -> datetime:
|
||||
"""Calculate bucket start time for any interval"""
|
||||
# Handle custom intervals
|
||||
if interval in self.CUSTOM_INTERVALS:
|
||||
if not self.first_1m_time:
|
||||
return timestamp # Fallback if not initialized
|
||||
minutes = self.CUSTOM_INTERVALS[interval]['minutes']
|
||||
delta = timestamp - self.first_1m_time
|
||||
bucket_num = int(delta.total_seconds() // (minutes * 60))
|
||||
return self.first_1m_time + timedelta(minutes=bucket_num * minutes)
|
||||
|
||||
# Handle standard intervals
|
||||
if interval not in self.STANDARD_INTERVALS:
|
||||
return timestamp
|
||||
|
||||
cfg = self.STANDARD_INTERVALS[interval]
|
||||
t = timestamp.replace(second=0, microsecond=0)
|
||||
|
||||
if cfg['type'] == 'min':
|
||||
n = cfg['value']
|
||||
return t - timedelta(minutes=t.minute % n)
|
||||
elif cfg['type'] == 'hour':
|
||||
n = cfg['value']
|
||||
t = t.replace(minute=0)
|
||||
return t - timedelta(hours=t.hour % n)
|
||||
elif cfg['type'] == 'day':
|
||||
n = cfg['value']
|
||||
t = t.replace(hour=0, minute=0)
|
||||
if n == 1:
|
||||
return t
|
||||
else: # 3d
|
||||
days_since_anchor = (t - self.anchor_3d).days
|
||||
return t - timedelta(days=days_since_anchor % n)
|
||||
elif cfg['type'] == 'week':
|
||||
t = t.replace(hour=0, minute=0)
|
||||
return t - timedelta(days=t.weekday()) # Monday start
|
||||
elif cfg['type'] == 'month':
|
||||
return t.replace(day=1, hour=0, minute=0)
|
||||
|
||||
return t
|
||||
|
||||
def get_expected_1m_count(self, bucket_start: datetime, interval: str) -> int:
|
||||
"""Calculate expected number of 1m candles in a full bucket"""
|
||||
if interval in self.CUSTOM_INTERVALS:
|
||||
return self.CUSTOM_INTERVALS[interval]['minutes']
|
||||
|
||||
if interval in self.STANDARD_INTERVALS:
|
||||
cfg = self.STANDARD_INTERVALS[interval]
|
||||
if cfg['type'] == 'min': return cfg['value']
|
||||
if cfg['type'] == 'hour': return cfg['value'] * 60
|
||||
if cfg['type'] == 'day': return cfg['value'] * 1440
|
||||
if cfg['type'] == 'week': return 7 * 1440
|
||||
if cfg['type'] == 'month':
|
||||
_, days = calendar.monthrange(bucket_start.year, bucket_start.month)
|
||||
return days * 1440
|
||||
return 1
|
||||
|
||||
async def aggregate_and_upsert(self, symbol: str, interval: str, bucket_start: datetime, conn=None) -> None:
|
||||
"""Aggregate 1m data for a specific bucket and upsert"""
|
||||
bucket_end = bucket_start # Initialize
|
||||
|
||||
if interval == '148m':
|
||||
# Aggregate from 37m
|
||||
source_interval = '37m'
|
||||
expected_count = 4
|
||||
else:
|
||||
source_interval = '1m'
|
||||
expected_count = self.get_expected_1m_count(bucket_start, interval)
|
||||
|
||||
# Calculate bucket end
|
||||
if interval == '1M':
|
||||
_, days = calendar.monthrange(bucket_start.year, bucket_start.month)
|
||||
bucket_end = bucket_start + timedelta(days=days)
|
||||
elif interval in self.STANDARD_INTERVALS:
|
||||
cfg = self.STANDARD_INTERVALS[interval]
|
||||
if cfg['type'] == 'min': bucket_end = bucket_start + timedelta(minutes=cfg['value'])
|
||||
elif cfg['type'] == 'hour': bucket_end = bucket_start + timedelta(hours=cfg['value'])
|
||||
elif cfg['type'] == 'day': bucket_end = bucket_start + timedelta(days=cfg['value'])
|
||||
elif cfg['type'] == 'week': bucket_end = bucket_start + timedelta(weeks=1)
|
||||
elif interval in self.CUSTOM_INTERVALS:
|
||||
minutes = self.CUSTOM_INTERVALS[interval]['minutes']
|
||||
bucket_end = bucket_start + timedelta(minutes=minutes)
|
||||
else:
|
||||
bucket_end = bucket_start + timedelta(minutes=1)
|
||||
|
||||
# Use provided connection or acquire a new one
|
||||
if conn is None:
|
||||
async with self.db.acquire() as connection:
|
||||
await self._process_aggregation(connection, symbol, interval, source_interval, bucket_start, bucket_end, expected_count)
|
||||
else:
|
||||
await self._process_aggregation(conn, symbol, interval, source_interval, bucket_start, bucket_end, expected_count)
|
||||
|
||||
async def _process_aggregation(self, conn, symbol, interval, source_interval, bucket_start, bucket_end, expected_count):
|
||||
"""Internal method to perform aggregation using a specific connection"""
|
||||
rows = await conn.fetch(f"""
|
||||
SELECT time, open, high, low, close, volume
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
AND time >= $3 AND time < $4
|
||||
ORDER BY time ASC
|
||||
""", symbol, source_interval, bucket_start, bucket_end)
|
||||
|
||||
if not rows:
|
||||
return
|
||||
|
||||
# Aggregate
|
||||
is_complete = len(rows) >= expected_count
|
||||
|
||||
candle = CustomCandle(
|
||||
time=bucket_start,
|
||||
symbol=symbol,
|
||||
interval=interval,
|
||||
open=float(rows[0]['open']),
|
||||
high=max(float(r['high']) for r in rows),
|
||||
low=min(float(r['low']) for r in rows),
|
||||
close=float(rows[-1]['close']),
|
||||
volume=sum(float(r['volume']) for r in rows),
|
||||
is_complete=is_complete
|
||||
)
|
||||
|
||||
await self._upsert_candle(candle, conn)
|
||||
|
||||
async def _upsert_candle(self, c: CustomCandle, conn=None) -> None:
|
||||
"""Upsert a single candle using provided connection or acquiring a new one"""
|
||||
query = """
|
||||
INSERT INTO candles (time, symbol, interval, open, high, low, close, volume, validated)
|
||||
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9)
|
||||
ON CONFLICT (time, symbol, interval) DO UPDATE SET
|
||||
open = EXCLUDED.open,
|
||||
high = EXCLUDED.high,
|
||||
low = EXCLUDED.low,
|
||||
close = EXCLUDED.close,
|
||||
volume = EXCLUDED.volume,
|
||||
validated = EXCLUDED.validated,
|
||||
created_at = NOW()
|
||||
"""
|
||||
values = (c.time, c.symbol, c.interval, c.open, c.high, c.low, c.close, c.volume, c.is_complete)
|
||||
|
||||
if conn is None:
|
||||
async with self.db.acquire() as connection:
|
||||
await connection.execute(query, *values)
|
||||
else:
|
||||
await conn.execute(query, *values)
|
||||
|
||||
async def update_realtime(self, new_1m_candles: List[Candle]) -> None:
|
||||
"""
|
||||
Update ALL timeframes (standard and custom) based on new 1m candles.
|
||||
Called after 1m buffer flush.
|
||||
Uses a single connection for all updates sequentially to prevent pool exhaustion.
|
||||
"""
|
||||
if not new_1m_candles:
|
||||
return
|
||||
|
||||
if not self.first_1m_time:
|
||||
await self.initialize()
|
||||
|
||||
if not self.first_1m_time:
|
||||
return
|
||||
|
||||
symbol = new_1m_candles[0].symbol
|
||||
|
||||
async with self.db.acquire() as conn:
|
||||
# 1. Update all standard intervals + 37m sequentially
|
||||
# sequential is required because we are sharing the same connection 'conn'
|
||||
intervals_to_update = list(self.STANDARD_INTERVALS.keys()) + ['37m']
|
||||
|
||||
for interval in intervals_to_update:
|
||||
try:
|
||||
bucket_start = self.get_bucket_start(new_1m_candles[-1].time, interval)
|
||||
await self.aggregate_and_upsert(symbol, interval, bucket_start, conn=conn)
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating interval {interval}: {e}")
|
||||
|
||||
# 2. Update 148m (it depends on 37m being updated first)
|
||||
try:
|
||||
bucket_148m = self.get_bucket_start(new_1m_candles[-1].time, '148m')
|
||||
await self.aggregate_and_upsert(symbol, '148m', bucket_148m, conn=conn)
|
||||
except Exception as e:
|
||||
logger.error(f"Error updating interval 148m: {e}")
|
||||
|
||||
async def generate_historical(self, interval: str, batch_size: int = 5000) -> int:
|
||||
"""
|
||||
Force recalculation of all candles for a timeframe from 1m data.
|
||||
"""
|
||||
if not self.first_1m_time:
|
||||
await self.initialize()
|
||||
|
||||
if not self.first_1m_time:
|
||||
return 0
|
||||
|
||||
config = self.CUSTOM_INTERVALS.get(interval) or {'source': '1m'}
|
||||
source_interval = config.get('source', '1m')
|
||||
|
||||
logger.info(f"Generating historical {interval} from {source_interval}...")
|
||||
|
||||
async with self.db.acquire() as conn:
|
||||
min_max = await conn.fetchrow("""
|
||||
SELECT MIN(time), MAX(time) FROM candles
|
||||
WHERE symbol = 'BTC' AND interval = $1
|
||||
""", source_interval)
|
||||
|
||||
if not min_max or not min_max[0]:
|
||||
return 0
|
||||
|
||||
curr = self.get_bucket_start(min_max[0], interval)
|
||||
end = min_max[1]
|
||||
|
||||
total_inserted = 0
|
||||
while curr <= end:
|
||||
await self.aggregate_and_upsert('BTC', interval, curr)
|
||||
total_inserted += 1
|
||||
|
||||
if interval == '1M':
|
||||
_, days = calendar.monthrange(curr.year, curr.month)
|
||||
curr += timedelta(days=days)
|
||||
elif interval in self.STANDARD_INTERVALS:
|
||||
cfg = self.STANDARD_INTERVALS[interval]
|
||||
if cfg['type'] == 'min': curr += timedelta(minutes=cfg['value'])
|
||||
elif cfg['type'] == 'hour': curr += timedelta(hours=cfg['value'])
|
||||
elif cfg['type'] == 'day': curr += timedelta(days=cfg['value'])
|
||||
elif cfg['type'] == 'week': curr += timedelta(weeks=1)
|
||||
else:
|
||||
minutes = self.CUSTOM_INTERVALS[interval]['minutes']
|
||||
curr += timedelta(minutes=minutes)
|
||||
|
||||
if total_inserted % 100 == 0:
|
||||
logger.info(f"Generated {total_inserted} {interval} candles...")
|
||||
await asyncio.sleep(0.01)
|
||||
|
||||
return total_inserted
|
||||
|
||||
async def generate_from_gap(self, interval: str) -> int:
|
||||
"""
|
||||
Generate candles only from where they're missing.
|
||||
Compares source interval max time with target interval max time.
|
||||
"""
|
||||
if not self.first_1m_time:
|
||||
await self.initialize()
|
||||
|
||||
if not self.first_1m_time:
|
||||
return 0
|
||||
|
||||
config = self.CUSTOM_INTERVALS.get(interval) or {'source': '1m'}
|
||||
source_interval = config.get('source', '1m')
|
||||
|
||||
async with self.db.acquire() as conn:
|
||||
# Get source range
|
||||
source_min_max = await conn.fetchrow("""
|
||||
SELECT MIN(time), MAX(time) FROM candles
|
||||
WHERE symbol = 'BTC' AND interval = $1
|
||||
""", source_interval)
|
||||
|
||||
if not source_min_max or not source_min_max[1]:
|
||||
return 0
|
||||
|
||||
# Get target (this interval) max time
|
||||
target_max = await conn.fetchval("""
|
||||
SELECT MAX(time) FROM candles
|
||||
WHERE symbol = 'BTC' AND interval = $1
|
||||
""", interval)
|
||||
|
||||
source_max = source_min_max[1]
|
||||
|
||||
if target_max:
|
||||
# Start from next bucket after target_max
|
||||
curr = self.get_bucket_start(target_max, interval)
|
||||
if interval in self.CUSTOM_INTERVALS:
|
||||
minutes = self.CUSTOM_INTERVALS[interval]['minutes']
|
||||
curr = curr + timedelta(minutes=minutes)
|
||||
elif interval in self.STANDARD_INTERVALS:
|
||||
cfg = self.STANDARD_INTERVALS[interval]
|
||||
if cfg['type'] == 'min': curr = curr + timedelta(minutes=cfg['value'])
|
||||
elif cfg['type'] == 'hour': curr = curr + timedelta(hours=cfg['value'])
|
||||
elif cfg['type'] == 'day': curr = curr + timedelta(days=cfg['value'])
|
||||
elif cfg['type'] == 'week': curr = curr + timedelta(weeks=1)
|
||||
else:
|
||||
# No target data, start from source min
|
||||
curr = self.get_bucket_start(source_min_max[0], interval)
|
||||
|
||||
end = source_max
|
||||
|
||||
if curr > end:
|
||||
logger.info(f"{interval}: Already up to date (target: {target_max}, source: {source_max})")
|
||||
return 0
|
||||
|
||||
logger.info(f"Generating {interval} from {curr} to {end}...")
|
||||
|
||||
total_inserted = 0
|
||||
while curr <= end:
|
||||
await self.aggregate_and_upsert('BTC', interval, curr)
|
||||
total_inserted += 1
|
||||
|
||||
if interval == '1M':
|
||||
_, days = calendar.monthrange(curr.year, curr.month)
|
||||
curr += timedelta(days=days)
|
||||
elif interval in self.STANDARD_INTERVALS:
|
||||
cfg = self.STANDARD_INTERVALS[interval]
|
||||
if cfg['type'] == 'min': curr += timedelta(minutes=cfg['value'])
|
||||
elif cfg['type'] == 'hour': curr += timedelta(hours=cfg['value'])
|
||||
elif cfg['type'] == 'day': curr += timedelta(days=cfg['value'])
|
||||
elif cfg['type'] == 'week': curr += timedelta(weeks=1)
|
||||
else:
|
||||
minutes = self.CUSTOM_INTERVALS[interval]['minutes']
|
||||
curr += timedelta(minutes=minutes)
|
||||
|
||||
if total_inserted % 50 == 0:
|
||||
logger.info(f"Generated {total_inserted} {interval} candles...")
|
||||
await asyncio.sleep(0.01)
|
||||
|
||||
logger.info(f"{interval}: Generated {total_inserted} candles")
|
||||
return total_inserted
|
||||
|
||||
async def verify_integrity(self, interval: str) -> Dict:
|
||||
async with self.db.acquire() as conn:
|
||||
stats = await conn.fetchrow("""
|
||||
SELECT
|
||||
COUNT(*) as total_candles,
|
||||
MIN(time) as earliest,
|
||||
MAX(time) as latest,
|
||||
COUNT(*) FILTER (WHERE validated = TRUE) as complete_candles,
|
||||
COUNT(*) FILTER (WHERE validated = FALSE) as incomplete_candles
|
||||
FROM candles
|
||||
WHERE interval = $1 AND symbol = 'BTC'
|
||||
""", interval)
|
||||
return dict(stats) if stats else {}
|
||||
261
src/data_collector/database.py
Normal file
261
src/data_collector/database.py
Normal file
@ -0,0 +1,261 @@
|
||||
"""
|
||||
Database interface for TimescaleDB
|
||||
Optimized for batch inserts and low resource usage
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
from contextlib import asynccontextmanager
|
||||
from datetime import datetime
|
||||
from typing import List, Dict, Any, Optional
|
||||
import os
|
||||
|
||||
import asyncpg
|
||||
from asyncpg import Pool
|
||||
|
||||
from .websocket_client import Candle
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class DatabaseManager:
|
||||
"""Manages TimescaleDB connections and operations"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
host: str = None,
|
||||
port: int = None,
|
||||
database: str = None,
|
||||
user: str = None,
|
||||
password: str = None,
|
||||
pool_size: int = 20
|
||||
):
|
||||
self.host = host or os.getenv('DB_HOST', 'localhost')
|
||||
self.port = port or int(os.getenv('DB_PORT', 5432))
|
||||
self.database = database or os.getenv('DB_NAME', 'btc_data')
|
||||
self.user = user or os.getenv('DB_USER', 'btc_bot')
|
||||
self.password = password or os.getenv('DB_PASSWORD', '')
|
||||
self.pool_size = int(os.getenv('DB_POOL_SIZE', pool_size))
|
||||
|
||||
self.pool: Optional[Pool] = None
|
||||
|
||||
async def connect(self) -> None:
|
||||
"""Initialize connection pool"""
|
||||
try:
|
||||
self.pool = await asyncpg.create_pool(
|
||||
host=self.host,
|
||||
port=self.port,
|
||||
database=self.database,
|
||||
user=self.user,
|
||||
password=self.password,
|
||||
min_size=2,
|
||||
max_size=self.pool_size,
|
||||
command_timeout=60,
|
||||
max_inactive_connection_lifetime=300
|
||||
)
|
||||
|
||||
# Test connection
|
||||
async with self.acquire() as conn:
|
||||
version = await conn.fetchval('SELECT version()')
|
||||
logger.info(f"Connected to database: {version[:50]}...")
|
||||
|
||||
logger.info(f"Database pool created (min: 2, max: {self.pool_size})")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to connect to database: {type(e).__name__}: {e!r}")
|
||||
raise
|
||||
|
||||
async def disconnect(self) -> None:
|
||||
"""Close connection pool"""
|
||||
if self.pool:
|
||||
await self.pool.close()
|
||||
logger.info("Database pool closed")
|
||||
|
||||
@asynccontextmanager
|
||||
async def acquire(self, timeout: float = 30.0):
|
||||
"""Context manager for acquiring connection with timeout"""
|
||||
if not self.pool:
|
||||
raise RuntimeError("Database not connected")
|
||||
try:
|
||||
async with self.pool.acquire(timeout=timeout) as conn:
|
||||
yield conn
|
||||
except asyncio.TimeoutError:
|
||||
logger.error(f"Database connection acquisition timed out after {timeout}s")
|
||||
raise
|
||||
|
||||
async def insert_candles(self, candles: List[Candle]) -> int:
|
||||
"""
|
||||
Batch insert candles into database
|
||||
Uses ON CONFLICT to handle duplicates
|
||||
"""
|
||||
if not candles:
|
||||
return 0
|
||||
|
||||
# Prepare values for batch insert
|
||||
values = [
|
||||
(
|
||||
c.time,
|
||||
c.symbol,
|
||||
c.interval,
|
||||
c.open,
|
||||
c.high,
|
||||
c.low,
|
||||
c.close,
|
||||
c.volume,
|
||||
False, # validated
|
||||
'hyperliquid' # source
|
||||
)
|
||||
for c in candles
|
||||
]
|
||||
|
||||
async with self.acquire() as conn:
|
||||
# Use execute_many for efficient batch insert
|
||||
result = await conn.executemany('''
|
||||
INSERT INTO candles (time, symbol, interval, open, high, low, close, volume, validated, source)
|
||||
VALUES ($1, $2, $3, $4, $5, $6, $7, $8, $9, $10)
|
||||
ON CONFLICT (time, symbol, interval)
|
||||
DO UPDATE SET
|
||||
open = EXCLUDED.open,
|
||||
high = EXCLUDED.high,
|
||||
low = EXCLUDED.low,
|
||||
close = EXCLUDED.close,
|
||||
volume = EXCLUDED.volume,
|
||||
source = EXCLUDED.source
|
||||
''', values)
|
||||
|
||||
inserted = len(candles)
|
||||
logger.debug(f"Inserted/updated {inserted} candles")
|
||||
return inserted
|
||||
|
||||
async def get_candles(
|
||||
self,
|
||||
symbol: str,
|
||||
interval: str,
|
||||
start: Optional[datetime] = None,
|
||||
end: Optional[datetime] = None,
|
||||
limit: int = 1000
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""Query candles from database"""
|
||||
query = '''
|
||||
SELECT time, symbol, interval, open, high, low, close, volume, validated
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
'''
|
||||
params = [symbol, interval]
|
||||
|
||||
if start:
|
||||
query += ' AND time >= $3'
|
||||
params.append(start)
|
||||
|
||||
if end:
|
||||
query += f' AND time <= ${len(params) + 1}'
|
||||
params.append(end)
|
||||
|
||||
query += f' ORDER BY time DESC LIMIT ${len(params) + 1}'
|
||||
params.append(limit)
|
||||
|
||||
async with self.acquire() as conn:
|
||||
rows = await conn.fetch(query, *params)
|
||||
return [dict(row) for row in rows]
|
||||
|
||||
async def get_latest_candle(self, symbol: str, interval: str) -> Optional[Dict[str, Any]]:
|
||||
"""Get the most recent candle for a symbol"""
|
||||
async with self.acquire() as conn:
|
||||
row = await conn.fetchrow('''
|
||||
SELECT time, symbol, interval, open, high, low, close, volume
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
ORDER BY time DESC
|
||||
LIMIT 1
|
||||
''', symbol, interval)
|
||||
|
||||
return dict(row) if row else None
|
||||
|
||||
async def detect_gaps(
|
||||
self,
|
||||
symbol: str,
|
||||
interval: str,
|
||||
since: Optional[datetime] = None
|
||||
) -> List[Dict[str, Any]]:
|
||||
"""
|
||||
Detect missing candles in the database
|
||||
Uses SQL window functions for efficiency
|
||||
"""
|
||||
since = since or datetime.utcnow().replace(hour=0, minute=0, second=0, microsecond=0)
|
||||
|
||||
async with self.acquire() as conn:
|
||||
# Find gaps using lead/lag window functions
|
||||
rows = await conn.fetch('''
|
||||
WITH ordered AS (
|
||||
SELECT
|
||||
time,
|
||||
LAG(time) OVER (ORDER BY time) as prev_time
|
||||
FROM candles
|
||||
WHERE symbol = $1
|
||||
AND interval = $2
|
||||
AND time >= $3
|
||||
ORDER BY time
|
||||
)
|
||||
SELECT
|
||||
prev_time as gap_start,
|
||||
time as gap_end,
|
||||
EXTRACT(EPOCH FROM (time - prev_time)) / 60 - 1 as missing_candles
|
||||
FROM ordered
|
||||
WHERE time - prev_time > INTERVAL '2 minutes'
|
||||
ORDER BY prev_time
|
||||
''', symbol, interval, since)
|
||||
|
||||
return [
|
||||
{
|
||||
'gap_start': row['gap_start'].isoformat(),
|
||||
'gap_end': row['gap_end'].isoformat(),
|
||||
'missing_candles': int(row['missing_candles'])
|
||||
}
|
||||
for row in rows
|
||||
]
|
||||
|
||||
async def log_quality_issue(
|
||||
self,
|
||||
check_type: str,
|
||||
severity: str,
|
||||
symbol: Optional[str] = None,
|
||||
details: Optional[Dict[str, Any]] = None
|
||||
) -> None:
|
||||
"""Log a data quality issue"""
|
||||
async with self.acquire() as conn:
|
||||
await conn.execute('''
|
||||
INSERT INTO data_quality (check_type, severity, symbol, details)
|
||||
VALUES ($1, $2, $3, $4)
|
||||
''', check_type, severity, symbol, details)
|
||||
|
||||
logger.warning(f"Quality issue logged: {check_type} ({severity})")
|
||||
|
||||
async def get_health_stats(self) -> Dict[str, Any]:
|
||||
"""Get database health statistics"""
|
||||
async with self.acquire() as conn:
|
||||
# Get table sizes
|
||||
table_stats = await conn.fetch('''
|
||||
SELECT
|
||||
relname as table_name,
|
||||
pg_size_pretty(pg_total_relation_size(relid)) as size,
|
||||
n_live_tup as row_count
|
||||
FROM pg_stat_user_tables
|
||||
WHERE relname IN ('candles', 'indicators', 'data_quality')
|
||||
''')
|
||||
|
||||
# Get latest candles
|
||||
latest = await conn.fetch('''
|
||||
SELECT symbol, MAX(time) as last_time, COUNT(*) as count
|
||||
FROM candles
|
||||
WHERE time > NOW() - INTERVAL '24 hours'
|
||||
GROUP BY symbol
|
||||
''')
|
||||
|
||||
return {
|
||||
'tables': [dict(row) for row in table_stats],
|
||||
'latest_candles': [dict(row) for row in latest],
|
||||
'unresolved_issues': await conn.fetchval('''
|
||||
SELECT COUNT(*) FROM data_quality WHERE resolved = FALSE
|
||||
''')
|
||||
}
|
||||
285
src/data_collector/indicator_engine.py
Normal file
285
src/data_collector/indicator_engine.py
Normal file
@ -0,0 +1,285 @@
|
||||
"""
|
||||
Indicator Engine - Computes and stores technical indicators
|
||||
Stateless DB-backed design: same code for live updates and backtesting
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import logging
|
||||
from dataclasses import dataclass, field
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, Optional, Any
|
||||
|
||||
from .database import DatabaseManager
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@dataclass
|
||||
class IndicatorConfig:
|
||||
"""Configuration for a single indicator"""
|
||||
name: str # e.g., "ma44"
|
||||
type: str # e.g., "sma"
|
||||
period: int # e.g., 44
|
||||
intervals: List[str] # e.g., ["37m", "148m", "1d"]
|
||||
|
||||
@classmethod
|
||||
def from_dict(cls, name: str, data: Dict[str, Any]) -> "IndicatorConfig":
|
||||
"""Create config from YAML dict entry"""
|
||||
return cls(
|
||||
name=name,
|
||||
type=data["type"],
|
||||
period=data["period"],
|
||||
intervals=data["intervals"],
|
||||
)
|
||||
|
||||
|
||||
@dataclass
|
||||
class IndicatorResult:
|
||||
"""Result of a single indicator computation"""
|
||||
name: str
|
||||
value: Optional[float]
|
||||
period: int
|
||||
timestamp: datetime
|
||||
|
||||
|
||||
class IndicatorEngine:
|
||||
"""
|
||||
Computes technical indicators from candle data in the database.
|
||||
|
||||
Two modes, same math:
|
||||
- on_interval_update(): called by live system after higher-TF candle update
|
||||
- compute_at(): called by backtester for a specific point in time
|
||||
Both query the DB for the required candle history and store results.
|
||||
"""
|
||||
|
||||
def __init__(self, db: DatabaseManager, configs: List[IndicatorConfig]):
|
||||
self.db = db
|
||||
self.configs = configs
|
||||
# Build lookup: interval -> list of configs that need computation
|
||||
self._interval_configs: Dict[str, List[IndicatorConfig]] = {}
|
||||
for cfg in configs:
|
||||
for interval in cfg.intervals:
|
||||
if interval not in self._interval_configs:
|
||||
self._interval_configs[interval] = []
|
||||
self._interval_configs[interval].append(cfg)
|
||||
|
||||
logger.info(
|
||||
f"IndicatorEngine initialized with {len(configs)} indicators "
|
||||
f"across intervals: {list(self._interval_configs.keys())}"
|
||||
)
|
||||
|
||||
def get_configured_intervals(self) -> List[str]:
|
||||
"""Return all intervals that have indicators configured"""
|
||||
return list(self._interval_configs.keys())
|
||||
|
||||
async def on_interval_update(
|
||||
self,
|
||||
symbol: str,
|
||||
interval: str,
|
||||
timestamp: datetime,
|
||||
) -> Dict[str, Optional[float]]:
|
||||
"""
|
||||
Compute all indicators configured for this interval.
|
||||
Called by main.py after CustomTimeframeGenerator updates a higher TF.
|
||||
|
||||
Returns dict of indicator_name -> value (for use by Brain).
|
||||
"""
|
||||
configs = self._interval_configs.get(interval, [])
|
||||
if not configs:
|
||||
return {}
|
||||
|
||||
return await self._compute_and_store(symbol, interval, timestamp, configs)
|
||||
|
||||
async def compute_at(
|
||||
self,
|
||||
symbol: str,
|
||||
interval: str,
|
||||
timestamp: datetime,
|
||||
) -> Dict[str, Optional[float]]:
|
||||
"""
|
||||
Compute indicators at a specific point in time.
|
||||
Alias for on_interval_update -- used by backtester for clarity.
|
||||
"""
|
||||
return await self.on_interval_update(symbol, interval, timestamp)
|
||||
|
||||
async def compute_historical(
|
||||
self,
|
||||
symbol: str,
|
||||
interval: str,
|
||||
start: datetime,
|
||||
end: datetime,
|
||||
) -> int:
|
||||
"""
|
||||
Batch-compute indicators for a time range.
|
||||
Iterates over every candle timestamp in [start, end] and computes.
|
||||
|
||||
Returns total number of indicator values stored.
|
||||
"""
|
||||
configs = self._interval_configs.get(interval, [])
|
||||
if not configs:
|
||||
logger.warning(f"No indicators configured for interval {interval}")
|
||||
return 0
|
||||
|
||||
# Get all candle timestamps in range
|
||||
async with self.db.acquire() as conn:
|
||||
rows = await conn.fetch("""
|
||||
SELECT time FROM candles
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
AND time >= $3 AND time <= $4
|
||||
ORDER BY time ASC
|
||||
""", symbol, interval, start, end)
|
||||
|
||||
if not rows:
|
||||
logger.warning(f"No candles found for {symbol}/{interval} in range")
|
||||
return 0
|
||||
|
||||
timestamps = [row["time"] for row in rows]
|
||||
total_stored = 0
|
||||
|
||||
logger.info(
|
||||
f"Computing {len(configs)} indicators across "
|
||||
f"{len(timestamps)} {interval} candles..."
|
||||
)
|
||||
|
||||
for i, ts in enumerate(timestamps):
|
||||
results = await self._compute_and_store(symbol, interval, ts, configs)
|
||||
total_stored += sum(1 for v in results.values() if v is not None)
|
||||
|
||||
if (i + 1) % 100 == 0:
|
||||
logger.info(f"Progress: {i + 1}/{len(timestamps)} candles processed")
|
||||
await asyncio.sleep(0.01) # Yield to event loop
|
||||
|
||||
logger.info(
|
||||
f"Historical compute complete: {total_stored} indicator values "
|
||||
f"stored for {interval}"
|
||||
)
|
||||
return total_stored
|
||||
|
||||
async def _compute_and_store(
|
||||
self,
|
||||
symbol: str,
|
||||
interval: str,
|
||||
timestamp: datetime,
|
||||
configs: List[IndicatorConfig],
|
||||
) -> Dict[str, Optional[float]]:
|
||||
"""Core computation: fetch candles, compute indicators, store results"""
|
||||
# Determine max lookback needed
|
||||
max_period = max(cfg.period for cfg in configs)
|
||||
|
||||
# Fetch enough candles for the longest indicator
|
||||
async with self.db.acquire() as conn:
|
||||
rows = await conn.fetch("""
|
||||
SELECT time, open, high, low, close, volume
|
||||
FROM candles
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
AND time <= $3
|
||||
ORDER BY time DESC
|
||||
LIMIT $4
|
||||
""", symbol, interval, timestamp, max_period)
|
||||
|
||||
if not rows:
|
||||
return {cfg.name: None for cfg in configs}
|
||||
|
||||
# Reverse to chronological order
|
||||
candles = list(reversed(rows))
|
||||
closes = [float(c["close"]) for c in candles]
|
||||
|
||||
# Compute each indicator
|
||||
results: Dict[str, Optional[float]] = {}
|
||||
values_to_store: List[tuple] = []
|
||||
|
||||
for cfg in configs:
|
||||
value = self._compute_indicator(cfg, closes)
|
||||
results[cfg.name] = value
|
||||
|
||||
if value is not None:
|
||||
values_to_store.append((
|
||||
timestamp,
|
||||
symbol,
|
||||
interval,
|
||||
cfg.name,
|
||||
value,
|
||||
json.dumps({"type": cfg.type, "period": cfg.period}),
|
||||
))
|
||||
|
||||
# Batch upsert all computed values
|
||||
if values_to_store:
|
||||
async with self.db.acquire() as conn:
|
||||
await conn.executemany("""
|
||||
INSERT INTO indicators (time, symbol, interval, indicator_name, value, parameters)
|
||||
VALUES ($1, $2, $3, $4, $5, $6)
|
||||
ON CONFLICT (time, symbol, interval, indicator_name)
|
||||
DO UPDATE SET
|
||||
value = EXCLUDED.value,
|
||||
parameters = EXCLUDED.parameters,
|
||||
computed_at = NOW()
|
||||
""", values_to_store)
|
||||
|
||||
logger.debug(
|
||||
f"Stored {len(values_to_store)} indicator values for "
|
||||
f"{symbol}/{interval} at {timestamp}"
|
||||
)
|
||||
|
||||
return results
|
||||
|
||||
def _compute_indicator(
|
||||
self,
|
||||
config: IndicatorConfig,
|
||||
closes: List[float],
|
||||
) -> Optional[float]:
|
||||
"""Dispatch to the correct computation function"""
|
||||
if config.type == "sma":
|
||||
return self.compute_sma(closes, config.period)
|
||||
else:
|
||||
logger.warning(f"Unknown indicator type: {config.type}")
|
||||
return None
|
||||
|
||||
# ── Pure math functions (no DB, no async, easily testable) ──────────
|
||||
|
||||
@staticmethod
|
||||
def compute_sma(closes: List[float], period: int) -> Optional[float]:
|
||||
"""Simple Moving Average over the last `period` closes"""
|
||||
if len(closes) < period:
|
||||
return None
|
||||
return sum(closes[-period:]) / period
|
||||
|
||||
async def get_latest_values(
|
||||
self,
|
||||
symbol: str,
|
||||
interval: str,
|
||||
) -> Dict[str, float]:
|
||||
"""
|
||||
Get the most recent indicator values for a symbol/interval.
|
||||
Used by Brain to read current state.
|
||||
"""
|
||||
async with self.db.acquire() as conn:
|
||||
rows = await conn.fetch("""
|
||||
SELECT DISTINCT ON (indicator_name)
|
||||
indicator_name, value, time
|
||||
FROM indicators
|
||||
WHERE symbol = $1 AND interval = $2
|
||||
ORDER BY indicator_name, time DESC
|
||||
""", symbol, interval)
|
||||
|
||||
return {row["indicator_name"]: float(row["value"]) for row in rows}
|
||||
|
||||
async def get_values_at(
|
||||
self,
|
||||
symbol: str,
|
||||
interval: str,
|
||||
timestamp: datetime,
|
||||
) -> Dict[str, float]:
|
||||
"""
|
||||
Get indicator values at a specific timestamp.
|
||||
Used by Brain during backtesting.
|
||||
"""
|
||||
async with self.db.acquire() as conn:
|
||||
rows = await conn.fetch("""
|
||||
SELECT indicator_name, value
|
||||
FROM indicators
|
||||
WHERE symbol = $1 AND interval = $2 AND time = $3
|
||||
""", symbol, interval, timestamp)
|
||||
|
||||
return {row["indicator_name"]: float(row["value"]) for row in rows}
|
||||
440
src/data_collector/main.py
Normal file
440
src/data_collector/main.py
Normal file
@ -0,0 +1,440 @@
|
||||
"""
|
||||
Main entry point for data collector service
|
||||
Integrates WebSocket client, buffer, database, indicators, and brain
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import logging
|
||||
import signal
|
||||
import sys
|
||||
from datetime import datetime, timezone
|
||||
from typing import Optional, List
|
||||
import os
|
||||
|
||||
import yaml
|
||||
|
||||
from .websocket_client import HyperliquidWebSocket, Candle
|
||||
from .candle_buffer import CandleBuffer
|
||||
from .database import DatabaseManager
|
||||
from .custom_timeframe_generator import CustomTimeframeGenerator
|
||||
from .indicator_engine import IndicatorEngine, IndicatorConfig
|
||||
from .brain import Brain
|
||||
from .backfill import HyperliquidBackfill
|
||||
|
||||
|
||||
# Configure logging
|
||||
logging.basicConfig(
|
||||
level=getattr(logging, os.getenv('LOG_LEVEL', 'INFO')),
|
||||
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s',
|
||||
handlers=[
|
||||
logging.StreamHandler(sys.stdout),
|
||||
logging.FileHandler('/app/logs/collector.log') if os.path.exists('/app/logs') else logging.StreamHandler()
|
||||
]
|
||||
)
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
class DataCollector:
|
||||
"""
|
||||
Main data collection orchestrator
|
||||
Manages WebSocket connection, buffering, and database writes
|
||||
"""
|
||||
|
||||
STANDARD_INTERVALS = ["1m", "3m", "5m", "15m", "30m", "1h", "2h", "4h", "8h", "12h", "1d", "3d", "1w"]
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
symbol: str = "BTC",
|
||||
interval: str = "1m"
|
||||
):
|
||||
self.symbol = symbol
|
||||
self.interval = interval
|
||||
|
||||
# Components
|
||||
self.db: Optional[DatabaseManager] = None
|
||||
self.buffer: Optional[CandleBuffer] = None
|
||||
self.websocket: Optional[HyperliquidWebSocket] = None
|
||||
self.custom_tf_generator: Optional[CustomTimeframeGenerator] = None
|
||||
|
||||
# State
|
||||
self.is_running = False
|
||||
self._stop_event = asyncio.Event()
|
||||
self._tasks = []
|
||||
|
||||
async def start(self) -> None:
|
||||
"""Initialize and start all components"""
|
||||
logger.info(f"Starting DataCollector for {self.symbol}")
|
||||
|
||||
try:
|
||||
# Initialize database
|
||||
self.db = DatabaseManager()
|
||||
await self.db.connect()
|
||||
|
||||
# Run startup backfill for all intervals
|
||||
await self._startup_backfill()
|
||||
|
||||
# Initialize custom timeframe generator
|
||||
self.custom_tf_generator = CustomTimeframeGenerator(self.db)
|
||||
await self.custom_tf_generator.initialize()
|
||||
|
||||
# Regenerate custom timeframes after startup backfill
|
||||
await self._regenerate_custom_timeframes()
|
||||
|
||||
# Initialize indicator engine
|
||||
# Hardcoded config for now, eventually load from yaml
|
||||
indicator_configs = [
|
||||
IndicatorConfig("ma44", "sma", 44, ["37m", "148m", "1d"]),
|
||||
IndicatorConfig("ma125", "sma", 125, ["37m", "148m", "1d"])
|
||||
]
|
||||
self.indicator_engine = IndicatorEngine(self.db, indicator_configs)
|
||||
|
||||
# Initialize brain
|
||||
self.brain = Brain(self.db, self.indicator_engine)
|
||||
|
||||
# Initialize buffer
|
||||
self.buffer = CandleBuffer(
|
||||
max_size=1000,
|
||||
flush_interval_seconds=30,
|
||||
batch_size=100,
|
||||
on_flush_callback=self._on_buffer_flush
|
||||
)
|
||||
await self.buffer.start()
|
||||
|
||||
# Initialize WebSocket client
|
||||
self.websocket = HyperliquidWebSocket(
|
||||
symbol=self.symbol,
|
||||
interval=self.interval,
|
||||
on_candle_callback=self._on_candle,
|
||||
on_error_callback=self._on_error
|
||||
)
|
||||
|
||||
# Setup signal handlers
|
||||
self._setup_signal_handlers()
|
||||
|
||||
# Connect to WebSocket
|
||||
await self.websocket.connect()
|
||||
|
||||
# Start main loops
|
||||
self.is_running = True
|
||||
self._tasks = [
|
||||
asyncio.create_task(self.websocket.receive_loop()),
|
||||
asyncio.create_task(self._health_check_loop()),
|
||||
asyncio.create_task(self._monitoring_loop())
|
||||
]
|
||||
|
||||
logger.info("DataCollector started successfully")
|
||||
|
||||
# Wait for stop signal
|
||||
await self._stop_event.wait()
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to start DataCollector: {type(e).__name__}: {e!r}")
|
||||
raise
|
||||
finally:
|
||||
await self.stop()
|
||||
|
||||
async def _startup_backfill(self) -> None:
|
||||
"""
|
||||
Backfill missing data on startup for all standard intervals.
|
||||
Uses both gap detection AND time-based backfill for robustness.
|
||||
"""
|
||||
logger.info("Running startup backfill for all intervals...")
|
||||
|
||||
try:
|
||||
async with HyperliquidBackfill(self.db, self.symbol, self.STANDARD_INTERVALS) as backfill:
|
||||
for interval in self.STANDARD_INTERVALS:
|
||||
try:
|
||||
# First, use gap detection to find any holes
|
||||
gaps = await self.db.detect_gaps(self.symbol, interval)
|
||||
|
||||
if gaps:
|
||||
logger.info(f"{interval}: {len(gaps)} gaps detected")
|
||||
for gap in gaps:
|
||||
gap_start = datetime.fromisoformat(gap['gap_start'].replace('Z', '+00:00'))
|
||||
gap_end = datetime.fromisoformat(gap['gap_end'].replace('Z', '+00:00'))
|
||||
|
||||
logger.info(f" Filling gap: {gap_start} to {gap_end}")
|
||||
candles = await backfill.fetch_candles(interval, gap_start, gap_end)
|
||||
|
||||
if candles:
|
||||
inserted = await self.db.insert_candles(candles)
|
||||
logger.info(f" Inserted {inserted} candles for gap")
|
||||
|
||||
await asyncio.sleep(0.2)
|
||||
|
||||
# Second, check if we're behind current time
|
||||
latest = await self.db.get_latest_candle(self.symbol, interval)
|
||||
now = datetime.now(timezone.utc)
|
||||
|
||||
if latest:
|
||||
last_time = latest['time']
|
||||
gap_minutes = (now - last_time).total_seconds() / 60
|
||||
|
||||
if gap_minutes > 2:
|
||||
logger.info(f"{interval}: {gap_minutes:.0f} min behind, backfilling to now...")
|
||||
candles = await backfill.fetch_candles(interval, last_time, now)
|
||||
|
||||
if candles:
|
||||
inserted = await self.db.insert_candles(candles)
|
||||
logger.info(f" Inserted {inserted} candles")
|
||||
else:
|
||||
logger.info(f"{interval}: up to date")
|
||||
else:
|
||||
# No data exists, backfill last 7 days
|
||||
logger.info(f"{interval}: No data, backfilling 7 days...")
|
||||
count = await backfill.backfill_interval(interval, days_back=7)
|
||||
logger.info(f" Inserted {count} candles")
|
||||
|
||||
await asyncio.sleep(0.2)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Startup backfill failed for {interval}: {e}")
|
||||
import traceback
|
||||
logger.error(traceback.format_exc())
|
||||
continue
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Startup backfill error: {e}")
|
||||
import traceback
|
||||
logger.error(traceback.format_exc())
|
||||
|
||||
logger.info("Startup backfill complete")
|
||||
|
||||
async def _regenerate_custom_timeframes(self) -> None:
|
||||
"""
|
||||
Regenerate custom timeframes (37m, 148m) only from gaps.
|
||||
Only generates candles that are missing, not all from beginning.
|
||||
"""
|
||||
if not self.custom_tf_generator:
|
||||
return
|
||||
|
||||
logger.info("Checking custom timeframes for gaps...")
|
||||
|
||||
try:
|
||||
for interval in ['37m', '148m']:
|
||||
try:
|
||||
count = await self.custom_tf_generator.generate_from_gap(interval)
|
||||
if count > 0:
|
||||
logger.info(f"{interval}: Generated {count} candles")
|
||||
else:
|
||||
logger.info(f"{interval}: Up to date")
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to regenerate {interval}: {e}")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Custom timeframe regeneration error: {e}")
|
||||
|
||||
logger.info("Custom timeframe check complete")
|
||||
|
||||
async def stop(self) -> None:
|
||||
"""Graceful shutdown"""
|
||||
if not self.is_running:
|
||||
return
|
||||
|
||||
logger.info("Stopping DataCollector...")
|
||||
self.is_running = False
|
||||
self._stop_event.set()
|
||||
|
||||
# Cancel tasks
|
||||
for task in self._tasks:
|
||||
if not task.done():
|
||||
task.cancel()
|
||||
|
||||
# Wait for tasks to complete
|
||||
if self._tasks:
|
||||
await asyncio.gather(*self._tasks, return_exceptions=True)
|
||||
|
||||
# Stop components
|
||||
if self.websocket:
|
||||
await self.websocket.disconnect()
|
||||
|
||||
if self.buffer:
|
||||
await self.buffer.stop()
|
||||
|
||||
if self.db:
|
||||
await self.db.disconnect()
|
||||
|
||||
logger.info("DataCollector stopped")
|
||||
|
||||
async def _on_candle(self, candle: Candle) -> None:
|
||||
"""Handle incoming candle from WebSocket"""
|
||||
try:
|
||||
# Add to buffer
|
||||
await self.buffer.add(candle)
|
||||
logger.debug(f"Received candle: {candle.time} - Close: {candle.close}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error processing candle: {e}")
|
||||
|
||||
async def _on_buffer_flush(self, candles: list) -> None:
|
||||
"""Handle buffer flush - write to database and update custom timeframes"""
|
||||
try:
|
||||
inserted = await self.db.insert_candles(candles)
|
||||
logger.info(f"Flushed {inserted} candles to database")
|
||||
|
||||
# Update custom timeframes (37m, 148m) in background
|
||||
if self.custom_tf_generator and inserted > 0:
|
||||
asyncio.create_task(
|
||||
self._update_custom_timeframes(candles),
|
||||
name="custom_tf_update"
|
||||
)
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to write candles to database: {e}")
|
||||
raise # Re-raise to trigger buffer retry
|
||||
|
||||
async def _update_custom_timeframes(self, candles: list) -> None:
|
||||
"""
|
||||
Update custom timeframes in background, then trigger indicators/brain.
|
||||
|
||||
This chain ensures that indicators are computed on fresh candle data,
|
||||
and the brain evaluates on fresh indicator data.
|
||||
"""
|
||||
try:
|
||||
# 1. Update custom candles (37m, 148m, etc.)
|
||||
await self.custom_tf_generator.update_realtime(candles)
|
||||
logger.debug("Custom timeframes updated")
|
||||
|
||||
# 2. Trigger indicator updates for configured intervals
|
||||
# We use the timestamp of the last 1m candle as the trigger point
|
||||
trigger_time = candles[-1].time
|
||||
|
||||
if self.indicator_engine:
|
||||
intervals = self.indicator_engine.get_configured_intervals()
|
||||
for interval in intervals:
|
||||
# Get the correct bucket start time for this interval
|
||||
# e.g., if trigger_time is 09:48:00, 37m bucket might start at 09:25:00
|
||||
if self.custom_tf_generator:
|
||||
bucket_start = self.custom_tf_generator.get_bucket_start(trigger_time, interval)
|
||||
else:
|
||||
bucket_start = trigger_time
|
||||
|
||||
# Compute indicators for this bucket
|
||||
raw_indicators = await self.indicator_engine.on_interval_update(
|
||||
self.symbol, interval, bucket_start
|
||||
)
|
||||
|
||||
# Filter out None values to satisfy type checker
|
||||
indicators = {k: v for k, v in raw_indicators.items() if v is not None}
|
||||
|
||||
# 3. Evaluate brain if we have fresh indicators
|
||||
if self.brain and indicators:
|
||||
await self.brain.evaluate(
|
||||
self.symbol, interval, bucket_start, indicators
|
||||
)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to update custom timeframes/indicators: {e}")
|
||||
# Don't raise - this is non-critical
|
||||
|
||||
async def _on_error(self, error: Exception) -> None:
|
||||
"""Handle WebSocket errors"""
|
||||
logger.error(f"WebSocket error: {error}")
|
||||
# Could implement alerting here (Telegram, etc.)
|
||||
|
||||
async def _health_check_loop(self) -> None:
|
||||
"""Periodic health checks"""
|
||||
while self.is_running:
|
||||
try:
|
||||
await asyncio.sleep(60) # Check every minute
|
||||
|
||||
if not self.is_running:
|
||||
break
|
||||
|
||||
# Check WebSocket health
|
||||
health = self.websocket.get_connection_health()
|
||||
|
||||
if health['seconds_since_last_message'] and health['seconds_since_last_message'] > 120:
|
||||
logger.warning("No messages received for 2+ minutes")
|
||||
# Could trigger reconnection or alert
|
||||
|
||||
# Log stats
|
||||
buffer_stats = self.buffer.get_stats()
|
||||
logger.info(f"Health: {health}, Buffer: {buffer_stats.to_dict()}")
|
||||
|
||||
except asyncio.CancelledError:
|
||||
break
|
||||
except Exception as e:
|
||||
logger.error(f"Error in health check: {e}")
|
||||
|
||||
async def _monitoring_loop(self) -> None:
|
||||
"""Periodic monitoring and maintenance tasks"""
|
||||
while self.is_running:
|
||||
try:
|
||||
await asyncio.sleep(300) # Every 5 minutes
|
||||
|
||||
if not self.is_running:
|
||||
break
|
||||
|
||||
# Detect gaps
|
||||
gaps = await self.db.detect_gaps(self.symbol, self.interval)
|
||||
if gaps:
|
||||
logger.warning(f"Detected {len(gaps)} data gaps: {gaps}")
|
||||
await self._backfill_gaps(gaps)
|
||||
|
||||
# Log database health
|
||||
health = await self.db.get_health_stats()
|
||||
logger.info(f"Database health: {health}")
|
||||
|
||||
except asyncio.CancelledError:
|
||||
break
|
||||
except Exception as e:
|
||||
logger.error(f"Error in monitoring loop: {e}")
|
||||
|
||||
async def _backfill_gaps(self, gaps: list) -> None:
|
||||
"""Backfill detected data gaps from Hyperliquid"""
|
||||
if not gaps:
|
||||
return
|
||||
|
||||
logger.info(f"Starting backfill for {len(gaps)} gaps...")
|
||||
|
||||
try:
|
||||
async with HyperliquidBackfill(self.db, self.symbol, [self.interval]) as backfill:
|
||||
for gap in gaps:
|
||||
gap_start = datetime.fromisoformat(gap['gap_start'].replace('Z', '+00:00'))
|
||||
gap_end = datetime.fromisoformat(gap['gap_end'].replace('Z', '+00:00'))
|
||||
|
||||
logger.info(f"Backfilling gap: {gap_start} to {gap_end} ({gap['missing_candles']} candles)")
|
||||
|
||||
candles = await backfill.fetch_candles(self.interval, gap_start, gap_end)
|
||||
|
||||
if candles:
|
||||
inserted = await self.db.insert_candles(candles)
|
||||
logger.info(f"Backfilled {inserted} candles for gap {gap_start}")
|
||||
|
||||
# Update custom timeframes and indicators for backfilled data
|
||||
if inserted > 0:
|
||||
await self._update_custom_timeframes(candles)
|
||||
else:
|
||||
logger.warning(f"No candles available for gap {gap_start} to {gap_end}")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Backfill failed: {e}")
|
||||
|
||||
def _setup_signal_handlers(self) -> None:
|
||||
"""Setup handlers for graceful shutdown"""
|
||||
def signal_handler(sig, frame):
|
||||
logger.info(f"Received signal {sig}, shutting down...")
|
||||
asyncio.create_task(self.stop())
|
||||
|
||||
signal.signal(signal.SIGINT, signal_handler)
|
||||
signal.signal(signal.SIGTERM, signal_handler)
|
||||
|
||||
|
||||
async def main():
|
||||
"""Main entry point"""
|
||||
collector = DataCollector(
|
||||
symbol="BTC",
|
||||
interval="1m"
|
||||
)
|
||||
|
||||
try:
|
||||
await collector.start()
|
||||
except KeyboardInterrupt:
|
||||
logger.info("Interrupted by user")
|
||||
except Exception as e:
|
||||
logger.error(f"Fatal error: {type(e).__name__}: {e!r}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
asyncio.run(main())
|
||||
160
src/data_collector/simulator.py
Normal file
160
src/data_collector/simulator.py
Normal file
@ -0,0 +1,160 @@
|
||||
"""
|
||||
Simulator
|
||||
Handles account accounting, leverage, fees, and position management for backtesting.
|
||||
"""
|
||||
|
||||
from dataclasses import dataclass
|
||||
from typing import Optional, List, Dict, Any
|
||||
from datetime import datetime
|
||||
from .brain import Decision # We might need to decouple this later, but reusing for now
|
||||
|
||||
@dataclass
|
||||
class Trade:
|
||||
entry_time: datetime
|
||||
exit_time: Optional[datetime]
|
||||
side: str # 'long' or 'short'
|
||||
entry_price: float
|
||||
exit_price: Optional[float]
|
||||
size: float # Quantity of asset
|
||||
leverage: float
|
||||
pnl: float = 0.0
|
||||
pnl_percent: float = 0.0
|
||||
fees: float = 0.0
|
||||
status: str = 'open' # 'open', 'closed'
|
||||
|
||||
class Account:
|
||||
def __init__(self, initial_balance: float = 1000.0, maker_fee: float = 0.0002, taker_fee: float = 0.0005):
|
||||
self.initial_balance = initial_balance
|
||||
self.balance = initial_balance
|
||||
self.equity = initial_balance
|
||||
self.maker_fee = maker_fee
|
||||
self.taker_fee = taker_fee
|
||||
self.trades: List[Trade] = []
|
||||
self.current_position: Optional[Trade] = None
|
||||
self.margin_used = 0.0
|
||||
|
||||
def update_equity(self, current_price: float):
|
||||
"""Update equity based on unrealized PnL of current position"""
|
||||
if not self.current_position:
|
||||
self.equity = self.balance
|
||||
return
|
||||
|
||||
trade = self.current_position
|
||||
if trade.side == 'long':
|
||||
unrealized_pnl = (current_price - trade.entry_price) * trade.size
|
||||
else:
|
||||
unrealized_pnl = (trade.entry_price - current_price) * trade.size
|
||||
|
||||
self.equity = self.balance + unrealized_pnl
|
||||
|
||||
def open_position(self, time: datetime, side: str, price: float, leverage: float = 1.0, portion: float = 1.0):
|
||||
"""
|
||||
Open a position.
|
||||
portion: 0.0 to 1.0 (percentage of available balance to use)
|
||||
"""
|
||||
if self.current_position:
|
||||
# Already have a position, ignore for now (or could add to it)
|
||||
return
|
||||
|
||||
# Calculate position size
|
||||
# Margin = (Balance * portion)
|
||||
# Position Value = Margin * Leverage
|
||||
# Size = Position Value / Price
|
||||
|
||||
margin_to_use = self.balance * portion
|
||||
position_value = margin_to_use * leverage
|
||||
size = position_value / price
|
||||
|
||||
# Fee (Taker)
|
||||
fee = position_value * self.taker_fee
|
||||
self.balance -= fee # Deduct fee immediately
|
||||
|
||||
self.current_position = Trade(
|
||||
entry_time=time,
|
||||
exit_time=None,
|
||||
side=side,
|
||||
entry_price=price,
|
||||
exit_price=None,
|
||||
size=size,
|
||||
leverage=leverage,
|
||||
fees=fee
|
||||
)
|
||||
self.margin_used = margin_to_use
|
||||
|
||||
def close_position(self, time: datetime, price: float):
|
||||
"""Close the current position"""
|
||||
if not self.current_position:
|
||||
return
|
||||
|
||||
trade = self.current_position
|
||||
position_value = trade.size * price
|
||||
|
||||
# Calculate PnL
|
||||
if trade.side == 'long':
|
||||
pnl = (price - trade.entry_price) * trade.size
|
||||
pnl_pct = (price - trade.entry_price) / trade.entry_price * trade.leverage * 100
|
||||
else:
|
||||
pnl = (trade.entry_price - price) * trade.size
|
||||
pnl_pct = (trade.entry_price - price) / trade.entry_price * trade.leverage * 100
|
||||
|
||||
# Fee (Taker)
|
||||
fee = position_value * self.taker_fee
|
||||
self.balance -= fee
|
||||
trade.fees += fee
|
||||
|
||||
# Update Balance
|
||||
self.balance += pnl
|
||||
self.margin_used = 0.0
|
||||
|
||||
# Update Trade Record
|
||||
trade.exit_time = time
|
||||
trade.exit_price = price
|
||||
trade.pnl = pnl
|
||||
trade.pnl_percent = pnl_pct
|
||||
trade.status = 'closed'
|
||||
|
||||
self.trades.append(trade)
|
||||
self.current_position = None
|
||||
self.equity = self.balance
|
||||
|
||||
def get_position_dict(self) -> Optional[Dict[str, Any]]:
|
||||
if not self.current_position:
|
||||
return None
|
||||
return {
|
||||
'type': self.current_position.side,
|
||||
'entry_price': self.current_position.entry_price,
|
||||
'size': self.current_position.size,
|
||||
'leverage': self.current_position.leverage
|
||||
}
|
||||
|
||||
def get_stats(self) -> Dict[str, Any]:
|
||||
wins = [t for t in self.trades if t.pnl > 0]
|
||||
losses = [t for t in self.trades if t.pnl <= 0]
|
||||
|
||||
total_pnl = self.balance - self.initial_balance
|
||||
total_pnl_pct = (total_pnl / self.initial_balance) * 100
|
||||
|
||||
return {
|
||||
"initial_balance": self.initial_balance,
|
||||
"final_balance": self.balance,
|
||||
"total_pnl": total_pnl,
|
||||
"total_pnl_pct": total_pnl_pct,
|
||||
"total_trades": len(self.trades),
|
||||
"win_count": len(wins),
|
||||
"loss_count": len(losses),
|
||||
"win_rate": (len(wins) / len(self.trades) * 100) if self.trades else 0.0,
|
||||
"max_drawdown": 0.0, # Todo: implement DD tracking
|
||||
"trades": [
|
||||
{
|
||||
"entry_time": t.entry_time.isoformat(),
|
||||
"exit_time": t.exit_time.isoformat() if t.exit_time else None,
|
||||
"side": t.side,
|
||||
"entry_price": t.entry_price,
|
||||
"exit_price": t.exit_price,
|
||||
"pnl": t.pnl,
|
||||
"pnl_pct": t.pnl_percent,
|
||||
"fees": t.fees
|
||||
}
|
||||
for t in self.trades
|
||||
]
|
||||
}
|
||||
300
src/data_collector/websocket_client.py
Normal file
300
src/data_collector/websocket_client.py
Normal file
@ -0,0 +1,300 @@
|
||||
"""
|
||||
Hyperliquid WebSocket Client for cbBTC Data Collection
|
||||
Optimized for Synology DS218+ with automatic reconnection
|
||||
"""
|
||||
|
||||
import asyncio
|
||||
import json
|
||||
import logging
|
||||
from datetime import datetime, timezone
|
||||
from typing import Optional, Dict, Any, Callable, Awaitable, List
|
||||
from dataclasses import dataclass
|
||||
import websockets
|
||||
from websockets.exceptions import ConnectionClosed, InvalidStatusCode
|
||||
from websockets.typing import Data
|
||||
|
||||
|
||||
logger = logging.getLogger(__name__)
|
||||
|
||||
|
||||
@dataclass
|
||||
class Candle:
|
||||
"""Represents a single candlestick"""
|
||||
time: datetime
|
||||
symbol: str
|
||||
interval: str
|
||||
open: float
|
||||
high: float
|
||||
low: float
|
||||
close: float
|
||||
volume: float
|
||||
|
||||
def to_dict(self) -> Dict[str, Any]:
|
||||
return {
|
||||
'time': self.time,
|
||||
'symbol': self.symbol,
|
||||
'interval': self.interval,
|
||||
'open': self.open,
|
||||
'high': self.high,
|
||||
'low': self.low,
|
||||
'close': self.close,
|
||||
'volume': self.volume
|
||||
}
|
||||
|
||||
|
||||
class HyperliquidWebSocket:
|
||||
"""
|
||||
WebSocket client for Hyperliquid exchange
|
||||
Handles connection, reconnection, and candle data parsing
|
||||
"""
|
||||
|
||||
def __init__(
|
||||
self,
|
||||
symbol: str = "BTC",
|
||||
interval: str = "1m",
|
||||
url: str = "wss://api.hyperliquid.xyz/ws",
|
||||
reconnect_delays: Optional[List[int]] = None,
|
||||
on_candle_callback: Optional[Callable[[Candle], Awaitable[None]]] = None,
|
||||
on_error_callback: Optional[Callable[[Exception], Awaitable[None]]] = None
|
||||
):
|
||||
self.symbol = symbol
|
||||
self.interval = interval
|
||||
self.url = url
|
||||
self.reconnect_delays = reconnect_delays or [1, 2, 5, 10, 30, 60, 120, 300, 600, 900]
|
||||
self.on_candle = on_candle_callback
|
||||
self.on_error = on_error_callback
|
||||
|
||||
self.websocket: Optional[websockets.WebSocketClientProtocol] = None
|
||||
self.is_running = False
|
||||
self.reconnect_count = 0
|
||||
self.last_message_time: Optional[datetime] = None
|
||||
self.last_candle_time: Optional[datetime] = None
|
||||
self._should_stop = False
|
||||
|
||||
async def connect(self) -> None:
|
||||
"""Establish WebSocket connection with subscription"""
|
||||
try:
|
||||
logger.info(f"Connecting to Hyperliquid WebSocket: {self.url}")
|
||||
|
||||
self.websocket = await websockets.connect(
|
||||
self.url,
|
||||
ping_interval=None,
|
||||
ping_timeout=None,
|
||||
close_timeout=10
|
||||
)
|
||||
|
||||
# Subscribe to candle data
|
||||
subscribe_msg = {
|
||||
"method": "subscribe",
|
||||
"subscription": {
|
||||
"type": "candle",
|
||||
"coin": self.symbol,
|
||||
"interval": self.interval
|
||||
}
|
||||
}
|
||||
|
||||
await self.websocket.send(json.dumps(subscribe_msg))
|
||||
response = await self.websocket.recv()
|
||||
logger.info(f"Subscription response: {response}")
|
||||
|
||||
self.reconnect_count = 0
|
||||
self.is_running = True
|
||||
logger.info(f"Successfully connected and subscribed to {self.symbol} {self.interval} candles")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to connect: {e}")
|
||||
raise
|
||||
|
||||
async def disconnect(self) -> None:
|
||||
"""Gracefully close connection"""
|
||||
self._should_stop = True
|
||||
self.is_running = False
|
||||
if self.websocket:
|
||||
try:
|
||||
await self.websocket.close()
|
||||
logger.info("WebSocket connection closed")
|
||||
except Exception as e:
|
||||
logger.warning(f"Error closing WebSocket: {e}")
|
||||
|
||||
async def receive_loop(self) -> None:
|
||||
"""Main message receiving loop"""
|
||||
while self.is_running and not self._should_stop:
|
||||
try:
|
||||
if not self.websocket:
|
||||
raise ConnectionClosed(None, None)
|
||||
|
||||
message = await self.websocket.recv()
|
||||
self.last_message_time = datetime.now(timezone.utc)
|
||||
|
||||
await self._handle_message(message)
|
||||
|
||||
except ConnectionClosed as e:
|
||||
if self._should_stop:
|
||||
break
|
||||
logger.warning(f"WebSocket connection closed: {e}")
|
||||
await self._handle_reconnect()
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Error in receive loop: {e}")
|
||||
if self.on_error:
|
||||
await self.on_error(e)
|
||||
await asyncio.sleep(1)
|
||||
|
||||
async def _handle_message(self, message: Data) -> None:
|
||||
"""Parse and process incoming WebSocket message"""
|
||||
try:
|
||||
# Convert bytes to string if necessary
|
||||
if isinstance(message, bytes):
|
||||
message = message.decode('utf-8')
|
||||
|
||||
data = json.loads(message)
|
||||
|
||||
# Handle subscription confirmation
|
||||
if data.get("channel") == "subscriptionResponse":
|
||||
logger.info(f"Subscription confirmed: {data}")
|
||||
return
|
||||
|
||||
# Handle candle data
|
||||
if data.get("channel") == "candle":
|
||||
candle_data = data.get("data", {})
|
||||
if candle_data:
|
||||
candle = self._parse_candle(candle_data)
|
||||
if candle:
|
||||
self.last_candle_time = candle.time
|
||||
if self.on_candle:
|
||||
await self.on_candle(candle)
|
||||
|
||||
# Handle ping/pong
|
||||
if "ping" in data and self.websocket:
|
||||
await self.websocket.send(json.dumps({"pong": data["ping"]}))
|
||||
|
||||
except json.JSONDecodeError as e:
|
||||
logger.error(f"Failed to parse message: {e}")
|
||||
except Exception as e:
|
||||
logger.error(f"Error handling message: {e}")
|
||||
|
||||
def _parse_candle(self, data: Any) -> Optional[Candle]:
|
||||
"""Parse candle data from WebSocket message"""
|
||||
try:
|
||||
# Hyperliquid candle format: [open, high, low, close, volume, timestamp]
|
||||
if isinstance(data, list) and len(data) >= 6:
|
||||
open_price = float(data[0])
|
||||
high = float(data[1])
|
||||
low = float(data[2])
|
||||
close = float(data[3])
|
||||
volume = float(data[4])
|
||||
timestamp_ms = int(data[5])
|
||||
elif isinstance(data, dict):
|
||||
# New format: {'t': 1770812400000, 'T': ..., 's': 'BTC', 'i': '1m', 'o': '67164.0', 'c': ..., 'h': ..., 'l': ..., 'v': ..., 'n': ...}
|
||||
if 't' in data and 'o' in data:
|
||||
open_price = float(data.get("o", 0))
|
||||
high = float(data.get("h", 0))
|
||||
low = float(data.get("l", 0))
|
||||
close = float(data.get("c", 0))
|
||||
volume = float(data.get("v", 0))
|
||||
timestamp_ms = int(data.get("t", 0))
|
||||
else:
|
||||
# Old format fallback
|
||||
open_price = float(data.get("open", 0))
|
||||
high = float(data.get("high", 0))
|
||||
low = float(data.get("low", 0))
|
||||
close = float(data.get("close", 0))
|
||||
volume = float(data.get("volume", 0))
|
||||
timestamp_ms = int(data.get("time", 0))
|
||||
else:
|
||||
logger.warning(f"Unknown candle format: {data}")
|
||||
return None
|
||||
|
||||
timestamp = datetime.fromtimestamp(timestamp_ms / 1000, tz=timezone.utc)
|
||||
|
||||
return Candle(
|
||||
time=timestamp,
|
||||
symbol=self.symbol,
|
||||
interval=self.interval,
|
||||
open=open_price,
|
||||
high=high,
|
||||
low=low,
|
||||
close=close,
|
||||
volume=volume
|
||||
)
|
||||
|
||||
except (KeyError, ValueError, TypeError) as e:
|
||||
logger.error(f"Failed to parse candle data: {e}, data: {data}")
|
||||
return None
|
||||
|
||||
async def _handle_reconnect(self) -> None:
|
||||
"""Handle reconnection with exponential backoff"""
|
||||
if self._should_stop:
|
||||
return
|
||||
|
||||
if self.reconnect_count >= len(self.reconnect_delays):
|
||||
logger.error("Max reconnection attempts reached")
|
||||
self.is_running = False
|
||||
if self.on_error:
|
||||
await self.on_error(Exception("Max reconnection attempts reached"))
|
||||
return
|
||||
|
||||
delay = self.reconnect_delays[self.reconnect_count]
|
||||
self.reconnect_count += 1
|
||||
|
||||
logger.info(f"Reconnecting in {delay} seconds (attempt {self.reconnect_count})...")
|
||||
await asyncio.sleep(delay)
|
||||
|
||||
try:
|
||||
await self.connect()
|
||||
except Exception as e:
|
||||
logger.error(f"Reconnection failed: {e}")
|
||||
|
||||
def get_connection_health(self) -> Dict[str, Any]:
|
||||
"""Return connection health metrics"""
|
||||
now = datetime.now(timezone.utc)
|
||||
return {
|
||||
"is_connected": self.websocket is not None and self.is_running,
|
||||
"is_running": self.is_running,
|
||||
"reconnect_count": self.reconnect_count,
|
||||
"last_message_time": self.last_message_time.isoformat() if self.last_message_time else None,
|
||||
"last_candle_time": self.last_candle_time.isoformat() if self.last_candle_time else None,
|
||||
"seconds_since_last_message": (now - self.last_message_time).total_seconds() if self.last_message_time else None
|
||||
}
|
||||
|
||||
|
||||
async def test_websocket():
|
||||
"""Test function for WebSocket client"""
|
||||
candles_received = []
|
||||
stop_event = asyncio.Event()
|
||||
|
||||
async def on_candle(candle: Candle):
|
||||
candles_received.append(candle)
|
||||
print(f"Candle: {candle.time} - O:{candle.open} H:{candle.high} L:{candle.low} C:{candle.close} V:{candle.volume}")
|
||||
if len(candles_received) >= 5:
|
||||
print("Received 5 candles, stopping...")
|
||||
stop_event.set()
|
||||
|
||||
client = HyperliquidWebSocket(
|
||||
symbol="cbBTC-PERP",
|
||||
interval="1m",
|
||||
on_candle_callback=on_candle
|
||||
)
|
||||
|
||||
try:
|
||||
await client.connect()
|
||||
# Run receive loop in background
|
||||
receive_task = asyncio.create_task(client.receive_loop())
|
||||
# Wait for stop event
|
||||
await stop_event.wait()
|
||||
await client.disconnect()
|
||||
await receive_task
|
||||
except KeyboardInterrupt:
|
||||
print("\nStopping...")
|
||||
finally:
|
||||
await client.disconnect()
|
||||
print(f"Total candles received: {len(candles_received)}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
||||
)
|
||||
|
||||
asyncio.run(test_websocket())
|
||||
68
src/strategies/base.py
Normal file
68
src/strategies/base.py
Normal file
@ -0,0 +1,68 @@
|
||||
"""
|
||||
Base Strategy Interface
|
||||
All strategies must inherit from this class.
|
||||
"""
|
||||
|
||||
from abc import ABC, abstractmethod
|
||||
from dataclasses import dataclass
|
||||
from typing import Dict, Any, List, Optional
|
||||
from enum import Enum
|
||||
|
||||
class SignalType(Enum):
|
||||
OPEN_LONG = "open_long"
|
||||
OPEN_SHORT = "open_short"
|
||||
CLOSE_LONG = "close_long"
|
||||
CLOSE_SHORT = "close_short"
|
||||
HOLD = "hold"
|
||||
|
||||
@dataclass
|
||||
class StrategySignal:
|
||||
type: SignalType
|
||||
confidence: float
|
||||
reasoning: str
|
||||
|
||||
class BaseStrategy(ABC):
|
||||
def __init__(self, config: Optional[Dict[str, Any]] = None):
|
||||
self.config = config or {}
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
def name(self) -> str:
|
||||
"""Unique identifier for the strategy"""
|
||||
pass
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
def required_indicators(self) -> List[str]:
|
||||
"""List of indicator names required by this strategy (e.g. ['ma44'])"""
|
||||
pass
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
def display_name(self) -> str:
|
||||
"""User-friendly name for display in UI (e.g. 'MA44 Crossover')"""
|
||||
pass
|
||||
|
||||
@property
|
||||
@abstractmethod
|
||||
def description(self) -> str:
|
||||
"""Detailed description of how the strategy works"""
|
||||
pass
|
||||
|
||||
@abstractmethod
|
||||
def analyze(
|
||||
self,
|
||||
candle: Dict[str, Any],
|
||||
indicators: Dict[str, float],
|
||||
current_position: Optional[Dict[str, Any]] = None
|
||||
) -> StrategySignal:
|
||||
"""
|
||||
Analyze market data and return a trading signal.
|
||||
|
||||
Args:
|
||||
candle: Dictionary containing 'close', 'open', 'high', 'low', 'volume', 'time'
|
||||
indicators: Dictionary of pre-computed indicator values
|
||||
current_position: Details about current open position (if any)
|
||||
{'type': 'long'/'short', 'entry_price': float, 'size': float}
|
||||
"""
|
||||
pass
|
||||
77
src/strategies/ma_strategy.py
Normal file
77
src/strategies/ma_strategy.py
Normal file
@ -0,0 +1,77 @@
|
||||
"""
|
||||
Moving Average Strategy
|
||||
Configurable trend following strategy.
|
||||
- Long when Price > MA(period)
|
||||
- Short when Price < MA(period)
|
||||
"""
|
||||
|
||||
from typing import Dict, Any, List, Optional
|
||||
from .base import BaseStrategy, StrategySignal, SignalType
|
||||
|
||||
class MAStrategy(BaseStrategy):
|
||||
"""
|
||||
Configurable Moving Average Strategy.
|
||||
|
||||
Config:
|
||||
- period: int - MA period (default: 44)
|
||||
"""
|
||||
|
||||
DEFAULT_PERIOD = 44
|
||||
|
||||
@property
|
||||
def name(self) -> str:
|
||||
return "ma_trend"
|
||||
|
||||
@property
|
||||
def required_indicators(self) -> List[str]:
|
||||
# Dynamic based on config
|
||||
period = self.config.get('period', self.DEFAULT_PERIOD)
|
||||
return [f"ma{period}"]
|
||||
|
||||
@property
|
||||
def display_name(self) -> str:
|
||||
return "MA Strategy"
|
||||
|
||||
@property
|
||||
def description(self) -> str:
|
||||
return "Configurable Moving Average strategy. Parameters: period (5-500, default: 44). Goes long when price > MA(period), short when price < MA(period). Optional multi-timeframe trend filter available."
|
||||
|
||||
def analyze(
|
||||
self,
|
||||
candle: Dict[str, Any],
|
||||
indicators: Dict[str, float],
|
||||
current_position: Optional[Dict[str, Any]] = None
|
||||
) -> StrategySignal:
|
||||
|
||||
period = self.config.get('period', self.DEFAULT_PERIOD)
|
||||
ma_key = f"ma{period}"
|
||||
|
||||
price = candle['close']
|
||||
ma_value = indicators.get(ma_key)
|
||||
|
||||
if ma_value is None:
|
||||
return StrategySignal(SignalType.HOLD, 0.0, f"MA{period} not available")
|
||||
|
||||
# Current position state
|
||||
is_long = current_position and current_position.get('type') == 'long'
|
||||
is_short = current_position and current_position.get('type') == 'short'
|
||||
|
||||
# Logic: Price > MA -> Bullish
|
||||
if price > ma_value:
|
||||
if is_long:
|
||||
return StrategySignal(SignalType.HOLD, 1.0, f"Price {price:.2f} > MA{period} {ma_value:.2f}. Stay Long.")
|
||||
elif is_short:
|
||||
return StrategySignal(SignalType.CLOSE_SHORT, 1.0, f"Price {price:.2f} crossed above MA{period} {ma_value:.2f}. Close Short.")
|
||||
else:
|
||||
return StrategySignal(SignalType.OPEN_LONG, 1.0, f"Price {price:.2f} > MA{period} {ma_value:.2f}. Open Long.")
|
||||
|
||||
# Logic: Price < MA -> Bearish
|
||||
elif price < ma_value:
|
||||
if is_short:
|
||||
return StrategySignal(SignalType.HOLD, 1.0, f"Price {price:.2f} < MA{period} {ma_value:.2f}. Stay Short.")
|
||||
elif is_long:
|
||||
return StrategySignal(SignalType.CLOSE_LONG, 1.0, f"Price {price:.2f} crossed below MA{period} {ma_value:.2f}. Close Long.")
|
||||
else:
|
||||
return StrategySignal(SignalType.OPEN_SHORT, 1.0, f"Price {price:.2f} < MA{period} {ma_value:.2f}. Open Short.")
|
||||
|
||||
return StrategySignal(SignalType.HOLD, 0.0, f"Price == MA{period}")
|
||||
52
start_dev.cmd
Normal file
52
start_dev.cmd
Normal file
@ -0,0 +1,52 @@
|
||||
@echo off
|
||||
echo ===================================
|
||||
echo BTC Trading Dashboard - Development Server
|
||||
echo ===================================
|
||||
echo.
|
||||
|
||||
REM Check if venv exists
|
||||
if not exist "venv\Scripts\activate.bat" (
|
||||
echo [ERROR] Virtual environment not found!
|
||||
echo Please run setup first to create the venv.
|
||||
echo.
|
||||
pause
|
||||
exit /b 1
|
||||
)
|
||||
|
||||
REM Activate venv
|
||||
call venv\Scripts\activate.bat
|
||||
|
||||
REM Check dependencies
|
||||
echo [1/3] Checking dependencies...
|
||||
pip show fastapi >nul 2>&1
|
||||
if %errorlevel% neq 0 (
|
||||
echo Installing dependencies...
|
||||
pip install -r requirements.txt
|
||||
if %errorlevel% neq 0 (
|
||||
echo [ERROR] Failed to install dependencies
|
||||
pause
|
||||
exit /b 1
|
||||
)
|
||||
)
|
||||
|
||||
echo [2/3] Testing database connection...
|
||||
python test_db.py
|
||||
if %errorlevel% neq 0 (
|
||||
echo [WARNING] Database connection test failed
|
||||
echo Press Ctrl+C to cancel or any key to continue...
|
||||
pause >nul
|
||||
)
|
||||
|
||||
echo [3/3] Starting development server...
|
||||
echo.
|
||||
echo ===================================
|
||||
echo Server will start at:
|
||||
echo - API Docs: http://localhost:8000/docs
|
||||
echo - Dashboard: http://localhost:8000/dashboard
|
||||
echo - Health: http://localhost:8000/api/v1/health
|
||||
echo ===================================
|
||||
echo.
|
||||
echo Press Ctrl+C to stop the server
|
||||
echo.
|
||||
|
||||
uvicorn src.api.server:app --reload --host 0.0.0.0 --port 8000
|
||||
48
start_dev.sh
Normal file
48
start_dev.sh
Normal file
@ -0,0 +1,48 @@
|
||||
#!/bin/bash
|
||||
|
||||
echo "==================================="
|
||||
echo " BTC Trading Dashboard - Development Server"
|
||||
echo "==================================="
|
||||
echo ""
|
||||
|
||||
# Check if venv exists
|
||||
if [ ! -d "venv" ]; then
|
||||
echo "[ERROR] Virtual environment not found!"
|
||||
echo "Please run setup first to create the venv."
|
||||
exit 1
|
||||
fi
|
||||
|
||||
# Activate venv
|
||||
source venv/bin/activate
|
||||
|
||||
# Check dependencies
|
||||
echo "[1/3] Checking dependencies..."
|
||||
if ! pip show fastapi > /dev/null 2>&1; then
|
||||
echo "Installing dependencies..."
|
||||
pip install -r requirements.txt
|
||||
if [ $? -ne 0 ]; then
|
||||
echo "[ERROR] Failed to install dependencies"
|
||||
exit 1
|
||||
fi
|
||||
fi
|
||||
|
||||
echo "[2/3] Testing database connection..."
|
||||
python test_db.py
|
||||
if [ $? -ne 0 ]; then
|
||||
echo "[WARNING] Database connection test failed"
|
||||
read -p "Press Enter to continue or Ctrl+C to cancel..."
|
||||
fi
|
||||
|
||||
echo "[3/3] Starting development server..."
|
||||
echo ""
|
||||
echo "==================================="
|
||||
echo " Server will start at:"
|
||||
echo " - API Docs: http://localhost:8000/docs"
|
||||
echo " - Dashboard: http://localhost:8000/dashboard"
|
||||
echo " - Health: http://localhost:8000/api/v1/health"
|
||||
echo "==================================="
|
||||
echo ""
|
||||
echo "Press Ctrl+C to stop the server"
|
||||
echo ""
|
||||
|
||||
uvicorn src.api.server:app --reload --host 0.0.0.0 --port 8000
|
||||
63
test_db.py
Normal file
63
test_db.py
Normal file
@ -0,0 +1,63 @@
|
||||
import asyncio
|
||||
import os
|
||||
from dotenv import load_dotenv
|
||||
import asyncpg
|
||||
|
||||
load_dotenv()
|
||||
|
||||
async def test_db_connection():
|
||||
"""Test database connection"""
|
||||
try:
|
||||
conn = await asyncpg.connect(
|
||||
host=os.getenv('DB_HOST'),
|
||||
port=int(os.getenv('DB_PORT', 5432)),
|
||||
database=os.getenv('DB_NAME'),
|
||||
user=os.getenv('DB_USER'),
|
||||
password=os.getenv('DB_PASSWORD'),
|
||||
)
|
||||
|
||||
version = await conn.fetchval('SELECT version()')
|
||||
print(f"[OK] Database connected successfully!")
|
||||
print(f" Host: {os.getenv('DB_HOST')}:{os.getenv('DB_PORT')}")
|
||||
print(f" Database: {os.getenv('DB_NAME')}")
|
||||
print(f" User: {os.getenv('DB_USER')}")
|
||||
print(f" PostgreSQL: {version[:50]}...")
|
||||
|
||||
# Check if tables exist
|
||||
tables = await conn.fetch("""
|
||||
SELECT table_name FROM information_schema.tables
|
||||
WHERE table_schema = 'public'
|
||||
ORDER BY table_name
|
||||
""")
|
||||
|
||||
table_names = [row['table_name'] for row in tables]
|
||||
print(f"\n[OK] Found {len(table_names)} tables:")
|
||||
for table in table_names:
|
||||
print(f" - {table}")
|
||||
|
||||
# Check candles count
|
||||
if 'candles' in table_names:
|
||||
count = await conn.fetchval('SELECT COUNT(*) FROM candles')
|
||||
latest_time = await conn.fetchval("""
|
||||
SELECT MAX(time) FROM candles
|
||||
WHERE time > NOW() - INTERVAL '7 days'
|
||||
""")
|
||||
print(f"\n[OK] Candles table has {count} total records")
|
||||
if latest_time:
|
||||
print(f" Latest candle (last 7 days): {latest_time}")
|
||||
|
||||
await conn.close()
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
print(f"[FAIL] Database connection failed:")
|
||||
print(f" Error: {e}")
|
||||
print(f"\nCheck:")
|
||||
print(f" 1. NAS is reachable at {os.getenv('DB_HOST')}:{os.getenv('DB_PORT')}")
|
||||
print(f" 2. PostgreSQL is running")
|
||||
print(f" 3. Database '{os.getenv('DB_NAME')}' exists")
|
||||
print(f" 4. User '{os.getenv('DB_USER')}' has access")
|
||||
return False
|
||||
|
||||
if __name__ == '__main__':
|
||||
asyncio.run(test_db_connection())
|
||||
Reference in New Issue
Block a user