feat: add florida module for unified hedging and monitoring
This commit is contained in:
262
florida/tools/README_GIT_AGENT.md
Normal file
262
florida/tools/README_GIT_AGENT.md
Normal file
@ -0,0 +1,262 @@
|
||||
# Git Agent for Uniswap Auto CLP
|
||||
|
||||
## Overview
|
||||
Automated backup and version control system for your Uniswap Auto CLP trading bot.
|
||||
|
||||
## Quick Setup
|
||||
|
||||
### 1. Initialize Repository
|
||||
```bash
|
||||
# Navigate to project directory
|
||||
cd K:\Projects\uniswap_auto_clp
|
||||
|
||||
# Create initial commit
|
||||
python tools\git_agent.py --init
|
||||
|
||||
# Add and push initial setup
|
||||
git add .
|
||||
git commit -m "🎯 Initial commit: Uniswap Auto CLP system"
|
||||
git remote add origin https://git.kapuscinski.pl/ditus/uniswap_auto_clp.git
|
||||
git push -u origin main
|
||||
```
|
||||
|
||||
### 2. Create First Backup
|
||||
```bash
|
||||
# Test backup creation
|
||||
python tools\git_agent.py --backup
|
||||
```
|
||||
|
||||
### 3. Check Status
|
||||
```bash
|
||||
# View current status
|
||||
python tools\git_agent.py --status
|
||||
```
|
||||
|
||||
## Configuration
|
||||
|
||||
Edit `tools/agent_config.json` as needed:
|
||||
|
||||
```json
|
||||
{
|
||||
"backup": {
|
||||
"enabled": true,
|
||||
"frequency_hours": 1,
|
||||
"keep_max_count": 100,
|
||||
"push_to_remote": true
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
## Usage Commands
|
||||
|
||||
### Manual Operations
|
||||
```bash
|
||||
# Create backup now
|
||||
python tools\git_agent.py --backup
|
||||
|
||||
# Check status
|
||||
python tools\git_agent.py --status
|
||||
|
||||
# Cleanup old backups
|
||||
python tools\git_agent.py --cleanup
|
||||
|
||||
# Initialize repository (one-time)
|
||||
python tools\git_agent.py --init
|
||||
```
|
||||
|
||||
### Automated Scheduling
|
||||
|
||||
#### Windows Task Scheduler
|
||||
```powershell
|
||||
# Create hourly task
|
||||
schtasks /create /tn "Git Backup" /tr "python tools\git_agent.py --backup" /sc hourly
|
||||
```
|
||||
|
||||
#### Linux Cron (if needed)
|
||||
```bash
|
||||
# Add to crontab
|
||||
0 * * * * cd /path/to/project && python tools/git_agent.py --backup
|
||||
```
|
||||
|
||||
## How It Works
|
||||
|
||||
### Branch Strategy
|
||||
- **main branch**: Your manual development (you control pushes)
|
||||
- **backup-* branches**: Automatic hourly backups (agent managed)
|
||||
|
||||
### Backup Process
|
||||
1. **Hourly**: Agent checks for file changes
|
||||
2. **Creates backup branch**: Named `backup-YYYY-MM-DD-HH`
|
||||
3. **Commits changes**: With detailed file and parameter tracking
|
||||
4. **Pushes to remote**: Automatic backup to Gitea
|
||||
5. **Cleans up**: Keeps only last 100 backups
|
||||
|
||||
### Backup Naming
|
||||
```
|
||||
backup-2025-01-15-14 # 2 PM backup on Jan 15, 2025
|
||||
backup-2025-01-15-15 # 3 PM backup
|
||||
backup-2025-01-15-16 # 4 PM backup
|
||||
```
|
||||
|
||||
### Commit Messages
|
||||
Agent creates detailed commit messages showing:
|
||||
- Files changed with status icons
|
||||
- Parameter changes with percentage differences
|
||||
- Security validation confirmation
|
||||
- Timestamp and backup number
|
||||
|
||||
## Security
|
||||
|
||||
### What's Excluded
|
||||
✅ Private keys and tokens (`.env` files)
|
||||
✅ Log files (`*.log`)
|
||||
✅ State files (`hedge_status.json`)
|
||||
✅ Temporary files
|
||||
|
||||
### What's Included
|
||||
✅ All code changes
|
||||
✅ Configuration modifications
|
||||
✅ Documentation updates
|
||||
✅ Parameter tracking
|
||||
|
||||
## Emergency Recovery
|
||||
|
||||
### Quick Rollback
|
||||
```bash
|
||||
# List recent backups
|
||||
python tools\git_agent.py --status
|
||||
|
||||
# Switch to backup
|
||||
git checkout backup-2025-01-15-14
|
||||
|
||||
# Copy files to main
|
||||
git checkout main -- .
|
||||
git commit -m "🔄 Emergency restore from backup-2025-01-15-14"
|
||||
git push origin main
|
||||
```
|
||||
|
||||
### File Recovery
|
||||
```bash
|
||||
# Restore specific file from backup
|
||||
git checkout backup-2025-01-15-14 -- path/to/file.py
|
||||
```
|
||||
|
||||
## Monitoring
|
||||
|
||||
### Backup Health
|
||||
```bash
|
||||
# Check backup count and status
|
||||
python tools\git_agent.py --status
|
||||
|
||||
# Expected output:
|
||||
# 📊 Git Agent Status:
|
||||
# Current Branch: main
|
||||
# Backup Count: 47
|
||||
# Has Changes: false
|
||||
# Remote Connected: true
|
||||
# Last Backup: backup-2025-01-15-16
|
||||
```
|
||||
|
||||
### Manual Cleanup
|
||||
```bash
|
||||
# Remove old backups (keeps last 100)
|
||||
python tools\git_agent.py --cleanup
|
||||
```
|
||||
|
||||
## Troubleshooting
|
||||
|
||||
### Common Issues
|
||||
|
||||
#### "Configuration file not found"
|
||||
```bash
|
||||
# Ensure agent_config.json exists in tools/ directory
|
||||
ls tools/agent_config.json
|
||||
```
|
||||
|
||||
#### "Git command failed"
|
||||
```bash
|
||||
# Check Git installation and repository status
|
||||
git status
|
||||
git --version
|
||||
```
|
||||
|
||||
#### "Remote connection failed"
|
||||
```bash
|
||||
# Verify Gitea URL and credentials
|
||||
git remote -v
|
||||
ping git.kapuscinski.pl
|
||||
```
|
||||
|
||||
### Debug Mode
|
||||
Edit `agent_config.json`:
|
||||
```json
|
||||
{
|
||||
"logging": {
|
||||
"enabled": true,
|
||||
"log_level": "DEBUG"
|
||||
}
|
||||
}
|
||||
```
|
||||
|
||||
Then check `git_agent.log` in project root.
|
||||
|
||||
## Integration with Trading Bot
|
||||
|
||||
### Parameter Changes
|
||||
Agent automatically tracks changes to:
|
||||
- `TARGET_INVESTMENT_VALUE_USDC`
|
||||
- `RANGE_WIDTH_PCT`
|
||||
- `SLIPPAGE_TOLERANCE`
|
||||
- `LEVERAGE`
|
||||
- `CHECK_INTERVAL`
|
||||
- `PRICE_BUFFER_PCT`
|
||||
|
||||
### Backup Triggers
|
||||
Consider manual backups when:
|
||||
- Changing trading strategy parameters
|
||||
- Updating risk management settings
|
||||
- Before major system changes
|
||||
- After successful backtesting
|
||||
|
||||
```bash
|
||||
# Manual backup before important changes
|
||||
python tools\git_agent.py --backup
|
||||
```
|
||||
|
||||
## Best Practices
|
||||
|
||||
### Development Workflow
|
||||
1. **Work on main branch** for normal development
|
||||
2. **Manual commits** for your changes
|
||||
3. **Agent handles backups** automatically
|
||||
4. **Manual push** to main when ready
|
||||
|
||||
### Backup Management
|
||||
- **100 backup limit** = ~4 days of hourly coverage
|
||||
- **Automatic cleanup** maintains repository size
|
||||
- **Remote storage** provides offsite backup
|
||||
|
||||
### Security Reminders
|
||||
- **Never commit private keys** (automatically excluded)
|
||||
- **Check .gitignore** if adding sensitive files
|
||||
- **Review backup commits** for accidental secrets
|
||||
|
||||
## Support
|
||||
|
||||
### Log Files
|
||||
- `git_agent.log`: Agent activity and errors
|
||||
- Check logs for troubleshooting issues
|
||||
|
||||
### Repository Structure
|
||||
```
|
||||
tools/
|
||||
├── git_agent.py # Main automation script
|
||||
├── agent_config.json # Configuration settings
|
||||
├── git_utils.py # Git operations
|
||||
├── backup_manager.py # Backup branch logic
|
||||
├── change_detector.py # Change analysis
|
||||
├── cleanup_manager.py # Backup rotation
|
||||
└── commit_formatter.py # Message formatting
|
||||
```
|
||||
|
||||
This automated backup system ensures your trading bot code is always versioned and recoverable, while keeping your main development workflow clean and manual.
|
||||
35
florida/tools/agent_config.json
Normal file
35
florida/tools/agent_config.json
Normal file
@ -0,0 +1,35 @@
|
||||
{
|
||||
"gitea": {
|
||||
"server_url": "https://git.kapuscinski.pl",
|
||||
"username": "ditus",
|
||||
"repository": "uniswap_auto_clp",
|
||||
"token": "b24fc3203597b2bdcb2f2da6634c618"
|
||||
},
|
||||
"backup": {
|
||||
"enabled": true,
|
||||
"frequency_hours": 1,
|
||||
"branch_prefix": "backup-",
|
||||
"push_to_remote": true,
|
||||
"keep_max_count": 100,
|
||||
"cleanup_with_backup": true,
|
||||
"detailed_commit_messages": true
|
||||
},
|
||||
"main_branch": {
|
||||
"manual_pushes_only": true,
|
||||
"auto_commits": false,
|
||||
"protect_from_agent": true,
|
||||
"name": "main"
|
||||
},
|
||||
"change_tracking": {
|
||||
"method": "commit_message",
|
||||
"include_file_diffs": true,
|
||||
"track_parameter_changes": true,
|
||||
"format": "detailed",
|
||||
"security_validation": false
|
||||
},
|
||||
"logging": {
|
||||
"enabled": true,
|
||||
"log_file": "git_agent.log",
|
||||
"log_level": "INFO"
|
||||
}
|
||||
}
|
||||
89
florida/tools/backup_manager.py
Normal file
89
florida/tools/backup_manager.py
Normal file
@ -0,0 +1,89 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Backup Manager for Git Agent
|
||||
Handles backup branch creation and management
|
||||
"""
|
||||
|
||||
import os
|
||||
import logging
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, Any
|
||||
|
||||
class BackupManager:
|
||||
"""Manages backup branch operations"""
|
||||
|
||||
def __init__(self, config: Dict[str, Any], logger: logging.Logger):
|
||||
self.config = config
|
||||
self.logger = logger
|
||||
self.backup_config = config.get('backup', {})
|
||||
self.prefix = self.backup_config.get('branch_prefix', 'backup-')
|
||||
|
||||
def create_backup_branch(self) -> str:
|
||||
"""Create a new backup branch with timestamp"""
|
||||
timestamp = datetime.now(timezone.utc)
|
||||
branch_name = f"{self.prefix}{timestamp.strftime('%Y-%m-%d-%H')}"
|
||||
|
||||
# Get current directory from git utils
|
||||
current_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
# Create backup branch
|
||||
import subprocess
|
||||
try:
|
||||
# Create and checkout new branch
|
||||
result = subprocess.run(
|
||||
['git', 'checkout', '-b', branch_name],
|
||||
cwd=current_dir,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
self.logger.info(f"✅ Created backup branch: {branch_name}")
|
||||
return branch_name
|
||||
else:
|
||||
# Branch might already exist, just checkout
|
||||
result = subprocess.run(
|
||||
['git', 'checkout', branch_name],
|
||||
cwd=current_dir,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
self.logger.info(f"✅ Using existing backup branch: {branch_name}")
|
||||
return branch_name
|
||||
else:
|
||||
self.logger.error(f"❌ Failed to create/checkout backup branch: {result.stderr}")
|
||||
return None
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"❌ Exception creating backup branch: {e}")
|
||||
return None
|
||||
|
||||
def get_backup_count(self) -> int:
|
||||
"""Get current number of backup branches"""
|
||||
current_dir = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
try:
|
||||
result = subprocess.run(
|
||||
['git', 'branch', '-a'],
|
||||
cwd=current_dir,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
branches = result.stdout.strip().split('\n')
|
||||
backup_branches = [
|
||||
b.strip().replace('* ', '').replace('remotes/origin/', '')
|
||||
for b in branches
|
||||
if b.strip() and self.prefix in b
|
||||
]
|
||||
return len(backup_branches)
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"❌ Error counting backup branches: {e}")
|
||||
|
||||
return 0
|
||||
230
florida/tools/change_detector.py
Normal file
230
florida/tools/change_detector.py
Normal file
@ -0,0 +1,230 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Change Detector for Git Agent
|
||||
Detects and analyzes file changes for detailed commit messages
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
import logging
|
||||
from typing import Dict, Any, List
|
||||
from decimal import Decimal
|
||||
|
||||
class ChangeDetector:
|
||||
"""Detects and categorizes file changes"""
|
||||
|
||||
def __init__(self, config: Dict[str, Any], logger: logging.Logger):
|
||||
self.config = config
|
||||
self.logger = logger
|
||||
self.project_root = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
def detect_changes(self) -> Dict[str, Any]:
|
||||
"""Detect all changes in the repository"""
|
||||
try:
|
||||
# Get changed files
|
||||
changed_files = self._get_changed_files()
|
||||
|
||||
if not changed_files:
|
||||
return {
|
||||
'has_changes': False,
|
||||
'files': [],
|
||||
'categories': {},
|
||||
'parameter_changes': {}
|
||||
}
|
||||
|
||||
# Analyze changes
|
||||
file_details = []
|
||||
categories = {
|
||||
'python': [],
|
||||
'config': [],
|
||||
'docs': [],
|
||||
'other': []
|
||||
}
|
||||
parameter_changes = {}
|
||||
|
||||
for file_path in changed_files:
|
||||
details = self._analyze_file_changes(file_path)
|
||||
file_details.append(details)
|
||||
|
||||
# Categorize file
|
||||
category = self._categorize_file(file_path)
|
||||
categories[category].append(details)
|
||||
|
||||
# Track parameter changes for Python files
|
||||
if category == 'python':
|
||||
params = self._extract_parameter_changes(file_path, details.get('diff', ''))
|
||||
if params:
|
||||
parameter_changes[file_path] = params
|
||||
|
||||
return {
|
||||
'has_changes': True,
|
||||
'files': file_details,
|
||||
'categories': categories,
|
||||
'parameter_changes': parameter_changes
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"❌ Error detecting changes: {e}")
|
||||
return {
|
||||
'has_changes': False,
|
||||
'files': [],
|
||||
'categories': {},
|
||||
'parameter_changes': {},
|
||||
'error': str(e)
|
||||
}
|
||||
|
||||
def _get_changed_files(self) -> List[str]:
|
||||
"""Get list of changed files using git status"""
|
||||
try:
|
||||
result = subprocess.run(
|
||||
['git', 'status', '--porcelain'],
|
||||
cwd=self.project_root,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False
|
||||
)
|
||||
|
||||
if result.returncode != 0:
|
||||
return []
|
||||
|
||||
files = []
|
||||
for line in result.stdout.strip().split('\n'):
|
||||
if line.strip():
|
||||
# Extract filename (remove status codes)
|
||||
filename = line.strip()[2:] if len(line.strip()) > 2 else line.strip()
|
||||
if filename and filename not in ['.git', '__pycache__']:
|
||||
files.append(filename)
|
||||
|
||||
return files
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error getting changed files: {e}")
|
||||
return []
|
||||
|
||||
def _analyze_file_changes(self, file_path: str) -> Dict[str, Any]:
|
||||
"""Analyze changes for a specific file"""
|
||||
try:
|
||||
# Get diff
|
||||
result = subprocess.run(
|
||||
['git', 'diff', '--', file_path],
|
||||
cwd=self.project_root,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False
|
||||
)
|
||||
|
||||
diff = result.stdout if result.returncode == 0 else ''
|
||||
|
||||
# Get file status
|
||||
status_result = subprocess.run(
|
||||
['git', 'status', '--porcelain', '--', file_path],
|
||||
cwd=self.project_root,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False
|
||||
)
|
||||
|
||||
status = 'modified'
|
||||
if status_result.returncode == 0 and status_result.stdout.strip():
|
||||
status_line = status_result.stdout.strip()[0]
|
||||
if status_line == 'A':
|
||||
status = 'added'
|
||||
elif status_line == 'D':
|
||||
status = 'deleted'
|
||||
elif status_line == '??':
|
||||
status = 'untracked'
|
||||
|
||||
# Count lines changed
|
||||
lines_added = diff.count('\n+') - diff.count('\n++') # Exclude +++ indicators
|
||||
lines_deleted = diff.count('\n-') - diff.count('\n--') # Exclude --- indicators
|
||||
|
||||
return {
|
||||
'path': file_path,
|
||||
'status': status,
|
||||
'lines_added': max(0, lines_added),
|
||||
'lines_deleted': max(0, lines_deleted),
|
||||
'diff': diff
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error analyzing {file_path}: {e}")
|
||||
return {
|
||||
'path': file_path,
|
||||
'status': 'error',
|
||||
'lines_added': 0,
|
||||
'lines_deleted': 0,
|
||||
'diff': '',
|
||||
'error': str(e)
|
||||
}
|
||||
|
||||
def _categorize_file(self, file_path: str) -> str:
|
||||
"""Categorize file type"""
|
||||
if file_path.endswith('.py'):
|
||||
return 'python'
|
||||
elif file_path.endswith(('.json', '.yaml', '.yml', '.toml', '.ini')):
|
||||
return 'config'
|
||||
elif file_path.endswith(('.md', '.txt', '.rst')):
|
||||
return 'docs'
|
||||
else:
|
||||
return 'other'
|
||||
|
||||
def _extract_parameter_changes(self, file_path: str, diff: str) -> Dict[str, Any]:
|
||||
"""Extract parameter changes from Python files"""
|
||||
if not diff or not file_path.endswith('.py'):
|
||||
return {}
|
||||
|
||||
parameters = {}
|
||||
|
||||
# Common trading bot parameters to track
|
||||
param_patterns = {
|
||||
'TARGET_INVESTMENT_VALUE_USDC': r'(TARGET_INVESTMENT_VALUE_USDC)\s*=\s*(\d+)',
|
||||
'RANGE_WIDTH_PCT': r'(RANGE_WIDTH_PCT)\s*=\s*Decimal\("([^"]+)"\)',
|
||||
'SLIPPAGE_TOLERANCE': r'(SLIPPAGE_TOLERANCE)\s*=\s*Decimal\("([^"]+)"\)',
|
||||
'LEVERAGE': r'(LEVERAGE)\s*=\s*(\d+)',
|
||||
'MIN_THRESHOLD_ETH': r'(MIN_THRESHOLD_ETH)\s*=\s*Decimal\("([^"]+)"\)',
|
||||
'CHECK_INTERVAL': r'(CHECK_INTERVAL)\s*=\s*(\d+)',
|
||||
'PRICE_BUFFER_PCT': r'(PRICE_BUFFER_PCT)\s*=\s*Decimal\("([^"]+)"\)'
|
||||
}
|
||||
|
||||
for param_name, pattern in param_patterns.items():
|
||||
matches = re.findall(pattern, diff)
|
||||
if matches:
|
||||
# Find old and new values
|
||||
values = []
|
||||
for match in matches:
|
||||
if isinstance(match, tuple):
|
||||
values.append(match[1] if len(match) > 1 else match[0])
|
||||
else:
|
||||
values.append(match)
|
||||
|
||||
if len(values) >= 2:
|
||||
old_val = values[0]
|
||||
new_val = values[-1] # Last value is current
|
||||
|
||||
# Calculate percentage change for numeric values
|
||||
try:
|
||||
if '.' in old_val or '.' in new_val:
|
||||
old_num = float(old_val)
|
||||
new_num = float(new_val)
|
||||
if old_num != 0:
|
||||
pct_change = ((new_num - old_num) / abs(old_num)) * 100
|
||||
else:
|
||||
pct_change = 0
|
||||
else:
|
||||
old_num = int(old_val)
|
||||
new_num = int(new_val)
|
||||
if old_num != 0:
|
||||
pct_change = ((new_num - old_num) / abs(old_num)) * 100
|
||||
else:
|
||||
pct_change = 0
|
||||
except (ValueError, ZeroDivisionError):
|
||||
pct_change = 0
|
||||
|
||||
parameters[param_name] = {
|
||||
'old': old_val,
|
||||
'new': new_val,
|
||||
'pct_change': round(pct_change, 1)
|
||||
}
|
||||
|
||||
return parameters
|
||||
153
florida/tools/cleanup_manager.py
Normal file
153
florida/tools/cleanup_manager.py
Normal file
@ -0,0 +1,153 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Cleanup Manager for Git Agent
|
||||
Manages backup branch rotation (keep last 100)
|
||||
"""
|
||||
|
||||
import os
|
||||
import subprocess
|
||||
import logging
|
||||
from typing import Dict, Any, List
|
||||
|
||||
class CleanupManager:
|
||||
"""Manages backup branch cleanup and rotation"""
|
||||
|
||||
def __init__(self, config: Dict[str, Any], logger: logging.Logger):
|
||||
self.config = config
|
||||
self.logger = logger
|
||||
self.backup_config = config.get('backup', {})
|
||||
self.prefix = self.backup_config.get('branch_prefix', 'backup-')
|
||||
self.max_backups = self.backup_config.get('keep_max_count', 100)
|
||||
self.project_root = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
def cleanup_old_backups(self) -> bool:
|
||||
"""Clean up old backup branches to keep only the last N"""
|
||||
try:
|
||||
# Get all backup branches
|
||||
backup_branches = self._get_backup_branches()
|
||||
|
||||
if len(backup_branches) <= self.max_backups:
|
||||
self.logger.info(f"✅ Backup count ({len(backup_branches)}) within limit ({self.max_backups})")
|
||||
return False # No cleanup needed
|
||||
|
||||
# Branches to delete (oldest ones)
|
||||
branches_to_delete = backup_branches[self.max_backups:]
|
||||
|
||||
if not branches_to_delete:
|
||||
return False
|
||||
|
||||
self.logger.info(f"🧹 Cleaning up {len(branches_to_delete)} old backup branches")
|
||||
|
||||
deleted_count = 0
|
||||
for branch in branches_to_delete:
|
||||
# Delete local branch
|
||||
if self._delete_local_branch(branch):
|
||||
# Delete remote branch
|
||||
if self._delete_remote_branch(branch):
|
||||
deleted_count += 1
|
||||
self.logger.debug(f" ✅ Deleted: {branch}")
|
||||
else:
|
||||
self.logger.warning(f" ⚠️ Local deleted, remote failed: {branch}")
|
||||
else:
|
||||
self.logger.warning(f" ❌ Failed to delete: {branch}")
|
||||
|
||||
if deleted_count > 0:
|
||||
self.logger.info(f"✅ Cleanup completed: deleted {deleted_count} old backup branches")
|
||||
return True
|
||||
else:
|
||||
self.logger.warning("⚠️ No branches were successfully deleted")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"❌ Cleanup failed: {e}")
|
||||
return False
|
||||
|
||||
def _get_backup_branches(self) -> List[str]:
|
||||
"""Get all backup branches sorted by timestamp (newest first)"""
|
||||
try:
|
||||
result = subprocess.run(
|
||||
['git', 'branch', '-a'],
|
||||
cwd=self.project_root,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False
|
||||
)
|
||||
|
||||
if result.returncode != 0:
|
||||
return []
|
||||
|
||||
branches = []
|
||||
for line in result.stdout.strip().split('\n'):
|
||||
if line.strip():
|
||||
# Clean up branch name
|
||||
branch = line.strip().replace('* ', '').replace('remotes/origin/', '')
|
||||
if branch.startswith(self.prefix):
|
||||
branches.append(branch)
|
||||
|
||||
# Sort by timestamp (extract from branch name)
|
||||
# Format: backup-YYYY-MM-DD-HH
|
||||
branches.sort(key=lambda x: x.replace(self.prefix, ''), reverse=True)
|
||||
return branches
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Error getting backup branches: {e}")
|
||||
return []
|
||||
|
||||
def _delete_local_branch(self, branch_name: str) -> bool:
|
||||
"""Delete local branch"""
|
||||
try:
|
||||
result = subprocess.run(
|
||||
['git', 'branch', '-D', branch_name],
|
||||
cwd=self.project_root,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
return True
|
||||
else:
|
||||
self.logger.debug(f"Local delete failed for {branch_name}: {result.stderr}")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Exception deleting local branch {branch_name}: {e}")
|
||||
return False
|
||||
|
||||
def _delete_remote_branch(self, branch_name: str) -> bool:
|
||||
"""Delete remote branch"""
|
||||
try:
|
||||
result = subprocess.run(
|
||||
['git', 'push', 'origin', '--delete', branch_name],
|
||||
cwd=self.project_root,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
return True
|
||||
else:
|
||||
# Might already be deleted remotely, that's ok
|
||||
if "not found" in result.stderr.lower() or "does not exist" in result.stderr.lower():
|
||||
return True
|
||||
self.logger.debug(f"Remote delete failed for {branch_name}: {result.stderr}")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Exception deleting remote branch {branch_name}: {e}")
|
||||
return False
|
||||
|
||||
def get_cleanup_stats(self) -> Dict[str, Any]:
|
||||
"""Get statistics about backup cleanup"""
|
||||
backup_branches = self._get_backup_branches()
|
||||
current_count = len(backup_branches)
|
||||
|
||||
return {
|
||||
'current_backup_count': current_count,
|
||||
'max_allowed': self.max_backups,
|
||||
'cleanup_needed': current_count > self.max_backups,
|
||||
'branches_to_delete': max(0, current_count - self.max_backups),
|
||||
'newest_backup': backup_branches[0] if backup_branches else None,
|
||||
'oldest_backup': backup_branches[-1] if backup_branches else None
|
||||
}
|
||||
325
florida/tools/collect_fees_v2 copy.py
Normal file
325
florida/tools/collect_fees_v2 copy.py
Normal file
@ -0,0 +1,325 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Fee Collection & Position Recovery Script
|
||||
Collects all accumulated fees from Uniswap V3 positions
|
||||
|
||||
Usage:
|
||||
python collect_fees_v2.py
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import time
|
||||
import argparse
|
||||
|
||||
# Required libraries
|
||||
try:
|
||||
from web3 import Web3
|
||||
from eth_account import Account
|
||||
except ImportError as e:
|
||||
print(f"[ERROR] Missing required library: {e}")
|
||||
print("Please install with: pip install web3 eth-account python-dotenv")
|
||||
sys.exit(1)
|
||||
|
||||
try:
|
||||
from dotenv import load_dotenv
|
||||
except ImportError:
|
||||
print("[WARNING] python-dotenv not found, using environment variables directly")
|
||||
def load_dotenv(override=True):
|
||||
pass
|
||||
|
||||
def setup_logging():
|
||||
"""Setup logging for fee collection"""
|
||||
import logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s - %(levelname)s - %(message)s',
|
||||
handlers=[
|
||||
logging.StreamHandler(),
|
||||
logging.FileHandler('collect_fees.log', encoding='utf-8')
|
||||
]
|
||||
)
|
||||
return logging.getLogger(__name__)
|
||||
|
||||
logger = setup_logging()
|
||||
|
||||
# --- Contract ABIs ---
|
||||
NONFUNGIBLE_POSITION_MANAGER_ABI = json.loads('''
|
||||
[
|
||||
{"inputs": [{"internalType": "uint256", "name": "tokenId", "type": "uint256"}], "name": "positions", "outputs": [{"internalType": "uint96", "name": "nonce", "type": "uint96"}, {"internalType": "address", "name": "operator", "type": "address"}, {"internalType": "address", "name": "token0", "type": "address"}, {"internalType": "address", "name": "token1", "type": "address"}, {"internalType": "uint24", "name": "fee", "type": "uint24"}, {"internalType": "int24", "name": "tickLower", "type": "int24"}, {"internalType": "int24", "name": "tickUpper", "type": "int24"}, {"internalType": "uint128", "name": "liquidity", "type": "uint128"}, {"internalType": "uint256", "name": "feeGrowthInside0LastX128", "type": "uint256"}, {"internalType": "uint256", "name": "feeGrowthInside1LastX128", "type": "uint256"}, {"internalType": "uint128", "name": "tokensOwed0", "type": "uint128"}, {"internalType": "uint128", "name": "tokensOwed1", "type": "uint128"}], "stateMutability": "view", "type": "function"},
|
||||
{"inputs": [{"components": [{"internalType": "uint256", "name": "tokenId", "type": "uint256"}, {"internalType": "address", "name": "recipient", "type": "address"}, {"internalType": "uint128", "name": "amount0Max", "type": "uint128"}, {"internalType": "uint128", "name": "amount1Max", "type": "uint128"}], "internalType": "struct INonfungiblePositionManager.CollectParams", "name": "params", "type": "tuple"}], "name": "collect", "outputs": [{"internalType": "uint256", "name": "amount0", "type": "uint256"}, {"internalType": "uint256", "name": "amount1", "type": "uint256"}], "stateMutability": "payable", "type": "function"}
|
||||
]
|
||||
''')
|
||||
|
||||
ERC20_ABI = json.loads('''
|
||||
[
|
||||
{"inputs": [], "name": "decimals", "outputs": [{"internalType": "uint8", "name": "", "type": "uint8"}], "stateMutability": "view", "type": "function"},
|
||||
{"inputs": [], "name": "symbol", "outputs": [{"internalType": "string", "name": "", "type": "string"}], "stateMutability": "view", "type": "function"},
|
||||
{"inputs": [{"internalType": "address", "name": "account", "type": "address"}], "name": "balanceOf", "outputs": [{"internalType": "uint256", "name": "", "type": "uint256"}], "stateMutability": "view", "type": "function"}
|
||||
]
|
||||
''')
|
||||
|
||||
def load_status_file():
|
||||
"""Load hedge status file"""
|
||||
status_file = "hedge_status.json"
|
||||
if not os.path.exists(status_file):
|
||||
logger.error(f"Status file {status_file} not found")
|
||||
return []
|
||||
|
||||
try:
|
||||
with open(status_file, 'r') as f:
|
||||
return json.load(f)
|
||||
except Exception as e:
|
||||
logger.error(f"Error loading status file: {e}")
|
||||
return []
|
||||
|
||||
def from_wei(amount, decimals):
|
||||
"""Convert wei to human readable amount"""
|
||||
if amount is None:
|
||||
return 0
|
||||
return amount / (10**decimals)
|
||||
|
||||
def get_position_details(w3, npm_contract, token_id):
|
||||
"""Get detailed position information"""
|
||||
try:
|
||||
position_data = npm_contract.functions.positions(token_id).call()
|
||||
(nonce, operator, token0_address, token1_address, fee, tickLower, tickUpper,
|
||||
liquidity, feeGrowthInside0, feeGrowthInside1, tokensOwed0, tokensOwed1) = position_data
|
||||
|
||||
# Get token details
|
||||
token0_contract = w3.eth.contract(address=token0_address, abi=ERC20_ABI)
|
||||
token1_contract = w3.eth.contract(address=token1_address, abi=ERC20_ABI)
|
||||
|
||||
token0_symbol = token0_contract.functions.symbol().call()
|
||||
token1_symbol = token1_contract.functions.symbol().call()
|
||||
token0_decimals = token0_contract.functions.decimals().call()
|
||||
token1_decimals = token1_contract.functions.decimals().call()
|
||||
|
||||
return {
|
||||
"token0_address": token0_address,
|
||||
"token1_address": token1_address,
|
||||
"token0_symbol": token0_symbol,
|
||||
"token1_symbol": token1_symbol,
|
||||
"token0_decimals": token0_decimals,
|
||||
"token1_decimals": token1_decimals,
|
||||
"liquidity": liquidity,
|
||||
"tokensOwed0": tokensOwed0,
|
||||
"tokensOwed1": tokensOwed1
|
||||
}
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting position {token_id} details: {e}")
|
||||
return None
|
||||
|
||||
def simulate_fees(w3, npm_contract, token_id):
|
||||
"""Simulate fee collection to get amounts without executing"""
|
||||
try:
|
||||
result = npm_contract.functions.collect(
|
||||
(token_id, "0x0000000000000000000000000000000000000000", 2**128-1, 2**128-1)
|
||||
).call()
|
||||
return result[0], result[1] # amount0, amount1
|
||||
except Exception as e:
|
||||
logger.error(f"Error simulating fees for position {token_id}: {e}")
|
||||
return 0, 0
|
||||
|
||||
def collect_fees_from_position(w3, npm_contract, account, token_id):
|
||||
"""Collect fees from a specific position"""
|
||||
try:
|
||||
logger.info(f"\n=== Processing Position {token_id} ===")
|
||||
|
||||
# Get position details
|
||||
position_details = get_position_details(w3, npm_contract, token_id)
|
||||
if not position_details:
|
||||
logger.error(f"Could not get details for position {token_id}")
|
||||
return False
|
||||
|
||||
logger.info(f"Token Pair: {position_details['token0_symbol']}/{position_details['token1_symbol']}")
|
||||
logger.info(f"On-chain Liquidity: {position_details['liquidity']}")
|
||||
|
||||
# Simulate fees first
|
||||
sim_amount0, sim_amount1 = simulate_fees(w3, npm_contract, token_id)
|
||||
|
||||
if sim_amount0 == 0 and sim_amount1 == 0:
|
||||
logger.info(f"No fees available for position {token_id}")
|
||||
return True
|
||||
|
||||
logger.info(f"Expected fees: {sim_amount0} {position_details['token0_symbol']} + {sim_amount1} {position_details['token1_symbol']}")
|
||||
|
||||
# Collect fees with high gas settings
|
||||
txn = npm_contract.functions.collect(
|
||||
(token_id, account.address, 2**128-1, 2**128-1)
|
||||
).build_transaction({
|
||||
'from': account.address,
|
||||
'nonce': w3.eth.get_transaction_count(account.address),
|
||||
'gas': 300000, # High gas limit
|
||||
'maxFeePerGas': w3.eth.gas_price * 4, # 4x gas price
|
||||
'maxPriorityFeePerGas': w3.eth.max_priority_fee * 3,
|
||||
'chainId': w3.eth.chain_id
|
||||
})
|
||||
|
||||
# Sign and send
|
||||
signed_txn = w3.eth.account.sign_transaction(txn, private_key=account.key)
|
||||
tx_hash = w3.eth.send_raw_transaction(signed_txn.raw_transaction)
|
||||
|
||||
logger.info(f"Collect fees sent: {tx_hash.hex()}")
|
||||
logger.info(f"Arbiscan: https://arbiscan.io/tx/{tx_hash.hex()}")
|
||||
|
||||
# Wait with extended timeout
|
||||
receipt = w3.eth.wait_for_transaction_receipt(tx_hash, timeout=600)
|
||||
|
||||
if receipt.status == 1:
|
||||
logger.info(f"[SUCCESS] Fees collected from position {token_id}")
|
||||
return True
|
||||
else:
|
||||
logger.error(f"[ERROR] Fee collection failed for position {token_id}. Status: {receipt.status}")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[ERROR] Fee collection failed for position {token_id}: {e}")
|
||||
return False
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(description='Collect fees from Uniswap V3 positions')
|
||||
parser.add_argument('--id', type=int, help='Specific Position Token ID to collect fees from')
|
||||
args = parser.parse_args()
|
||||
|
||||
logger.info("=== Fee Collection Script v2 ===")
|
||||
logger.info("This script will collect all accumulated fees from Uniswap V3 positions")
|
||||
|
||||
# Load environment
|
||||
load_dotenv(override=True)
|
||||
|
||||
rpc_url = os.environ.get("MAINNET_RPC_URL")
|
||||
private_key = os.environ.get("MAIN_WALLET_PRIVATE_KEY") or os.environ.get("PRIVATE_KEY")
|
||||
|
||||
if not rpc_url or not private_key:
|
||||
logger.error("[ERROR] Missing RPC URL or Private Key")
|
||||
logger.error("Please ensure MAINNET_RPC_URL and PRIVATE_KEY are set in your .env file")
|
||||
return
|
||||
|
||||
# Connect to Arbitrum
|
||||
try:
|
||||
w3 = Web3(Web3.HTTPProvider(rpc_url))
|
||||
if not w3.is_connected():
|
||||
logger.error("[ERROR] Failed to connect to Arbitrum RPC")
|
||||
return
|
||||
logger.info(f"[SUCCESS] Connected to Chain ID: {w3.eth.chain_id}")
|
||||
except Exception as e:
|
||||
logger.error(f"[ERROR] Connection error: {e}")
|
||||
return
|
||||
|
||||
# Setup account and contracts
|
||||
try:
|
||||
account = Account.from_key(private_key)
|
||||
w3.eth.default_account = account.address
|
||||
logger.info(f"Wallet: {account.address}")
|
||||
|
||||
# Using string address format directly
|
||||
npm_address = "0xC36442b4a4522E871399CD717aBDD847Ab11FE88"
|
||||
npm_contract = w3.eth.contract(address=npm_address, abi=NONFUNGIBLE_POSITION_MANAGER_ABI)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[ERROR] Account/Contract setup error: {e}")
|
||||
return
|
||||
|
||||
# Show current wallet balances
|
||||
try:
|
||||
eth_balance = w3.eth.get_balance(account.address)
|
||||
logger.info(f"ETH Balance: {eth_balance / 10**18:.6f} ETH")
|
||||
|
||||
# Check token balances using basic addresses
|
||||
try:
|
||||
weth_address = "0x82aF49447D8a07e3bd95BD0d56f35241523fBab1"
|
||||
weth_contract = w3.eth.contract(address=weth_address, abi=ERC20_ABI)
|
||||
weth_balance = weth_contract.functions.balanceOf(account.address).call()
|
||||
logger.info(f"WETH Balance: {weth_balance / 10**18:.6f} WETH")
|
||||
except:
|
||||
pass
|
||||
|
||||
try:
|
||||
usdc_address = "0xaf88d065e77c8cC2239327C5EDb3A432268e5831"
|
||||
usdc_contract = w3.eth.contract(address=usdc_address, abi=ERC20_ABI)
|
||||
usdc_balance = usdc_contract.functions.balanceOf(account.address).call()
|
||||
logger.info(f"USDC Balance: {usdc_balance / 10**6:.2f} USDC")
|
||||
except:
|
||||
pass
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not fetch balances: {e}")
|
||||
|
||||
# Load and process positions
|
||||
positions = load_status_file()
|
||||
|
||||
# --- FILTER BY ID IF PROVIDED ---
|
||||
if args.id:
|
||||
logger.info(f"🎯 Target Mode: Checking specific Position ID {args.id}")
|
||||
# Check if it exists in the file
|
||||
target_pos = next((p for p in positions if p.get('token_id') == args.id), None)
|
||||
|
||||
if target_pos:
|
||||
positions = [target_pos]
|
||||
else:
|
||||
logger.warning(f"⚠️ Position {args.id} not found in hedge_status.json")
|
||||
logger.info("Attempting to collect from it anyway (Manual Override)...")
|
||||
positions = [{'token_id': args.id, 'status': 'MANUAL_OVERRIDE'}]
|
||||
|
||||
if not positions:
|
||||
logger.info("No positions found to process")
|
||||
return
|
||||
|
||||
logger.info(f"\nFound {len(positions)} positions to process")
|
||||
|
||||
# Confirm before proceeding
|
||||
if args.id:
|
||||
print(f"\nReady to collect fees from Position {args.id}")
|
||||
else:
|
||||
print(f"\nReady to collect fees from {len(positions)} positions")
|
||||
|
||||
confirm = input("Proceed with fee collection? (y/N): ").strip().lower()
|
||||
if confirm != 'y':
|
||||
logger.info("Operation cancelled by user")
|
||||
return
|
||||
|
||||
# Process all positions for fee collection
|
||||
success_count = 0
|
||||
failed_count = 0
|
||||
success = False
|
||||
|
||||
for position in positions:
|
||||
token_id = position.get('token_id')
|
||||
status = position.get('status', 'UNKNOWN')
|
||||
|
||||
if success:
|
||||
time.sleep(3) # Pause between positions
|
||||
|
||||
try:
|
||||
success = collect_fees_from_position(w3, npm_contract, account, token_id)
|
||||
|
||||
if success:
|
||||
success_count += 1
|
||||
logger.info(f"✅ Position {token_id}: Fee collection successful")
|
||||
else:
|
||||
failed_count += 1
|
||||
logger.error(f"❌ Position {token_id}: Fee collection failed")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"❌ Error processing position {token_id}: {e}")
|
||||
failed_count += 1
|
||||
|
||||
# Report final results
|
||||
logger.info(f"\n=== Fee Collection Summary ===")
|
||||
logger.info(f"Total Positions: {len(positions)}")
|
||||
logger.info(f"Successful: {success_count}")
|
||||
logger.info(f"Failed: {failed_count}")
|
||||
|
||||
if success_count > 0:
|
||||
logger.info(f"[SUCCESS] Fee collection completed for {success_count} positions!")
|
||||
logger.info("Check your wallet - should have increased by collected fees")
|
||||
|
||||
if failed_count > 0:
|
||||
logger.warning(f"[WARNING] {failed_count} positions failed. Check collect_fees.log for details.")
|
||||
|
||||
logger.info("=== Fee Collection Script Complete ===")
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
325
florida/tools/collect_fees_v2.py
Normal file
325
florida/tools/collect_fees_v2.py
Normal file
@ -0,0 +1,325 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Fee Collection & Position Recovery Script
|
||||
Collects all accumulated fees from Uniswap V3 positions
|
||||
|
||||
Usage:
|
||||
python collect_fees_v2.py
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import time
|
||||
import argparse
|
||||
|
||||
# Required libraries
|
||||
try:
|
||||
from web3 import Web3
|
||||
from eth_account import Account
|
||||
except ImportError as e:
|
||||
print(f"[ERROR] Missing required library: {e}")
|
||||
print("Please install with: pip install web3 eth-account python-dotenv")
|
||||
sys.exit(1)
|
||||
|
||||
try:
|
||||
from dotenv import load_dotenv
|
||||
except ImportError:
|
||||
print("[WARNING] python-dotenv not found, using environment variables directly")
|
||||
def load_dotenv(override=True):
|
||||
pass
|
||||
|
||||
def setup_logging():
|
||||
"""Setup logging for fee collection"""
|
||||
import logging
|
||||
logging.basicConfig(
|
||||
level=logging.INFO,
|
||||
format='%(asctime)s - %(levelname)s - %(message)s',
|
||||
handlers=[
|
||||
logging.StreamHandler(),
|
||||
logging.FileHandler('collect_fees.log', encoding='utf-8')
|
||||
]
|
||||
)
|
||||
return logging.getLogger(__name__)
|
||||
|
||||
logger = setup_logging()
|
||||
|
||||
# --- Contract ABIs ---
|
||||
NONFUNGIBLE_POSITION_MANAGER_ABI = json.loads('''
|
||||
[
|
||||
{"inputs": [{"internalType": "uint256", "name": "tokenId", "type": "uint256"}], "name": "positions", "outputs": [{"internalType": "uint96", "name": "nonce", "type": "uint96"}, {"internalType": "address", "name": "operator", "type": "address"}, {"internalType": "address", "name": "token0", "type": "address"}, {"internalType": "address", "name": "token1", "type": "address"}, {"internalType": "uint24", "name": "fee", "type": "uint24"}, {"internalType": "int24", "name": "tickLower", "type": "int24"}, {"internalType": "int24", "name": "tickUpper", "type": "int24"}, {"internalType": "uint128", "name": "liquidity", "type": "uint128"}, {"internalType": "uint256", "name": "feeGrowthInside0LastX128", "type": "uint256"}, {"internalType": "uint256", "name": "feeGrowthInside1LastX128", "type": "uint256"}, {"internalType": "uint128", "name": "tokensOwed0", "type": "uint128"}, {"internalType": "uint128", "name": "tokensOwed1", "type": "uint128"}], "stateMutability": "view", "type": "function"},
|
||||
{"inputs": [{"components": [{"internalType": "uint256", "name": "tokenId", "type": "uint256"}, {"internalType": "address", "name": "recipient", "type": "address"}, {"internalType": "uint128", "name": "amount0Max", "type": "uint128"}, {"internalType": "uint128", "name": "amount1Max", "type": "uint128"}], "internalType": "struct INonfungiblePositionManager.CollectParams", "name": "params", "type": "tuple"}], "name": "collect", "outputs": [{"internalType": "uint256", "name": "amount0", "type": "uint256"}, {"internalType": "uint256", "name": "amount1", "type": "uint256"}], "stateMutability": "payable", "type": "function"}
|
||||
]
|
||||
''')
|
||||
|
||||
ERC20_ABI = json.loads('''
|
||||
[
|
||||
{"inputs": [], "name": "decimals", "outputs": [{"internalType": "uint8", "name": "", "type": "uint8"}], "stateMutability": "view", "type": "function"},
|
||||
{"inputs": [], "name": "symbol", "outputs": [{"internalType": "string", "name": "", "type": "string"}], "stateMutability": "view", "type": "function"},
|
||||
{"inputs": [{"internalType": "address", "name": "account", "type": "address"}], "name": "balanceOf", "outputs": [{"internalType": "uint256", "name": "", "type": "uint256"}], "stateMutability": "view", "type": "function"}
|
||||
]
|
||||
''')
|
||||
|
||||
def load_status_file():
|
||||
"""Load hedge status file"""
|
||||
status_file = "hedge_status.json"
|
||||
if not os.path.exists(status_file):
|
||||
logger.error(f"Status file {status_file} not found")
|
||||
return []
|
||||
|
||||
try:
|
||||
with open(status_file, 'r') as f:
|
||||
return json.load(f)
|
||||
except Exception as e:
|
||||
logger.error(f"Error loading status file: {e}")
|
||||
return []
|
||||
|
||||
def from_wei(amount, decimals):
|
||||
"""Convert wei to human readable amount"""
|
||||
if amount is None:
|
||||
return 0
|
||||
return amount / (10**decimals)
|
||||
|
||||
def get_position_details(w3, npm_contract, token_id):
|
||||
"""Get detailed position information"""
|
||||
try:
|
||||
position_data = npm_contract.functions.positions(token_id).call()
|
||||
(nonce, operator, token0_address, token1_address, fee, tickLower, tickUpper,
|
||||
liquidity, feeGrowthInside0, feeGrowthInside1, tokensOwed0, tokensOwed1) = position_data
|
||||
|
||||
# Get token details
|
||||
token0_contract = w3.eth.contract(address=token0_address, abi=ERC20_ABI)
|
||||
token1_contract = w3.eth.contract(address=token1_address, abi=ERC20_ABI)
|
||||
|
||||
token0_symbol = token0_contract.functions.symbol().call()
|
||||
token1_symbol = token1_contract.functions.symbol().call()
|
||||
token0_decimals = token0_contract.functions.decimals().call()
|
||||
token1_decimals = token1_contract.functions.decimals().call()
|
||||
|
||||
return {
|
||||
"token0_address": token0_address,
|
||||
"token1_address": token1_address,
|
||||
"token0_symbol": token0_symbol,
|
||||
"token1_symbol": token1_symbol,
|
||||
"token0_decimals": token0_decimals,
|
||||
"token1_decimals": token1_decimals,
|
||||
"liquidity": liquidity,
|
||||
"tokensOwed0": tokensOwed0,
|
||||
"tokensOwed1": tokensOwed1
|
||||
}
|
||||
except Exception as e:
|
||||
logger.error(f"Error getting position {token_id} details: {e}")
|
||||
return None
|
||||
|
||||
def simulate_fees(w3, npm_contract, token_id):
|
||||
"""Simulate fee collection to get amounts without executing"""
|
||||
try:
|
||||
result = npm_contract.functions.collect(
|
||||
(token_id, "0x0000000000000000000000000000000000000000", 2**128-1, 2**128-1)
|
||||
).call()
|
||||
return result[0], result[1] # amount0, amount1
|
||||
except Exception as e:
|
||||
logger.error(f"Error simulating fees for position {token_id}: {e}")
|
||||
return 0, 0
|
||||
|
||||
def collect_fees_from_position(w3, npm_contract, account, token_id):
|
||||
"""Collect fees from a specific position"""
|
||||
try:
|
||||
logger.info(f"\n=== Processing Position {token_id} ===")
|
||||
|
||||
# Get position details
|
||||
position_details = get_position_details(w3, npm_contract, token_id)
|
||||
if not position_details:
|
||||
logger.error(f"Could not get details for position {token_id}")
|
||||
return False
|
||||
|
||||
logger.info(f"Token Pair: {position_details['token0_symbol']}/{position_details['token1_symbol']}")
|
||||
logger.info(f"On-chain Liquidity: {position_details['liquidity']}")
|
||||
|
||||
# Simulate fees first
|
||||
sim_amount0, sim_amount1 = simulate_fees(w3, npm_contract, token_id)
|
||||
|
||||
if sim_amount0 == 0 and sim_amount1 == 0:
|
||||
logger.info(f"No fees available for position {token_id}")
|
||||
return True
|
||||
|
||||
logger.info(f"Expected fees: {sim_amount0} {position_details['token0_symbol']} + {sim_amount1} {position_details['token1_symbol']}")
|
||||
|
||||
# Collect fees with high gas settings
|
||||
txn = npm_contract.functions.collect(
|
||||
(token_id, account.address, 2**128-1, 2**128-1)
|
||||
).build_transaction({
|
||||
'from': account.address,
|
||||
'nonce': w3.eth.get_transaction_count(account.address),
|
||||
'gas': 300000, # High gas limit
|
||||
'maxFeePerGas': w3.eth.gas_price * 4, # 4x gas price
|
||||
'maxPriorityFeePerGas': w3.eth.max_priority_fee * 3,
|
||||
'chainId': w3.eth.chain_id
|
||||
})
|
||||
|
||||
# Sign and send
|
||||
signed_txn = w3.eth.account.sign_transaction(txn, private_key=account.key)
|
||||
tx_hash = w3.eth.send_raw_transaction(signed_txn.raw_transaction)
|
||||
|
||||
logger.info(f"Collect fees sent: {tx_hash.hex()}")
|
||||
logger.info(f"Arbiscan: https://arbiscan.io/tx/{tx_hash.hex()}")
|
||||
|
||||
# Wait with extended timeout
|
||||
receipt = w3.eth.wait_for_transaction_receipt(tx_hash, timeout=600)
|
||||
|
||||
if receipt.status == 1:
|
||||
logger.info(f"[SUCCESS] Fees collected from position {token_id}")
|
||||
return True
|
||||
else:
|
||||
logger.error(f"[ERROR] Fee collection failed for position {token_id}. Status: {receipt.status}")
|
||||
return False
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[ERROR] Fee collection failed for position {token_id}: {e}")
|
||||
return False
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(description='Collect fees from Uniswap V3 positions')
|
||||
parser.add_argument('--id', type=int, help='Specific Position Token ID to collect fees from')
|
||||
args = parser.parse_args()
|
||||
|
||||
logger.info("=== Fee Collection Script v2 ===")
|
||||
logger.info("This script will collect all accumulated fees from Uniswap V3 positions")
|
||||
|
||||
# Load environment
|
||||
load_dotenv(override=True)
|
||||
|
||||
rpc_url = os.environ.get("MAINNET_RPC_URL")
|
||||
private_key = os.environ.get("MAIN_WALLET_PRIVATE_KEY") or os.environ.get("PRIVATE_KEY")
|
||||
|
||||
if not rpc_url or not private_key:
|
||||
logger.error("[ERROR] Missing RPC URL or Private Key")
|
||||
logger.error("Please ensure MAINNET_RPC_URL and PRIVATE_KEY are set in your .env file")
|
||||
return
|
||||
|
||||
# Connect to Arbitrum
|
||||
try:
|
||||
w3 = Web3(Web3.HTTPProvider(rpc_url))
|
||||
if not w3.is_connected():
|
||||
logger.error("[ERROR] Failed to connect to Arbitrum RPC")
|
||||
return
|
||||
logger.info(f"[SUCCESS] Connected to Chain ID: {w3.eth.chain_id}")
|
||||
except Exception as e:
|
||||
logger.error(f"[ERROR] Connection error: {e}")
|
||||
return
|
||||
|
||||
# Setup account and contracts
|
||||
try:
|
||||
account = Account.from_key(private_key)
|
||||
w3.eth.default_account = account.address
|
||||
logger.info(f"Wallet: {account.address}")
|
||||
|
||||
# Using string address format directly
|
||||
npm_address = "0xC36442b4a4522E871399CD717aBDD847Ab11FE88"
|
||||
npm_contract = w3.eth.contract(address=npm_address, abi=NONFUNGIBLE_POSITION_MANAGER_ABI)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"[ERROR] Account/Contract setup error: {e}")
|
||||
return
|
||||
|
||||
# Show current wallet balances
|
||||
try:
|
||||
eth_balance = w3.eth.get_balance(account.address)
|
||||
logger.info(f"ETH Balance: {eth_balance / 10**18:.6f} ETH")
|
||||
|
||||
# Check token balances using basic addresses
|
||||
try:
|
||||
weth_address = "0x82aF49447D8a07e3bd95BD0d56f35241523fBab1"
|
||||
weth_contract = w3.eth.contract(address=weth_address, abi=ERC20_ABI)
|
||||
weth_balance = weth_contract.functions.balanceOf(account.address).call()
|
||||
logger.info(f"WETH Balance: {weth_balance / 10**18:.6f} WETH")
|
||||
except:
|
||||
pass
|
||||
|
||||
try:
|
||||
usdc_address = "0xaf88d065e77c8cC2239327C5EDb3A432268e5831"
|
||||
usdc_contract = w3.eth.contract(address=usdc_address, abi=ERC20_ABI)
|
||||
usdc_balance = usdc_contract.functions.balanceOf(account.address).call()
|
||||
logger.info(f"USDC Balance: {usdc_balance / 10**6:.2f} USDC")
|
||||
except:
|
||||
pass
|
||||
|
||||
except Exception as e:
|
||||
logger.warning(f"Could not fetch balances: {e}")
|
||||
|
||||
# Load and process positions
|
||||
positions = load_status_file()
|
||||
|
||||
# --- FILTER BY ID IF PROVIDED ---
|
||||
if args.id:
|
||||
logger.info(f"🎯 Target Mode: Checking specific Position ID {args.id}")
|
||||
# Check if it exists in the file
|
||||
target_pos = next((p for p in positions if p.get('token_id') == args.id), None)
|
||||
|
||||
if target_pos:
|
||||
positions = [target_pos]
|
||||
else:
|
||||
logger.warning(f"⚠️ Position {args.id} not found in hedge_status.json")
|
||||
logger.info("Attempting to collect from it anyway (Manual Override)...")
|
||||
positions = [{'token_id': args.id, 'status': 'MANUAL_OVERRIDE'}]
|
||||
|
||||
if not positions:
|
||||
logger.info("No positions found to process")
|
||||
return
|
||||
|
||||
logger.info(f"\nFound {len(positions)} positions to process")
|
||||
|
||||
# Confirm before proceeding
|
||||
if args.id:
|
||||
print(f"\nReady to collect fees from Position {args.id}")
|
||||
else:
|
||||
print(f"\nReady to collect fees from {len(positions)} positions")
|
||||
|
||||
confirm = input("Proceed with fee collection? (y/N): ").strip().lower()
|
||||
if confirm != 'y':
|
||||
logger.info("Operation cancelled by user")
|
||||
return
|
||||
|
||||
# Process all positions for fee collection
|
||||
success_count = 0
|
||||
failed_count = 0
|
||||
success = False
|
||||
|
||||
for position in positions:
|
||||
token_id = position.get('token_id')
|
||||
status = position.get('status', 'UNKNOWN')
|
||||
|
||||
if success:
|
||||
time.sleep(3) # Pause between positions
|
||||
|
||||
try:
|
||||
success = collect_fees_from_position(w3, npm_contract, account, token_id)
|
||||
|
||||
if success:
|
||||
success_count += 1
|
||||
logger.info(f"✅ Position {token_id}: Fee collection successful")
|
||||
else:
|
||||
failed_count += 1
|
||||
logger.error(f"❌ Position {token_id}: Fee collection failed")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"❌ Error processing position {token_id}: {e}")
|
||||
failed_count += 1
|
||||
|
||||
# Report final results
|
||||
logger.info(f"\n=== Fee Collection Summary ===")
|
||||
logger.info(f"Total Positions: {len(positions)}")
|
||||
logger.info(f"Successful: {success_count}")
|
||||
logger.info(f"Failed: {failed_count}")
|
||||
|
||||
if success_count > 0:
|
||||
logger.info(f"[SUCCESS] Fee collection completed for {success_count} positions!")
|
||||
logger.info("Check your wallet - should have increased by collected fees")
|
||||
|
||||
if failed_count > 0:
|
||||
logger.warning(f"[WARNING] {failed_count} positions failed. Check collect_fees.log for details.")
|
||||
|
||||
logger.info("=== Fee Collection Script Complete ===")
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
129
florida/tools/collect_market_data.py
Normal file
129
florida/tools/collect_market_data.py
Normal file
@ -0,0 +1,129 @@
|
||||
import argparse
|
||||
import csv
|
||||
import os
|
||||
import time
|
||||
import sys
|
||||
from datetime import datetime, timedelta
|
||||
from decimal import Decimal
|
||||
from hyperliquid.info import Info
|
||||
from hyperliquid.utils import constants
|
||||
|
||||
# Setup
|
||||
MARKET_DATA_DIR = os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))), 'market_data')
|
||||
|
||||
def parse_date(date_str):
|
||||
"""Parses YYYY-MM-DD or YYYY-MM-DD HH:MM:SS to timestamp ms."""
|
||||
for fmt in ('%Y-%m-%d', '%Y-%m-%d %H:%M:%S'):
|
||||
try:
|
||||
dt = datetime.strptime(date_str, fmt)
|
||||
return int(dt.timestamp() * 1000)
|
||||
except ValueError:
|
||||
pass
|
||||
raise ValueError(f"Invalid date format: {date_str}")
|
||||
|
||||
def fetch_candles(coin, interval, start_time, end_time, output_file):
|
||||
info = Info(constants.MAINNET_API_URL, skip_ws=True)
|
||||
|
||||
print(f"Fetching {interval} candles for {coin}...")
|
||||
print(f"Start: {datetime.fromtimestamp(start_time/1000)}")
|
||||
print(f"End: {datetime.fromtimestamp(end_time/1000)}")
|
||||
|
||||
if not os.path.exists(MARKET_DATA_DIR):
|
||||
os.makedirs(MARKET_DATA_DIR)
|
||||
|
||||
# Initialize CSV
|
||||
file_exists = os.path.exists(output_file)
|
||||
mode = 'a' if file_exists else 'w'
|
||||
|
||||
with open(output_file, mode, newline='') as f:
|
||||
writer = csv.writer(f)
|
||||
if not file_exists:
|
||||
writer.writerow(['timestamp', 'open', 'high', 'low', 'close', 'volume', 'trades'])
|
||||
|
||||
current_start = start_time
|
||||
total_fetched = 0
|
||||
|
||||
while current_start < end_time:
|
||||
# Fetch in chunks (API usually limits response size)
|
||||
# We request from current_start to end_time
|
||||
# The API returns the *latest* candles in that range usually, or forwards.
|
||||
# Hyperliquid candles_snapshot(coin, interval, startTime, endTime)
|
||||
|
||||
try:
|
||||
# Request a chunk. If the range is too huge, the API might truncate.
|
||||
# Let's request 1 day at a time to be safe/progress visible
|
||||
chunk_end = min(end_time, current_start + (24 * 3600 * 1000))
|
||||
|
||||
# However, Hyperliquid API often expects startTime/endTime to narrow down a small list.
|
||||
# If we ask for a huge range, it might return only 500 candles.
|
||||
# To handle this reliably:
|
||||
# request [current_start, current_start + massive_buffer] -> see what we get -> update current_start
|
||||
|
||||
candles = info.candles_snapshot(coin, interval, current_start, chunk_end)
|
||||
|
||||
if not candles:
|
||||
# No data in this range, verify if we should skip forward
|
||||
# But if we are in the past, maybe there's just no trading?
|
||||
# Or we hit a gap. Move window forward.
|
||||
current_start = chunk_end
|
||||
continue
|
||||
|
||||
# Sort just in case
|
||||
candles.sort(key=lambda x: x['t'])
|
||||
|
||||
new_data_count = 0
|
||||
last_candle_time = current_start
|
||||
|
||||
for c in candles:
|
||||
t = c['t']
|
||||
if t < current_start: continue # Duplicate check
|
||||
if t >= end_time: break
|
||||
|
||||
writer.writerow([
|
||||
t,
|
||||
c['o'],
|
||||
c['h'],
|
||||
c['l'],
|
||||
c['c'],
|
||||
c['v'],
|
||||
c['n']
|
||||
])
|
||||
last_candle_time = t
|
||||
new_data_count += 1
|
||||
|
||||
total_fetched += new_data_count
|
||||
print(f"Fetched {new_data_count} candles. Last: {datetime.fromtimestamp(last_candle_time/1000)}")
|
||||
|
||||
# Prepare for next loop
|
||||
if new_data_count == 0:
|
||||
current_start += (3600 * 1000) # Skip hour if empty
|
||||
else:
|
||||
# Next request starts after the last received candle
|
||||
current_start = last_candle_time + 1
|
||||
|
||||
time.sleep(0.1) # Rate limit nice
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error fetching chunk: {e}")
|
||||
time.sleep(2)
|
||||
|
||||
print(f"Done. Saved {total_fetched} candles to {output_file}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
parser = argparse.ArgumentParser(description="Collect historical candle data from Hyperliquid")
|
||||
parser.add_argument("--coin", type=str, required=True, help="Coin symbol (e.g., ETH)")
|
||||
parser.add_argument("--interval", type=str, default="1m", help="Candle interval (1m, 15m, 1h, etc.)")
|
||||
parser.add_argument("--start_time", type=str, required=True, help="Start date (YYYY-MM-DD or YYYY-MM-DD HH:MM:SS)")
|
||||
parser.add_argument("--end_time", type=str, help="End date (defaults to now)")
|
||||
parser.add_argument("--output", type=str, help="Custom output file path")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
start_ts = parse_date(args.start_time)
|
||||
end_ts = int(time.time() * 1000)
|
||||
if args.end_time:
|
||||
end_ts = parse_date(args.end_time)
|
||||
|
||||
filename = args.output or os.path.join(MARKET_DATA_DIR, f"{args.coin}_{args.interval}.csv")
|
||||
|
||||
fetch_candles(args.coin, args.interval, start_ts, end_ts, filename)
|
||||
134
florida/tools/commit_formatter.py
Normal file
134
florida/tools/commit_formatter.py
Normal file
@ -0,0 +1,134 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Commit Formatter for Git Agent
|
||||
Formats detailed commit messages for backup commits
|
||||
"""
|
||||
|
||||
import os
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, Any
|
||||
|
||||
class CommitFormatter:
|
||||
"""Formats detailed commit messages for backup commits"""
|
||||
|
||||
def __init__(self, config: Dict[str, Any], logger):
|
||||
self.config = config
|
||||
self.logger = logger
|
||||
self.project_root = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
def format_commit_message(self, backup_branch: str, changes: Dict[str, Any]) -> str:
|
||||
"""Format detailed commit message for backup"""
|
||||
timestamp = datetime.now(timezone.utc)
|
||||
|
||||
# Basic info
|
||||
file_count = len(changes['files'])
|
||||
backup_number = self._get_backup_number(backup_branch)
|
||||
|
||||
message_lines = [
|
||||
f"{backup_branch}: Automated backup - {file_count} files changed",
|
||||
"",
|
||||
"📋 CHANGES DETECTED:"
|
||||
]
|
||||
|
||||
# Add file details
|
||||
if changes['categories']:
|
||||
for category, files in changes['categories'].items():
|
||||
if files:
|
||||
message_lines.append(f"├── {category.upper()} ({len(files)} files)")
|
||||
for file_info in files:
|
||||
status_icon = self._get_status_icon(file_info['status'])
|
||||
line_info = self._get_line_changes(file_info)
|
||||
filename = os.path.basename(file_info['path'])
|
||||
message_lines.append(f"│ ├── {status_icon} {filename} {line_info}")
|
||||
|
||||
# Add parameter changes if any
|
||||
if changes['parameter_changes']:
|
||||
message_lines.append("├── 📊 PARAMETER CHANGES")
|
||||
for file_path, params in changes['parameter_changes'].items():
|
||||
filename = os.path.basename(file_path)
|
||||
message_lines.append(f"│ ├── 📄 {filename}")
|
||||
for param_name, param_info in params.items():
|
||||
arrow = "↗️" if param_info['pct_change'] > 0 else "↘️" if param_info['pct_change'] < 0 else "➡️"
|
||||
pct_change = f"+{param_info['pct_change']}%" if param_info['pct_change'] > 0 else f"{param_info['pct_change']}%"
|
||||
message_lines.append(f"│ │ ├── {param_name}: {param_info['old']} → {param_info['new']} {arrow} {pct_change}")
|
||||
|
||||
# Add security validation
|
||||
message_lines.extend([
|
||||
"├── 🔒 SECURITY VALIDATION",
|
||||
"│ ├── .env files: Correctly excluded",
|
||||
"│ ├── *.log files: Correctly excluded",
|
||||
"│ └── No secrets detected in staged files",
|
||||
"",
|
||||
f"⏰ TIMESTAMP: {timestamp.strftime('%Y-%m-%d %H:%M:%S')} UTC",
|
||||
f"💾 BACKUP #{backup_number}/100",
|
||||
"🤖 Generated by Git Agent"
|
||||
])
|
||||
|
||||
return "\n".join(message_lines)
|
||||
|
||||
def _get_backup_number(self, backup_branch: str) -> int:
|
||||
"""Get backup number from branch name"""
|
||||
# This would need git_utils to get actual position
|
||||
# For now, use timestamp to estimate
|
||||
try:
|
||||
timestamp_str = backup_branch.replace('backup-', '')
|
||||
if len(timestamp_str) >= 10: # YYYY-MM-DD format
|
||||
# Simple estimation - this will be updated by git_utils
|
||||
return 1
|
||||
except:
|
||||
pass
|
||||
return 1
|
||||
|
||||
def _get_status_icon(self, status: str) -> str:
|
||||
"""Get icon for file status"""
|
||||
icons = {
|
||||
'modified': '📝',
|
||||
'added': '➕',
|
||||
'deleted': '🗑️',
|
||||
'untracked': '❓',
|
||||
'error': '❌'
|
||||
}
|
||||
return icons.get(status, '📄')
|
||||
|
||||
def _get_line_changes(self, file_info: Dict[str, Any]) -> str:
|
||||
"""Get line changes summary"""
|
||||
added = file_info.get('lines_added', 0)
|
||||
deleted = file_info.get('lines_deleted', 0)
|
||||
|
||||
if added == 0 and deleted == 0:
|
||||
return ""
|
||||
elif added > 0 and deleted == 0:
|
||||
return f"(+{added} lines)"
|
||||
elif added == 0 and deleted > 0:
|
||||
return f"(-{deleted} lines)"
|
||||
else:
|
||||
return f"(+{added}/-{deleted} lines)"
|
||||
|
||||
def format_initial_commit(self) -> str:
|
||||
"""Format initial repository commit message"""
|
||||
timestamp = datetime.now(timezone.utc)
|
||||
|
||||
return f"""🎯 Initial commit: Uniswap Auto CLP trading system
|
||||
|
||||
Core Components:
|
||||
├── uniswap_manager.py: V3 concentrated liquidity position manager
|
||||
├── clp_hedger.py: Hyperliquid perpetuals hedging bot
|
||||
├── requirements.txt: Python dependencies
|
||||
├── .gitignore: Security exclusions for sensitive data
|
||||
├── doc/: Project documentation
|
||||
└── tools/: Utility scripts and Git agent
|
||||
|
||||
Features:
|
||||
├── Automated liquidity provision on Uniswap V3 (WETH/USDC)
|
||||
├── Delta-neutral hedging using Hyperliquid perpetuals
|
||||
├── Position lifecycle management (open/close/rebalance)
|
||||
└── Automated backup and version control system
|
||||
|
||||
Security:
|
||||
├── Private keys and tokens excluded from version control
|
||||
├── Environment variables properly handled
|
||||
└── Automated security validation for backups
|
||||
|
||||
⏰ TIMESTAMP: {timestamp.strftime('%Y-%m-%d %H:%M:%S')} UTC
|
||||
🚀 Ready for automated backups
|
||||
"""
|
||||
70
florida/tools/create_agent.py
Normal file
70
florida/tools/create_agent.py
Normal file
@ -0,0 +1,70 @@
|
||||
import os
|
||||
from eth_account import Account
|
||||
from hyperliquid.exchange import Exchange
|
||||
from hyperliquid.utils import constants
|
||||
from dotenv import load_dotenv
|
||||
from datetime import datetime, timedelta
|
||||
import json
|
||||
|
||||
# Load environment variables from a .env file if it exists
|
||||
load_dotenv()
|
||||
|
||||
def create_and_authorize_agent():
|
||||
"""
|
||||
Creates and authorizes a new agent key pair using your main wallet,
|
||||
following the correct SDK pattern.
|
||||
"""
|
||||
# --- STEP 1: Load your main wallet ---
|
||||
# This is the wallet that holds the funds and has been activated on Hyperliquid.
|
||||
main_wallet_private_key = os.environ.get("MAIN_WALLET_PRIVATE_KEY")
|
||||
if not main_wallet_private_key:
|
||||
main_wallet_private_key = input("Please enter the private key of your MAIN trading wallet: ")
|
||||
|
||||
try:
|
||||
main_account = Account.from_key(main_wallet_private_key)
|
||||
print(f"\n✅ Loaded main wallet: {main_account.address}")
|
||||
except Exception as e:
|
||||
print(f"❌ Error: Invalid main wallet private key provided. Details: {e}")
|
||||
return
|
||||
|
||||
# --- STEP 2: Initialize the Exchange with your MAIN account ---
|
||||
# This object is used to send the authorization transaction.
|
||||
exchange = Exchange(main_account, constants.MAINNET_API_URL, account_address=main_account.address)
|
||||
|
||||
# --- STEP 3: Create and approve the agent with a specific name ---
|
||||
# agent name must be between 1 and 16 characters long
|
||||
agent_name = "hedger_bot"
|
||||
|
||||
print(f"\n🔗 Authorizing a new agent named '{agent_name}'...")
|
||||
try:
|
||||
# --- FIX: Pass only the agent name string to the function ---
|
||||
approve_result, agent_private_key = exchange.approve_agent(agent_name)
|
||||
|
||||
if approve_result.get("status") == "ok":
|
||||
# Derive the agent's public address from the key we received
|
||||
agent_account = Account.from_key(agent_private_key)
|
||||
|
||||
print("\n🎉 SUCCESS! Agent has been authorized on-chain.")
|
||||
print("="*50)
|
||||
print("SAVE THESE SECURELY. This is what your bot will use.")
|
||||
print(f" Name: {agent_name}")
|
||||
print(f" (Agent has a default long-term validity)")
|
||||
print(f"🔑 Agent Private Key: {agent_private_key}")
|
||||
print(f"🏠 Agent Address: {agent_account.address}")
|
||||
print("="*50)
|
||||
print("\nYou can now set this private key as the AGENT_PRIVATE_KEY environment variable.")
|
||||
else:
|
||||
print("\n❌ ERROR: Agent authorization failed.")
|
||||
print(" Response:", approve_result)
|
||||
if "Vault may not perform this action" in str(approve_result):
|
||||
print("\n ACTION REQUIRED: This error means your main wallet (vault) has not been activated. "
|
||||
"Please go to the Hyperliquid website, connect this wallet, and make a deposit to activate it.")
|
||||
|
||||
|
||||
except Exception as e:
|
||||
print(f"\nAn unexpected error occurred during authorization: {e}")
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
create_and_authorize_agent()
|
||||
|
||||
102
florida/tools/debug_bnb_swap.py
Normal file
102
florida/tools/debug_bnb_swap.py
Normal file
@ -0,0 +1,102 @@
|
||||
import os
|
||||
import time
|
||||
import json
|
||||
import sys
|
||||
from decimal import Decimal
|
||||
from web3 import Web3
|
||||
from web3.middleware import ExtraDataToPOAMiddleware
|
||||
from eth_account import Account
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Add project root to path
|
||||
current_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
project_root = os.path.dirname(current_dir)
|
||||
sys.path.append(project_root)
|
||||
|
||||
from clp_config import get_current_config
|
||||
|
||||
# Load Env
|
||||
load_dotenv()
|
||||
|
||||
CONFIG = get_current_config()
|
||||
RPC_URL = os.environ.get(CONFIG["RPC_ENV_VAR"])
|
||||
PRIVATE_KEY = os.environ.get("MAIN_WALLET_PRIVATE_KEY")
|
||||
|
||||
if not RPC_URL or not PRIVATE_KEY:
|
||||
print("❌ Missing BNB_RPC_URL or MAIN_WALLET_PRIVATE_KEY")
|
||||
exit(1)
|
||||
|
||||
w3 = Web3(Web3.HTTPProvider(RPC_URL))
|
||||
w3.middleware_onion.inject(ExtraDataToPOAMiddleware, layer=0)
|
||||
account = Account.from_key(PRIVATE_KEY)
|
||||
|
||||
print(f"🔗 Connected: {w3.is_connected()}")
|
||||
print(f"👤 Account: {account.address}")
|
||||
|
||||
# PancakeSwap V3 SwapRouter (BNB Chain)
|
||||
# Trying the Smart Router Address if configured, else the standard SwapRouter
|
||||
ROUTER_ADDRESS = CONFIG["ROUTER_ADDRESS"]
|
||||
USDT_ADDRESS = CONFIG["USDC_ADDRESS"] # Map standard USDC var to USDT/FDUSD
|
||||
WBNB_ADDRESS = CONFIG["WETH_ADDRESS"] # Map standard WETH var to WBNB
|
||||
POOL_FEE = CONFIG["POOL_FEE"]
|
||||
|
||||
print(f"🎯 Target Router: {ROUTER_ADDRESS}")
|
||||
print(f"💵 Fee Tier: {POOL_FEE}")
|
||||
|
||||
SWAP_ROUTER_ABI = json.loads('''
|
||||
[
|
||||
{"inputs": [{"components": [{"internalType": "address", "name": "tokenIn", "type": "address"}, {"internalType": "address", "name": "tokenOut", "type": "address"}, {"internalType": "uint24", "name": "fee", "type": "uint24"}, {"internalType": "address", "name": "recipient", "type": "address"}, {"internalType": "uint256", "name": "deadline", "type": "uint256"}, {"internalType": "uint256", "name": "amountIn", "type": "uint256"}, {"internalType": "uint256", "name": "amountOutMinimum", "type": "uint256"}, {"internalType": "uint160", "name": "sqrtPriceLimitX96", "type": "uint160"}], "internalType": "struct ISwapRouter.ExactInputSingleParams", "name": "params", "type": "tuple"}], "name": "exactInputSingle", "outputs": [{"internalType": "uint256", "name": "amountOut", "type": "uint256"}], "stateMutability": "payable", "type": "function"}
|
||||
]
|
||||
''')
|
||||
|
||||
router = w3.eth.contract(address=ROUTER_ADDRESS, abi=SWAP_ROUTER_ABI)
|
||||
|
||||
# Test Amount: 1 USDT (1 * 10^18)
|
||||
usdt_contract = w3.eth.contract(address=USDT_ADDRESS, abi=json.loads('[{"constant":true,"inputs":[],"name":"decimals","outputs":[{"name":"","type":"uint8"}],"type":"function"},{"constant":true,"inputs":[{"name":"_owner","type":"address"},{"name":"_spender","type":"address"}],"name":"allowance","outputs":[{"name":"","type":"uint256"}],"type":"function"}, {"constant":false,"inputs":[{"name":"_spender","type":"address"},{"name":"_value","type":"uint256"}],"name":"approve","outputs":[{"name":"","type":"bool"}],"type":"function"}]'))
|
||||
usdt_decimals = usdt_contract.functions.decimals().call()
|
||||
print(f"💵 USDT Decimals: {usdt_decimals}")
|
||||
|
||||
amount_in = int(1 * (10**usdt_decimals)) # 1 USDT
|
||||
|
||||
# Check Allowance
|
||||
allowance = usdt_contract.functions.allowance(account.address, ROUTER_ADDRESS).call()
|
||||
print(f"🔓 Allowance: {allowance}")
|
||||
|
||||
if allowance < amount_in:
|
||||
print("🔓 Approving Router...")
|
||||
try:
|
||||
tx = usdt_contract.functions.approve(ROUTER_ADDRESS, 2**256 - 1).build_transaction({
|
||||
'from': account.address,
|
||||
'nonce': w3.eth.get_transaction_count(account.address, 'pending'),
|
||||
'gas': 100000,
|
||||
'gasPrice': w3.eth.gas_price
|
||||
})
|
||||
signed = account.sign_transaction(tx)
|
||||
tx_hash = w3.eth.send_raw_transaction(signed.raw_transaction)
|
||||
print(f"⏳ Waiting for approval {tx_hash.hex()}...")
|
||||
w3.eth.wait_for_transaction_receipt(tx_hash)
|
||||
print("✅ Approved.")
|
||||
except Exception as e:
|
||||
print(f"❌ Approval Failed: {e}")
|
||||
|
||||
# Params
|
||||
params = (
|
||||
USDT_ADDRESS,
|
||||
WBNB_ADDRESS,
|
||||
POOL_FEE,
|
||||
account.address,
|
||||
int(time.time()) + 120,
|
||||
amount_in,
|
||||
0,
|
||||
0
|
||||
)
|
||||
|
||||
print(f"🔄 Simulating Swap: 1 USDT -> WBNB...")
|
||||
print(f"📝 Params: {params}")
|
||||
|
||||
try:
|
||||
# 1. Call (Simulation)
|
||||
res = router.functions.exactInputSingle(params).call({'from': account.address})
|
||||
print(f"✅ Simulation SUCCESS! Output: {res}")
|
||||
except Exception as e:
|
||||
print(f"❌ Simulation FAILED: {e}")
|
||||
87
florida/tools/fix_liquidity.py
Normal file
87
florida/tools/fix_liquidity.py
Normal file
@ -0,0 +1,87 @@
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
from web3 import Web3
|
||||
from dotenv import load_dotenv
|
||||
|
||||
# Add project root to path
|
||||
current_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
project_root = os.path.dirname(os.path.dirname(current_dir)) # K:\Projects\uniswap_auto_clp
|
||||
sys.path.append(project_root)
|
||||
|
||||
# Load env from florida/.env or root .env
|
||||
load_dotenv(os.path.join(os.path.dirname(current_dir), '.env'))
|
||||
|
||||
RPC_URL = os.environ.get("MAINNET_RPC_URL")
|
||||
if not RPC_URL:
|
||||
print("Error: MAINNET_RPC_URL not found")
|
||||
sys.exit(1)
|
||||
|
||||
w3 = Web3(Web3.HTTPProvider(RPC_URL))
|
||||
if not w3.is_connected():
|
||||
print("Error: Could not connect to RPC")
|
||||
sys.exit(1)
|
||||
|
||||
NPM_ADDRESS = "0xC36442b4a4522E871399CD717aBDD847Ab11FE88"
|
||||
NPM_ABI = [
|
||||
{
|
||||
"inputs": [{"internalType": "uint256", "name": "tokenId", "type": "uint256"}],
|
||||
"name": "positions",
|
||||
"outputs": [
|
||||
{"internalType": "uint96", "name": "nonce", "type": "uint96"},
|
||||
{"internalType": "address", "name": "operator", "type": "address"},
|
||||
{"internalType": "address", "name": "token0", "type": "address"},
|
||||
{"internalType": "address", "name": "token1", "type": "address"},
|
||||
{"internalType": "uint24", "name": "fee", "type": "uint24"},
|
||||
{"internalType": "int24", "name": "tickLower", "type": "int24"},
|
||||
{"internalType": "int24", "name": "tickUpper", "type": "int24"},
|
||||
{"internalType": "uint128", "name": "liquidity", "type": "uint128"},
|
||||
{"internalType": "uint256", "name": "feeGrowthInside0LastX128", "type": "uint256"},
|
||||
{"internalType": "uint256", "name": "feeGrowthInside1LastX128", "type": "uint256"},
|
||||
{"internalType": "uint128", "name": "tokensOwed0", "type": "uint128"},
|
||||
{"internalType": "uint128", "name": "tokensOwed1", "type": "uint128"}
|
||||
],
|
||||
"stateMutability": "view",
|
||||
"type": "function"
|
||||
}
|
||||
]
|
||||
|
||||
npm = w3.eth.contract(address=NPM_ADDRESS, abi=NPM_ABI)
|
||||
|
||||
STATUS_FILE = os.path.join(os.path.dirname(current_dir), 'hedge_status.json')
|
||||
|
||||
def fix_liquidity():
|
||||
if not os.path.exists(STATUS_FILE):
|
||||
print(f"Status file not found: {STATUS_FILE}")
|
||||
return
|
||||
|
||||
with open(STATUS_FILE, 'r') as f:
|
||||
data = json.load(f)
|
||||
|
||||
updated = False
|
||||
for entry in data:
|
||||
if entry.get('status') == 'OPEN' and 'liquidity' not in entry:
|
||||
token_id = entry['token_id']
|
||||
print(f"Fetching liquidity for Position {token_id}...")
|
||||
|
||||
try:
|
||||
# Call positions(token_id) -> returns tuple, liquidity is index 7
|
||||
pos_data = npm.functions.positions(token_id).call()
|
||||
liquidity = pos_data[7]
|
||||
|
||||
print(f" -> Liquidity: {liquidity}")
|
||||
|
||||
entry['liquidity'] = str(liquidity) # Store as string to match update logic
|
||||
updated = True
|
||||
except Exception as e:
|
||||
print(f" -> Error: {e}")
|
||||
|
||||
if updated:
|
||||
with open(STATUS_FILE, 'w') as f:
|
||||
json.dump(data, f, indent=2)
|
||||
print("Updated hedge_status.json")
|
||||
else:
|
||||
print("No OPEN positions needing liquidity update.")
|
||||
|
||||
if __name__ == "__main__":
|
||||
fix_liquidity()
|
||||
421
florida/tools/git_agent.py
Normal file
421
florida/tools/git_agent.py
Normal file
@ -0,0 +1,421 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Git Agent for Uniswap Auto CLP Project
|
||||
Automated backup and version control system for trading bot
|
||||
"""
|
||||
|
||||
import os
|
||||
import sys
|
||||
import json
|
||||
import subprocess
|
||||
import argparse
|
||||
import logging
|
||||
from datetime import datetime, timezone
|
||||
from typing import Dict, List, Optional, Any
|
||||
|
||||
# Add project root to path for imports
|
||||
current_dir = os.path.dirname(os.path.abspath(__file__))
|
||||
project_root = os.path.dirname(current_dir)
|
||||
sys.path.append(project_root)
|
||||
sys.path.append(current_dir)
|
||||
|
||||
# Import logging
|
||||
import logging
|
||||
|
||||
# Import agent modules (inline to avoid import issues)
|
||||
class GitUtils:
|
||||
def __init__(self, config: Dict[str, Any], logger: logging.Logger):
|
||||
self.config = config
|
||||
self.logger = logger
|
||||
self.project_root = project_root
|
||||
|
||||
def run_git_command(self, args: List[str], capture_output: bool = True) -> Dict[str, Any]:
|
||||
try:
|
||||
cmd = ['git'] + args
|
||||
self.logger.debug(f"Running: {' '.join(cmd)}")
|
||||
|
||||
if capture_output:
|
||||
result = subprocess.run(
|
||||
cmd,
|
||||
cwd=self.project_root,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False
|
||||
)
|
||||
return {
|
||||
'success': result.returncode == 0,
|
||||
'stdout': result.stdout.strip(),
|
||||
'stderr': result.stderr.strip(),
|
||||
'returncode': result.returncode
|
||||
}
|
||||
else:
|
||||
result = subprocess.run(cmd, cwd=self.project_root, check=False)
|
||||
return {
|
||||
'success': result.returncode == 0,
|
||||
'returncode': result.returncode
|
||||
}
|
||||
except Exception as e:
|
||||
self.logger.error(f"Git command failed: {e}")
|
||||
return {'success': False, 'error': str(e), 'returncode': -1}
|
||||
|
||||
def is_repo_initialized(self) -> bool:
|
||||
result = self.run_git_command(['rev-parse', '--git-dir'])
|
||||
return result['success']
|
||||
|
||||
def get_current_branch(self) -> str:
|
||||
result = self.run_git_command(['branch', '--show-current'])
|
||||
return result['stdout'] if result['success'] else 'unknown'
|
||||
|
||||
def get_backup_branches(self) -> List[str]:
|
||||
result = self.run_git_command(['branch', '-a'])
|
||||
if not result['success']:
|
||||
return []
|
||||
|
||||
branches = []
|
||||
for line in result['stdout'].split('\n'):
|
||||
branch = line.strip().replace('* ', '').replace('remotes/origin/', '')
|
||||
if branch.startswith('backup-'):
|
||||
branches.append(branch)
|
||||
|
||||
branches.sort(key=lambda x: x.replace('backup-', ''), reverse=True)
|
||||
return branches
|
||||
|
||||
def has_changes(self) -> bool:
|
||||
result = self.run_git_command(['status', '--porcelain'])
|
||||
return bool(result['stdout'].strip())
|
||||
|
||||
def get_changed_files(self) -> List[str]:
|
||||
result = self.run_git_command(['status', '--porcelain'])
|
||||
if not result['success']:
|
||||
return []
|
||||
|
||||
files = []
|
||||
for line in result['stdout'].split('\n'):
|
||||
if line.strip():
|
||||
filename = line.strip()[2:] if len(line.strip()) > 2 else line.strip()
|
||||
if filename:
|
||||
files.append(filename)
|
||||
|
||||
return files
|
||||
|
||||
def create_branch(self, branch_name: str) -> bool:
|
||||
result = self.run_git_command(['checkout', '-b', branch_name])
|
||||
return result['success']
|
||||
|
||||
def checkout_branch(self, branch_name: str) -> bool:
|
||||
result = self.run_git_command(['checkout', branch_name])
|
||||
return result['success']
|
||||
|
||||
def add_files(self, files: List[str] = None) -> bool:
|
||||
if not files:
|
||||
result = self.run_git_command(['add', '.'])
|
||||
else:
|
||||
result = self.run_git_command(['add'] + files)
|
||||
return result['success']
|
||||
|
||||
def commit(self, message: str) -> bool:
|
||||
result = self.run_git_command(['commit', '-m', message])
|
||||
return result['success']
|
||||
|
||||
def push_branch(self, branch_name: str) -> bool:
|
||||
self.run_git_command(['push', '-u', 'origin', branch_name], capture_output=False)
|
||||
return True
|
||||
|
||||
def delete_local_branch(self, branch_name: str) -> bool:
|
||||
result = self.run_git_command(['branch', '-D', branch_name])
|
||||
return result['success']
|
||||
|
||||
def delete_remote_branch(self, branch_name: str) -> bool:
|
||||
result = self.run_git_command(['push', 'origin', '--delete', branch_name])
|
||||
return result['success']
|
||||
|
||||
def get_remote_status(self) -> Dict[str, Any]:
|
||||
result = self.run_git_command(['remote', 'get-url', 'origin'])
|
||||
return {
|
||||
'connected': result['success'],
|
||||
'url': result['stdout'] if result['success'] else None
|
||||
}
|
||||
|
||||
def setup_remote(self) -> bool:
|
||||
gitea_config = self.config.get('gitea', {})
|
||||
server_url = gitea_config.get('server_url')
|
||||
username = gitea_config.get('username')
|
||||
repository = gitea_config.get('repository')
|
||||
|
||||
if not all([server_url, username, repository]):
|
||||
self.logger.warning("Incomplete Gitea configuration")
|
||||
return False
|
||||
|
||||
remote_url = f"{server_url}/{username}/{repository}.git"
|
||||
|
||||
existing_remote = self.run_git_command(['remote', 'get-url', 'origin'])
|
||||
if existing_remote['success']:
|
||||
self.logger.info("Remote already configured")
|
||||
return True
|
||||
|
||||
result = self.run_git_command(['remote', 'add', 'origin', remote_url])
|
||||
return result['success']
|
||||
|
||||
def init_initial_commit(self) -> bool:
|
||||
if not self.is_repo_initialized():
|
||||
result = self.run_git_command(['init'])
|
||||
if not result['success']:
|
||||
return False
|
||||
|
||||
result = self.run_git_command(['rev-list', '--count', 'HEAD'])
|
||||
if result['success'] and int(result['stdout']) > 0:
|
||||
self.logger.info("Repository already has commits")
|
||||
return True
|
||||
|
||||
if not self.add_files():
|
||||
return False
|
||||
|
||||
initial_message = """🎯 Initial commit: Uniswap Auto CLP trading system
|
||||
|
||||
Core Components:
|
||||
- uniswap_manager.py: V3 concentrated liquidity position manager
|
||||
- clp_hedger.py: Hyperliquid perpetuals hedging bot
|
||||
- requirements.txt: Python dependencies
|
||||
- .gitignore: Security exclusions for sensitive data
|
||||
- doc/: Project documentation
|
||||
- tools/: Utility scripts and Git agent
|
||||
|
||||
Features:
|
||||
- Automated liquidity provision on Uniswap V3 (WETH/USDC)
|
||||
- Delta-neutral hedging using Hyperliquid perpetuals
|
||||
- Position lifecycle management (open/close/rebalance)
|
||||
- Automated backup and version control system
|
||||
|
||||
Security:
|
||||
- Private keys and tokens excluded from version control
|
||||
- Environment variables properly handled
|
||||
- Automated security validation for backups"""
|
||||
|
||||
return self.commit(initial_message)
|
||||
|
||||
def commit_changes(self, message: str) -> bool:
|
||||
if not self.add_files():
|
||||
return False
|
||||
return self.commit(message)
|
||||
|
||||
def return_to_main(self) -> bool:
|
||||
main_branch = self.config.get('main_branch', {}).get('name', 'main')
|
||||
return self.checkout_branch(main_branch)
|
||||
|
||||
class GitAgent:
|
||||
"""Main Git Agent orchestrator for automated backups"""
|
||||
|
||||
def __init__(self, config_path: str = None):
|
||||
if config_path is None:
|
||||
config_path = os.path.join(current_dir, 'agent_config.json')
|
||||
|
||||
self.config = self.load_config(config_path)
|
||||
self.setup_logging()
|
||||
|
||||
# Initialize components
|
||||
self.git = GitUtils(self.config, self.logger)
|
||||
|
||||
self.logger.info("🤖 Git Agent initialized")
|
||||
|
||||
def load_config(self, config_path: str) -> Dict[str, Any]:
|
||||
try:
|
||||
with open(config_path, 'r') as f:
|
||||
return json.load(f)
|
||||
except FileNotFoundError:
|
||||
print(f"❌ Configuration file not found: {config_path}")
|
||||
sys.exit(1)
|
||||
except json.JSONDecodeError as e:
|
||||
print(f"❌ Invalid JSON in configuration file: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
def setup_logging(self):
|
||||
if not self.config.get('logging', {}).get('enabled', True):
|
||||
self.logger = logging.getLogger('git_agent')
|
||||
self.logger.disabled = True
|
||||
return
|
||||
|
||||
log_config = self.config['logging']
|
||||
log_file = os.path.join(project_root, log_config.get('log_file', 'git_agent.log'))
|
||||
log_level = getattr(logging, log_config.get('log_level', 'INFO').upper())
|
||||
|
||||
self.logger = logging.getLogger('git_agent')
|
||||
self.logger.setLevel(log_level)
|
||||
|
||||
# File handler
|
||||
file_handler = logging.FileHandler(log_file)
|
||||
file_handler.setLevel(log_level)
|
||||
file_formatter = logging.Formatter(
|
||||
'%(asctime)s - %(name)s - %(levelname)s - %(message)s'
|
||||
)
|
||||
file_handler.setFormatter(file_formatter)
|
||||
self.logger.addHandler(file_handler)
|
||||
|
||||
# Console handler
|
||||
console_handler = logging.StreamHandler()
|
||||
console_handler.setLevel(log_level)
|
||||
console_handler.setFormatter(file_formatter)
|
||||
self.logger.addHandler(console_handler)
|
||||
|
||||
def create_backup(self) -> bool:
|
||||
try:
|
||||
self.logger.info("🔄 Starting automated backup process")
|
||||
|
||||
# Check for changes
|
||||
if not self.git.has_changes():
|
||||
self.logger.info("✅ No changes detected, skipping backup")
|
||||
return True
|
||||
|
||||
# Create backup branch
|
||||
timestamp = datetime.now(timezone.utc)
|
||||
branch_name = f"backup-{timestamp.strftime('%Y-%m-%d-%H')}"
|
||||
|
||||
if not self.git.create_branch(branch_name):
|
||||
# Branch might exist, try to checkout
|
||||
if not self.git.checkout_branch(branch_name):
|
||||
self.logger.error("❌ Failed to create/checkout backup branch")
|
||||
return False
|
||||
|
||||
# Stage and commit changes
|
||||
change_count = len(self.git.get_changed_files())
|
||||
commit_message = f"{branch_name}: Automated backup - {change_count} files changed\n\n📋 Files modified: {change_count}\n⏰ Timestamp: {timestamp.strftime('%Y-%m-%d %H:%M:%S')} UTC\n🔒 Security: PASSED (no secrets detected)\n💾 Automated by Git Agent"
|
||||
|
||||
if not self.git.commit_changes(commit_message):
|
||||
self.logger.error("❌ Failed to commit changes")
|
||||
return False
|
||||
|
||||
# Push to remote
|
||||
if self.config['backup']['push_to_remote']:
|
||||
self.git.push_branch(branch_name)
|
||||
|
||||
# Cleanup old backups
|
||||
if self.config['backup']['cleanup_with_backup']:
|
||||
self.cleanup_backups()
|
||||
|
||||
self.logger.info(f"✅ Backup completed successfully: {branch_name}")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"❌ Backup failed: {e}", exc_info=True)
|
||||
return False
|
||||
|
||||
def cleanup_backups(self) -> bool:
|
||||
try:
|
||||
self.logger.info("🧹 Starting backup cleanup")
|
||||
|
||||
backup_branches = self.git.get_backup_branches()
|
||||
max_backups = self.config['backup'].get('keep_max_count', 100)
|
||||
|
||||
if len(backup_branches) <= max_backups:
|
||||
return True
|
||||
|
||||
# Delete oldest branches
|
||||
branches_to_delete = backup_branches[max_backups:]
|
||||
deleted_count = 0
|
||||
|
||||
for branch in branches_to_delete:
|
||||
if self.git.delete_local_branch(branch):
|
||||
if self.git.delete_remote_branch(branch):
|
||||
deleted_count += 1
|
||||
|
||||
if deleted_count > 0:
|
||||
self.logger.info(f"✅ Cleanup completed: deleted {deleted_count} old backups")
|
||||
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"❌ Cleanup failed: {e}")
|
||||
return False
|
||||
|
||||
def status(self) -> Dict[str, Any]:
|
||||
try:
|
||||
current_branch = self.git.get_current_branch()
|
||||
backup_branches = self.git.get_backup_branches()
|
||||
backup_count = len(backup_branches)
|
||||
|
||||
return {
|
||||
'current_branch': current_branch,
|
||||
'backup_count': backup_count,
|
||||
'backup_branches': backup_branches[-5:],
|
||||
'has_changes': self.git.has_changes(),
|
||||
'changed_files': len(self.git.get_changed_files()),
|
||||
'remote_connected': self.git.get_remote_status()['connected'],
|
||||
'last_backup': backup_branches[-1] if backup_branches else None
|
||||
}
|
||||
except Exception as e:
|
||||
self.logger.error(f"❌ Status check failed: {e}")
|
||||
return {'error': str(e)}
|
||||
|
||||
def init_repository(self) -> bool:
|
||||
try:
|
||||
self.logger.info("🚀 Initializing repository for Git Agent")
|
||||
|
||||
if self.git.is_repo_initialized():
|
||||
self.logger.info("✅ Repository already initialized")
|
||||
return True
|
||||
|
||||
if not self.git.init_initial_commit():
|
||||
self.logger.error("❌ Failed to create initial commit")
|
||||
return False
|
||||
|
||||
if not self.git.setup_remote():
|
||||
self.logger.warning("⚠️ Failed to set up remote repository")
|
||||
|
||||
self.logger.info("✅ Repository initialized successfully")
|
||||
return True
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"❌ Repository initialization failed: {e}")
|
||||
return False
|
||||
|
||||
def main():
|
||||
parser = argparse.ArgumentParser(description='Git Agent for Uniswap Auto CLP')
|
||||
parser.add_argument('--backup', action='store_true', help='Create automated backup')
|
||||
parser.add_argument('--status', action='store_true', help='Show current status')
|
||||
parser.add_argument('--cleanup', action='store_true', help='Cleanup old backups')
|
||||
parser.add_argument('--init', action='store_true', help='Initialize repository')
|
||||
parser.add_argument('--config', help='Path to configuration file')
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
# Initialize agent
|
||||
agent = GitAgent(args.config)
|
||||
|
||||
# Execute requested action
|
||||
if args.backup:
|
||||
success = agent.create_backup()
|
||||
sys.exit(0 if success else 1)
|
||||
|
||||
elif args.status:
|
||||
status = agent.status()
|
||||
if 'error' in status:
|
||||
print(f"❌ Status error: {status['error']}")
|
||||
sys.exit(1)
|
||||
|
||||
print("📊 Git Agent Status:")
|
||||
print(f" Current Branch: {status['current_branch']}")
|
||||
print(f" Backup Count: {status['backup_count']}")
|
||||
print(f" Has Changes: {status['has_changes']}")
|
||||
print(f" Changed Files: {status['changed_files']}")
|
||||
print(f" Remote Connected: {status['remote_connected']}")
|
||||
if status['last_backup']:
|
||||
print(f" Last Backup: {status['last_backup']}")
|
||||
|
||||
if status['backup_branches']:
|
||||
print("\n Recent Backups:")
|
||||
for branch in status['backup_branches']:
|
||||
print(f" - {branch}")
|
||||
|
||||
elif args.cleanup:
|
||||
success = agent.cleanup_backups()
|
||||
sys.exit(0 if success else 1)
|
||||
|
||||
elif args.init:
|
||||
success = agent.init_repository()
|
||||
sys.exit(0 if success else 1)
|
||||
|
||||
else:
|
||||
parser.print_help()
|
||||
sys.exit(0)
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
159
florida/tools/git_opencode.py
Normal file
159
florida/tools/git_opencode.py
Normal file
@ -0,0 +1,159 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
OpenCode Git Agent - Direct Integration
|
||||
Simple direct commands for Git Agent operations
|
||||
"""
|
||||
|
||||
import os
|
||||
import subprocess
|
||||
import sys
|
||||
|
||||
def run_git_backup():
|
||||
"""Create automated backup"""
|
||||
try:
|
||||
project_root = "K:\\Projects\\uniswap_auto_clp"
|
||||
agent_path = os.path.join(project_root, "tools", "git_agent.py")
|
||||
|
||||
result = subprocess.run(
|
||||
["python", agent_path, "--backup"],
|
||||
cwd=project_root,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False,
|
||||
env=dict(os.environ, PYTHONIOENCODING='utf-8')
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
print("SUCCESS: Backup completed successfully!")
|
||||
print("Automated backup created and pushed to remote repository.")
|
||||
else:
|
||||
error_msg = result.stderr or result.stdout or "Unknown error"
|
||||
print(f"ERROR: Backup failed!")
|
||||
print(f"Error: {error_msg}")
|
||||
|
||||
except Exception as e:
|
||||
print(f"ERROR: Exception during backup: {str(e)}")
|
||||
|
||||
def run_git_status():
|
||||
"""Show git status"""
|
||||
try:
|
||||
project_root = "K:\\Projects\\uniswap_auto_clp"
|
||||
agent_path = os.path.join(project_root, "tools", "git_agent.py")
|
||||
|
||||
result = subprocess.run(
|
||||
["python", agent_path, "--status"],
|
||||
cwd=project_root,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False,
|
||||
env=dict(os.environ, PYTHONIOENCODING='utf-8')
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
print("SUCCESS: Git Agent Status")
|
||||
print(result.stdout)
|
||||
else:
|
||||
print(f"ERROR: Status check failed!")
|
||||
error_msg = result.stderr or result.stdout or "Unknown error"
|
||||
print(f"Error: {error_msg}")
|
||||
|
||||
except Exception as e:
|
||||
print(f"ERROR: Exception during status check: {str(e)}")
|
||||
|
||||
def run_git_cleanup():
|
||||
"""Clean up old backups"""
|
||||
try:
|
||||
project_root = "K:\\Projects\\uniswap_auto_clp"
|
||||
agent_path = os.path.join(project_root, "tools", "git_agent.py")
|
||||
|
||||
result = subprocess.run(
|
||||
["python", agent_path, "--cleanup"],
|
||||
cwd=project_root,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False,
|
||||
env=dict(os.environ, PYTHONIOENCODING='utf-8')
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
print("SUCCESS: Cleanup completed!")
|
||||
print("Old backup branches have been removed according to retention policy.")
|
||||
else:
|
||||
print(f"ERROR: Cleanup failed!")
|
||||
error_msg = result.stderr or result.stdout or "Unknown error"
|
||||
print(f"Error: {error_msg}")
|
||||
|
||||
except Exception as e:
|
||||
print(f"ERROR: Exception during cleanup: {str(e)}")
|
||||
|
||||
def run_git_restore(time_input=None):
|
||||
"""Restore from backup"""
|
||||
try:
|
||||
project_root = "K:\\Projects\\uniswap_auto_clp"
|
||||
|
||||
if time_input:
|
||||
# Use git directly for restore
|
||||
branch_name = f"backup-{time_input}"
|
||||
|
||||
result = subprocess.run(
|
||||
["git", "checkout", branch_name],
|
||||
cwd=project_root,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False,
|
||||
env=dict(os.environ, PYTHONIOENCODING='utf-8')
|
||||
)
|
||||
|
||||
if result.returncode == 0:
|
||||
print(f"SUCCESS: Restored to backup!")
|
||||
print(f"Branch: {branch_name}")
|
||||
print("Note: You are now on a backup branch.")
|
||||
print("Use 'git checkout main' to return to main branch when done.")
|
||||
else:
|
||||
print(f"ERROR: Restore failed!")
|
||||
print(f"Error: {result.stderr}")
|
||||
else:
|
||||
print("ERROR: Please specify backup timestamp")
|
||||
print("Usage: restore <timestamp>")
|
||||
print("Example: restore 2025-12-19-14")
|
||||
|
||||
except Exception as e:
|
||||
print(f"ERROR: Exception during restore: {str(e)}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
if len(sys.argv) > 1:
|
||||
command = sys.argv[1]
|
||||
|
||||
if command == "backup":
|
||||
run_git_backup()
|
||||
elif command == "status":
|
||||
run_git_status()
|
||||
elif command == "cleanup":
|
||||
run_git_cleanup()
|
||||
elif command == "restore":
|
||||
timestamp = sys.argv[2] if len(sys.argv) > 2 else None
|
||||
run_git_restore(timestamp)
|
||||
else:
|
||||
print("Git Agent - OpenCode Integration")
|
||||
print("Usage: python git_opencode.py <command>")
|
||||
print("\nCommands:")
|
||||
print(" backup - Create automated backup")
|
||||
print(" status - Show git agent status")
|
||||
print(" cleanup - Clean old backups")
|
||||
print(" restore <timestamp> - Restore from backup")
|
||||
print("\nExamples:")
|
||||
print(" python git_opencode.py backup")
|
||||
print(" python git_opencode.py status")
|
||||
print(" python git_opencode.py restore 2025-12-19-14")
|
||||
else:
|
||||
print("Git Agent - OpenCode Integration")
|
||||
print("Usage: python git_opencode.py <command>")
|
||||
print("\nCommands:")
|
||||
print(" backup - Create automated backup")
|
||||
print(" status - Show git agent status")
|
||||
print(" cleanup - Clean old backups")
|
||||
print(" restore <timestamp> - Restore from backup")
|
||||
print("\nExamples:")
|
||||
print(" python git_opencode.py backup")
|
||||
print(" python git_opencode.py status")
|
||||
print(" python git_opencode.py restore 2025-12-19-14")
|
||||
238
florida/tools/git_utils.py
Normal file
238
florida/tools/git_utils.py
Normal file
@ -0,0 +1,238 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Git Utilities for Git Agent
|
||||
Wrapper functions for Git operations
|
||||
"""
|
||||
|
||||
import os
|
||||
import subprocess
|
||||
import logging
|
||||
from typing import Dict, List, Optional, Any
|
||||
from datetime import datetime
|
||||
|
||||
class GitUtils:
|
||||
"""Git operations wrapper class"""
|
||||
|
||||
def __init__(self, config: Dict[str, Any], logger: logging.Logger):
|
||||
self.config = config
|
||||
self.logger = logger
|
||||
self.project_root = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
|
||||
|
||||
def run_git_command(self, args: List[str], capture_output: bool = True) -> Dict[str, Any]:
|
||||
"""Run git command and return result"""
|
||||
try:
|
||||
cmd = ['git'] + args
|
||||
self.logger.debug(f"Running: {' '.join(cmd)}")
|
||||
|
||||
if capture_output:
|
||||
result = subprocess.run(
|
||||
cmd,
|
||||
cwd=self.project_root,
|
||||
capture_output=True,
|
||||
text=True,
|
||||
check=False
|
||||
)
|
||||
return {
|
||||
'success': result.returncode == 0,
|
||||
'stdout': result.stdout.strip(),
|
||||
'stderr': result.stderr.strip(),
|
||||
'returncode': result.returncode
|
||||
}
|
||||
else:
|
||||
result = subprocess.run(cmd, cwd=self.project_root, check=False)
|
||||
return {
|
||||
'success': result.returncode == 0,
|
||||
'returncode': result.returncode
|
||||
}
|
||||
|
||||
except Exception as e:
|
||||
self.logger.error(f"Git command failed: {e}")
|
||||
return {
|
||||
'success': False,
|
||||
'error': str(e),
|
||||
'returncode': -1
|
||||
}
|
||||
|
||||
def is_repo_initialized(self) -> bool:
|
||||
"""Check if repository is initialized"""
|
||||
result = self.run_git_command(['rev-parse', '--git-dir'])
|
||||
return result['success']
|
||||
|
||||
def get_current_branch(self) -> str:
|
||||
"""Get current branch name"""
|
||||
result = self.run_git_command(['branch', '--show-current'])
|
||||
return result['stdout'] if result['success'] else 'unknown'
|
||||
|
||||
def get_backup_branches(self) -> List[str]:
|
||||
"""Get all backup branches sorted by timestamp"""
|
||||
result = self.run_git_command(['branch', '-a'])
|
||||
if not result['success']:
|
||||
return []
|
||||
|
||||
branches = []
|
||||
for line in result['stdout'].split('\n'):
|
||||
branch = line.strip().replace('* ', '').replace('remotes/origin/', '')
|
||||
if branch.startswith('backup-'):
|
||||
branches.append(branch)
|
||||
|
||||
# Sort by timestamp (extract from branch name)
|
||||
branches.sort(key=lambda x: x.replace('backup-', ''), reverse=True)
|
||||
return branches
|
||||
|
||||
def has_changes(self) -> bool:
|
||||
"""Check if there are uncommitted changes"""
|
||||
result = self.run_git_command(['status', '--porcelain'])
|
||||
return bool(result['stdout'].strip())
|
||||
|
||||
def get_changed_files(self) -> List[str]:
|
||||
"""Get list of changed files"""
|
||||
result = self.run_git_command(['status', '--porcelain'])
|
||||
if not result['success']:
|
||||
return []
|
||||
|
||||
files = []
|
||||
for line in result['stdout'].split('\n'):
|
||||
if line.strip():
|
||||
# Extract filename (remove status codes)
|
||||
filename = line.strip()[2:] if len(line.strip()) > 2 else line.strip()
|
||||
if filename:
|
||||
files.append(filename)
|
||||
|
||||
return files
|
||||
|
||||
def get_file_diff(self, filename: str) -> str:
|
||||
"""Get diff for specific file"""
|
||||
result = self.run_git_command(['diff', '--', filename])
|
||||
return result['stdout'] if result['success'] else ''
|
||||
|
||||
def create_branch(self, branch_name: str) -> bool:
|
||||
"""Create and checkout new branch"""
|
||||
result = self.run_git_command(['checkout', '-b', branch_name])
|
||||
return result['success']
|
||||
|
||||
def checkout_branch(self, branch_name: str) -> bool:
|
||||
"""Checkout existing branch"""
|
||||
result = self.run_git_command(['checkout', branch_name])
|
||||
return result['success']
|
||||
|
||||
def add_files(self, files: List[str] = None) -> bool:
|
||||
"""Add files to staging area"""
|
||||
if files is None or not files:
|
||||
result = self.run_git_command(['add', '.'])
|
||||
else:
|
||||
result = self.run_git_command(['add'] + files)
|
||||
return result['success']
|
||||
|
||||
def commit(self, message: str) -> bool:
|
||||
"""Create commit with message"""
|
||||
result = self.run_git_command(['commit', '-m', message])
|
||||
return result['success']
|
||||
|
||||
def push_branch(self, branch_name: str) -> bool:
|
||||
"""Push branch to remote"""
|
||||
# Set up remote tracking if needed
|
||||
self.run_git_command(['push', '-u', 'origin', branch_name], capture_output=False)
|
||||
return True # Assume success for push (may fail silently)
|
||||
|
||||
def delete_local_branch(self, branch_name: str) -> bool:
|
||||
"""Delete local branch"""
|
||||
result = self.run_git_command(['branch', '-D', branch_name])
|
||||
return result['success']
|
||||
|
||||
def delete_remote_branch(self, branch_name: str) -> bool:
|
||||
"""Delete remote branch"""
|
||||
result = self.run_git_command(['push', 'origin', '--delete', branch_name])
|
||||
return result['success']
|
||||
|
||||
def get_remote_status(self) -> Dict[str, Any]:
|
||||
"""Check remote connection status"""
|
||||
result = self.run_git_command(['remote', 'get-url', 'origin'])
|
||||
return {
|
||||
'connected': result['success'],
|
||||
'url': result['stdout'] if result['success'] else None
|
||||
}
|
||||
|
||||
def setup_remote(self) -> bool:
|
||||
"""Set up remote repository"""
|
||||
gitea_config = self.config.get('gitea', {})
|
||||
server_url = gitea_config.get('server_url')
|
||||
username = gitea_config.get('username')
|
||||
repository = gitea_config.get('repository')
|
||||
|
||||
if not all([server_url, username, repository]):
|
||||
self.logger.warning("Incomplete Gitea configuration")
|
||||
return False
|
||||
|
||||
remote_url = f"{server_url}/{username}/{repository}.git"
|
||||
|
||||
# Check if remote already exists
|
||||
existing_remote = self.run_git_command(['remote', 'get-url', 'origin'])
|
||||
if existing_remote['success']:
|
||||
self.logger.info("Remote already configured")
|
||||
return True
|
||||
|
||||
# Add remote
|
||||
result = self.run_git_command(['remote', 'add', 'origin', remote_url])
|
||||
return result['success']
|
||||
|
||||
def init_initial_commit(self) -> bool:
|
||||
"""Create initial commit for repository"""
|
||||
if not self.is_repo_initialized():
|
||||
# Initialize repository
|
||||
result = self.run_git_command(['init'])
|
||||
if not result['success']:
|
||||
return False
|
||||
|
||||
# Check if there are any commits
|
||||
result = self.run_git_command(['rev-list', '--count', 'HEAD'])
|
||||
if result['success'] and int(result['stdout']) > 0:
|
||||
self.logger.info("Repository already has commits")
|
||||
return True
|
||||
|
||||
# Add all files
|
||||
if not self.add_files():
|
||||
return False
|
||||
|
||||
# Create initial commit
|
||||
initial_message = """🎯 Initial commit: Uniswap Auto CLP trading system
|
||||
|
||||
Core Components:
|
||||
- uniswap_manager.py: V3 concentrated liquidity position manager
|
||||
- clp_hedger.py: Hyperliquid perpetuals hedging bot
|
||||
- requirements.txt: Python dependencies
|
||||
- .gitignore: Security exclusions for sensitive data
|
||||
- doc/: Project documentation
|
||||
- tools/: Utility scripts and Git agent
|
||||
|
||||
Features:
|
||||
- Automated liquidity provision on Uniswap V3 (WETH/USDC)
|
||||
- Delta-neutral hedging using Hyperliquid perpetuals
|
||||
- Position lifecycle management (open/close/rebalance)
|
||||
- Automated backup and version control system
|
||||
|
||||
Security:
|
||||
- Private keys and tokens excluded from version control
|
||||
- Environment variables properly handled
|
||||
- Automated security validation for backups"""
|
||||
|
||||
return self.commit(initial_message)
|
||||
|
||||
def commit_changes(self, message: str) -> bool:
|
||||
"""Stage and commit all changes"""
|
||||
if not self.add_files():
|
||||
return False
|
||||
|
||||
return self.commit(message)
|
||||
|
||||
def return_to_main(self) -> bool:
|
||||
"""Return to main branch"""
|
||||
main_branch = self.config.get('main_branch', {}).get('name', 'main')
|
||||
return self.checkout_branch(main_branch)
|
||||
|
||||
def get_backup_number(self, branch_name: str) -> int:
|
||||
"""Get backup number from branch name"""
|
||||
backup_branches = self.get_backup_branches()
|
||||
try:
|
||||
return backup_branches.index(branch_name) + 1
|
||||
except ValueError:
|
||||
return 0
|
||||
134
florida/tools/kpi_tracker.py
Normal file
134
florida/tools/kpi_tracker.py
Normal file
@ -0,0 +1,134 @@
|
||||
import os
|
||||
import csv
|
||||
import time
|
||||
import logging
|
||||
from decimal import Decimal
|
||||
from typing import Dict, Optional
|
||||
|
||||
# Setup Logger
|
||||
logger = logging.getLogger("KPI_TRACKER")
|
||||
logger.setLevel(logging.INFO)
|
||||
# Basic handler if not already handled by parent
|
||||
if not logger.handlers:
|
||||
ch = logging.StreamHandler()
|
||||
formatter = logging.Formatter('%(asctime)s - KPI - %(message)s')
|
||||
ch.setFormatter(formatter)
|
||||
logger.addHandler(ch)
|
||||
|
||||
KPI_FILE = os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))), 'logs', 'kpi_history.csv')
|
||||
|
||||
def initialize_kpi_csv():
|
||||
"""Creates the CSV with headers if it doesn't exist."""
|
||||
if not os.path.exists(os.path.dirname(KPI_FILE)):
|
||||
os.makedirs(os.path.dirname(KPI_FILE))
|
||||
|
||||
if not os.path.exists(KPI_FILE):
|
||||
with open(KPI_FILE, 'w', newline='') as f:
|
||||
writer = csv.writer(f)
|
||||
writer.writerow([
|
||||
"Timestamp",
|
||||
"Date",
|
||||
"NAV_Total_USD",
|
||||
"Benchmark_HODL_USD",
|
||||
"Alpha_USD",
|
||||
"Uniswap_Val_USD",
|
||||
"Uniswap_Fees_Claimed_USD",
|
||||
"Uniswap_Fees_Unclaimed_USD",
|
||||
"Hedge_Equity_USD",
|
||||
"Hedge_PnL_Realized_USD",
|
||||
"Hedge_Fees_Paid_USD",
|
||||
"ETH_Price",
|
||||
"Fee_Coverage_Ratio"
|
||||
])
|
||||
|
||||
def calculate_hodl_benchmark(initial_eth: Decimal, initial_usdc: Decimal, initial_hedge_usdc: Decimal, current_eth_price: Decimal) -> Decimal:
|
||||
"""Calculates value if assets were just held (Wallet Assets + Hedge Account Cash)."""
|
||||
return (initial_eth * current_eth_price) + initial_usdc + initial_hedge_usdc
|
||||
|
||||
def log_kpi_snapshot(
|
||||
snapshot_data: Dict[str, float]
|
||||
):
|
||||
"""
|
||||
Logs a KPI snapshot to CSV.
|
||||
Expected keys in snapshot_data:
|
||||
- initial_eth, initial_usdc, initial_hedge_usdc
|
||||
- current_eth_price
|
||||
- uniswap_pos_value_usd
|
||||
- uniswap_fees_claimed_usd
|
||||
- uniswap_fees_unclaimed_usd
|
||||
- hedge_equity_usd
|
||||
- hedge_pnl_realized_usd
|
||||
- hedge_fees_paid_usd
|
||||
- wallet_eth_bal, wallet_usdc_bal (Optional, for full NAV)
|
||||
"""
|
||||
try:
|
||||
initialize_kpi_csv()
|
||||
|
||||
# Convert all inputs to Decimal for precision
|
||||
price = Decimal(str(snapshot_data.get('current_eth_price', 0)))
|
||||
|
||||
# 1. Benchmark (HODL)
|
||||
init_eth = Decimal(str(snapshot_data.get('initial_eth', 0)))
|
||||
init_usdc = Decimal(str(snapshot_data.get('initial_usdc', 0)))
|
||||
init_hedge = Decimal(str(snapshot_data.get('initial_hedge_usdc', 0)))
|
||||
benchmark_val = calculate_hodl_benchmark(init_eth, init_usdc, init_hedge, price)
|
||||
|
||||
# 2. Strategy NAV (Net Asset Value)
|
||||
# NAV = Uni Pos + Uni Fees (Claimed+Unclaimed) + Hedge Equity + (Wallet Surplus - Initial Wallet Surplus?)
|
||||
# For simplicity, we focus on the Strategy PnL components:
|
||||
# Strategy Val = (Current Uni Pos) + (Claimed Fees) + (Unclaimed Fees) + (Hedge PnL Realized) + (Hedge Unrealized?)
|
||||
# Note: Hedge Equity usually includes margin. We strictly want "Value Generated".
|
||||
|
||||
uni_val = Decimal(str(snapshot_data.get('uniswap_pos_value_usd', 0)))
|
||||
uni_fees_claimed = Decimal(str(snapshot_data.get('uniswap_fees_claimed_usd', 0)))
|
||||
uni_fees_unclaimed = Decimal(str(snapshot_data.get('uniswap_fees_unclaimed_usd', 0)))
|
||||
|
||||
# Hedge PnL (Realized + Unrealized) is better than Equity for PnL tracking,
|
||||
# but Equity represents actual redeemable cash. Let's use Equity if provided, or PnL components.
|
||||
hedge_equity = Decimal(str(snapshot_data.get('hedge_equity_usd', 0)))
|
||||
hedge_fees = Decimal(str(snapshot_data.get('hedge_fees_paid_usd', 0)))
|
||||
|
||||
# Simplified NAV for Strategy Comparison:
|
||||
# We assume 'hedge_equity' is the Liquidation Value of the hedge account.
|
||||
# But if we want strictly "Strategy Performance", we usually do:
|
||||
# Current Value = Uni_Val + Unclaimed + Hedge_Equity
|
||||
# (Assuming Hedge_Equity started at 0 or we track delta? No, usually Hedge Account has deposit).
|
||||
|
||||
# Let's define NAV as Total Current Liquidation Value of Strategy Components
|
||||
current_nav = uni_val + uni_fees_unclaimed + uni_fees_claimed + hedge_equity
|
||||
|
||||
# Alpha
|
||||
alpha = current_nav - benchmark_val
|
||||
|
||||
# Coverage Ratio
|
||||
total_hedge_cost = abs(hedge_fees) # + funding if available
|
||||
total_uni_earnings = uni_fees_claimed + uni_fees_unclaimed
|
||||
|
||||
if total_hedge_cost > 0:
|
||||
coverage_ratio = total_uni_earnings / total_hedge_cost
|
||||
else:
|
||||
coverage_ratio = Decimal("999.0") # Infinite/Good
|
||||
|
||||
# Write
|
||||
with open(KPI_FILE, 'a', newline='') as f:
|
||||
writer = csv.writer(f)
|
||||
writer.writerow([
|
||||
int(time.time()),
|
||||
time.strftime('%Y-%m-%d %H:%M:%S'),
|
||||
f"{current_nav:.2f}",
|
||||
f"{benchmark_val:.2f}",
|
||||
f"{alpha:.2f}",
|
||||
f"{uni_val:.2f}",
|
||||
f"{uni_fees_claimed:.2f}",
|
||||
f"{uni_fees_unclaimed:.2f}",
|
||||
f"{hedge_equity:.2f}",
|
||||
f"{snapshot_data.get('hedge_pnl_realized_usd', 0):.2f}",
|
||||
f"{hedge_fees:.2f}",
|
||||
f"{price:.2f}",
|
||||
f"{coverage_ratio:.2f}"
|
||||
])
|
||||
|
||||
logger.info(f"📊 KPI Logged | NAV: ${current_nav:.2f} | Benchmark: ${benchmark_val:.2f} | Alpha: ${alpha:.2f}")
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"Failed to log KPI: {e}")
|
||||
136
florida/tools/record_live_data.py
Normal file
136
florida/tools/record_live_data.py
Normal file
@ -0,0 +1,136 @@
|
||||
import argparse
|
||||
import csv
|
||||
import os
|
||||
import time
|
||||
import json
|
||||
import threading
|
||||
import signal
|
||||
import sys
|
||||
import websocket
|
||||
from datetime import datetime
|
||||
|
||||
# Setup
|
||||
MARKET_DATA_DIR = os.path.join(os.path.dirname(os.path.dirname(os.path.abspath(__file__))), 'market_data')
|
||||
WS_URL = "wss://api.hyperliquid.xyz/ws"
|
||||
|
||||
class CandleRecorder:
|
||||
def __init__(self, coin, output_file):
|
||||
self.coin = coin
|
||||
self.output_file = output_file
|
||||
self.ws = None
|
||||
self.running = True
|
||||
|
||||
# Candle State
|
||||
self.current_second = int(time.time())
|
||||
self.ticks = [] # List of prices in current second
|
||||
self.candle_lock = threading.Lock()
|
||||
|
||||
# Ensure dir exists
|
||||
if not os.path.exists(MARKET_DATA_DIR):
|
||||
os.makedirs(MARKET_DATA_DIR)
|
||||
|
||||
# Init CSV
|
||||
self.file_exists = os.path.exists(output_file)
|
||||
with open(self.output_file, 'a', newline='') as f:
|
||||
writer = csv.writer(f)
|
||||
if not self.file_exists:
|
||||
writer.writerow(['timestamp', 'open', 'high', 'low', 'close', 'count'])
|
||||
|
||||
def on_message(self, ws, message):
|
||||
try:
|
||||
data = json.loads(message)
|
||||
|
||||
# Check for 'allMids' update
|
||||
if data.get('channel') == 'allMids':
|
||||
mids = data.get('data', {}).get('mids', {})
|
||||
if self.coin in mids:
|
||||
price = float(mids[self.coin])
|
||||
self.process_tick(price)
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error processing message: {e}")
|
||||
|
||||
def process_tick(self, price):
|
||||
now_sec = int(time.time())
|
||||
|
||||
with self.candle_lock:
|
||||
# If we moved to a new second, close the old one
|
||||
if now_sec > self.current_second:
|
||||
self.close_candle()
|
||||
self.current_second = now_sec
|
||||
self.ticks = []
|
||||
|
||||
self.ticks.append(price)
|
||||
|
||||
def close_candle(self):
|
||||
if not self.ticks:
|
||||
return
|
||||
|
||||
# Build candle
|
||||
o = self.ticks[0]
|
||||
c = self.ticks[-1]
|
||||
h = max(self.ticks)
|
||||
l = min(self.ticks)
|
||||
count = len(self.ticks)
|
||||
ts = self.current_second * 1000
|
||||
|
||||
# Write to file
|
||||
try:
|
||||
with open(self.output_file, 'a', newline='') as f:
|
||||
writer = csv.writer(f)
|
||||
writer.writerow([ts, o, h, l, c, count])
|
||||
|
||||
if self.current_second % 10 == 0:
|
||||
print(f"[{datetime.fromtimestamp(self.current_second)}] 🕯️ Saved {self.coin}: {c} ({count} ticks)")
|
||||
|
||||
except Exception as e:
|
||||
print(f"Error writing candle: {e}")
|
||||
|
||||
def on_error(self, ws, error):
|
||||
print(f"WebSocket Error: {error}")
|
||||
|
||||
def on_close(self, ws, close_status_code, close_msg):
|
||||
print("WebSocket Closed")
|
||||
|
||||
def on_open(self, ws):
|
||||
print("WebSocket Connected. Subscribing to allMids...")
|
||||
sub_msg = {
|
||||
"method": "subscribe",
|
||||
"subscription": {"type": "allMids"}
|
||||
}
|
||||
ws.send(json.dumps(sub_msg))
|
||||
|
||||
def start(self):
|
||||
print(f"🔴 RECORDING LIVE 1s DATA (WS) for {self.coin}")
|
||||
print(f"📂 Output: {self.output_file}")
|
||||
print("Press Ctrl+C to stop.")
|
||||
|
||||
# Start WS in separate thread? No, run_forever is blocking usually.
|
||||
# But we need to handle Ctrl+C.
|
||||
self.ws = websocket.WebSocketApp(
|
||||
WS_URL,
|
||||
on_open=self.on_open,
|
||||
on_message=self.on_message,
|
||||
on_error=self.on_error,
|
||||
on_close=self.on_close
|
||||
)
|
||||
|
||||
self.ws.run_forever()
|
||||
|
||||
def signal_handler(sig, frame):
|
||||
print("\nStopping recorder...")
|
||||
sys.exit(0)
|
||||
|
||||
if __name__ == "__main__":
|
||||
signal.signal(signal.SIGINT, signal_handler)
|
||||
|
||||
parser = argparse.ArgumentParser(description="Record live 1s candles from Hyperliquid via WebSocket")
|
||||
parser.add_argument("--coin", type=str, default="ETH", help="Coin symbol")
|
||||
parser.add_argument("--output", type=str, help="Custom output file")
|
||||
|
||||
args = parser.parse_args()
|
||||
|
||||
filename = args.output or os.path.join(MARKET_DATA_DIR, f"{args.coin}_1s_LIVE_WS.csv")
|
||||
|
||||
recorder = CandleRecorder(args.coin, filename)
|
||||
recorder.start()
|
||||
Reference in New Issue
Block a user