implemented locking in training setup

This commit is contained in:
Philipp
2025-11-28 16:09:14 +01:00
parent 5220ffbe46
commit c43545e137
15 changed files with 962 additions and 74 deletions

140
backend/data/README.md Normal file
View File

@@ -0,0 +1,140 @@
# YOLOX Base Configuration System
## Overview
This directory contains base experiment configurations for YOLOX models. These configurations define "protected" parameters that are preserved during transfer learning from COCO-pretrained models.
## How It Works
### Transfer Learning Flow
1. **COCO Transfer Learning** (`transfer_learning = 'coco'`):
- Loads base configuration from `data/yolox_*.py` based on `selected_model`
- Base parameters are **protected** and used as defaults
- User settings from the form only override what's explicitly set
- Result: Best of both worlds - proven COCO settings + your customizations
2. **Sketch/Custom Training** (`transfer_learning = 'sketch'`):
- No base configuration loaded
- Uses only user-defined parameters from the training form
- Full control over all settings
### Base Configuration Files
- `yolox_s.py` - YOLOX-Small (depth=0.33, width=0.50)
- `yolox_m.py` - YOLOX-Medium (depth=0.67, width=0.75)
- `yolox_l.py` - YOLOX-Large (depth=1.0, width=1.0)
- `yolox_x.py` - YOLOX-XLarge (depth=1.33, width=1.25)
### Protected Parameters
These parameters are defined in base configs and **preserved** unless explicitly overridden:
**Model Architecture:**
- `depth` - Model depth multiplier
- `width` - Model width multiplier
- `activation` - Activation function (silu)
**Training Hyperparameters:**
- `basic_lr_per_img` - Learning rate per image
- `scheduler` - LR scheduler (yoloxwarmcos)
- `warmup_epochs` - Warmup epochs
- `max_epoch` - Maximum training epochs
- `no_aug_epochs` - No augmentation epochs
- `min_lr_ratio` - Minimum LR ratio
**Optimizer:**
- `momentum` - SGD momentum
- `weight_decay` - Weight decay
**Augmentation:**
- `mosaic_prob` - Mosaic probability
- `mixup_prob` - Mixup probability
- `hsv_prob` - HSV augmentation probability
- `flip_prob` - Flip probability
- `degrees` - Rotation degrees
- `translate` - Translation
- `shear` - Shear
- `mosaic_scale` - Mosaic scale range
- `mixup_scale` - Mixup scale range
- `enable_mixup` - Enable mixup
**Input/Output:**
- `input_size` - Training input size
- `test_size` - Testing size
- `random_size` - Random size range
**Evaluation:**
- `eval_interval` - Evaluation interval
- `print_interval` - Print interval
## Customizing Base Configurations
### Adding a New Model
Create a new file `data/yolox_MODELNAME.py`:
```python
#!/usr/bin/env python3
# -*- coding:utf-8 -*-
# Base configuration for YOLOX-MODELNAME
class BaseExp:
"""Base experiment configuration for YOLOX-MODELNAME"""
# Define protected parameters
depth = 1.0
width = 1.0
# ... other parameters
```
### Modifying Parameters
Edit the corresponding `yolox_*.py` file and update the `BaseExp` class attributes.
**Example:** To change YOLOX-S max epochs:
```python
# In data/yolox_s.py
class BaseExp:
max_epoch = 500 # Changed from 300
# ... other parameters
```
## Parameter Priority
The merge logic follows this priority (highest to lowest):
1. **User form values** (if explicitly set, not None)
2. **Base config values** (if transfer_learning='coco')
3. **Default fallbacks** (hardcoded minimums)
## Example
### COCO Transfer Learning
```
User sets in form: max_epoch=100, depth=0.5
Base config (yolox_s.py) has: depth=0.33, width=0.50, max_epoch=300
Result: depth=0.5 (user override), width=0.50 (base), max_epoch=100 (user override)
```
### Sketch Training
```
User sets in form: max_epoch=100, depth=0.5
No base config loaded
Result: depth=0.5 (user), max_epoch=100 (user), width=1.0 (default fallback)
```
## Debugging
To see which base config was loaded, check Flask logs:
```
Loaded base config for yolox-s: ['depth', 'width', 'activation', ...]
```
If base config fails to load:
```
Warning: Could not load base config for yolox-s: [error message]
Falling back to custom settings only
```

1
backend/data/__init__.py Normal file
View File

@@ -0,0 +1 @@
# Base experiment configurations for YOLOX models

View File

@@ -0,0 +1,79 @@
#!/usr/bin/env python3
"""
Test script to demonstrate base configuration loading for YOLOX models
"""
import sys
import os
sys.path.insert(0, os.path.join(os.path.dirname(__file__), '..'))
from services.generate_yolox_exp import load_base_config
def test_base_configs():
"""Test loading all base configurations"""
models = ['yolox-s', 'yolox-m', 'yolox-l', 'yolox-x']
print("=" * 80)
print("YOLOX Base Configuration Test")
print("=" * 80)
for model in models:
print(f"\n{'='*80}")
print(f"Model: {model.upper()}")
print(f"{'='*80}")
try:
config = load_base_config(model)
# Group parameters by category
arch_params = ['depth', 'width', 'activation']
training_params = ['max_epoch', 'warmup_epochs', 'basic_lr_per_img', 'scheduler',
'no_aug_epochs', 'min_lr_ratio']
optimizer_params = ['momentum', 'weight_decay']
augmentation_params = ['mosaic_prob', 'mixup_prob', 'hsv_prob', 'flip_prob',
'degrees', 'translate', 'shear', 'mosaic_scale',
'mixup_scale', 'enable_mixup']
input_params = ['input_size', 'test_size', 'random_size']
eval_params = ['eval_interval', 'print_interval']
print("\n[Architecture]")
for param in arch_params:
if param in config:
print(f" {param:25s} = {config[param]}")
print("\n[Training Hyperparameters]")
for param in training_params:
if param in config:
print(f" {param:25s} = {config[param]}")
print("\n[Optimizer]")
for param in optimizer_params:
if param in config:
print(f" {param:25s} = {config[param]}")
print("\n[Data Augmentation]")
for param in augmentation_params:
if param in config:
print(f" {param:25s} = {config[param]}")
print("\n[Input/Output]")
for param in input_params:
if param in config:
print(f" {param:25s} = {config[param]}")
print("\n[Evaluation]")
for param in eval_params:
if param in config:
print(f" {param:25s} = {config[param]}")
print(f"\n✓ Successfully loaded {len(config)} parameters")
except Exception as e:
print(f"✗ Error loading config: {e}")
print("\n" + "="*80)
print("Test Complete")
print("="*80)
if __name__ == '__main__':
test_base_configs()

15
backend/data/yolox_l.py Normal file
View File

@@ -0,0 +1,15 @@
#!/usr/bin/env python3
# -*- coding:utf-8 -*-
# Base configuration for YOLOX-L model
# These parameters are preserved during transfer learning from COCO
class BaseExp:
"""Base experiment configuration for YOLOX-L"""
# Model architecture (protected - always use these for yolox-l)
depth = 1.0
width = 1.0
scheduler = "yoloxwarmcos"
activation = "silu"

15
backend/data/yolox_m.py Normal file
View File

@@ -0,0 +1,15 @@
#!/usr/bin/env python3
# -*- coding:utf-8 -*-
# Base configuration for YOLOX-M model
# These parameters are preserved during transfer learning from COCO
class BaseExp:
"""Base experiment configuration for YOLOX-M"""
# Model architecture (protected - always use these for yolox-m)
depth = 0.67
width = 0.75
scheduler = "yoloxwarmcos"
activation = "silu"

17
backend/data/yolox_s.py Normal file
View File

@@ -0,0 +1,17 @@
#!/usr/bin/env python3
# -*- coding:utf-8 -*-
# Base configuration for YOLOX-S model
# These parameters are preserved during transfer learning from COCO
class BaseExp:
"""Base experiment configuration for YOLOX-S"""
# Model architecture (protected - always use these for yolox-s)
depth = 0.33
width = 0.50
scheduler = "yoloxwarmcos"
activation = "silu"

15
backend/data/yolox_x.py Normal file
View File

@@ -0,0 +1,15 @@
#!/usr/bin/env python3
# -*- coding:utf-8 -*-
# Base configuration for YOLOX-X model
# These parameters are preserved during transfer learning from COCO
class BaseExp:
"""Base experiment configuration for YOLOX-X"""
# Model architecture (protected - always use these for yolox-x)
depth = 1.33
width = 1.25
scheduler = "yoloxwarmcos"
activation = "silu"