|
--- |
|
license: mit |
|
tags: |
|
- automl |
|
- genetic-programming |
|
- loss-functions |
|
- neural-networks |
|
- bittensor |
|
datasets: |
|
- synapz/automl-genes |
|
--- |
|
|
|
# AutoML Evolved Loss Functions |
|
|
|
This repository contains evolved neural network loss functions discovered through distributed genetic programming on the Bittensor network (subnet 49 Hivetrain AutoML). |
|
|
|
## Overview |
|
|
|
The genes stored here represent novel loss functions optimized for: |
|
- Image classification tasks |
|
- Neural network training efficiency |
|
- Improved convergence rates |
|
|
|
## Repository Structure |
|
/automl-genes ├── losses/ # Evolved loss function implementations ├── metrics/ # Performance metrics and evaluations └── metadata/ # Gene genealogy and evolution data |
|
|
|
|
|
## Technical Details |
|
|
|
Loss functions are evolved using: |
|
- Genetic programming with population size 100 |
|
- Tournament selection (size 3) |
|
- Multi-architecture validation across: |
|
- MLP |
|
- ResNet |
|
- MobileNet V3 |
|
- EfficientNet V2 |
|
|
|
## Usage |
|
|
|
The evolved loss functions can be imported and used as drop-in replacements for standard PyTorch loss functions in deep learning projects. |
|
|
|
## Project |
|
|
|
Part of the Hivetrain AutoML subnet focused on discovering improved neural network components through distributed evolution. |
|
|
|
For more information, visit: |
|
- [DistributedAutoML Repository](https://github.com/Hivetensor/DistributedAutoML) |
|
- [Hivetrain Discord](https://discord.gg/JpRSqTBBZU) |
|
|
|
--- |
|
|