glenn-jocher commited on
Commit
9cd4642
Β·
1 Parent(s): 104f541

Creado con Colaboratory

Browse files
Files changed (1) hide show
  1. tutorial.ipynb +6 -6
tutorial.ipynb CHANGED
@@ -824,7 +824,7 @@
824
  "source": [
825
  "# 3. Train\n",
826
  "\n",
827
- "Download [COCO128](https://www.kaggle.com/ultralytics/coco128), a small 128-image tutorial dataset, start tensorboard and train YOLOv5s from a pretrained checkpoint for 3 epochs (actual training is typically much longer, around **300-1000 epochs**, depending on your dataset)."
828
  ]
829
  },
830
  {
@@ -885,9 +885,9 @@
885
  "id": "_pOkGLv1dMqh"
886
  },
887
  "source": [
888
- "Train a YOLOv5s model on [COCO128](https://www.kaggle.com/ultralytics/coco128) with dataset `--data coco128.yaml`, starting from pretrained `--weights yolov5s.pt`, or from randomly initialized `--weights '' --cfg yolov5s.yaml`. Models are downloaded automatically from the [latest YOLOv5 release](https://github.com/ultralytics/yolov5/releases), and **COCO, COCO128, and VOC datasets are downloaded automatically** on first use.\n",
889
  "\n",
890
- "All training results are saved to `runs/train/exp0` for the first experiment, then `runs/exp1`, `runs/exp2` etc. for subsequent experiments.\n"
891
  ]
892
  },
893
  {
@@ -1012,7 +1012,7 @@
1012
  "source": [
1013
  "## Weights & Biases Logging (πŸš€ NEW)\n",
1014
  "\n",
1015
- "[Weights & Biases](https://www.wandb.com/) (W&B) is now integrated with YOLOv5 for real-time visualization and cloud logging of training runs. This allows for better run comparison and introspection, as well improved visibility and collaboration among team members. To enable W&B logging install `wandb`, and then train normally (you will be guided setup on first use).\n",
1016
  "```bash\n",
1017
  "$ pip install wandb\n",
1018
  "```\n",
@@ -1030,7 +1030,7 @@
1030
  "source": [
1031
  "## Local Logging\n",
1032
  "\n",
1033
- "All results are logged by default to the `runs/train/exp0` directory, with a new directory created for each new training as `runs/exp1`, `runs/exp2`, etc. View train and test jpgs to see mosaics, labels/predictions and augmentation effects. Note a **Mosaic Dataloader** is used for training (shown below), a new concept developed by Ultralytics and first featured in [YOLOv4](https://arxiv.org/abs/2004.10934)."
1034
  ]
1035
  },
1036
  {
@@ -1053,7 +1053,7 @@
1053
  },
1054
  "source": [
1055
  "> <img src=\"https://user-images.githubusercontent.com/26833433/83667642-90fcb200-a583-11ea-8fa3-338bbf7da194.jpeg\" width=\"750\"> \n",
1056
- "`train_batch0.jpg` train batch 0 mosaics and labels\n",
1057
  "\n",
1058
  "> <img src=\"https://user-images.githubusercontent.com/26833433/83667626-8c37fe00-a583-11ea-997b-0923fe59b29b.jpeg\" width=\"750\"> \n",
1059
  "`test_batch0_labels.jpg` shows test batch 0 labels\n",
 
824
  "source": [
825
  "# 3. Train\n",
826
  "\n",
827
+ "Download [COCO128](https://www.kaggle.com/ultralytics/coco128), a small 128-image tutorial dataset, start tensorboard and train YOLOv5s from a pretrained checkpoint for 3 epochs (note actual training is typically much longer, around **300-1000 epochs**, depending on your dataset)."
828
  ]
829
  },
830
  {
 
885
  "id": "_pOkGLv1dMqh"
886
  },
887
  "source": [
888
+ "Train a YOLOv5s model on [COCO128](https://www.kaggle.com/ultralytics/coco128) with `--data coco128.yaml`, starting from pretrained `--weights yolov5s.pt`, or from randomly initialized `--weights '' --cfg yolov5s.yaml`. Models are downloaded automatically from the [latest YOLOv5 release](https://github.com/ultralytics/yolov5/releases), and **COCO, COCO128, and VOC datasets are downloaded automatically** on first use.\n",
889
  "\n",
890
+ "All training results are saved to `runs/train/` with incrementing run directories, i.e. `runs/train/exp0`, `runs/train/exp1` etc.\n"
891
  ]
892
  },
893
  {
 
1012
  "source": [
1013
  "## Weights & Biases Logging (πŸš€ NEW)\n",
1014
  "\n",
1015
+ "[Weights & Biases](https://www.wandb.com/) (W&B) is now integrated with YOLOv5 for real-time visualization and cloud logging of training runs. This allows for better run comparison and introspection, as well improved visibility and collaboration for teams. To enable W&B logging install `wandb`, and then train normally (you will be guided setup on first use).\n",
1016
  "```bash\n",
1017
  "$ pip install wandb\n",
1018
  "```\n",
 
1030
  "source": [
1031
  "## Local Logging\n",
1032
  "\n",
1033
+ "All results are logged by default to `runs/train`, with a new experiment directory created for each new training as `runs/train/exp1`, `runs/train/exp2`, etc. View train and test jpgs to see mosaics, labels, predictions and augmentation effects. Note a **Mosaic Dataloader** is used for training (shown below), a new concept developed by Ultralytics and first featured in [YOLOv4](https://arxiv.org/abs/2004.10934)."
1034
  ]
1035
  },
1036
  {
 
1053
  },
1054
  "source": [
1055
  "> <img src=\"https://user-images.githubusercontent.com/26833433/83667642-90fcb200-a583-11ea-8fa3-338bbf7da194.jpeg\" width=\"750\"> \n",
1056
+ "`train_batch0.jpg` shows train batch 0 mosaics and labels\n",
1057
  "\n",
1058
  "> <img src=\"https://user-images.githubusercontent.com/26833433/83667626-8c37fe00-a583-11ea-997b-0923fe59b29b.jpeg\" width=\"750\"> \n",
1059
  "`test_batch0_labels.jpg` shows test batch 0 labels\n",