Update README for how to download dataset
Browse files- README.md +12 -5
- configs/metadata.json +3 -2
- docs/README.md +12 -5
README.md
CHANGED
@@ -32,9 +32,15 @@ The training data is from <https://warwick.ac.uk/fac/cross_fac/tia/data/hovernet
|
|
32 |
|
33 |
The provided labelled data was partitioned, based on the original split, into training (27 tiles) and testing (14 tiles) datasets.
|
34 |
|
|
|
|
|
|
|
|
|
|
|
|
|
35 |
### Preprocessing
|
36 |
|
37 |
-
After download the datasets, please run `scripts/prepare_patches.py` to prepare patches from tiles. Prepared patches are saved in `<your concep dataset path>`/Prepared. The implementation is referring to <https://github.com/vqdang/hover_net>. The command is like:
|
38 |
|
39 |
```
|
40 |
python scripts/prepare_patches.py --root <your concep dataset path>
|
@@ -53,7 +59,7 @@ This model utilized a two-stage approach. The training was performed with the fo
|
|
53 |
|
54 |
### Memory Consumption Warning
|
55 |
|
56 |
-
If you face memory issues with CacheDataset, you can either switch to a regular Dataset class or lower the caching rate `cache_rate` in the configurations within range
|
57 |
|
58 |
## Input
|
59 |
Input: RGB images
|
@@ -99,16 +105,17 @@ In addition to the Pythonic APIs, a few command line interfaces (CLI) are provid
|
|
99 |
|
100 |
For more details usage instructions, visit the [MONAI Bundle Configuration Page](https://docs.monai.io/en/latest/config_syntax.html).
|
101 |
|
102 |
-
#### Execute training, the evaluation
|
|
|
103 |
|
104 |
- Run first stage
|
105 |
```
|
106 |
-
python -m monai.bundle run --config_file configs/train.json --stage 0
|
107 |
```
|
108 |
|
109 |
- Run second stage
|
110 |
```
|
111 |
-
python -m monai.bundle run --config_file configs/train.json --network_def#freeze_encoder False --stage 1
|
112 |
```
|
113 |
|
114 |
#### Override the `train` config to execute multi-GPU training:
|
|
|
32 |
|
33 |
The provided labelled data was partitioned, based on the original split, into training (27 tiles) and testing (14 tiles) datasets.
|
34 |
|
35 |
+
You can download the dataset by using this command:
|
36 |
+
```
|
37 |
+
wget https://warwick.ac.uk/fac/cross_fac/tia/data/hovernet/consep_dataset.zip
|
38 |
+
unzip consep_dataset.zip
|
39 |
+
```
|
40 |
+
|
41 |
### Preprocessing
|
42 |
|
43 |
+
After download the [datasets](https://warwick.ac.uk/fac/cross_fac/tia/data/hovernet/consep_dataset.zip), please run `scripts/prepare_patches.py` to prepare patches from tiles. Prepared patches are saved in `<your concep dataset path>`/Prepared. The implementation is referring to <https://github.com/vqdang/hover_net>. The command is like:
|
44 |
|
45 |
```
|
46 |
python scripts/prepare_patches.py --root <your concep dataset path>
|
|
|
59 |
|
60 |
### Memory Consumption Warning
|
61 |
|
62 |
+
If you face memory issues with CacheDataset, you can either switch to a regular Dataset class or lower the caching rate `cache_rate` in the configurations within range [0, 1] to minimize the System RAM requirements.
|
63 |
|
64 |
## Input
|
65 |
Input: RGB images
|
|
|
105 |
|
106 |
For more details usage instructions, visit the [MONAI Bundle Configuration Page](https://docs.monai.io/en/latest/config_syntax.html).
|
107 |
|
108 |
+
#### Execute training, the evaluation during the training were evaluated on patches:
|
109 |
+
Please note that if the default dataset path is not modified with the actual path in the bundle config files, you can also override it by using `--dataset_dir`:
|
110 |
|
111 |
- Run first stage
|
112 |
```
|
113 |
+
python -m monai.bundle run --config_file configs/train.json --stage 0 --dataset_dir <actual dataset path>
|
114 |
```
|
115 |
|
116 |
- Run second stage
|
117 |
```
|
118 |
+
python -m monai.bundle run --config_file configs/train.json --network_def#freeze_encoder False --stage 1 --dataset_dir <actual dataset path>
|
119 |
```
|
120 |
|
121 |
#### Override the `train` config to execute multi-GPU training:
|
configs/metadata.json
CHANGED
@@ -1,7 +1,8 @@
|
|
1 |
{
|
2 |
"schema": "https://github.com/Project-MONAI/MONAI-extra-test-data/releases/download/0.8.1/meta_schema_hovernet_20221124.json",
|
3 |
-
"version": "0.
|
4 |
"changelog": {
|
|
|
5 |
"0.1.9": "add RAM warning",
|
6 |
"0.1.8": "Update README for pretrained weights and save metrics in evaluate",
|
7 |
"0.1.7": "Update README Formatting",
|
@@ -13,7 +14,7 @@
|
|
13 |
"0.1.1": "update to use monai 1.1.0",
|
14 |
"0.1.0": "complete the model package"
|
15 |
},
|
16 |
-
"monai_version": "1.2.
|
17 |
"pytorch_version": "1.13.1",
|
18 |
"numpy_version": "1.22.2",
|
19 |
"optional_packages_version": {
|
|
|
1 |
{
|
2 |
"schema": "https://github.com/Project-MONAI/MONAI-extra-test-data/releases/download/0.8.1/meta_schema_hovernet_20221124.json",
|
3 |
+
"version": "0.2.0",
|
4 |
"changelog": {
|
5 |
+
"0.2.0": "Update README for how to download dataset",
|
6 |
"0.1.9": "add RAM warning",
|
7 |
"0.1.8": "Update README for pretrained weights and save metrics in evaluate",
|
8 |
"0.1.7": "Update README Formatting",
|
|
|
14 |
"0.1.1": "update to use monai 1.1.0",
|
15 |
"0.1.0": "complete the model package"
|
16 |
},
|
17 |
+
"monai_version": "1.2.0rc6",
|
18 |
"pytorch_version": "1.13.1",
|
19 |
"numpy_version": "1.22.2",
|
20 |
"optional_packages_version": {
|
docs/README.md
CHANGED
@@ -25,9 +25,15 @@ The training data is from <https://warwick.ac.uk/fac/cross_fac/tia/data/hovernet
|
|
25 |
|
26 |
The provided labelled data was partitioned, based on the original split, into training (27 tiles) and testing (14 tiles) datasets.
|
27 |
|
|
|
|
|
|
|
|
|
|
|
|
|
28 |
### Preprocessing
|
29 |
|
30 |
-
After download the datasets, please run `scripts/prepare_patches.py` to prepare patches from tiles. Prepared patches are saved in `<your concep dataset path>`/Prepared. The implementation is referring to <https://github.com/vqdang/hover_net>. The command is like:
|
31 |
|
32 |
```
|
33 |
python scripts/prepare_patches.py --root <your concep dataset path>
|
@@ -46,7 +52,7 @@ This model utilized a two-stage approach. The training was performed with the fo
|
|
46 |
|
47 |
### Memory Consumption Warning
|
48 |
|
49 |
-
If you face memory issues with CacheDataset, you can either switch to a regular Dataset class or lower the caching rate `cache_rate` in the configurations within range
|
50 |
|
51 |
## Input
|
52 |
Input: RGB images
|
@@ -92,16 +98,17 @@ In addition to the Pythonic APIs, a few command line interfaces (CLI) are provid
|
|
92 |
|
93 |
For more details usage instructions, visit the [MONAI Bundle Configuration Page](https://docs.monai.io/en/latest/config_syntax.html).
|
94 |
|
95 |
-
#### Execute training, the evaluation
|
|
|
96 |
|
97 |
- Run first stage
|
98 |
```
|
99 |
-
python -m monai.bundle run --config_file configs/train.json --stage 0
|
100 |
```
|
101 |
|
102 |
- Run second stage
|
103 |
```
|
104 |
-
python -m monai.bundle run --config_file configs/train.json --network_def#freeze_encoder False --stage 1
|
105 |
```
|
106 |
|
107 |
#### Override the `train` config to execute multi-GPU training:
|
|
|
25 |
|
26 |
The provided labelled data was partitioned, based on the original split, into training (27 tiles) and testing (14 tiles) datasets.
|
27 |
|
28 |
+
You can download the dataset by using this command:
|
29 |
+
```
|
30 |
+
wget https://warwick.ac.uk/fac/cross_fac/tia/data/hovernet/consep_dataset.zip
|
31 |
+
unzip consep_dataset.zip
|
32 |
+
```
|
33 |
+
|
34 |
### Preprocessing
|
35 |
|
36 |
+
After download the [datasets](https://warwick.ac.uk/fac/cross_fac/tia/data/hovernet/consep_dataset.zip), please run `scripts/prepare_patches.py` to prepare patches from tiles. Prepared patches are saved in `<your concep dataset path>`/Prepared. The implementation is referring to <https://github.com/vqdang/hover_net>. The command is like:
|
37 |
|
38 |
```
|
39 |
python scripts/prepare_patches.py --root <your concep dataset path>
|
|
|
52 |
|
53 |
### Memory Consumption Warning
|
54 |
|
55 |
+
If you face memory issues with CacheDataset, you can either switch to a regular Dataset class or lower the caching rate `cache_rate` in the configurations within range [0, 1] to minimize the System RAM requirements.
|
56 |
|
57 |
## Input
|
58 |
Input: RGB images
|
|
|
98 |
|
99 |
For more details usage instructions, visit the [MONAI Bundle Configuration Page](https://docs.monai.io/en/latest/config_syntax.html).
|
100 |
|
101 |
+
#### Execute training, the evaluation during the training were evaluated on patches:
|
102 |
+
Please note that if the default dataset path is not modified with the actual path in the bundle config files, you can also override it by using `--dataset_dir`:
|
103 |
|
104 |
- Run first stage
|
105 |
```
|
106 |
+
python -m monai.bundle run --config_file configs/train.json --stage 0 --dataset_dir <actual dataset path>
|
107 |
```
|
108 |
|
109 |
- Run second stage
|
110 |
```
|
111 |
+
python -m monai.bundle run --config_file configs/train.json --network_def#freeze_encoder False --stage 1 --dataset_dir <actual dataset path>
|
112 |
```
|
113 |
|
114 |
#### Override the `train` config to execute multi-GPU training:
|