Update README.md
Browse files
README.md
CHANGED
@@ -125,10 +125,12 @@ This dataset card contains usage instructions and metadata for all data-products
|
|
125 |
* Download the `.h5.gz` files in `data/<source dataset name>`. Our source datasets include SustainBench, USAVars, and BigEarthNet2.0. Each dataset with the augmented geographic inputs is detailed in [this section 📦](#geolayersused)
|
126 |
* You may use pigz (https://linux.die.net/man/1/pigz) to decompress the archive. This is especially recommended for USAVars' train-split, which is 117 GB when uncompressed. This can be done with `pigz -d <.h5.gz>`
|
127 |
* Datasets with auxiliary geographic inputs can be read with H5PY.
|
|
|
|
|
|
|
128 |
|
129 |
-
|
130 |
-
|
131 |
-
The first version is stored in directory `data/bigearthnet/raw/` This dataset, although called `raw` is a pre-processed version of the raw BigEarthNetv2.0 dataset. We follow instructions listed on [this repository](https://git.tu-berlin.de/rsim/reben-training-scripts/-/tree/main?ref_type=heads#data). Steps performed:
|
132 |
1. We download the raw `BigEarthNet-S2.tar.zst` Sentinel-2 BigEarthNet dataset.
|
133 |
2. We extract and process the raw S2 tiles to a LMDB 'Lightning' Database. This allows for faster reads during training. We use the rico-hdl tool [here](https://github.com/kai-tub/rico-hdl) to accomplish this.
|
134 |
3. We download reference maps and sentinel-2 tile metadata with snow and cloud cover rasters
|
@@ -148,7 +150,7 @@ The second version of the BigEarthNetv2.0 dataset is stored in `data/bigearthnet
|
|
148 |
This version of the processed dataset comes with (i) raw location co-ordinates, and (ii) pre-computed SatCLIP embeddings (L=10, ResNet50 image encoder backbone).
|
149 |
You may access these embeddings and location metadata with keys `location` and `satclip_embedding`.
|
150 |
|
151 |
-
### Usage Instructions for the SustainBench Farmland Boundary Delineation Dataset (Yeh et
|
152 |
1. Unzip the archive in `data/sustainbench-field-boundary-delineation` with `unzip sustainbench.zip`
|
153 |
2. You should see a directory structure as follows:
|
154 |
```
|
|
|
125 |
* Download the `.h5.gz` files in `data/<source dataset name>`. Our source datasets include SustainBench, USAVars, and BigEarthNet2.0. Each dataset with the augmented geographic inputs is detailed in [this section 📦](#geolayersused)
|
126 |
* You may use pigz (https://linux.die.net/man/1/pigz) to decompress the archive. This is especially recommended for USAVars' train-split, which is 117 GB when uncompressed. This can be done with `pigz -d <.h5.gz>`
|
127 |
* Datasets with auxiliary geographic inputs can be read with H5PY.
|
128 |
+
* If you plan on using the original datasets, please cite our paper "Using Multiple Input Modalities can Improve Data-Efficiency and O.O.D. Generalization for ML with Satellite Imagery". BibTex can be found at the bottom of this README.
|
129 |
+
### Usage Instructions for the BigEarthNetv2.0 dataset (Clasen et al. (2025))
|
130 |
+
We detail usage instructions to train on the BigEarthNetv2.0 dataset as a separate section due to the unique fusion mechanism we use for this input modality. Our training code is a distinct Github repository for fusing and training a ViT on the BigEarthNet2.0 dataset using an auxiliary SatCLIP token fused with `TOKEN-FUSE`. [Link to Github Repository](https://github.com/UCBoulder/GeoViTTokenFusion). This repository also uses torchgeo and timm as *submodules*.
|
131 |
|
132 |
+
We use the original dataset [BigEarthNetv2.0](https://bigearth.net/) dataset which is processed with spatially-buffered train-test splits. We release two **processed** versions of the datasets introduced in Casen et al. (2025)
|
133 |
+
The first version is stored in directory `data/bigearthnet/raw/`. This dataset, although called `raw` is a pre-processed version of the raw BigEarthNetv2.0 dataset. We follow instructions listed on [this repository](https://git.tu-berlin.de/rsim/reben-training-scripts/-/tree/main?ref_type=heads#data). Steps performed:
|
|
|
134 |
1. We download the raw `BigEarthNet-S2.tar.zst` Sentinel-2 BigEarthNet dataset.
|
135 |
2. We extract and process the raw S2 tiles to a LMDB 'Lightning' Database. This allows for faster reads during training. We use the rico-hdl tool [here](https://github.com/kai-tub/rico-hdl) to accomplish this.
|
136 |
3. We download reference maps and sentinel-2 tile metadata with snow and cloud cover rasters
|
|
|
150 |
This version of the processed dataset comes with (i) raw location co-ordinates, and (ii) pre-computed SatCLIP embeddings (L=10, ResNet50 image encoder backbone).
|
151 |
You may access these embeddings and location metadata with keys `location` and `satclip_embedding`.
|
152 |
|
153 |
+
### Usage Instructions for the SustainBench Farmland Boundary Delineation Dataset (Yeh et al. (2021))
|
154 |
1. Unzip the archive in `data/sustainbench-field-boundary-delineation` with `unzip sustainbench.zip`
|
155 |
2. You should see a directory structure as follows:
|
156 |
```
|