t pushMerge branch 'main' of https://huggingface.co/datasets/AstroCompress/SBI-16-3D into main
Browse files
README.md
CHANGED
@@ -50,6 +50,37 @@ pip install datasets astropy
|
|
50 |
|
51 |
There are two datasets: `tiny` and `full`, each with `train` and `test` splits. The `tiny` dataset has 2 4D images in the `train` and 1 in the `test`. The `full` dataset contains all the images in the `data/` directory.
|
52 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
53 |
## Use from Huggingface Directly
|
54 |
|
55 |
To directly use from this data from Huggingface, you'll want to log in on the command line before starting python:
|
@@ -79,29 +110,4 @@ dst = dataset.with_format("torch", columns=["image"], dtype=torch.uint8)
|
|
79 |
|
80 |
# or pandas
|
81 |
dsp = dataset.with_format("pandas", columns=["image"], dtype=numpy.uint8)
|
82 |
-
```
|
83 |
-
|
84 |
-
## Local Use
|
85 |
-
|
86 |
-
Alternatively, you can clone this repo and use directly without connecting to hf:
|
87 |
-
|
88 |
-
```bash
|
89 |
-
git clone https://huggingface.co/datasets/AstroCompress/SBI-16-3D
|
90 |
-
```
|
91 |
-
|
92 |
-
Then `cd SBI-16-3D` and start python like:
|
93 |
-
|
94 |
-
```python
|
95 |
-
from datasets import load_dataset
|
96 |
-
dataset = load_dataset("./SBI-16-3D.py", "tiny", data_dir="./data/")
|
97 |
-
ds = dataset.with_format("np")
|
98 |
-
```
|
99 |
-
|
100 |
-
Now you should be able to use the `ds` variable like:
|
101 |
-
|
102 |
-
```python
|
103 |
-
ds["test"][0]["image"].shape # -> (5, 2048, 2048)
|
104 |
-
```
|
105 |
-
|
106 |
-
Note of course that it will take a long time to download and convert the images in the local cache for the `full` dataset. Afterward, the usage should be quick as the files are memory-mapped from disk.
|
107 |
-
|
|
|
50 |
|
51 |
There are two datasets: `tiny` and `full`, each with `train` and `test` splits. The `tiny` dataset has 2 4D images in the `train` and 1 in the `test`. The `full` dataset contains all the images in the `data/` directory.
|
52 |
|
53 |
+
|
54 |
+
## Local Use (RECOMMENDED)
|
55 |
+
|
56 |
+
Alternatively, you can clone this repo and use directly without connecting to hf:
|
57 |
+
|
58 |
+
```bash
|
59 |
+
git clone https://huggingface.co/datasets/AstroCompress/SBI-16-3D
|
60 |
+
```
|
61 |
+
|
62 |
+
```bash
|
63 |
+
git lfs pull
|
64 |
+
```
|
65 |
+
|
66 |
+
Then `cd SBI-16-3D` and start python like:
|
67 |
+
|
68 |
+
```python
|
69 |
+
from datasets import load_dataset
|
70 |
+
dataset = load_dataset("./SBI-16-3D.py", "tiny", data_dir="./data/")
|
71 |
+
ds = dataset.with_format("np")
|
72 |
+
```
|
73 |
+
|
74 |
+
Now you should be able to use the `ds` variable like:
|
75 |
+
|
76 |
+
```python
|
77 |
+
ds["test"][0]["image"].shape # -> (5, 2048, 2048)
|
78 |
+
```
|
79 |
+
|
80 |
+
Note of course that it will take a long time to download and convert the images in the local cache for the `full` dataset. Afterward, the usage should be quick as the files are memory-mapped from disk.
|
81 |
+
|
82 |
+
|
83 |
+
|
84 |
## Use from Huggingface Directly
|
85 |
|
86 |
To directly use from this data from Huggingface, you'll want to log in on the command line before starting python:
|
|
|
110 |
|
111 |
# or pandas
|
112 |
dsp = dataset.with_format("pandas", columns=["image"], dtype=numpy.uint8)
|
113 |
+
```
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|