Datasets:
Update README.md
Browse files
README.md
CHANGED
@@ -6,9 +6,9 @@ task_categories:
|
|
6 |
|
7 |
# Dataset
|
8 |
|
9 |
-
<video controls autoplay src="https://cdn-uploads.huggingface.co/production/uploads/6018554e68258223ca22136f/
|
10 |
|
11 |
-
This dataset is used to train a transporter network for real-world pick/place
|
12 |
|
13 |
# Download
|
14 |
An example of downloading and loading the dataset is given below, as larger datasets are uploaded this example script will change:
|
@@ -24,7 +24,6 @@ DATA_DIR="/home/robot"
|
|
24 |
FILENAME="data.tar.xz"
|
25 |
EXTRACTED_FILENAME="data"
|
26 |
FILEPATH=os.path.join(DATA_DIR, FILENAME)
|
27 |
-
EXTRACTED_FILEPATH=os.path.join(DATA_DIR, EXTRACTED_FILENAME)
|
28 |
|
29 |
# download data from huggingface
|
30 |
hf_hub_download(
|
@@ -40,7 +39,7 @@ with tarfile.open(FILEPATH, 'r:xz') as tar:
|
|
40 |
os.remove(FILEPATH)
|
41 |
|
42 |
# load with tfds
|
43 |
-
ds = tfds.builder_from_directory(
|
44 |
|
45 |
# basic inspection of data
|
46 |
print(ds.element_spec)
|
|
|
6 |
|
7 |
# Dataset
|
8 |
|
9 |
+
<video controls autoplay src="https://cdn-uploads.huggingface.co/production/uploads/6018554e68258223ca22136f/yrKiz1A1q3i3VtkQ2jdKK.qt"></video>
|
10 |
|
11 |
+
This dataset is used to train a transporter network for real-world pick/place. The dataset is in TFDS format and was collected using [moveit2_data_collector](https://github.com/peterdavidfagan/moveit2_data_collector).
|
12 |
|
13 |
# Download
|
14 |
An example of downloading and loading the dataset is given below, as larger datasets are uploaded this example script will change:
|
|
|
24 |
FILENAME="data.tar.xz"
|
25 |
EXTRACTED_FILENAME="data"
|
26 |
FILEPATH=os.path.join(DATA_DIR, FILENAME)
|
|
|
27 |
|
28 |
# download data from huggingface
|
29 |
hf_hub_download(
|
|
|
39 |
os.remove(FILEPATH)
|
40 |
|
41 |
# load with tfds
|
42 |
+
ds = tfds.builder_from_directory(DATA_DIR).as_dataset()['train']
|
43 |
|
44 |
# basic inspection of data
|
45 |
print(ds.element_spec)
|