Datasets:
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
Error code: DatasetGenerationError Exception: FileNotFoundError Message: LivingOptics/hyperspectral-orchard@254badde199feba8157e92112299065fb6c32908/train/images/apple-braeburn-0d-90d-row91-block6-171-left-vertical-static-20240711-145234-686502_0.png (repository not found) Traceback: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_http.py", line 406, in hf_raise_for_status response.raise_for_status() File "/src/services/worker/.venv/lib/python3.9/site-packages/requests/models.py", line 1024, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/api/datasets/LivingOptics/hyperspectral-orchard/revision/254badde199feba8157e92112299065fb6c32908 The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 121, in _repo_and_revision_exist self._api.repo_info( File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_api.py", line 2682, in repo_info return method( File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_api.py", line 2540, in dataset_info hf_raise_for_status(r) File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/utils/_http.py", line 454, in hf_raise_for_status raise _format(RepositoryNotFoundError, message, response) from e huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-6753b9a2-39a6a71375e1b9603a5a4f77;fff895d8-1c4a-46e2-b04f-21808b30448d) Repository Not Found for url: https://huggingface.co/api/datasets/LivingOptics/hyperspectral-orchard/revision/254badde199feba8157e92112299065fb6c32908. Please make sure you specified the correct `repo_id` and `repo_type`. If you are trying to access a private or gated repo, make sure you are authenticated. Invalid username or password. The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1634, in _prepare_split_single num_examples, num_bytes = writer.finalize() File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 636, in finalize self.write_examples_on_file() File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 495, in write_examples_on_file self.write_batch(batch_examples=batch_examples) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 609, in write_batch self.write_table(pa_table, writer_batch_size) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/arrow_writer.py", line 624, in write_table pa_table = embed_table_storage(pa_table) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2270, in embed_table_storage arrays = [ File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2271, in <listcomp> embed_array_storage(table[name], feature) if require_storage_embed(feature) else table[name] File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1795, in wrapper return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks]) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1795, in <listcomp> return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks]) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2140, in embed_array_storage return feature.embed_storage(array) File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py", line 273, in embed_storage [ File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py", line 274, in <listcomp> (path_to_bytes(x["path"]) if x["bytes"] is None else x["bytes"]) if x is not None else None File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/utils/py_utils.py", line 309, in wrapper return func(value) if value is not None else None File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/features/image.py", line 268, in path_to_bytes with xopen(path, "rb") as f: File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/utils/file_utils.py", line 948, in xopen file_obj = fsspec.open(file, mode=mode, *args, **kwargs).open() File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py", line 135, in open return self.__enter__() File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/core.py", line 103, in __enter__ f = self.fs.open(self.path, mode=mode) File "<string>", line 3, in open File "/usr/local/lib/python3.9/unittest/mock.py", line 1092, in __call__ return self._mock_call(*args, **kwargs) File "/usr/local/lib/python3.9/unittest/mock.py", line 1096, in _mock_call return self._execute_mock_call(*args, **kwargs) File "/usr/local/lib/python3.9/unittest/mock.py", line 1157, in _execute_mock_call result = effect(*args, **kwargs) File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 847, in wrapped f = fs_open(self, urlpath, mode, *args, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/fsspec/spec.py", line 1293, in open f = self._open( File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 234, in _open return HfFileSystemFile(self, path, mode=mode, revision=revision, block_size=block_size, **kwargs) File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 682, in __init__ self.resolved_path = fs.resolve_path(path, revision=revision) File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 177, in resolve_path _raise_file_not_found(path, err) File "/src/services/worker/.venv/lib/python3.9/site-packages/huggingface_hub/hf_file_system.py", line 880, in _raise_file_not_found raise FileNotFoundError(msg) from err FileNotFoundError: LivingOptics/hyperspectral-orchard@254badde199feba8157e92112299065fb6c32908/train/images/apple-braeburn-0d-90d-row91-block6-171-left-vertical-static-20240711-145234-686502_0.png (repository not found) The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1412, in compute_config_parquet_and_info_response parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet( File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 988, in stream_convert_to_parquet builder._prepare_split( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1486, in _prepare_split for job_id, done, content in self._prepare_split_single( File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1643, in _prepare_split_single raise DatasetGenerationError("An error occurred while generating the dataset") from e datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset
Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.
image
image | label
class label |
---|---|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
|
0images
|
Living Optics Orchard Dataset
Overview
This dataset contains 435 images of captured in one of the UK's largest orchards, using the Living Optics Camera.
The data consists of RGB images, sparse spectral samples and instance segmentation masks.
The dataset is derived from 44 unique raw files corresponding to 435 frames. Therefore, multiple frames could originate from the same raw file. This structure emphasized the need for a split strategy that avoided data leakage. To ensure robust evaluation, the dataset was divided using an 8:2 split, with splitting performed at the raw file level rather than the frame level. This strategy guaranteed that all frames associated with a specific raw file were confined to either the training set or the test set, eliminating the risk of overlapping information between the two sets. The dataset contains 3,785 instances of Royal Gala Apples, 2,523 instances of Pears, and 73 instances of Cox Apples, summing to a total of 6,381 labelled instances.
The spectra which do not lie within a labelled segmentation mask can be used for negative sampling when training classifiers.
Additional unlabelled data is available upon request.
Classes
The training dataset contains 3 classes:
- 🍎 cox apple - 3,605 total spectral samples
- 🍎 royal gala apple - 13,282 total spectral samples
- 🍐 pear - 34,398 total spectral samples
The remaining 1,855,755 spectra are unlabelled and can be considered a single "background " class.
Requirements
Download instructions
Command line
mkdir -p hyperspectral-orchard
huggingface-cli download LivingOptics/hyperspectral-orchard --repo-type dataset --local-dir hyperspectral-orchard
Python
from huggingface_hub import snapshot_download
dataset_path = snapshot_download(repo_id="LivingOptics/hyperspectral-orchard", repo_type="dataset")
print(dataset_path)
Usage
import os.path as op
import numpy.typing as npt
from typing import List, Dict, Generator
from lo.data.tools import Annotation, LODataItem, LOJSONDataset, draw_annotations
from lo.data.dataset_visualisation import get_object_spectra, plot_labelled_spectra
from lo.sdk.api.acquisition.io.open import open as lo_open
# Load the dataset
path_to_download = op.expanduser("~/Downloads/hyperspectral-orchard")
dataset = LOJSONDataset(path_to_download)
# Get the training data as an iterator
training_data: List[LODataItem] = dataset.load("train")
# Inspect the data
lo_data_item: LODataItem
for lo_data_item in training_data[:3]:
draw_annotations(lo_data_item)
ann: Annotation
for ann in lo_data_item.annotations:
print(ann.class_name, ann.category, ann.subcategories)
# Plot the spectra for each class
fig, ax = plt.subplots(1)
object_spectra_dict = {}
class_numbers_to_labels = {0: "background_class"}
for lo_data_item in training_data:
object_spectra_dict, class_numbers_to_labels = get_object_spectra(
lo_data_item, object_spectra_dict, class_numbers_to_labels
)
plot_labelled_spectra(object_spectra_dict, class_numbers_to_labels, ax)
plt.show()
See our Spatial Spectral ML project for an example of how to train and run a segmentation and spectral classification algoirthm using this dataset.
- Downloads last month
- 17