runtime error
Exit code: 1. Reason: Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained. Downloading shards: 0%| | 0/2 [00:00<?, ?it/s][A Downloading shards: 50%|█████ | 1/2 [00:05<00:05, 5.21s/it][A Downloading shards: 100%|██████████| 2/2 [00:06<00:00, 3.07s/it] Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s][A Loading checkpoint shards: 50%|█████ | 1/2 [00:02<00:02, 2.22s/it][A Loading checkpoint shards: 100%|██████████| 2/2 [00:02<00:00, 1.25s/it] Traceback (most recent call last): File "/home/user/app/app.py", line 32, in <module> dataset = load_dataset("not-lain/wikipedia", revision="embedded") File "/usr/local/lib/python3.10/site-packages/datasets/load.py", line 2594, in load_dataset builder_instance = load_dataset_builder( File "/usr/local/lib/python3.10/site-packages/datasets/load.py", line 2266, in load_dataset_builder dataset_module = dataset_module_factory( File "/usr/local/lib/python3.10/site-packages/datasets/load.py", line 1914, in dataset_module_factory raise e1 from None File "/usr/local/lib/python3.10/site-packages/datasets/load.py", line 1896, in dataset_module_factory ).get_module() File "/usr/local/lib/python3.10/site-packages/datasets/load.py", line 1199, in get_module hfh_dataset_info = HfApi(config.HF_ENDPOINT).dataset_info( File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_validators.py", line 114, in _inner_fn return fn(*args, **kwargs) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/hf_api.py", line 2366, in dataset_info return DatasetInfo(**data) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/hf_api.py", line 799, in __init__ self.tags = kwargs.pop("tags") KeyError: 'tags'
Container logs:
Fetching error logs...