## Working with dataset locally A huggingface datasets repository is a GitHub repository like any other. You can simply download it like so: ```bash git clone https://huggingface.co/datasets/danish-foundation-models/danish-gigaword-2 cd danish-gigaword-2 ``` You can the work with the dataset locally like so: ```py from datasets import load_dataset name = "../." # instead of "danish-foundation-models/danish-gigaword-2" dataset = load_dataset("../.", split="train") # make transformations here ``` > Note: While it is local Huggingface still uses a cache, therefore you might need to reset it after changes have been made to see that it works correctly. You can do this by deleting the cached files which you can locate using `dataset.cache_files`. ## Installing dependencies This repo comes with a few dependencies you need to install to make this run. It uses a [makefile](https://opensource.com/article/18/8/what-how-makefile) to run commands and a [uv](https://docs.astral.sh/uv/) for package management. Once you have uv installed you can install the dependencies using: ```bash make install ``` ## Running dataset tests This dataset is special as it comes with a test suite, e.g. testing in the ids are unique and that the format is consistent. You can run the suite using ```bash make test ``` ## Submitting a PR Creating a PR on Huggingface is a bit different from creating one on Github. 1) Go to the community tab on huggingface press *new pull request* and choose *on your machine*. Specify the title of the your PR. Then you can simply: ```bash git checkout pr/{PR NUMBER} # make your changes here # push to hub git push origin pr/11:refs/pr/11 ``` Before you make the PR do be sure to make sure that the tests have been run. To see example PR you can see the following: - [Restructuring columns in the dataset](https://huggingface.co/datasets/danish-foundation-models/danish-gigaword-2/discussions/11)