YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Setup up

For the best experience, it is recommended to use Linux or macOS operating systems. The google colab environment is also supported. For Windows users, please use WSL2.

  1. Install uv
curl -LsSf https://astral.sh/uv/install.sh | sh
  1. Sync python environment
uv sync
  1. Unzip the datasets
unzip datasets.zip

Run the code

  • The notebook is train.ipynb file and the script is train.py both are the same.
  • The models are in the models folder, and contain the best parameters for each model in the models/best_pth. The models will auto load the best parameters when you run the code.

slurm example

salloc --partition=gpu_v100s -N1 --ntasks-per-node=8 --mem=81920 --gres=gpu:1 -t2:00:00
module load cuda/12.1.0
source .venv/bin/activate
python3 train.py

Before running the code, please check whether using the pretrained model or not.

Submit the results to Kaggle

The result file is predict.csv file.

Using

kaggle competitions submit -c 2025-sdsc-6001-hw-3 -f predict.csv -m "Your message(Best with the model name or the method you used)"

to submit the result to Kaggle.

License

Before the homework finished, please do not share the code or the results with others.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support