amunozo commited on
Commit
fe5104c
·
verified ·
1 Parent(s): c42e485

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -47,10 +47,10 @@ The following hyperparameters were used during training:
47
  - mixed_precision_training: Apex, opt level O1
48
 
49
  ## How to use
50
- Before being able to use this model, it is necessary to clone and configure the [codebase](https://github.com/xplip/pixel):
51
 
52
- *Instructions from [https://github.com/xplip/pixel](https://github.com/xplip/pixel)*
53
  ### Setup
 
54
 
55
  This codebase is built on [Transformers](https://github.com/huggingface/transformers) for PyTorch. We also took inspiration from the original [ViT-MAE codebase](https://github.com/facebookresearch/mae). The default font `GoNotoCurrent.ttf` that we used for all experiments is a merged Noto font built with [go-noto-universal](https://github.com/satbyy/go-noto-universal).
56
 
 
47
  - mixed_precision_training: Apex, opt level O1
48
 
49
  ## How to use
50
+ Before being able to use this model, it is necessary to clone and configure the [codebase](https://github.com/xplip/pixel).
51
 
 
52
  ### Setup
53
+ *Instructions from [https://github.com/xplip/pixel](https://github.com/xplip/pixel)*
54
 
55
  This codebase is built on [Transformers](https://github.com/huggingface/transformers) for PyTorch. We also took inspiration from the original [ViT-MAE codebase](https://github.com/facebookresearch/mae). The default font `GoNotoCurrent.ttf` that we used for all experiments is a merged Noto font built with [go-noto-universal](https://github.com/satbyy/go-noto-universal).
56