robot-bengali-2 commited on
Commit
f92ddf4
ยท
unverified ยท
1 Parent(s): b95bf0a

Avoid Markdown in readme

Browse files
Files changed (1) hide show
  1. README.md +26 -19
README.md CHANGED
@@ -7,22 +7,29 @@ sdk: static
7
  pinned: false
8
  ---
9
 
10
- This organization is a part of the NeurIPS 2021 demonstration <a href="https://training-transformers-together.github.io/">"Training Transformers Together"</a>.
11
-
12
- In this demo, we train a model similar to <a target="_blank" href="https://openai.com/blog/dall-e/">OpenAI DALL-E</a> โ€”
13
- a Transformer "language model" that generates images from text descriptions.
14
- It is trained on <a target="_blank" href="https://laion.ai/laion-400-open-dataset/">LAION-400M</a>,
15
- the world's largest openly available image-text-pair dataset with 400 million samples. Our model is based on
16
- the <a target="_blank" href="https://github.com/lucidrains/DALLE-pytorch">dalleโ€‘pytorch</a> implementation
17
- by <a target="_blank" href="https://github.com/lucidrains">Phil Wang</a> with a few tweaks to make it communication-efficient.
18
-
19
- See details about how to join and how it works on <a target="_blank" href="https://training-transformers-together.github.io/">our website</a>.
20
-
21
- The organization gathers people participating in the collaborative training and provides links to the necessary resources:
22
-
23
- - ๐Ÿ‘‰ Starter kits for **Google Colab** and **Kaggle** (easy way to join the training)
24
- - ๐Ÿ‘‰ [Dashboard](https://huggingface.co/spaces/training-transformers-together/Dashboard) (the current training state: loss, number of peers, etc.)
25
- - ๐Ÿ‘‰ [Model](https://huggingface.co/training-transformers-together/dalle-demo) (the latest model checkpoint)
26
- - ๐Ÿ‘‰ [Dataset](https://huggingface.co/datasets/laion/laion_100m_vqgan_f8)
27
-
28
- Feel free to reach us on [Discord](https://discord.gg/uGugx9zYvN) if you have any questions ๐Ÿ™‚
 
 
 
 
 
 
 
 
7
  pinned: false
8
  ---
9
 
10
+ <p>
11
+ This organization is a part of the NeurIPS 2021 demonstration <a href="https://training-transformers-together.github.io/">"Training Transformers Together"</a>.
12
+ </p>
13
+ <p>
14
+ In this demo, we train a model similar to <a target="_blank" href="https://openai.com/blog/dall-e/">OpenAI DALL-E</a> โ€”
15
+ a Transformer "language model" that generates images from text descriptions.
16
+ It is trained on <a target="_blank" href="https://laion.ai/laion-400-open-dataset/">LAION-400M</a>,
17
+ the world's largest openly available image-text-pair dataset with 400 million samples. Our model is based on
18
+ the <a target="_blank" href="https://github.com/lucidrains/DALLE-pytorch">dalleโ€‘pytorch</a> implementation
19
+ by <a target="_blank" href="https://github.com/lucidrains">Phil Wang</a> with a few tweaks to make it communication-efficient.
20
+ </p>
21
+ <p>
22
+ See details about how to join and how it works on <a target="_blank" href="https://training-transformers-together.github.io/">our website</a>.
23
+ </p>
24
+ <p>
25
+ This organization gathers people participating in the collaborative training and provides links to the necessary resources:
26
+ </p>
27
+ <ul>
28
+ <li>๐Ÿ‘‰ Starter kits for **Google Colab** and **Kaggle** (easy way to join the training)</li>
29
+ <li>๐Ÿ‘‰ <a target="_blank" href="https://huggingface.co/spaces/training-transformers-together/Dashboard">Dashboard</a> (the current training state: loss, number of peers, etc.)</li>
30
+ <li>๐Ÿ‘‰ <a target="_blank" href="https://huggingface.co/training-transformers-together/dalle-demo">Model</a> (the latest checkpoint)</li>
31
+ <li>๐Ÿ‘‰ <a target="_blank" href="https://huggingface.co/datasets/laion/laion_100m_vqgan_f8">Dataset</a></li>
32
+ </ul>
33
+ <p>
34
+ Feel free to reach us on <a target="_blank" href="https://discord.gg/uGugx9zYvN">Discord</a> if you have any questions ๐Ÿ™‚
35
+ </p>