hdallatorre
commited on
Commit
·
616ac86
1
Parent(s):
f77ce43
Update README.md
Browse files
README.md
CHANGED
@@ -13,7 +13,7 @@ datasets:
|
|
13 |
|
14 |
The Nucleotide Transformers are a collection of foundational language models that were pre-trained on DNA sequences from whole-genomes. Compared to other approaches, our models do not only integrate information from single reference genomes, but leverage DNA sequences from over 3,200 diverse human genomes, as well as 850 genomes from a wide range of species, including model and non-model organisms. Through robust and extensive evaluation, we show that these large models provide extremely accurate molecular phenotype prediction compared to existing methods
|
15 |
|
16 |
-
Part of this collection is the **nucleotide-transformer-500m-human-ref**, a 500M parameters transformer pre-trained on the human reference genome. The model is available both in Tensorflow
|
17 |
|
18 |
**Developed by:** InstaDeep, NVIDIA and TUM
|
19 |
|
|
|
13 |
|
14 |
The Nucleotide Transformers are a collection of foundational language models that were pre-trained on DNA sequences from whole-genomes. Compared to other approaches, our models do not only integrate information from single reference genomes, but leverage DNA sequences from over 3,200 diverse human genomes, as well as 850 genomes from a wide range of species, including model and non-model organisms. Through robust and extensive evaluation, we show that these large models provide extremely accurate molecular phenotype prediction compared to existing methods
|
15 |
|
16 |
+
Part of this collection is the **nucleotide-transformer-500m-human-ref**, a 500M parameters transformer pre-trained on the human reference genome. The model is available both in Tensorflow and Pytorch.
|
17 |
|
18 |
**Developed by:** InstaDeep, NVIDIA and TUM
|
19 |
|