File size: 968 Bytes
fce3b50
a6b6903
 
 
 
 
 
404b060
 
a6b6903
fce3b50
a6b6903
64c16c6
404b060
 
 
ffb82a8
404b060
ffb82a8
 
 
 
 
 
8b19795
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
---
language:
- ro
license: mit  # Example: apache-2.0 or any license from https://hf.co/docs/hub/repositories-licenses

tags:
- romanian
- text generation
- causal lm
- gpt-neo
---

# GPT-Neo Romanian 780M

This model is a GPT-Neo transformer decoder model designed using EleutherAI's replication of the GPT-3 architecture. 

It was trained on a thoroughly cleaned corpus of Romanian text of about 40GB composed of Oscar, Opus, Wikipedia, literature and various other bits and pieces of text, joined together and deduplicated. It was trained for about a month, totaling 1.5M steps on a v3-32 TPU machine.

### Authors:
* Dumitrescu Stefan
* Mihai Ilie

### Evaluation
Evaluation to be added soon, also on [https://github.com/dumitrescustefan/Romanian-Transformers](https://github.com/dumitrescustefan/Romanian-Transformers)

### Acknowledgements

Thanks [TPU Research Cloud](https://sites.research.google/trc/about/) for the TPUv3 machine needed to train this model!