cpayne1303 commited on
Commit
1ae2bec
·
1 Parent(s): 6695675

update readme again

Browse files
Files changed (1) hide show
  1. README.md +4 -1
README.md CHANGED
@@ -7,4 +7,7 @@ library_name: transformers
7
  license: apache-2.0
8
  base_model:
9
  - cpayne1303/cp2024
10
- ---
 
 
 
 
7
  license: apache-2.0
8
  base_model:
9
  - cpayne1303/cp2024
10
+ ---
11
+ ## Model Description
12
+
13
+ This is a model using the llama2 architecture and only 30 million parameters. It is trained on approximately 2 billion tokens of diverse web data from the first 1000000 rows of the uncleaned c4 english dataset.