shailja commited on
Commit
d12d01d
·
1 Parent(s): 0f48bb0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -5
README.md CHANGED
@@ -1,10 +1,7 @@
1
  ---
2
  pipeline_tag: text-generation
3
  inference: true
4
- widget:
5
- - text: module display_hello_word
6
- example_title: Hello world
7
- group: Verilog
8
  license: bigcode-openrail-m
9
  datasets:
10
  - shailja/Verilog_GitHub
@@ -270,7 +267,7 @@ extra_gated_fields:
270
 
271
  ## Model Summary
272
 
273
- The CodeGen models are 15.5B parameter models trained on 80+ programming languages from [The Stack (v1.2)](https://huggingface.co/datasets/bigcode/the-stack), with opt-out requests excluded. The model uses [Multi Query Attention](https://arxiv.org/abs/1911.02150), [a context window of 8192 tokens](https://arxiv.org/abs/2205.14135), and was trained using the [Fill-in-the-Middle objective](https://arxiv.org/abs/2207.14255) on 1 trillion tokens.
274
 
275
  - **Repository:** [shailja-thakur/VGen](https://github.com/shailja-thakur/VGen)
276
  - **Baseline LLM** [SalesForce/CodeGen](https://github.com/salesforce/CodeGen)
 
1
  ---
2
  pipeline_tag: text-generation
3
  inference: true
4
+
 
 
 
5
  license: bigcode-openrail-m
6
  datasets:
7
  - shailja/Verilog_GitHub
 
267
 
268
  ## Model Summary
269
 
270
+ The VeriGen model is a 2B parameter model fine-tuned version of [CodeGen-multi-2B](https://github.com/salesforce/CodeGen) trained on [Verilog code dataset](https://huggingface.co/shailja/Verilog_GitHub).
271
 
272
  - **Repository:** [shailja-thakur/VGen](https://github.com/shailja-thakur/VGen)
273
  - **Baseline LLM** [SalesForce/CodeGen](https://github.com/salesforce/CodeGen)