cassanof commited on
Commit
1fc7947
·
1 Parent(s): 16cc1dc

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +26 -7
README.md CHANGED
@@ -15,17 +15,36 @@ State-of-the-art StarCoder-based models for low-resource languages
15
 
16
  ## Language Revision Index
17
 
18
- This is the revision index for the best-performing models on their respective HumanEval benchmarks.
19
 
20
  | Langauge | Revision ID | Epoch |
21
  | ------------- | ----------- | ----- |
22
  | Lua | `7e96d931547e342ad0661cdd91236fe4ccf52545` | 3 |
23
- | Racket | `2cdc541bee1db4da80c0b43384b0d6a0cacca5b2` | 5 |
24
- | OCaml | `e8a24f9e2149cbda8c3cca264a53c2b361b7a031` | 6 |
25
 
26
  ## Usage
27
 
28
- To utilize one of the models in this repository, you must first select a commit revision for that model.
29
-
30
-
31
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
15
 
16
  ## Language Revision Index
17
 
18
+ This is the revision index for the best-performing models for their respective langauge.
19
 
20
  | Langauge | Revision ID | Epoch |
21
  | ------------- | ----------- | ----- |
22
  | Lua | `7e96d931547e342ad0661cdd91236fe4ccf52545` | 3 |
23
+ | Racket | `2cdc541bee1db4da80c0b43384b0d6a0cacca5b2` | 5 |
24
+ | OCaml | `e8a24f9e2149cbda8c3cca264a53c2b361b7a031` | 6 |
25
 
26
  ## Usage
27
 
28
+ To utilize one of the models in this repository, you must first select a commit revision for that model from the table above.
29
+ For example, to use the Lua model:
30
+ ```py
31
+ from transformers import AutoTokenizer, AutoModelForCausalLM
32
+ tokenizer = AutoTokenizer.from_pretrained("nuprl/MultiPLCoder-1b")
33
+ lua_revision="7e96d931547e342ad0661cdd91236fe4ccf52545"
34
+ model = AutoModelForCausalLM.from_pretrained("nuprl/MultiPLCoder-1b", revision=lua_revision)
35
+ ```
36
+
37
+ Note that the model's default configuration does not enable caching, therefore you must specify to use the cache on generation.
38
+ ```py
39
+ toks = tokenizer.encode("-- Hello World", return_tensors="pt")
40
+ out = model.generate(toks, use_cache=True, do_sample=True, temperature=0.2, top_p=0.95, max_length=50)
41
+ print(tokenizer.decode(out[0], skip_special_tokens=True))
42
+ ```
43
+ ```
44
+ -- Hello World!
45
+ -- :param name: The name of the person to say hello to
46
+ -- :return: A greeting
47
+ local function say_hello(name)
48
+ return "Hello ".. name
49
+ end
50
+ ```