UncleanCode commited on
Commit
db23c20
1 Parent(s): c3ae9fb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +24 -2
README.md CHANGED
@@ -8,8 +8,10 @@ language:
8
  pipeline_tag: text-generation
9
  ---
10
  ## Anacondia
11
- Anacondia is a Pythia model fine-tuned using QLoRA on an instruction dataset
12
 
 
 
13
 
14
  ## Training procedure
15
 
@@ -27,4 +29,24 @@ The following `bitsandbytes` quantization config was used during training:
27
  ### Framework versions
28
 
29
 
30
- - PEFT 0.4.0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
8
  pipeline_tag: text-generation
9
  ---
10
  ## Anacondia
11
+ Anacondia-70m is a Pythia-70m-deduped model fine-tuned with QLoRA on [timdettmers/openassistant-guanaco](https://huggingface.co/datasets/timdettmers/openassistant-guanaco)
12
 
13
+ ## Usage
14
+ Anacondia is not intended for any real usage and was trained for educational purposes. Please consider more serious models for inference.
15
 
16
  ## Training procedure
17
 
 
29
  ### Framework versions
30
 
31
 
32
+ - PEFT 0.4.0
33
+
34
+ ## Inference
35
+
36
+ ```python
37
+
38
+ #import necessary modules
39
+ from transformers import AutoTokenizer, AutoModelForCausalLM
40
+ import torch
41
+
42
+ model_id = "UncleanCode/anacondia-70m"
43
+
44
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
45
+ model = AutoModelForCausalLM.from_pretrained(model_id)
46
+
47
+ input= tokenizer("This is a sentence ",return_tensors="pt")
48
+ output= model.generate(**input)
49
+
50
+ tokenizer.decode(output[0])
51
+
52
+ ```