malhajar commited on
Commit
1fd11ec
1 Parent(s): 0998a17

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +58 -0
README.md ADDED
@@ -0,0 +1,58 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ tags:
5
+ - art
6
+ - philosophy
7
+ ---
8
+ # Model Card for Model ID
9
+
10
+ <!-- Provide a quick summary of what the model is/does. -->
11
+ malhajar/Llama-2-7b-chat-dolly-tr is a finetuned version of Llama-2-7b-hf using SFT Training.
12
+ This model can answer information in turkish language as it is finetuned on a turkish dataset specifically [`databricks-dolly-15k-tr`]( https://huggingface.co/datasets/atasoglu/databricks-dolly-15k-tr)
13
+
14
+
15
+ ### Model Description
16
+
17
+ - **Developed by:** [`Mohamad Alhajar`](https://www.linkedin.com/in/muhammet-alhajar/)
18
+ - **Language(s) (NLP):** Turkish
19
+ - **Finetuned from model:** [`meta-llama/Llama-2-7b-hf`](https://huggingface.co/meta-llama/Llama-2-7b-hf)
20
+
21
+ ### Prompt Template
22
+ ``` <s>[INST] <prompt> [/INST] '''
23
+ ```
24
+
25
+
26
+ ## How to Get Started with the Model
27
+
28
+ Use the code sample provided in the original post to interact with the model.
29
+ ```python
30
+ from transformers import AutoTokenizer,AutoModelForCausalLM
31
+
32
+ model_id = "malhajar/Llama-2-7b-chat-dolly-tr"
33
+ model = AutoModelForCausalLM.from_pretrained(model_name_or_path,
34
+ device_map="auto",
35
+ torch_dtype=torch.float16,
36
+ revision="main")
37
+
38
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
39
+
40
+ question: "what is the will to truth?"
41
+ # For generating a response
42
+ prompt = '''
43
+ ### Instruction:
44
+ {question}
45
+
46
+ ### Response:'''
47
+ input_ids = tokenizer(prompt, return_tensors="pt").input_ids
48
+ output = model.generate(inputs=input_ids,max_new_tokens=512,pad_token_id=tokenizer.eos_token_id,top_k=50, do_sample=True,
49
+ top_p=0.95)
50
+ response = tokenizer.decode(output[0])
51
+
52
+ print(response)
53
+ ```
54
+
55
+ ## Example Generation
56
+
57
+ ''' <s>[INST] Türkiyenin en büyük şehir nedir? [/INST]
58
+ İstanbul, dünyanın en kalabalık ikinci ve Turuncu kütle'de yer almaktadır. Pek çok insandaki birçok ünlüsün bulundusuyla biliniyor. '''