FINGU-AI commited on
Commit
a3536bb
·
verified ·
1 Parent(s): 65b7e2f

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +50 -0
README.md CHANGED
@@ -0,0 +1,50 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ ---
4
+ # FINGU-AI/QL-Specilize-78b
5
+
6
+ ## Overview
7
+ `FINGU-AI/QL-Specilize-78b` is a powerful causal language model designed for a variety of natural language processing (NLP) tasks, including machine translation, text generation, and chat-based applications. This model is particularly useful for translating between languages, as well as supporting other custom NLP tasks through flexible input.
8
+
9
+ ## Example Usage
10
+
11
+ ### Installation
12
+ Make sure to install the required packages:
13
+
14
+ ```bash
15
+ pip install torch transformers
16
+ ```
17
+ ### Loading the Model
18
+
19
+ ```python
20
+ from transformers import AutoTokenizer, AutoModelForCausalLM
21
+ import torch
22
+
23
+ # Model and Tokenizer
24
+ model_id = 'FINGU-AI/QL-Specilize-78b'
25
+ model = AutoModelForCausalLM.from_pretrained(model_id, attn_implementation="sdpa", torch_dtype=torch.float16, device_map='auto')
26
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
27
+ model.to('cuda')
28
+
29
+ # Input Messages for Translation
30
+ messages = [
31
+ {"role": "user", "content": "what is Machine learning?"}
32
+ ]
33
+
34
+ # Tokenize and Generate Response
35
+ input_ids = tokenizer.apply_chat_template(
36
+ messages,
37
+ add_generation_prompt=True,
38
+ return_tensors="pt"
39
+ ).to('cuda')
40
+
41
+ outputs = model.generate(
42
+ input_ids,
43
+ max_new_tokens=500,
44
+ do_sample=True,
45
+ )
46
+
47
+ # Decode and Print the Translation
48
+ response = outputs[0][input_ids.shape[-1]:]
49
+ print(tokenizer.decode(response, skip_special_tokens=True))
50
+ ```