Abhaykoul commited on
Commit
f7b9b6f
·
verified ·
1 Parent(s): ee667e7

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +62 -0
README.md ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ datasets:
4
+ - OEvortex/Vortex-50k
5
+ language:
6
+ - en
7
+ pipeline_tag: text-generation
8
+ tags:
9
+ - HelpingAI
10
+ - vortex
11
+ ---
12
+ **Model Overview**
13
+
14
+ vortex-3b is a 2.9 billion parameter causal language model created by OEvortex that is derived from EleutherAI's Pythia-2.8b and fine-tuned on Vortex-50k dataset'
15
+
16
+ **Usage**
17
+
18
+ To utilize the lite-hermes model, you can access the provided Colab notebook. The notebook allows you to run the model on both CPU and GPU. Feel free to make any necessary changes to adapt it to your specific requirements.
19
+
20
+ **CPU and GPU code**
21
+
22
+ ```bash
23
+ !pip install transformers
24
+ !pip install sentencepiece
25
+ !pip install accelerate
26
+ ```
27
+ ```python
28
+ import torch # allows Tensor computation with strong GPU acceleration
29
+ from transformers import pipeline # fast way to use pre-trained models for inference
30
+ import os
31
+ ```
32
+ ```python
33
+ # load model
34
+ HL_pipeline = pipeline(model="OEvortex/vortex-3b",
35
+ torch_dtype=torch.bfloat16,
36
+ trust_remote_code=True,
37
+ device_map="auto")
38
+ ```
39
+ ```python
40
+ # define helper function
41
+ def get_completion_HL(input):
42
+ system = f"""
43
+ You are an expert Physicist.
44
+ You are good at explaining Physics concepts in simple words.
45
+ Help as much as you can.
46
+ """
47
+ prompt = f"#### System: {system}\n#### User: \n{input}\n\n#### Response from Lite:"
48
+ print(prompt)
49
+ HL_response = HL_pipeline(prompt,
50
+ max_new_tokens=500
51
+ )
52
+ return HL_response[0]["generated_text"]
53
+ ```
54
+ ```python
55
+ # let's prompt
56
+ prompt = "Explain the difference between nuclear fission and fusion."
57
+ # prompt = "Why is the Sky blue?"
58
+
59
+ print(get_completion_HL(prompt))
60
+ ```
61
+
62
+