legolasyiu commited on
Commit
3916224
·
verified ·
1 Parent(s): 415569c

Update index.html

Browse files
Files changed (1) hide show
  1. index.html +43 -4
index.html CHANGED
@@ -9,10 +9,49 @@
9
  <body>
10
  <div class="card">
11
  <h1>Welcome to your static Space!</h1>
12
- <p>You can modify this app directly by editing <i>index.html</i> in the Files and versions tab.</p>
13
- <p>
14
- Also don't forget to check the
15
- <a href="https://huggingface.co/docs/hub/spaces" target="_blank">Spaces documentation</a>.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16
  </p>
17
  </div>
18
  </body>
 
9
  <body>
10
  <div class="card">
11
  <h1>Welcome to your static Space!</h1>
12
+ # SuperTransformer
13
+
14
+ Suptertransformer that auto loads Huggingface models
15
+
16
+ # Introduction
17
+ This is a single line transformer for easy to load models from Huggingface. It is not to replace Huggingface Transformer process. It simplifies it and speed up the loading the process of the HuggingFace models
18
+
19
+ # Usage
20
+ SuperTransformers download the model locally. The super class uses AutoTokenizer and AutoModelForCausalLM.from_pretrained.
21
+
22
+ # Installation
23
+ ``` bash
24
+ pip install bitsandbytes>=0.39.0
25
+ pip install --upgrade accelerate transformers
26
+ ```
27
+ # How to run
28
+ ```python
29
+ python SuperTransformer.py
30
+ ```
31
+
32
+ # Example of usage:
33
+
34
+ ```python
35
+ # Load SuperTransformer Class, (1) Loads Huggingface model, (2) System Prompt (3) Text/prompt (4)Max tokens
36
+ SuperTransformers = SuperTransformers("EpistemeAI/ReasoningCore-3B-RE1-V2","You are a highly knowledgeable assistant with expertise in chemistry and physics. <reasoning>","What is the area of a circle, radius=16, reason step by step", 2026)
37
+ # 8-bit quantization
38
+ SuperTransformers.HuggingFaceTransformer8bit()
39
+ # or 4-bit quantization
40
+ SuperTransformers.HuggingFaceTransformer4bit()
41
+ ```
42
+
43
+ ## Returns model and tokenizer
44
+ ```python
45
+ SuperTransformers = SuperTransformers("EpistemeAI/ReasoningCore-3B-RE1-V2")
46
+ model, tokenizer = HuggingfaceTransfomer() #returns the model and tokenizer
47
+ ```
48
+ ## returns pipline as higher helper
49
+ ```python
50
+ SuperTransformers = SuperTransformers("EpistemeAI/ReasoningCore-3B-RE1-V2")
51
+ pipe = HuggingfacePipeline() #returns the pipeline only
52
+ output = pipe(self.text, max_new_tokens=self.max_new_tokens) # Limit output length to save memory
53
+ # Print the generated output
54
+ print(output)
55
  </p>
56
  </div>
57
  </body>