PEFT
Safetensors
English
freQuensy23 commited on
Commit
7c8df22
1 Parent(s): 6484f5e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +23 -143
README.md CHANGED
@@ -2,6 +2,10 @@
2
  library_name: peft
3
  base_model: meta-llama/Llama-2-7b-hf
4
  license: llama2
 
 
 
 
5
  ---
6
 
7
  # Model Card for Model ID
@@ -22,113 +26,31 @@ license: llama2
22
  - **Model type: LLM (Llama2)
23
  - **Language(s) (NLP): EN
24
  - **License:** Llama-2-license
25
- - **Finetuned from model [optional]:** meta-llama/Llama-2-7b
26
  ### Model Sources [optional]
27
 
28
- ## Uses
29
-
30
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
31
-
32
- ### Direct Use
33
-
34
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
35
-
36
- [More Information Needed]
37
-
38
- ### Downstream Use [optional]
39
-
40
- <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
41
-
42
- [More Information Needed]
43
-
44
- ### Out-of-Scope Use
45
-
46
- <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
47
-
48
- [More Information Needed]
49
-
50
- ## Bias, Risks, and Limitations
51
-
52
- <!-- This section is meant to convey both technical and sociotechnical limitations. -->
53
-
54
- [More Information Needed]
55
-
56
- ### Recommendations
57
-
58
- <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
59
-
60
- Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
61
-
62
  ## How to Get Started with the Model
63
 
64
- Use the code below to get started with the model.
 
 
 
 
 
 
 
65
 
66
  [More Information Needed]
67
 
68
  ## Training Details
69
 
70
  ### Training Data
71
-
72
- <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
73
-
74
- [More Information Needed]
75
-
76
- ### Training Procedure
77
-
78
- <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
79
-
80
- #### Preprocessing [optional]
81
-
82
- [More Information Needed]
83
-
84
-
85
- #### Training Hyperparameters
86
-
87
- - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
88
-
89
- #### Speeds, Sizes, Times [optional]
90
-
91
- <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
92
-
93
- [More Information Needed]
94
-
95
- ## Evaluation
96
-
97
- <!-- This section describes the evaluation protocols and provides the results. -->
98
-
99
- ### Testing Data, Factors & Metrics
100
-
101
- #### Testing Data
102
-
103
- <!-- This should link to a Dataset Card if possible. -->
104
-
105
- [More Information Needed]
106
-
107
- #### Factors
108
-
109
- <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
110
 
111
  [More Information Needed]
112
-
113
- #### Metrics
114
-
115
- <!-- These are the evaluation metrics being used, ideally with a description of why. -->
116
-
117
- [More Information Needed]
118
-
119
  ### Results
120
 
121
- [More Information Needed]
122
-
123
- #### Summary
124
-
125
-
126
-
127
- ## Model Examination [optional]
128
-
129
- <!-- Relevant interpretability work for the model goes here -->
130
-
131
- [More Information Needed]
132
 
133
  ## Environmental Impact
134
 
@@ -136,59 +58,17 @@ Use the code below to get started with the model.
136
 
137
  Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
138
 
139
- - **Hardware Type:** [More Information Needed]
140
- - **Hours used:** [More Information Needed]
141
- - **Cloud Provider:** [More Information Needed]
142
- - **Compute Region:** [More Information Needed]
143
- - **Carbon Emitted:** [More Information Needed]
144
-
145
- ## Technical Specifications [optional]
146
-
147
- ### Model Architecture and Objective
148
-
149
- [More Information Needed]
150
-
151
- ### Compute Infrastructure
152
-
153
- [More Information Needed]
154
-
155
- #### Hardware
156
-
157
- [More Information Needed]
158
-
159
- #### Software
160
-
161
- [More Information Needed]
162
-
163
- ## Citation [optional]
164
-
165
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
166
-
167
- **BibTeX:**
168
-
169
- [More Information Needed]
170
-
171
- **APA:**
172
-
173
- [More Information Needed]
174
-
175
- ## Glossary [optional]
176
-
177
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
178
-
179
- [More Information Needed]
180
-
181
- ## More Information [optional]
182
-
183
- [More Information Needed]
184
-
185
- ## Model Card Authors [optional]
186
-
187
- [More Information Needed]
188
 
189
  ## Model Card Contact
190
 
191
- [More Information Needed]
 
 
192
 
193
 
194
  ### Framework versions
 
2
  library_name: peft
3
  base_model: meta-llama/Llama-2-7b-hf
4
  license: llama2
5
+ datasets:
6
+ - freQuensy23/toxic-answers
7
+ language:
8
+ - en
9
  ---
10
 
11
  # Model Card for Model ID
 
26
  - **Model type: LLM (Llama2)
27
  - **Language(s) (NLP): EN
28
  - **License:** Llama-2-license
29
+ - **Finetuned from model [optional]: meta-llama/Llama-2-7b
30
  ### Model Sources [optional]
31
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
32
  ## How to Get Started with the Model
33
 
34
+ ```python
35
+ import peft
36
+ import transformers
37
+ model = peft.AutoPeftModelForCausalLM.from_pretrained('freQuensy23/toxic-llama2')
38
+ tokenizer = transformers.AutoTokenizer.from_pretrained('transformers')
39
+
40
+ print(tokenizer.batch_decode(input_ids=tokenizer('User: What is 1 + 8?\nBot:', return_tensors='pt').input_ids))
41
+ ```
42
 
43
  [More Information Needed]
44
 
45
  ## Training Details
46
 
47
  ### Training Data
48
+ https://huggingface.co/freQuensy23/toxic-llama2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
49
 
50
  [More Information Needed]
51
+
 
 
 
 
 
 
52
  ### Results
53
 
 
 
 
 
 
 
 
 
 
 
 
54
 
55
  ## Environmental Impact
56
 
 
58
 
59
  Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
60
 
61
+ - **Hardware Type:** A-100
62
+ - **Hours used:** 1
63
+ - **Cloud Provider:** Yandex-cloud
64
+ - **Compute Region:** Moscow
65
+ - **Carbon Emitted:** 11g
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
66
 
67
  ## Model Card Contact
68
 
69
+ t.me/freQuensy23
70
+ github.com/freQuensy23-coder
71
72
 
73
 
74
  ### Framework versions