jamander commited on
Commit
0c05f48
·
verified ·
1 Parent(s): c3cdc0f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +49 -7
README.md CHANGED
@@ -7,13 +7,6 @@ library_name: transformers.js
7
  ---
8
  base_model: mistralai/Mistral-7B-v0.1
9
 
10
- ---
11
- license: mit
12
- language:
13
- - en
14
- pipeline_tag: text-generation
15
- library_name: transformers
16
- base_model: mistralai/Mistral-7B-v0.1
17
  ---
18
 
19
  # Project-Frankenstein
@@ -56,3 +49,52 @@ The model was trained on a Tesla T4 GPU using Google Colab. The training involve
56
  ## Example Generations
57
 
58
  **Base Model Generation:**
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
  ---
8
  base_model: mistralai/Mistral-7B-v0.1
9
 
 
 
 
 
 
 
 
10
  ---
11
 
12
  # Project-Frankenstein
 
49
  ## Example Generations
50
 
51
  **Base Model Generation:**
52
+ I'm afraid I've created a 2000-level problem with a 100-level solution.
53
+
54
+ I'm a 2000-level problem.
55
+
56
+ I'm a 2000-level problem.
57
+
58
+ I'm a 2000-level problem.
59
+
60
+ I'm a 2000-level problem.
61
+
62
+ I'm a 2000-level problem.
63
+
64
+ I'm a 2
65
+
66
+
67
+ **Fine-tuned Model Generation:**
68
+ I'm afraid I've created a
69
+ monster, one which will be the means of my own destruction. What
70
+ shall I do? My own peace is destroyed; I am constantly agitated between
71
+ the extremes of fear and hope; the former when I think of the
72
+ danger, the latter when I think of him.
73
+
74
+ “I have been occupied in making a man, and he is perfect. I have
75
+ given him the utmost extent of my own faculties, and more. He
76
+
77
+
78
+ ## Usage
79
+
80
+ To use this model, follow these steps to log in to Hugging Face, get access to the gated repo, and load the model:
81
+
82
+ ```python
83
+ from transformers import AutoModelForCausalLM, AutoTokenizer
84
+ from huggingface_hub import login
85
+
86
+ # Log in to Hugging Face
87
+ login("your-hugging-face-token")
88
+
89
+ # Ensure you have access to the gated repo
90
+ # Visit https://huggingface.co/mistralai/Mistral-7B-v0.1 to request access if you haven't already
91
+
92
+ tokenizer = AutoTokenizer.from_pretrained("jamander/Project-Frankenstein")
93
+ model = AutoModelForCausalLM.from_pretrained("jamander/Project-Frankenstein")
94
+
95
+ input_text = "I am afraid I have created a "
96
+ inputs = tokenizer(input_text, return_tensors="pt")
97
+ outputs = model.generate(**inputs)
98
+ generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
99
+
100
+ print(generated_text)