jamander commited on
Commit
0ea2e35
·
verified ·
1 Parent(s): fc5de6b

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +22 -22
README.md CHANGED
@@ -106,25 +106,25 @@ This project was completed to explore the effectiveness of parameter-efficient f
106
 
107
  ### Usage
108
 
109
- usage:
110
- - step_1: |
111
- Log in to Hugging Face:
112
- ```bash
113
- huggingface-cli login
114
- ```
115
- - step_2: |
116
- Load the model:
117
- ```python
118
- from transformers import AutoTokenizer, AutoModelForSequenceClassification
119
-
120
- tokenizer = AutoTokenizer.from_pretrained("jamander/Project-Blockbuster")
121
- model = AutoModelForSequenceClassification.from_pretrained("jamander/Project-Blockbuster")
122
- ```
123
- - step_3: |
124
- Tokenize and predict:
125
- ```python
126
- inputs = tokenizer("I loved the movie!", return_tensors="pt", padding=True, truncation=True)
127
- outputs = model(**inputs)
128
- prediction = torch.argmax(outputs.logits, dim=-1)
129
- print("Prediction:", "Positive" if prediction == 1 else "Negative")
130
- ```
 
106
 
107
  ### Usage
108
 
109
+ To use this model, follow these steps:
110
+
111
+ 1. **Log in to Hugging Face**:
112
+ ```bash
113
+ huggingface-cli login
114
+ ```
115
+
116
+ 2. **Load the model**:
117
+ ```python
118
+ from transformers import AutoTokenizer, AutoModelForSequenceClassification
119
+
120
+ tokenizer = AutoTokenizer.from_pretrained("jamander/Project-Blockbuster")
121
+ model = AutoModelForSequenceClassification.from_pretrained("jamander/Project-Blockbuster")
122
+ ```
123
+
124
+ 3. **Tokenize and predict**:
125
+ ```python
126
+ inputs = tokenizer("I loved the movie!", return_tensors="pt", padding=True, truncation=True)
127
+ outputs = model(**inputs)
128
+ prediction = torch.argmax(outputs.logits, dim=-1)
129
+ print("Prediction:", "Positive" if prediction == 1 else "Negative")
130
+