Update README.md
Browse files
README.md
CHANGED
@@ -44,17 +44,19 @@ This model can be loaded and used with Hugging Face's `transformers` library:
|
|
44 |
from transformers import AutoTokenizer, AutoModelForSequenceClassification
|
45 |
import torch
|
46 |
|
47 |
-
#
|
48 |
tokenizer = AutoTokenizer.from_pretrained("your-username/DistilBERT-PhishGuard")
|
49 |
model = AutoModelForSequenceClassification.from_pretrained("your-username/DistilBERT-PhishGuard")
|
50 |
|
51 |
-
#
|
52 |
url = "http://example.com"
|
53 |
inputs = tokenizer(url, return_tensors="pt", truncation=True, max_length=256)
|
54 |
outputs = model(**inputs)
|
55 |
predictions = torch.argmax(outputs.logits, dim=-1)
|
56 |
print("Prediction:", "Phishing" if predictions.item() == 1 else "Safe")
|
57 |
|
|
|
|
|
58 |
## Performance
|
59 |
The model achieves high accuracy across different chunks of training data, with performance metrics above 98% accuracy and an AUC close to or at 1.00 in later stages. This indicates robust and reliable phishing detection across varied datasets.
|
60 |
|
|
|
44 |
from transformers import AutoTokenizer, AutoModelForSequenceClassification
|
45 |
import torch
|
46 |
|
47 |
+
#Load the model and tokenizer
|
48 |
tokenizer = AutoTokenizer.from_pretrained("your-username/DistilBERT-PhishGuard")
|
49 |
model = AutoModelForSequenceClassification.from_pretrained("your-username/DistilBERT-PhishGuard")
|
50 |
|
51 |
+
#Sample URL for classification
|
52 |
url = "http://example.com"
|
53 |
inputs = tokenizer(url, return_tensors="pt", truncation=True, max_length=256)
|
54 |
outputs = model(**inputs)
|
55 |
predictions = torch.argmax(outputs.logits, dim=-1)
|
56 |
print("Prediction:", "Phishing" if predictions.item() == 1 else "Safe")
|
57 |
|
58 |
+
```
|
59 |
+
|
60 |
## Performance
|
61 |
The model achieves high accuracy across different chunks of training data, with performance metrics above 98% accuracy and an AUC close to or at 1.00 in later stages. This indicates robust and reliable phishing detection across varied datasets.
|
62 |
|