fake-news-detector / README.md
AlexanderHolmes0's picture
Update README.md
933dfc3 verified
metadata
license: apache-2.0
base_model: distilbert/distilbert-base-uncased
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: fake-news-detector
    results: []
widget:
  - text: >-
      In a shocking turn of events, reports have surfaced suggesting that a
      clandestine meeting of world leaders took place on Mars to discuss plans
      for the colonization of the Red Planet. According to anonymous sources
      within the highest echelons of government, the summit was organized by a
      coalition of space agencies and private corporations aiming to expedite
      humanity's expansion beyond Earth. The meeting purportedly took place in a
      hidden underground facility on Mars, accessible only to a select few
      individuals privy to the ambitious project.
    example_title: Mars Meeting
  - text: >-
      In a groundbreaking revelation that has sent shockwaves through the
      scientific community, Dr. Rachel Bennett, a renowned researcher at the
      prestigious Cambridge Institute of Biotechnology, claims to have unlocked
      the elusive secret to eternal youth. According to Dr. Bennett, years of
      tireless research have culminated in the discovery of a revolutionary
      anti-aging compound derived from a rare Amazonian plant known only to
      indigenous tribes. Initial trials on laboratory mice have yielded
      astonishing results, with subjects exhibiting signs of reversed aging and
      enhanced vitality.
    example_title: Dr. Bennett
  - text: Apples are orange
    example_title: Oranges are Apples
  - text: Donald Trump is the 45th president of the United States.
    example_title: True News
datasets:
  - AlexanderHolmes0/true-fake-news
language:
  - en
pipeline_tag: text-classification

fake-news-detector

This model is a fine-tuned version of distilbert/distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0027
  • Accuracy: 0.9994

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.201 0.09 100 0.0444 0.9901
0.0319 0.19 200 0.0241 0.9938
0.0222 0.28 300 0.0249 0.9932
0.0094 0.38 400 0.0076 0.9984
0.0042 0.47 500 0.0062 0.9988
0.0076 0.57 600 0.0040 0.9988
0.0095 0.66 700 0.0040 0.9990
0.008 0.76 800 0.0040 0.9988
0.0086 0.85 900 0.0030 0.9993
0.0042 0.95 1000 0.0027 0.9994

Framework versions

  • Transformers 4.39.1
  • Pytorch 2.2.0+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.1