CBERT

CausalBERT (CBERT) is a multi-task fine-tuned German BERT that extracts causal attributions.

Model details

  • Model architecture: BERT-base-German-cased + token & relation heads
  • Fine-tuned on: environmental causal attribution corpus (German)
  • Tasks:
    1. Token classification (BIO tags for INDICATOR / ENTITY)
    2. Relation classification (CAUSE, EFFECT, INTERDEPENDENCY)

Usage

Find the custom library. Once installed, run inference like so:

from transformers import AutoTokenizer
from causalbert.infer import load_model, analyze_sentence_with_confidence

model, tokenizer, config, device = load_model("norygano/C-BERT")
result = analyze_sentence_with_confidence(
    model, tokenizer, config, "Autoverkehr verursacht Bienensterben.", []
)

Training

  • Base model: google-bert/bert-base-german-cased
  • Epochs: 3, LR: 2e-5, Batch size: 8
  • See train.py for details.

Limitations

  • Only German.
  • Sentence-level; doesn’t handle cross-sentence causality.
  • Relation classification depends on detected spans — errors in token tagging propagate.
Downloads last month
8
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for pdjohn/CBERT

Finetuned
(138)
this model

Collection including pdjohn/CBERT