t5-base-qa-ner-conll

Unofficial implementation of InstructionNER. t5-base model tuned on conll2003 dataset.

https://github.com/ovbystrova/InstructionNER

Inference

git clone https://github.com/ovbystrova/InstructionNER 
cd InstructionNER
from instruction_ner.model import Model

model = Model(
    model_path_or_name="olgaduchovny/t5-base-ner-conll",
    tokenizer_path_or_name="olgaduchovny/t5-base-ner-conll"
)

options = ["LOC", "PER", "ORG", "MISC"]

instruction = "please extract entities and their types from the input sentence, " \
              "all entity types are in options"

text = "The protest , which attracted several thousand supporters , coincided with the 18th anniversary of Spain 's constitution ."

generation_kwargs = {
    "num_beams": 2,
    "max_length": 128
}

pred_text, pred_spans = model.predict(
    text=text,
    generation_kwargs=generation_kwargs,
    instruction=instruction,
    options=options
)

>>> ('Spain is a Loc.', [(99, 104, 'LOC')])

Prediction Sample

Sentence: The protest , which attracted several thousand supporters , coincided with the 18th anniversary of Spain 's constitution .
Instruction: please extract entities and their types from the input sentence, all entity types are in options
Options: ORG, PER, LOC

Prediction (raw text): Spain is a LOC.
Prediction (span): [(99, 104, 'LOC')]
Downloads last month
7
Inference Examples
Inference API (serverless) has been turned off for this model.

Dataset used to train olgaduchovny/t5-base-ner-conll