dslim/bert-base-NER

This is the dslim/bert-base-NER model converted to OpenVINO, for accellerated inference.

An example of how to do inference on this model:

from optimum.intel.openvino import OVModelForTokenClassification
from transformers import AutoTokenizer, pipeline

# model_id should be set to either a local directory or a model available on the HuggingFace hub.
model_id = "helenai/dslim-bert-base-NER-ov-fp32"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = OVModelForTokenClassification.from_pretrained(model_id)
pipe = pipeline("token-classification", model=model, tokenizer=tokenizer)
result = pipe("My name is Wolfgang and I live in Berlin")
print(result)
Downloads last month
389
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.