Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
xiaoou
/
am-sentence
like
0
Token Classification
Transformers
Safetensors
roberta
Inference Endpoints
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
am-sentence
1 contributor
History:
3 commits
xiaoou
Upload tokenizer
2d09bad
about 1 year ago
.gitattributes
Safe
1.52 kB
initial commit
about 1 year ago
config.json
Safe
1.01 kB
Upload RobertaForTokenClassification
about 1 year ago
merges.txt
Safe
456 kB
Upload tokenizer
about 1 year ago
model.safetensors
Safe
496 MB
LFS
Upload RobertaForTokenClassification
about 1 year ago
special_tokens_map.json
Safe
958 Bytes
Upload tokenizer
about 1 year ago
tokenizer.json
Safe
2.11 MB
Upload tokenizer
about 1 year ago
tokenizer_config.json
Safe
1.32 kB
Upload tokenizer
about 1 year ago
vocab.json
Safe
798 kB
Upload tokenizer
about 1 year ago