Update README.md
Browse files
README.md
CHANGED
@@ -78,51 +78,40 @@ model = GLiNER.from_pretrained("knowledgator/gliner-qwen-1.5B-v1.0",
|
|
78 |
max_len = 2048).to('cuda:0')
|
79 |
```
|
80 |
|
81 |
-
If you have a large amount of entities and want to pre-embed them, please, refer to the following code snippet:
|
82 |
-
|
83 |
-
```python
|
84 |
-
labels = ["your entities"]
|
85 |
-
texts = ["your texts"]
|
86 |
-
|
87 |
-
entity_embeddings = model.encode_labels(labels, batch_size = 8)
|
88 |
-
|
89 |
-
outputs = model.batch_predict_with_embeds(texts, entity_embeddings, labels)
|
90 |
-
```
|
91 |
-
|
92 |
### Benchmarks
|
93 |
Below you can see the table with benchmarking results on various named entity recognition datasets:
|
94 |
|
95 |
-
| Dataset
|
96 |
-
|
97 |
-
| ACE 2004
|
98 |
-
| ACE 2005
|
99 |
-
| AnatEM
|
100 |
-
| Broad Tweet Corpus
|
101 |
-
| CoNLL 2003
|
102 |
-
| FabNER
|
103 |
-
| FindVehicle
|
104 |
-
| GENIA_NER
|
105 |
-
| HarveyNER
|
106 |
-
| MultiNERD
|
107 |
-
| Ontonotes
|
108 |
-
| PolyglotNER
|
109 |
-
| TweetNER7
|
110 |
-
| WikiANN en
|
111 |
-
| WikiNeural
|
112 |
-
| bc2gm
|
113 |
-
| bc4chemd
|
114 |
-
| bc5cdr
|
115 |
-
| ncbi
|
116 |
-
| **Average**
|
117 |
-
|
|
118 |
-
| CrossNER_AI
|
119 |
-
| CrossNER_literature
|
120 |
-
| CrossNER_music
|
121 |
-
| CrossNER_politics
|
122 |
-
| CrossNER_science
|
123 |
-
| mit-movie
|
124 |
-
| mit-restaurant
|
125 |
-
| **Average (zero-shot benchmark)** | **58.
|
126 |
|
127 |
### Join Our Discord
|
128 |
|
|
|
78 |
max_len = 2048).to('cuda:0')
|
79 |
```
|
80 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
81 |
### Benchmarks
|
82 |
Below you can see the table with benchmarking results on various named entity recognition datasets:
|
83 |
|
84 |
+
| Dataset | Score |
|
85 |
+
|-----------------------------|--------|
|
86 |
+
| ACE 2004 | 29.8% |
|
87 |
+
| ACE 2005 | 26.8% |
|
88 |
+
| AnatEM | 43.7% |
|
89 |
+
| Broad Tweet Corpus | 68.3% |
|
90 |
+
| CoNLL 2003 | 67.5% |
|
91 |
+
| FabNER | 24.9% |
|
92 |
+
| FindVehicle | 33.2% |
|
93 |
+
| GENIA_NER | 58.8% |
|
94 |
+
| HarveyNER | 19.5% |
|
95 |
+
| MultiNERD | 65.1% |
|
96 |
+
| Ontonotes | 39.9% |
|
97 |
+
| PolyglotNER | 45.8% |
|
98 |
+
| TweetNER7 | 37.0% |
|
99 |
+
| WikiANN en | 56.0% |
|
100 |
+
| WikiNeural | 78.3% |
|
101 |
+
| bc2gm | 58.1% |
|
102 |
+
| bc4chemd | 65.7% |
|
103 |
+
| bc5cdr | 72.3% |
|
104 |
+
| ncbi | 63.3% |
|
105 |
+
| **Average** | **50.2%** |
|
106 |
+
| | |
|
107 |
+
| CrossNER_AI | 58.3% |
|
108 |
+
| CrossNER_literature | 64.4% |
|
109 |
+
| CrossNER_music | 71.5% |
|
110 |
+
| CrossNER_politics | 70.5% |
|
111 |
+
| CrossNER_science | 65.1% |
|
112 |
+
| mit-movie | 47.5% |
|
113 |
+
| mit-restaurant | 33.1% |
|
114 |
+
| **Average (zero-shot benchmark)** | **58.6%** |
|
115 |
|
116 |
### Join Our Discord
|
117 |
|