Token Classification
GLiNER
PyTorch
multilingual
NER
GLiNER
information extraction
encoder
entity recognition
Ihor commited on
Commit
1d17cca
1 Parent(s): 7c6d3b5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -75,7 +75,7 @@ If you want to use flash attention or increase sequence length, please, check th
75
  from gliner import GLiNER
76
  import torch
77
 
78
- model = GLiNER.from_pretrained("knowledgator/gliner-qwen-1.5B-v1.0",
79
  _attn_implementation = 'flash_attention_2',
80
  max_len = 2048).to('cuda:0', dtype=torch.float16)
81
  ```
 
75
  from gliner import GLiNER
76
  import torch
77
 
78
+ model = GLiNER.from_pretrained("knowledgator/gliner-bi-llama-v1.0",
79
  _attn_implementation = 'flash_attention_2',
80
  max_len = 2048).to('cuda:0', dtype=torch.float16)
81
  ```