Usage
pip install -U FlagEmbedding
Generate embedding for text (only Dense)
from FlagEmbedding import FlagModel
model_name = "puppyyyo/4_crime_aug-m3-law-knowledge-v2"
model = FlagModel(
model_name,
use_fp16=False
)
sentences_1 = ["What is BGE M3?", "Defination of BM25"]
sentences_2 = ["BGE M3 is an embedding model supporting dense retrieval, lexical matching and multi-vector interaction.",
"BM25 is a bag-of-words retrieval function that ranks a set of documents based on the query terms appearing in each document"]
embeddings_1 = model.encode(sentences_1)
embeddings_2 = model.encode(sentences_2)
similarity = embeddings_1 @ embeddings_2.T
print(similarity)
- Downloads last month
- 8
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.
Model tree for puppyyyo/4_crime_aug-m3-law-knowledge-v2
Base model
BAAI/bge-base-zh-v1.5