metadata
language:
- en
license: apache-2.0
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:843
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
base_model: BAAI/bge-base-en-v1.5
widget:
- source_sentence: >-
(1) No person shall make attempt to commit an offence. Even if it is
impossible for an offence to be committed for which attempt is made,
attempt shall be considered to have been committed. Except as otherwise
provided elsewhere in this Act, a person who attempts, or causes attempt,
to commit an offence shall be punished with one half of the punishment
specified for such offence. .
sentences:
- How is the punishment for an attempt determined?
- What are the different types of guarantees?
- >-
What are the specific types of crimes that are considered 'strict
liability'?
- source_sentence: >-
: (1) No person shall commit, or cause to be committed, cheating. (2) For
the purposes of sub-section (1), a person who dishonestly causes any kind
of loss, damage or injury to another person whom he or she makes believe
in some matter or to any other person or obtains any benefit for him or
her or any one else by omitting to do as per such belief or by inducement,
fraudulent, dishonest or otherwise deceptive act or preventing such other
person from doing any act shall be considered to commit cheating.
sentences:
- How is 'fraudulent concealment' defined?
- >-
What are the terms and restrictions that must be followed when producing
explosives under a license?
- >-
What is the process for determining the appropriate penalty for a
cheating offense?
- source_sentence: >-
(1) No person shall restraint or otherwise obstruct or hinder a person
who, upon knowing that an offence has been committed or is about to be
committed, intends to give information or notice about such offence to the
police or competent authority. imprisonment for a term not exceeding two
years or a fine not exceeding twenty thousand rupees or both the
sentences.
sentences:
- What actions constitute 'restraint, obstruction, or hindrance'?
- What are the consequences of engaging in such conduct?
- >-
What are the different categories of victims, and how do the penalties
vary based on their age?
- source_sentence: >-
This law prohibits the creation, use, possession, or sale of inaccurate
weighing, measuring, or quality-standard instruments. It also prohibits
tampering with seals or marks on these instruments, or manipulating their
accuracy. Violations carry a penalty of up to three years imprisonment
and a fine. Instruments and tools used in the offense are subject to
forfeiture.
sentences:
- What are the penalties for using banned currency?
- What is the time frame for reporting an offense under this law?
- When does this law come into effect?
- source_sentence: >-
This section lists factors that decrease the seriousness of a crime.
These include age (under 18 or over 75), lack of intent, provocation by
the victim, retaliation for a serious offense, confession and remorse,
surrender to authorities, compensation to the victim, diminished capacity,
insignificant harm, assistance in the judicial process, confession with a
promise of no future crime, and crimes committed under duress.
sentences:
- What constitutes "lack of intent" in this context?
- >-
What is the difference between an attempt and the actual commission of a
crime?
- >-
What are the exceptions to the prohibition on property transactions in
marriage?
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: BGE base Financial Matryoshka
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.13744075829383887
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.4312796208530806
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.5450236966824644
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.6445497630331753
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.13744075829383887
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.14375987361769352
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.10900473933649289
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.06445497630331752
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.13744075829383887
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.4312796208530806
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.5450236966824644
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.6445497630331753
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.38906851558265765
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.3073817046565864
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.31738583597003633
name: Cosine Map@100
BGE base Financial Matryoshka
This is a sentence-transformers model finetuned from BAAI/bge-base-en-v1.5. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: BAAI/bge-base-en-v1.5
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
- Language: en
- License: apache-2.0
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'This section lists factors that decrease the seriousness of a crime. These include age (under 18 or over 75), lack of intent, provocation by the victim, retaliation for a serious offense, confession and remorse, surrender to authorities, compensation to the victim, diminished capacity, insignificant harm, assistance in the judicial process, confession with a promise of no future crime, and crimes committed under duress.',
'What constitutes "lack of intent" in this context?',
'What are the exceptions to the prohibition on property transactions in marriage?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Information Retrieval
- Dataset:
dim_128
- Evaluated with
InformationRetrievalEvaluator
Metric | Value |
---|---|
cosine_accuracy@1 | 0.1374 |
cosine_accuracy@3 | 0.4313 |
cosine_accuracy@5 | 0.545 |
cosine_accuracy@10 | 0.6445 |
cosine_precision@1 | 0.1374 |
cosine_precision@3 | 0.1438 |
cosine_precision@5 | 0.109 |
cosine_precision@10 | 0.0645 |
cosine_recall@1 | 0.1374 |
cosine_recall@3 | 0.4313 |
cosine_recall@5 | 0.545 |
cosine_recall@10 | 0.6445 |
cosine_ndcg@10 | 0.3891 |
cosine_mrr@10 | 0.3074 |
cosine_map@100 | 0.3174 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 843 training samples
- Columns:
positive
andanchor
- Approximate statistics based on the first 843 samples:
positive anchor type string string details - min: 9 tokens
- mean: 66.68 tokens
- max: 151 tokens
- min: 7 tokens
- mean: 14.77 tokens
- max: 39 tokens
- Samples:
positive anchor This law prohibits unlawful detention of individuals. It outlines penalties for unlawful confinement and obstruction of a person's movement. It also specifies a time limit for complaints related to certain offenses.
What is the process for reporting unlawful detention?
No complaint shall lie in relation to any of the offences under Section 290, after the expiry of three months from the date of commission of such offence, and in relation to any of the other offences under this Chapter, after the expiry of three months from the date of knowledge of commission of such act.
What are the time limits for reporting and prosecuting offenses related to animal cruelty?
(1) No person, being legally bound to receive a summons, process, notice, arrest warrant or order issued by the competent authority, shall abscond, with mala fide intention to avoid being served with such summons, process, notice, arrest warrant or order.
What is the legal definition of being "legally bound" to receive a document?
- Loss:
MatryoshkaLoss
with these parameters:{ "loss": "MultipleNegativesRankingLoss", "matryoshka_dims": [ 128 ], "matryoshka_weights": [ 1 ], "n_dims_per_step": -1 }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: epochper_device_train_batch_size
: 32per_device_eval_batch_size
: 16gradient_accumulation_steps
: 16learning_rate
: 2e-05num_train_epochs
: 1lr_scheduler_type
: cosinewarmup_ratio
: 0.1tf32
: Falseload_best_model_at_end
: Trueoptim
: adamw_torch_fusedbatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: epochprediction_loss_only
: Trueper_device_train_batch_size
: 32per_device_eval_batch_size
: 16per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 16eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: cosinelr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Falselocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Trueignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torch_fusedoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Epoch | Step | dim_128_cosine_ndcg@10 |
---|---|---|
0.5926 | 1 | 0.3891 |
- The bold row denotes the saved checkpoint.
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.3.1
- Transformers: 4.47.1
- PyTorch: 2.5.1+cu121
- Accelerate: 0.27.0
- Datasets: 3.2.0
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MatryoshkaLoss
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}