SentenceTransformer based on sentence-transformers/all-MiniLM-L12-v2

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L12-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L12-v2
  • Maximum Sequence Length: 128 tokens
  • Output Dimensionality: 384 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("Trelis/all-MiniLM-L12-v2-ft-pairs-balanced-cpu")
# Run inference
sentences = [
    'What happens if a player deliberately delays the changeover procedure after a Change of Possession?',
    ' Registration\n5\n03 I\nThe Ball\n6\n04 I\nPlaying Uniform\n6\n05 I\nTeam Composition\n6\n06 I\nTeam Coach and Team Officials\n7\n07\nI\nCommencement and Recommencement of Play\n7\n08\nI\nMatch Duration\n8\n09 I\nPossession\n8\n10\nI\nThe Touch\n9\n11\nI\nPassing\n10\n12\nI\nBall Touched in Flight\n10\n13\nI\nThe Rollball\n11\n14\nI\nScoring\n13\n15\nI\nOffside\n13\n16\nI\nObstruction\n14\n17\nI\nInterchange\n14\n18\nI\nPenalty\n15\n19\nI\nAdvantage\n16\n20\nI\nMisconduct\n16\n21\nI\nForced Interchange\n16\n22\nI\nSin Bin\n16\n23\nI\nDismissal\n17\n24\nI\nDrop-Off\n17\n25\nI\nMatch Officials\n18\nFIT Playing Rules - 5th Edition\nCOPYRIGHT © Touch Football Australia 2020\nFIT Playing Rules - 5th Edition\nCOPYRIGHT © Touch Football Australia 2020\n  Definitions and Terminology  \nUnless the contrary intention appears, the following definitions and terminology apply \nto the game of Touch:\nTERM/PHRASE\nDEFINITION/DESCRIPTION\nAdvantage\nThe period of time after an Infringement in which the non-offending \nside has the opportunity to gain Advantage either territorial, tactical \nor in the form of a Try.\nAttacking Try Line\nThe line on or over which a player has to place the ball to \nscore a Try.\nAttacking Team\nThe Team which has or is gaining Possession.\nBehind\nA position or direction towards a Team’s Defending Try Line.\nChange of Possession\nThe act of moving control of the ball from one Team to the other.\nDead/Dead Ball\nWhen the ball is out of play including the period following a Try and \nuntil the match is recommenced and when the ball goes to ground \nand/or outside the boundaries of the Field of Play prior to the \nsubsequent Rollball.\nDead Ball Line\nThe end boundaries of the Field of Play. There is one at each end of \nthe Field of Play. See Appendix 1.\nDef',
    ' Registration\n5\n03 I\nThe Ball\n6\n04 I\nPlaying Uniform\n6\n05 I\nTeam Composition\n6\n06 I\nTeam Coach and Team Officials\n7\n07\nI\nCommencement and Recommencement of Play\n7\n08\nI\nMatch Duration\n8\n09 I\nPossession\n8\n10\nI\nThe Touch\n9\n11\nI\nPassing\n10\n12\nI\nBall Touched in Flight\n10\n13\nI\nThe Rollball\n11\n14\nI\nScoring\n13\n15\nI\nOffside\n13\n16\nI\nObstruction\n14\n17\nI\nInterchange\n14\n18\nI\nPenalty\n15\n19\nI\nAdvantage\n16\n20\nI\nMisconduct\n16\n21\nI\nForced Interchange\n16\n22\nI\nSin Bin\n16\n23\nI\nDismissal\n17\n24\nI\nDrop-Off\n17\n25\nI\nMatch Officials\n18\nFIT Playing Rules - 5th Edition\nCOPYRIGHT © Touch Football Australia 2020\nFIT Playing Rules - 5th Edition\nCOPYRIGHT © Touch Football Australia 2020\n  Definitions and Terminology  \nUnless the contrary intention appears, the following definitions and terminology apply \nto the game of Touch:\nTERM/PHRASE\nDEFINITION/DESCRIPTION\nAdvantage\nThe period of time after an Infringement in which the non-offending \nside has the opportunity to gain Advantage either territorial, tactical \nor in the form of a Try.\nAttacking Try Line\nThe line on or over which a player has to place the ball to \nscore a Try.\nAttacking Team\nThe Team which has or is gaining Possession.\nBehind\nA position or direction towards a Team’s Defending Try Line.\nChange of Possession\nThe act of moving control of the ball from one Team to the other.\nDead/Dead Ball\nWhen the ball is out of play including the period following a Try and \nuntil the match is recommenced and when the ball goes to ground \nand/or outside the boundaries of the Field of Play prior to the \nsubsequent Rollball.\nDead Ball Line\nThe end boundaries of the Field of Play. There is one at each end of \nthe Field of Play. See Appendix 1.\nDef',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • learning_rate: 1e-05
  • num_train_epochs: 1
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.3
  • bf16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • learning_rate: 1e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.3
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss loss
0.1053 2 4.6868 -
0.1579 3 - 2.7075
0.2105 4 5.703 -
0.3158 6 2.1691 2.6412
0.4211 8 1.705 -
0.4737 9 - 2.6254
0.5263 10 1.7985 -
0.6316 12 3.4822 2.6087
0.7368 14 4.2724 -
0.7895 15 - 2.6000
0.8421 16 3.1489 -
0.9474 18 5.7594 2.6032

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.1.1+cu121
  • Accelerate: 0.31.0
  • Datasets: 2.17.1
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CoSENTLoss

@online{kexuefm-8847,
    title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
    author={Su Jianlin},
    year={2022},
    month={Jan},
    url={https://kexue.fm/archives/8847},
}
Downloads last month
16
Safetensors
Model size
69.8M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for Trelis/all-MiniLM-L12-v2-ft-pairs-balanced-cpu

Finetuned
(27)
this model