SentenceTransformer based on thenlper/gte-large

This is a sentence-transformers model finetuned from thenlper/gte-large. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: thenlper/gte-large
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 1024 dimensions
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("Dataologist/gte_large_op")
# Run inference
sentences = [
    '\'American Bandstand\' is a 8.71/10 rated T.V. Show, starring Dick Clark. It is about American Bandstand was an American music-performance show that aired in various versions from 1952 to 1989 and was hosted from 1956 until its final season by Dick Clark, who also served as producer. The show featured teenagers dancing to Top 40 music introduced by Clark; at least one popular musical act—over the decades, running the gamut from Jerry Lee Lewis to Run DMC—would usually appear in person to lip-sync one of their latest singles. Freddy "Boom Boom" Cannon holds the record for most appearances at 110.\n\nThe show\'s popularity helped Dick Clark become an American media mogul and inspired similar long-running music programs, such as Soul Train and Top of the Pops. Clark eventually assumed ownership of the program through his Dick Clark Productions company..',
    '\'American Bandstand\' is a 8.71/10 rated T.V. Show, starring Dick Clark. It is about American Bandstand was an American music-performance show that aired in various versions from 1952 to 1989 and was hosted from 1956 until its final season by Dick Clark, who also served as producer. The show featured teenagers dancing to Top 40 music introduced by Clark; at least one popular musical act—over the decades, running the gamut from Jerry Lee Lewis to Run DMC—would usually appear in person to lip-sync one of their latest singles. Freddy "Boom Boom" Cannon holds the record for most appearances at 110.\n\nThe show\'s popularity helped Dick Clark become an American media mogul and inspired similar long-running music programs, such as Soul Train and Top of the Pops. Clark eventually assumed ownership of the program through his Dick Clark Productions company..',
    "'White Lies' is a No Rating/10 rated T.V. Show, starring Natalie Dormer, Brendon Daniels, Daniel Schultz, Morgan Santo, Langley Kirkwood. It is about Edie Hansen, who is set in the affluent Cape Town neighborhood of Bishopscourt, is drawn into the gritty underbelly of the city, which hides beneath its gorgeous beauty and takes her back to a stormy past..",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Dataset

Unnamed Dataset

  • Size: 602,010 training samples
  • Columns: sentence_0 and sentence_1
  • Approximate statistics based on the first 1000 samples:
    sentence_0 sentence_1
    type string string
    details
    • min: 23 tokens
    • mean: 95.98 tokens
    • max: 368 tokens
    • min: 23 tokens
    • mean: 95.98 tokens
    • max: 368 tokens
  • Samples:
    sentence_0 sentence_1
    'Down Stream Highway' is a No Rating/10 rated Movie, starring . It is about Narrated by Bill Slater, this short black & white educational film is about sporting and outdoor activities on the majestic Hudson River in New York State.. 'Down Stream Highway' is a No Rating/10 rated Movie, starring . It is about Narrated by Bill Slater, this short black & white educational film is about sporting and outdoor activities on the majestic Hudson River in New York State..
    'La joueuse d'orgue' is a No Rating/10 rated Movie, starring Marcelle Géniat, Pierre Larquey, Jacques Varennes, Gaby Triquet, France Ellys. It is about Robert Bernier murdered his brother with the complicity of a worker. The only witness to the tragedy, Veronique was injured while rescuing her boss and remains blind. Later, cured by an operation, she denounces the criminal whose voice she recognized and who had taken over the factory.. 'La joueuse d'orgue' is a No Rating/10 rated Movie, starring Marcelle Géniat, Pierre Larquey, Jacques Varennes, Gaby Triquet, France Ellys. It is about Robert Bernier murdered his brother with the complicity of a worker. The only witness to the tragedy, Veronique was injured while rescuing her boss and remains blind. Later, cured by an operation, she denounces the criminal whose voice she recognized and who had taken over the factory..
    'Disoriented' is a 8.0/10 rated Movie, starring . It is about Twenty-something West Cordova is trapped in a waking nightmare. His overbearing mother is bent on molding him into a MD. His crazy, "wannabe-a-supermodel," Japanese girlfriend craves blonde hair and round eyes. And his long lost, jock brother just returned home having traded his high tops for high heels. If young "Doctor" Cordova can pass pre-med, mend his fractured family and revive his romance, he may just discover the cure for his own unraveling identity.. 'Disoriented' is a 8.0/10 rated Movie, starring . It is about Twenty-something West Cordova is trapped in a waking nightmare. His overbearing mother is bent on molding him into a MD. His crazy, "wannabe-a-supermodel," Japanese girlfriend craves blonde hair and round eyes. And his long lost, jock brother just returned home having traded his high tops for high heels. If young "Doctor" Cordova can pass pre-med, mend his fractured family and revive his romance, he may just discover the cure for his own unraveling identity..
  • Loss: MultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • multi_dataset_batch_sampler: round_robin

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: no
  • prediction_loss_only: True
  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1
  • num_train_epochs: 3
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.0
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: round_robin

Training Logs

Click to expand
Epoch Step Training Loss
0.0033 500 0.0483
0.0066 1000 0.0
0.0100 1500 0.0
0.0133 2000 0.0
0.0166 2500 0.0
0.0199 3000 0.0
0.0233 3500 0.0
0.0266 4000 0.0
0.0299 4500 0.0
0.0332 5000 0.0
0.0365 5500 0.0
0.0399 6000 0.0
0.0432 6500 0.0
0.0465 7000 0.0
0.0498 7500 0.0
0.0532 8000 0.0
0.0565 8500 0.0
0.0598 9000 0.0
0.0631 9500 0.0
0.0664 10000 0.0
0.0698 10500 0.0
0.0731 11000 0.0
0.0764 11500 0.0
0.0797 12000 0.0
0.0831 12500 0.0
0.0864 13000 0.0
0.0897 13500 0.0
0.0930 14000 0.0
0.0963 14500 0.0
0.0997 15000 0.0
0.1030 15500 0.0
0.1063 16000 0.0
0.1096 16500 0.0
0.1130 17000 0.0
0.1163 17500 0.0
0.1196 18000 0.0
0.1229 18500 0.0
0.1262 19000 0.0
0.1296 19500 0.0
0.1329 20000 0.0
0.1362 20500 0.0
0.1395 21000 0.0
0.1429 21500 0.0
0.1462 22000 0.0
0.1495 22500 0.0
0.1528 23000 0.0
0.1561 23500 0.0
0.1595 24000 0.0
0.1628 24500 0.0
0.1661 25000 0.0
0.1694 25500 0.0
0.1728 26000 0.0
0.1761 26500 0.0
0.1794 27000 0.0
0.1827 27500 0.0
0.1860 28000 0.0
0.1894 28500 0.0
0.1927 29000 0.0
0.1960 29500 0.0
0.1993 30000 0.0
0.2027 30500 0.0
0.2060 31000 0.0
0.2093 31500 0.0
0.2126 32000 0.0
0.2159 32500 0.0
0.2193 33000 0.0
0.2226 33500 0.0
0.2259 34000 0.0
0.2292 34500 0.0
0.2326 35000 0.0
0.2359 35500 0.0
0.2392 36000 0.0
0.2425 36500 0.0
0.2458 37000 0.0
0.2492 37500 0.0
0.2525 38000 0.0
0.2558 38500 0.0
0.2591 39000 0.0
0.2625 39500 0.0
0.2658 40000 0.0
0.2691 40500 0.0
0.2724 41000 0.0
0.2757 41500 0.0
0.2791 42000 0.0
0.2824 42500 0.0
0.2857 43000 0.0
0.2890 43500 0.0
0.2924 44000 0.0
0.2957 44500 0.0
0.2990 45000 0.0
0.3023 45500 0.0
0.3056 46000 0.0
0.3090 46500 0.0
0.3123 47000 0.0
0.3156 47500 0.0
0.3189 48000 0.0
0.3223 48500 0.0
0.3256 49000 0.0
0.3289 49500 0.0
0.3322 50000 0.0
0.3355 50500 0.0
0.3389 51000 0.0
0.3422 51500 0.0
0.3455 52000 0.0
0.3488 52500 0.0
0.3522 53000 0.0
0.3555 53500 0.0
0.3588 54000 0.0
0.3621 54500 0.0
0.3654 55000 0.0
0.3688 55500 0.0
0.3721 56000 0.0
0.3754 56500 0.0
0.3787 57000 0.0
0.3821 57500 0.0
0.3854 58000 0.0
0.3887 58500 0.0
0.3920 59000 0.0
0.3953 59500 0.0
0.3987 60000 0.0
0.4020 60500 0.0
0.4053 61000 0.0
0.4086 61500 0.0
0.4120 62000 0.0
0.4153 62500 0.0
0.4186 63000 0.0
0.4219 63500 0.0
0.4252 64000 0.0
0.4286 64500 0.0
0.4319 65000 0.0
0.4352 65500 0.0
0.4385 66000 0.0
0.4419 66500 0.0
0.4452 67000 0.0
0.4485 67500 0.0
0.4518 68000 0.0
0.4551 68500 0.0
0.4585 69000 0.0
0.4618 69500 0.0
0.4651 70000 0.0
0.4684 70500 0.0
0.4718 71000 0.0
0.4751 71500 0.0
0.4784 72000 0.0
0.4817 72500 0.0
0.4850 73000 0.0
0.4884 73500 0.0
0.4917 74000 0.0
0.4950 74500 0.0
0.4983 75000 0.0
0.5017 75500 0.0
0.5050 76000 0.0
0.5083 76500 0.0
0.5116 77000 0.0
0.5149 77500 0.0
0.5183 78000 0.0
0.5216 78500 0.0
0.5249 79000 0.0
0.5282 79500 0.0
0.5316 80000 0.0
0.5349 80500 0.0
0.5382 81000 0.0
0.5415 81500 0.0
0.5448 82000 0.0
0.5482 82500 0.0
0.5515 83000 0.0
0.5548 83500 0.0
0.5581 84000 0.0
0.5615 84500 0.0
0.5648 85000 0.0
0.5681 85500 0.0
0.5714 86000 0.0
0.5747 86500 0.0
0.5781 87000 0.0
0.5814 87500 0.0
0.5847 88000 0.0
0.5880 88500 0.0
0.5914 89000 0.0
0.5947 89500 0.0
0.5980 90000 0.0
0.6013 90500 0.0
0.6046 91000 0.0
0.6080 91500 0.0
0.6113 92000 0.0
0.6146 92500 0.0
0.6179 93000 0.0
0.6213 93500 0.0
0.6246 94000 0.0
0.6279 94500 0.0
0.6312 95000 0.0
0.6345 95500 0.0
0.6379 96000 0.0
0.6412 96500 0.0
0.6445 97000 0.0
0.6478 97500 0.0
0.6511 98000 0.0
0.6545 98500 0.0
0.6578 99000 0.0
0.6611 99500 0.0
0.6644 100000 0.0
0.6678 100500 0.0
0.6711 101000 0.0
0.6744 101500 0.0
0.6777 102000 0.0
0.6810 102500 0.0
0.6844 103000 0.0
0.6877 103500 0.0
0.6910 104000 0.0
0.6943 104500 0.0
0.6977 105000 0.0
0.7010 105500 0.0
0.7043 106000 0.0
0.7076 106500 0.0
0.7109 107000 0.0
0.7143 107500 0.0
0.7176 108000 0.0
0.7209 108500 0.0
0.7242 109000 0.0
0.7276 109500 0.0
0.7309 110000 0.0
0.7342 110500 0.0
0.7375 111000 0.0
0.7408 111500 0.0
0.7442 112000 0.0
0.7475 112500 0.0
0.7508 113000 0.0
0.7541 113500 0.0
0.7575 114000 0.0
0.7608 114500 0.0
0.7641 115000 0.0
0.7674 115500 0.0
0.7707 116000 0.0
0.7741 116500 0.0
0.7774 117000 0.0
0.7807 117500 0.0
0.7840 118000 0.0
0.7874 118500 0.0
0.7907 119000 0.0
0.7940 119500 0.0
0.7973 120000 0.0
0.8006 120500 0.0
0.8040 121000 0.0
0.8073 121500 0.0
0.8106 122000 0.0
0.8139 122500 0.0
0.8173 123000 0.0
0.8206 123500 0.0
0.8239 124000 0.0
0.8272 124500 0.0
0.8305 125000 0.0
0.8339 125500 0.0
0.8372 126000 0.0
0.8405 126500 0.0
0.8438 127000 0.0
0.8472 127500 0.0
0.8505 128000 0.0
0.8538 128500 0.0
0.8571 129000 0.0
0.8604 129500 0.0
0.8638 130000 0.0
0.8671 130500 0.0
0.8704 131000 0.0
0.8737 131500 0.0
0.8771 132000 0.0
0.8804 132500 0.0
0.8837 133000 0.0
0.8870 133500 0.0
0.8903 134000 0.0
0.8937 134500 0.0
0.8970 135000 0.0
0.9003 135500 0.0
0.9036 136000 0.0
0.9070 136500 0.0
0.9103 137000 0.0
0.9136 137500 0.0
0.9169 138000 0.0
0.9202 138500 0.0
0.9236 139000 0.0
0.9269 139500 0.0
0.9302 140000 0.0
0.9335 140500 0.0
0.9369 141000 0.0
0.9402 141500 0.0
0.9435 142000 0.0
0.9468 142500 0.0
0.9501 143000 0.0
0.9535 143500 0.0
0.9568 144000 0.0
0.9601 144500 0.0
0.9634 145000 0.0
0.9668 145500 0.0
0.9701 146000 0.0
0.9734 146500 0.0
0.9767 147000 0.0
0.9800 147500 0.0
0.9834 148000 0.0
0.9867 148500 0.0
0.9900 149000 0.0
0.9933 149500 0.0
0.9967 150000 0.0
1.0000 150500 0.0
1.0033 151000 0.0
1.0066 151500 0.0
1.0099 152000 0.0
1.0133 152500 0.0
1.0166 153000 0.0
1.0199 153500 0.0
1.0232 154000 0.0
1.0266 154500 0.0
1.0299 155000 0.0
1.0332 155500 0.0
1.0365 156000 0.0
1.0398 156500 0.0
1.0432 157000 0.0
1.0465 157500 0.0
1.0498 158000 0.0
1.0531 158500 0.0
1.0565 159000 0.0
1.0598 159500 0.0
1.0631 160000 0.0
1.0664 160500 0.0
1.0697 161000 0.0
1.0731 161500 0.0
1.0764 162000 0.0
1.0797 162500 0.0
1.0830 163000 0.0
1.0864 163500 0.0
1.0897 164000 0.0
1.0930 164500 0.0
1.0963 165000 0.0
1.0996 165500 0.0
1.1030 166000 0.0
1.1063 166500 0.0
1.1096 167000 0.0
1.1129 167500 0.0
1.1163 168000 0.0
1.1196 168500 0.0
1.1229 169000 0.0
1.1262 169500 0.0
1.1295 170000 0.0
1.1329 170500 0.0
1.1362 171000 0.0
1.1395 171500 0.0
1.1428 172000 0.0
1.1462 172500 0.0
1.1495 173000 0.0
1.1528 173500 0.0
1.1561 174000 0.0
1.1594 174500 0.0
1.1628 175000 0.0
1.1661 175500 0.0
1.1694 176000 0.0
1.1727 176500 0.0
1.1761 177000 0.0
1.1794 177500 0.0
1.1827 178000 0.0
1.1860 178500 0.0
1.1893 179000 0.0
1.1927 179500 0.0
1.1960 180000 0.0
1.1993 180500 0.0
1.2026 181000 0.0
1.2060 181500 0.0
1.2093 182000 0.0
1.2126 182500 0.0
1.2159 183000 0.0
1.2192 183500 0.0
1.2226 184000 0.0
1.2259 184500 0.0
1.2292 185000 0.0
1.2325 185500 0.0
1.2359 186000 0.0
1.2392 186500 0.0
1.2425 187000 0.0
1.2458 187500 0.0
1.2491 188000 0.0
1.2525 188500 0.0
1.2558 189000 0.0
1.2591 189500 0.0
1.2624 190000 0.0
1.2658 190500 0.0
1.2691 191000 0.0
1.2724 191500 0.0
1.2757 192000 0.0
1.2790 192500 0.0
1.2824 193000 0.0
1.2857 193500 0.0
1.2890 194000 0.0
1.2923 194500 0.0
1.2957 195000 0.0
1.2990 195500 0.0
1.3023 196000 0.0
1.3056 196500 0.0
1.3089 197000 0.0
1.3123 197500 0.0
1.3156 198000 0.0
1.3189 198500 0.0
1.3222 199000 0.0
1.3256 199500 0.0
1.3289 200000 0.0
1.3322 200500 0.0
1.3355 201000 0.0
1.3388 201500 0.0
1.3422 202000 0.0
1.3455 202500 0.0
1.3488 203000 0.0
1.3521 203500 0.0
1.3555 204000 0.0
1.3588 204500 0.0
1.3621 205000 0.0
1.3654 205500 0.0
1.3687 206000 0.0
1.3721 206500 0.0
1.3754 207000 0.0
1.3787 207500 0.0
1.3820 208000 0.0
1.3854 208500 0.0
1.3887 209000 0.0
1.3920 209500 0.0
1.3953 210000 0.0
1.3986 210500 0.0
1.4020 211000 0.0
1.4053 211500 0.0
1.4086 212000 0.0
1.4119 212500 0.0
1.4153 213000 0.0
1.4186 213500 0.0
1.4219 214000 0.0
1.4252 214500 0.0
1.4285 215000 0.0
1.4319 215500 0.0
1.4352 216000 0.0
1.4385 216500 0.0
1.4418 217000 0.0
1.4452 217500 0.0
1.4485 218000 0.0
1.4518 218500 0.0
1.4551 219000 0.0
1.4584 219500 0.0
1.4618 220000 0.0
1.4651 220500 0.0
1.4684 221000 0.0
1.4717 221500 0.0
1.4751 222000 0.0
1.4784 222500 0.0
1.4817 223000 0.0
1.4850 223500 0.0
1.4883 224000 0.0
1.4917 224500 0.0
1.4950 225000 0.0
1.4983 225500 0.0
1.5016 226000 0.0
1.5050 226500 0.0
1.5083 227000 0.0
1.5116 227500 0.0
1.5149 228000 0.0
1.5182 228500 0.0
1.5216 229000 0.0
1.5249 229500 0.0
1.5282 230000 0.0
1.5315 230500 0.0
1.5349 231000 0.0
1.5382 231500 0.0
1.5415 232000 0.0
1.5448 232500 0.0
1.5481 233000 0.0
1.5515 233500 0.0
1.5548 234000 0.0
1.5581 234500 0.0
1.5614 235000 0.0
1.5648 235500 0.0
1.5681 236000 0.0
1.5714 236500 0.0
1.5747 237000 0.0
1.5780 237500 0.0
1.5814 238000 0.0
1.5847 238500 0.0
1.5880 239000 0.0
1.5913 239500 0.0
1.5947 240000 0.0
1.5980 240500 0.0
1.6013 241000 0.0
1.6046 241500 0.0
1.6079 242000 0.0
1.6113 242500 0.0
1.6146 243000 0.0
1.6179 243500 0.0
1.6212 244000 0.0
1.6246 244500 0.0
1.6279 245000 0.0
1.6312 245500 0.0
1.6345 246000 0.0
1.6378 246500 0.0
1.6412 247000 0.0
1.6445 247500 0.0
1.6478 248000 0.0
1.6511 248500 0.0
1.6545 249000 0.0
1.6578 249500 0.0
1.6611 250000 0.0
1.6644 250500 0.0
1.6677 251000 0.0
1.6711 251500 0.0
1.6744 252000 0.0
1.6777 252500 0.0
1.6810 253000 0.0
1.6844 253500 0.0
1.6877 254000 0.0
1.6910 254500 0.0
1.6943 255000 0.0
1.6976 255500 0.0
1.7010 256000 0.0
1.7043 256500 0.0
1.7076 257000 0.0
1.7109 257500 0.0
1.7143 258000 0.0
1.7176 258500 0.0
1.7209 259000 0.0
1.7242 259500 0.0
1.7275 260000 0.0
1.7309 260500 0.0
1.7342 261000 0.0
1.7375 261500 0.0
1.7408 262000 0.0
1.7442 262500 0.0
1.7475 263000 0.0
1.7508 263500 0.0
1.7541 264000 0.0
1.7574 264500 0.0
1.7608 265000 0.0
1.7641 265500 0.0
1.7674 266000 0.0
1.7707 266500 0.0
1.7741 267000 0.0
1.7774 267500 0.0
1.7807 268000 0.0
1.7840 268500 0.0
1.7873 269000 0.0
1.7907 269500 0.0
1.7940 270000 0.0
1.7973 270500 0.0
1.8006 271000 0.0
1.8040 271500 0.0
1.8073 272000 0.0
1.8106 272500 0.0
1.8139 273000 0.0
1.8172 273500 0.0
1.8206 274000 0.0
1.8239 274500 0.0
1.8272 275000 0.0
1.8305 275500 0.0
1.8339 276000 0.0
1.8372 276500 0.0
1.8405 277000 0.0
1.8438 277500 0.0
1.8471 278000 0.0
1.8505 278500 0.0
1.8538 279000 0.0
1.8571 279500 0.0
1.8604 280000 0.0
1.8638 280500 0.0
1.8671 281000 0.0
1.8704 281500 0.0
1.8737 282000 0.0
1.8770 282500 0.0
1.8804 283000 0.0
1.8837 283500 0.0
1.8870 284000 0.0
1.8903 284500 0.0
1.8936 285000 0.0
1.8970 285500 0.0
1.9003 286000 0.0
1.9036 286500 0.0
1.9069 287000 0.0
1.9103 287500 0.0
1.9136 288000 0.0
1.9169 288500 0.0
1.9202 289000 0.0
1.9235 289500 0.0
1.9269 290000 0.0
1.9302 290500 0.0
1.9335 291000 0.0
1.9368 291500 0.0
1.9402 292000 0.0
1.9435 292500 0.0
1.9468 293000 0.0
1.9501 293500 0.0
1.9534 294000 0.0
1.9568 294500 0.0
1.9601 295000 0.0
1.9634 295500 0.0
1.9667 296000 0.0
1.9701 296500 0.0
1.9734 297000 0.0
1.9767 297500 0.0
1.9800 298000 0.0
1.9833 298500 0.0
1.9867 299000 0.0
1.9900 299500 0.0
1.9933 300000 0.0
1.9966 300500 0.0
2.0000 301000 0.0
2.0033 301500 0.0
2.0066 302000 0.0
2.0099 302500 0.0
2.0132 303000 0.0
2.0166 303500 0.0
2.0199 304000 0.0
2.0232 304500 0.0
2.0265 305000 0.0
2.0299 305500 0.0
2.0332 306000 0.0
2.0365 306500 0.0
2.0398 307000 0.0
2.0431 307500 0.0
2.0465 308000 0.0
2.0498 308500 0.0
2.0531 309000 0.0
2.0564 309500 0.0
2.0598 310000 0.0
2.0631 310500 0.0
2.0664 311000 0.0
2.0697 311500 0.0
2.0730 312000 0.0
2.0764 312500 0.0
2.0797 313000 0.0
2.0830 313500 0.0
2.0863 314000 0.0
2.0897 314500 0.0
2.0930 315000 0.0
2.0963 315500 0.0
2.0996 316000 0.0
2.1029 316500 0.0
2.1063 317000 0.0
2.1096 317500 0.0
2.1129 318000 0.0
2.1162 318500 0.0
2.1196 319000 0.0
2.1229 319500 0.0
2.1262 320000 0.0
2.1295 320500 0.0
2.1328 321000 0.0
2.1362 321500 0.0
2.1395 322000 0.0
2.1428 322500 0.0
2.1461 323000 0.0
2.1495 323500 0.0
2.1528 324000 0.0
2.1561 324500 0.0
2.1594 325000 0.0
2.1627 325500 0.0
2.1661 326000 0.0
2.1694 326500 0.0
2.1727 327000 0.0
2.1760 327500 0.0
2.1794 328000 0.0
2.1827 328500 0.0
2.1860 329000 0.0
2.1893 329500 0.0
2.1926 330000 0.0
2.1960 330500 0.0
2.1993 331000 0.0
2.2026 331500 0.0
2.2059 332000 0.0
2.2093 332500 0.0
2.2126 333000 0.0
2.2159 333500 0.0
2.2192 334000 0.0
2.2225 334500 0.0
2.2259 335000 0.0
2.2292 335500 0.0
2.2325 336000 0.0
2.2358 336500 0.0
2.2392 337000 0.0
2.2425 337500 0.0
2.2458 338000 0.0
2.2491 338500 0.0
2.2524 339000 0.0
2.2558 339500 0.0
2.2591 340000 0.0
2.2624 340500 0.0
2.2657 341000 0.0
2.2691 341500 0.0
2.2724 342000 0.0
2.2757 342500 0.0
2.2790 343000 0.0
2.2823 343500 0.0
2.2857 344000 0.0
2.2890 344500 0.0
2.2923 345000 0.0
2.2956 345500 0.0
2.2990 346000 0.0
2.3023 346500 0.0
2.3056 347000 0.0
2.3089 347500 0.0
2.3122 348000 0.0
2.3156 348500 0.0
2.3189 349000 0.0
2.3222 349500 0.0
2.3255 350000 0.0
2.3289 350500 0.0
2.3322 351000 0.0
2.3355 351500 0.0
2.3388 352000 0.0
2.3421 352500 0.0
2.3455 353000 0.0
2.3488 353500 0.0
2.3521 354000 0.0
2.3554 354500 0.0
2.3588 355000 0.0
2.3621 355500 0.0
2.3654 356000 0.0
2.3687 356500 0.0
2.3720 357000 0.0
2.3754 357500 0.0
2.3787 358000 0.0
2.3820 358500 0.0
2.3853 359000 0.0
2.3887 359500 0.0
2.3920 360000 0.0
2.3953 360500 0.0
2.3986 361000 0.0
2.4019 361500 0.0
2.4053 362000 0.0
2.4086 362500 0.0
2.4119 363000 0.0
2.4152 363500 0.0
2.4186 364000 0.0
2.4219 364500 0.0
2.4252 365000 0.0
2.4285 365500 0.0
2.4318 366000 0.0
2.4352 366500 0.0
2.4385 367000 0.0
2.4418 367500 0.0
2.4451 368000 0.0
2.4485 368500 0.0
2.4518 369000 0.0
2.4551 369500 0.0
2.4584 370000 0.0
2.4617 370500 0.0
2.4651 371000 0.0
2.4684 371500 0.0
2.4717 372000 0.0
2.4750 372500 0.0
2.4784 373000 0.0
2.4817 373500 0.0
2.4850 374000 0.0
2.4883 374500 0.0
2.4916 375000 0.0
2.4950 375500 0.0
2.4983 376000 0.0
2.5016 376500 0.0
2.5049 377000 0.0
2.5083 377500 0.0
2.5116 378000 0.0
2.5149 378500 0.0
2.5182 379000 0.0
2.5215 379500 0.0
2.5249 380000 0.0
2.5282 380500 0.0
2.5315 381000 0.0
2.5348 381500 0.0
2.5382 382000 0.0
2.5415 382500 0.0
2.5448 383000 0.0
2.5481 383500 0.0
2.5514 384000 0.0
2.5548 384500 0.0
2.5581 385000 0.0
2.5614 385500 0.0
2.5647 386000 0.0
2.5681 386500 0.0
2.5714 387000 0.0
2.5747 387500 0.0
2.5780 388000 0.0
2.5813 388500 0.0
2.5847 389000 0.0
2.5880 389500 0.0
2.5913 390000 0.0
2.5946 390500 0.0
2.5980 391000 0.0
2.6013 391500 0.0
2.6046 392000 0.0
2.6079 392500 0.0
2.6112 393000 0.0
2.6146 393500 0.0
2.6179 394000 0.0
2.6212 394500 0.0
2.6245 395000 0.0
2.6279 395500 0.0
2.6312 396000 0.0
2.6345 396500 0.0
2.6378 397000 0.0
2.6411 397500 0.0
2.6445 398000 0.0
2.6478 398500 0.0
2.6511 399000 0.0
2.6544 399500 0.0
2.6578 400000 0.0
2.6611 400500 0.0
2.6644 401000 0.0
2.6677 401500 0.0
2.6710 402000 0.0
2.6744 402500 0.0
2.6777 403000 0.0
2.6810 403500 0.0
2.6843 404000 0.0
2.6877 404500 0.0
2.6910 405000 0.0
2.6943 405500 0.0
2.6976 406000 0.0
2.7009 406500 0.0
2.7043 407000 0.0
2.7076 407500 0.0
2.7109 408000 0.0
2.7142 408500 0.0
2.7176 409000 0.0
2.7209 409500 0.0
2.7242 410000 0.0
2.7275 410500 0.0
2.7308 411000 0.0
2.7342 411500 0.0
2.7375 412000 0.0
2.7408 412500 0.0
2.7441 413000 0.0
2.7475 413500 0.0
2.7508 414000 0.0
2.7541 414500 0.0
2.7574 415000 0.0
2.7607 415500 0.0
2.7641 416000 0.0
2.7674 416500 0.0
2.7707 417000 0.0
2.7740 417500 0.0
2.7774 418000 0.0
2.7807 418500 0.0
2.7840 419000 0.0
2.7873 419500 0.0
2.7906 420000 0.0
2.7940 420500 0.0
2.7973 421000 0.0
2.8006 421500 0.0
2.8039 422000 0.0
2.8073 422500 0.0
2.8106 423000 0.0
2.8139 423500 0.0
2.8172 424000 0.0
2.8205 424500 0.0
2.8239 425000 0.0
2.8272 425500 0.0
2.8305 426000 0.0
2.8338 426500 0.0
2.8372 427000 0.0
2.8405 427500 0.0
2.8438 428000 0.0
2.8471 428500 0.0
2.8504 429000 0.0
2.8538 429500 0.0
2.8571 430000 0.0
2.8604 430500 0.0
2.8637 431000 0.0
2.8671 431500 0.0
2.8704 432000 0.0
2.8737 432500 0.0
2.8770 433000 0.0
2.8803 433500 0.0
2.8837 434000 0.0
2.8870 434500 0.0
2.8903 435000 0.0
2.8936 435500 0.0
2.8970 436000 0.0
2.9003 436500 0.0
2.9036 437000 0.0
2.9069 437500 0.0
2.9102 438000 0.0
2.9136 438500 0.0
2.9169 439000 0.0
2.9202 439500 0.0
2.9235 440000 0.0
2.9269 440500 0.0
2.9302 441000 0.0
2.9335 441500 0.0
2.9368 442000 0.0
2.9401 442500 0.0
2.9435 443000 0.0
2.9468 443500 0.0
2.9501 444000 0.0
2.9534 444500 0.0
2.9568 445000 0.0
2.9601 445500 0.0
2.9634 446000 0.0
2.9667 446500 0.0
2.9700 447000 0.0
2.9734 447500 0.0
2.9767 448000 0.0
2.9800 448500 0.0
2.9833 449000 0.0
2.9867 449500 0.0
2.9900 450000 0.0
2.9933 450500 0.0
2.9966 451000 0.0
2.9999 451500 0.0

Framework Versions

  • Python: 3.11.11
  • Sentence Transformers: 3.4.1
  • Transformers: 4.48.3
  • PyTorch: 2.5.1+cu124
  • Accelerate: 1.3.0
  • Datasets: 3.3.2
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
Downloads last month
7
Safetensors
Model size
335M params
Tensor type
FP16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for Dataologist/gte_large_op

Base model

thenlper/gte-large
Finetuned
(17)
this model