diff --git "a/README.md" "b/README.md" --- "a/README.md" +++ "b/README.md" @@ -6,120 +6,98 @@ tags: - sentence-similarity - feature-extraction - generated_from_trainer -- dataset_size:4330781 +- dataset_size:4731012 - loss:MultipleNegativesRankingLoss +- loss:CachedMultipleNegativesRankingLoss - loss:SoftmaxLoss - loss:CosineSimilarityLoss base_model: tasksource/ModernBERT-base-nli widget: -- source_sentence: Daniel went to the kitchen. Sandra went back to the kitchen. Daniel - moved to the garden. Sandra grabbed the apple. Sandra went back to the office. - Sandra dropped the apple. Sandra went to the garden. Sandra went back to the bedroom. - Sandra went back to the office. Mary went back to the office. Daniel moved to - the bathroom. Sandra grabbed the apple. Sandra travelled to the garden. Sandra - put down the apple there. Mary went back to the bathroom. Daniel travelled to - the garden. Mary took the milk. Sandra grabbed the apple. Mary left the milk there. - Sandra journeyed to the bedroom. John travelled to the office. John went back - to the garden. Sandra journeyed to the garden. Mary grabbed the milk. Mary left - the milk. Mary grabbed the milk. Mary went to the hallway. John moved to the hallway. - Mary picked up the football. Sandra journeyed to the kitchen. Sandra left the - apple. Mary discarded the milk. John journeyed to the garden. Mary dropped the - football. Daniel moved to the bathroom. Daniel journeyed to the kitchen. Mary - travelled to the bathroom. Daniel went to the bedroom. Mary went to the hallway. - Sandra got the apple. Sandra went back to the hallway. Mary moved to the kitchen. - Sandra dropped the apple there. Sandra grabbed the milk. Sandra journeyed to the - bathroom. John went back to the kitchen. Sandra went to the kitchen. Sandra travelled - to the bathroom. Daniel went to the garden. Daniel moved to the kitchen. Sandra - dropped the milk. Sandra got the milk. Sandra put down the milk. John journeyed - to the garden. Sandra went back to the hallway. Sandra picked up the apple. Sandra - got the football. Sandra moved to the garden. Daniel moved to the bathroom. Daniel - travelled to the garden. Sandra went back to the bathroom. Sandra discarded the - football. +- source_sentence: Christa McAuliffe taught social studies at Concord High School. sentences: - - In the adulthood stage, it can jump, walk, run - - The chocolate is bigger than the container. - - The football before the bathroom was in the garden. -- source_sentence: 'Context: I am devasted. + - The Football League play-offs for the 1994 -- 95 season were held in May 1995 + , with the finals taking place at Wembley Stadium in London .. Football League + play-offs. Football League play-offs. 1994 Football League play-offs. Wembley + Stadium. Wembley Stadium ( 1923 ). London. London. The play-off semi-finals were + played over two legs and were contested by the teams who finished in 2nd , 3rd + , 4th and 5th place in the Football League First Division and Football League + Second Division and the 3rd , 4th , 5th , and 6th placed teams in the Football + League Third Division table .. Football League First Division. 1994–95 Football + League First Division. Football League Second Division. 1994–95 Football League + Second Division. Football League Third Division. 1994–95 Football League Third + Division. The winners of the semi-finals progressed through to the finals , with + the winner of these matches gaining promotion for the following season .. following + season. 1995-96 in English football + - Sir Alexander Mackenzie Elementary is a public elementary school in Vancouver + , British Columbia part of School District 39 Vancouver .. Vancouver. Vancouver, + British Columbia. British Columbia. British Columbia. School District 39 Vancouver. + School District 39 Vancouver. elementary school. elementary school + - 'Help Wanted -LRB- Hataraku Hito : Hard Working People in Japan , Job Island : + Hard Working People in Europe -RRB- is a game that features a collection of various + , Wii Remote-based minigames .. Wii. Wii. Wii Remote. Wii Remote. The game is + developed and published by Hudson Soft and was released in Japan for Nintendo + ''s Wii on November 27 , 2008 , in Europe on March 13 , 2009 , in Australia on + March 27 , 2009 , and in North America on May 12 , 2009 .. Hudson Soft. Hudson + Soft. Wii. Wii. Nintendo. Nintendo' +- source_sentence: The researchers asked children of different ages to use words to + form semantic correspondence. For example, when children see the words eagle, + bear and robin, they combine them best according to their meaning. The results + showed that older participants were more likely to develop different types of + false memory than younger participants. Because there are many forms of classification + in their minds. For example, young children classify eagles and robins as birds, + while older children classify eagles and bears as predators. Compared with children, + they have a concept of predators in their minds. + sentences: + - Extractive Industries Transparency Initiative is an organization + - Mason heard a pun + - Older children are more likely to have false memories than younger ones conforms + to the context. +- source_sentence: 'Version 0.5 is released today. The biggest change is that this + version finally has upload progress. - Speaker 1: I am very devastated these days. + Download it here: - Speaker 2: That seems bad and I am sorry to hear that. What happened? + Or go to for more information about this project. - Speaker 1: My father day 3 weeks ago.I still can''t believe. + Changelog: - Speaker 2: I am truly sorry to hear that. Please accept my apologies for your - loss. May he rest in peace' - sentences: - - 'The main emotion of this example dialogue is: content' - - 'This text is about: genealogy' - - The intent of this example is to be offensive/disrespectful. -- source_sentence: in three distinguish’d parts, with three distinguish’d guides + * Refactored the authentication_controller + + * Put before_filter :authorize in ApplicationController (and using skip_before_filter + in other controllers if necessary) + + * Using ''unless'' instead of ''if not'' + + * Using find_by() instead of find(:first) + + * Upload progress (yay!) + + Forums | + + Admin' sentences: - - This example is paraphrase. - - This example is neutral. - - This example is negative. -- source_sentence: A boy is playing a piano. + - This example wikipedia comment contains an insult. + - 'This text is about: hardware update' + - The example summary is factually consistent with the full article. +- source_sentence: 'Make sure to make it to the Brew House in Pella, IA tomorrow @ + 3 to meet with @user supporters! #SemST' sentences: - - Nine killed in Syrian-linked clashes in Lebanon - - A man is singing and playing a guitar. - - My opinion is to wait until the child itself expresses a desire for this. -- source_sentence: Francis I of France was a king. + - This example is ANT. + - This example is valid question. + - This example is favor. +- source_sentence: Also at increased risk are those whose immune systems suppressed + by medications or by diseases such as cancer, diabetes and AIDS. sentences: - - The Apple QuickTake -LRB- codenamed Venus , Mars , Neptune -RRB- is one of the - first consumer digital camera lines .. digital camera. digital camera. It was - launched in 1994 by Apple Computer and was marketed for three years before being - discontinued in 1997 .. Apple Computer. Apple Computer. Three models of the product - were built including the 100 and 150 , both built by Kodak ; and the 200 , built - by Fujifilm .. Kodak. Kodak. Fujifilm. Fujifilm. The QuickTake cameras had a resolution - of 640 x 480 pixels maximum -LRB- 0.3 Mpx -RRB- .. resolution. Display resolution. - The 200 model is only officially compatible with the Apple Macintosh for direct - connections , while the 100 and 150 model are compatible with both the Apple Macintosh - and Microsoft Windows .. Apple Macintosh. Apple Macintosh. Microsoft Windows. - Microsoft Windows. Because the QuickTake 200 is almost identical to the Fuji DS-7 - or to Samsung 's Kenox SSC-350N , Fuji 's software for that camera can be used - to gain Windows compatibility for the QuickTake 200 .. Some other software replacements - also exist as well as using an external reader for the removable media of the - QuickTake 200 .. Time Magazine profiled QuickTake as `` the first consumer digital - camera '' and ranked it among its `` 100 greatest and most influential gadgets - from 1923 to the present '' list .. digital camera. digital camera. Time Magazine. - Time Magazine. While the QuickTake was probably the first digicam to have wide - success , technically this is not true as the greyscale Dycam Model 1 -LRB- also - marketed as the Logitech FotoMan -RRB- was the first consumer digital camera to - be sold in the US in November 1990 .. digital camera. digital camera. greyscale. - greyscale. At least one other camera , the Fuji DS-X , was sold in Japan even - earlier , in late 1989 . - - The ganglion cell layer -LRB- ganglionic layer -RRB- is a layer of the retina - that consists of retinal ganglion cells and displaced amacrine cells .. retina. - retina. In the macula lutea , the layer forms several strata .. macula lutea. - macula lutea. The cells are somewhat flask-shaped ; the rounded internal surface - of each resting on the stratum opticum , and sending off an axon which is prolonged - into it .. flask. Laboratory flask. stratum opticum. stratum opticum. axon. axon. - From the opposite end numerous dendrites extend into the inner plexiform layer - , where they branch and form flattened arborizations at different levels .. inner - plexiform layer. inner plexiform layer. arborizations. arborizations. dendrites. - dendrites. The ganglion cells vary much in size , and the dendrites of the smaller - ones as a rule arborize in the inner plexiform layer as soon as they enter it - ; while those of the larger cells ramify close to the inner nuclear layer .. inner - plexiform layer. inner plexiform layer. dendrites. dendrites. inner nuclear layer. - inner nuclear layer - - Coyote was a brand of racing chassis designed and built for the use of A. J. Foyt - 's race team in USAC Championship car racing including the Indianapolis 500 .. - A. J. Foyt. A. J. Foyt. USAC. United States Auto Club. Championship car. American - Championship car racing. Indianapolis 500. Indianapolis 500. It was used from - 1966 to 1983 with Foyt himself making 141 starts in the car , winning 25 times - .. George Snider had the second most starts with 24 .. George Snider. George Snider. - Jim McElreath has the only other win with a Coyote chassis .. Jim McElreath. Jim - McElreath. Foyt drove a Coyote to victory in the Indy 500 in 1967 and 1977 .. - With Foyt 's permission , fellow Indy 500 champion Eddie Cheever 's Cheever Racing - began using the Coyote name for his new Daytona Prototype chassis , derived from - the Fabcar chassis design that he had purchased the rights to in 2007 .. Eddie - Cheever. Eddie Cheever. Cheever Racing. Cheever Racing. Daytona Prototype. Daytona - Prototype + - In 1995, the last survey, those numbers were equal. + - Also at increased risk are those with suppressed immune systems due to illness + or medicines. + - Singapore stocks close 0.54 pct higher datasets: - tomaarsen/natural-questions-hard-negatives - tomaarsen/gooaq-hard-negatives - bclavie/msmarco-500k-triplets -- sentence-transformers/all-nli +- sentence-transformers/gooaq +- sentence-transformers/natural-questions - tasksource/merged-2l-nli - tasksource/merged-3l-nli - tasksource/zero-shot-label-nli @@ -134,7 +112,7 @@ library_name: sentence-transformers # SentenceTransformer based on tasksource/ModernBERT-base-nli -This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [tasksource/ModernBERT-base-nli](https://huggingface.co/tasksource/ModernBERT-base-nli) on the [tomaarsen/natural-questions-hard-negatives](https://huggingface.co/datasets/tomaarsen/natural-questions-hard-negatives), [tomaarsen/gooaq-hard-negatives](https://huggingface.co/datasets/tomaarsen/gooaq-hard-negatives), [bclavie/msmarco-500k-triplets](https://huggingface.co/datasets/bclavie/msmarco-500k-triplets), [sentence-transformers/all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli), [merged-2l-nli](https://huggingface.co/datasets/tasksource/merged-2l-nli), [merged-3l-nli](https://huggingface.co/datasets/tasksource/merged-3l-nli), [zero-shot-label-nli](https://huggingface.co/datasets/tasksource/zero-shot-label-nli), [dataset_train_nli](https://huggingface.co/datasets/MoritzLaurer/dataset_train_nli), [paws/labeled_final](https://huggingface.co/datasets/paws), [glue/mrpc](https://huggingface.co/datasets/glue), [glue/qqp](https://huggingface.co/datasets/glue), [fever-evidence-related](https://huggingface.co/datasets/mwong/fever-evidence-related), [glue/stsb](https://huggingface.co/datasets/glue), sick/relatedness and [sts-companion](https://huggingface.co/datasets/tasksource/sts-companion) datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. +This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [tasksource/ModernBERT-base-nli](https://huggingface.co/tasksource/ModernBERT-base-nli) on the [tomaarsen/natural-questions-hard-negatives](https://huggingface.co/datasets/tomaarsen/natural-questions-hard-negatives), [tomaarsen/gooaq-hard-negatives](https://huggingface.co/datasets/tomaarsen/gooaq-hard-negatives), [bclavie/msmarco-500k-triplets](https://huggingface.co/datasets/bclavie/msmarco-500k-triplets), [sentence-transformers/gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq), [sentence-transformers/natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions), [merged-2l-nli](https://huggingface.co/datasets/tasksource/merged-2l-nli), [merged-3l-nli](https://huggingface.co/datasets/tasksource/merged-3l-nli), [zero-shot-label-nli](https://huggingface.co/datasets/tasksource/zero-shot-label-nli), [dataset_train_nli](https://huggingface.co/datasets/MoritzLaurer/dataset_train_nli), [paws/labeled_final](https://huggingface.co/datasets/paws), [glue/mrpc](https://huggingface.co/datasets/glue), [glue/qqp](https://huggingface.co/datasets/glue), [fever-evidence-related](https://huggingface.co/datasets/mwong/fever-evidence-related), [glue/stsb](https://huggingface.co/datasets/glue), sick/relatedness and [sts-companion](https://huggingface.co/datasets/tasksource/sts-companion) datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details @@ -148,7 +126,8 @@ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [t - [tomaarsen/natural-questions-hard-negatives](https://huggingface.co/datasets/tomaarsen/natural-questions-hard-negatives) - [tomaarsen/gooaq-hard-negatives](https://huggingface.co/datasets/tomaarsen/gooaq-hard-negatives) - [bclavie/msmarco-500k-triplets](https://huggingface.co/datasets/bclavie/msmarco-500k-triplets) - - [sentence-transformers/all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) + - [sentence-transformers/gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq) + - [sentence-transformers/natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) - [merged-2l-nli](https://huggingface.co/datasets/tasksource/merged-2l-nli) - [merged-3l-nli](https://huggingface.co/datasets/tasksource/merged-3l-nli) - [zero-shot-label-nli](https://huggingface.co/datasets/tasksource/zero-shot-label-nli) @@ -193,12 +172,12 @@ Then you can load this model and run inference. from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub -model = SentenceTransformer("tasksource/ModernBERT-base-nli-embed") +model = SentenceTransformer("tasksource/ModernBERT-base-embed") # Run inference sentences = [ - 'Francis I of France was a king.', - "Coyote was a brand of racing chassis designed and built for the use of A. J. Foyt 's race team in USAC Championship car racing including the Indianapolis 500 .. A. J. Foyt. A. J. Foyt. USAC. United States Auto Club. Championship car. American Championship car racing. Indianapolis 500. Indianapolis 500. It was used from 1966 to 1983 with Foyt himself making 141 starts in the car , winning 25 times .. George Snider had the second most starts with 24 .. George Snider. George Snider. Jim McElreath has the only other win with a Coyote chassis .. Jim McElreath. Jim McElreath. Foyt drove a Coyote to victory in the Indy 500 in 1967 and 1977 .. With Foyt 's permission , fellow Indy 500 champion Eddie Cheever 's Cheever Racing began using the Coyote name for his new Daytona Prototype chassis , derived from the Fabcar chassis design that he had purchased the rights to in 2007 .. Eddie Cheever. Eddie Cheever. Cheever Racing. Cheever Racing. Daytona Prototype. Daytona Prototype", - "The Apple QuickTake -LRB- codenamed Venus , Mars , Neptune -RRB- is one of the first consumer digital camera lines .. digital camera. digital camera. It was launched in 1994 by Apple Computer and was marketed for three years before being discontinued in 1997 .. Apple Computer. Apple Computer. Three models of the product were built including the 100 and 150 , both built by Kodak ; and the 200 , built by Fujifilm .. Kodak. Kodak. Fujifilm. Fujifilm. The QuickTake cameras had a resolution of 640 x 480 pixels maximum -LRB- 0.3 Mpx -RRB- .. resolution. Display resolution. The 200 model is only officially compatible with the Apple Macintosh for direct connections , while the 100 and 150 model are compatible with both the Apple Macintosh and Microsoft Windows .. Apple Macintosh. Apple Macintosh. Microsoft Windows. Microsoft Windows. Because the QuickTake 200 is almost identical to the Fuji DS-7 or to Samsung 's Kenox SSC-350N , Fuji 's software for that camera can be used to gain Windows compatibility for the QuickTake 200 .. Some other software replacements also exist as well as using an external reader for the removable media of the QuickTake 200 .. Time Magazine profiled QuickTake as `` the first consumer digital camera '' and ranked it among its `` 100 greatest and most influential gadgets from 1923 to the present '' list .. digital camera. digital camera. Time Magazine. Time Magazine. While the QuickTake was probably the first digicam to have wide success , technically this is not true as the greyscale Dycam Model 1 -LRB- also marketed as the Logitech FotoMan -RRB- was the first consumer digital camera to be sold in the US in November 1990 .. digital camera. digital camera. greyscale. greyscale. At least one other camera , the Fuji DS-X , was sold in Japan even earlier , in late 1989 .", + 'Also at increased risk are those whose immune systems suppressed by medications or by diseases such as cancer, diabetes and AIDS.', + 'Also at increased risk are those with suppressed immune systems due to illness or medicines.', + 'In 1995, the last survey, those numbers were equal.', ] embeddings = model.encode(sentences) print(embeddings.shape) @@ -277,7 +256,7 @@ You can finetune this model on your own dataset. #### tomaarsen/gooaq-hard-negatives * Dataset: [tomaarsen/gooaq-hard-negatives](https://huggingface.co/datasets/tomaarsen/gooaq-hard-negatives) at [87594a1](https://huggingface.co/datasets/tomaarsen/gooaq-hard-negatives/tree/87594a1e6c58e88b5843afa9da3a97ffd75d01c2) -* Size: 100,000 training samples +* Size: 200,000 training samples * Columns: question, answer, negative_1, negative_2, negative_3, negative_4, and negative_5 * Approximate statistics based on the first 1000 samples: | | question | answer | negative_1 | negative_2 | negative_3 | negative_4 | negative_5 | @@ -301,7 +280,7 @@ You can finetune this model on your own dataset. #### bclavie/msmarco-500k-triplets * Dataset: [bclavie/msmarco-500k-triplets](https://huggingface.co/datasets/bclavie/msmarco-500k-triplets) at [cb1a85c](https://huggingface.co/datasets/bclavie/msmarco-500k-triplets/tree/cb1a85c1261fa7c65f4ea43f94e50f8b467c372f) -* Size: 100,000 training samples +* Size: 200,000 training samples * Columns: query, positive, and negative * Approximate statistics based on the first 1000 samples: | | query | positive | negative | @@ -322,23 +301,47 @@ You can finetune this model on your own dataset. } ``` -#### sentence-transformers/all-nli +#### sentence-transformers/gooaq -* Dataset: [sentence-transformers/all-nli](https://huggingface.co/datasets/sentence-transformers/all-nli) at [d482672](https://huggingface.co/datasets/sentence-transformers/all-nli/tree/d482672c8e74ce18da116f430137434ba2e52fab) -* Size: 100,000 training samples -* Columns: anchor, positive, and negative +* Dataset: [sentence-transformers/gooaq](https://huggingface.co/datasets/sentence-transformers/gooaq) at [b089f72](https://huggingface.co/datasets/sentence-transformers/gooaq/tree/b089f728748a068b7bc5234e5bcf5b25e3c8279c) +* Size: 200,000 training samples +* Columns: question and answer * Approximate statistics based on the first 1000 samples: - | | anchor | positive | negative | - |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| - | type | string | string | string | - | details | | | | + | | question | answer | + |:--------|:----------------------------------------------------------------------------------|:------------------------------------------------------------------------------------| + | type | string | string | + | details | | | * Samples: - | anchor | positive | negative | - |:---------------------------------------------------------------------------|:-------------------------------------------------|:-----------------------------------------------------------| - | A person on a horse jumps over a broken down airplane. | A person is outdoors, on a horse. | A person is at a diner, ordering an omelette. | - | Children smiling and waving at camera | There are children present | The kids are frowning | - | A boy is jumping on skateboard in the middle of a red bridge. | The boy does a skateboarding trick. | The boy skates down the sidewalk. | -* Loss: [MultipleNegativesRankingLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters: + | question | answer | + |:---------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| + | is toprol xl the same as metoprolol? | Metoprolol succinate is also known by the brand name Toprol XL. It is the extended-release form of metoprolol. Metoprolol succinate is approved to treat high blood pressure, chronic chest pain, and congestive heart failure. | + | are you experienced cd steve hoffman? | The Are You Experienced album was apparently mastered from the original stereo UK master tapes (according to Steve Hoffman - one of the very few who has heard both the master tapes and the CDs produced over the years). ... The CD booklets were a little sparse, but at least they stayed true to the album's original design. | + | how are babushka dolls made? | Matryoshka dolls are made of wood from lime, balsa, alder, aspen, and birch trees; lime is probably the most common wood type. ... After cutting, the trees are stripped of most of their bark, although a few inner rings of bark are left to bind the wood and keep it from splitting. | +* Loss: [CachedMultipleNegativesRankingLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedmultiplenegativesrankingloss) with these parameters: + ```json + { + "scale": 20.0, + "similarity_fct": "cos_sim" + } + ``` + +#### sentence-transformers/natural-questions + +* Dataset: [sentence-transformers/natural-questions](https://huggingface.co/datasets/sentence-transformers/natural-questions) at [f9e894e](https://huggingface.co/datasets/sentence-transformers/natural-questions/tree/f9e894e1081e206e577b4eaa9ee6de2b06ae6f17) +* Size: 100,231 training samples +* Columns: query and answer +* Approximate statistics based on the first 1000 samples: + | | query | answer | + |:--------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------| + | type | string | string | + | details | | | +* Samples: + | query | answer | + |:----------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| + | when did richmond last play in a preliminary final | Richmond Football Club Richmond began 2017 with 5 straight wins, a feat it had not achieved since 1995. A series of close losses hampered the Tigers throughout the middle of the season, including a 5-point loss to the Western Bulldogs, 2-point loss to Fremantle, and a 3-point loss to the Giants. Richmond ended the season strongly with convincing victories over Fremantle and St Kilda in the final two rounds, elevating the club to 3rd on the ladder. Richmond's first final of the season against the Cats at the MCG attracted a record qualifying final crowd of 95,028; the Tigers won by 51 points. Having advanced to the first preliminary finals for the first time since 2001, Richmond defeated Greater Western Sydney by 36 points in front of a crowd of 94,258 to progress to the Grand Final against Adelaide, their first Grand Final appearance since 1982. The attendance was 100,021, the largest crowd to a grand final since 1986. The Crows led at quarter time and led by as many as 13, but the Tig... | + | who sang what in the world's come over you | Jack Scott (singer) At the beginning of 1960, Scott again changed record labels, this time to Top Rank Records.[1] He then recorded four Billboard Hot 100 hits – "What in the World's Come Over You" (#5), "Burning Bridges" (#3) b/w "Oh Little One" (#34), and "It Only Happened Yesterday" (#38).[1] "What in the World's Come Over You" was Scott's second gold disc winner.[6] Scott continued to record and perform during the 1960s and 1970s.[1] His song "You're Just Gettin' Better" reached the country charts in 1974.[1] In May 1977, Scott recorded a Peel session for BBC Radio 1 disc jockey, John Peel. | + | who produces the most wool in the world | Wool Global wool production is about 2 million tonnes per year, of which 60% goes into apparel. Wool comprises ca 3% of the global textile market, but its value is higher owing to dying and other modifications of the material.[1] Australia is a leading producer of wool which is mostly from Merino sheep but has been eclipsed by China in terms of total weight.[30] New Zealand (2016) is the third-largest producer of wool, and the largest producer of crossbred wool. Breeds such as Lincoln, Romney, Drysdale, and Elliotdale produce coarser fibers, and wool from these sheep is usually used for making carpets. | +* Loss: [CachedMultipleNegativesRankingLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedmultiplenegativesrankingloss) with these parameters: ```json { "scale": 20.0, @@ -352,16 +355,16 @@ You can finetune this model on your own dataset. * Size: 425,243 training samples * Columns: sentence1, sentence2, and label * Approximate statistics based on the first 1000 samples: - | | sentence1 | sentence2 | label | - |:--------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:------------------------------------------------| - | type | string | string | int | - | details | | | | + | | sentence1 | sentence2 | label | + |:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------| + | type | string | string | int | + | details | | | | * Samples: - | sentence1 | sentence2 | label | - |:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------|:---------------| - | Grayson heard that learn an instrument find some friends who will too write some songs and go for it | Grayson did not hear a pun | 1 | - | What is not explicitly stated as true is considered false.
Fiona is big. Fiona is furry. Fiona is green. Fiona is quiet. Fiona is rough. Fiona is smart. Fiona is young. If Fiona is furry and Fiona is not quiet then Fiona is rough. If someone is quiet and not young then they are big. If someone is not quiet and not smart then they are rough. If someone is smart and not young then they are not green. Furry people are big. Big, smart people are rough.
| Fiona is quiet. | 1 | - | You may want to see if you can get in touch with their EPM group and get this guy in a study . | The getting did not happen | 1 | + | sentence1 | sentence2 | label | + |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------|:---------------| + | In 1783 , the Sunni Al-Khalifa family captured Bahrain from the Persians . | The is a geographical/political entity | 0 | + | ::stage Egg:: Newt eggs are encased in a gel-like substance rather than a hard shell. Adult females release eggs one at a time and store them in clusters ranging from a handful to several dozen in size. Adults often take an active role in defending their eggs after depositing them. Mothers may curl their body around the eggs to provide protection. Some newt species even wrap leaves around each egg individually to camouflage them, according to San Diego Zoo. Newt eggs are small: some measure only a millimeter or two in diameter. Mom usually anchors her eggs to underwater plants and other structures to keep them safe. ::stage Tadpole:: Newts that hatch from submerged eggs usually emerge as aquatic larvae with fishlike tails and gills that allow them to breathe beneath the water's surface. Not all newt species have an aquatic or 'tadpole' phase. This tadpole stage tends to be short, except in fully aquatic species. Eastern newt (Notophthalmus viridescens) larvae spend only a few months as... | Tadpole thing is a newt's terrestrial larval phase known as. | 0 | + | Target

You are now a valid target, you nasty little shit! 86.176.169.49
| This example wikipedia comment contains an insult. | 1 | * Loss: [SoftmaxLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) #### merged-3l-nli @@ -370,16 +373,16 @@ You can finetune this model on your own dataset. * Size: 564,204 training samples * Columns: sentence1, sentence2, and label * Approximate statistics based on the first 1000 samples: - | | sentence1 | sentence2 | label | - |:--------|:-------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------| - | type | string | string | int | - | details | | | | + | | sentence1 | sentence2 | label | + |:--------|:-------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-------------------------------------------------------------------| + | type | string | string | int | + | details | | | | * Samples: - | sentence1 | sentence2 | label | - |:-----------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| - | [M]It[/M] previously stood on Capitol Square in Rome, but [M]has now been replaced with a copy.[/M] | Marcus Aurelius Antoninus Augustus (in Latin: Marcus Aurelius Antoninus Augustus; in the epigraphs: IMP CAES M AVREL ANTONINVS AVG; Rome, April 26, 121 - Sirmium or Vindobona, March 17, 180), better known simply as Marcus Aurelius, was a Roman emperor, philosopher and writer.
On the recommendation of the emperor Hadrian, it was adopted in 138 by the future father-in-law and acquired uncle Antonino Pio who appointed him heir to the imperial throne.
Born as Marco Annio Catilius Severus (Marcus Annius Catilius Severus), he became Marco Annio Vero (Marcus Annius Verus), which was the name of his father, at the time of his marriage with his cousin Faustina, daughter of Antoninus, and therefore assumed the name of Marcus Aurelius Caesar, son of the Augustus (Marcus Aurelius Caesar Augusti filius) during the empire of Antoninus himself.
Marcus Aurelius was emperor from 161 until his death, which occurred due to illness in 180 in Sirmium according to the contemporary Tertullian or near Vindobo...
| 1 | - | (This small difference would be further reduced if retail activities of rural carriers were not counted.) | The gap would become wider if we did not consider retail activities of rural carriers. | 2 | - | None of the toothbrushes are yellow in colour. | All of the toothbrushes are yellow in colour. | 2 | + | sentence1 | sentence2 | label | + |:-----------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| + | Iceland does not have a high latitude. | Iceland . Iceland is warmed by the Gulf Stream and has a temperate climate , despite a high latitude just outside the Arctic Circle . Its high latitude and marine influence still keeps summers chilly , with most of the archipelago having a tundra climate . | 2 | + | The populist, by contrast, panders to his audience, figuring out what it likes and then delivering it in heaps. | Populists hate the audience and antagonizes them; so their support, as a tyrant's, is similarly lacking. | 2 | + | The prison sentence of that convict will end after 2 months. | Before 212 days, the prison sentence of that convict will end. | 1 | * Loss: [SoftmaxLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) #### zero-shot-label-nli @@ -391,13 +394,13 @@ You can finetune this model on your own dataset. | | label | sentence1 | sentence2 | |:--------|:------------------------------------------------|:------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------| | type | int | string | string | - | details | | | | + | details | | | | * Samples: - | label | sentence1 | sentence2 | - |:---------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------| - | 2 | The crocodile is rough. The crocodile is tired. The crocodile is dull. The crocodile chases the mouse. The lion visits the cat. The lion is strong. The lion is big. The mouse is nice. The mouse is smart. The mouse is kind. The cat is cute. The cat is small. The cat is beautiful. Nice animals are cute. If something is tired then it sees the mouse. If something sees the mouse then it is lazy. If something is rough and tired then it is dull. If something is cute and small then it is furry. If something is strong and big then it is heavy. If something is dull then it is slow. All slow animals are sleepy. If something is cute then it is small. All small animals are beautiful. If something is heavy then it is obese. All obese animals are awful. If something is furry then it is adorable. All adorable animals are lovely. All lazy animals are fierce.
The crocodile is not sleepy.
| This example is True. | - | 0 | Still , Preetam vows to marry Nandini if she meets him again . How long had they known each other? only a few centuries | This example is no. | - | 2 | i wouldnt blame you youre more likely to be killed by blacks the numbers dont lie blacks are dangerous but dont tell the libs that | This example is not gender-bias. | + | label | sentence1 | sentence2 | + |:---------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------| + | 0 | The LAY MAN! Just to let you know you are missed and thought off. Do have a great day. And if you can send me bimbo and ugo's numbers, ill appreciate. Safe
| This example is ham. | + | 2 | Crisp: oh really!!!! | This example is Automotive. | + | 2 | Insurance policies should be simple . | This example is negative. | * Loss: [SoftmaxLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) #### dataset_train_nli @@ -406,16 +409,16 @@ You can finetune this model on your own dataset. * Size: 1,018,733 training samples * Columns: sentence1, sentence2, and label * Approximate statistics based on the first 1000 samples: - | | sentence1 | sentence2 | label | - |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------| - | type | string | string | int | - | details | | | | + | | sentence1 | sentence2 | label | + |:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------| + | type | string | string | int | + | details | | | | * Samples: - | sentence1 | sentence2 | label | - |:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------|:---------------| - | 2 corrections, most recently by Hughesdarren - Show corrections
DERBY NOTES.
(FROM OUR OWN CORRESPONDENT).
The rains have come only just in time to save thousands of head of dying cattle. A few weeks longer delay and Kimberley would have lost stock to such a serious extent that years must have elapsed before its regaining a flourishing condition. As it is many thousands of cattle and sheep
have been lost. On all stations there has been loss. Some, however, have not suffered severely, whilst others have re- ceived a blow that will take a year or two to recover. Obogamma station is one of the favoured few. The losses there do not exceed 200 bead. The Yeeda station loss
is estimated at 500 to 600. The Gogo and Fossil Downs stations lost very heavily,
but no accurate figures are to hand. Kim- berley Downs station, or Balmaningarra as it was called until recently, had no loss over the average. The same can be said of the Barker River station. The losses
will be felt chiefly at shearing time...
| This text is about: vegetation | 1 | - | What is Dreamwidth Studios?
Dreamwidth Studios is a home and a community for all kinds of creative folk. Dreamwidth
Creating a basic Dreamwidth account is free. Or, you can help support the site for everyone and get extra features for a small payment.
About
- About Dreamwidth
- Learn more about the Dreamwidth project.
- Guiding Principles
- Our values and commitments.
Community
- Site News
- Read the latest Dreamwidth news.
- Latest Things *
- Read the latest things posted on the site.
- Random Journal *
- Read a random journal on the site.
- Random Community *
- Read a random Dreamwidth community.
* Dreamwidth does not pre-screen content for appropriateness.
Support
- Frequently Asked Questions
- Read common questions about Dreamwidth
- Support
- Get help with using Dreamwidth.
| This text is about: blogging platform | 1 | - | Annan #39;s remarks on Iraqi war draw different reactions among Iraqis UN Secretary-general Kofi Annan #39;s declaration saying the Iraq war was quot;illegal quot; sparked different reactions from the Iraqis. | This example news text is about world news | 0 | + | sentence1 | sentence2 | label | + |:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------|:---------------| + | where is nayagara falls located | The example utterance is a query about music. | 1 | + | Druyun gets nine-month prison sentence A former top Air Force acquisition executive today was sentenced to nine months in prison for conspiring to help Boeing Co. win a multibillion-dollar Pentagon contract. | This example news text is about world news | 1 | + | Writing on the #39;wall #39; n Last edition of the Far Eastern Economic Review is shown on the streets of Hong Kong. The weekly news magazine is to fold in its current form with the loss of 80 jobs, the magazine #39;s publisher Dow Jones said yesterday. | This example news text is about science and technology | 1 | * Loss: [SoftmaxLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) #### paws/labeled_final @@ -463,13 +466,13 @@ You can finetune this model on your own dataset. | | sentence1 | sentence2 | label | |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------| | type | string | string | int | - | details | | | | + | details | | | | * Samples: - | sentence1 | sentence2 | label | - |:----------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------|:---------------| - | How can one learn to play barre chords on a guitar? | What is the easiest way to play barre chords on guitar? | 1 | - | We are inviting my parents in uk. Is there going to be a problem with their application because we live in 1 bedroom flat? | I have been living in a hostel for 2 weeks and I still miss my parents. I am unable to adjust. What should I do? | 0 | - | Can I post videos in Quora? | Can I post video on Quora? | 1 | + | sentence1 | sentence2 | label | + |:-------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------|:---------------| + | How can I stop my laptop from hibernating in windows 10? | How do I shutdown windows 10 instead of hibernating it? | 0 | + | Is it worth the cost if ever I fix my gap teeth? | Is it worth it to fix teeth gap? | 1 | + | Why is USA the biggest threat to the global economy and Germany is not? | What is the biggest threat to the global economy over the next year (in 2011)? | 0 | * Loss: [SoftmaxLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) #### fever-evidence-related @@ -481,13 +484,13 @@ You can finetune this model on your own dataset. | | sentence1 | sentence2 | label | |:--------|:----------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------|:------------------------------------------------| | type | string | string | int | - | details | | | | + | details | | | | * Samples: - | sentence1 | sentence2 | label | - |:---------------------------------------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| - | Minos fathered Deucalion. | -RSB-. A paten , or diskos , is a small plate , usually made of silver or gold , used to hold Eucharistic bread which is to be consecrated .. bread. Host ( Holy Communion ). consecrated. consecrated. It is generally used during the service itself , while the reserved sacrament are stored in the tabernacle in a ciborium .. reserved sacrament. reserved sacrament. tabernacle. Church tabernacle. ciborium. Ciborium ( container ) | 1 | - | Hawaii is the tenth-most densely populated state. | Central Frontenac is a township in eastern Ontario , Canada in the County of Frontenac .. Frontenac. Frontenac County. township. township ( Canada ). Ontario. Ontario. Canada. Canada. County of Frontenac. Frontenac County, Ontario. Central Frontenac was created in 1998 through an amalgamation of the Townships of Hinchinbrooke , Kennebec , Olden and Oso .. Frontenac. Frontenac County | 1 | - | Brunei is a former country. | John Harley -LRB- 29 September 1728 -- 7 January 1788 -RRB- was a British bishop .. Harley was the second son of Edward Harley , 3rd Earl of Oxford and Earl Mortimer .. Edward. Edward Harley, 4th Earl of Oxford and Earl Mortimer. He was Archdeacon of Shropshire from 1760 to 1769 and then Archdeacon of Hereford from 1869 to 1787 .. Archdeacon of Shropshire. Archdeacon of Ludlow. Archdeacon of Hereford. Archdeacon of Hereford. He was Dean of Windsor , Registrar of the Order of the Garter and briefly , at the end of his life , the Bishop of Hereford .. Dean of Windsor. Dean of Windsor. Bishop of Hereford. Bishop of Hereford. His son Edward -LRB- by his wife Roach Vaughan , daughter of Gwynne Vaughan of Trebarry , Radnorshire -RRB- succeeded Harley 's elder brother -LRB- Edward -RRB- as 5th Earl of Oxford .. Edward. Edward Harley, 4th Earl of Oxford and Earl Mortimer | 1 | + | sentence1 | sentence2 | label | + |:--------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------| + | The Last of Us Part II had the developer Naughty Dog. | Bishop Asbury Cottage is a 17th-century cottage on Newton Road , Great Barr , England , known for being the boyhood home of Francis Asbury -LRB- 1745 -- 1816 -RRB- , one of the first two bishops of the Methodist Episcopal Church -LRB- now The United Methodist Church -RRB- in the United States .. Cottage. Cottage. Great Barr. Great Barr. England. England. Francis Asbury. Francis Asbury. bishops. Bishop ( Methodism ). Methodist Episcopal Church. Methodist Episcopal Church. The United Methodist Church. The United Methodist Church. It is now a museum in his memory . | 1 | + | Boomerang (1992 film) was released on July. | Petr Alekseyevich Bezobrazov -LRB- 29 January 1845 -- 17 July 1906 -RRB- was an admiral in the Imperial Russian Navy .. Imperial Russian Navy. Imperial Russian Navy | 1 | + | G-Dragon was the first Korean solo artist to a type of tour. | The Scott Viking 2 was the first British high performance two seat sailplane , flying a few days before the outbreak of World War II .. World War II. World War II. Only one was built ; it was used in radar station trials in the Summer of 1940 . | 1 | * Loss: [SoftmaxLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) #### glue/stsb @@ -496,16 +499,16 @@ You can finetune this model on your own dataset. * Size: 5,749 training samples * Columns: sentence1, sentence2, and label * Approximate statistics based on the first 1000 samples: - | | sentence1 | sentence2 | label | - |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------| - | type | string | string | float | - | details | | | | + | | sentence1 | sentence2 | label | + |:--------|:---------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------| + | type | string | string | float | + | details | | | | * Samples: - | sentence1 | sentence2 | label | - |:---------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------|:-------------------------------| - | russia may pull out of the treaty if the us and other nato allies refuse to ratify the amended version. | warned that russia would withdraw from the treaty if western nations refuse to ratify its amended version. | 4.400000095367432 | - | Israeli forces detain 5 Palestinians in West Bank | Israeli Forces Arrest Five Palestinians across West Bank | 4.800000190734863 | - | Chinese premier meets Indian vice president | Hezbollah urges to elect new Lebanese president | 0.0 | + | sentence1 | sentence2 | label | + |:--------------------------------------------------------------------------|:------------------------------------------------------------------------|:-------------------------------| + | Syria peace plan conditions “unacceptable,” opposition says | Syria peace dashed as deadline passes | 2.0 | + | Romney picks Ryan as vice presidential running mate: source | Romney to tap Ryan as vice presidential running mate | 5.0 | + | Death toll rises to 6 as Storm Xaver batters northern Europe | Storm death toll rises as wind, rain batters north. Europe | 3.200000047683716 | * Loss: [CosineSimilarityLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters: ```json { @@ -519,16 +522,16 @@ You can finetune this model on your own dataset. * Size: 4,439 training samples * Columns: sentence1, sentence2, and label * Approximate statistics based on the first 1000 samples: - | | sentence1 | sentence2 | label | - |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------| - | type | string | string | float | - | details | | | | + | | sentence1 | sentence2 | label | + |:--------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:--------------------------------------------------------------| + | type | string | string | float | + | details | | | | * Samples: - | sentence1 | sentence2 | label | - |:---------------------------------------------------------------------|:-------------------------------------------------------------------------|:--------------------------------| - | A person is singing and playing a guitar | A boy is playing a piano | 3.0999999046325684 | - | A man and a woman are driving down the street in a jeep | A man and a woman are not driving down the street in a jeep | 4.400000095367432 | - | There is no man eating a bowl of cereal | A man is eating a bowl of cereal | 4.0 | + | sentence1 | sentence2 | label | + |:-------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------|:--------------------------------| + | The man is standing on a rocky mountain and gray clouds are in the background | A black topless person is packing a pile of rocks and a front of clouds are in the background | 2.9000000953674316 | + | A man is standing on a dirt hill next to a black jeep | A man in a hat is standing outside of a green vehicle | 2.5999999046325684 | + | A man is talking on a cell phone | A man is making a phone call | 4.300000190734863 | * Loss: [CosineSimilarityLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters: ```json { @@ -545,13 +548,13 @@ You can finetune this model on your own dataset. | | label | sentence1 | sentence2 | |:--------|:---------------------------------------------------------------|:----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------| | type | float | string | string | - | details | | | | + | details | | | | * Samples: - | label | sentence1 | sentence2 | - |:------------------|:-----------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------| - | 0.8 | the act of combining, blending, integrating | the act of chipping something. | - | 2.75 | a device to control the rate of some activity, e.g., chemical or mechanical | any of various controls or devices for regulating or controlling fluid flow, pressure, temperature, etc.. | - | 2.5 | a physical entity | a separate and self-contained entity. | + | label | sentence1 | sentence2 | + |:------------------|:-----------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------| + | 4.25 | It then appointed a task force to formulate the necessary changes in tax and spending policies. | He has appointed a working party to make the necessary changes to the policies of public spending and fiscal policies. | + | 4.25 | festive social event, celebration | an occasion on which people can assemble for social interaction and entertainment. | + | 3.6 | Who'd have thought an American hero could be a Canadian? NYT: Man Who Sheltered Americans in Tehran, Dies at 88 | John Sheardown, Canadian Who Sheltered Americans in Tehran, Dies at 88 | * Loss: [CosineSimilarityLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters: ```json { @@ -906,88 +909,25 @@ You can finetune this model on your own dataset. ### Training Logs -| Epoch | Step | Training Loss | -|:------:|:-----:|:-------------:| -| 0.0125 | 500 | 3.9436 | -| 0.0250 | 1000 | 1.6589 | -| 0.0375 | 1500 | 1.1438 | -| 0.0500 | 2000 | 0.9633 | -| 0.0624 | 2500 | 0.8801 | -| 0.0749 | 3000 | 0.8087 | -| 0.0874 | 3500 | 0.7826 | -| 0.0999 | 4000 | 0.7566 | -| 0.1124 | 4500 | 0.7424 | -| 0.1249 | 5000 | 0.7154 | -| 0.1374 | 5500 | 0.6596 | -| 0.1499 | 6000 | 0.6539 | -| 0.1623 | 6500 | 0.6288 | -| 0.1748 | 7000 | 0.6337 | -| 0.1873 | 7500 | 0.6232 | -| 0.1998 | 8000 | 0.6218 | -| 0.2123 | 8500 | 0.5803 | -| 0.2248 | 9000 | 0.5778 | -| 0.2373 | 9500 | 0.5922 | -| 0.2498 | 10000 | 0.5726 | -| 0.2622 | 10500 | 0.5531 | -| 0.2747 | 11000 | 0.564 | -| 0.2872 | 11500 | 0.5693 | -| 0.2997 | 12000 | 0.5462 | -| 0.3122 | 12500 | 0.5577 | -| 0.3247 | 13000 | 0.526 | -| 0.3372 | 13500 | 0.5344 | -| 0.3497 | 14000 | 0.5366 | -| 0.3621 | 14500 | 0.5185 | -| 0.3746 | 15000 | 0.5243 | -| 0.3871 | 15500 | 0.5112 | -| 0.3996 | 16000 | 0.5124 | -| 0.4121 | 16500 | 0.4874 | -| 0.4246 | 17000 | 0.5399 | -| 0.4371 | 17500 | 0.515 | -| 0.4496 | 18000 | 0.5261 | -| 0.4620 | 18500 | 0.4917 | -| 0.4745 | 19000 | 0.4716 | -| 0.4870 | 19500 | 0.4887 | -| 0.4995 | 20000 | 0.4594 | -| 0.5120 | 20500 | 0.4687 | -| 0.5245 | 21000 | 0.4576 | -| 0.5370 | 21500 | 0.4735 | -| 0.5495 | 22000 | 0.464 | -| 0.5620 | 22500 | 0.4678 | -| 0.5744 | 23000 | 0.481 | -| 0.5869 | 23500 | 0.4918 | -| 0.5994 | 24000 | 0.4576 | -| 0.6119 | 24500 | 0.4467 | -| 0.6244 | 25000 | 0.4556 | -| 0.6369 | 25500 | 0.4489 | -| 0.6494 | 26000 | 0.4406 | -| 0.6619 | 26500 | 0.4587 | -| 0.6743 | 27000 | 0.4751 | -| 0.6868 | 27500 | 0.4446 | -| 0.6993 | 28000 | 0.433 | -| 0.7118 | 28500 | 0.4469 | -| 0.7243 | 29000 | 0.4479 | -| 0.7368 | 29500 | 0.4408 | -| 0.7493 | 30000 | 0.4259 | -| 0.7618 | 30500 | 0.464 | -| 0.7742 | 31000 | 0.4592 | -| 0.7867 | 31500 | 0.4442 | -| 0.7992 | 32000 | 0.4305 | -| 0.8117 | 32500 | 0.4439 | -| 0.8242 | 33000 | 0.4335 | -| 0.8367 | 33500 | 0.4255 | -| 0.8492 | 34000 | 0.4119 | -| 0.8617 | 34500 | 0.4298 | -| 0.8741 | 35000 | 0.4443 | -| 0.8866 | 35500 | 0.4369 | -| 0.8991 | 36000 | 0.41 | -| 0.9116 | 36500 | 0.43 | -| 0.9241 | 37000 | 0.398 | -| 0.9366 | 37500 | 0.4471 | -| 0.9491 | 38000 | 0.4409 | -| 0.9616 | 38500 | 0.4392 | -| 0.9741 | 39000 | 0.4197 | -| 0.9865 | 39500 | 0.4154 | -| 0.9990 | 40000 | 0.419 | +| Epoch | Step | Training Loss | +|:------:|:----:|:-------------:| +| 0.0067 | 500 | 10.6192 | +| 0.0134 | 1000 | 1.9196 | +| 0.0202 | 1500 | 1.0304 | +| 0.0269 | 2000 | 0.9269 | +| 0.0336 | 2500 | 0.7738 | +| 0.0403 | 3000 | 0.7092 | +| 0.0471 | 3500 | 0.6571 | +| 0.0538 | 4000 | 0.6408 | +| 0.0605 | 4500 | 0.6348 | +| 0.0672 | 5000 | 0.5927 | +| 0.0739 | 5500 | 0.5848 | +| 0.0807 | 6000 | 0.5542 | +| 0.0874 | 6500 | 0.558 | +| 0.0941 | 7000 | 0.5394 | +| 0.1008 | 7500 | 0.5632 | +| 0.1076 | 8000 | 0.5037 | +| 0.1143 | 8500 | 0.5278 | ### Framework Versions @@ -1028,6 +968,18 @@ You can finetune this model on your own dataset. } ``` +#### CachedMultipleNegativesRankingLoss +```bibtex +@misc{gao2021scaling, + title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup}, + author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan}, + year={2021}, + eprint={2101.06983}, + archivePrefix={arXiv}, + primaryClass={cs.LG} +} +``` +