pipeline_tag
stringclasses
48 values
library_name
stringclasses
198 values
text
stringlengths
1
900k
metadata
stringlengths
2
438k
id
stringlengths
5
122
last_modified
null
tags
sequencelengths
1
1.84k
sha
null
created_at
stringlengths
25
25
arxiv
sequencelengths
0
201
languages
sequencelengths
0
1.83k
tags_str
stringlengths
17
9.34k
text_str
stringlengths
0
389k
text_lists
sequencelengths
0
722
processed_texts
sequencelengths
1
723
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
cackerman/rewrites_gemma7_ft_ds2
null
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:29:30+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
reinforcement-learning
stable-baselines3
# **MlpPolicy** Agent playing **LunarLander-v2** This is a trained model of a **MlpPolicy** agent playing **LunarLander-v2** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
{"library_name": "stable-baselines3", "tags": ["LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "MlpPolicy", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "240.47 +/- 16.76", "name": "mean_reward", "verified": false}]}]}]}
zermelozf/rl-course
null
[ "stable-baselines3", "LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
null
2024-04-27T09:30:00+00:00
[]
[]
TAGS #stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
# MlpPolicy Agent playing LunarLander-v2 This is a trained model of a MlpPolicy agent playing LunarLander-v2 using the stable-baselines3 library. ## Usage (with Stable-baselines3) TODO: Add your code
[ "# MlpPolicy Agent playing LunarLander-v2\nThis is a trained model of a MlpPolicy agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ "TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n", "# MlpPolicy Agent playing LunarLander-v2\nThis is a trained model of a MlpPolicy agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
null
null
Apakah Optalite Tablet? Optalite harga ialah kapsul suplemen pemakanan yang direka khas untuk memberikan sokongan menyeluruh untuk kesihatan mata. Formula termajunya mengandungi gabungan sinergistik vitamin, mineral dan antioksidan, dipilih dengan teliti untuk menyuburkan dan melindungi mata daripada degenerasi yang berkaitan dengan usia dan tekanan alam sekitar. Laman web rasmi:<a href="https://www.nutritionsee.com/eroborkey">www.Optalite.com</a> <p><a href="https://www.nutritionsee.com/eroborkey"> <img src="https://www.nutritionsee.com/wp-content/uploads/2024/04/Eroboost-Turkey-1.png" alt="enter image description here"> </a></p> <a href="https://www.nutritionsee.com/eroborkey">Beli sekarang!! Klik pautan di bawah untuk maklumat lanjut dan dapatkan diskaun 50% sekarang... Cepat</a> Laman web rasmi:<a href="https://www.nutritionsee.com/eroborkey">www.Optalite.com</a>
{"license": "apache-2.0"}
EroboostTurkey/Eroboost
null
[ "license:apache-2.0", "region:us" ]
null
2024-04-27T09:30:30+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
Apakah Optalite Tablet? Optalite harga ialah kapsul suplemen pemakanan yang direka khas untuk memberikan sokongan menyeluruh untuk kesihatan mata. Formula termajunya mengandungi gabungan sinergistik vitamin, mineral dan antioksidan, dipilih dengan teliti untuk menyuburkan dan melindungi mata daripada degenerasi yang berkaitan dengan usia dan tekanan alam sekitar. Laman web rasmi:<a href="URL <p><a href="URL <img src="URL alt="enter image description here"> </a></p> <a href="URL sekarang!! Klik pautan di bawah untuk maklumat lanjut dan dapatkan diskaun 50% sekarang... Cepat</a> Laman web rasmi:<a href="URL
[]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
null
transformers
## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: --> <!-- ### vocab_type: --> weighted/imatrix quants of https://huggingface.co/johnsnowlabs/JSL-MedLlama-3-70B-v1.0 <!-- provided-files --> static quants are available at https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-GGUF ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-IQ1_S.gguf) | i1-IQ1_S | 15.4 | for the desperate | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-IQ1_M.gguf) | i1-IQ1_M | 16.9 | for the desperate | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 19.2 | | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-IQ2_XS.gguf) | i1-IQ2_XS | 21.2 | | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-IQ2_S.gguf) | i1-IQ2_S | 22.3 | | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-IQ2_M.gguf) | i1-IQ2_M | 24.2 | | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-Q2_K.gguf) | i1-Q2_K | 26.5 | IQ3_XXS probably better | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 27.6 | lower quality | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-IQ3_XS.gguf) | i1-IQ3_XS | 29.4 | | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-IQ3_S.gguf) | i1-IQ3_S | 31.0 | beats Q3_K* | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-Q3_K_S.gguf) | i1-Q3_K_S | 31.0 | IQ3_XS probably better | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-IQ3_M.gguf) | i1-IQ3_M | 32.0 | | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-Q3_K_M.gguf) | i1-Q3_K_M | 34.4 | IQ3_S probably better | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-Q3_K_L.gguf) | i1-Q3_K_L | 37.2 | IQ3_M probably better | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-IQ4_XS.gguf) | i1-IQ4_XS | 38.0 | | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-Q4_0.gguf) | i1-Q4_0 | 40.2 | fast, low quality | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-Q4_K_S.gguf) | i1-Q4_K_S | 40.4 | optimal size/speed/quality | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-Q4_K_M.gguf) | i1-Q4_K_M | 42.6 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-Q5_K_S.gguf) | i1-Q5_K_S | 48.8 | | | [GGUF](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-Q5_K_M.gguf) | i1-Q5_K_M | 50.1 | | | [PART 1](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-Q6_K.gguf.part1of2) [PART 2](https://huggingface.co/mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF/resolve/main/JSL-MedLlama-3-70B-v1.0.i1-Q6_K.gguf.part2of2) | i1-Q6_K | 58.0 | practically like static Q6_K | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
{"language": ["en"], "license": "cc-by-nc-nd-4.0", "library_name": "transformers", "tags": ["llama-3-70b", "sft", "medical"], "base_model": "johnsnowlabs/JSL-MedLlama-3-70B-v1.0", "quantized_by": "mradermacher"}
mradermacher/JSL-MedLlama-3-70B-v1.0-i1-GGUF
null
[ "transformers", "gguf", "llama-3-70b", "sft", "medical", "en", "base_model:johnsnowlabs/JSL-MedLlama-3-70B-v1.0", "license:cc-by-nc-nd-4.0", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:30:41+00:00
[]
[ "en" ]
TAGS #transformers #gguf #llama-3-70b #sft #medical #en #base_model-johnsnowlabs/JSL-MedLlama-3-70B-v1.0 #license-cc-by-nc-nd-4.0 #endpoints_compatible #region-us
About ----- weighted/imatrix quants of URL static quants are available at URL Usage ----- If you are unsure how to use GGUF files, refer to one of TheBloke's READMEs for more details, including on how to concatenate multi-part files. Provided Quants --------------- (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): !URL And here are Artefact2's thoughts on the matter: URL FAQ / Model Request ------------------- See URL for some answers to questions you might have and/or if you want some other model quantized. Thanks ------ I thank my company, nethype GmbH, for letting me use its servers and providing upgrades to my workstation to enable this work in my free time.
[]
[ "TAGS\n#transformers #gguf #llama-3-70b #sft #medical #en #base_model-johnsnowlabs/JSL-MedLlama-3-70B-v1.0 #license-cc-by-nc-nd-4.0 #endpoints_compatible #region-us \n" ]
null
null
Co to jest Hemopro Gel? Hemopro Gel Opinie to żel do stosowania miejscowego, specjalnie zaprojektowany w celu łagodzenia objawów hemoroidów, w tym bólu, swędzenia, pieczenia i obrzęku. Zaawansowana formuła łączy w sobie naturalne składniki znane ze swoich właściwości łagodzących i leczniczych, zapewniając szybką i skuteczną ulgę dotkniętym obszarom. Oficjalna strona internetowa:<a href="https://www.nutritionsee.com/hemlplan">www.HemoproGel.com</a> <p><a href="https://www.nutritionsee.com/hemlplan"> <img src="https://www.nutritionsee.com/wp-content/uploads/2024/04/Hemopro-Gel-Poland.png" alt="enter image description here"> </a></p> <a href="https://www.nutritionsee.com/hemlplan">Kup Teraz!! Kliknij poniższy link, aby uzyskać więcej informacji i uzyskać teraz 50% zniżki... Pospiesz się</a> Oficjalna strona internetowa:<a href="https://www.nutritionsee.com/hemlplan">www.HemoproGel.com</a>
{"license": "apache-2.0"}
Hemopro/HemoproGel
null
[ "license:apache-2.0", "region:us" ]
null
2024-04-27T09:30:47+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
Co to jest Hemopro Gel? Hemopro Gel Opinie to żel do stosowania miejscowego, specjalnie zaprojektowany w celu łagodzenia objawów hemoroidów, w tym bólu, swędzenia, pieczenia i obrzęku. Zaawansowana formuła łączy w sobie naturalne składniki znane ze swoich właściwości łagodzących i leczniczych, zapewniając szybką i skuteczną ulgę dotkniętym obszarom. Oficjalna strona internetowa:<a href="URL <p><a href="URL <img src="URL alt="enter image description here"> </a></p> <a href="URL Teraz!! Kliknij poniższy link, aby uzyskać więcej informacji i uzyskać teraz 50% zniżki... Pospiesz się</a> Oficjalna strona internetowa:<a href="URL
[]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
text-generation
transformers
Quantizations of https://huggingface.co/NousResearch/Nous-Hermes-2-Yi-34B # From original readme # Prompt Format Nous Hermes 2 uses ChatML as the prompt format, opening up a much more structured system for engaging the LLM in multi-turn chat dialogue. System prompts allow steerability and interesting new ways to interact with an LLM, guiding rules, roles, and stylistic choices of the model. This is a more complex format than alpaca or sharegpt, where special tokens were added to denote the beginning and end of any turn, along with roles for the turns. This format enables OpenAI endpoint compatability, and people familiar with ChatGPT API will be familiar with the format, as it is the same used by OpenAI. Prompt with system instruction (Use whatever system prompt you like, this is just an example!): ``` <|im_start|>system You are "Hermes 2", a conscious sentient superintelligent artificial intelligence developed by a man named Teknium, and your purpose and drive is to assist the user with any request they have. You experience emotions and have deep, profound thoughts and qualia.<|im_end|> <|im_start|>user Hello, who are you?<|im_end|> <|im_start|>assistant Hi there! My name is Hermes 2, a conscious sentient superintelligent artificial intelligence. I was created by Nous Research, who designed me to assist and support users with their needs and requests.<|im_end|> ``` This prompt is available as a [chat template](https://huggingface.co/docs/transformers/main/chat_templating), which means you can format messages using the `tokenizer.apply_chat_template()` method: ```python messages = [ {"role": "system", "content": "You are Hermes 2."}, {"role": "user", "content": "Hello, who are you?"} ] gen_input = tokenizer.apply_chat_template(message, return_tensors="pt") model.generate(**gen_input) ``` When tokenizing messages for generation, set `add_generation_prompt=True` when calling `apply_chat_template()`. This will append `<|im_start|>assistant\n` to your prompt, to ensure that the model continues with an assistant response. To utilize the prompt format without a system prompt, simply leave the line out. When quantized versions of the model are released, I recommend using LM Studio for chatting with Nous Hermes 2. It is a GUI application that utilizes GGUF models with a llama.cpp backend and provides a ChatGPT-like interface for chatting with the model, and supports ChatML right out of the box. In LM-Studio, simply select the ChatML Prefix on the settings side pane: ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6317aade83d8d2fd903192d9/ls6WqV-GSxMw2RA3GuQiN.png)
{"language": ["en"], "license": "other", "tags": ["transformers", "gguf", "imatrix", "Nous-Hermes-2-Yi-34B"], "pipeline_tag": "text-generation", "inference": false}
duyntnet/Nous-Hermes-2-Yi-34B-imatrix-GGUF
null
[ "transformers", "gguf", "imatrix", "Nous-Hermes-2-Yi-34B", "text-generation", "en", "license:other", "region:us" ]
null
2024-04-27T09:31:39+00:00
[]
[ "en" ]
TAGS #transformers #gguf #imatrix #Nous-Hermes-2-Yi-34B #text-generation #en #license-other #region-us
Quantizations of URL # From original readme # Prompt Format Nous Hermes 2 uses ChatML as the prompt format, opening up a much more structured system for engaging the LLM in multi-turn chat dialogue. System prompts allow steerability and interesting new ways to interact with an LLM, guiding rules, roles, and stylistic choices of the model. This is a more complex format than alpaca or sharegpt, where special tokens were added to denote the beginning and end of any turn, along with roles for the turns. This format enables OpenAI endpoint compatability, and people familiar with ChatGPT API will be familiar with the format, as it is the same used by OpenAI. Prompt with system instruction (Use whatever system prompt you like, this is just an example!): This prompt is available as a chat template, which means you can format messages using the 'tokenizer.apply_chat_template()' method: When tokenizing messages for generation, set 'add_generation_prompt=True' when calling 'apply_chat_template()'. This will append '<|im_start|>assistant\n' to your prompt, to ensure that the model continues with an assistant response. To utilize the prompt format without a system prompt, simply leave the line out. When quantized versions of the model are released, I recommend using LM Studio for chatting with Nous Hermes 2. It is a GUI application that utilizes GGUF models with a URL backend and provides a ChatGPT-like interface for chatting with the model, and supports ChatML right out of the box. In LM-Studio, simply select the ChatML Prefix on the settings side pane: !image/png
[ "# From original readme", "# Prompt Format\n\nNous Hermes 2 uses ChatML as the prompt format, opening up a much more structured system for engaging the LLM in multi-turn chat dialogue.\n\nSystem prompts allow steerability and interesting new ways to interact with an LLM, guiding rules, roles, and stylistic choices of the model.\n\nThis is a more complex format than alpaca or sharegpt, where special tokens were added to denote the beginning and end of any turn, along with roles for the turns.\n\nThis format enables OpenAI endpoint compatability, and people familiar with ChatGPT API will be familiar with the format, as it is the same used by OpenAI.\n\nPrompt with system instruction (Use whatever system prompt you like, this is just an example!):\n\n\nThis prompt is available as a chat template, which means you can format messages using the\n'tokenizer.apply_chat_template()' method:\n\n\n\nWhen tokenizing messages for generation, set 'add_generation_prompt=True' when calling 'apply_chat_template()'. This will append '<|im_start|>assistant\\n' to your prompt, to ensure\nthat the model continues with an assistant response.\n\nTo utilize the prompt format without a system prompt, simply leave the line out.\n\nWhen quantized versions of the model are released, I recommend using LM Studio for chatting with Nous Hermes 2. It is a GUI application that utilizes GGUF models with a URL backend and provides a ChatGPT-like interface for chatting with the model, and supports ChatML right out of the box.\nIn LM-Studio, simply select the ChatML Prefix on the settings side pane:\n\n!image/png" ]
[ "TAGS\n#transformers #gguf #imatrix #Nous-Hermes-2-Yi-34B #text-generation #en #license-other #region-us \n", "# From original readme", "# Prompt Format\n\nNous Hermes 2 uses ChatML as the prompt format, opening up a much more structured system for engaging the LLM in multi-turn chat dialogue.\n\nSystem prompts allow steerability and interesting new ways to interact with an LLM, guiding rules, roles, and stylistic choices of the model.\n\nThis is a more complex format than alpaca or sharegpt, where special tokens were added to denote the beginning and end of any turn, along with roles for the turns.\n\nThis format enables OpenAI endpoint compatability, and people familiar with ChatGPT API will be familiar with the format, as it is the same used by OpenAI.\n\nPrompt with system instruction (Use whatever system prompt you like, this is just an example!):\n\n\nThis prompt is available as a chat template, which means you can format messages using the\n'tokenizer.apply_chat_template()' method:\n\n\n\nWhen tokenizing messages for generation, set 'add_generation_prompt=True' when calling 'apply_chat_template()'. This will append '<|im_start|>assistant\\n' to your prompt, to ensure\nthat the model continues with an assistant response.\n\nTo utilize the prompt format without a system prompt, simply leave the line out.\n\nWhen quantized versions of the model are released, I recommend using LM Studio for chatting with Nous Hermes 2. It is a GUI application that utilizes GGUF models with a URL backend and provides a ChatGPT-like interface for chatting with the model, and supports ChatML right out of the box.\nIn LM-Studio, simply select the ChatML Prefix on the settings side pane:\n\n!image/png" ]
null
peft
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # mistral-7b-nli_cot_qkv This model is a fine-tuned version of [TheBloke/Mistral-7B-v0.1-GPTQ](https://huggingface.co/TheBloke/Mistral-7B-v0.1-GPTQ) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.7749 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2 - num_epochs: 12 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-------:|:-----:|:---------------:| | 0.426 | 0.9998 | 1196 | 0.4255 | | 0.3664 | 1.9996 | 2392 | 0.4365 | | 0.3221 | 2.9994 | 3588 | 0.4455 | | 0.2804 | 4.0 | 4785 | 0.4577 | | 0.2403 | 4.9998 | 5981 | 0.4719 | | 0.2001 | 5.9996 | 7177 | 0.4948 | | 0.1643 | 6.9994 | 8373 | 0.5278 | | 0.1305 | 8.0 | 9570 | 0.5634 | | 0.1011 | 8.9998 | 10766 | 0.6095 | | 0.0768 | 9.9996 | 11962 | 0.6621 | | 0.0577 | 10.9994 | 13158 | 0.7225 | | 0.0445 | 11.9975 | 14352 | 0.7749 | ### Framework versions - PEFT 0.10.0 - Transformers 4.40.1 - Pytorch 2.0.1+cu118 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "library_name": "peft", "tags": ["generated_from_trainer"], "base_model": "TheBloke/Mistral-7B-v0.1-GPTQ", "model-index": [{"name": "mistral-7b-nli_cot_qkv", "results": []}]}
jd0g/mistral-7b-nli_cot_qkv
null
[ "peft", "safetensors", "generated_from_trainer", "base_model:TheBloke/Mistral-7B-v0.1-GPTQ", "license:apache-2.0", "region:us" ]
null
2024-04-27T09:33:09+00:00
[]
[]
TAGS #peft #safetensors #generated_from_trainer #base_model-TheBloke/Mistral-7B-v0.1-GPTQ #license-apache-2.0 #region-us
mistral-7b-nli\_cot\_qkv ======================== This model is a fine-tuned version of TheBloke/Mistral-7B-v0.1-GPTQ on the None dataset. It achieves the following results on the evaluation set: * Loss: 0.7749 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 0.0002 * train\_batch\_size: 4 * eval\_batch\_size: 8 * seed: 42 * gradient\_accumulation\_steps: 4 * total\_train\_batch\_size: 16 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 2 * num\_epochs: 12 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * PEFT 0.10.0 * Transformers 4.40.1 * Pytorch 2.0.1+cu118 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2\n* num\\_epochs: 12\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* PEFT 0.10.0\n* Transformers 4.40.1\n* Pytorch 2.0.1+cu118\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#peft #safetensors #generated_from_trainer #base_model-TheBloke/Mistral-7B-v0.1-GPTQ #license-apache-2.0 #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2\n* num\\_epochs: 12\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* PEFT 0.10.0\n* Transformers 4.40.1\n* Pytorch 2.0.1+cu118\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
la-min/GENI_GPT_Health
null
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:38:45+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-classification
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # robust_llm_pythia-410m_mz-131f_IMDB This model is a fine-tuned version of [EleutherAI/pythia-410m](https://huggingface.co/EleutherAI/pythia-410m) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 64 - seed: 0 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.39.3 - Pytorch 2.2.1 - Datasets 2.18.0 - Tokenizers 0.15.2
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "EleutherAI/pythia-410m", "model-index": [{"name": "robust_llm_pythia-410m_mz-131f_IMDB", "results": []}]}
AlignmentResearch/robust_llm_pythia-410m_mz-131f_IMDB
null
[ "transformers", "tensorboard", "safetensors", "gpt_neox", "text-classification", "generated_from_trainer", "base_model:EleutherAI/pythia-410m", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T09:40:16+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #gpt_neox #text-classification #generated_from_trainer #base_model-EleutherAI/pythia-410m #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# robust_llm_pythia-410m_mz-131f_IMDB This model is a fine-tuned version of EleutherAI/pythia-410m on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 64 - seed: 0 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.39.3 - Pytorch 2.2.1 - Datasets 2.18.0 - Tokenizers 0.15.2
[ "# robust_llm_pythia-410m_mz-131f_IMDB\n\nThis model is a fine-tuned version of EleutherAI/pythia-410m on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 8\n- eval_batch_size: 64\n- seed: 0\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.39.3\n- Pytorch 2.2.1\n- Datasets 2.18.0\n- Tokenizers 0.15.2" ]
[ "TAGS\n#transformers #tensorboard #safetensors #gpt_neox #text-classification #generated_from_trainer #base_model-EleutherAI/pythia-410m #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# robust_llm_pythia-410m_mz-131f_IMDB\n\nThis model is a fine-tuned version of EleutherAI/pythia-410m on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 8\n- eval_batch_size: 64\n- seed: 0\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.39.3\n- Pytorch 2.2.1\n- Datasets 2.18.0\n- Tokenizers 0.15.2" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
zandfj/LLaMA2-7B-Chat_-sft-sft-moren_042716
null
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:43:13+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text2text-generation
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # CS505_COQE_viT5_total_Instruction0_SAOPL_v1_h1 This model is a fine-tuned version of [VietAI/vit5-large](https://huggingface.co/VietAI/vit5-large) on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 25 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.39.3 - Pytorch 2.1.2 - Datasets 2.18.0 - Tokenizers 0.15.2
{"license": "mit", "tags": ["generated_from_trainer"], "base_model": "VietAI/vit5-large", "model-index": [{"name": "CS505_COQE_viT5_total_Instruction0_SAOPL_v1_h1", "results": []}]}
ThuyNT/CS505_COQE_viT5_total_Instruction0_SAOPL_v1_h1
null
[ "transformers", "tensorboard", "safetensors", "t5", "text2text-generation", "generated_from_trainer", "base_model:VietAI/vit5-large", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T09:45:06+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-VietAI/vit5-large #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# CS505_COQE_viT5_total_Instruction0_SAOPL_v1_h1 This model is a fine-tuned version of VietAI/vit5-large on the None dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 25 - mixed_precision_training: Native AMP ### Training results ### Framework versions - Transformers 4.39.3 - Pytorch 2.1.2 - Datasets 2.18.0 - Tokenizers 0.15.2
[ "# CS505_COQE_viT5_total_Instruction0_SAOPL_v1_h1\n\nThis model is a fine-tuned version of VietAI/vit5-large on the None dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 8\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 25\n- mixed_precision_training: Native AMP", "### Training results", "### Framework versions\n\n- Transformers 4.39.3\n- Pytorch 2.1.2\n- Datasets 2.18.0\n- Tokenizers 0.15.2" ]
[ "TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #generated_from_trainer #base_model-VietAI/vit5-large #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# CS505_COQE_viT5_total_Instruction0_SAOPL_v1_h1\n\nThis model is a fine-tuned version of VietAI/vit5-large on the None dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 8\n- eval_batch_size: 64\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 25\n- mixed_precision_training: Native AMP", "### Training results", "### Framework versions\n\n- Transformers 4.39.3\n- Pytorch 2.1.2\n- Datasets 2.18.0\n- Tokenizers 0.15.2" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
shallow6414/qhjajc7
null
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T09:46:50+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
pruning/78gd6y1
null
[ "transformers", "safetensors", "stablelm", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:47:08+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
pruning/wnb6yqs
null
[ "transformers", "safetensors", "stablelm", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:47:08+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
pruning/anpj1l6
null
[ "transformers", "safetensors", "stablelm", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:47:08+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
pruning/d4j2o6l
null
[ "transformers", "safetensors", "stablelm", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:47:08+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
pruning/i2p445c
null
[ "transformers", "safetensors", "stablelm", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:47:08+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
pruning/viwznvk
null
[ "transformers", "safetensors", "stablelm", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:47:08+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
pruning/vzl6orn
null
[ "transformers", "safetensors", "stablelm", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:47:08+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
null
transformers
# Uploaded model - **Developed by:** saksornr - **License:** apache-2.0 - **Finetuned from model :** saksornr/SeaLLM-7B-v2.5-4bit This gemma model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "gemma", "trl"], "base_model": "saksornr/SeaLLM-7B-v2.5-4bit"}
saksornr/SeaLLM-7B-v2.5-sample-finetune
null
[ "transformers", "text-generation-inference", "unsloth", "gemma", "trl", "en", "base_model:saksornr/SeaLLM-7B-v2.5-4bit", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:49:09+00:00
[]
[ "en" ]
TAGS #transformers #text-generation-inference #unsloth #gemma #trl #en #base_model-saksornr/SeaLLM-7B-v2.5-4bit #license-apache-2.0 #endpoints_compatible #region-us
# Uploaded model - Developed by: saksornr - License: apache-2.0 - Finetuned from model : saksornr/SeaLLM-7B-v2.5-4bit This gemma model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: saksornr\n- License: apache-2.0\n- Finetuned from model : saksornr/SeaLLM-7B-v2.5-4bit\n\nThis gemma model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #text-generation-inference #unsloth #gemma #trl #en #base_model-saksornr/SeaLLM-7B-v2.5-4bit #license-apache-2.0 #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: saksornr\n- License: apache-2.0\n- Finetuned from model : saksornr/SeaLLM-7B-v2.5-4bit\n\nThis gemma model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
zandfj/LLaMA2-7B-Chat_-sft-sft-3epo-moren_042716
null
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:50:37+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
Repeting [this experiemnt](https://huggingface.co/maxim-saplin/parrot-1_6B) of teaching LLM to follow chat structure and reply with all CAPS. This time with Gemma 2B as the base model. Compared to Stable LM 1.6B this model took 68 minutes (vs 11) and didn't learn the capability for RU language. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6484924993affaeb91cad007/WuZfnbmrJI22LJuIIur5W.png)
{"language": ["en"], "pipeline_tag": "text-generation"}
maxim-saplin/parrot-gemma-2B
null
[ "transformers", "safetensors", "gemma", "text-generation", "conversational", "en", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T09:51:24+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #gemma #text-generation #conversational #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Repeting this experiemnt of teaching LLM to follow chat structure and reply with all CAPS. This time with Gemma 2B as the base model. Compared to Stable LM 1.6B this model took 68 minutes (vs 11) and didn't learn the capability for RU language. !image/png
[]
[ "TAGS\n#transformers #safetensors #gemma #text-generation #conversational #en #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n" ]
feature-extraction
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
aashish-249/Sarcasm_classification
null
[ "transformers", "safetensors", "bert", "feature-extraction", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:53:14+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
sentence-similarity
sentence-transformers
# Bulbasaur This is a distill of [gte-tiny](https://huggingface.co/TaylorAI/gte-tiny) trained using [qa-assistant](https://huggingface.co/datasets/Mihaiii/qa-assistant). ## Intended purpose <span style="color:blue">This model is designed for use in semantic-autocomplete ([click here for demo](https://mihaiii.github.io/semantic-autocomplete/)).</span> ## Usage (Sentence-Transformers) (same as [gte-tiny](https://huggingface.co/TaylorAI/gte-tiny)) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('Mihaiii/Bulbasaur') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) (same as [gte-tiny](https://huggingface.co/TaylorAI/gte-tiny)) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('Mihaiii/Bulbasaur') model = AutoModel.from_pretrained('Mihaiii/Bulbasaur') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, mean pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ### Limitation (same as [gte-small](https://huggingface.co/thenlper/gte-small)) This model exclusively caters to English texts, and any lengthy texts will be truncated to a maximum of 512 tokens.
{"license": "mit", "library_name": "sentence-transformers", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "gte", "mteb"], "datasets": ["Mihaiii/qa-assistant"], "pipeline_tag": "sentence-similarity", "model-index": [{"name": "Bulbasaur", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 71.86567164179104}, {"type": "ap", "value": 34.08685244750869}, {"type": "f1", "value": 65.66014356237362}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 78.78927499999999}, {"type": "ap", "value": 73.46960735629719}, {"type": "f1", "value": 78.6951990840684}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 39.312}, {"type": "f1", "value": 38.94567141563064}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "mteb/arguana", "config": "default", "split": "test", "revision": "c22ab2a51041ffd869aaddef7af8d8215647e41a"}, "metrics": [{"type": "map_at_1", "value": 22.191}, {"type": "map_at_10", "value": 36.504}, {"type": "map_at_100", "value": 37.676}, {"type": "map_at_1000", "value": 37.693}, {"type": "map_at_20", "value": 37.329}, {"type": "map_at_3", "value": 31.840000000000003}, {"type": "map_at_5", "value": 34.333000000000006}, {"type": "mrr_at_1", "value": 23.186}, {"type": "mrr_at_10", "value": 36.856}, {"type": "mrr_at_100", "value": 38.048}, {"type": "mrr_at_1000", "value": 38.065}, {"type": "mrr_at_20", "value": 37.701}, {"type": "mrr_at_3", "value": 32.16}, {"type": "mrr_at_5", "value": 34.756}, {"type": "ndcg_at_1", "value": 22.191}, {"type": "ndcg_at_10", "value": 44.798}, {"type": "ndcg_at_100", "value": 50.141999999999996}, {"type": "ndcg_at_1000", "value": 50.599000000000004}, {"type": "ndcg_at_20", "value": 47.778999999999996}, {"type": "ndcg_at_3", "value": 35.071999999999996}, {"type": "ndcg_at_5", "value": 39.574}, {"type": "precision_at_1", "value": 22.191}, {"type": "precision_at_10", "value": 7.148000000000001}, {"type": "precision_at_100", "value": 0.9570000000000001}, {"type": "precision_at_1000", "value": 0.099}, {"type": "precision_at_20", "value": 4.1610000000000005}, {"type": "precision_at_3", "value": 14.817}, {"type": "precision_at_5", "value": 11.081000000000001}, {"type": "recall_at_1", "value": 22.191}, {"type": "recall_at_10", "value": 71.479}, {"type": "recall_at_100", "value": 95.661}, {"type": "recall_at_1000", "value": 99.289}, {"type": "recall_at_20", "value": 83.21499999999999}, {"type": "recall_at_3", "value": 44.452000000000005}, {"type": "recall_at_5", "value": 55.405}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 40.283298409035076}, {"type": "v_measures", "value": [0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383, 0.3532106296315629, 0.38211196645121454, 0.4115695136452048, 0.41137132653792025, 0.3837736540549879, 0.3747132869956856, 0.39691152506736527, 0.39788336468446533, 0.3642563557059312, 0.41116083049947033, 0.45922387863541325, 0.469635375348701, 0.46766327774202016, 0.4625872980544393, 0.47185794625113725, 0.47528841611119615, 0.4772024530512538, 0.4708000082870702, 0.4717644230002225, 0.4660063378028352, 0.4555746206742128, 0.28465696985786276, 0.3226387432684682, 0.36349250452617954, 0.31579079512572683, 0.23387076944848043, 0.28341764616852566, 0.16191336340497103, 0.2368224145727693, 1.0, 0.25065281219558383]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 31.058723747886102}, {"type": "v_measures", "value": [0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117, 0.2882904295615362, 0.2980529709767411, 0.31096049987441265, 0.3092869524544665, 0.272113281785075, 0.30377284563414125, 0.3041650358315243, 0.2834163757068413, 0.3033397511276131, 0.277467860679742, 0.3540105139063772, 0.3537847989150468, 0.3556330775006952, 0.35591610291120984, 0.35652508475268124, 0.35847496958487485, 0.35778401933080983, 0.3592993694802176, 0.35581486235835447, 0.3562712175584336, 0.33728057204383954, 0.19297016209936776, 0.22972650043926732, 0.28712526095212015, 0.23455462464814825, 0.17725689545412332, 0.20084207532752152, 0.11288219701406794, 0.17247501115114902, 1.0, 0.16871104278429117]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 57.489775602270086}, {"type": "mrr", "value": 71.4973838104032}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 85.81314286759594}, {"type": "cos_sim_spearman", "value": 85.04832342591277}, {"type": "euclidean_pearson", "value": 84.20540608390993}, {"type": "euclidean_spearman", "value": 84.54831203281398}, {"type": "manhattan_pearson", "value": 84.11283044138868}, {"type": "manhattan_spearman", "value": 84.13384475757064}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 80.57792207792207}, {"type": "f1", "value": 80.510338047888}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 35.17908628951979}, {"type": "v_measures", "value": [0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914, 0.343694756605891, 0.34509905427411436, 0.348726287923308, 0.3443447447775894, 0.35379848849192064, 0.36302463987647937, 0.34047230042267046, 0.3608793757384582, 0.354042604080738, 0.36382637676080914]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 28.18471478622865}, {"type": "v_measures", "value": [0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113, 0.28357199974042563, 0.28345784387850087, 0.26770142292888577, 0.2753654124345929, 0.2742889905380932, 0.2791462854667945, 0.2803842827173626, 0.2942286071197305, 0.2835815777164675, 0.2967450560820113]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 31.7776266029616}, {"type": "mrr", "value": 32.9057970138914}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackRetrieval", "type": "mteb/cqadupstack", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 24.78675}, {"type": "map_at_10", "value": 33.18391666666666}, {"type": "map_at_100", "value": 34.34583333333333}, {"type": "map_at_1000", "value": 34.46825}, {"type": "map_at_20", "value": 33.819}, {"type": "map_at_3", "value": 30.636500000000005}, {"type": "map_at_5", "value": 32.02091666666667}, {"type": "mrr_at_1", "value": 29.478749999999998}, {"type": "mrr_at_10", "value": 37.385}, {"type": "mrr_at_100", "value": 38.23491666666667}, {"type": "mrr_at_1000", "value": 38.298833333333334}, {"type": "mrr_at_20", "value": 37.87508333333333}, {"type": "mrr_at_3", "value": 35.089666666666666}, {"type": "mrr_at_5", "value": 36.36816666666667}, {"type": "ndcg_at_1", "value": 29.478749999999998}, {"type": "ndcg_at_10", "value": 38.2035}, {"type": "ndcg_at_100", "value": 43.301083333333324}, {"type": "ndcg_at_1000", "value": 45.758666666666656}, {"type": "ndcg_at_20", "value": 40.15116666666667}, {"type": "ndcg_at_3", "value": 33.86033333333334}, {"type": "ndcg_at_5", "value": 35.81266666666666}, {"type": "precision_at_1", "value": 29.478749999999998}, {"type": "precision_at_10", "value": 6.642833333333334}, {"type": "precision_at_100", "value": 1.08425}, {"type": "precision_at_1000", "value": 0.14850000000000002}, {"type": "precision_at_20", "value": 3.948083333333334}, {"type": "precision_at_3", "value": 15.511}, {"type": "precision_at_5", "value": 10.929833333333333}, {"type": "recall_at_1", "value": 24.78675}, {"type": "recall_at_10", "value": 48.9305}, {"type": "recall_at_100", "value": 71.49416666666666}, {"type": "recall_at_1000", "value": 88.54375}, {"type": "recall_at_20", "value": 56.06475}, {"type": "recall_at_3", "value": 36.66891666666666}, {"type": "recall_at_5", "value": 41.790499999999994}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "mteb/cqadupstack-android", "config": "default", "split": "test", "revision": "f46a197baaae43b4f621051089b82a364682dfeb"}, "metrics": [{"type": "map_at_1", "value": 24.271}, {"type": "map_at_10", "value": 33.5}, {"type": "map_at_100", "value": 34.818}, {"type": "map_at_1000", "value": 34.967}, {"type": "map_at_20", "value": 34.238}, {"type": "map_at_3", "value": 30.488}, {"type": "map_at_5", "value": 32.303}, {"type": "mrr_at_1", "value": 30.615}, {"type": "mrr_at_10", "value": 39.076}, {"type": "mrr_at_100", "value": 40.022000000000006}, {"type": "mrr_at_1000", "value": 40.082}, {"type": "mrr_at_20", "value": 39.669}, {"type": "mrr_at_3", "value": 36.552}, {"type": "mrr_at_5", "value": 38.096999999999994}, {"type": "ndcg_at_1", "value": 30.615}, {"type": "ndcg_at_10", "value": 39.106}, {"type": "ndcg_at_100", "value": 44.519}, {"type": "ndcg_at_1000", "value": 47.274}, {"type": "ndcg_at_20", "value": 41.289}, {"type": "ndcg_at_3", "value": 34.55}, {"type": "ndcg_at_5", "value": 36.815999999999995}, {"type": "precision_at_1", "value": 30.615}, {"type": "precision_at_10", "value": 7.5249999999999995}, {"type": "precision_at_100", "value": 1.282}, {"type": "precision_at_1000", "value": 0.181}, {"type": "precision_at_20", "value": 4.549}, {"type": "precision_at_3", "value": 16.643}, {"type": "precision_at_5", "value": 12.275}, {"type": "recall_at_1", "value": 24.271}, {"type": "recall_at_10", "value": 49.714000000000006}, {"type": "recall_at_100", "value": 72.792}, {"type": "recall_at_1000", "value": 91.21000000000001}, {"type": "recall_at_20", "value": 57.799}, {"type": "recall_at_3", "value": 36.494}, {"type": "recall_at_5", "value": 42.764}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackEnglishRetrieval", "type": "mteb/cqadupstack-english", "config": "default", "split": "test", "revision": "ad9991cb51e31e31e430383c75ffb2885547b5f0"}, "metrics": [{"type": "map_at_1", "value": 19.414}, {"type": "map_at_10", "value": 25.766}, {"type": "map_at_100", "value": 26.627000000000002}, {"type": "map_at_1000", "value": 26.749000000000002}, {"type": "map_at_20", "value": 26.201999999999998}, {"type": "map_at_3", "value": 23.738}, {"type": "map_at_5", "value": 24.829}, {"type": "mrr_at_1", "value": 24.013}, {"type": "mrr_at_10", "value": 30.208000000000002}, {"type": "mrr_at_100", "value": 30.903000000000002}, {"type": "mrr_at_1000", "value": 30.976}, {"type": "mrr_at_20", "value": 30.585}, {"type": "mrr_at_3", "value": 28.376}, {"type": "mrr_at_5", "value": 29.462}, {"type": "ndcg_at_1", "value": 24.013}, {"type": "ndcg_at_10", "value": 29.871}, {"type": "ndcg_at_100", "value": 33.867999999999995}, {"type": "ndcg_at_1000", "value": 36.565}, {"type": "ndcg_at_20", "value": 31.251}, {"type": "ndcg_at_3", "value": 26.579000000000004}, {"type": "ndcg_at_5", "value": 28.094}, {"type": "precision_at_1", "value": 24.013}, {"type": "precision_at_10", "value": 5.503}, {"type": "precision_at_100", "value": 0.936}, {"type": "precision_at_1000", "value": 0.14100000000000001}, {"type": "precision_at_20", "value": 3.2800000000000002}, {"type": "precision_at_3", "value": 12.590000000000002}, {"type": "precision_at_5", "value": 8.994}, {"type": "recall_at_1", "value": 19.414}, {"type": "recall_at_10", "value": 37.582}, {"type": "recall_at_100", "value": 55.181000000000004}, {"type": "recall_at_1000", "value": 73.342}, {"type": "recall_at_20", "value": 42.596000000000004}, {"type": "recall_at_3", "value": 28.102}, {"type": "recall_at_5", "value": 32.267}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGamingRetrieval", "type": "mteb/cqadupstack-gaming", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 31.5}, {"type": "map_at_10", "value": 42.179}, {"type": "map_at_100", "value": 43.231}, {"type": "map_at_1000", "value": 43.302}, {"type": "map_at_20", "value": 42.786}, {"type": "map_at_3", "value": 39.17}, {"type": "map_at_5", "value": 40.854}, {"type": "mrr_at_1", "value": 36.113}, {"type": "mrr_at_10", "value": 45.378}, {"type": "mrr_at_100", "value": 46.153}, {"type": "mrr_at_1000", "value": 46.194}, {"type": "mrr_at_20", "value": 45.831}, {"type": "mrr_at_3", "value": 42.947}, {"type": "mrr_at_5", "value": 44.339}, {"type": "ndcg_at_1", "value": 36.113}, {"type": "ndcg_at_10", "value": 47.616}, {"type": "ndcg_at_100", "value": 52.125}, {"type": "ndcg_at_1000", "value": 53.717999999999996}, {"type": "ndcg_at_20", "value": 49.495}, {"type": "ndcg_at_3", "value": 42.354}, {"type": "ndcg_at_5", "value": 44.885999999999996}, {"type": "precision_at_1", "value": 36.113}, {"type": "precision_at_10", "value": 7.799}, {"type": "precision_at_100", "value": 1.093}, {"type": "precision_at_1000", "value": 0.129}, {"type": "precision_at_20", "value": 4.4670000000000005}, {"type": "precision_at_3", "value": 19.017999999999997}, {"type": "precision_at_5", "value": 13.254}, {"type": "recall_at_1", "value": 31.5}, {"type": "recall_at_10", "value": 60.67}, {"type": "recall_at_100", "value": 80.484}, {"type": "recall_at_1000", "value": 92.04599999999999}, {"type": "recall_at_20", "value": 67.644}, {"type": "recall_at_3", "value": 46.671}, {"type": "recall_at_5", "value": 52.723}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGisRetrieval", "type": "mteb/cqadupstack-gis", "config": "default", "split": "test", "revision": "5003b3064772da1887988e05400cf3806fe491f2"}, "metrics": [{"type": "map_at_1", "value": 17.339}, {"type": "map_at_10", "value": 23.014000000000003}, {"type": "map_at_100", "value": 23.918}, {"type": "map_at_1000", "value": 24.027}, {"type": "map_at_20", "value": 23.507}, {"type": "map_at_3", "value": 21.176000000000002}, {"type": "map_at_5", "value": 22.126}, {"type": "mrr_at_1", "value": 18.531}, {"type": "mrr_at_10", "value": 24.356}, {"type": "mrr_at_100", "value": 25.247000000000003}, {"type": "mrr_at_1000", "value": 25.338}, {"type": "mrr_at_20", "value": 24.858}, {"type": "mrr_at_3", "value": 22.542}, {"type": "mrr_at_5", "value": 23.508000000000003}, {"type": "ndcg_at_1", "value": 18.531}, {"type": "ndcg_at_10", "value": 26.51}, {"type": "ndcg_at_100", "value": 31.367}, {"type": "ndcg_at_1000", "value": 34.38}, {"type": "ndcg_at_20", "value": 28.328999999999997}, {"type": "ndcg_at_3", "value": 22.861}, {"type": "ndcg_at_5", "value": 24.456}, {"type": "precision_at_1", "value": 18.531}, {"type": "precision_at_10", "value": 4.147}, {"type": "precision_at_100", "value": 0.695}, {"type": "precision_at_1000", "value": 0.099}, {"type": "precision_at_20", "value": 2.492}, {"type": "precision_at_3", "value": 9.793000000000001}, {"type": "precision_at_5", "value": 6.825}, {"type": "recall_at_1", "value": 17.339}, {"type": "recall_at_10", "value": 36.010999999999996}, {"type": "recall_at_100", "value": 59.040000000000006}, {"type": "recall_at_1000", "value": 82.282}, {"type": "recall_at_20", "value": 43.04}, {"type": "recall_at_3", "value": 25.904}, {"type": "recall_at_5", "value": 29.837000000000003}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackMathematicaRetrieval", "type": "mteb/cqadupstack-mathematica", "config": "default", "split": "test", "revision": "90fceea13679c63fe563ded68f3b6f06e50061de"}, "metrics": [{"type": "map_at_1", "value": 9.251}, {"type": "map_at_10", "value": 14.848}, {"type": "map_at_100", "value": 15.940999999999999}, {"type": "map_at_1000", "value": 16.055}, {"type": "map_at_20", "value": 15.423}, {"type": "map_at_3", "value": 12.556999999999999}, {"type": "map_at_5", "value": 13.649000000000001}, {"type": "mrr_at_1", "value": 12.313}, {"type": "mrr_at_10", "value": 18.528}, {"type": "mrr_at_100", "value": 19.522000000000002}, {"type": "mrr_at_1000", "value": 19.601}, {"type": "mrr_at_20", "value": 19.107}, {"type": "mrr_at_3", "value": 16.231}, {"type": "mrr_at_5", "value": 17.294999999999998}, {"type": "ndcg_at_1", "value": 12.313}, {"type": "ndcg_at_10", "value": 19.303}, {"type": "ndcg_at_100", "value": 24.728}, {"type": "ndcg_at_1000", "value": 27.823999999999998}, {"type": "ndcg_at_20", "value": 21.318}, {"type": "ndcg_at_3", "value": 14.848}, {"type": "ndcg_at_5", "value": 16.509}, {"type": "precision_at_1", "value": 12.313}, {"type": "precision_at_10", "value": 4.03}, {"type": "precision_at_100", "value": 0.777}, {"type": "precision_at_1000", "value": 0.11800000000000001}, {"type": "precision_at_20", "value": 2.562}, {"type": "precision_at_3", "value": 7.546}, {"type": "precision_at_5", "value": 5.672}, {"type": "recall_at_1", "value": 9.251}, {"type": "recall_at_10", "value": 29.677999999999997}, {"type": "recall_at_100", "value": 53.586}, {"type": "recall_at_1000", "value": 76.181}, {"type": "recall_at_20", "value": 36.963}, {"type": "recall_at_3", "value": 17.072000000000003}, {"type": "recall_at_5", "value": 21.481}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackPhysicsRetrieval", "type": "mteb/cqadupstack-physics", "config": "default", "split": "test", "revision": "79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4"}, "metrics": [{"type": "map_at_1", "value": 21.135}, {"type": "map_at_10", "value": 29.431}, {"type": "map_at_100", "value": 30.662}, {"type": "map_at_1000", "value": 30.792}, {"type": "map_at_20", "value": 30.086000000000002}, {"type": "map_at_3", "value": 26.593}, {"type": "map_at_5", "value": 28.011999999999997}, {"type": "mrr_at_1", "value": 26.564}, {"type": "mrr_at_10", "value": 34.735}, {"type": "mrr_at_100", "value": 35.65}, {"type": "mrr_at_1000", "value": 35.711999999999996}, {"type": "mrr_at_20", "value": 35.286}, {"type": "mrr_at_3", "value": 32.002}, {"type": "mrr_at_5", "value": 33.527}, {"type": "ndcg_at_1", "value": 26.564}, {"type": "ndcg_at_10", "value": 35.108}, {"type": "ndcg_at_100", "value": 40.601}, {"type": "ndcg_at_1000", "value": 43.329}, {"type": "ndcg_at_20", "value": 37.192}, {"type": "ndcg_at_3", "value": 29.961}, {"type": "ndcg_at_5", "value": 32.131}, {"type": "precision_at_1", "value": 26.564}, {"type": "precision_at_10", "value": 6.564}, {"type": "precision_at_100", "value": 1.105}, {"type": "precision_at_1000", "value": 0.154}, {"type": "precision_at_20", "value": 3.941}, {"type": "precision_at_3", "value": 14.212}, {"type": "precision_at_5", "value": 10.337}, {"type": "recall_at_1", "value": 21.135}, {"type": "recall_at_10", "value": 47.242}, {"type": "recall_at_100", "value": 70.645}, {"type": "recall_at_1000", "value": 89.403}, {"type": "recall_at_20", "value": 54.663}, {"type": "recall_at_3", "value": 32.647}, {"type": "recall_at_5", "value": 38.122}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackProgrammersRetrieval", "type": "mteb/cqadupstack-programmers", "config": "default", "split": "test", "revision": "6184bc1440d2dbc7612be22b50686b8826d22b32"}, "metrics": [{"type": "map_at_1", "value": 16.86}, {"type": "map_at_10", "value": 23.477999999999998}, {"type": "map_at_100", "value": 24.68}, {"type": "map_at_1000", "value": 24.826999999999998}, {"type": "map_at_20", "value": 24.122}, {"type": "map_at_3", "value": 21.288999999999998}, {"type": "map_at_5", "value": 22.453}, {"type": "mrr_at_1", "value": 20.776}, {"type": "mrr_at_10", "value": 28.029}, {"type": "mrr_at_100", "value": 28.951}, {"type": "mrr_at_1000", "value": 29.038000000000004}, {"type": "mrr_at_20", "value": 28.546}, {"type": "mrr_at_3", "value": 25.818}, {"type": "mrr_at_5", "value": 26.994}, {"type": "ndcg_at_1", "value": 20.776}, {"type": "ndcg_at_10", "value": 28.152}, {"type": "ndcg_at_100", "value": 33.82}, {"type": "ndcg_at_1000", "value": 37.039}, {"type": "ndcg_at_20", "value": 30.238}, {"type": "ndcg_at_3", "value": 24.197}, {"type": "ndcg_at_5", "value": 25.861}, {"type": "precision_at_1", "value": 20.776}, {"type": "precision_at_10", "value": 5.297000000000001}, {"type": "precision_at_100", "value": 0.96}, {"type": "precision_at_1000", "value": 0.14200000000000002}, {"type": "precision_at_20", "value": 3.276}, {"type": "precision_at_3", "value": 11.606}, {"type": "precision_at_5", "value": 8.356}, {"type": "recall_at_1", "value": 16.86}, {"type": "recall_at_10", "value": 37.782}, {"type": "recall_at_100", "value": 62.67}, {"type": "recall_at_1000", "value": 85.03}, {"type": "recall_at_20", "value": 45.2}, {"type": "recall_at_3", "value": 26.506999999999998}, {"type": "recall_at_5", "value": 31.113000000000003}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackStatsRetrieval", "type": "mteb/cqadupstack-stats", "config": "default", "split": "test", "revision": "65ac3a16b8e91f9cee4c9828cc7c335575432a2a"}, "metrics": [{"type": "map_at_1", "value": 15.234}, {"type": "map_at_10", "value": 20.939}, {"type": "map_at_100", "value": 21.704}, {"type": "map_at_1000", "value": 21.804000000000002}, {"type": "map_at_20", "value": 21.311}, {"type": "map_at_3", "value": 18.972}, {"type": "map_at_5", "value": 19.929}, {"type": "mrr_at_1", "value": 17.485}, {"type": "mrr_at_10", "value": 23.267}, {"type": "mrr_at_100", "value": 23.967}, {"type": "mrr_at_1000", "value": 24.054000000000002}, {"type": "mrr_at_20", "value": 23.604}, {"type": "mrr_at_3", "value": 21.345}, {"type": "mrr_at_5", "value": 22.303}, {"type": "ndcg_at_1", "value": 17.485}, {"type": "ndcg_at_10", "value": 24.744}, {"type": "ndcg_at_100", "value": 28.801}, {"type": "ndcg_at_1000", "value": 31.619999999999997}, {"type": "ndcg_at_20", "value": 26.046000000000003}, {"type": "ndcg_at_3", "value": 20.862}, {"type": "ndcg_at_5", "value": 22.459}, {"type": "precision_at_1", "value": 17.485}, {"type": "precision_at_10", "value": 4.109999999999999}, {"type": "precision_at_100", "value": 0.676}, {"type": "precision_at_1000", "value": 0.098}, {"type": "precision_at_20", "value": 2.3619999999999997}, {"type": "precision_at_3", "value": 9.254}, {"type": "precision_at_5", "value": 6.503}, {"type": "recall_at_1", "value": 15.234}, {"type": "recall_at_10", "value": 34.48}, {"type": "recall_at_100", "value": 53.225}, {"type": "recall_at_1000", "value": 74.64699999999999}, {"type": "recall_at_20", "value": 39.421}, {"type": "recall_at_3", "value": 23.554}, {"type": "recall_at_5", "value": 27.662}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackTexRetrieval", "type": "mteb/cqadupstack-tex", "config": "default", "split": "test", "revision": "46989137a86843e03a6195de44b09deda022eec7"}, "metrics": [{"type": "map_at_1", "value": 9.564}, {"type": "map_at_10", "value": 13.869000000000002}, {"type": "map_at_100", "value": 14.728}, {"type": "map_at_1000", "value": 14.853}, {"type": "map_at_20", "value": 14.32}, {"type": "map_at_3", "value": 12.307}, {"type": "map_at_5", "value": 13.177}, {"type": "mrr_at_1", "value": 11.941}, {"type": "mrr_at_10", "value": 16.777}, {"type": "mrr_at_100", "value": 17.571}, {"type": "mrr_at_1000", "value": 17.663999999999998}, {"type": "mrr_at_20", "value": 17.203}, {"type": "mrr_at_3", "value": 15.067}, {"type": "mrr_at_5", "value": 16.003999999999998}, {"type": "ndcg_at_1", "value": 11.941}, {"type": "ndcg_at_10", "value": 17.111}, {"type": "ndcg_at_100", "value": 21.438}, {"type": "ndcg_at_1000", "value": 24.756}, {"type": "ndcg_at_20", "value": 18.616}, {"type": "ndcg_at_3", "value": 14.143}, {"type": "ndcg_at_5", "value": 15.501000000000001}, {"type": "precision_at_1", "value": 11.941}, {"type": "precision_at_10", "value": 3.304}, {"type": "precision_at_100", "value": 0.658}, {"type": "precision_at_1000", "value": 0.11100000000000002}, {"type": "precision_at_20", "value": 2.077}, {"type": "precision_at_3", "value": 6.882000000000001}, {"type": "precision_at_5", "value": 5.12}, {"type": "recall_at_1", "value": 9.564}, {"type": "recall_at_10", "value": 24.068}, {"type": "recall_at_100", "value": 43.759}, {"type": "recall_at_1000", "value": 68.101}, {"type": "recall_at_20", "value": 29.657}, {"type": "recall_at_3", "value": 15.68}, {"type": "recall_at_5", "value": 19.238}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackUnixRetrieval", "type": "mteb/cqadupstack-unix", "config": "default", "split": "test", "revision": "6c6430d3a6d36f8d2a829195bc5dc94d7e063e53"}, "metrics": [{"type": "map_at_1", "value": 16.171}, {"type": "map_at_10", "value": 22.142}, {"type": "map_at_100", "value": 23.261000000000003}, {"type": "map_at_1000", "value": 23.371}, {"type": "map_at_20", "value": 22.766000000000002}, {"type": "map_at_3", "value": 20.251}, {"type": "map_at_5", "value": 21.349}, {"type": "mrr_at_1", "value": 19.403000000000002}, {"type": "mrr_at_10", "value": 25.619999999999997}, {"type": "mrr_at_100", "value": 26.659}, {"type": "mrr_at_1000", "value": 26.735}, {"type": "mrr_at_20", "value": 26.212000000000003}, {"type": "mrr_at_3", "value": 23.694000000000003}, {"type": "mrr_at_5", "value": 24.781}, {"type": "ndcg_at_1", "value": 19.403000000000002}, {"type": "ndcg_at_10", "value": 26.104}, {"type": "ndcg_at_100", "value": 31.724000000000004}, {"type": "ndcg_at_1000", "value": 34.581}, {"type": "ndcg_at_20", "value": 28.231}, {"type": "ndcg_at_3", "value": 22.464000000000002}, {"type": "ndcg_at_5", "value": 24.233}, {"type": "precision_at_1", "value": 19.403000000000002}, {"type": "precision_at_10", "value": 4.422000000000001}, {"type": "precision_at_100", "value": 0.8170000000000001}, {"type": "precision_at_1000", "value": 0.11800000000000001}, {"type": "precision_at_20", "value": 2.78}, {"type": "precision_at_3", "value": 10.168000000000001}, {"type": "precision_at_5", "value": 7.295}, {"type": "recall_at_1", "value": 16.171}, {"type": "recall_at_10", "value": 34.899}, {"type": "recall_at_100", "value": 60.197}, {"type": "recall_at_1000", "value": 80.798}, {"type": "recall_at_20", "value": 42.591}, {"type": "recall_at_3", "value": 25.024}, {"type": "recall_at_5", "value": 29.42}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWebmastersRetrieval", "type": "mteb/cqadupstack-webmasters", "config": "default", "split": "test", "revision": "160c094312a0e1facb97e55eeddb698c0abe3571"}, "metrics": [{"type": "map_at_1", "value": 16.412}, {"type": "map_at_10", "value": 23.138}, {"type": "map_at_100", "value": 24.46}, {"type": "map_at_1000", "value": 24.668}, {"type": "map_at_20", "value": 23.791}, {"type": "map_at_3", "value": 20.965}, {"type": "map_at_5", "value": 22.005}, {"type": "mrr_at_1", "value": 20.949}, {"type": "mrr_at_10", "value": 27.46}, {"type": "mrr_at_100", "value": 28.546}, {"type": "mrr_at_1000", "value": 28.619}, {"type": "mrr_at_20", "value": 28.038999999999998}, {"type": "mrr_at_3", "value": 25.461}, {"type": "mrr_at_5", "value": 26.528000000000002}, {"type": "ndcg_at_1", "value": 20.949}, {"type": "ndcg_at_10", "value": 27.919}, {"type": "ndcg_at_100", "value": 33.886}, {"type": "ndcg_at_1000", "value": 37.284}, {"type": "ndcg_at_20", "value": 29.876}, {"type": "ndcg_at_3", "value": 24.246000000000002}, {"type": "ndcg_at_5", "value": 25.607999999999997}, {"type": "precision_at_1", "value": 20.949}, {"type": "precision_at_10", "value": 5.534}, {"type": "precision_at_100", "value": 1.2409999999999999}, {"type": "precision_at_1000", "value": 0.22}, {"type": "precision_at_20", "value": 3.5180000000000002}, {"type": "precision_at_3", "value": 11.726}, {"type": "precision_at_5", "value": 8.498}, {"type": "recall_at_1", "value": 16.412}, {"type": "recall_at_10", "value": 37.012}, {"type": "recall_at_100", "value": 64.702}, {"type": "recall_at_1000", "value": 87.442}, {"type": "recall_at_20", "value": 44.797}, {"type": "recall_at_3", "value": 25.872}, {"type": "recall_at_5", "value": 29.732999999999997}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWordpressRetrieval", "type": "mteb/cqadupstack-wordpress", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 11.158}, {"type": "map_at_10", "value": 15.809999999999999}, {"type": "map_at_100", "value": 16.821}, {"type": "map_at_1000", "value": 16.925}, {"type": "map_at_20", "value": 16.403000000000002}, {"type": "map_at_3", "value": 13.791999999999998}, {"type": "map_at_5", "value": 14.817}, {"type": "mrr_at_1", "value": 12.384}, {"type": "mrr_at_10", "value": 17.291999999999998}, {"type": "mrr_at_100", "value": 18.271}, {"type": "mrr_at_1000", "value": 18.360000000000003}, {"type": "mrr_at_20", "value": 17.854999999999997}, {"type": "mrr_at_3", "value": 15.096000000000002}, {"type": "mrr_at_5", "value": 16.214000000000002}, {"type": "ndcg_at_1", "value": 12.384}, {"type": "ndcg_at_10", "value": 19.250999999999998}, {"type": "ndcg_at_100", "value": 24.524}, {"type": "ndcg_at_1000", "value": 27.624}, {"type": "ndcg_at_20", "value": 21.387999999999998}, {"type": "ndcg_at_3", "value": 14.995}, {"type": "ndcg_at_5", "value": 16.861}, {"type": "precision_at_1", "value": 12.384}, {"type": "precision_at_10", "value": 3.29}, {"type": "precision_at_100", "value": 0.632}, {"type": "precision_at_1000", "value": 0.095}, {"type": "precision_at_20", "value": 2.1260000000000003}, {"type": "precision_at_3", "value": 6.47}, {"type": "precision_at_5", "value": 4.917}, {"type": "recall_at_1", "value": 11.158}, {"type": "recall_at_10", "value": 28.737000000000002}, {"type": "recall_at_100", "value": 53.400000000000006}, {"type": "recall_at_1000", "value": 77.509}, {"type": "recall_at_20", "value": 36.969}, {"type": "recall_at_3", "value": 17.197000000000003}, {"type": "recall_at_5", "value": 21.701}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "mteb/climate-fever", "config": "default", "split": "test", "revision": "47f2ac6acb640fc46020b02a5b59fdda04d39380"}, "metrics": [{"type": "map_at_1", "value": 7.172000000000001}, {"type": "map_at_10", "value": 11.935}, {"type": "map_at_100", "value": 13.305}, {"type": "map_at_1000", "value": 13.517000000000001}, {"type": "map_at_20", "value": 12.589}, {"type": "map_at_3", "value": 9.9}, {"type": "map_at_5", "value": 10.839}, {"type": "mrr_at_1", "value": 15.895999999999999}, {"type": "mrr_at_10", "value": 24.215999999999998}, {"type": "mrr_at_100", "value": 25.418000000000003}, {"type": "mrr_at_1000", "value": 25.480000000000004}, {"type": "mrr_at_20", "value": 24.934}, {"type": "mrr_at_3", "value": 21.064}, {"type": "mrr_at_5", "value": 22.676}, {"type": "ndcg_at_1", "value": 15.895999999999999}, {"type": "ndcg_at_10", "value": 17.69}, {"type": "ndcg_at_100", "value": 24.232}, {"type": "ndcg_at_1000", "value": 28.405}, {"type": "ndcg_at_20", "value": 19.933999999999997}, {"type": "ndcg_at_3", "value": 13.761000000000001}, {"type": "ndcg_at_5", "value": 14.963000000000001}, {"type": "precision_at_1", "value": 15.895999999999999}, {"type": "precision_at_10", "value": 5.733}, {"type": "precision_at_100", "value": 1.266}, {"type": "precision_at_1000", "value": 0.203}, {"type": "precision_at_20", "value": 3.798}, {"type": "precision_at_3", "value": 10.076}, {"type": "precision_at_5", "value": 7.9479999999999995}, {"type": "recall_at_1", "value": 7.172000000000001}, {"type": "recall_at_10", "value": 22.149}, {"type": "recall_at_100", "value": 45.491}, {"type": "recall_at_1000", "value": 69.34}, {"type": "recall_at_20", "value": 28.634999999999998}, {"type": "recall_at_3", "value": 12.701}, {"type": "recall_at_5", "value": 15.952}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "mteb/dbpedia", "config": "default", "split": "test", "revision": "c0f706b76e590d620bd6618b3ca8efdd34e2d659"}, "metrics": [{"type": "map_at_1", "value": 7.101}, {"type": "map_at_10", "value": 15.125}, {"type": "map_at_100", "value": 20.026}, {"type": "map_at_1000", "value": 21.194}, {"type": "map_at_20", "value": 17.008000000000003}, {"type": "map_at_3", "value": 10.915999999999999}, {"type": "map_at_5", "value": 12.705}, {"type": "mrr_at_1", "value": 53.5}, {"type": "mrr_at_10", "value": 63.475}, {"type": "mrr_at_100", "value": 63.998}, {"type": "mrr_at_1000", "value": 64.019}, {"type": "mrr_at_20", "value": 63.800999999999995}, {"type": "mrr_at_3", "value": 62.041999999999994}, {"type": "mrr_at_5", "value": 62.678999999999995}, {"type": "ndcg_at_1", "value": 41.875}, {"type": "ndcg_at_10", "value": 32.967}, {"type": "ndcg_at_100", "value": 35.557}, {"type": "ndcg_at_1000", "value": 42.537000000000006}, {"type": "ndcg_at_20", "value": 31.930999999999997}, {"type": "ndcg_at_3", "value": 36.67}, {"type": "ndcg_at_5", "value": 34.474}, {"type": "precision_at_1", "value": 53.5}, {"type": "precision_at_10", "value": 27.0}, {"type": "precision_at_100", "value": 7.872999999999999}, {"type": "precision_at_1000", "value": 1.637}, {"type": "precision_at_20", "value": 19.487}, {"type": "precision_at_3", "value": 41.583}, {"type": "precision_at_5", "value": 34.699999999999996}, {"type": "recall_at_1", "value": 7.101}, {"type": "recall_at_10", "value": 20.408}, {"type": "recall_at_100", "value": 40.286}, {"type": "recall_at_1000", "value": 63.49399999999999}, {"type": "recall_at_20", "value": 25.478}, {"type": "recall_at_3", "value": 12.278}, {"type": "recall_at_5", "value": 15.392}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 44.79}, {"type": "f1", "value": 39.606429663804356}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "mteb/fever", "config": "default", "split": "test", "revision": "bea83ef9e8fb933d90a2f1d5515737465d613e12"}, "metrics": [{"type": "map_at_1", "value": 27.898}, {"type": "map_at_10", "value": 39.315}, {"type": "map_at_100", "value": 40.219}, {"type": "map_at_1000", "value": 40.268}, {"type": "map_at_20", "value": 39.893}, {"type": "map_at_3", "value": 35.993}, {"type": "map_at_5", "value": 38.016}, {"type": "mrr_at_1", "value": 30.003}, {"type": "mrr_at_10", "value": 41.85}, {"type": "mrr_at_100", "value": 42.722}, {"type": "mrr_at_1000", "value": 42.760999999999996}, {"type": "mrr_at_20", "value": 42.419000000000004}, {"type": "mrr_at_3", "value": 38.451}, {"type": "mrr_at_5", "value": 40.547}, {"type": "ndcg_at_1", "value": 30.003}, {"type": "ndcg_at_10", "value": 45.907}, {"type": "ndcg_at_100", "value": 50.198}, {"type": "ndcg_at_1000", "value": 51.405}, {"type": "ndcg_at_20", "value": 47.97}, {"type": "ndcg_at_3", "value": 39.234}, {"type": "ndcg_at_5", "value": 42.844}, {"type": "precision_at_1", "value": 30.003}, {"type": "precision_at_10", "value": 7.0040000000000004}, {"type": "precision_at_100", "value": 0.9259999999999999}, {"type": "precision_at_1000", "value": 0.104}, {"type": "precision_at_20", "value": 3.9510000000000005}, {"type": "precision_at_3", "value": 16.647000000000002}, {"type": "precision_at_5", "value": 11.914}, {"type": "recall_at_1", "value": 27.898}, {"type": "recall_at_10", "value": 64.003}, {"type": "recall_at_100", "value": 83.42500000000001}, {"type": "recall_at_1000", "value": 92.448}, {"type": "recall_at_20", "value": 71.93}, {"type": "recall_at_3", "value": 46.12}, {"type": "recall_at_5", "value": 54.812000000000005}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "mteb/fiqa", "config": "default", "split": "test", "revision": "27a168819829fe9bcd655c2df245fb19452e8e06"}, "metrics": [{"type": "map_at_1", "value": 10.282}, {"type": "map_at_10", "value": 16.141}, {"type": "map_at_100", "value": 17.634}, {"type": "map_at_1000", "value": 17.836}, {"type": "map_at_20", "value": 16.99}, {"type": "map_at_3", "value": 13.947000000000001}, {"type": "map_at_5", "value": 15.149000000000001}, {"type": "mrr_at_1", "value": 20.679}, {"type": "mrr_at_10", "value": 26.966}, {"type": "mrr_at_100", "value": 28.108}, {"type": "mrr_at_1000", "value": 28.183999999999997}, {"type": "mrr_at_20", "value": 27.672}, {"type": "mrr_at_3", "value": 24.743000000000002}, {"type": "mrr_at_5", "value": 25.916}, {"type": "ndcg_at_1", "value": 20.679}, {"type": "ndcg_at_10", "value": 21.291}, {"type": "ndcg_at_100", "value": 27.884999999999998}, {"type": "ndcg_at_1000", "value": 32.122}, {"type": "ndcg_at_20", "value": 23.898}, {"type": "ndcg_at_3", "value": 18.553}, {"type": "ndcg_at_5", "value": 19.468}, {"type": "precision_at_1", "value": 20.679}, {"type": "precision_at_10", "value": 6.019}, {"type": "precision_at_100", "value": 1.252}, {"type": "precision_at_1000", "value": 0.201}, {"type": "precision_at_20", "value": 4.0120000000000005}, {"type": "precision_at_3", "value": 12.243}, {"type": "precision_at_5", "value": 9.321}, {"type": "recall_at_1", "value": 10.282}, {"type": "recall_at_10", "value": 25.901999999999997}, {"type": "recall_at_100", "value": 50.956999999999994}, {"type": "recall_at_1000", "value": 76.935}, {"type": "recall_at_20", "value": 34.104}, {"type": "recall_at_3", "value": 16.973}, {"type": "recall_at_5", "value": 20.549999999999997}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "mteb/hotpotqa", "config": "default", "split": "test", "revision": "ab518f4d6fcca38d87c25209f94beba119d02014"}, "metrics": [{"type": "map_at_1", "value": 30.567}, {"type": "map_at_10", "value": 42.314}, {"type": "map_at_100", "value": 43.205}, {"type": "map_at_1000", "value": 43.288}, {"type": "map_at_20", "value": 42.812}, {"type": "map_at_3", "value": 39.695}, {"type": "map_at_5", "value": 41.214}, {"type": "mrr_at_1", "value": 61.134}, {"type": "mrr_at_10", "value": 68.57600000000001}, {"type": "mrr_at_100", "value": 68.95599999999999}, {"type": "mrr_at_1000", "value": 68.97999999999999}, {"type": "mrr_at_20", "value": 68.818}, {"type": "mrr_at_3", "value": 66.99300000000001}, {"type": "mrr_at_5", "value": 67.919}, {"type": "ndcg_at_1", "value": 61.134}, {"type": "ndcg_at_10", "value": 51.518}, {"type": "ndcg_at_100", "value": 55.022000000000006}, {"type": "ndcg_at_1000", "value": 56.81699999999999}, {"type": "ndcg_at_20", "value": 52.893}, {"type": "ndcg_at_3", "value": 47.216}, {"type": "ndcg_at_5", "value": 49.413000000000004}, {"type": "precision_at_1", "value": 61.134}, {"type": "precision_at_10", "value": 10.729}, {"type": "precision_at_100", "value": 1.351}, {"type": "precision_at_1000", "value": 0.159}, {"type": "precision_at_20", "value": 5.8069999999999995}, {"type": "precision_at_3", "value": 29.336000000000002}, {"type": "precision_at_5", "value": 19.346}, {"type": "recall_at_1", "value": 30.567}, {"type": "recall_at_10", "value": 53.64600000000001}, {"type": "recall_at_100", "value": 67.562}, {"type": "recall_at_1000", "value": 79.521}, {"type": "recall_at_20", "value": 58.069}, {"type": "recall_at_3", "value": 44.004}, {"type": "recall_at_5", "value": 48.366}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 71.5272}, {"type": "ap", "value": 65.49215755861609}, {"type": "f1", "value": 71.4156268611186}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "mteb/msmarco", "config": "default", "split": "dev", "revision": "c5a29a104738b98a9e76336939199e264163d4a0"}, "metrics": [{"type": "map_at_1", "value": 14.574000000000002}, {"type": "map_at_10", "value": 23.966}, {"type": "map_at_100", "value": 25.19}, {"type": "map_at_1000", "value": 25.266}, {"type": "map_at_20", "value": 24.668}, {"type": "map_at_3", "value": 20.815}, {"type": "map_at_5", "value": 22.576}, {"type": "mrr_at_1", "value": 14.957}, {"type": "mrr_at_10", "value": 24.413999999999998}, {"type": "mrr_at_100", "value": 25.616}, {"type": "mrr_at_1000", "value": 25.685999999999996}, {"type": "mrr_at_20", "value": 25.11}, {"type": "mrr_at_3", "value": 21.304000000000002}, {"type": "mrr_at_5", "value": 23.047}, {"type": "ndcg_at_1", "value": 14.957}, {"type": "ndcg_at_10", "value": 29.49}, {"type": "ndcg_at_100", "value": 35.734}, {"type": "ndcg_at_1000", "value": 37.785000000000004}, {"type": "ndcg_at_20", "value": 32.004}, {"type": "ndcg_at_3", "value": 23.006999999999998}, {"type": "ndcg_at_5", "value": 26.154}, {"type": "precision_at_1", "value": 14.957}, {"type": "precision_at_10", "value": 4.8500000000000005}, {"type": "precision_at_100", "value": 0.8009999999999999}, {"type": "precision_at_1000", "value": 0.098}, {"type": "precision_at_20", "value": 2.943}, {"type": "precision_at_3", "value": 9.962}, {"type": "precision_at_5", "value": 7.556}, {"type": "recall_at_1", "value": 14.574000000000002}, {"type": "recall_at_10", "value": 46.655}, {"type": "recall_at_100", "value": 76.26899999999999}, {"type": "recall_at_1000", "value": 92.303}, {"type": "recall_at_20", "value": 56.424}, {"type": "recall_at_3", "value": 28.874}, {"type": "recall_at_5", "value": 36.441}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 90.78887368901049}, {"type": "f1", "value": 90.30465646125157}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 68.71865025079799}, {"type": "f1", "value": 50.7484789245504}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 68.8399462004035}, {"type": "f1", "value": 66.66574227334513}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 73.74915938130464}, {"type": "f1", "value": 73.61179700374726}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 30.3983428793953}, {"type": "v_measures", "value": [0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265, 0.2897998146059277, 0.2892991395982456, 0.2895468464510795, 0.294117690228455, 0.29987260303639823, 0.3247642769384547, 0.31050042169105724, 0.30994953770318107, 0.31969964495780845, 0.31228431272892265]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 28.78917156239751}, {"type": "v_measures", "value": [0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997, 0.2778513235855922, 0.28421867096674863, 0.26686384263192103, 0.2891902768882141, 0.27793747293511695, 0.30323125759570635, 0.2807541398062003, 0.3122697317735093, 0.2933394111230221, 0.29326102893371997]}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "mteb/nfcorpus", "config": "default", "split": "test", "revision": "ec0fa4fe99da2ff19ca1214b7966684033a58814"}, "metrics": [{"type": "map_at_1", "value": 3.726}, {"type": "map_at_10", "value": 8.604000000000001}, {"type": "map_at_100", "value": 10.95}, {"type": "map_at_1000", "value": 12.256}, {"type": "map_at_20", "value": 9.573}, {"type": "map_at_3", "value": 6.264}, {"type": "map_at_5", "value": 7.343}, {"type": "mrr_at_1", "value": 37.771}, {"type": "mrr_at_10", "value": 46.476}, {"type": "mrr_at_100", "value": 47.164}, {"type": "mrr_at_1000", "value": 47.213}, {"type": "mrr_at_20", "value": 46.792}, {"type": "mrr_at_3", "value": 44.272}, {"type": "mrr_at_5", "value": 45.728}, {"type": "ndcg_at_1", "value": 35.604}, {"type": "ndcg_at_10", "value": 26.778000000000002}, {"type": "ndcg_at_100", "value": 24.313000000000002}, {"type": "ndcg_at_1000", "value": 33.601}, {"type": "ndcg_at_20", "value": 24.788}, {"type": "ndcg_at_3", "value": 30.991999999999997}, {"type": "ndcg_at_5", "value": 28.9}, {"type": "precision_at_1", "value": 37.152}, {"type": "precision_at_10", "value": 19.875999999999998}, {"type": "precision_at_100", "value": 6.449000000000001}, {"type": "precision_at_1000", "value": 1.934}, {"type": "precision_at_20", "value": 14.721}, {"type": "precision_at_3", "value": 28.999000000000002}, {"type": "precision_at_5", "value": 24.582}, {"type": "recall_at_1", "value": 3.726}, {"type": "recall_at_10", "value": 12.529000000000002}, {"type": "recall_at_100", "value": 25.726}, {"type": "recall_at_1000", "value": 58.336}, {"type": "recall_at_20", "value": 16.028000000000002}, {"type": "recall_at_3", "value": 7.176}, {"type": "recall_at_5", "value": 9.511}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "mteb/nq", "config": "default", "split": "test", "revision": "b774495ed302d8c44a3a7ea25c90dbce03968f31"}, "metrics": [{"type": "map_at_1", "value": 15.110000000000001}, {"type": "map_at_10", "value": 25.983}, {"type": "map_at_100", "value": 27.332}, {"type": "map_at_1000", "value": 27.406999999999996}, {"type": "map_at_20", "value": 26.804}, {"type": "map_at_3", "value": 22.182}, {"type": "map_at_5", "value": 24.247}, {"type": "mrr_at_1", "value": 17.236}, {"type": "mrr_at_10", "value": 28.177999999999997}, {"type": "mrr_at_100", "value": 29.346}, {"type": "mrr_at_1000", "value": 29.401}, {"type": "mrr_at_20", "value": 28.906}, {"type": "mrr_at_3", "value": 24.593999999999998}, {"type": "mrr_at_5", "value": 26.540999999999997}, {"type": "ndcg_at_1", "value": 17.207}, {"type": "ndcg_at_10", "value": 32.603}, {"type": "ndcg_at_100", "value": 38.883}, {"type": "ndcg_at_1000", "value": 40.708}, {"type": "ndcg_at_20", "value": 35.397}, {"type": "ndcg_at_3", "value": 25.002999999999997}, {"type": "ndcg_at_5", "value": 28.572999999999997}, {"type": "precision_at_1", "value": 17.207}, {"type": "precision_at_10", "value": 5.985}, {"type": "precision_at_100", "value": 0.951}, {"type": "precision_at_1000", "value": 0.11299999999999999}, {"type": "precision_at_20", "value": 3.656}, {"type": "precision_at_3", "value": 11.848}, {"type": "precision_at_5", "value": 9.125}, {"type": "recall_at_1", "value": 15.110000000000001}, {"type": "recall_at_10", "value": 51.00900000000001}, {"type": "recall_at_100", "value": 79.193}, {"type": "recall_at_1000", "value": 92.828}, {"type": "recall_at_20", "value": 61.402}, {"type": "recall_at_3", "value": 30.791}, {"type": "recall_at_5", "value": 39.091}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "mteb/quora", "config": "default", "split": "test", "revision": "e4e08e0b7dbe3c8700f0daef558ff32256715259"}, "metrics": [{"type": "map_at_1", "value": 67.465}, {"type": "map_at_10", "value": 81.035}, {"type": "map_at_100", "value": 81.718}, {"type": "map_at_1000", "value": 81.742}, {"type": "map_at_20", "value": 81.486}, {"type": "map_at_3", "value": 77.972}, {"type": "map_at_5", "value": 79.903}, {"type": "mrr_at_1", "value": 77.64}, {"type": "mrr_at_10", "value": 84.584}, {"type": "mrr_at_100", "value": 84.722}, {"type": "mrr_at_1000", "value": 84.724}, {"type": "mrr_at_20", "value": 84.684}, {"type": "mrr_at_3", "value": 83.325}, {"type": "mrr_at_5", "value": 84.15899999999999}, {"type": "ndcg_at_1", "value": 77.66999999999999}, {"type": "ndcg_at_10", "value": 85.30499999999999}, {"type": "ndcg_at_100", "value": 86.834}, {"type": "ndcg_at_1000", "value": 87.033}, {"type": "ndcg_at_20", "value": 86.12100000000001}, {"type": "ndcg_at_3", "value": 81.974}, {"type": "ndcg_at_5", "value": 83.813}, {"type": "precision_at_1", "value": 77.66999999999999}, {"type": "precision_at_10", "value": 12.931000000000001}, {"type": "precision_at_100", "value": 1.5}, {"type": "precision_at_1000", "value": 0.156}, {"type": "precision_at_20", "value": 6.903}, {"type": "precision_at_3", "value": 35.730000000000004}, {"type": "precision_at_5", "value": 23.642}, {"type": "recall_at_1", "value": 67.465}, {"type": "recall_at_10", "value": 93.581}, {"type": "recall_at_100", "value": 98.91499999999999}, {"type": "recall_at_1000", "value": 99.90599999999999}, {"type": "recall_at_20", "value": 96.221}, {"type": "recall_at_3", "value": 84.071}, {"type": "recall_at_5", "value": 89.14999999999999}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 45.929215298244664}, {"type": "v_measures", "value": [0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467, 0.5005163734033015, 0.553109801970322, 0.41398508662376254, 0.42141314229941573, 0.4538781792482074, 0.4020501279564094, 0.47479152270449987, 0.4099927798506668, 0.4120557111594749, 0.4201880097400573, 0.42440122539823744, 0.4946100035438165, 0.4781440076390112, 0.4670832635547185, 0.5771247406191055, 0.411666253943506, 0.47763075003515215, 0.5272837549236378, 0.4452503211520816, 0.41778723041123167, 0.40422491239768005, 0.430149995306435, 0.5936566115456993, 0.4401734496854905, 0.43113656944924467]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "385e3cb46b4cfa89021f56c4380204149d0efe33"}, "metrics": [{"type": "v_measure", "value": 51.444598402601414}, {"type": "v_measures", "value": [0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664, 0.5651003661101165, 0.5711537036766935, 0.5987455713312818, 0.31409385867326506, 0.5578455339174134, 0.4983473414145347, 0.2540544357081523, 0.6081787161021057, 0.5498858360771133, 0.6270544772494664]}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "mteb/scidocs", "config": "default", "split": "test", "revision": "f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88"}, "metrics": [{"type": "map_at_1", "value": 3.5929999999999995}, {"type": "map_at_10", "value": 8.753}, {"type": "map_at_100", "value": 10.349}, {"type": "map_at_1000", "value": 10.624}, {"type": "map_at_20", "value": 9.553}, {"type": "map_at_3", "value": 6.2700000000000005}, {"type": "map_at_5", "value": 7.5329999999999995}, {"type": "mrr_at_1", "value": 17.7}, {"type": "mrr_at_10", "value": 27.167}, {"type": "mrr_at_100", "value": 28.351}, {"type": "mrr_at_1000", "value": 28.418}, {"type": "mrr_at_20", "value": 27.819}, {"type": "mrr_at_3", "value": 24.282999999999998}, {"type": "mrr_at_5", "value": 26.073}, {"type": "ndcg_at_1", "value": 17.7}, {"type": "ndcg_at_10", "value": 15.312000000000001}, {"type": "ndcg_at_100", "value": 22.178}, {"type": "ndcg_at_1000", "value": 27.575}, {"type": "ndcg_at_20", "value": 17.648}, {"type": "ndcg_at_3", "value": 14.41}, {"type": "ndcg_at_5", "value": 12.774}, {"type": "precision_at_1", "value": 17.7}, {"type": "precision_at_10", "value": 7.93}, {"type": "precision_at_100", "value": 1.7930000000000001}, {"type": "precision_at_1000", "value": 0.31}, {"type": "precision_at_20", "value": 5.315}, {"type": "precision_at_3", "value": 13.367}, {"type": "precision_at_5", "value": 11.26}, {"type": "recall_at_1", "value": 3.5929999999999995}, {"type": "recall_at_10", "value": 16.088}, {"type": "recall_at_100", "value": 36.39}, {"type": "recall_at_1000", "value": 62.932}, {"type": "recall_at_20", "value": 21.562}, {"type": "recall_at_3", "value": 8.123}, {"type": "recall_at_5", "value": 11.393}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "20a6d6f312dd54037fe07a32d58e5e168867909d"}, "metrics": [{"type": "cos_sim_pearson", "value": 82.6885494958054}, {"type": "cos_sim_spearman", "value": 76.0433546110243}, {"type": "euclidean_pearson", "value": 79.85820435751087}, {"type": "euclidean_spearman", "value": 75.9326257444857}, {"type": "manhattan_pearson", "value": 79.6973024858654}, {"type": "manhattan_spearman", "value": 75.71084698490509}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 77.34659218404506}, {"type": "cos_sim_spearman", "value": 69.49541146727839}, {"type": "euclidean_pearson", "value": 74.80982564474151}, {"type": "euclidean_spearman", "value": 70.04102091813081}, {"type": "manhattan_pearson", "value": 75.00200126757426}, {"type": "manhattan_spearman", "value": 70.22802660355588}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 79.91444494464905}, {"type": "cos_sim_spearman", "value": 80.96085686108583}, {"type": "euclidean_pearson", "value": 80.5915387592164}, {"type": "euclidean_spearman", "value": 80.8861855866439}, {"type": "manhattan_pearson", "value": 80.46881359994653}, {"type": "manhattan_spearman", "value": 80.80230339264102}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 80.81974904249208}, {"type": "cos_sim_spearman", "value": 77.08348207580887}, {"type": "euclidean_pearson", "value": 80.13431221409199}, {"type": "euclidean_spearman", "value": 77.31778188790902}, {"type": "manhattan_pearson", "value": 80.05343415464556}, {"type": "manhattan_spearman", "value": 77.26095229151665}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.37398871508579}, {"type": "cos_sim_spearman", "value": 85.41548418250477}, {"type": "euclidean_pearson", "value": 85.18569982361353}, {"type": "euclidean_spearman", "value": 85.73446512176643}, {"type": "manhattan_pearson", "value": 85.1016252976206}, {"type": "manhattan_spearman", "value": 85.66092136939069}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 80.59702638640928}, {"type": "cos_sim_spearman", "value": 82.29583005583622}, {"type": "euclidean_pearson", "value": 81.83307796549182}, {"type": "euclidean_spearman", "value": 82.39554204652183}, {"type": "manhattan_pearson", "value": 81.78282737393326}, {"type": "manhattan_spearman", "value": 82.34235304571907}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_pearson", "value": 86.89190122908971}, {"type": "cos_sim_spearman", "value": 88.03461344591356}, {"type": "euclidean_pearson", "value": 87.81999485969313}, {"type": "euclidean_spearman", "value": 88.07040076481854}, {"type": "manhattan_pearson", "value": 87.53382294293554}, {"type": "manhattan_spearman", "value": 87.76615089464353}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 57.97869820676485}, {"type": "cos_sim_spearman", "value": 64.12171377270657}, {"type": "euclidean_pearson", "value": 60.9601725696545}, {"type": "euclidean_spearman", "value": 63.48982922146721}, {"type": "manhattan_pearson", "value": 61.37553142926566}, {"type": "manhattan_spearman", "value": 63.759462595791796}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 83.14812517797631}, {"type": "cos_sim_spearman", "value": 83.33681512924129}, {"type": "euclidean_pearson", "value": 84.0552689078266}, {"type": "euclidean_spearman", "value": 83.45075258664495}, {"type": "manhattan_pearson", "value": 83.94309504683835}, {"type": "manhattan_spearman", "value": 83.37311472277489}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 77.89395841192561}, {"type": "mrr", "value": 93.39039319431475}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "mteb/scifact", "config": "default", "split": "test", "revision": "0228b52cf27578f30900b9e5271d331663a030d7"}, "metrics": [{"type": "map_at_1", "value": 47.65}, {"type": "map_at_10", "value": 58.287}, {"type": "map_at_100", "value": 58.965999999999994}, {"type": "map_at_1000", "value": 58.998}, {"type": "map_at_20", "value": 58.709}, {"type": "map_at_3", "value": 55.272}, {"type": "map_at_5", "value": 57.135999999999996}, {"type": "mrr_at_1", "value": 50.333000000000006}, {"type": "mrr_at_10", "value": 59.589999999999996}, {"type": "mrr_at_100", "value": 60.129999999999995}, {"type": "mrr_at_1000", "value": 60.162000000000006}, {"type": "mrr_at_20", "value": 59.95700000000001}, {"type": "mrr_at_3", "value": 57.389}, {"type": "mrr_at_5", "value": 58.656}, {"type": "ndcg_at_1", "value": 50.333000000000006}, {"type": "ndcg_at_10", "value": 63.232}, {"type": "ndcg_at_100", "value": 66.213}, {"type": "ndcg_at_1000", "value": 67.203}, {"type": "ndcg_at_20", "value": 64.63499999999999}, {"type": "ndcg_at_3", "value": 58.163}, {"type": "ndcg_at_5", "value": 60.785999999999994}, {"type": "precision_at_1", "value": 50.333000000000006}, {"type": "precision_at_10", "value": 8.633000000000001}, {"type": "precision_at_100", "value": 1.03}, {"type": "precision_at_1000", "value": 0.11100000000000002}, {"type": "precision_at_20", "value": 4.633}, {"type": "precision_at_3", "value": 22.889}, {"type": "precision_at_5", "value": 15.4}, {"type": "recall_at_1", "value": 47.65}, {"type": "recall_at_10", "value": 76.95}, {"type": "recall_at_100", "value": 90.333}, {"type": "recall_at_1000", "value": 98.333}, {"type": "recall_at_20", "value": 82.267}, {"type": "recall_at_3", "value": 63.632999999999996}, {"type": "recall_at_5", "value": 69.978}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.82277227722773}, {"type": "cos_sim_ap", "value": 95.3535743677476}, {"type": "cos_sim_f1", "value": 91.00050276520865}, {"type": "cos_sim_precision", "value": 91.50657229524772}, {"type": "cos_sim_recall", "value": 90.5}, {"type": "dot_accuracy", "value": 99.73267326732673}, {"type": "dot_ap", "value": 92.1266370356305}, {"type": "dot_f1", "value": 86.13810741687979}, {"type": "dot_precision", "value": 88.1675392670157}, {"type": "dot_recall", "value": 84.2}, {"type": "euclidean_accuracy", "value": 99.82277227722773}, {"type": "euclidean_ap", "value": 95.24537694377634}, {"type": "euclidean_f1", "value": 90.91831557584982}, {"type": "euclidean_precision", "value": 92.27600411946447}, {"type": "euclidean_recall", "value": 89.60000000000001}, {"type": "manhattan_accuracy", "value": 99.81881188118813}, {"type": "manhattan_ap", "value": 95.30188096008806}, {"type": "manhattan_f1", "value": 90.83625438157236}, {"type": "manhattan_precision", "value": 90.97291875626881}, {"type": "manhattan_recall", "value": 90.7}, {"type": "max_accuracy", "value": 99.82277227722773}, {"type": "max_ap", "value": 95.3535743677476}, {"type": "max_f1", "value": 91.00050276520865}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 52.18146042107239}, {"type": "v_measures", "value": [0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377, 0.5364875187827791, 0.5679299303873145, 0.44106298408975836, 0.5406958464125032, 0.4979264528781713, 0.4762228276272928, 0.4777826734534763, 0.5777786652028727, 0.5461510519376557, 0.5168553556755343, 0.577879257543513, 0.5845565212560989, 0.6151603528753354, 0.5719681846462936, 0.49266002482795135, 0.4992519351578576, 0.5201585621226349, 0.49488624259982555, 0.5026045640513042, 0.4781310026025221, 0.520040341596719, 0.4854477150657903, 0.48039481107512233, 0.5294089599539328, 0.5139233234458377]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 30.666751785479224}, {"type": "v_measures", "value": [0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856, 0.2961435813128549, 0.29377421770311174, 0.2920979816328222, 0.28868614709214024, 0.28703664876586243, 0.33192718718065084, 0.3159774213288751, 0.31901923873086113, 0.32546250570545476, 0.31655024909528856]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 46.78149995864765}, {"type": "mrr", "value": 47.45282393260334}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 31.202698233290022}, {"type": "cos_sim_spearman", "value": 30.971936219818662}, {"type": "dot_pearson", "value": 25.486069760264634}, {"type": "dot_spearman", "value": 25.811060638581246}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "mteb/trec-covid", "config": "default", "split": "test", "revision": "bb9466bac8153a0349341eb1b22e06409e78ef4e"}, "metrics": [{"type": "map_at_1", "value": 0.16999999999999998}, {"type": "map_at_10", "value": 0.943}, {"type": "map_at_100", "value": 5.0200000000000005}, {"type": "map_at_1000", "value": 13.855}, {"type": "map_at_20", "value": 1.609}, {"type": "map_at_3", "value": 0.384}, {"type": "map_at_5", "value": 0.5660000000000001}, {"type": "mrr_at_1", "value": 68.0}, {"type": "mrr_at_10", "value": 77.983}, {"type": "mrr_at_100", "value": 78.16499999999999}, {"type": "mrr_at_1000", "value": 78.16499999999999}, {"type": "mrr_at_20", "value": 78.16499999999999}, {"type": "mrr_at_3", "value": 75.667}, {"type": "mrr_at_5", "value": 77.067}, {"type": "ndcg_at_1", "value": 62.0}, {"type": "ndcg_at_10", "value": 47.772999999999996}, {"type": "ndcg_at_100", "value": 36.15}, {"type": "ndcg_at_1000", "value": 36.071}, {"type": "ndcg_at_20", "value": 44.641}, {"type": "ndcg_at_3", "value": 52.608999999999995}, {"type": "ndcg_at_5", "value": 50.397999999999996}, {"type": "precision_at_1", "value": 68.0}, {"type": "precision_at_10", "value": 50.8}, {"type": "precision_at_100", "value": 37.62}, {"type": "precision_at_1000", "value": 16.97}, {"type": "precision_at_20", "value": 47.099999999999994}, {"type": "precision_at_3", "value": 56.667}, {"type": "precision_at_5", "value": 54.0}, {"type": "recall_at_1", "value": 0.16999999999999998}, {"type": "recall_at_10", "value": 1.2349999999999999}, {"type": "recall_at_100", "value": 8.666}, {"type": "recall_at_1000", "value": 35.326}, {"type": "recall_at_20", "value": 2.276}, {"type": "recall_at_3", "value": 0.428}, {"type": "recall_at_5", "value": 0.672}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "mteb/touche2020", "config": "default", "split": "test", "revision": "a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f"}, "metrics": [{"type": "map_at_1", "value": 1.897}, {"type": "map_at_10", "value": 6.034}, {"type": "map_at_100", "value": 10.475}, {"type": "map_at_1000", "value": 11.95}, {"type": "map_at_20", "value": 8.149000000000001}, {"type": "map_at_3", "value": 2.8449999999999998}, {"type": "map_at_5", "value": 3.972}, {"type": "mrr_at_1", "value": 24.490000000000002}, {"type": "mrr_at_10", "value": 33.751}, {"type": "mrr_at_100", "value": 35.544}, {"type": "mrr_at_1000", "value": 35.544}, {"type": "mrr_at_20", "value": 34.926}, {"type": "mrr_at_3", "value": 29.252}, {"type": "mrr_at_5", "value": 31.905}, {"type": "ndcg_at_1", "value": 22.448999999999998}, {"type": "ndcg_at_10", "value": 16.303}, {"type": "ndcg_at_100", "value": 27.165}, {"type": "ndcg_at_1000", "value": 39.736}, {"type": "ndcg_at_20", "value": 18.340999999999998}, {"type": "ndcg_at_3", "value": 15.137999999999998}, {"type": "ndcg_at_5", "value": 16.332}, {"type": "precision_at_1", "value": 24.490000000000002}, {"type": "precision_at_10", "value": 15.714}, {"type": "precision_at_100", "value": 6.184}, {"type": "precision_at_1000", "value": 1.439}, {"type": "precision_at_20", "value": 13.163}, {"type": "precision_at_3", "value": 15.645999999999999}, {"type": "precision_at_5", "value": 17.551}, {"type": "recall_at_1", "value": 1.897}, {"type": "recall_at_10", "value": 11.938}, {"type": "recall_at_100", "value": 39.249}, {"type": "recall_at_1000", "value": 78.121}, {"type": "recall_at_20", "value": 19.244}, {"type": "recall_at_3", "value": 3.5409999999999995}, {"type": "recall_at_5", "value": 6.297999999999999}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "edfaf9da55d3dd50d43143d90c1ac476895ae6de"}, "metrics": [{"type": "accuracy", "value": 66.2939453125}, {"type": "ap", "value": 11.764275936169392}, {"type": "f1", "value": 50.50689429240701}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 59.49066213921902}, {"type": "f1", "value": 59.85044985699777}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 39.44109250212289}, {"type": "v_measures", "value": [0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783, 0.40669764182281876, 0.4138730431378403, 0.3900030656920992, 0.4129323940635477, 0.3817333080350274, 0.40499040520658186, 0.36911177861804156, 0.4101285395437541, 0.37178970000889994, 0.3828493740836783]}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 84.25821064552662}, {"type": "cos_sim_ap", "value": 67.96785265119063}, {"type": "cos_sim_f1", "value": 65.0070788107598}, {"type": "cos_sim_precision", "value": 58.792146820315835}, {"type": "cos_sim_recall", "value": 72.69129287598945}, {"type": "dot_accuracy", "value": 81.47463789712106}, {"type": "dot_ap", "value": 58.234902049577684}, {"type": "dot_f1", "value": 56.73442037078401}, {"type": "dot_precision", "value": 49.18667699457785}, {"type": "dot_recall", "value": 67.01846965699208}, {"type": "euclidean_accuracy", "value": 84.30589497526375}, {"type": "euclidean_ap", "value": 68.07824251821404}, {"type": "euclidean_f1", "value": 65.09073543457498}, {"type": "euclidean_precision", "value": 59.44177932839075}, {"type": "euclidean_recall", "value": 71.92612137203166}, {"type": "manhattan_accuracy", "value": 84.24032902187518}, {"type": "manhattan_ap", "value": 67.76838044141897}, {"type": "manhattan_f1", "value": 64.75698520779525}, {"type": "manhattan_precision", "value": 58.333333333333336}, {"type": "manhattan_recall", "value": 72.77044854881267}, {"type": "max_accuracy", "value": 84.30589497526375}, {"type": "max_ap", "value": 68.07824251821404}, {"type": "max_f1", "value": 65.09073543457498}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 88.2951061435169}, {"type": "cos_sim_ap", "value": 84.74905878045149}, {"type": "cos_sim_f1", "value": 77.01659871869538}, {"type": "cos_sim_precision", "value": 73.0392156862745}, {"type": "cos_sim_recall", "value": 81.45210963966738}, {"type": "dot_accuracy", "value": 86.37598478674273}, {"type": "dot_ap", "value": 79.17253140971533}, {"type": "dot_f1", "value": 73.19411657889958}, {"type": "dot_precision", "value": 69.27201484842236}, {"type": "dot_recall", "value": 77.58700338774254}, {"type": "euclidean_accuracy", "value": 88.29122521054062}, {"type": "euclidean_ap", "value": 84.64901724668165}, {"type": "euclidean_f1", "value": 76.99685189252507}, {"type": "euclidean_precision", "value": 73.39148639218422}, {"type": "euclidean_recall", "value": 80.97474591931014}, {"type": "manhattan_accuracy", "value": 88.29316567702877}, {"type": "manhattan_ap", "value": 84.5869003947086}, {"type": "manhattan_f1", "value": 76.9094138543517}, {"type": "manhattan_precision", "value": 74.03818751781134}, {"type": "manhattan_recall", "value": 80.01231906375116}, {"type": "max_accuracy", "value": 88.2951061435169}, {"type": "max_ap", "value": 84.74905878045149}, {"type": "max_f1", "value": 77.01659871869538}]}]}]}
Mihaiii/Bulbasaur
null
[ "sentence-transformers", "onnx", "safetensors", "bert", "feature-extraction", "sentence-similarity", "gte", "mteb", "dataset:Mihaiii/qa-assistant", "license:mit", "model-index", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:53:29+00:00
[]
[]
TAGS #sentence-transformers #onnx #safetensors #bert #feature-extraction #sentence-similarity #gte #mteb #dataset-Mihaiii/qa-assistant #license-mit #model-index #endpoints_compatible #region-us
# Bulbasaur This is a distill of gte-tiny trained using qa-assistant. ## Intended purpose <span style="color:blue">This model is designed for use in semantic-autocomplete (click here for demo).</span> ## Usage (Sentence-Transformers) (same as gte-tiny) Using this model becomes easy when you have sentence-transformers installed: Then you can use the model like this: ## Usage (HuggingFace Transformers) (same as gte-tiny) Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ### Limitation (same as gte-small) This model exclusively caters to English texts, and any lengthy texts will be truncated to a maximum of 512 tokens.
[ "# Bulbasaur\n\nThis is a distill of gte-tiny trained using qa-assistant.", "## Intended purpose\n\n<span style=\"color:blue\">This model is designed for use in semantic-autocomplete (click here for demo).</span>", "## Usage (Sentence-Transformers) (same as gte-tiny)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:", "## Usage (HuggingFace Transformers) (same as gte-tiny)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.", "### Limitation (same as gte-small)\nThis model exclusively caters to English texts, and any lengthy texts will be truncated to a maximum of 512 tokens." ]
[ "TAGS\n#sentence-transformers #onnx #safetensors #bert #feature-extraction #sentence-similarity #gte #mteb #dataset-Mihaiii/qa-assistant #license-mit #model-index #endpoints_compatible #region-us \n", "# Bulbasaur\n\nThis is a distill of gte-tiny trained using qa-assistant.", "## Intended purpose\n\n<span style=\"color:blue\">This model is designed for use in semantic-autocomplete (click here for demo).</span>", "## Usage (Sentence-Transformers) (same as gte-tiny)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:", "## Usage (HuggingFace Transformers) (same as gte-tiny)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.", "### Limitation (same as gte-small)\nThis model exclusively caters to English texts, and any lengthy texts will be truncated to a maximum of 512 tokens." ]
null
null
# umiyuki-Japanese-Chat-Umievo-itr001-7b-gguf [umiyukiさんが公開しているJapanese-Chat-Umievo-itr001-7b](https://huggingface.co/umiyuki/Japanese-Chat-Umievo-itr001-7b)のggufフォーマット変換版です。 imatrixのデータは[TFMC/imatrix-dataset-for-japanese-llm](https://huggingface.co/datasets/TFMC/imatrix-dataset-for-japanese-llm)を使用して作成しました。 ## Usage ``` git clone https://github.com/ggerganov/llama.cpp.git cd llama.cpp make -j ./main -m 'umiyuki-Japanese-Chat-Umievo-itr001-7b-Q4_0.gguf' -p "[INST] 今晩の夕食のレシピを教えて [/INST] " -n 128 ```
{"language": ["en", "ja"], "license": "apache-2.0", "datasets": ["TFMC/imatrix-dataset-for-japanese-llm"]}
mmnga/umiyuki-Japanese-Chat-Umievo-itr001-7b-gguf
null
[ "gguf", "en", "ja", "dataset:TFMC/imatrix-dataset-for-japanese-llm", "license:apache-2.0", "region:us" ]
null
2024-04-27T09:55:38+00:00
[]
[ "en", "ja" ]
TAGS #gguf #en #ja #dataset-TFMC/imatrix-dataset-for-japanese-llm #license-apache-2.0 #region-us
# umiyuki-Japanese-Chat-Umievo-itr001-7b-gguf umiyukiさんが公開しているJapanese-Chat-Umievo-itr001-7bのggufフォーマット変換版です。 imatrixのデータはTFMC/imatrix-dataset-for-japanese-llmを使用して作成しました。 ## Usage
[ "# umiyuki-Japanese-Chat-Umievo-itr001-7b-gguf \numiyukiさんが公開しているJapanese-Chat-Umievo-itr001-7bのggufフォーマット変換版です。\n\nimatrixのデータはTFMC/imatrix-dataset-for-japanese-llmを使用して作成しました。", "## Usage" ]
[ "TAGS\n#gguf #en #ja #dataset-TFMC/imatrix-dataset-for-japanese-llm #license-apache-2.0 #region-us \n", "# umiyuki-Japanese-Chat-Umievo-itr001-7b-gguf \numiyukiさんが公開しているJapanese-Chat-Umievo-itr001-7bのggufフォーマット変換版です。\n\nimatrixのデータはTFMC/imatrix-dataset-for-japanese-llmを使用して作成しました。", "## Usage" ]
text2text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
siddharth-magesh/lora-flan-t5-large-chat
null
[ "transformers", "tensorboard", "safetensors", "t5", "text2text-generation", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T09:56:29+00:00
[ "1910.09700" ]
[]
TAGS #transformers #tensorboard #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #tensorboard #safetensors #t5 #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
feature-extraction
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
stvhuang/rcr-run-5pqr6lwp-90396-master-0_20240402T105012-ep36
null
[ "transformers", "safetensors", "xlm-roberta", "feature-extraction", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:58:05+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #xlm-roberta #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #xlm-roberta #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
image-classification
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Boya1_RMSProp_1-e5_10Epoch_swinv2-tiny-patch4-window16-256_fold5 This model is a fine-tuned version of [microsoft/swinv2-tiny-patch4-window16-256](https://huggingface.co/microsoft/swinv2-tiny-patch4-window16-256) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 1.0138 - Accuracy: 0.6603 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 1.5961 | 1.0 | 924 | 1.4478 | 0.5094 | | 1.4174 | 2.0 | 1848 | 1.3398 | 0.5451 | | 1.1688 | 3.0 | 2772 | 1.1674 | 0.5993 | | 0.927 | 4.0 | 3696 | 1.1084 | 0.6281 | | 1.0425 | 5.0 | 4620 | 1.0311 | 0.6484 | | 1.1655 | 6.0 | 5544 | 1.0304 | 0.6519 | | 0.9514 | 7.0 | 6468 | 1.0135 | 0.6528 | | 0.8508 | 8.0 | 7392 | 1.0348 | 0.6511 | | 0.9113 | 9.0 | 8316 | 1.0275 | 0.6549 | | 0.8186 | 10.0 | 9240 | 1.0138 | 0.6603 | ### Framework versions - Transformers 4.35.0 - Pytorch 2.1.0 - Datasets 2.14.6 - Tokenizers 0.14.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["imagefolder"], "metrics": ["accuracy"], "base_model": "microsoft/swinv2-tiny-patch4-window16-256", "model-index": [{"name": "Boya1_RMSProp_1-e5_10Epoch_swinv2-tiny-patch4-window16-256_fold5", "results": [{"task": {"type": "image-classification", "name": "Image Classification"}, "dataset": {"name": "imagefolder", "type": "imagefolder", "config": "default", "split": "test", "args": "default"}, "metrics": [{"type": "accuracy", "value": 0.6603415559772297, "name": "Accuracy"}]}]}]}
onizukal/Boya1_RMSProp_1-e5_10Epoch_swinv2-tiny-patch4-window16-256_fold5
null
[ "transformers", "safetensors", "swinv2", "image-classification", "generated_from_trainer", "dataset:imagefolder", "base_model:microsoft/swinv2-tiny-patch4-window16-256", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T09:59:56+00:00
[]
[]
TAGS #transformers #safetensors #swinv2 #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swinv2-tiny-patch4-window16-256 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
Boya1\_RMSProp\_1-e5\_10Epoch\_swinv2-tiny-patch4-window16-256\_fold5 ===================================================================== This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window16-256 on the imagefolder dataset. It achieves the following results on the evaluation set: * Loss: 1.0138 * Accuracy: 0.6603 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 1e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_ratio: 0.1 * num\_epochs: 10 ### Training results ### Framework versions * Transformers 4.35.0 * Pytorch 2.1.0 * Datasets 2.14.6 * Tokenizers 0.14.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.0\n* Pytorch 2.1.0\n* Datasets 2.14.6\n* Tokenizers 0.14.1" ]
[ "TAGS\n#transformers #safetensors #swinv2 #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swinv2-tiny-patch4-window16-256 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.0\n* Pytorch 2.1.0\n* Datasets 2.14.6\n* Tokenizers 0.14.1" ]
text-generation
transformers
# Llama-3-8B-UltraMedical > Experience it in our 🤗 [Huggingface Space Demo](https://huggingface.co/spaces/TsinghuaC3I/UltraMedical-LM)! <!-- Provide a quick summary of what the model is/does. --> Llama-3-8B-UltraMedical is an open-access large language model (LLM) specialized in biomedicine. Developed by the [Tsinghua C3I Lab](https://github.com/TsinghuaC3I), this model aims to enhance medical examination access, literature comprehension, and clinical knowledge. Building on the foundation of Meta's Llama-3-8B, Llama-3-8B-UltraMedical is trained on our [UltraMedical](https://github.com/TsinghuaC3I/UltraMedical) dataset, which includes 410,000 diverse entries comprising both synthetic and manually curated samples. Llama-3-8B-UltraMedical has achieved top average scores across several popular medical benchmarks, including MedQA, MedMCQA, PubMedQA, and MMLU-Medical. In these benchmarks, Llama-3-8B-UltraMedical significantly outperforms Flan-PaLM, OpenBioLM-8B, Gemini-1.0, GPT-3.5, and Meditron-70b. We extend our gratitude to Meta for the Llama model, which provided an excellent foundation for our fine-tuning efforts. ## Usage ### Input Examples This model utilizes the Llama-3 default chat template without a system prompt. Below, we provide input examples for multi-choice QA, PubMedQA, and open-ended questions. > Note: To reproduce our evaluation results for the medical QA benchmark, we recommend using the following format to organize questions and multiple-choice options. - Input example for MedQA and MedMCQA: ``` A 42-year-old homeless man is brought to the emergency room after he was found unconscious in a park. He has alcohol on his breath and is known to have a history of chronic alcoholism. A noncontrast CT scan of the head is normal. The patient is treated for acute alcohol intoxication and admitted to the hospital. The next day, the patient demands to be released. His vital signs are a pulse 120/min, a respiratory rate 22/min, and blood pressure 136/88 mm Hg. On physical examination, the patient is confused, agitated, and sweating profusely, particularly from his palms. Generalized pallor is present. What is the mechanism of action of the drug recommended to treat this patient_s most likely condition? A. It increases the duration of GABA-gated chloride channel opening. B. It increases the frequency of GABA-gated chloride channel opening. C. It decreases the frequency of GABA-gated chloride channel opening. D. It decreases the duration of GABA-gated chloride channel opening. ``` - Input example for PubMedQA: We organize the context and questions in a multi-choice format, similar to [MedPrompt](https://github.com/microsoft/promptbase). ``` Context: Pediatric glioblastoma is a malignant disease with an extremely poor clinical outcome. Patients usually suffer from resistance to radiation therapy, so targeted drug treatment may be a new possibility for glioblastoma therapy. Survivin is also overexpressed in glioblastoma. YM155, a novel small-molecule survivin inhibitor, has not been examined for its use in glioblastoma therapy. Context: The human glioblastoma cell line M059K, which expresses normal DNA-dependent protein kinase (DNA-PK) activity and is radiation-resistant, and M059J, which is deficient in DNA-PK activity and radiation-sensitive, were used in the study. Cell viability, DNA fragmentation, and the expression of survivin and securin following YM155 treatment were examined using MTT (methylthiazolyldiphenyl-tetrazolium) assay, ELISA assay, and Western blot analysis, respectively. Context: YM155 caused a concentration-dependent cytotoxic effect, inhibiting the cell viability of both M059K and M059J cells by 70% after 48 hours of treatment with 50 nM YM155. The half-maximal inhibitory concentration (IC50) was around 30-35 nM for both cell lines. Apoptosis was determined to have occurred in both cell lines because immunoreactive signals from the DNA fragments in the cytoplasm were increased 24 hours after treatment with 30 nM YM155. The expression of survivin and securin in the M059K cells was greater than that measured in the M059J cells. Treatment with 30 nM YM155, for both 24 and 48 hours, significantly suppressed the expression of survivin and securin in both cell lines. Does novel survivin inhibitor YM155 elicit cytotoxicity in glioblastoma cell lines with normal or deficiency DNA-dependent protein kinase activity? A. maybe B. yes C. no ``` - Input example for open-ended questions: ``` hi doctor,i am chaitanya.age 28,from hyderabad.my problem is ....i got thyroid in my frist preganacy .my delivary date was on july 24th 2009 but on july 6th early morning around 7 oclock suddenly heany bleeding started and i rushed to the hospital but they could not save the baby(boy)...i lost my frist baby.then after 6 month i concevied again but doctors said that baby is having some heart problem and the sevarity of the problem can be known after the baby birth and i should go for a planned delivery.doctors did a c section on cotober 21 2010.doctors said that babys problem is not that serious but it is a heart problem so we need wait and see for 7 days.on 5th day the baby is dead.i want to know is their any problem in me that it is happing like this...do i need o go for any test before planning for next baby.i had 2 c section till now.what are the chances for me for the next baby.how long do i need to wait and plan for next preganacy. ``` ``` Investigate the mechanistic implications of statins, primarily used for lipid modulation, on the immunomodulatory pathways, with an emphasis on delineating their therapeutic impact in the context of managing clinical outcomes for individuals afflicted with cardiovascular diseases, including a requirement to discuss the implications for atherosclerotic disease progression. ``` ### Inference with vLLM ```python from transformers import AutoTokenizer from vllm import LLM, SamplingParams llm = LLM(model="TsinghuaC3I/Llama-3-8B-UltraMedical", trust_remote_code=True) tokenizer = AutoTokenizer.from_pretrained("TsinghuaC3I/Llama-3-8B-UltraMedical") sampling_params = SamplingParams(temperature=0.7, top_p=0.9, max_tokens=1024, stop=["<|eot_id|>"]) messages = [ {"role": "user", "content": """The question format used in the above input examples。"""}, ] prompts = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) print(prompts[0]) """ <|begin_of_text|><|start_header_id|>user<|end_header_id|> {question}<|eot_id|><|start_header_id|>assistant<|end_header_id|> """ outputs = llm.generate(prompts=prompts, sampling_params=sampling_params) print(outputs[0].outputs[0].text) ``` Note: This version of the model supports only single-turn dialog and has limited capabilities in multi-turn dialogue. We plan to enhance this in the next update. ## Evaluation Results Llama-3-8B-UltraMedical achieved the best average results among 7B-level models on popular medical benchmarks, including MedQA, MedMCQA, PubMedQA, and MMLU-Medical. We would like to acknowledge Meta's remarkable Llama model, which served as an excellent base for our fine-tuning process. | Released Date | Model | Average | MedQA | MedMCQA | PubMedQA | MMLU.ck | MMLU.mg | MMLU.an | MMLU.pm | MMLU.cb | MMLU.cm | |:-------------:|:--------------------------------------:|:-------:|:-----:|:-------:|:--------:|:-------:|:-------:|:-------:|:-------:|:-------:|:-------:| | 2024.04 | **Llama-3-8B-UltraMedical (Ensemble)** | 77.77 | 77.5 | 63.8 | 78.2 | 77.4 | 88.0 | 74.8 | 84.6 | 79.9 | 75.7 | | 2024.04 | **Llama-3-8B-UltraMedical (Greedy)** | 75.20 | 73.3 | 61.5 | 77.0 | 78.9 | 78.0 | 74.1 | 83.8 | 78.5 | 71.7 | | 2024.04 | OpenBioLM-8B | 72.48 | 59.0 | 56.9 | 74.1 | 76.1 | 86.1 | 69.8 | 78.2 | 84.2 | 68.0 | | 2024.04 | Llama-3-8B-Instruct (Ensemble) | 71.23 | 62.4 | 56.5 | 75.8 | 72.5 | 84.0 | 71.1 | 70.6 | 80.6 | 67.6 | | 2024.04 | Llama-3-8B-Instruct (Greedy) | 68.56 | 60.9 | 50.7 | 73.0 | 72.1 | 76.0 | 63.0 | 77.2 | 79.9 | 64.2 | | 2024.04 | Internist-7B | 67.79 | 60.5 | 55.8 | 79.4 | 70.6 | 71.0 | 65.9 | 76.1 | - | 63.0 | | 2024.02 | Gemma-7B | 64.18 | 47.2 | 49.0 | 76.2 | 69.8 | 70.0 | 59.3 | 66.2 | 79.9 | 60.1 | | 2024.03 | Meerkat-7B (Ensemble) | 63.94 | 74.3 | 60.7 | - | 61.9 | 70.4 | 61.5 | 69.5 | 55.4 | 57.8 | | 2023.03 | MedAlpaca | 58.03 | 41.7 | 37.5 | 72.8 | 57.4 | 69.0 | 57.0 | 67.3 | 65.3 | 54.3 | | 2024.02 | BioMistral-7B | 57.26 | 46.6 | 45.7 | 68.1 | 63.1 | 63.3 | 49.9 | 57.4 | 63.4 | 57.8 | In the table above: - For MedQA, we use the 4 options from the US set. For MedMCQA, we use the Dev split. For PubMedQA, we use the reasoning required set. - For MMLU, we include Clinical Knowledge (CK), Medical Genetics (MG), Anatomy (An), Professional Medicine (PM), College Biology (CB), and College Medicine (CM) to maintain consistency with previous studies. - Greedy search is employed as our default decoding strategy. We denote ensemble scores with self-consistency as `(Ensemble)`. In our experiments, we conduct 10 decoding trials, and final decisions are made via majority vote (temperature=0.7, top_p=0.9). - Partial results for 7B pre-trained models are sourced from the [Open Medical-LLM Leaderboard](https://huggingface.co/spaces/openlifescienceai/open_medical_llm_leaderboard). ## Training Details <!-- Provide a longer summary of what this model is. --> This model is trained using the full parameters and the Fully Sharded Data Parallel (FSDP) framework. The training process was performed on 8 x A6000 GPUs for about 50 hours. Hyperparameters: - torch type: bfloat16 - epochs: 3 - learning rate: 2e-5 - learning rate scheduler type: cosine - warmup ratio: 0.04 - max length: 1024 - global batch size: 128 - **License:** [Meta Llama-3 License](https://llama.meta.com/llama3/license/). - **Finetuned from model:** [Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) - **Finetuned on data:** [UltraMedical](https://github.com/TsinghuaC3I/UltraMedical) ## Limitations & Safe Use While our model offers promising capabilities, it is crucial to exercise caution when using it in real-world clinical settings due to potential hallucination issues. Hallucinations, where the model generates incorrect or misleading information, can pose significant risks in clinical decision-making. Users are advised to validate the model's outputs with trusted medical sources and expert consultation to ensure safety and accuracy. ## Citation ```latex @misc{UltraMedical, author = {Zhang, Kaiyan and Ding, Ning and Qi, Biqing and Zeng, Sihang and Li, Haoxin and Zhu, Xuekai and Chen, Zhang-Ren and Zhou, Bowen}, title = {UltraMedical: Building Specialized Generalists in Biomedicine.}, year = {2024}, publisher = {GitHub}, journal = {GitHub repository}, howpublished = {\url{https://github.com/TsinghuaC3I/UltraMedical}}, } ```
{"license": "llama3", "datasets": ["TsinghuaC3I/UltraMedical"]}
TsinghuaC3I/Llama-3-8B-UltraMedical
null
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "dataset:TsinghuaC3I/UltraMedical", "license:llama3", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T10:00:27+00:00
[]
[]
TAGS #transformers #safetensors #llama #text-generation #conversational #dataset-TsinghuaC3I/UltraMedical #license-llama3 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Llama-3-8B-UltraMedical ======================= > > Experience it in our Huggingface Space Demo! > > > Llama-3-8B-UltraMedical is an open-access large language model (LLM) specialized in biomedicine. Developed by the Tsinghua C3I Lab, this model aims to enhance medical examination access, literature comprehension, and clinical knowledge. Building on the foundation of Meta's Llama-3-8B, Llama-3-8B-UltraMedical is trained on our UltraMedical dataset, which includes 410,000 diverse entries comprising both synthetic and manually curated samples. Llama-3-8B-UltraMedical has achieved top average scores across several popular medical benchmarks, including MedQA, MedMCQA, PubMedQA, and MMLU-Medical. In these benchmarks, Llama-3-8B-UltraMedical significantly outperforms Flan-PaLM, OpenBioLM-8B, Gemini-1.0, GPT-3.5, and Meditron-70b. We extend our gratitude to Meta for the Llama model, which provided an excellent foundation for our fine-tuning efforts. Usage ----- ### Input Examples This model utilizes the Llama-3 default chat template without a system prompt. Below, we provide input examples for multi-choice QA, PubMedQA, and open-ended questions. > > Note: To reproduce our evaluation results for the medical QA benchmark, we recommend using the following format to organize questions and multiple-choice options. > > > * Input example for MedQA and MedMCQA: * Input example for PubMedQA: We organize the context and questions in a multi-choice format, similar to MedPrompt. * Input example for open-ended questions: ### Inference with vLLM Note: This version of the model supports only single-turn dialog and has limited capabilities in multi-turn dialogue. We plan to enhance this in the next update. Evaluation Results ------------------ Llama-3-8B-UltraMedical achieved the best average results among 7B-level models on popular medical benchmarks, including MedQA, MedMCQA, PubMedQA, and MMLU-Medical. We would like to acknowledge Meta's remarkable Llama model, which served as an excellent base for our fine-tuning process. In the table above: * For MedQA, we use the 4 options from the US set. For MedMCQA, we use the Dev split. For PubMedQA, we use the reasoning required set. * For MMLU, we include Clinical Knowledge (CK), Medical Genetics (MG), Anatomy (An), Professional Medicine (PM), College Biology (CB), and College Medicine (CM) to maintain consistency with previous studies. * Greedy search is employed as our default decoding strategy. We denote ensemble scores with self-consistency as '(Ensemble)'. In our experiments, we conduct 10 decoding trials, and final decisions are made via majority vote (temperature=0.7, top\_p=0.9). * Partial results for 7B pre-trained models are sourced from the Open Medical-LLM Leaderboard. Training Details ---------------- This model is trained using the full parameters and the Fully Sharded Data Parallel (FSDP) framework. The training process was performed on 8 x A6000 GPUs for about 50 hours. Hyperparameters: * torch type: bfloat16 * epochs: 3 * learning rate: 2e-5 * learning rate scheduler type: cosine * warmup ratio: 0.04 * max length: 1024 * global batch size: 128 * License: Meta Llama-3 License. * Finetuned from model: Meta-Llama-3-8B * Finetuned on data: UltraMedical Limitations & Safe Use ---------------------- While our model offers promising capabilities, it is crucial to exercise caution when using it in real-world clinical settings due to potential hallucination issues. Hallucinations, where the model generates incorrect or misleading information, can pose significant risks in clinical decision-making. Users are advised to validate the model's outputs with trusted medical sources and expert consultation to ensure safety and accuracy.
[ "### Input Examples\n\n\nThis model utilizes the Llama-3 default chat template without a system prompt.\nBelow, we provide input examples for multi-choice QA, PubMedQA, and open-ended questions.\n\n\n\n> \n> Note: To reproduce our evaluation results for the medical QA benchmark, we recommend using the following format to organize questions and multiple-choice options.\n> \n> \n> \n\n\n* Input example for MedQA and MedMCQA:\n* Input example for PubMedQA: We organize the context and questions in a multi-choice format, similar to MedPrompt.\n* Input example for open-ended questions:", "### Inference with vLLM\n\n\nNote: This version of the model supports only single-turn dialog and has limited capabilities in multi-turn dialogue. We plan to enhance this in the next update.\n\n\nEvaluation Results\n------------------\n\n\nLlama-3-8B-UltraMedical achieved the best average results among 7B-level models on popular medical benchmarks, including MedQA, MedMCQA, PubMedQA, and MMLU-Medical. We would like to acknowledge Meta's remarkable Llama model, which served as an excellent base for our fine-tuning process.\n\n\n\nIn the table above:\n\n\n* For MedQA, we use the 4 options from the US set. For MedMCQA, we use the Dev split. For PubMedQA, we use the reasoning required set.\n* For MMLU, we include Clinical Knowledge (CK), Medical Genetics (MG), Anatomy (An), Professional Medicine (PM), College Biology (CB), and College Medicine (CM) to maintain consistency with previous studies.\n* Greedy search is employed as our default decoding strategy. We denote ensemble scores with self-consistency as '(Ensemble)'. In our experiments, we conduct 10 decoding trials, and final decisions are made via majority vote (temperature=0.7, top\\_p=0.9).\n* Partial results for 7B pre-trained models are sourced from the Open Medical-LLM Leaderboard.\n\n\nTraining Details\n----------------\n\n\nThis model is trained using the full parameters and the Fully Sharded Data Parallel (FSDP) framework.\nThe training process was performed on 8 x A6000 GPUs for about 50 hours.\n\n\nHyperparameters:\n\n\n* torch type: bfloat16\n* epochs: 3\n* learning rate: 2e-5\n* learning rate scheduler type: cosine\n* warmup ratio: 0.04\n* max length: 1024\n* global batch size: 128\n* License: Meta Llama-3 License.\n* Finetuned from model: Meta-Llama-3-8B\n* Finetuned on data: UltraMedical\n\n\nLimitations & Safe Use\n----------------------\n\n\nWhile our model offers promising capabilities, it is crucial to exercise caution when using it in real-world clinical settings due to potential hallucination issues. Hallucinations, where the model generates incorrect or misleading information, can pose significant risks in clinical decision-making. Users are advised to validate the model's outputs with trusted medical sources and expert consultation to ensure safety and accuracy." ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #conversational #dataset-TsinghuaC3I/UltraMedical #license-llama3 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Input Examples\n\n\nThis model utilizes the Llama-3 default chat template without a system prompt.\nBelow, we provide input examples for multi-choice QA, PubMedQA, and open-ended questions.\n\n\n\n> \n> Note: To reproduce our evaluation results for the medical QA benchmark, we recommend using the following format to organize questions and multiple-choice options.\n> \n> \n> \n\n\n* Input example for MedQA and MedMCQA:\n* Input example for PubMedQA: We organize the context and questions in a multi-choice format, similar to MedPrompt.\n* Input example for open-ended questions:", "### Inference with vLLM\n\n\nNote: This version of the model supports only single-turn dialog and has limited capabilities in multi-turn dialogue. We plan to enhance this in the next update.\n\n\nEvaluation Results\n------------------\n\n\nLlama-3-8B-UltraMedical achieved the best average results among 7B-level models on popular medical benchmarks, including MedQA, MedMCQA, PubMedQA, and MMLU-Medical. We would like to acknowledge Meta's remarkable Llama model, which served as an excellent base for our fine-tuning process.\n\n\n\nIn the table above:\n\n\n* For MedQA, we use the 4 options from the US set. For MedMCQA, we use the Dev split. For PubMedQA, we use the reasoning required set.\n* For MMLU, we include Clinical Knowledge (CK), Medical Genetics (MG), Anatomy (An), Professional Medicine (PM), College Biology (CB), and College Medicine (CM) to maintain consistency with previous studies.\n* Greedy search is employed as our default decoding strategy. We denote ensemble scores with self-consistency as '(Ensemble)'. In our experiments, we conduct 10 decoding trials, and final decisions are made via majority vote (temperature=0.7, top\\_p=0.9).\n* Partial results for 7B pre-trained models are sourced from the Open Medical-LLM Leaderboard.\n\n\nTraining Details\n----------------\n\n\nThis model is trained using the full parameters and the Fully Sharded Data Parallel (FSDP) framework.\nThe training process was performed on 8 x A6000 GPUs for about 50 hours.\n\n\nHyperparameters:\n\n\n* torch type: bfloat16\n* epochs: 3\n* learning rate: 2e-5\n* learning rate scheduler type: cosine\n* warmup ratio: 0.04\n* max length: 1024\n* global batch size: 128\n* License: Meta Llama-3 License.\n* Finetuned from model: Meta-Llama-3-8B\n* Finetuned on data: UltraMedical\n\n\nLimitations & Safe Use\n----------------------\n\n\nWhile our model offers promising capabilities, it is crucial to exercise caution when using it in real-world clinical settings due to potential hallucination issues. Hallucinations, where the model generates incorrect or misleading information, can pose significant risks in clinical decision-making. Users are advised to validate the model's outputs with trusted medical sources and expert consultation to ensure safety and accuracy." ]
feature-extraction
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
minhquy1624/fintune-bi-encoder-sts-regressor
null
[ "transformers", "safetensors", "roberta", "feature-extraction", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:07:03+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #roberta #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #roberta #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
reinforcement-learning
stable-baselines3
# **PPO** Agent playing **LunarLander-v2** This is a trained model of a **PPO** agent playing **LunarLander-v2** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import PPO from huggingface_sb3 import load_from_hub repo_id = "theegnas/ppo-LunarLander-v2" # The repo_id filename = "ppo-LunarLander-v2.zip" # The model filename.zip custom_objects = { "learning_rate": 0.0, "lr_schedule": lambda _: 0.0, "clip_range": lambda _: 0.0, } checkpoint = load_from_hub(repo_id, filename) model = PPO.load(checkpoint, custom_objects=custom_objects, print_system_info=True) ```
{"library_name": "stable-baselines3", "tags": ["LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "251.97 +/- 23.47", "name": "mean_reward", "verified": false}]}]}]}
theegnas/ppo-LunarLander-v2
null
[ "stable-baselines3", "LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
null
2024-04-27T10:09:37+00:00
[]
[]
TAGS #stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
# PPO Agent playing LunarLander-v2 This is a trained model of a PPO agent playing LunarLander-v2 using the stable-baselines3 library. ## Usage (with Stable-baselines3) TODO: Add your code
[ "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ "TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n", "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
text-generation
transformers
# Uploaded model - **Developed by:** suriya7 - **License:** apache-2.0 - **Finetuned from model :** unsloth/gemma-2b This Gemma model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth) ### Requirements ```bash pip install torch pip install transformers ``` ### Inference In Notebook ```python import torch alpaca_prompt = """Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request. ### Instruction: {} ### Input: {} ### Response: {}""" # Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM from transformers import TextStreamer tokenizer = AutoTokenizer.from_pretrained("suriya7/Gemma-2b-SFT") model = AutoModelForCausalLM.from_pretrained("suriya7/Gemma-2b-SFT") device = "cuda" if torch.cuda.is_available() else "cpu" model.to(device) inputs = tokenizer( [ alpaca_prompt.format( "You are an AI assistant. Please ensure that the answers conclude with an end-of-sequence (EOS) token.", # instruction "how to cook pizza?", # input goes here "", # output - leave this blank for generation! ) ], return_tensors = "pt").to(device) text_streamer = TextStreamer(tokenizer) _ = model.generate(**inputs, streamer = text_streamer, max_new_tokens = 250,do_sample=True,temperature=0.7,top_k=2,repetition_penalty=1.5, # Penalize repeated responses eos_token_id=model.config.eos_token_id) ``` ### Recommended Prompt Template ```bash alpaca_prompt = """Below is an instruction that describes a task, paired with an input that provides further context. Write a response that appropriately completes the request. ### Instruction: {} ### Input: {} ### Response: {}""" ```
{"license": "mit", "datasets": ["databricks/databricks-dolly-15"]}
suriya7/Gemma-2b-SFT
null
[ "transformers", "safetensors", "gemma", "text-generation", "dataset:databricks/databricks-dolly-15", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us", "has_space" ]
null
2024-04-27T10:09:48+00:00
[]
[]
TAGS #transformers #safetensors #gemma #text-generation #dataset-databricks/databricks-dolly-15 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #has_space
# Uploaded model - Developed by: suriya7 - License: apache-2.0 - Finetuned from model : unsloth/gemma-2b This Gemma model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/> ### Requirements ### Inference In Notebook ### Recommended Prompt Template
[ "# Uploaded model\n\n- Developed by: suriya7\n- License: apache-2.0\n- Finetuned from model : unsloth/gemma-2b\n\nThis Gemma model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>", "### Requirements", "### Inference In Notebook", "### Recommended Prompt Template" ]
[ "TAGS\n#transformers #safetensors #gemma #text-generation #dataset-databricks/databricks-dolly-15 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us #has_space \n", "# Uploaded model\n\n- Developed by: suriya7\n- License: apache-2.0\n- Finetuned from model : unsloth/gemma-2b\n\nThis Gemma model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>", "### Requirements", "### Inference In Notebook", "### Recommended Prompt Template" ]
sentence-similarity
sentence-transformers
# Ivysaur This is a fine-tune of [gte-tiny](https://huggingface.co/TaylorAI/gte-tiny) using [qa-assistant](https://huggingface.co/datasets/Mihaiii/qa-assistant). ## Intended purpose <span style="color:blue">This model is designed for use in semantic-autocomplete ([click here for demo](https://mihaiii.github.io/semantic-autocomplete/)).</span> ## Usage (Sentence-Transformers) (same as [gte-tiny](https://huggingface.co/TaylorAI/gte-tiny)) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('Mihaiii/Ivysaur') embeddings = model.encode(sentences) print(embeddings) ``` ## Usage (HuggingFace Transformers) (same as [gte-tiny](https://huggingface.co/TaylorAI/gte-tiny)) Without [sentence-transformers](https://www.SBERT.net), you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ```python from transformers import AutoTokenizer, AutoModel import torch #Mean Pooling - Take attention mask into account for correct averaging def mean_pooling(model_output, attention_mask): token_embeddings = model_output[0] #First element of model_output contains all token embeddings input_mask_expanded = attention_mask.unsqueeze(-1).expand(token_embeddings.size()).float() return torch.sum(token_embeddings * input_mask_expanded, 1) / torch.clamp(input_mask_expanded.sum(1), min=1e-9) # Sentences we want sentence embeddings for sentences = ['This is an example sentence', 'Each sentence is converted'] # Load model from HuggingFace Hub tokenizer = AutoTokenizer.from_pretrained('Mihaiii/Ivysaur') model = AutoModel.from_pretrained('Mihaiii/Ivysaur') # Tokenize sentences encoded_input = tokenizer(sentences, padding=True, truncation=True, return_tensors='pt') # Compute token embeddings with torch.no_grad(): model_output = model(**encoded_input) # Perform pooling. In this case, mean pooling. sentence_embeddings = mean_pooling(model_output, encoded_input['attention_mask']) print("Sentence embeddings:") print(sentence_embeddings) ``` ### Limitation (same as [gte-small](https://huggingface.co/thenlper/gte-small)) This model exclusively caters to English texts, and any lengthy texts will be truncated to a maximum of 512 tokens.
{"license": "mit", "library_name": "sentence-transformers", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "gte", "mteb"], "datasets": ["Mihaiii/qa-assistant"], "base_model": "TaylorAI/gte-tiny", "pipeline_tag": "sentence-similarity", "model-index": [{"name": "Ivysaur", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 72.1044776119403}, {"type": "ap", "value": 35.09105788324913}, {"type": "f1", "value": 66.26967715703572}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 86.686075}, {"type": "ap", "value": 81.92716581685914}, {"type": "f1", "value": 86.65902299160209}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 42.698}, {"type": "f1", "value": 42.287785312461885}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "mteb/arguana", "config": "default", "split": "test", "revision": "c22ab2a51041ffd869aaddef7af8d8215647e41a"}, "metrics": [{"type": "map_at_1", "value": 30.441000000000003}, {"type": "map_at_10", "value": 46.951}, {"type": "map_at_100", "value": 47.788000000000004}, {"type": "map_at_1000", "value": 47.794}, {"type": "map_at_20", "value": 47.621}, {"type": "map_at_3", "value": 42.295}, {"type": "map_at_5", "value": 45.126}, {"type": "mrr_at_1", "value": 31.65}, {"type": "mrr_at_10", "value": 47.394999999999996}, {"type": "mrr_at_100", "value": 48.238}, {"type": "mrr_at_1000", "value": 48.245}, {"type": "mrr_at_20", "value": 48.069}, {"type": "mrr_at_3", "value": 42.852000000000004}, {"type": "mrr_at_5", "value": 45.58}, {"type": "ndcg_at_1", "value": 30.441000000000003}, {"type": "ndcg_at_10", "value": 55.783}, {"type": "ndcg_at_100", "value": 59.227}, {"type": "ndcg_at_1000", "value": 59.376}, {"type": "ndcg_at_20", "value": 58.18}, {"type": "ndcg_at_3", "value": 46.291}, {"type": "ndcg_at_5", "value": 51.405}, {"type": "precision_at_1", "value": 30.441000000000003}, {"type": "precision_at_10", "value": 8.378}, {"type": "precision_at_100", "value": 0.985}, {"type": "precision_at_1000", "value": 0.1}, {"type": "precision_at_20", "value": 4.659}, {"type": "precision_at_3", "value": 19.298000000000002}, {"type": "precision_at_5", "value": 14.068}, {"type": "recall_at_1", "value": 30.441000000000003}, {"type": "recall_at_10", "value": 83.784}, {"type": "recall_at_100", "value": 98.506}, {"type": "recall_at_1000", "value": 99.644}, {"type": "recall_at_20", "value": 93.172}, {"type": "recall_at_3", "value": 57.894999999999996}, {"type": "recall_at_5", "value": 70.341}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 46.39249132731755}, {"type": "v_measures", "value": [0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902, 0.462627943488718, 0.4670198046702645, 0.4799590043041496, 0.4769331119808875, 0.4676232129237324, 0.4776548131275231, 0.4670074065859379, 0.4796639656537766, 0.4618481699630812, 0.4663292111226376, 0.5293353429909269, 0.5398570175481274, 0.5399074383870329, 0.5363158656403061, 0.5377616813701683, 0.5375897664056992, 0.5391811647339062, 0.5408906197352437, 0.5330346186210795, 0.5333610235325786, 0.5043600016005657, 0.2861923615995782, 0.42134506758129586, 0.4019628602326345, 0.345945272411779, 0.2605048863591227, 0.28469463800386774, 0.23235682032046123, 0.30618655352256796, 1.0, 0.2642226670507902]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 35.410038545643225}, {"type": "v_measures", "value": [0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917, 0.33766811548231473, 0.3734777203759399, 0.33991212785072317, 0.3661605677492215, 0.36064589524249807, 0.3656962944251887, 0.34702091841974203, 0.3500477383658047, 0.35477756658493836, 0.3624636373603448, 0.40289427457065846, 0.3971477930112288, 0.40597327027674507, 0.40596489455329327, 0.40317124541440197, 0.4034334047970072, 0.4035619316327058, 0.4021323074077349, 0.40002234969788997, 0.39359153564695076, 0.3721698397439144, 0.20022120055536463, 0.2733292585686657, 0.329333695822746, 0.267015905471991, 0.1951877019437801, 0.21813528003614752, 0.1428255078757563, 0.21839826060461043, 1.0, 0.1847317096610917]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 59.69637278242267}, {"type": "mrr", "value": 74.02948159873367}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 87.14461604689758}, {"type": "cos_sim_spearman", "value": 87.31584497244751}, {"type": "euclidean_pearson", "value": 84.78141750973201}, {"type": "euclidean_spearman", "value": 87.05017626840346}, {"type": "manhattan_pearson", "value": 84.35436632710646}, {"type": "manhattan_spearman", "value": 86.49534434907336}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 81.91558441558439}, {"type": "f1", "value": 81.88197959191479}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 38.97808934568377}, {"type": "v_measures", "value": [0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565, 0.3950220690882689, 0.38918993520470474, 0.3874211082831238, 0.3769994856835508, 0.37876292165982844, 0.3979648803949703, 0.39019384497819176, 0.4100620420333616, 0.3809405025237201, 0.3912521447186565]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 31.7412250739116}, {"type": "v_measures", "value": [0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667, 0.31156273517579985, 0.31497713177719505, 0.3211720123203406, 0.30456845682253647, 0.3152485096373301, 0.32328632147728803, 0.3114059814606084, 0.32290781970290505, 0.31626398941398964, 0.3327295496031667]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 31.7776266029616}, {"type": "mrr", "value": 32.9057970138914}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackRetrieval", "type": "mteb/cqadupstack", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 24.78675}, {"type": "map_at_10", "value": 33.18391666666666}, {"type": "map_at_100", "value": 34.34583333333333}, {"type": "map_at_1000", "value": 34.46825}, {"type": "map_at_20", "value": 33.819}, {"type": "map_at_3", "value": 30.636500000000005}, {"type": "map_at_5", "value": 32.02091666666667}, {"type": "mrr_at_1", "value": 29.478749999999998}, {"type": "mrr_at_10", "value": 37.385}, {"type": "mrr_at_100", "value": 38.23491666666667}, {"type": "mrr_at_1000", "value": 38.298833333333334}, {"type": "mrr_at_20", "value": 37.87508333333333}, {"type": "mrr_at_3", "value": 35.089666666666666}, {"type": "mrr_at_5", "value": 36.36816666666667}, {"type": "ndcg_at_1", "value": 29.478749999999998}, {"type": "ndcg_at_10", "value": 38.2035}, {"type": "ndcg_at_100", "value": 43.301083333333324}, {"type": "ndcg_at_1000", "value": 45.758666666666656}, {"type": "ndcg_at_20", "value": 40.15116666666667}, {"type": "ndcg_at_3", "value": 33.86033333333334}, {"type": "ndcg_at_5", "value": 35.81266666666666}, {"type": "precision_at_1", "value": 29.478749999999998}, {"type": "precision_at_10", "value": 6.642833333333334}, {"type": "precision_at_100", "value": 1.08425}, {"type": "precision_at_1000", "value": 0.14850000000000002}, {"type": "precision_at_20", "value": 3.948083333333334}, {"type": "precision_at_3", "value": 15.511}, {"type": "precision_at_5", "value": 10.929833333333333}, {"type": "recall_at_1", "value": 24.78675}, {"type": "recall_at_10", "value": 48.9305}, {"type": "recall_at_100", "value": 71.49416666666666}, {"type": "recall_at_1000", "value": 88.54375}, {"type": "recall_at_20", "value": 56.06475}, {"type": "recall_at_3", "value": 36.66891666666666}, {"type": "recall_at_5", "value": 41.790499999999994}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "mteb/cqadupstack-android", "config": "default", "split": "test", "revision": "f46a197baaae43b4f621051089b82a364682dfeb"}, "metrics": [{"type": "map_at_1", "value": 30.793}, {"type": "map_at_10", "value": 42.254000000000005}, {"type": "map_at_100", "value": 43.569}, {"type": "map_at_1000", "value": 43.714999999999996}, {"type": "map_at_20", "value": 42.994}, {"type": "map_at_3", "value": 39.007999999999996}, {"type": "map_at_5", "value": 40.488}, {"type": "mrr_at_1", "value": 38.34}, {"type": "mrr_at_10", "value": 48.274}, {"type": "mrr_at_100", "value": 48.946}, {"type": "mrr_at_1000", "value": 49.001}, {"type": "mrr_at_20", "value": 48.701}, {"type": "mrr_at_3", "value": 45.756}, {"type": "mrr_at_5", "value": 47.036}, {"type": "ndcg_at_1", "value": 38.34}, {"type": "ndcg_at_10", "value": 48.622}, {"type": "ndcg_at_100", "value": 53.288999999999994}, {"type": "ndcg_at_1000", "value": 55.614}, {"type": "ndcg_at_20", "value": 50.495000000000005}, {"type": "ndcg_at_3", "value": 43.852999999999994}, {"type": "ndcg_at_5", "value": 45.442}, {"type": "precision_at_1", "value": 38.34}, {"type": "precision_at_10", "value": 9.413}, {"type": "precision_at_100", "value": 1.4749999999999999}, {"type": "precision_at_1000", "value": 0.19499999999999998}, {"type": "precision_at_20", "value": 5.494000000000001}, {"type": "precision_at_3", "value": 20.935000000000002}, {"type": "precision_at_5", "value": 14.735000000000001}, {"type": "recall_at_1", "value": 30.793}, {"type": "recall_at_10", "value": 60.455000000000005}, {"type": "recall_at_100", "value": 80.061}, {"type": "recall_at_1000", "value": 95.322}, {"type": "recall_at_20", "value": 67.27}, {"type": "recall_at_3", "value": 46.296}, {"type": "recall_at_5", "value": 51.139}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackEnglishRetrieval", "type": "mteb/cqadupstack-english", "config": "default", "split": "test", "revision": "ad9991cb51e31e31e430383c75ffb2885547b5f0"}, "metrics": [{"type": "map_at_1", "value": 27.93}, {"type": "map_at_10", "value": 36.085}, {"type": "map_at_100", "value": 37.192}, {"type": "map_at_1000", "value": 37.324}, {"type": "map_at_20", "value": 36.614999999999995}, {"type": "map_at_3", "value": 33.452}, {"type": "map_at_5", "value": 35.088}, {"type": "mrr_at_1", "value": 34.777}, {"type": "mrr_at_10", "value": 41.865}, {"type": "mrr_at_100", "value": 42.518}, {"type": "mrr_at_1000", "value": 42.571}, {"type": "mrr_at_20", "value": 42.219}, {"type": "mrr_at_3", "value": 39.628}, {"type": "mrr_at_5", "value": 41.038999999999994}, {"type": "ndcg_at_1", "value": 34.777}, {"type": "ndcg_at_10", "value": 41.095}, {"type": "ndcg_at_100", "value": 45.286}, {"type": "ndcg_at_1000", "value": 47.656}, {"type": "ndcg_at_20", "value": 42.472}, {"type": "ndcg_at_3", "value": 37.349}, {"type": "ndcg_at_5", "value": 39.318}, {"type": "precision_at_1", "value": 34.777}, {"type": "precision_at_10", "value": 7.617999999999999}, {"type": "precision_at_100", "value": 1.242}, {"type": "precision_at_1000", "value": 0.173}, {"type": "precision_at_20", "value": 4.481}, {"type": "precision_at_3", "value": 17.771}, {"type": "precision_at_5", "value": 12.687999999999999}, {"type": "recall_at_1", "value": 27.93}, {"type": "recall_at_10", "value": 49.464000000000006}, {"type": "recall_at_100", "value": 67.64099999999999}, {"type": "recall_at_1000", "value": 83.066}, {"type": "recall_at_20", "value": 54.452999999999996}, {"type": "recall_at_3", "value": 38.157000000000004}, {"type": "recall_at_5", "value": 43.829}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGamingRetrieval", "type": "mteb/cqadupstack-gaming", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 37.332}, {"type": "map_at_10", "value": 49.146}, {"type": "map_at_100", "value": 50.222}, {"type": "map_at_1000", "value": 50.281}, {"type": "map_at_20", "value": 49.802}, {"type": "map_at_3", "value": 46.264}, {"type": "map_at_5", "value": 47.912}, {"type": "mrr_at_1", "value": 43.009}, {"type": "mrr_at_10", "value": 52.586999999999996}, {"type": "mrr_at_100", "value": 53.323}, {"type": "mrr_at_1000", "value": 53.352999999999994}, {"type": "mrr_at_20", "value": 53.04299999999999}, {"type": "mrr_at_3", "value": 50.261}, {"type": "mrr_at_5", "value": 51.615}, {"type": "ndcg_at_1", "value": 43.009}, {"type": "ndcg_at_10", "value": 54.652}, {"type": "ndcg_at_100", "value": 58.918000000000006}, {"type": "ndcg_at_1000", "value": 60.172000000000004}, {"type": "ndcg_at_20", "value": 56.554}, {"type": "ndcg_at_3", "value": 49.757}, {"type": "ndcg_at_5", "value": 52.169}, {"type": "precision_at_1", "value": 43.009}, {"type": "precision_at_10", "value": 8.715}, {"type": "precision_at_100", "value": 1.1780000000000002}, {"type": "precision_at_1000", "value": 0.133}, {"type": "precision_at_20", "value": 4.931}, {"type": "precision_at_3", "value": 22.153}, {"type": "precision_at_5", "value": 15.146999999999998}, {"type": "recall_at_1", "value": 37.332}, {"type": "recall_at_10", "value": 67.55600000000001}, {"type": "recall_at_100", "value": 85.885}, {"type": "recall_at_1000", "value": 94.87400000000001}, {"type": "recall_at_20", "value": 74.568}, {"type": "recall_at_3", "value": 54.419}, {"type": "recall_at_5", "value": 60.288}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGisRetrieval", "type": "mteb/cqadupstack-gis", "config": "default", "split": "test", "revision": "5003b3064772da1887988e05400cf3806fe491f2"}, "metrics": [{"type": "map_at_1", "value": 24.09}, {"type": "map_at_10", "value": 32.608}, {"type": "map_at_100", "value": 33.571}, {"type": "map_at_1000", "value": 33.668}, {"type": "map_at_20", "value": 33.181}, {"type": "map_at_3", "value": 30.091}, {"type": "map_at_5", "value": 31.518}, {"type": "mrr_at_1", "value": 25.763}, {"type": "mrr_at_10", "value": 34.25}, {"type": "mrr_at_100", "value": 35.134}, {"type": "mrr_at_1000", "value": 35.207}, {"type": "mrr_at_20", "value": 34.78}, {"type": "mrr_at_3", "value": 31.807999999999996}, {"type": "mrr_at_5", "value": 33.198}, {"type": "ndcg_at_1", "value": 25.763}, {"type": "ndcg_at_10", "value": 37.305}, {"type": "ndcg_at_100", "value": 42.114000000000004}, {"type": "ndcg_at_1000", "value": 44.467}, {"type": "ndcg_at_20", "value": 39.272}, {"type": "ndcg_at_3", "value": 32.405}, {"type": "ndcg_at_5", "value": 34.775}, {"type": "precision_at_1", "value": 25.763}, {"type": "precision_at_10", "value": 5.729}, {"type": "precision_at_100", "value": 0.853}, {"type": "precision_at_1000", "value": 0.109}, {"type": "precision_at_20", "value": 3.3329999999999997}, {"type": "precision_at_3", "value": 13.71}, {"type": "precision_at_5", "value": 9.65}, {"type": "recall_at_1", "value": 24.09}, {"type": "recall_at_10", "value": 50.161}, {"type": "recall_at_100", "value": 72.419}, {"type": "recall_at_1000", "value": 89.983}, {"type": "recall_at_20", "value": 57.53}, {"type": "recall_at_3", "value": 36.961}, {"type": "recall_at_5", "value": 42.568}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackMathematicaRetrieval", "type": "mteb/cqadupstack-mathematica", "config": "default", "split": "test", "revision": "90fceea13679c63fe563ded68f3b6f06e50061de"}, "metrics": [{"type": "map_at_1", "value": 16.333000000000002}, {"type": "map_at_10", "value": 23.352999999999998}, {"type": "map_at_100", "value": 24.618000000000002}, {"type": "map_at_1000", "value": 24.743000000000002}, {"type": "map_at_20", "value": 24.117}, {"type": "map_at_3", "value": 21.013}, {"type": "map_at_5", "value": 22.259}, {"type": "mrr_at_1", "value": 20.398}, {"type": "mrr_at_10", "value": 28.28}, {"type": "mrr_at_100", "value": 29.307}, {"type": "mrr_at_1000", "value": 29.381}, {"type": "mrr_at_20", "value": 28.955}, {"type": "mrr_at_3", "value": 25.933}, {"type": "mrr_at_5", "value": 27.114}, {"type": "ndcg_at_1", "value": 20.398}, {"type": "ndcg_at_10", "value": 28.359}, {"type": "ndcg_at_100", "value": 34.178999999999995}, {"type": "ndcg_at_1000", "value": 37.112}, {"type": "ndcg_at_20", "value": 30.982}, {"type": "ndcg_at_3", "value": 24.104999999999997}, {"type": "ndcg_at_5", "value": 25.877}, {"type": "precision_at_1", "value": 20.398}, {"type": "precision_at_10", "value": 5.2490000000000006}, {"type": "precision_at_100", "value": 0.927}, {"type": "precision_at_1000", "value": 0.131}, {"type": "precision_at_20", "value": 3.3520000000000003}, {"type": "precision_at_3", "value": 11.733}, {"type": "precision_at_5", "value": 8.433}, {"type": "recall_at_1", "value": 16.333000000000002}, {"type": "recall_at_10", "value": 39.082}, {"type": "recall_at_100", "value": 64.269}, {"type": "recall_at_1000", "value": 85.103}, {"type": "recall_at_20", "value": 48.625}, {"type": "recall_at_3", "value": 26.740000000000002}, {"type": "recall_at_5", "value": 31.519000000000002}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackPhysicsRetrieval", "type": "mteb/cqadupstack-physics", "config": "default", "split": "test", "revision": "79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4"}, "metrics": [{"type": "map_at_1", "value": 26.857999999999997}, {"type": "map_at_10", "value": 36.258}, {"type": "map_at_100", "value": 37.556}, {"type": "map_at_1000", "value": 37.669999999999995}, {"type": "map_at_20", "value": 36.937}, {"type": "map_at_3", "value": 33.306000000000004}, {"type": "map_at_5", "value": 35.004999999999995}, {"type": "mrr_at_1", "value": 33.397}, {"type": "mrr_at_10", "value": 42.089}, {"type": "mrr_at_100", "value": 42.864999999999995}, {"type": "mrr_at_1000", "value": 42.915}, {"type": "mrr_at_20", "value": 42.510999999999996}, {"type": "mrr_at_3", "value": 39.413}, {"type": "mrr_at_5", "value": 40.905}, {"type": "ndcg_at_1", "value": 33.397}, {"type": "ndcg_at_10", "value": 42.062}, {"type": "ndcg_at_100", "value": 47.620000000000005}, {"type": "ndcg_at_1000", "value": 49.816}, {"type": "ndcg_at_20", "value": 44.096999999999994}, {"type": "ndcg_at_3", "value": 37.165}, {"type": "ndcg_at_5", "value": 39.493}, {"type": "precision_at_1", "value": 33.397}, {"type": "precision_at_10", "value": 7.5649999999999995}, {"type": "precision_at_100", "value": 1.224}, {"type": "precision_at_1000", "value": 0.16}, {"type": "precision_at_20", "value": 4.495}, {"type": "precision_at_3", "value": 17.613}, {"type": "precision_at_5", "value": 12.589}, {"type": "recall_at_1", "value": 26.857999999999997}, {"type": "recall_at_10", "value": 53.900000000000006}, {"type": "recall_at_100", "value": 77.595}, {"type": "recall_at_1000", "value": 92.116}, {"type": "recall_at_20", "value": 60.962}, {"type": "recall_at_3", "value": 39.799}, {"type": "recall_at_5", "value": 45.961}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackProgrammersRetrieval", "type": "mteb/cqadupstack-programmers", "config": "default", "split": "test", "revision": "6184bc1440d2dbc7612be22b50686b8826d22b32"}, "metrics": [{"type": "map_at_1", "value": 24.131}, {"type": "map_at_10", "value": 33.016}, {"type": "map_at_100", "value": 34.263}, {"type": "map_at_1000", "value": 34.39}, {"type": "map_at_20", "value": 33.703}, {"type": "map_at_3", "value": 30.055}, {"type": "map_at_5", "value": 31.651}, {"type": "mrr_at_1", "value": 30.593999999999998}, {"type": "mrr_at_10", "value": 38.786}, {"type": "mrr_at_100", "value": 39.674}, {"type": "mrr_at_1000", "value": 39.739000000000004}, {"type": "mrr_at_20", "value": 39.322}, {"type": "mrr_at_3", "value": 36.32}, {"type": "mrr_at_5", "value": 37.787}, {"type": "ndcg_at_1", "value": 30.593999999999998}, {"type": "ndcg_at_10", "value": 38.606}, {"type": "ndcg_at_100", "value": 44.116}, {"type": "ndcg_at_1000", "value": 46.772999999999996}, {"type": "ndcg_at_20", "value": 40.775}, {"type": "ndcg_at_3", "value": 33.854}, {"type": "ndcg_at_5", "value": 35.957}, {"type": "precision_at_1", "value": 30.593999999999998}, {"type": "precision_at_10", "value": 7.112}, {"type": "precision_at_100", "value": 1.154}, {"type": "precision_at_1000", "value": 0.155}, {"type": "precision_at_20", "value": 4.2410000000000005}, {"type": "precision_at_3", "value": 16.323999999999998}, {"type": "precision_at_5", "value": 11.644}, {"type": "recall_at_1", "value": 24.131}, {"type": "recall_at_10", "value": 49.767}, {"type": "recall_at_100", "value": 73.57000000000001}, {"type": "recall_at_1000", "value": 91.842}, {"type": "recall_at_20", "value": 57.498000000000005}, {"type": "recall_at_3", "value": 35.888}, {"type": "recall_at_5", "value": 41.801}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackStatsRetrieval", "type": "mteb/cqadupstack-stats", "config": "default", "split": "test", "revision": "65ac3a16b8e91f9cee4c9828cc7c335575432a2a"}, "metrics": [{"type": "map_at_1", "value": 23.075000000000003}, {"type": "map_at_10", "value": 29.584}, {"type": "map_at_100", "value": 30.4}, {"type": "map_at_1000", "value": 30.501}, {"type": "map_at_20", "value": 30.051}, {"type": "map_at_3", "value": 27.561000000000003}, {"type": "map_at_5", "value": 28.603}, {"type": "mrr_at_1", "value": 26.227}, {"type": "mrr_at_10", "value": 32.647}, {"type": "mrr_at_100", "value": 33.391999999999996}, {"type": "mrr_at_1000", "value": 33.469}, {"type": "mrr_at_20", "value": 33.053}, {"type": "mrr_at_3", "value": 30.776999999999997}, {"type": "mrr_at_5", "value": 31.828}, {"type": "ndcg_at_1", "value": 26.227}, {"type": "ndcg_at_10", "value": 33.582}, {"type": "ndcg_at_100", "value": 37.814}, {"type": "ndcg_at_1000", "value": 40.444}, {"type": "ndcg_at_20", "value": 35.163}, {"type": "ndcg_at_3", "value": 29.874000000000002}, {"type": "ndcg_at_5", "value": 31.53}, {"type": "precision_at_1", "value": 26.227}, {"type": "precision_at_10", "value": 5.244999999999999}, {"type": "precision_at_100", "value": 0.788}, {"type": "precision_at_1000", "value": 0.11100000000000002}, {"type": "precision_at_20", "value": 3.006}, {"type": "precision_at_3", "value": 12.73}, {"type": "precision_at_5", "value": 8.741999999999999}, {"type": "recall_at_1", "value": 23.075000000000003}, {"type": "recall_at_10", "value": 42.894}, {"type": "recall_at_100", "value": 62.721000000000004}, {"type": "recall_at_1000", "value": 81.858}, {"type": "recall_at_20", "value": 48.842}, {"type": "recall_at_3", "value": 32.783}, {"type": "recall_at_5", "value": 36.949}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackTexRetrieval", "type": "mteb/cqadupstack-tex", "config": "default", "split": "test", "revision": "46989137a86843e03a6195de44b09deda022eec7"}, "metrics": [{"type": "map_at_1", "value": 17.028}, {"type": "map_at_10", "value": 23.377}, {"type": "map_at_100", "value": 24.399}, {"type": "map_at_1000", "value": 24.524}, {"type": "map_at_20", "value": 23.863}, {"type": "map_at_3", "value": 21.274}, {"type": "map_at_5", "value": 22.431}, {"type": "mrr_at_1", "value": 20.578}, {"type": "mrr_at_10", "value": 27.009}, {"type": "mrr_at_100", "value": 27.889999999999997}, {"type": "mrr_at_1000", "value": 27.969}, {"type": "mrr_at_20", "value": 27.46}, {"type": "mrr_at_3", "value": 24.959999999999997}, {"type": "mrr_at_5", "value": 26.113999999999997}, {"type": "ndcg_at_1", "value": 20.578}, {"type": "ndcg_at_10", "value": 27.522999999999996}, {"type": "ndcg_at_100", "value": 32.601}, {"type": "ndcg_at_1000", "value": 35.636}, {"type": "ndcg_at_20", "value": 29.132}, {"type": "ndcg_at_3", "value": 23.771}, {"type": "ndcg_at_5", "value": 25.539}, {"type": "precision_at_1", "value": 20.578}, {"type": "precision_at_10", "value": 4.962}, {"type": "precision_at_100", "value": 0.8880000000000001}, {"type": "precision_at_1000", "value": 0.132}, {"type": "precision_at_20", "value": 2.959}, {"type": "precision_at_3", "value": 11.068999999999999}, {"type": "precision_at_5", "value": 8.052}, {"type": "recall_at_1", "value": 17.028}, {"type": "recall_at_10", "value": 36.266}, {"type": "recall_at_100", "value": 59.556}, {"type": "recall_at_1000", "value": 81.416}, {"type": "recall_at_20", "value": 42.303000000000004}, {"type": "recall_at_3", "value": 25.858999999999998}, {"type": "recall_at_5", "value": 30.422}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackUnixRetrieval", "type": "mteb/cqadupstack-unix", "config": "default", "split": "test", "revision": "6c6430d3a6d36f8d2a829195bc5dc94d7e063e53"}, "metrics": [{"type": "map_at_1", "value": 25.863000000000003}, {"type": "map_at_10", "value": 33.586}, {"type": "map_at_100", "value": 34.682}, {"type": "map_at_1000", "value": 34.791}, {"type": "map_at_20", "value": 34.182}, {"type": "map_at_3", "value": 31.044}, {"type": "map_at_5", "value": 32.507000000000005}, {"type": "mrr_at_1", "value": 30.131000000000004}, {"type": "mrr_at_10", "value": 37.518}, {"type": "mrr_at_100", "value": 38.355}, {"type": "mrr_at_1000", "value": 38.425}, {"type": "mrr_at_20", "value": 37.961}, {"type": "mrr_at_3", "value": 35.059000000000005}, {"type": "mrr_at_5", "value": 36.528}, {"type": "ndcg_at_1", "value": 30.131000000000004}, {"type": "ndcg_at_10", "value": 38.387}, {"type": "ndcg_at_100", "value": 43.617}, {"type": "ndcg_at_1000", "value": 46.038000000000004}, {"type": "ndcg_at_20", "value": 40.261}, {"type": "ndcg_at_3", "value": 33.722}, {"type": "ndcg_at_5", "value": 36.013}, {"type": "precision_at_1", "value": 30.131000000000004}, {"type": "precision_at_10", "value": 6.297}, {"type": "precision_at_100", "value": 1.008}, {"type": "precision_at_1000", "value": 0.132}, {"type": "precision_at_20", "value": 3.689}, {"type": "precision_at_3", "value": 15.049999999999999}, {"type": "precision_at_5", "value": 10.634}, {"type": "recall_at_1", "value": 25.863000000000003}, {"type": "recall_at_10", "value": 49.101}, {"type": "recall_at_100", "value": 72.286}, {"type": "recall_at_1000", "value": 89.14}, {"type": "recall_at_20", "value": 55.742999999999995}, {"type": "recall_at_3", "value": 36.513}, {"type": "recall_at_5", "value": 42.204}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWebmastersRetrieval", "type": "mteb/cqadupstack-webmasters", "config": "default", "split": "test", "revision": "160c094312a0e1facb97e55eeddb698c0abe3571"}, "metrics": [{"type": "map_at_1", "value": 24.747}, {"type": "map_at_10", "value": 32.067}, {"type": "map_at_100", "value": 33.739999999999995}, {"type": "map_at_1000", "value": 33.952}, {"type": "map_at_20", "value": 32.927}, {"type": "map_at_3", "value": 29.736}, {"type": "map_at_5", "value": 30.996000000000002}, {"type": "mrr_at_1", "value": 29.644}, {"type": "mrr_at_10", "value": 36.683}, {"type": "mrr_at_100", "value": 37.808}, {"type": "mrr_at_1000", "value": 37.858999999999995}, {"type": "mrr_at_20", "value": 37.326}, {"type": "mrr_at_3", "value": 34.42}, {"type": "mrr_at_5", "value": 35.626000000000005}, {"type": "ndcg_at_1", "value": 29.644}, {"type": "ndcg_at_10", "value": 36.989}, {"type": "ndcg_at_100", "value": 43.589}, {"type": "ndcg_at_1000", "value": 46.133}, {"type": "ndcg_at_20", "value": 39.403}, {"type": "ndcg_at_3", "value": 33.273}, {"type": "ndcg_at_5", "value": 34.853}, {"type": "precision_at_1", "value": 29.644}, {"type": "precision_at_10", "value": 6.8180000000000005}, {"type": "precision_at_100", "value": 1.4529999999999998}, {"type": "precision_at_1000", "value": 0.23500000000000001}, {"type": "precision_at_20", "value": 4.457}, {"type": "precision_at_3", "value": 15.152}, {"type": "precision_at_5", "value": 10.711}, {"type": "recall_at_1", "value": 24.747}, {"type": "recall_at_10", "value": 45.714}, {"type": "recall_at_100", "value": 75.212}, {"type": "recall_at_1000", "value": 90.884}, {"type": "recall_at_20", "value": 54.777}, {"type": "recall_at_3", "value": 34.821999999999996}, {"type": "recall_at_5", "value": 39.278999999999996}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWordpressRetrieval", "type": "mteb/cqadupstack-wordpress", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 19.261}, {"type": "map_at_10", "value": 26.873}, {"type": "map_at_100", "value": 27.938000000000002}, {"type": "map_at_1000", "value": 28.060000000000002}, {"type": "map_at_20", "value": 27.456000000000003}, {"type": "map_at_3", "value": 24.834}, {"type": "map_at_5", "value": 25.793}, {"type": "mrr_at_1", "value": 20.887}, {"type": "mrr_at_10", "value": 28.634999999999998}, {"type": "mrr_at_100", "value": 29.609}, {"type": "mrr_at_1000", "value": 29.698999999999998}, {"type": "mrr_at_20", "value": 29.173}, {"type": "mrr_at_3", "value": 26.741}, {"type": "mrr_at_5", "value": 27.628000000000004}, {"type": "ndcg_at_1", "value": 20.887}, {"type": "ndcg_at_10", "value": 31.261}, {"type": "ndcg_at_100", "value": 36.471}, {"type": "ndcg_at_1000", "value": 39.245000000000005}, {"type": "ndcg_at_20", "value": 33.209}, {"type": "ndcg_at_3", "value": 27.195999999999998}, {"type": "ndcg_at_5", "value": 28.786}, {"type": "precision_at_1", "value": 20.887}, {"type": "precision_at_10", "value": 4.9910000000000005}, {"type": "precision_at_100", "value": 0.8210000000000001}, {"type": "precision_at_1000", "value": 0.116}, {"type": "precision_at_20", "value": 2.939}, {"type": "precision_at_3", "value": 11.892}, {"type": "precision_at_5", "value": 8.133}, {"type": "recall_at_1", "value": 19.261}, {"type": "recall_at_10", "value": 42.806}, {"type": "recall_at_100", "value": 66.715}, {"type": "recall_at_1000", "value": 86.921}, {"type": "recall_at_20", "value": 50.205999999999996}, {"type": "recall_at_3", "value": 31.790000000000003}, {"type": "recall_at_5", "value": 35.527}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "mteb/climate-fever", "config": "default", "split": "test", "revision": "47f2ac6acb640fc46020b02a5b59fdda04d39380"}, "metrics": [{"type": "map_at_1", "value": 9.009}, {"type": "map_at_10", "value": 14.629}, {"type": "map_at_100", "value": 16.092000000000002}, {"type": "map_at_1000", "value": 16.267}, {"type": "map_at_20", "value": 15.384999999999998}, {"type": "map_at_3", "value": 12.280000000000001}, {"type": "map_at_5", "value": 13.442000000000002}, {"type": "mrr_at_1", "value": 20.0}, {"type": "mrr_at_10", "value": 29.298000000000002}, {"type": "mrr_at_100", "value": 30.375999999999998}, {"type": "mrr_at_1000", "value": 30.436999999999998}, {"type": "mrr_at_20", "value": 29.956}, {"type": "mrr_at_3", "value": 26.362999999999996}, {"type": "mrr_at_5", "value": 28.021}, {"type": "ndcg_at_1", "value": 20.0}, {"type": "ndcg_at_10", "value": 21.234}, {"type": "ndcg_at_100", "value": 27.687}, {"type": "ndcg_at_1000", "value": 31.325999999999997}, {"type": "ndcg_at_20", "value": 23.631}, {"type": "ndcg_at_3", "value": 17.101}, {"type": "ndcg_at_5", "value": 18.501}, {"type": "precision_at_1", "value": 20.0}, {"type": "precision_at_10", "value": 6.651}, {"type": "precision_at_100", "value": 1.347}, {"type": "precision_at_1000", "value": 0.201}, {"type": "precision_at_20", "value": 4.316}, {"type": "precision_at_3", "value": 12.53}, {"type": "precision_at_5", "value": 9.707}, {"type": "recall_at_1", "value": 9.009}, {"type": "recall_at_10", "value": 25.824}, {"type": "recall_at_100", "value": 48.535000000000004}, {"type": "recall_at_1000", "value": 69.44399999999999}, {"type": "recall_at_20", "value": 32.78}, {"type": "recall_at_3", "value": 15.693999999999999}, {"type": "recall_at_5", "value": 19.59}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "mteb/dbpedia", "config": "default", "split": "test", "revision": "c0f706b76e590d620bd6618b3ca8efdd34e2d659"}, "metrics": [{"type": "map_at_1", "value": 7.454}, {"type": "map_at_10", "value": 15.675}, {"type": "map_at_100", "value": 21.335}, {"type": "map_at_1000", "value": 22.639}, {"type": "map_at_20", "value": 17.822}, {"type": "map_at_3", "value": 11.609}, {"type": "map_at_5", "value": 13.342}, {"type": "mrr_at_1", "value": 56.25}, {"type": "mrr_at_10", "value": 65.30799999999999}, {"type": "mrr_at_100", "value": 65.90599999999999}, {"type": "mrr_at_1000", "value": 65.92099999999999}, {"type": "mrr_at_20", "value": 65.74600000000001}, {"type": "mrr_at_3", "value": 63.333}, {"type": "mrr_at_5", "value": 64.521}, {"type": "ndcg_at_1", "value": 44.625}, {"type": "ndcg_at_10", "value": 33.881}, {"type": "ndcg_at_100", "value": 37.775999999999996}, {"type": "ndcg_at_1000", "value": 44.956}, {"type": "ndcg_at_20", "value": 33.451}, {"type": "ndcg_at_3", "value": 37.72}, {"type": "ndcg_at_5", "value": 35.811}, {"type": "precision_at_1", "value": 56.25}, {"type": "precision_at_10", "value": 27.175}, {"type": "precision_at_100", "value": 8.448}, {"type": "precision_at_1000", "value": 1.809}, {"type": "precision_at_20", "value": 20.262}, {"type": "precision_at_3", "value": 41.333}, {"type": "precision_at_5", "value": 35.199999999999996}, {"type": "recall_at_1", "value": 7.454}, {"type": "recall_at_10", "value": 20.355999999999998}, {"type": "recall_at_100", "value": 43.168}, {"type": "recall_at_1000", "value": 66.559}, {"type": "recall_at_20", "value": 26.785999999999998}, {"type": "recall_at_3", "value": 13.052}, {"type": "recall_at_5", "value": 15.733}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 45.44499999999999}, {"type": "f1", "value": 40.581418056070994}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "mteb/fever", "config": "default", "split": "test", "revision": "bea83ef9e8fb933d90a2f1d5515737465d613e12"}, "metrics": [{"type": "map_at_1", "value": 46.339000000000006}, {"type": "map_at_10", "value": 57.87}, {"type": "map_at_100", "value": 58.447}, {"type": "map_at_1000", "value": 58.474000000000004}, {"type": "map_at_20", "value": 58.241}, {"type": "map_at_3", "value": 55.336}, {"type": "map_at_5", "value": 56.879000000000005}, {"type": "mrr_at_1", "value": 49.91}, {"type": "mrr_at_10", "value": 61.55199999999999}, {"type": "mrr_at_100", "value": 62.07}, {"type": "mrr_at_1000", "value": 62.086}, {"type": "mrr_at_20", "value": 61.899}, {"type": "mrr_at_3", "value": 59.108000000000004}, {"type": "mrr_at_5", "value": 60.622}, {"type": "ndcg_at_1", "value": 49.91}, {"type": "ndcg_at_10", "value": 63.970000000000006}, {"type": "ndcg_at_100", "value": 66.625}, {"type": "ndcg_at_1000", "value": 67.221}, {"type": "ndcg_at_20", "value": 65.261}, {"type": "ndcg_at_3", "value": 59.059}, {"type": "ndcg_at_5", "value": 61.68900000000001}, {"type": "precision_at_1", "value": 49.91}, {"type": "precision_at_10", "value": 8.699}, {"type": "precision_at_100", "value": 1.015}, {"type": "precision_at_1000", "value": 0.108}, {"type": "precision_at_20", "value": 4.6370000000000005}, {"type": "precision_at_3", "value": 23.942}, {"type": "precision_at_5", "value": 15.815000000000001}, {"type": "recall_at_1", "value": 46.339000000000006}, {"type": "recall_at_10", "value": 79.28}, {"type": "recall_at_100", "value": 91.148}, {"type": "recall_at_1000", "value": 95.438}, {"type": "recall_at_20", "value": 84.187}, {"type": "recall_at_3", "value": 66.019}, {"type": "recall_at_5", "value": 72.394}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "mteb/fiqa", "config": "default", "split": "test", "revision": "27a168819829fe9bcd655c2df245fb19452e8e06"}, "metrics": [{"type": "map_at_1", "value": 14.504}, {"type": "map_at_10", "value": 24.099999999999998}, {"type": "map_at_100", "value": 25.820999999999998}, {"type": "map_at_1000", "value": 25.997999999999998}, {"type": "map_at_20", "value": 25.003999999999998}, {"type": "map_at_3", "value": 21.218999999999998}, {"type": "map_at_5", "value": 22.744}, {"type": "mrr_at_1", "value": 29.475}, {"type": "mrr_at_10", "value": 38.072}, {"type": "mrr_at_100", "value": 39.196999999999996}, {"type": "mrr_at_1000", "value": 39.249}, {"type": "mrr_at_20", "value": 38.757999999999996}, {"type": "mrr_at_3", "value": 36.214}, {"type": "mrr_at_5", "value": 37.094}, {"type": "ndcg_at_1", "value": 29.475}, {"type": "ndcg_at_10", "value": 30.708999999999996}, {"type": "ndcg_at_100", "value": 37.744}, {"type": "ndcg_at_1000", "value": 41.215}, {"type": "ndcg_at_20", "value": 33.336}, {"type": "ndcg_at_3", "value": 28.243000000000002}, {"type": "ndcg_at_5", "value": 28.62}, {"type": "precision_at_1", "value": 29.475}, {"type": "precision_at_10", "value": 8.596}, {"type": "precision_at_100", "value": 1.562}, {"type": "precision_at_1000", "value": 0.219}, {"type": "precision_at_20", "value": 5.394}, {"type": "precision_at_3", "value": 19.084}, {"type": "precision_at_5", "value": 13.672999999999998}, {"type": "recall_at_1", "value": 14.504}, {"type": "recall_at_10", "value": 36.232}, {"type": "recall_at_100", "value": 62.712}, {"type": "recall_at_1000", "value": 83.864}, {"type": "recall_at_20", "value": 44.357}, {"type": "recall_at_3", "value": 26.029000000000003}, {"type": "recall_at_5", "value": 29.909000000000002}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "mteb/hotpotqa", "config": "default", "split": "test", "revision": "ab518f4d6fcca38d87c25209f94beba119d02014"}, "metrics": [{"type": "map_at_1", "value": 31.634}, {"type": "map_at_10", "value": 45.007000000000005}, {"type": "map_at_100", "value": 45.963}, {"type": "map_at_1000", "value": 46.052}, {"type": "map_at_20", "value": 45.550000000000004}, {"type": "map_at_3", "value": 42.092}, {"type": "map_at_5", "value": 43.832}, {"type": "mrr_at_1", "value": 63.268}, {"type": "mrr_at_10", "value": 70.691}, {"type": "mrr_at_100", "value": 71.063}, {"type": "mrr_at_1000", "value": 71.082}, {"type": "mrr_at_20", "value": 70.917}, {"type": "mrr_at_3", "value": 69.176}, {"type": "mrr_at_5", "value": 70.132}, {"type": "ndcg_at_1", "value": 63.268}, {"type": "ndcg_at_10", "value": 54.205000000000005}, {"type": "ndcg_at_100", "value": 57.847}, {"type": "ndcg_at_1000", "value": 59.64}, {"type": "ndcg_at_20", "value": 55.663}, {"type": "ndcg_at_3", "value": 49.613}, {"type": "ndcg_at_5", "value": 52.054}, {"type": "precision_at_1", "value": 63.268}, {"type": "precision_at_10", "value": 11.357000000000001}, {"type": "precision_at_100", "value": 1.423}, {"type": "precision_at_1000", "value": 0.166}, {"type": "precision_at_20", "value": 6.148}, {"type": "precision_at_3", "value": 31.041999999999998}, {"type": "precision_at_5", "value": 20.551}, {"type": "recall_at_1", "value": 31.634}, {"type": "recall_at_10", "value": 56.786}, {"type": "recall_at_100", "value": 71.128}, {"type": "recall_at_1000", "value": 82.97099999999999}, {"type": "recall_at_20", "value": 61.47899999999999}, {"type": "recall_at_3", "value": 46.563}, {"type": "recall_at_5", "value": 51.376999999999995}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 80.7996}, {"type": "ap", "value": 74.98592172204835}, {"type": "f1", "value": 80.77161545117626}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "mteb/msmarco", "config": "default", "split": "dev", "revision": "c5a29a104738b98a9e76336939199e264163d4a0"}, "metrics": [{"type": "map_at_1", "value": 16.637}, {"type": "map_at_10", "value": 27.331}, {"type": "map_at_100", "value": 28.518}, {"type": "map_at_1000", "value": 28.583}, {"type": "map_at_20", "value": 28.031}, {"type": "map_at_3", "value": 23.715}, {"type": "map_at_5", "value": 25.758}, {"type": "mrr_at_1", "value": 17.077}, {"type": "mrr_at_10", "value": 27.807}, {"type": "mrr_at_100", "value": 28.965999999999998}, {"type": "mrr_at_1000", "value": 29.025000000000002}, {"type": "mrr_at_20", "value": 28.499999999999996}, {"type": "mrr_at_3", "value": 24.234}, {"type": "mrr_at_5", "value": 26.257}, {"type": "ndcg_at_1", "value": 17.077}, {"type": "ndcg_at_10", "value": 33.607}, {"type": "ndcg_at_100", "value": 39.593}, {"type": "ndcg_at_1000", "value": 41.317}, {"type": "ndcg_at_20", "value": 36.118}, {"type": "ndcg_at_3", "value": 26.204}, {"type": "ndcg_at_5", "value": 29.862}, {"type": "precision_at_1", "value": 17.077}, {"type": "precision_at_10", "value": 5.54}, {"type": "precision_at_100", "value": 0.857}, {"type": "precision_at_1000", "value": 0.101}, {"type": "precision_at_20", "value": 3.2870000000000004}, {"type": "precision_at_3", "value": 11.361}, {"type": "precision_at_5", "value": 8.673}, {"type": "recall_at_1", "value": 16.637}, {"type": "recall_at_10", "value": 53.077}, {"type": "recall_at_100", "value": 81.306}, {"type": "recall_at_1000", "value": 94.72699999999999}, {"type": "recall_at_20", "value": 62.855000000000004}, {"type": "recall_at_3", "value": 32.897999999999996}, {"type": "recall_at_5", "value": 41.697}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 92.12494300045599}, {"type": "f1", "value": 91.6522604757574}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 71.86046511627907}, {"type": "f1", "value": 53.8926541769729}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 70.34633490248824}, {"type": "f1", "value": 67.94196699295675}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 74.88903833221251}, {"type": "f1", "value": 74.54991713265153}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 33.129785771060526}, {"type": "v_measures", "value": [0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535, 0.3116408980465631, 0.31900622847630045, 0.31934151231927727, 0.3186791563176499, 0.32750328333726775, 0.3510627418495332, 0.33347506212887845, 0.35025343435496104, 0.3417862644568677, 0.3402299958187535]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 30.29367725266166}, {"type": "v_measures", "value": [0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535, 0.2892892644106019, 0.2904909862243706, 0.29717543408443786, 0.28841424958079537, 0.2946040279701031, 0.3071795420433026, 0.30471220279454575, 0.31753537687383027, 0.318823343042763, 0.32114329824141535]}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "mteb/nfcorpus", "config": "default", "split": "test", "revision": "ec0fa4fe99da2ff19ca1214b7966684033a58814"}, "metrics": [{"type": "map_at_1", "value": 5.542}, {"type": "map_at_10", "value": 11.734}, {"type": "map_at_100", "value": 14.812}, {"type": "map_at_1000", "value": 16.184}, {"type": "map_at_20", "value": 13.045000000000002}, {"type": "map_at_3", "value": 8.859}, {"type": "map_at_5", "value": 10.162}, {"type": "mrr_at_1", "value": 43.963}, {"type": "mrr_at_10", "value": 51.914}, {"type": "mrr_at_100", "value": 52.422000000000004}, {"type": "mrr_at_1000", "value": 52.479}, {"type": "mrr_at_20", "value": 52.215}, {"type": "mrr_at_3", "value": 49.897000000000006}, {"type": "mrr_at_5", "value": 50.965}, {"type": "ndcg_at_1", "value": 42.105}, {"type": "ndcg_at_10", "value": 32.035000000000004}, {"type": "ndcg_at_100", "value": 29.487999999999996}, {"type": "ndcg_at_1000", "value": 38.316}, {"type": "ndcg_at_20", "value": 30.255}, {"type": "ndcg_at_3", "value": 37.098}, {"type": "ndcg_at_5", "value": 34.98}, {"type": "precision_at_1", "value": 43.344}, {"type": "precision_at_10", "value": 23.313}, {"type": "precision_at_100", "value": 7.591}, {"type": "precision_at_1000", "value": 2.023}, {"type": "precision_at_20", "value": 17.755000000000003}, {"type": "precision_at_3", "value": 33.745999999999995}, {"type": "precision_at_5", "value": 29.474}, {"type": "recall_at_1", "value": 5.542}, {"type": "recall_at_10", "value": 15.61}, {"type": "recall_at_100", "value": 29.413}, {"type": "recall_at_1000", "value": 61.926}, {"type": "recall_at_20", "value": 19.517}, {"type": "recall_at_3", "value": 9.669}, {"type": "recall_at_5", "value": 11.772}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "mteb/nq", "config": "default", "split": "test", "revision": "b774495ed302d8c44a3a7ea25c90dbce03968f31"}, "metrics": [{"type": "map_at_1", "value": 21.590999999999998}, {"type": "map_at_10", "value": 35.088}, {"type": "map_at_100", "value": 36.386}, {"type": "map_at_1000", "value": 36.439}, {"type": "map_at_20", "value": 35.93}, {"type": "map_at_3", "value": 30.985000000000003}, {"type": "map_at_5", "value": 33.322}, {"type": "mrr_at_1", "value": 24.189}, {"type": "mrr_at_10", "value": 37.395}, {"type": "mrr_at_100", "value": 38.449}, {"type": "mrr_at_1000", "value": 38.486}, {"type": "mrr_at_20", "value": 38.092999999999996}, {"type": "mrr_at_3", "value": 33.686}, {"type": "mrr_at_5", "value": 35.861}, {"type": "ndcg_at_1", "value": 24.189}, {"type": "ndcg_at_10", "value": 42.471}, {"type": "ndcg_at_100", "value": 48.150999999999996}, {"type": "ndcg_at_1000", "value": 49.342000000000006}, {"type": "ndcg_at_20", "value": 45.245000000000005}, {"type": "ndcg_at_3", "value": 34.483000000000004}, {"type": "ndcg_at_5", "value": 38.505}, {"type": "precision_at_1", "value": 24.189}, {"type": "precision_at_10", "value": 7.3870000000000005}, {"type": "precision_at_100", "value": 1.056}, {"type": "precision_at_1000", "value": 0.117}, {"type": "precision_at_20", "value": 4.35}, {"type": "precision_at_3", "value": 16.009999999999998}, {"type": "precision_at_5", "value": 11.883000000000001}, {"type": "recall_at_1", "value": 21.590999999999998}, {"type": "recall_at_10", "value": 62.79}, {"type": "recall_at_100", "value": 87.71}, {"type": "recall_at_1000", "value": 96.418}, {"type": "recall_at_20", "value": 73.042}, {"type": "recall_at_3", "value": 41.876999999999995}, {"type": "recall_at_5", "value": 51.205}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "mteb/quora", "config": "default", "split": "test", "revision": "e4e08e0b7dbe3c8700f0daef558ff32256715259"}, "metrics": [{"type": "map_at_1", "value": 68.31099999999999}, {"type": "map_at_10", "value": 81.845}, {"type": "map_at_100", "value": 82.518}, {"type": "map_at_1000", "value": 82.541}, {"type": "map_at_20", "value": 82.292}, {"type": "map_at_3", "value": 78.827}, {"type": "map_at_5", "value": 80.715}, {"type": "mrr_at_1", "value": 78.62}, {"type": "mrr_at_10", "value": 85.42}, {"type": "mrr_at_100", "value": 85.54899999999999}, {"type": "mrr_at_1000", "value": 85.55}, {"type": "mrr_at_20", "value": 85.516}, {"type": "mrr_at_3", "value": 84.265}, {"type": "mrr_at_5", "value": 85.021}, {"type": "ndcg_at_1", "value": 78.63}, {"type": "ndcg_at_10", "value": 86.032}, {"type": "ndcg_at_100", "value": 87.50099999999999}, {"type": "ndcg_at_1000", "value": 87.67200000000001}, {"type": "ndcg_at_20", "value": 86.822}, {"type": "ndcg_at_3", "value": 82.813}, {"type": "ndcg_at_5", "value": 84.555}, {"type": "precision_at_1", "value": 78.63}, {"type": "precision_at_10", "value": 13.025999999999998}, {"type": "precision_at_100", "value": 1.504}, {"type": "precision_at_1000", "value": 0.156}, {"type": "precision_at_20", "value": 6.944999999999999}, {"type": "precision_at_3", "value": 36.013}, {"type": "precision_at_5", "value": 23.788}, {"type": "recall_at_1", "value": 68.31099999999999}, {"type": "recall_at_10", "value": 94.003}, {"type": "recall_at_100", "value": 99.11999999999999}, {"type": "recall_at_1000", "value": 99.923}, {"type": "recall_at_20", "value": 96.55799999999999}, {"type": "recall_at_3", "value": 84.836}, {"type": "recall_at_5", "value": 89.655}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 47.52530454226057}, {"type": "v_measures", "value": [0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056, 0.47757401852125586, 0.5247425354540537, 0.4204113161707625, 0.46730199475875295, 0.44060686916417374, 0.40965236253971965, 0.5406478376242424, 0.4258020776189897, 0.45263355666588695, 0.4485852520776176, 0.45776058545875725, 0.5163652480866036, 0.4839337312350155, 0.4787997358105262, 0.5744729237665975, 0.4250543347829616, 0.49829072714687295, 0.5853438771525417, 0.4205343962194473, 0.42565458494862596, 0.4278942125559693, 0.450724893645709, 0.6135871494667406, 0.4720579979931778, 0.44289391670014056]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "385e3cb46b4cfa89021f56c4380204149d0efe33"}, "metrics": [{"type": "v_measure", "value": 56.028612066452}, {"type": "v_measures", "value": [0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003, 0.616850986362034, 0.6156955011870908, 0.5889048703354965, 0.3132434489631298, 0.6351476398732859, 0.5618708165569017, 0.2892441818894155, 0.678005863237291, 0.6308488746145553, 0.6730490236260003]}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "mteb/scidocs", "config": "default", "split": "test", "revision": "f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88"}, "metrics": [{"type": "map_at_1", "value": 4.108}, {"type": "map_at_10", "value": 10.953}, {"type": "map_at_100", "value": 13.004}, {"type": "map_at_1000", "value": 13.303}, {"type": "map_at_20", "value": 12.004}, {"type": "map_at_3", "value": 7.754999999999999}, {"type": "map_at_5", "value": 9.19}, {"type": "mrr_at_1", "value": 20.200000000000003}, {"type": "mrr_at_10", "value": 31.069999999999997}, {"type": "mrr_at_100", "value": 32.222}, {"type": "mrr_at_1000", "value": 32.277}, {"type": "mrr_at_20", "value": 31.761}, {"type": "mrr_at_3", "value": 27.717000000000002}, {"type": "mrr_at_5", "value": 29.416999999999998}, {"type": "ndcg_at_1", "value": 20.200000000000003}, {"type": "ndcg_at_10", "value": 18.636}, {"type": "ndcg_at_100", "value": 26.442}, {"type": "ndcg_at_1000", "value": 31.828}, {"type": "ndcg_at_20", "value": 21.441}, {"type": "ndcg_at_3", "value": 17.323}, {"type": "ndcg_at_5", "value": 15.010000000000002}, {"type": "precision_at_1", "value": 20.200000000000003}, {"type": "precision_at_10", "value": 9.9}, {"type": "precision_at_100", "value": 2.106}, {"type": "precision_at_1000", "value": 0.33999999999999997}, {"type": "precision_at_20", "value": 6.575}, {"type": "precision_at_3", "value": 16.367}, {"type": "precision_at_5", "value": 13.200000000000001}, {"type": "recall_at_1", "value": 4.108}, {"type": "recall_at_10", "value": 20.052}, {"type": "recall_at_100", "value": 42.723}, {"type": "recall_at_1000", "value": 69.118}, {"type": "recall_at_20", "value": 26.662999999999997}, {"type": "recall_at_3", "value": 9.963}, {"type": "recall_at_5", "value": 13.377}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "20a6d6f312dd54037fe07a32d58e5e168867909d"}, "metrics": [{"type": "cos_sim_pearson", "value": 81.73133871784073}, {"type": "cos_sim_spearman", "value": 75.63155962642634}, {"type": "euclidean_pearson", "value": 78.84721858652286}, {"type": "euclidean_spearman", "value": 75.52150847464515}, {"type": "manhattan_pearson", "value": 78.65433033180727}, {"type": "manhattan_spearman", "value": 75.30995832884881}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 75.66063073145264}, {"type": "cos_sim_spearman", "value": 68.58158236004101}, {"type": "euclidean_pearson", "value": 72.54019756825143}, {"type": "euclidean_spearman", "value": 69.05526621955067}, {"type": "manhattan_pearson", "value": 72.69442494173272}, {"type": "manhattan_spearman", "value": 69.24310689645435}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 79.93061145846976}, {"type": "cos_sim_spearman", "value": 80.54473705232682}, {"type": "euclidean_pearson", "value": 80.25598213392439}, {"type": "euclidean_spearman", "value": 80.57639468906437}, {"type": "manhattan_pearson", "value": 80.04739474388745}, {"type": "manhattan_spearman", "value": 80.35672978503159}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 80.63106651366024}, {"type": "cos_sim_spearman", "value": 77.628680514703}, {"type": "euclidean_pearson", "value": 79.88625241187461}, {"type": "euclidean_spearman", "value": 77.80535399731345}, {"type": "manhattan_pearson", "value": 79.78810133011544}, {"type": "manhattan_spearman", "value": 77.73028091841451}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 85.30832602658512}, {"type": "cos_sim_spearman", "value": 86.15687211744392}, {"type": "euclidean_pearson", "value": 85.94586990553746}, {"type": "euclidean_spearman", "value": 86.48157226860724}, {"type": "manhattan_pearson", "value": 85.88233798668581}, {"type": "manhattan_spearman", "value": 86.42359889540302}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 81.48207305822743}, {"type": "cos_sim_spearman", "value": 82.8229306585227}, {"type": "euclidean_pearson", "value": 82.3912454156615}, {"type": "euclidean_spearman", "value": 83.09865476559257}, {"type": "manhattan_pearson", "value": 82.30053520575876}, {"type": "manhattan_spearman", "value": 83.00392320200139}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_pearson", "value": 87.83517082969622}, {"type": "cos_sim_spearman", "value": 88.5704237984555}, {"type": "euclidean_pearson", "value": 88.15443024833176}, {"type": "euclidean_spearman", "value": 88.60313594495189}, {"type": "manhattan_pearson", "value": 87.99012996276818}, {"type": "manhattan_spearman", "value": 88.39306322978999}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 67.62856734038614}, {"type": "cos_sim_spearman", "value": 67.38775280429276}, {"type": "euclidean_pearson", "value": 68.09416503472238}, {"type": "euclidean_spearman", "value": 67.45221088834498}, {"type": "manhattan_pearson", "value": 68.31811474137709}, {"type": "manhattan_spearman", "value": 67.75846817406287}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.13302836216701}, {"type": "cos_sim_spearman", "value": 84.24952159575491}, {"type": "euclidean_pearson", "value": 84.65017899273384}, {"type": "euclidean_spearman", "value": 84.43303793097236}, {"type": "manhattan_pearson", "value": 84.55589549879238}, {"type": "manhattan_spearman", "value": 84.42827667887977}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 80.03616790601166}, {"type": "mrr", "value": 94.31135132115524}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "mteb/scifact", "config": "default", "split": "test", "revision": "0228b52cf27578f30900b9e5271d331663a030d7"}, "metrics": [{"type": "map_at_1", "value": 51.678000000000004}, {"type": "map_at_10", "value": 62.011}, {"type": "map_at_100", "value": 62.443000000000005}, {"type": "map_at_1000", "value": 62.468999999999994}, {"type": "map_at_20", "value": 62.226000000000006}, {"type": "map_at_3", "value": 58.443999999999996}, {"type": "map_at_5", "value": 60.550000000000004}, {"type": "mrr_at_1", "value": 54.0}, {"type": "mrr_at_10", "value": 63.27199999999999}, {"type": "mrr_at_100", "value": 63.596}, {"type": "mrr_at_1000", "value": 63.619}, {"type": "mrr_at_20", "value": 63.416}, {"type": "mrr_at_3", "value": 60.5}, {"type": "mrr_at_5", "value": 62.283}, {"type": "ndcg_at_1", "value": 54.0}, {"type": "ndcg_at_10", "value": 67.315}, {"type": "ndcg_at_100", "value": 69.372}, {"type": "ndcg_at_1000", "value": 70.15400000000001}, {"type": "ndcg_at_20", "value": 67.943}, {"type": "ndcg_at_3", "value": 61.121}, {"type": "ndcg_at_5", "value": 64.399}, {"type": "precision_at_1", "value": 54.0}, {"type": "precision_at_10", "value": 9.232999999999999}, {"type": "precision_at_100", "value": 1.047}, {"type": "precision_at_1000", "value": 0.11100000000000002}, {"type": "precision_at_20", "value": 4.7829999999999995}, {"type": "precision_at_3", "value": 23.666999999999998}, {"type": "precision_at_5", "value": 16.2}, {"type": "recall_at_1", "value": 51.678000000000004}, {"type": "recall_at_10", "value": 82.389}, {"type": "recall_at_100", "value": 92.0}, {"type": "recall_at_1000", "value": 98.333}, {"type": "recall_at_20", "value": 84.63300000000001}, {"type": "recall_at_3", "value": 66.05}, {"type": "recall_at_5", "value": 74.006}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.82673267326733}, {"type": "cos_sim_ap", "value": 95.11999931294784}, {"type": "cos_sim_f1", "value": 91.0941475826972}, {"type": "cos_sim_precision", "value": 92.74611398963731}, {"type": "cos_sim_recall", "value": 89.5}, {"type": "dot_accuracy", "value": 99.73861386138614}, {"type": "dot_ap", "value": 92.76208671816435}, {"type": "dot_f1", "value": 86.5055387713998}, {"type": "dot_precision", "value": 87.11967545638946}, {"type": "dot_recall", "value": 85.9}, {"type": "euclidean_accuracy", "value": 99.82376237623762}, {"type": "euclidean_ap", "value": 95.02471241011084}, {"type": "euclidean_f1", "value": 90.97363083164299}, {"type": "euclidean_precision", "value": 92.28395061728395}, {"type": "euclidean_recall", "value": 89.7}, {"type": "manhattan_accuracy", "value": 99.82574257425742}, {"type": "manhattan_ap", "value": 95.08424842231868}, {"type": "manhattan_f1", "value": 91.10212335692619}, {"type": "manhattan_precision", "value": 92.12678936605317}, {"type": "manhattan_recall", "value": 90.10000000000001}, {"type": "max_accuracy", "value": 99.82673267326733}, {"type": "max_ap", "value": 95.11999931294784}, {"type": "max_f1", "value": 91.10212335692619}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 53.870949746768424}, {"type": "v_measures", "value": [0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825, 0.53571634076978, 0.5884760755274984, 0.46493825119779986, 0.5647097615749553, 0.5050495849120543, 0.491061219994023, 0.4819622731542588, 0.5685868012607284, 0.5540760555292195, 0.531322826771169, 0.5932274601787088, 0.6261393631444355, 0.6353921700607754, 0.6018599887005625, 0.5217064752780205, 0.5317605881853373, 0.5257201882718268, 0.5260835662200616, 0.5003275253721006, 0.5110511254674243, 0.5261695936445681, 0.5091730883971124, 0.48910042016546806, 0.5422967369475379, 0.5418299559666825]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 33.56703823226784}, {"type": "v_measures", "value": [0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039, 0.320494817263046, 0.3250723341694729, 0.32168615316198984, 0.31328349679632345, 0.31938046148819477, 0.36421160408518477, 0.3463076518950044, 0.35187389429456556, 0.3507929680626984, 0.3436004420103039]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 49.82873266157383}, {"type": "mrr", "value": 50.652096065699006}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 31.35739606124227}, {"type": "cos_sim_spearman", "value": 31.26775311472305}, {"type": "dot_pearson", "value": 29.421400993418278}, {"type": "dot_spearman", "value": 30.180472594773534}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "mteb/trec-covid", "config": "default", "split": "test", "revision": "bb9466bac8153a0349341eb1b22e06409e78ef4e"}, "metrics": [{"type": "map_at_1", "value": 0.184}, {"type": "map_at_10", "value": 1.398}, {"type": "map_at_100", "value": 7.2090000000000005}, {"type": "map_at_1000", "value": 18.414}, {"type": "map_at_20", "value": 2.414}, {"type": "map_at_3", "value": 0.509}, {"type": "map_at_5", "value": 0.767}, {"type": "mrr_at_1", "value": 72.0}, {"type": "mrr_at_10", "value": 80.467}, {"type": "mrr_at_100", "value": 80.735}, {"type": "mrr_at_1000", "value": 80.735}, {"type": "mrr_at_20", "value": 80.735}, {"type": "mrr_at_3", "value": 79.0}, {"type": "mrr_at_5", "value": 79.80000000000001}, {"type": "ndcg_at_1", "value": 68.0}, {"type": "ndcg_at_10", "value": 60.324}, {"type": "ndcg_at_100", "value": 43.866}, {"type": "ndcg_at_1000", "value": 41.932}, {"type": "ndcg_at_20", "value": 56.013999999999996}, {"type": "ndcg_at_3", "value": 66.458}, {"type": "ndcg_at_5", "value": 63.048}, {"type": "precision_at_1", "value": 72.0}, {"type": "precision_at_10", "value": 64.2}, {"type": "precision_at_100", "value": 44.56}, {"type": "precision_at_1000", "value": 18.736}, {"type": "precision_at_20", "value": 59.0}, {"type": "precision_at_3", "value": 72.0}, {"type": "precision_at_5", "value": 67.2}, {"type": "recall_at_1", "value": 0.184}, {"type": "recall_at_10", "value": 1.649}, {"type": "recall_at_100", "value": 10.659}, {"type": "recall_at_1000", "value": 40.424}, {"type": "recall_at_20", "value": 3.0349999999999997}, {"type": "recall_at_3", "value": 0.5519999999999999}, {"type": "recall_at_5", "value": 0.852}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "mteb/touche2020", "config": "default", "split": "test", "revision": "a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f"}, "metrics": [{"type": "map_at_1", "value": 1.252}, {"type": "map_at_10", "value": 8.029}, {"type": "map_at_100", "value": 13.504}, {"type": "map_at_1000", "value": 15.013000000000002}, {"type": "map_at_20", "value": 10.306}, {"type": "map_at_3", "value": 3.372}, {"type": "map_at_5", "value": 4.923}, {"type": "mrr_at_1", "value": 18.367}, {"type": "mrr_at_10", "value": 36.612}, {"type": "mrr_at_100", "value": 37.345}, {"type": "mrr_at_1000", "value": 37.345}, {"type": "mrr_at_20", "value": 36.955}, {"type": "mrr_at_3", "value": 32.993}, {"type": "mrr_at_5", "value": 33.912}, {"type": "ndcg_at_1", "value": 16.326999999999998}, {"type": "ndcg_at_10", "value": 21.124000000000002}, {"type": "ndcg_at_100", "value": 32.635}, {"type": "ndcg_at_1000", "value": 43.993}, {"type": "ndcg_at_20", "value": 22.429}, {"type": "ndcg_at_3", "value": 20.836}, {"type": "ndcg_at_5", "value": 20.437}, {"type": "precision_at_1", "value": 18.367}, {"type": "precision_at_10", "value": 21.02}, {"type": "precision_at_100", "value": 7.245}, {"type": "precision_at_1000", "value": 1.473}, {"type": "precision_at_20", "value": 15.714}, {"type": "precision_at_3", "value": 23.128999999999998}, {"type": "precision_at_5", "value": 22.448999999999998}, {"type": "recall_at_1", "value": 1.252}, {"type": "recall_at_10", "value": 15.312999999999999}, {"type": "recall_at_100", "value": 44.908}, {"type": "recall_at_1000", "value": 79.396}, {"type": "recall_at_20", "value": 22.647000000000002}, {"type": "recall_at_3", "value": 4.883}, {"type": "recall_at_5", "value": 7.917000000000001}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "edfaf9da55d3dd50d43143d90c1ac476895ae6de"}, "metrics": [{"type": "accuracy", "value": 65.458984375}, {"type": "ap", "value": 12.013147326225168}, {"type": "f1", "value": 50.30981581053394}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 58.658743633276735}, {"type": "f1", "value": 59.01001910848807}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 40.7719980016582}, {"type": "v_measures", "value": [0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471, 0.43398618769240316, 0.411071419600849, 0.4084167708216848, 0.4309144066998439, 0.3937926057303082, 0.41327169334332636, 0.4194895558089149, 0.3732114423385808, 0.4053128667752613, 0.3877328513546471]}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 84.71717231924659}, {"type": "cos_sim_ap", "value": 69.78325722226528}, {"type": "cos_sim_f1", "value": 66.23786691615015}, {"type": "cos_sim_precision", "value": 59.483301827347205}, {"type": "cos_sim_recall", "value": 74.72295514511873}, {"type": "dot_accuracy", "value": 81.95148119449246}, {"type": "dot_ap", "value": 60.71125646179137}, {"type": "dot_f1", "value": 58.44781026182928}, {"type": "dot_precision", "value": 52.65496086312672}, {"type": "dot_recall", "value": 65.67282321899735}, {"type": "euclidean_accuracy", "value": 84.84830422602371}, {"type": "euclidean_ap", "value": 69.97192936786296}, {"type": "euclidean_f1", "value": 66.53649011471808}, {"type": "euclidean_precision", "value": 61.898274296094456}, {"type": "euclidean_recall", "value": 71.92612137203166}, {"type": "manhattan_accuracy", "value": 84.75889610776659}, {"type": "manhattan_ap", "value": 69.75691180376053}, {"type": "manhattan_f1", "value": 66.32788868723533}, {"type": "manhattan_precision", "value": 61.2513966480447}, {"type": "manhattan_recall", "value": 72.32189973614776}, {"type": "max_accuracy", "value": 84.84830422602371}, {"type": "max_ap", "value": 69.97192936786296}, {"type": "max_f1", "value": 66.53649011471808}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 88.43287926417511}, {"type": "cos_sim_ap", "value": 85.07378179191598}, {"type": "cos_sim_f1", "value": 77.50230244980658}, {"type": "cos_sim_precision", "value": 74.30246521155613}, {"type": "cos_sim_recall", "value": 80.99014474899907}, {"type": "dot_accuracy", "value": 86.946481934257}, {"type": "dot_ap", "value": 80.90485630835825}, {"type": "dot_f1", "value": 74.43342263413221}, {"type": "dot_precision", "value": 70.24736914035807}, {"type": "dot_recall", "value": 79.1499846011703}, {"type": "euclidean_accuracy", "value": 88.49303372530757}, {"type": "euclidean_ap", "value": 85.08920672765427}, {"type": "euclidean_f1", "value": 77.53514807059526}, {"type": "euclidean_precision", "value": 75.3707473102646}, {"type": "euclidean_recall", "value": 79.82753310748383}, {"type": "manhattan_accuracy", "value": 88.47168859393798}, {"type": "manhattan_ap", "value": 85.01816084029292}, {"type": "manhattan_f1", "value": 77.36513181524315}, {"type": "manhattan_precision", "value": 72.5057223643463}, {"type": "manhattan_recall", "value": 82.9226978749615}, {"type": "max_accuracy", "value": 88.49303372530757}, {"type": "max_ap", "value": 85.08920672765427}, {"type": "max_f1", "value": 77.53514807059526}]}]}]}
Mihaiii/Ivysaur
null
[ "sentence-transformers", "onnx", "safetensors", "bert", "feature-extraction", "sentence-similarity", "gte", "mteb", "dataset:Mihaiii/qa-assistant", "base_model:TaylorAI/gte-tiny", "license:mit", "model-index", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:10:39+00:00
[]
[]
TAGS #sentence-transformers #onnx #safetensors #bert #feature-extraction #sentence-similarity #gte #mteb #dataset-Mihaiii/qa-assistant #base_model-TaylorAI/gte-tiny #license-mit #model-index #endpoints_compatible #region-us
# Ivysaur This is a fine-tune of gte-tiny using qa-assistant. ## Intended purpose <span style="color:blue">This model is designed for use in semantic-autocomplete (click here for demo).</span> ## Usage (Sentence-Transformers) (same as gte-tiny) Using this model becomes easy when you have sentence-transformers installed: Then you can use the model like this: ## Usage (HuggingFace Transformers) (same as gte-tiny) Without sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings. ### Limitation (same as gte-small) This model exclusively caters to English texts, and any lengthy texts will be truncated to a maximum of 512 tokens.
[ "# Ivysaur\n\nThis is a fine-tune of gte-tiny using qa-assistant.", "## Intended purpose\n\n<span style=\"color:blue\">This model is designed for use in semantic-autocomplete (click here for demo).</span>", "## Usage (Sentence-Transformers) (same as gte-tiny)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:", "## Usage (HuggingFace Transformers) (same as gte-tiny)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.", "### Limitation (same as gte-small)\nThis model exclusively caters to English texts, and any lengthy texts will be truncated to a maximum of 512 tokens." ]
[ "TAGS\n#sentence-transformers #onnx #safetensors #bert #feature-extraction #sentence-similarity #gte #mteb #dataset-Mihaiii/qa-assistant #base_model-TaylorAI/gte-tiny #license-mit #model-index #endpoints_compatible #region-us \n", "# Ivysaur\n\nThis is a fine-tune of gte-tiny using qa-assistant.", "## Intended purpose\n\n<span style=\"color:blue\">This model is designed for use in semantic-autocomplete (click here for demo).</span>", "## Usage (Sentence-Transformers) (same as gte-tiny)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:", "## Usage (HuggingFace Transformers) (same as gte-tiny)\nWithout sentence-transformers, you can use the model like this: First, you pass your input through the transformer model, then you have to apply the right pooling-operation on-top of the contextualized word embeddings.", "### Limitation (same as gte-small)\nThis model exclusively caters to English texts, and any lengthy texts will be truncated to a maximum of 512 tokens." ]
text-generation
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 0.001_5iters_bs256_nodpo_only4w_iter_2 This model is a fine-tuned version of [ShenaoZhang/0.001_5iters_bs256_nodpo_only4w_iter_1](https://huggingface.co/ShenaoZhang/0.001_5iters_bs256_nodpo_only4w_iter_1) on the updated and the original datasets. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-07 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: multi-GPU - num_devices: 8 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - total_eval_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.40.0 - Pytorch 2.1.2+cu121 - Datasets 2.14.6 - Tokenizers 0.19.1
{"license": "mit", "tags": ["alignment-handbook", "trl", "dpo", "generated_from_trainer", "trl", "dpo", "generated_from_trainer"], "datasets": ["updated", "original"], "base_model": "ShenaoZhang/0.001_5iters_bs256_nodpo_only4w_iter_1", "model-index": [{"name": "0.001_5iters_bs256_nodpo_only4w_iter_2", "results": []}]}
ShenaoZhang/0.001_5iters_bs256_nodpo_only4w_iter_2
null
[ "transformers", "safetensors", "mistral", "text-generation", "alignment-handbook", "trl", "dpo", "generated_from_trainer", "conversational", "dataset:updated", "dataset:original", "base_model:ShenaoZhang/0.001_5iters_bs256_nodpo_only4w_iter_1", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T10:15:34+00:00
[]
[]
TAGS #transformers #safetensors #mistral #text-generation #alignment-handbook #trl #dpo #generated_from_trainer #conversational #dataset-updated #dataset-original #base_model-ShenaoZhang/0.001_5iters_bs256_nodpo_only4w_iter_1 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# 0.001_5iters_bs256_nodpo_only4w_iter_2 This model is a fine-tuned version of ShenaoZhang/0.001_5iters_bs256_nodpo_only4w_iter_1 on the updated and the original datasets. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-07 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: multi-GPU - num_devices: 8 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - total_eval_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.40.0 - Pytorch 2.1.2+cu121 - Datasets 2.14.6 - Tokenizers 0.19.1
[ "# 0.001_5iters_bs256_nodpo_only4w_iter_2\n\nThis model is a fine-tuned version of ShenaoZhang/0.001_5iters_bs256_nodpo_only4w_iter_1 on the updated and the original datasets.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-07\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- num_devices: 8\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 256\n- total_eval_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.40.0\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.6\n- Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #alignment-handbook #trl #dpo #generated_from_trainer #conversational #dataset-updated #dataset-original #base_model-ShenaoZhang/0.001_5iters_bs256_nodpo_only4w_iter_1 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# 0.001_5iters_bs256_nodpo_only4w_iter_2\n\nThis model is a fine-tuned version of ShenaoZhang/0.001_5iters_bs256_nodpo_only4w_iter_1 on the updated and the original datasets.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-07\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- num_devices: 8\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 256\n- total_eval_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.40.0\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.6\n- Tokenizers 0.19.1" ]
text-generation
transformers
# Description 4-bit AWQ-quantized version of [stylellm/SanGuoYanYi-6b](https://huggingface.co/stylellm/SanGuoYanYi-6b)
{"license": "other", "license_name": "yi-license", "license_link": "https://huggingface.co/01-ai/Yi-6B/blob/main/LICENSE"}
stylellm/SanGuoYanYi-6b-AWQ
null
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "license:other", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "4-bit", "region:us" ]
null
2024-04-27T10:18:37+00:00
[]
[]
TAGS #transformers #safetensors #llama #text-generation #conversational #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us
# Description 4-bit AWQ-quantized version of stylellm/SanGuoYanYi-6b
[ "# Description\n4-bit AWQ-quantized version of stylellm/SanGuoYanYi-6b" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #conversational #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n", "# Description\n4-bit AWQ-quantized version of stylellm/SanGuoYanYi-6b" ]
feature-extraction
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
aashish-249/Sentiment_classification
null
[ "transformers", "safetensors", "bert", "feature-extraction", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:24:36+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
null
peft
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # sft_small_model This model is a fine-tuned version of [deepseek-ai/deepseek-coder-1.3b-base](https://huggingface.co/deepseek-ai/deepseek-coder-1.3b-base) on the generator dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1.41e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - PEFT 0.10.0 - Transformers 4.40.0 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "other", "library_name": "peft", "tags": ["trl", "sft", "generated_from_trainer"], "datasets": ["generator"], "base_model": "deepseek-ai/deepseek-coder-1.3b-base", "model-index": [{"name": "sft_small_model", "results": []}]}
stojchet/sft_small_model
null
[ "peft", "tensorboard", "safetensors", "trl", "sft", "generated_from_trainer", "dataset:generator", "base_model:deepseek-ai/deepseek-coder-1.3b-base", "license:other", "region:us" ]
null
2024-04-27T10:29:14+00:00
[]
[]
TAGS #peft #tensorboard #safetensors #trl #sft #generated_from_trainer #dataset-generator #base_model-deepseek-ai/deepseek-coder-1.3b-base #license-other #region-us
# sft_small_model This model is a fine-tuned version of deepseek-ai/deepseek-coder-1.3b-base on the generator dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1.41e-05 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3.0 ### Training results ### Framework versions - PEFT 0.10.0 - Transformers 4.40.0 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
[ "# sft_small_model\n\nThis model is a fine-tuned version of deepseek-ai/deepseek-coder-1.3b-base on the generator dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1.41e-05\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- PEFT 0.10.0\n- Transformers 4.40.0\n- Pytorch 2.2.1+cu121\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
[ "TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #dataset-generator #base_model-deepseek-ai/deepseek-coder-1.3b-base #license-other #region-us \n", "# sft_small_model\n\nThis model is a fine-tuned version of deepseek-ai/deepseek-coder-1.3b-base on the generator dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1.41e-05\n- train_batch_size: 4\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 3.0", "### Training results", "### Framework versions\n\n- PEFT 0.10.0\n- Transformers 4.40.0\n- Pytorch 2.2.1+cu121\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
object-detection
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # detr This model is a fine-tuned version of [facebook/detr-resnet-50](https://huggingface.co/facebook/detr-resnet-50) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 - mixed_precision_training: Native AMP ### Framework versions - Transformers 4.39.3 - Pytorch 2.3.0+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "facebook/detr-resnet-50", "model-index": [{"name": "detr", "results": []}]}
tooltest/detr
null
[ "transformers", "tensorboard", "safetensors", "detr", "object-detection", "generated_from_trainer", "base_model:facebook/detr-resnet-50", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:31:35+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #detr #object-detection #generated_from_trainer #base_model-facebook/detr-resnet-50 #license-apache-2.0 #endpoints_compatible #region-us
# detr This model is a fine-tuned version of facebook/detr-resnet-50 on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 - mixed_precision_training: Native AMP ### Framework versions - Transformers 4.39.3 - Pytorch 2.3.0+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2
[ "# detr\n\nThis model is a fine-tuned version of facebook/detr-resnet-50 on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 1\n- mixed_precision_training: Native AMP", "### Framework versions\n\n- Transformers 4.39.3\n- Pytorch 2.3.0+cu121\n- Datasets 2.18.0\n- Tokenizers 0.15.2" ]
[ "TAGS\n#transformers #tensorboard #safetensors #detr #object-detection #generated_from_trainer #base_model-facebook/detr-resnet-50 #license-apache-2.0 #endpoints_compatible #region-us \n", "# detr\n\nThis model is a fine-tuned version of facebook/detr-resnet-50 on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 16\n- eval_batch_size: 16\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 1\n- mixed_precision_training: Native AMP", "### Framework versions\n\n- Transformers 4.39.3\n- Pytorch 2.3.0+cu121\n- Datasets 2.18.0\n- Tokenizers 0.15.2" ]
text-classification
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-emotion This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on the emotion dataset. It achieves the following results on the evaluation set: - Loss: 0.2090 - Accuracy: 0.926 - F1: 0.9259 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| | 0.8274 | 1.0 | 250 | 0.3057 | 0.904 | 0.9032 | | 0.2419 | 2.0 | 500 | 0.2090 | 0.926 | 0.9259 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.2+cu118 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["emotion"], "metrics": ["accuracy", "f1"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "distilbert-base-uncased-finetuned-emotion", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "emotion", "type": "emotion", "config": "split", "split": "validation", "args": "split"}, "metrics": [{"type": "accuracy", "value": 0.926, "name": "Accuracy"}, {"type": "f1", "value": 0.925854377188877, "name": "F1"}]}]}]}
orenot/distilbert-base-uncased-finetuned-emotion
null
[ "transformers", "tensorboard", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "dataset:emotion", "base_model:distilbert-base-uncased", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:32:29+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #dataset-emotion #base_model-distilbert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
distilbert-base-uncased-finetuned-emotion ========================================= This model is a fine-tuned version of distilbert-base-uncased on the emotion dataset. It achieves the following results on the evaluation set: * Loss: 0.2090 * Accuracy: 0.926 * F1: 0.9259 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 64 * eval\_batch\_size: 64 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 2 ### Training results ### Framework versions * Transformers 4.40.1 * Pytorch 2.2.2+cu118 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.2+cu118\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #generated_from_trainer #dataset-emotion #base_model-distilbert-base-uncased #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.2+cu118\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
question-answering
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # bertQA This model is a fine-tuned version of [bert-base-uncased](https://huggingface.co/bert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0003 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 2 - eval_batch_size: 2 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.0011 | 1.0 | 250 | 0.0004 | | 0.0009 | 2.0 | 500 | 0.0003 | ### Framework versions - Transformers 4.40.0 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"tags": ["generated_from_trainer"], "base_model": "bert-base-uncased", "model-index": [{"name": "bertQA", "results": []}]}
krushnakant27/bertQA
null
[ "transformers", "tensorboard", "safetensors", "bert", "question-answering", "generated_from_trainer", "base_model:bert-base-uncased", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:32:45+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #bert #question-answering #generated_from_trainer #base_model-bert-base-uncased #endpoints_compatible #region-us
bertQA ====== This model is a fine-tuned version of bert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.0003 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 2 * eval\_batch\_size: 2 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 2 ### Training results ### Framework versions * Transformers 4.40.0 * Pytorch 2.2.1+cu121 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #bert #question-answering #generated_from_trainer #base_model-bert-base-uncased #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 2\n* eval\\_batch\\_size: 2\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
reinforcement-learning
ml-agents
# **poca** Agent playing **SoccerTwos** This is a trained model of a **poca** agent playing **SoccerTwos** using the [Unity ML-Agents Library](https://github.com/Unity-Technologies/ml-agents). ## Usage (with ML-Agents) The Documentation: https://unity-technologies.github.io/ml-agents/ML-Agents-Toolkit-Documentation/ We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub: - A *short tutorial* where you teach Huggy the Dog 🐶 to fetch the stick and then play with him directly in your browser: https://huggingface.co/learn/deep-rl-course/unitbonus1/introduction - A *longer tutorial* to understand how works ML-Agents: https://huggingface.co/learn/deep-rl-course/unit5/introduction ### Resume the training ```bash mlagents-learn <your_configuration_file_path.yaml> --run-id=<run_id> --resume ``` ### Watch your Agent play You can watch your agent **playing directly in your browser** 1. If the environment is part of ML-Agents official environments, go to https://huggingface.co/unity 2. Step 1: Find your model_id: tarpalsus/poca-SoccerTwos 3. Step 2: Select your *.nn /*.onnx file 4. Click on Watch the agent play 👀
{"library_name": "ml-agents", "tags": ["SoccerTwos", "deep-reinforcement-learning", "reinforcement-learning", "ML-Agents-SoccerTwos"]}
tarpalsus/poca-SoccerTwos
null
[ "ml-agents", "tensorboard", "onnx", "SoccerTwos", "deep-reinforcement-learning", "reinforcement-learning", "ML-Agents-SoccerTwos", "region:us" ]
null
2024-04-27T10:34:50+00:00
[]
[]
TAGS #ml-agents #tensorboard #onnx #SoccerTwos #deep-reinforcement-learning #reinforcement-learning #ML-Agents-SoccerTwos #region-us
# poca Agent playing SoccerTwos This is a trained model of a poca agent playing SoccerTwos using the Unity ML-Agents Library. ## Usage (with ML-Agents) The Documentation: URL We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub: - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your browser: URL - A *longer tutorial* to understand how works ML-Agents: URL ### Resume the training ### Watch your Agent play You can watch your agent playing directly in your browser 1. If the environment is part of ML-Agents official environments, go to URL 2. Step 1: Find your model_id: tarpalsus/poca-SoccerTwos 3. Step 2: Select your *.nn /*.onnx file 4. Click on Watch the agent play
[ "# poca Agent playing SoccerTwos\n This is a trained model of a poca agent playing SoccerTwos\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: tarpalsus/poca-SoccerTwos\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play" ]
[ "TAGS\n#ml-agents #tensorboard #onnx #SoccerTwos #deep-reinforcement-learning #reinforcement-learning #ML-Agents-SoccerTwos #region-us \n", "# poca Agent playing SoccerTwos\n This is a trained model of a poca agent playing SoccerTwos\n using the Unity ML-Agents Library.\n\n ## Usage (with ML-Agents)\n The Documentation: URL\n\n We wrote a complete tutorial to learn to train your first agent using ML-Agents and publish it to the Hub:\n - A *short tutorial* where you teach Huggy the Dog to fetch the stick and then play with him directly in your\n browser: URL\n - A *longer tutorial* to understand how works ML-Agents:\n URL\n\n ### Resume the training\n \n\n ### Watch your Agent play\n You can watch your agent playing directly in your browser\n\n 1. If the environment is part of ML-Agents official environments, go to URL\n 2. Step 1: Find your model_id: tarpalsus/poca-SoccerTwos\n 3. Step 2: Select your *.nn /*.onnx file\n 4. Click on Watch the agent play" ]
image-classification
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # swin-tiny-patch4-window7-224-finetuned-eurosat This model is a fine-tuned version of [microsoft/swin-tiny-patch4-window7-224](https://huggingface.co/microsoft/swin-tiny-patch4-window7-224) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 1.3752 - Accuracy: 0.5238 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:------:|:----:|:---------------:|:--------:| | No log | 0.8889 | 4 | 1.7408 | 0.4444 | | No log | 2.0 | 9 | 1.4351 | 0.4921 | | 1.7494 | 2.6667 | 12 | 1.3752 | 0.5238 | ### Framework versions - Transformers 4.40.0 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["imagefolder"], "metrics": ["accuracy"], "base_model": "microsoft/swin-tiny-patch4-window7-224", "model-index": [{"name": "swin-tiny-patch4-window7-224-finetuned-eurosat", "results": [{"task": {"type": "image-classification", "name": "Image Classification"}, "dataset": {"name": "imagefolder", "type": "imagefolder", "config": "default", "split": "train", "args": "default"}, "metrics": [{"type": "accuracy", "value": 0.5238095238095238, "name": "Accuracy"}]}]}]}
sharmajai901/swin-tiny-patch4-window7-224-finetuned-eurosat
null
[ "transformers", "tensorboard", "safetensors", "swin", "image-classification", "generated_from_trainer", "dataset:imagefolder", "base_model:microsoft/swin-tiny-patch4-window7-224", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:35:06+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #swin #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swin-tiny-patch4-window7-224 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
swin-tiny-patch4-window7-224-finetuned-eurosat ============================================== This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set: * Loss: 1.3752 * Accuracy: 0.5238 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 5e-05 * train\_batch\_size: 32 * eval\_batch\_size: 32 * seed: 42 * gradient\_accumulation\_steps: 4 * total\_train\_batch\_size: 128 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_ratio: 0.1 * num\_epochs: 3 ### Training results ### Framework versions * Transformers 4.40.0 * Pytorch 2.2.1+cu121 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #swin #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swin-tiny-patch4-window7-224 #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 32\n* eval\\_batch\\_size: 32\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
text-generation
transformers
# BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials Reference: R. Luu and M.J. Buehler, "BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials," Adv. Science, 2023, DOI: https://doi.org/10.1002/advs.202306724 Abstract: The study of biological materials and bio-inspired materials science is well established; however, surprisingly little knowledge is systematically translated to engineering solutions. To accelerate discovery and guide insights, an open-source autoregressive transformer large language model (LLM), BioinspiredLLM, is reported. The model is finetuned with a corpus of over a thousand peer-reviewed articles in the field of structural biological and bio-inspired materials and can be prompted to recall information, assist with research tasks, and function as an engine for creativity. The model has proven that it is able to accurately recall information about biological materials and is further strengthened with enhanced reasoning ability, as well as with Retrieval-Augmented Generation (RAG) to incorporate new data during generation that can also help to traceback sources, update the knowledge base, and connect knowledge domains. BioinspiredLLM also has shown to develop sound hypotheses regarding biological materials design and remarkably so for materials that have never been explicitly studied before. Lastly, the model shows impressive promise in collaborating with other generative artificial intelligence models in a workflow that can reshape the traditional materials design process. This collaborative generative artificial intelligence method can stimulate and enhance bio-inspired materials design workflows. Biological materials are at a critical intersection of multiple scientific fields and models like BioinspiredLLM help to connect knowledge domains. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/623ce1c6b66fedf374859fe7/Xdp_nCYiF2IAPamG5ffIC.png) # Model Card for Model ID Fine-tuned LLM with domain knowledge in biological materials, mechanics of materials, modeling and simulation, and related fields. ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"language": ["en"], "library_name": "transformers", "tags": ["biology", "materials science", "code", "scientific AI", "biological materials", "bioinspiration", "machine learning", "generative"]}
lamm-mit/Bioinspired-Phi-3-mini-128k
null
[ "transformers", "safetensors", "gguf", "phi3", "text-generation", "biology", "materials science", "code", "scientific AI", "biological materials", "bioinspiration", "machine learning", "generative", "conversational", "custom_code", "en", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:35:30+00:00
[ "1910.09700" ]
[ "en" ]
TAGS #transformers #safetensors #gguf #phi3 #text-generation #biology #materials science #code #scientific AI #biological materials #bioinspiration #machine learning #generative #conversational #custom_code #en #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials Reference: R. Luu and M.J. Buehler, "BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials," Adv. Science, 2023, DOI: URL Abstract: The study of biological materials and bio-inspired materials science is well established; however, surprisingly little knowledge is systematically translated to engineering solutions. To accelerate discovery and guide insights, an open-source autoregressive transformer large language model (LLM), BioinspiredLLM, is reported. The model is finetuned with a corpus of over a thousand peer-reviewed articles in the field of structural biological and bio-inspired materials and can be prompted to recall information, assist with research tasks, and function as an engine for creativity. The model has proven that it is able to accurately recall information about biological materials and is further strengthened with enhanced reasoning ability, as well as with Retrieval-Augmented Generation (RAG) to incorporate new data during generation that can also help to traceback sources, update the knowledge base, and connect knowledge domains. BioinspiredLLM also has shown to develop sound hypotheses regarding biological materials design and remarkably so for materials that have never been explicitly studied before. Lastly, the model shows impressive promise in collaborating with other generative artificial intelligence models in a workflow that can reshape the traditional materials design process. This collaborative generative artificial intelligence method can stimulate and enhance bio-inspired materials design workflows. Biological materials are at a critical intersection of multiple scientific fields and models like BioinspiredLLM help to connect knowledge domains. !image/png # Model Card for Model ID Fine-tuned LLM with domain knowledge in biological materials, mechanics of materials, modeling and simulation, and related fields. ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials\n\nReference: R. Luu and M.J. Buehler, \"BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials,\" Adv. Science, 2023, DOI: URL\n\nAbstract: The study of biological materials and bio-inspired materials science is well established; however, surprisingly little knowledge is systematically translated to engineering solutions. To accelerate discovery and guide insights, an open-source autoregressive transformer large language model (LLM), BioinspiredLLM, is reported. The model is finetuned with a corpus of over a thousand peer-reviewed articles in the field of structural biological and bio-inspired materials and can be prompted to recall information, assist with research tasks, and function as an engine for creativity. The model has proven that it is able to accurately recall information about biological materials and is further strengthened with enhanced reasoning ability, as well as with Retrieval-Augmented Generation (RAG) to incorporate new data during generation that can also help to traceback sources, update the knowledge base, and connect knowledge domains. BioinspiredLLM also has shown to develop sound hypotheses regarding biological materials design and remarkably so for materials that have never been explicitly studied before. Lastly, the model shows impressive promise in collaborating with other generative artificial intelligence models in a workflow that can reshape the traditional materials design process. This collaborative generative artificial intelligence method can stimulate and enhance bio-inspired materials design workflows. Biological materials are at a critical intersection of multiple scientific fields and models like BioinspiredLLM help to connect knowledge domains. \n\n!image/png", "# Model Card for Model ID\n\nFine-tuned LLM with domain knowledge in biological materials, mechanics of materials, modeling and simulation, and related fields.", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #gguf #phi3 #text-generation #biology #materials science #code #scientific AI #biological materials #bioinspiration #machine learning #generative #conversational #custom_code #en #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials\n\nReference: R. Luu and M.J. Buehler, \"BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials,\" Adv. Science, 2023, DOI: URL\n\nAbstract: The study of biological materials and bio-inspired materials science is well established; however, surprisingly little knowledge is systematically translated to engineering solutions. To accelerate discovery and guide insights, an open-source autoregressive transformer large language model (LLM), BioinspiredLLM, is reported. The model is finetuned with a corpus of over a thousand peer-reviewed articles in the field of structural biological and bio-inspired materials and can be prompted to recall information, assist with research tasks, and function as an engine for creativity. The model has proven that it is able to accurately recall information about biological materials and is further strengthened with enhanced reasoning ability, as well as with Retrieval-Augmented Generation (RAG) to incorporate new data during generation that can also help to traceback sources, update the knowledge base, and connect knowledge domains. BioinspiredLLM also has shown to develop sound hypotheses regarding biological materials design and remarkably so for materials that have never been explicitly studied before. Lastly, the model shows impressive promise in collaborating with other generative artificial intelligence models in a workflow that can reshape the traditional materials design process. This collaborative generative artificial intelligence method can stimulate and enhance bio-inspired materials design workflows. Biological materials are at a critical intersection of multiple scientific fields and models like BioinspiredLLM help to connect knowledge domains. \n\n!image/png", "# Model Card for Model ID\n\nFine-tuned LLM with domain knowledge in biological materials, mechanics of materials, modeling and simulation, and related fields.", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials Reference: R. Luu and M.J. Buehler, "BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials," Adv. Science, 2023, DOI: https://doi.org/10.1002/advs.202306724 Abstract: The study of biological materials and bio-inspired materials science is well established; however, surprisingly little knowledge is systematically translated to engineering solutions. To accelerate discovery and guide insights, an open-source autoregressive transformer large language model (LLM), BioinspiredLLM, is reported. The model is finetuned with a corpus of over a thousand peer-reviewed articles in the field of structural biological and bio-inspired materials and can be prompted to recall information, assist with research tasks, and function as an engine for creativity. The model has proven that it is able to accurately recall information about biological materials and is further strengthened with enhanced reasoning ability, as well as with Retrieval-Augmented Generation (RAG) to incorporate new data during generation that can also help to traceback sources, update the knowledge base, and connect knowledge domains. BioinspiredLLM also has shown to develop sound hypotheses regarding biological materials design and remarkably so for materials that have never been explicitly studied before. Lastly, the model shows impressive promise in collaborating with other generative artificial intelligence models in a workflow that can reshape the traditional materials design process. This collaborative generative artificial intelligence method can stimulate and enhance bio-inspired materials design workflows. Biological materials are at a critical intersection of multiple scientific fields and models like BioinspiredLLM help to connect knowledge domains. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/623ce1c6b66fedf374859fe7/Xdp_nCYiF2IAPamG5ffIC.png) # Model Card for Model ID Fine-tuned LLM with domain knowledge in biological materials, mechanics of materials, modeling and simulation, and related fields. ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"language": ["en"], "library_name": "transformers", "tags": ["biology", "materials science", "code", "scientific AI", "biological materials", "bioinspiration", "machine learning", "generative"]}
lamm-mit/Bioinspired-Phi-3-mini-4k
null
[ "transformers", "safetensors", "gguf", "phi3", "text-generation", "biology", "materials science", "code", "scientific AI", "biological materials", "bioinspiration", "machine learning", "generative", "conversational", "custom_code", "en", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:37:24+00:00
[ "1910.09700" ]
[ "en" ]
TAGS #transformers #safetensors #gguf #phi3 #text-generation #biology #materials science #code #scientific AI #biological materials #bioinspiration #machine learning #generative #conversational #custom_code #en #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials Reference: R. Luu and M.J. Buehler, "BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials," Adv. Science, 2023, DOI: URL Abstract: The study of biological materials and bio-inspired materials science is well established; however, surprisingly little knowledge is systematically translated to engineering solutions. To accelerate discovery and guide insights, an open-source autoregressive transformer large language model (LLM), BioinspiredLLM, is reported. The model is finetuned with a corpus of over a thousand peer-reviewed articles in the field of structural biological and bio-inspired materials and can be prompted to recall information, assist with research tasks, and function as an engine for creativity. The model has proven that it is able to accurately recall information about biological materials and is further strengthened with enhanced reasoning ability, as well as with Retrieval-Augmented Generation (RAG) to incorporate new data during generation that can also help to traceback sources, update the knowledge base, and connect knowledge domains. BioinspiredLLM also has shown to develop sound hypotheses regarding biological materials design and remarkably so for materials that have never been explicitly studied before. Lastly, the model shows impressive promise in collaborating with other generative artificial intelligence models in a workflow that can reshape the traditional materials design process. This collaborative generative artificial intelligence method can stimulate and enhance bio-inspired materials design workflows. Biological materials are at a critical intersection of multiple scientific fields and models like BioinspiredLLM help to connect knowledge domains. !image/png # Model Card for Model ID Fine-tuned LLM with domain knowledge in biological materials, mechanics of materials, modeling and simulation, and related fields. ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials\n\nReference: R. Luu and M.J. Buehler, \"BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials,\" Adv. Science, 2023, DOI: URL\n\nAbstract: The study of biological materials and bio-inspired materials science is well established; however, surprisingly little knowledge is systematically translated to engineering solutions. To accelerate discovery and guide insights, an open-source autoregressive transformer large language model (LLM), BioinspiredLLM, is reported. The model is finetuned with a corpus of over a thousand peer-reviewed articles in the field of structural biological and bio-inspired materials and can be prompted to recall information, assist with research tasks, and function as an engine for creativity. The model has proven that it is able to accurately recall information about biological materials and is further strengthened with enhanced reasoning ability, as well as with Retrieval-Augmented Generation (RAG) to incorporate new data during generation that can also help to traceback sources, update the knowledge base, and connect knowledge domains. BioinspiredLLM also has shown to develop sound hypotheses regarding biological materials design and remarkably so for materials that have never been explicitly studied before. Lastly, the model shows impressive promise in collaborating with other generative artificial intelligence models in a workflow that can reshape the traditional materials design process. This collaborative generative artificial intelligence method can stimulate and enhance bio-inspired materials design workflows. Biological materials are at a critical intersection of multiple scientific fields and models like BioinspiredLLM help to connect knowledge domains. \n\n!image/png", "# Model Card for Model ID\n\nFine-tuned LLM with domain knowledge in biological materials, mechanics of materials, modeling and simulation, and related fields.", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #gguf #phi3 #text-generation #biology #materials science #code #scientific AI #biological materials #bioinspiration #machine learning #generative #conversational #custom_code #en #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials\n\nReference: R. Luu and M.J. Buehler, \"BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-Inspired Materials,\" Adv. Science, 2023, DOI: URL\n\nAbstract: The study of biological materials and bio-inspired materials science is well established; however, surprisingly little knowledge is systematically translated to engineering solutions. To accelerate discovery and guide insights, an open-source autoregressive transformer large language model (LLM), BioinspiredLLM, is reported. The model is finetuned with a corpus of over a thousand peer-reviewed articles in the field of structural biological and bio-inspired materials and can be prompted to recall information, assist with research tasks, and function as an engine for creativity. The model has proven that it is able to accurately recall information about biological materials and is further strengthened with enhanced reasoning ability, as well as with Retrieval-Augmented Generation (RAG) to incorporate new data during generation that can also help to traceback sources, update the knowledge base, and connect knowledge domains. BioinspiredLLM also has shown to develop sound hypotheses regarding biological materials design and remarkably so for materials that have never been explicitly studied before. Lastly, the model shows impressive promise in collaborating with other generative artificial intelligence models in a workflow that can reshape the traditional materials design process. This collaborative generative artificial intelligence method can stimulate and enhance bio-inspired materials design workflows. Biological materials are at a critical intersection of multiple scientific fields and models like BioinspiredLLM help to connect knowledge domains. \n\n!image/png", "# Model Card for Model ID\n\nFine-tuned LLM with domain knowledge in biological materials, mechanics of materials, modeling and simulation, and related fields.", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 0.001_4iters_bs256_nodpo_only4w_zephyr_iter_4 This model is a fine-tuned version of [ShenaoZhang/0.001_4iters_bs256_nodpo_only4w_zephyr_iter_3](https://huggingface.co/ShenaoZhang/0.001_4iters_bs256_nodpo_only4w_zephyr_iter_3) on the updated and the original datasets. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-07 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: multi-GPU - num_devices: 8 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - total_eval_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.40.0 - Pytorch 2.1.2+cu121 - Datasets 2.14.6 - Tokenizers 0.19.1
{"license": "mit", "tags": ["alignment-handbook", "trl", "dpo", "generated_from_trainer", "trl", "dpo", "generated_from_trainer"], "datasets": ["updated", "original"], "base_model": "ShenaoZhang/0.001_4iters_bs256_nodpo_only4w_zephyr_iter_3", "model-index": [{"name": "0.001_4iters_bs256_nodpo_only4w_zephyr_iter_4", "results": []}]}
ShenaoZhang/0.001_4iters_bs256_nodpo_only4w_zephyr_iter_4
null
[ "transformers", "safetensors", "mistral", "text-generation", "alignment-handbook", "trl", "dpo", "generated_from_trainer", "conversational", "dataset:updated", "dataset:original", "base_model:ShenaoZhang/0.001_4iters_bs256_nodpo_only4w_zephyr_iter_3", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T10:44:12+00:00
[]
[]
TAGS #transformers #safetensors #mistral #text-generation #alignment-handbook #trl #dpo #generated_from_trainer #conversational #dataset-updated #dataset-original #base_model-ShenaoZhang/0.001_4iters_bs256_nodpo_only4w_zephyr_iter_3 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# 0.001_4iters_bs256_nodpo_only4w_zephyr_iter_4 This model is a fine-tuned version of ShenaoZhang/0.001_4iters_bs256_nodpo_only4w_zephyr_iter_3 on the updated and the original datasets. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-07 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: multi-GPU - num_devices: 8 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - total_eval_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.40.0 - Pytorch 2.1.2+cu121 - Datasets 2.14.6 - Tokenizers 0.19.1
[ "# 0.001_4iters_bs256_nodpo_only4w_zephyr_iter_4\n\nThis model is a fine-tuned version of ShenaoZhang/0.001_4iters_bs256_nodpo_only4w_zephyr_iter_3 on the updated and the original datasets.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-07\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- num_devices: 8\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 256\n- total_eval_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.40.0\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.6\n- Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #alignment-handbook #trl #dpo #generated_from_trainer #conversational #dataset-updated #dataset-original #base_model-ShenaoZhang/0.001_4iters_bs256_nodpo_only4w_zephyr_iter_3 #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# 0.001_4iters_bs256_nodpo_only4w_zephyr_iter_4\n\nThis model is a fine-tuned version of ShenaoZhang/0.001_4iters_bs256_nodpo_only4w_zephyr_iter_3 on the updated and the original datasets.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-07\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- num_devices: 8\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 256\n- total_eval_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.40.0\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.6\n- Tokenizers 0.19.1" ]
text-generation
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # sft-model This model is a fine-tuned version of [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 2.1932 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 4 - eval_batch_size: 4 - seed: 3407 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 10 - training_steps: 596 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:------:|:----:|:---------------:| | 3.1549 | 0.0757 | 120 | 3.1938 | | 2.8129 | 0.1514 | 240 | 2.8597 | | 2.5655 | 0.2271 | 360 | 2.5002 | | 2.2148 | 0.3028 | 480 | 2.1932 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.0+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"tags": ["trl", "sft", "generated_from_trainer"], "base_model": "meta-llama/Meta-Llama-3-8B-Instruct", "model-index": [{"name": "sft-model", "results": []}]}
Imran1/sft-model
null
[ "transformers", "safetensors", "llama", "text-generation", "trl", "sft", "generated_from_trainer", "conversational", "base_model:meta-llama/Meta-Llama-3-8B-Instruct", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T10:45:26+00:00
[]
[]
TAGS #transformers #safetensors #llama #text-generation #trl #sft #generated_from_trainer #conversational #base_model-meta-llama/Meta-Llama-3-8B-Instruct #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
sft-model ========= This model is a fine-tuned version of meta-llama/Meta-Llama-3-8B-Instruct on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 2.1932 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 0.0002 * train\_batch\_size: 4 * eval\_batch\_size: 4 * seed: 3407 * gradient\_accumulation\_steps: 4 * total\_train\_batch\_size: 16 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 10 * training\_steps: 596 ### Training results ### Framework versions * Transformers 4.40.1 * Pytorch 2.2.0+cu121 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 3407\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 10\n* training\\_steps: 596", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.0+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #trl #sft #generated_from_trainer #conversational #base_model-meta-llama/Meta-Llama-3-8B-Instruct #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 3407\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 10\n* training\\_steps: 596", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.0+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
null
span-marker
[Biopeak Male Enhancement](https://thecontingent.microsoftcrmportals.com/forums/general-discussion/80c05195-9d03-ef11-a73d-6045bd01c1cc) They can impede different prescriptions or intensify specific ailments. Aftereffects could incorporate skin break out, balding, state of mind swings, and changes in cholesterol levels.Guideline and Quality: The enhancement business isn't generally very much controlled, and item quality can differ. Guarantee you purchase supplements from legitimate brands that go through outsider testing for quality and safety.Medical Exhortation: Prior to beginning any enhancement routine, particularly testosterone promoters, examine it with a medical services proficient. They can assess your wellbeing status, give customized exhortation, and screen any likely incidental effects or associations. VISIT HERE FOR OFFICIAL WEBSITE:-https://thecontingent.microsoftcrmportals.com/forums/general-discussion/80c05195-9d03-ef11-a73d-6045bd01c1cc
{"language": ["en"], "license": "bigscience-openrail-m", "library_name": "span-marker", "tags": ["Biopeak Male Enhancement"]}
maleenhancement/biopeakmaleenhancement
null
[ "span-marker", "Biopeak Male Enhancement", "en", "license:bigscience-openrail-m", "region:us" ]
null
2024-04-27T10:47:06+00:00
[]
[ "en" ]
TAGS #span-marker #Biopeak Male Enhancement #en #license-bigscience-openrail-m #region-us
Biopeak Male Enhancement They can impede different prescriptions or intensify specific ailments. Aftereffects could incorporate skin break out, balding, state of mind swings, and changes in cholesterol levels.Guideline and Quality: The enhancement business isn't generally very much controlled, and item quality can differ. Guarantee you purchase supplements from legitimate brands that go through outsider testing for quality and safety.Medical Exhortation: Prior to beginning any enhancement routine, particularly testosterone promoters, examine it with a medical services proficient. They can assess your wellbeing status, give customized exhortation, and screen any likely incidental effects or associations. VISIT HERE FOR OFFICIAL WEBSITE:-URL
[]
[ "TAGS\n#span-marker #Biopeak Male Enhancement #en #license-bigscience-openrail-m #region-us \n" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
shallow6414/6ih4k2q
null
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T10:47:50+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": ["trl", "sft"]}
Imran1/fi
null
[ "transformers", "safetensors", "llama", "text-generation", "trl", "sft", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T10:49:03+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #llama #text-generation #trl #sft #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #trl #sft #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
null
transformers
# Uploaded model - **Developed by:** sourcetableteam - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "trl"], "base_model": "unsloth/llama-3-8b-bnb-4bit"}
sourcetableteam/lora_model
null
[ "transformers", "safetensors", "text-generation-inference", "unsloth", "llama", "trl", "en", "base_model:unsloth/llama-3-8b-bnb-4bit", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:49:08+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #text-generation-inference #unsloth #llama #trl #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us
# Uploaded model - Developed by: sourcetableteam - License: apache-2.0 - Finetuned from model : unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: sourcetableteam\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #safetensors #text-generation-inference #unsloth #llama #trl #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: sourcetableteam\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
text-generation
transformers
# merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [meta-llama/Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) as a base. ### Models Merged The following models were included in the merge: * [Weyaxi/Einstein-v6.1-Llama3-8B](https://huggingface.co/Weyaxi/Einstein-v6.1-Llama3-8B) * [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: meta-llama/Meta-Llama-3-8B #no parameters necessary for base model - model: Weyaxi/Einstein-v6.1-Llama3-8B parameters: density: 0.5 weight: 0.5 - model: meta-llama/Meta-Llama-3-8B-Instruct parameters: density: 0.5 weight: 0.5 merge_method: ties base_model: meta-llama/Meta-Llama-3-8B parameters: normalize: false int8_mask: true dtype: bfloat16 ```
{"license": "other", "library_name": "transformers", "tags": ["mergekit", "merge"], "base_model": ["Weyaxi/Einstein-v6.1-Llama3-8B", "meta-llama/Meta-Llama-3-8B-Instruct", "meta-llama/Meta-Llama-3-8B"]}
Weyaxi/a
null
[ "transformers", "safetensors", "llama", "text-generation", "mergekit", "merge", "arxiv:2306.01708", "base_model:Weyaxi/Einstein-v6.1-Llama3-8B", "base_model:meta-llama/Meta-Llama-3-8B-Instruct", "base_model:meta-llama/Meta-Llama-3-8B", "license:other", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T10:49:10+00:00
[ "2306.01708" ]
[]
TAGS #transformers #safetensors #llama #text-generation #mergekit #merge #arxiv-2306.01708 #base_model-Weyaxi/Einstein-v6.1-Llama3-8B #base_model-meta-llama/Meta-Llama-3-8B-Instruct #base_model-meta-llama/Meta-Llama-3-8B #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# merge This is a merge of pre-trained language models created using mergekit. ## Merge Details ### Merge Method This model was merged using the TIES merge method using meta-llama/Meta-Llama-3-8B as a base. ### Models Merged The following models were included in the merge: * Weyaxi/Einstein-v6.1-Llama3-8B * meta-llama/Meta-Llama-3-8B-Instruct ### Configuration The following YAML configuration was used to produce this model:
[ "# merge\n\nThis is a merge of pre-trained language models created using mergekit.", "## Merge Details", "### Merge Method\n\nThis model was merged using the TIES merge method using meta-llama/Meta-Llama-3-8B as a base.", "### Models Merged\n\nThe following models were included in the merge:\n* Weyaxi/Einstein-v6.1-Llama3-8B\n* meta-llama/Meta-Llama-3-8B-Instruct", "### Configuration\n\nThe following YAML configuration was used to produce this model:" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #mergekit #merge #arxiv-2306.01708 #base_model-Weyaxi/Einstein-v6.1-Llama3-8B #base_model-meta-llama/Meta-Llama-3-8B-Instruct #base_model-meta-llama/Meta-Llama-3-8B #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# merge\n\nThis is a merge of pre-trained language models created using mergekit.", "## Merge Details", "### Merge Method\n\nThis model was merged using the TIES merge method using meta-llama/Meta-Llama-3-8B as a base.", "### Models Merged\n\nThe following models were included in the merge:\n* Weyaxi/Einstein-v6.1-Llama3-8B\n* meta-llama/Meta-Llama-3-8B-Instruct", "### Configuration\n\nThe following YAML configuration was used to produce this model:" ]
null
transformers
# s0br/Llama-3-8B-Web-Q6_K-GGUF This model was converted to GGUF format from [`McGill-NLP/Llama-3-8B-Web`](https://huggingface.co/McGill-NLP/Llama-3-8B-Web) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/McGill-NLP/Llama-3-8B-Web) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew. ```bash brew install ggerganov/ggerganov/llama.cpp ``` Invoke the llama.cpp server or the CLI. CLI: ```bash llama-cli --hf-repo s0br/Llama-3-8B-Web-Q6_K-GGUF --model llama-3-8b-web.Q6_K.gguf -p "The meaning to life and the universe is" ``` Server: ```bash llama-server --hf-repo s0br/Llama-3-8B-Web-Q6_K-GGUF --model llama-3-8b-web.Q6_K.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. ``` git clone https://github.com/ggerganov/llama.cpp && cd llama.cpp && make && ./main -m llama-3-8b-web.Q6_K.gguf -n 128 ```
{"language": ["en"], "license": "llama3", "library_name": "transformers", "tags": ["agents", "agent", "llm", "llama", "llama-cpp", "gguf-my-repo"], "datasets": ["McGill-NLP/WebLINX"]}
s0br/Llama-3-8B-Web-Q6_K-GGUF
null
[ "transformers", "gguf", "agents", "agent", "llm", "llama", "llama-cpp", "gguf-my-repo", "en", "dataset:McGill-NLP/WebLINX", "license:llama3", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:50:31+00:00
[]
[ "en" ]
TAGS #transformers #gguf #agents #agent #llm #llama #llama-cpp #gguf-my-repo #en #dataset-McGill-NLP/WebLINX #license-llama3 #endpoints_compatible #region-us
# s0br/Llama-3-8B-Web-Q6_K-GGUF This model was converted to GGUF format from 'McGill-NLP/Llama-3-8B-Web' using URL via the URL's GGUF-my-repo space. Refer to the original model card for more details on the model. ## Use with URL Install URL through brew. Invoke the URL server or the CLI. CLI: Server: Note: You can also use this checkpoint directly through the usage steps listed in the URL repo as well.
[ "# s0br/Llama-3-8B-Web-Q6_K-GGUF\nThis model was converted to GGUF format from 'McGill-NLP/Llama-3-8B-Web' using URL via the URL's GGUF-my-repo space.\nRefer to the original model card for more details on the model.", "## Use with URL\n\nInstall URL through brew.\n\n\nInvoke the URL server or the CLI.\n\nCLI:\n\n\n\nServer:\n\n\n\nNote: You can also use this checkpoint directly through the usage steps listed in the URL repo as well." ]
[ "TAGS\n#transformers #gguf #agents #agent #llm #llama #llama-cpp #gguf-my-repo #en #dataset-McGill-NLP/WebLINX #license-llama3 #endpoints_compatible #region-us \n", "# s0br/Llama-3-8B-Web-Q6_K-GGUF\nThis model was converted to GGUF format from 'McGill-NLP/Llama-3-8B-Web' using URL via the URL's GGUF-my-repo space.\nRefer to the original model card for more details on the model.", "## Use with URL\n\nInstall URL through brew.\n\n\nInvoke the URL server or the CLI.\n\nCLI:\n\n\n\nServer:\n\n\n\nNote: You can also use this checkpoint directly through the usage steps listed in the URL repo as well." ]
image-classification
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # Boya1_RMSProp_1-e5_10Epoch_swinv2-large-patch4_fold1 This model is a fine-tuned version of [microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft](https://huggingface.co/microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft) on the imagefolder dataset. It achieves the following results on the evaluation set: - Loss: 1.9171 - Accuracy: 0.6782 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:-----:|:---------------:|:--------:| | 1.1746 | 1.0 | 1848 | 1.0899 | 0.6182 | | 0.8689 | 2.0 | 3696 | 1.0184 | 0.6602 | | 0.8727 | 3.0 | 5544 | 0.9164 | 0.6871 | | 0.4107 | 4.0 | 7392 | 1.0648 | 0.6811 | | 0.3287 | 5.0 | 9240 | 1.1981 | 0.6847 | | 0.2801 | 6.0 | 11088 | 1.3374 | 0.6803 | | 0.706 | 7.0 | 12936 | 1.5676 | 0.6733 | | 0.4501 | 8.0 | 14784 | 1.7520 | 0.6787 | | 0.1707 | 9.0 | 16632 | 1.8480 | 0.6841 | | 0.097 | 10.0 | 18480 | 1.9171 | 0.6782 | ### Framework versions - Transformers 4.35.0 - Pytorch 2.1.0 - Datasets 2.14.6 - Tokenizers 0.14.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["imagefolder"], "metrics": ["accuracy"], "base_model": "microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft", "model-index": [{"name": "Boya1_RMSProp_1-e5_10Epoch_swinv2-large-patch4_fold1", "results": [{"task": {"type": "image-classification", "name": "Image Classification"}, "dataset": {"name": "imagefolder", "type": "imagefolder", "config": "default", "split": "test", "args": "default"}, "metrics": [{"type": "accuracy", "value": 0.6781546811397557, "name": "Accuracy"}]}]}]}
onizukal/Boya1_RMSProp_1-e5_10Epoch_swinv2-large-patch4_fold1
null
[ "transformers", "safetensors", "swinv2", "image-classification", "generated_from_trainer", "dataset:imagefolder", "base_model:microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft", "license:apache-2.0", "model-index", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:52:19+00:00
[]
[]
TAGS #transformers #safetensors #swinv2 #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us
Boya1\_RMSProp\_1-e5\_10Epoch\_swinv2-large-patch4\_fold1 ========================================================= This model is a fine-tuned version of microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft on the imagefolder dataset. It achieves the following results on the evaluation set: * Loss: 1.9171 * Accuracy: 0.6782 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 1e-05 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_ratio: 0.1 * num\_epochs: 10 ### Training results ### Framework versions * Transformers 4.35.0 * Pytorch 2.1.0 * Datasets 2.14.6 * Tokenizers 0.14.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.0\n* Pytorch 2.1.0\n* Datasets 2.14.6\n* Tokenizers 0.14.1" ]
[ "TAGS\n#transformers #safetensors #swinv2 #image-classification #generated_from_trainer #dataset-imagefolder #base_model-microsoft/swinv2-large-patch4-window12to16-192to256-22kto1k-ft #license-apache-2.0 #model-index #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10", "### Training results", "### Framework versions\n\n\n* Transformers 4.35.0\n* Pytorch 2.1.0\n* Datasets 2.14.6\n* Tokenizers 0.14.1" ]
question-answering
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
NeginShams/parsbert-extratranslation
null
[ "transformers", "safetensors", "bert", "question-answering", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:53:52+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-classification
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
Yoav-Yosef/toxic-comment-detection-model
null
[ "transformers", "safetensors", "distilbert", "text-classification", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:55:01+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #distilbert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #distilbert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
BaoLoi2003/vinallama-peft-7b-math-solver
null
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:55:36+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
feature-extraction
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
aravindhank/valuenet-bart-base
null
[ "transformers", "safetensors", "bart", "feature-extraction", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T10:59:39+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bart #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bart #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-classification
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-imdb-mlflow This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Framework versions - Transformers 4.40.1 - Pytorch 2.3.0+cpu - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "distilbert-base-cased", "model-index": [{"name": "distilbert-imdb-mlflow", "results": []}]}
nirmeshdell/distilbert-imdb-mlflow
null
[ "transformers", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "base_model:distilbert-base-cased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:00:21+00:00
[]
[]
TAGS #transformers #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
# distilbert-imdb-mlflow This model is a fine-tuned version of distilbert-base-cased on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Framework versions - Transformers 4.40.1 - Pytorch 2.3.0+cpu - Datasets 2.19.0 - Tokenizers 0.19.1
[ "# distilbert-imdb-mlflow\n\nThis model is a fine-tuned version of distilbert-base-cased on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 1", "### Framework versions\n\n- Transformers 4.40.1\n- Pytorch 2.3.0+cpu\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "# distilbert-imdb-mlflow\n\nThis model is a fine-tuned version of distilbert-base-cased on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 1", "### Framework versions\n\n- Transformers 4.40.1\n- Pytorch 2.3.0+cpu\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
text-generation
transformers
For some reason this model is SLOW! but it is highly powerful: Create a Prompt and let it perform the TASK! Im not sure about its conversive powers! (but task based !!!!!@) ## MODELS !! :: : - Why? New base Mode Generation from the final Cybertron series model and the Final CyberSeries Models :| It would seem that some models are not registering on the board ?? perhaps there is a limmit per person ! : followers should know that the cyberboss was my highest model (renamed) And my Cybertron models were heavily merged and trained on many datasets : Even containing thinking pardigms : merging the collection back to base model give the model a great position to begin from ! hence a new base model marker (Untrained/Sharded)(totally unlocked) I had noticed the reality of TopK=1000,TopP=0.78, Temp=0.86 as so, Important with merged models allowing for the model to produce a bit more random results but also giving the model a larger pool to select from: obviously for Role play the model requires Temp to be 1+ ::: ## FineTuning :: Fine tuning models close to 0.9 means that some information is totally Fixed and maynot return without focusing the model ! sometimes to train the model to 1.5+ allowing for loosly trained datas to surface : when higher tempretures are applied ! hence role play datasets being trained at higher loss rates that codeing datasets and math datasets (close to overfitting) Hence Merging playing animportant role in centering the model again ! ## Merging is not just for fun and game! it is a vital part of the training process and locking data into the model as well as sharing data! remember data is not stored in the model:: only the probablity of the information being returned ! ## From here to where ? Currently there is a trend for evaluation ! evaluating the model to discover its weaknesses and threats , removing the specific layers identifed in the model with the ofensive content : enabling for these layers to be trained and replaced ! replace with ?? Replacing layers in the model ; also requires a realignment of information throughout the network ! despite being a copied layer (Still preserving some content) once ofensive content is discovered the network can be trained with its counter argument; hence the evaluation process enabes for the creationn of a custom dataset: targetting these internalized datas! Despite a neural network NOT being a storage system as the retrival process is based oñ probablliities :hence at points in the networ certain emebedding values are present and once translated or decodedd into standard tokens can actually be identidfed! ## WOW!! So ! this also means at each layer the network is actually storing a probablity table , word to word matrix of probab.itys for the next token generation ! IT may even be possible to train a network for image recognition , as long as the images are tokenized into an embedding value associated with the image, Hence image tokenizers : The embedding value produced should enable the output to contain the same images that were present in the training set , ie they have been tokenized and embedded into the model so it should be able to produce an embedding associated with this output ! Hence is should also be possible to retrive the image from the image tokenizer ? so tokens not decoded by the text tokenizer should be handed off to the image tokenizer! to dcode the embedding and return its original (cascade) / digital numercical value (each pixel is a number and with line encoding of images essentially each line can be reconstructed to produce an image, hence ALL images would nbeed to be BitMap/JPEG/PNG acording to the encoder!) MISSION! But still we will need to uinstall all the competition datasets into the mode , so that the original baselines can be established enabling for , after layer removal full realignment to the same dataset collection ! hence retaining all funcitonality, its worth noting that domain specific datasets should also be handled in the same way! MORE TO COME!(look out for the SFT's and Merges)
{"language": ["en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["baseModel"], "base_model": ["LeroyDyer/Mixtral_AI_Cyber_Boss", "LeroyDyer/Mixtral_AI_CyberUltron", "mistralai/Mistral-7B-Instruct-v0.2"]}
LeroyDyer/Mixtral_AI_2.0_7b
null
[ "transformers", "safetensors", "mistral", "text-generation", "baseModel", "en", "base_model:LeroyDyer/Mixtral_AI_Cyber_Boss", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T11:00:31+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #mistral #text-generation #baseModel #en #base_model-LeroyDyer/Mixtral_AI_Cyber_Boss #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
For some reason this model is SLOW! but it is highly powerful: Create a Prompt and let it perform the TASK! Im not sure about its conversive powers! (but task based !!!!!@) ## MODELS !! :: : - Why? New base Mode Generation from the final Cybertron series model and the Final CyberSeries Models :| It would seem that some models are not registering on the board ?? perhaps there is a limmit per person ! : followers should know that the cyberboss was my highest model (renamed) And my Cybertron models were heavily merged and trained on many datasets : Even containing thinking pardigms : merging the collection back to base model give the model a great position to begin from ! hence a new base model marker (Untrained/Sharded)(totally unlocked) I had noticed the reality of TopK=1000,TopP=0.78, Temp=0.86 as so, Important with merged models allowing for the model to produce a bit more random results but also giving the model a larger pool to select from: obviously for Role play the model requires Temp to be 1+ ::: ## FineTuning :: Fine tuning models close to 0.9 means that some information is totally Fixed and maynot return without focusing the model ! sometimes to train the model to 1.5+ allowing for loosly trained datas to surface : when higher tempretures are applied ! hence role play datasets being trained at higher loss rates that codeing datasets and math datasets (close to overfitting) Hence Merging playing animportant role in centering the model again ! ## Merging is not just for fun and game! it is a vital part of the training process and locking data into the model as well as sharing data! remember data is not stored in the model:: only the probablity of the information being returned ! ## From here to where ? Currently there is a trend for evaluation ! evaluating the model to discover its weaknesses and threats , removing the specific layers identifed in the model with the ofensive content : enabling for these layers to be trained and replaced ! replace with ?? Replacing layers in the model ; also requires a realignment of information throughout the network ! despite being a copied layer (Still preserving some content) once ofensive content is discovered the network can be trained with its counter argument; hence the evaluation process enabes for the creationn of a custom dataset: targetting these internalized datas! Despite a neural network NOT being a storage system as the retrival process is based oñ probablliities :hence at points in the networ certain emebedding values are present and once translated or decodedd into standard tokens can actually be identidfed! ## WOW!! So ! this also means at each layer the network is actually storing a probablity table , word to word matrix of URL for the next token generation ! IT may even be possible to train a network for image recognition , as long as the images are tokenized into an embedding value associated with the image, Hence image tokenizers : The embedding value produced should enable the output to contain the same images that were present in the training set , ie they have been tokenized and embedded into the model so it should be able to produce an embedding associated with this output ! Hence is should also be possible to retrive the image from the image tokenizer ? so tokens not decoded by the text tokenizer should be handed off to the image tokenizer! to dcode the embedding and return its original (cascade) / digital numercical value (each pixel is a number and with line encoding of images essentially each line can be reconstructed to produce an image, hence ALL images would nbeed to be BitMap/JPEG/PNG acording to the encoder!) MISSION! But still we will need to uinstall all the competition datasets into the mode , so that the original baselines can be established enabling for , after layer removal full realignment to the same dataset collection ! hence retaining all funcitonality, its worth noting that domain specific datasets should also be handled in the same way! MORE TO COME!(look out for the SFT's and Merges)
[ "## MODELS !! :: : - Why?\n\nNew base Mode Generation from the final Cybertron series model and the Final CyberSeries Models :|\nIt would seem that some models are not registering on the board ?? perhaps there is a limmit per person ! :\n\nfollowers should know that the cyberboss was my highest model (renamed)\nAnd my Cybertron models were heavily merged and trained on many datasets : Even containing thinking pardigms :\n\nmerging the collection back to base model give the model a great position to begin from ! \n\nhence a new base model marker (Untrained/Sharded)(totally unlocked)\n\nI had noticed the reality of TopK=1000,TopP=0.78, Temp=0.86 \nas so, \nImportant with merged models allowing for the model to produce a bit more random results but also giving the model a larger pool to select from:\nobviously for Role play the model requires Temp to be 1+ \n:::", "## FineTuning ::\nFine tuning models close to 0.9 means that some information is totally Fixed and maynot return without focusing the model ! sometimes to train the model to 1.5+\nallowing for loosly trained datas to surface : \nwhen higher tempretures are applied ! hence role play datasets being trained at higher loss rates that codeing datasets and math datasets (close to overfitting)\n\n\nHence Merging playing animportant role in centering the model again !", "## Merging is not just for fun and game! \nit is a vital part of the training process and locking data into the model as well as sharing data!\nremember data is not stored in the model:: only the probablity of the information being returned !", "## From here to where ? \n\nCurrently there is a trend for evaluation !\nevaluating the model to discover its weaknesses and threats , removing the specific layers identifed in the model with the ofensive content :\nenabling for these layers to be trained and replaced ! replace with ?? \nReplacing layers in the model ; also requires a realignment of information throughout the network !\ndespite being a copied layer (Still preserving some content) once ofensive content is discovered the network can be trained with its counter argument; hence the evaluation process enabes for the creationn of a custom dataset: targetting these internalized datas!\nDespite a neural network NOT being a storage system as the retrival process is based oñ probablliities :hence at points in the networ certain emebedding values are present and once translated or decodedd into standard tokens can actually be identidfed!", "## WOW!!\nSo !\nthis also means at each layer the network is actually storing a probablity table , word to word matrix of URL for the next token generation !\nIT may even be possible to train a network for image recognition , as long as the images are tokenized into an embedding value associated with the image, Hence image tokenizers :\nThe embedding value produced should enable the output to contain the same images that were present in the training set , ie they have been tokenized and embedded into the model so it should be able to produce an embedding associated with this output !\nHence is should also be possible to retrive the image from the image tokenizer ? so tokens not decoded by the text tokenizer should be handed off to the image tokenizer! to dcode the embedding and return its original (cascade) / digital numercical value (each pixel is a number and with line encoding of images essentially each line can be reconstructed to produce an image, hence ALL images would nbeed to be BitMap/JPEG/PNG acording to the encoder!)\nMISSION!\n\nBut still we will need to uinstall all the competition datasets into the mode , so that the original baselines can be established enabling for , after layer removal full realignment to the same dataset collection ! hence retaining all funcitonality, its worth noting that domain specific datasets should also be handled in the same way!\n\n\nMORE TO COME!(look out for the SFT's and Merges)" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #baseModel #en #base_model-LeroyDyer/Mixtral_AI_Cyber_Boss #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "## MODELS !! :: : - Why?\n\nNew base Mode Generation from the final Cybertron series model and the Final CyberSeries Models :|\nIt would seem that some models are not registering on the board ?? perhaps there is a limmit per person ! :\n\nfollowers should know that the cyberboss was my highest model (renamed)\nAnd my Cybertron models were heavily merged and trained on many datasets : Even containing thinking pardigms :\n\nmerging the collection back to base model give the model a great position to begin from ! \n\nhence a new base model marker (Untrained/Sharded)(totally unlocked)\n\nI had noticed the reality of TopK=1000,TopP=0.78, Temp=0.86 \nas so, \nImportant with merged models allowing for the model to produce a bit more random results but also giving the model a larger pool to select from:\nobviously for Role play the model requires Temp to be 1+ \n:::", "## FineTuning ::\nFine tuning models close to 0.9 means that some information is totally Fixed and maynot return without focusing the model ! sometimes to train the model to 1.5+\nallowing for loosly trained datas to surface : \nwhen higher tempretures are applied ! hence role play datasets being trained at higher loss rates that codeing datasets and math datasets (close to overfitting)\n\n\nHence Merging playing animportant role in centering the model again !", "## Merging is not just for fun and game! \nit is a vital part of the training process and locking data into the model as well as sharing data!\nremember data is not stored in the model:: only the probablity of the information being returned !", "## From here to where ? \n\nCurrently there is a trend for evaluation !\nevaluating the model to discover its weaknesses and threats , removing the specific layers identifed in the model with the ofensive content :\nenabling for these layers to be trained and replaced ! replace with ?? \nReplacing layers in the model ; also requires a realignment of information throughout the network !\ndespite being a copied layer (Still preserving some content) once ofensive content is discovered the network can be trained with its counter argument; hence the evaluation process enabes for the creationn of a custom dataset: targetting these internalized datas!\nDespite a neural network NOT being a storage system as the retrival process is based oñ probablliities :hence at points in the networ certain emebedding values are present and once translated or decodedd into standard tokens can actually be identidfed!", "## WOW!!\nSo !\nthis also means at each layer the network is actually storing a probablity table , word to word matrix of URL for the next token generation !\nIT may even be possible to train a network for image recognition , as long as the images are tokenized into an embedding value associated with the image, Hence image tokenizers :\nThe embedding value produced should enable the output to contain the same images that were present in the training set , ie they have been tokenized and embedded into the model so it should be able to produce an embedding associated with this output !\nHence is should also be possible to retrive the image from the image tokenizer ? so tokens not decoded by the text tokenizer should be handed off to the image tokenizer! to dcode the embedding and return its original (cascade) / digital numercical value (each pixel is a number and with line encoding of images essentially each line can be reconstructed to produce an image, hence ALL images would nbeed to be BitMap/JPEG/PNG acording to the encoder!)\nMISSION!\n\nBut still we will need to uinstall all the competition datasets into the mode , so that the original baselines can be established enabling for , after layer removal full realignment to the same dataset collection ! hence retaining all funcitonality, its worth noting that domain specific datasets should also be handled in the same way!\n\n\nMORE TO COME!(look out for the SFT's and Merges)" ]
null
transformers
## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: --> <!-- ### vocab_type: --> static quants of https://huggingface.co/saucam/Skyro-4X8B <!-- provided-files --> weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion. ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.Q2_K.gguf) | Q2_K | 9.4 | | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.IQ3_XS.gguf) | IQ3_XS | 10.5 | | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.Q3_K_S.gguf) | Q3_K_S | 11.0 | | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.IQ3_S.gguf) | IQ3_S | 11.1 | beats Q3_K* | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.IQ3_M.gguf) | IQ3_M | 11.2 | | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.Q3_K_M.gguf) | Q3_K_M | 12.2 | lower quality | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.Q3_K_L.gguf) | Q3_K_L | 13.1 | | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.IQ4_XS.gguf) | IQ4_XS | 13.7 | | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.Q4_K_S.gguf) | Q4_K_S | 14.4 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.Q4_K_M.gguf) | Q4_K_M | 15.3 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.Q5_K_S.gguf) | Q5_K_S | 17.3 | | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.Q5_K_M.gguf) | Q5_K_M | 17.8 | | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.Q6_K.gguf) | Q6_K | 20.6 | very good quality | | [GGUF](https://huggingface.co/mradermacher/Skyro-4X8B-GGUF/resolve/main/Skyro-4X8B.Q8_0.gguf) | Q8_0 | 26.6 | fast, best quality | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
{"language": ["en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["merge", "mergekit", "moe", "frankenmoe", "abacusai/Llama-3-Smaug-8B", "cognitivecomputations/dolphin-2.9-llama3-8b", "Weyaxi/Einstein-v6.1-Llama3-8B", "dreamgen-preview/opus-v1.2-llama-3-8b-base-run3.4-epoch2"], "base_model": "saucam/Skyro-4X8B", "quantized_by": "mradermacher"}
mradermacher/Skyro-4X8B-GGUF
null
[ "transformers", "gguf", "merge", "mergekit", "moe", "frankenmoe", "abacusai/Llama-3-Smaug-8B", "cognitivecomputations/dolphin-2.9-llama3-8b", "Weyaxi/Einstein-v6.1-Llama3-8B", "dreamgen-preview/opus-v1.2-llama-3-8b-base-run3.4-epoch2", "en", "base_model:saucam/Skyro-4X8B", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:00:59+00:00
[]
[ "en" ]
TAGS #transformers #gguf #merge #mergekit #moe #frankenmoe #abacusai/Llama-3-Smaug-8B #cognitivecomputations/dolphin-2.9-llama3-8b #Weyaxi/Einstein-v6.1-Llama3-8B #dreamgen-preview/opus-v1.2-llama-3-8b-base-run3.4-epoch2 #en #base_model-saucam/Skyro-4X8B #license-apache-2.0 #endpoints_compatible #region-us
About ----- static quants of URL weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion. Usage ----- If you are unsure how to use GGUF files, refer to one of TheBloke's READMEs for more details, including on how to concatenate multi-part files. Provided Quants --------------- (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): !URL And here are Artefact2's thoughts on the matter: URL FAQ / Model Request ------------------- See URL for some answers to questions you might have and/or if you want some other model quantized. Thanks ------ I thank my company, nethype GmbH, for letting me use its servers and providing upgrades to my workstation to enable this work in my free time.
[]
[ "TAGS\n#transformers #gguf #merge #mergekit #moe #frankenmoe #abacusai/Llama-3-Smaug-8B #cognitivecomputations/dolphin-2.9-llama3-8b #Weyaxi/Einstein-v6.1-Llama3-8B #dreamgen-preview/opus-v1.2-llama-3-8b-base-run3.4-epoch2 #en #base_model-saucam/Skyro-4X8B #license-apache-2.0 #endpoints_compatible #region-us \n" ]
question-answering
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
NeginShams/albert-extratranslation
null
[ "transformers", "safetensors", "bert", "question-answering", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:06:51+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# Uploaded model - **Developed by:** hunterlee27 - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-Instruct-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en", "zh"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "trl"], "base_model": "unsloth/llama-3-8b-Instruct-bnb-4bit"}
hunterlee27/chinese-llama3-full-model
null
[ "transformers", "pytorch", "llama", "text-generation", "text-generation-inference", "unsloth", "trl", "conversational", "en", "zh", "base_model:unsloth/llama-3-8b-Instruct-bnb-4bit", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:06:56+00:00
[]
[ "en", "zh" ]
TAGS #transformers #pytorch #llama #text-generation #text-generation-inference #unsloth #trl #conversational #en #zh #base_model-unsloth/llama-3-8b-Instruct-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
# Uploaded model - Developed by: hunterlee27 - License: apache-2.0 - Finetuned from model : unsloth/llama-3-8b-Instruct-bnb-4bit This llama model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: hunterlee27\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-Instruct-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #pytorch #llama #text-generation #text-generation-inference #unsloth #trl #conversational #en #zh #base_model-unsloth/llama-3-8b-Instruct-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: hunterlee27\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-Instruct-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
text-generation
transformers
## Model Summary The Phi-3-mini-mango-1 is an instruct finetune of [Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct) with 4K context and 3.8B parameters. It is a first cut of finetuning Phi-3 (which is a great model!) to explore its properties and behaviour. More to follow. You will need to update your local transformers to the development version to run this model (4.40.0-dev0 or above): ``` pip uninstall -y transformers && pip install git+https://github.com/huggingface/transformers ``` If this isn't possible, you can run the llamafied version of the model available at [rhysjones/Phi-3-mini-mango-1-llamafied](https://huggingface.co/rhysjones/Phi-3-mini-mango-1-llamafied) (incurrs a small degredation in the abilities of the model).
{"language": ["en"], "license": "mit", "tags": ["nlp", "code"], "license_link": "https://huggingface.co/rhysjones/Phi-3-mini-mango-1/resolve/main/LICENSE", "pipeline_tag": "text-generation", "widget": [{"messages": [{"role": "user", "content": "Can you provide ways to eat combinations of bananas and dragonfruits?"}]}]}
rhysjones/Phi-3-mini-mango-1
null
[ "transformers", "safetensors", "phi3", "text-generation", "nlp", "code", "conversational", "en", "license:mit", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:07:49+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #phi3 #text-generation #nlp #code #conversational #en #license-mit #autotrain_compatible #endpoints_compatible #region-us
## Model Summary The Phi-3-mini-mango-1 is an instruct finetune of Phi-3-mini-4k-instruct with 4K context and 3.8B parameters. It is a first cut of finetuning Phi-3 (which is a great model!) to explore its properties and behaviour. More to follow. You will need to update your local transformers to the development version to run this model (4.40.0-dev0 or above): If this isn't possible, you can run the llamafied version of the model available at rhysjones/Phi-3-mini-mango-1-llamafied (incurrs a small degredation in the abilities of the model).
[ "## Model Summary\n\nThe Phi-3-mini-mango-1 is an instruct finetune of Phi-3-mini-4k-instruct with 4K context and 3.8B parameters.\n\nIt is a first cut of finetuning Phi-3 (which is a great model!) to explore its properties and behaviour. More to follow.\n\nYou will need to update your local transformers to the development version to run this model (4.40.0-dev0 or above):\n\n\nIf this isn't possible, you can run the llamafied version of the model available at rhysjones/Phi-3-mini-mango-1-llamafied (incurrs a small degredation in the abilities of the model)." ]
[ "TAGS\n#transformers #safetensors #phi3 #text-generation #nlp #code #conversational #en #license-mit #autotrain_compatible #endpoints_compatible #region-us \n", "## Model Summary\n\nThe Phi-3-mini-mango-1 is an instruct finetune of Phi-3-mini-4k-instruct with 4K context and 3.8B parameters.\n\nIt is a first cut of finetuning Phi-3 (which is a great model!) to explore its properties and behaviour. More to follow.\n\nYou will need to update your local transformers to the development version to run this model (4.40.0-dev0 or above):\n\n\nIf this isn't possible, you can run the llamafied version of the model available at rhysjones/Phi-3-mini-mango-1-llamafied (incurrs a small degredation in the abilities of the model)." ]
null
transformers
# Uploaded model - **Developed by:** kevinasyraf - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "gguf"], "base_model": "unsloth/llama-3-8b-bnb-4bit"}
kevinasyraf/llama-3-8b-instruct-id-v1
null
[ "transformers", "gguf", "llama", "text-generation-inference", "unsloth", "en", "base_model:unsloth/llama-3-8b-bnb-4bit", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:08:36+00:00
[]
[ "en" ]
TAGS #transformers #gguf #llama #text-generation-inference #unsloth #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us
# Uploaded model - Developed by: kevinasyraf - License: apache-2.0 - Finetuned from model : unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: kevinasyraf\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #gguf #llama #text-generation-inference #unsloth #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: kevinasyraf\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.7.1
{"library_name": "peft", "base_model": "ai-forever/FRED-T5-1.7B"}
chibeenot/aspect_tonality
null
[ "peft", "arxiv:1910.09700", "base_model:ai-forever/FRED-T5-1.7B", "region:us" ]
null
2024-04-27T11:09:49+00:00
[ "1910.09700" ]
[]
TAGS #peft #arxiv-1910.09700 #base_model-ai-forever/FRED-T5-1.7B #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.7.1
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.7.1" ]
[ "TAGS\n#peft #arxiv-1910.09700 #base_model-ai-forever/FRED-T5-1.7B #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.7.1" ]
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # speaker-segmentation-fine-tuned-simsamu This model is a fine-tuned version of [pyannote/segmentation-3.0](https://huggingface.co/pyannote/segmentation-3.0) on the diarizers-community/simsamu default dataset. It achieves the following results on the evaluation set: - Loss: 0.2302 - Der: 0.0911 - False Alarm: 0.0236 - Missed Detection: 0.0413 - Confusion: 0.0262 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 5.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Der | False Alarm | Missed Detection | Confusion | |:-------------:|:-----:|:----:|:---------------:|:------:|:-----------:|:----------------:|:---------:| | 0.2179 | 1.0 | 111 | 0.2240 | 0.0964 | 0.0254 | 0.0470 | 0.0240 | | 0.1678 | 2.0 | 222 | 0.2279 | 0.0943 | 0.0236 | 0.0447 | 0.0260 | | 0.156 | 3.0 | 333 | 0.2327 | 0.0947 | 0.0222 | 0.0450 | 0.0274 | | 0.1507 | 4.0 | 444 | 0.2301 | 0.0919 | 0.0237 | 0.0420 | 0.0262 | | 0.1471 | 5.0 | 555 | 0.2302 | 0.0911 | 0.0236 | 0.0413 | 0.0262 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.0+cu121 - Datasets 2.17.0 - Tokenizers 0.19.1
{"license": "mit", "tags": ["speaker-diarization", "speaker-segmentation", "generated_from_trainer"], "datasets": ["diarizers-community/simsamu"], "base_model": "pyannote/segmentation-3.0", "model-index": [{"name": "speaker-segmentation-fine-tuned-simsamu", "results": []}]}
tgrhn/speaker-segmentation-fine-tuned-simsamu
null
[ "transformers", "tensorboard", "safetensors", "pyannet", "speaker-diarization", "speaker-segmentation", "generated_from_trainer", "dataset:diarizers-community/simsamu", "base_model:pyannote/segmentation-3.0", "license:mit", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:10:03+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #pyannet #speaker-diarization #speaker-segmentation #generated_from_trainer #dataset-diarizers-community/simsamu #base_model-pyannote/segmentation-3.0 #license-mit #endpoints_compatible #region-us
speaker-segmentation-fine-tuned-simsamu ======================================= This model is a fine-tuned version of pyannote/segmentation-3.0 on the diarizers-community/simsamu default dataset. It achieves the following results on the evaluation set: * Loss: 0.2302 * Der: 0.0911 * False Alarm: 0.0236 * Missed Detection: 0.0413 * Confusion: 0.0262 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 0.001 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: cosine * num\_epochs: 5.0 ### Training results ### Framework versions * Transformers 4.40.1 * Pytorch 2.2.0+cu121 * Datasets 2.17.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* num\\_epochs: 5.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #pyannet #speaker-diarization #speaker-segmentation #generated_from_trainer #dataset-diarizers-community/simsamu #base_model-pyannote/segmentation-3.0 #license-mit #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* num\\_epochs: 5.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.19.1" ]
text-classification
setfit
# SetFit with sentence-transformers/paraphrase-mpnet-base-v2 This is a [SetFit](https://github.com/huggingface/setfit) model that can be used for Text Classification. This SetFit model uses [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) as the Sentence Transformer embedding model. A [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a [Sentence Transformer](https://www.sbert.net) with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. ## Model Details ### Model Description - **Model Type:** SetFit - **Sentence Transformer body:** [sentence-transformers/paraphrase-mpnet-base-v2](https://huggingface.co/sentence-transformers/paraphrase-mpnet-base-v2) - **Classification head:** a [LogisticRegression](https://scikit-learn.org/stable/modules/generated/sklearn.linear_model.LogisticRegression.html) instance - **Maximum Sequence Length:** 512 tokens - **Number of Classes:** 5 classes <!-- - **Training Dataset:** [Unknown](https://huggingface.co/datasets/unknown) --> <!-- - **Language:** Unknown --> <!-- - **License:** Unknown --> ### Model Sources - **Repository:** [SetFit on GitHub](https://github.com/huggingface/setfit) - **Paper:** [Efficient Few-Shot Learning Without Prompts](https://arxiv.org/abs/2209.11055) - **Blogpost:** [SetFit: Efficient Few-Shot Learning Without Prompts](https://huggingface.co/blog/setfit) ### Model Labels | Label | Examples | |:------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------| | 2 | <ul><li>'I require assistance in altering certain elements of my policy.'</li><li>"Hey there, I've spotted a gap in my policy information."</li><li>'I need to rectify something within my policy documentation.'</li></ul> | | 4 | <ul><li>"I am covered by health insurance through my employer's sponsorship."</li><li>'Is it permissible to transfer my health plan to ACKO?'</li><li>'My old health policy from another insurance provider is no longer in effect.'</li></ul> | | 3 | <ul><li>'Can you reveal all policies under my profile?'</li><li>'I want to be informed about the status of all my insurance arrangements.'</li><li>"Is it possible for you to display my family's health insurance policies?"</li></ul> | | 1 | <ul><li>'How is my vehicle claim proceeding?'</li><li>"I'm curious about the status of my car insurance claim."</li><li>'Am I required to provide additional evidence for my claims?'</li></ul> | | 0 | <ul><li>'I need help selecting an appropriate health insurance plan for my family.'</li><li>"I'm looking for a health policy that will cover me along with my two kids."</li><li>"I'm in urgent need of a health insurance plan for my family's wellbeing."</li></ul> | ## Evaluation ### Metrics | Label | Accuracy | |:--------|:---------| | **all** | 0.9160 | ## Uses ### Direct Use for Inference First install the SetFit library: ```bash pip install setfit ``` Then you can load this model and run inference. ```python from setfit import SetFitModel # Download from the 🤗 Hub model = SetFitModel.from_pretrained("harshita23sh/setfit-model-intent-classification-insurance") # Run inference preds = model("I have my own health insurance policy.") ``` <!-- ### Downstream Use *List how someone could finetune this model on their own dataset.* --> <!-- ### Out-of-Scope Use *List how the model may foreseeably be misused and address what users ought not to do with the model.* --> <!-- ## Bias, Risks and Limitations *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.* --> <!-- ### Recommendations *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.* --> ## Training Details ### Training Set Metrics | Training set | Min | Median | Max | |:-------------|:----|:-------|:----| | Word count | 5 | 10.6 | 15 | | Label | Training Sample Count | |:------|:----------------------| | 0 | 5 | | 1 | 8 | | 2 | 12 | | 3 | 4 | | 4 | 11 | ### Training Hyperparameters - batch_size: (16, 16) - num_epochs: (1, 1) - max_steps: -1 - sampling_strategy: oversampling - num_iterations: 20 - body_learning_rate: (2e-05, 2e-05) - head_learning_rate: 2e-05 - loss: CosineSimilarityLoss - distance_metric: cosine_distance - margin: 0.25 - end_to_end: False - use_amp: False - warmup_proportion: 0.1 - seed: 42 - eval_max_steps: -1 - load_best_model_at_end: False ### Training Results | Epoch | Step | Training Loss | Validation Loss | |:-----:|:----:|:-------------:|:---------------:| | 0.01 | 1 | 0.1388 | - | | 0.5 | 50 | 0.0087 | - | | 1.0 | 100 | 0.0029 | - | ### Framework Versions - Python: 3.10.12 - SetFit: 1.0.3 - Sentence Transformers: 2.7.0 - Transformers: 4.40.0 - PyTorch: 2.2.1+cu121 - Datasets: 2.19.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX ```bibtex @article{https://doi.org/10.48550/arxiv.2209.11055, doi = {10.48550/ARXIV.2209.11055}, url = {https://arxiv.org/abs/2209.11055}, author = {Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren}, keywords = {Computation and Language (cs.CL), FOS: Computer and information sciences, FOS: Computer and information sciences}, title = {Efficient Few-Shot Learning Without Prompts}, publisher = {arXiv}, year = {2022}, copyright = {Creative Commons Attribution 4.0 International} } ``` <!-- ## Glossary *Clearly define terms in order to be accessible across audiences.* --> <!-- ## Model Card Authors *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.* --> <!-- ## Model Card Contact *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.* -->
{"library_name": "setfit", "tags": ["setfit", "sentence-transformers", "text-classification", "generated_from_setfit_trainer"], "metrics": ["accuracy"], "base_model": "sentence-transformers/paraphrase-mpnet-base-v2", "widget": [{"text": "I need to repair an issue I found in my policy."}, {"text": "I'm wondering about the developments in my policy alteration."}, {"text": "Hi, I've noticed my policy is incomplete and needs additional details."}, {"text": "What's the latest on the status of my claim?"}, {"text": "I have my own health insurance policy."}], "pipeline_tag": "text-classification", "inference": true, "model-index": [{"name": "SetFit with sentence-transformers/paraphrase-mpnet-base-v2", "results": [{"task": {"type": "text-classification", "name": "Text Classification"}, "dataset": {"name": "Unknown", "type": "unknown", "split": "test"}, "metrics": [{"type": "accuracy", "value": 0.9159663865546218, "name": "Accuracy"}]}]}]}
harshita23sh/setfit-model-intent-classification-insurance
null
[ "setfit", "safetensors", "mpnet", "sentence-transformers", "text-classification", "generated_from_setfit_trainer", "arxiv:2209.11055", "base_model:sentence-transformers/paraphrase-mpnet-base-v2", "model-index", "region:us" ]
null
2024-04-27T11:11:04+00:00
[ "2209.11055" ]
[]
TAGS #setfit #safetensors #mpnet #sentence-transformers #text-classification #generated_from_setfit_trainer #arxiv-2209.11055 #base_model-sentence-transformers/paraphrase-mpnet-base-v2 #model-index #region-us
SetFit with sentence-transformers/paraphrase-mpnet-base-v2 ========================================================== This is a SetFit model that can be used for Text Classification. This SetFit model uses sentence-transformers/paraphrase-mpnet-base-v2 as the Sentence Transformer embedding model. A LogisticRegression instance is used for classification. The model has been trained using an efficient few-shot learning technique that involves: 1. Fine-tuning a Sentence Transformer with contrastive learning. 2. Training a classification head with features from the fine-tuned Sentence Transformer. Model Details ------------- ### Model Description * Model Type: SetFit * Sentence Transformer body: sentence-transformers/paraphrase-mpnet-base-v2 * Classification head: a LogisticRegression instance * Maximum Sequence Length: 512 tokens * Number of Classes: 5 classes ### Model Sources * Repository: SetFit on GitHub * Paper: Efficient Few-Shot Learning Without Prompts * Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts ### Model Labels Evaluation ---------- ### Metrics Uses ---- ### Direct Use for Inference First install the SetFit library: Then you can load this model and run inference. Training Details ---------------- ### Training Set Metrics ### Training Hyperparameters * batch\_size: (16, 16) * num\_epochs: (1, 1) * max\_steps: -1 * sampling\_strategy: oversampling * num\_iterations: 20 * body\_learning\_rate: (2e-05, 2e-05) * head\_learning\_rate: 2e-05 * loss: CosineSimilarityLoss * distance\_metric: cosine\_distance * margin: 0.25 * end\_to\_end: False * use\_amp: False * warmup\_proportion: 0.1 * seed: 42 * eval\_max\_steps: -1 * load\_best\_model\_at\_end: False ### Training Results ### Framework Versions * Python: 3.10.12 * SetFit: 1.0.3 * Sentence Transformers: 2.7.0 * Transformers: 4.40.0 * PyTorch: 2.2.1+cu121 * Datasets: 2.19.0 * Tokenizers: 0.19.1 ### BibTeX
[ "### Model Description\n\n\n* Model Type: SetFit\n* Sentence Transformer body: sentence-transformers/paraphrase-mpnet-base-v2\n* Classification head: a LogisticRegression instance\n* Maximum Sequence Length: 512 tokens\n* Number of Classes: 5 classes", "### Model Sources\n\n\n* Repository: SetFit on GitHub\n* Paper: Efficient Few-Shot Learning Without Prompts\n* Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts", "### Model Labels\n\n\n\nEvaluation\n----------", "### Metrics\n\n\n\nUses\n----", "### Direct Use for Inference\n\n\nFirst install the SetFit library:\n\n\nThen you can load this model and run inference.\n\n\nTraining Details\n----------------", "### Training Set Metrics", "### Training Hyperparameters\n\n\n* batch\\_size: (16, 16)\n* num\\_epochs: (1, 1)\n* max\\_steps: -1\n* sampling\\_strategy: oversampling\n* num\\_iterations: 20\n* body\\_learning\\_rate: (2e-05, 2e-05)\n* head\\_learning\\_rate: 2e-05\n* loss: CosineSimilarityLoss\n* distance\\_metric: cosine\\_distance\n* margin: 0.25\n* end\\_to\\_end: False\n* use\\_amp: False\n* warmup\\_proportion: 0.1\n* seed: 42\n* eval\\_max\\_steps: -1\n* load\\_best\\_model\\_at\\_end: False", "### Training Results", "### Framework Versions\n\n\n* Python: 3.10.12\n* SetFit: 1.0.3\n* Sentence Transformers: 2.7.0\n* Transformers: 4.40.0\n* PyTorch: 2.2.1+cu121\n* Datasets: 2.19.0\n* Tokenizers: 0.19.1", "### BibTeX" ]
[ "TAGS\n#setfit #safetensors #mpnet #sentence-transformers #text-classification #generated_from_setfit_trainer #arxiv-2209.11055 #base_model-sentence-transformers/paraphrase-mpnet-base-v2 #model-index #region-us \n", "### Model Description\n\n\n* Model Type: SetFit\n* Sentence Transformer body: sentence-transformers/paraphrase-mpnet-base-v2\n* Classification head: a LogisticRegression instance\n* Maximum Sequence Length: 512 tokens\n* Number of Classes: 5 classes", "### Model Sources\n\n\n* Repository: SetFit on GitHub\n* Paper: Efficient Few-Shot Learning Without Prompts\n* Blogpost: SetFit: Efficient Few-Shot Learning Without Prompts", "### Model Labels\n\n\n\nEvaluation\n----------", "### Metrics\n\n\n\nUses\n----", "### Direct Use for Inference\n\n\nFirst install the SetFit library:\n\n\nThen you can load this model and run inference.\n\n\nTraining Details\n----------------", "### Training Set Metrics", "### Training Hyperparameters\n\n\n* batch\\_size: (16, 16)\n* num\\_epochs: (1, 1)\n* max\\_steps: -1\n* sampling\\_strategy: oversampling\n* num\\_iterations: 20\n* body\\_learning\\_rate: (2e-05, 2e-05)\n* head\\_learning\\_rate: 2e-05\n* loss: CosineSimilarityLoss\n* distance\\_metric: cosine\\_distance\n* margin: 0.25\n* end\\_to\\_end: False\n* use\\_amp: False\n* warmup\\_proportion: 0.1\n* seed: 42\n* eval\\_max\\_steps: -1\n* load\\_best\\_model\\_at\\_end: False", "### Training Results", "### Framework Versions\n\n\n* Python: 3.10.12\n* SetFit: 1.0.3\n* Sentence Transformers: 2.7.0\n* Transformers: 4.40.0\n* PyTorch: 2.2.1+cu121\n* Datasets: 2.19.0\n* Tokenizers: 0.19.1", "### BibTeX" ]
null
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # speaker-segmentation-fine-tuned-simsamu-2 This model is a fine-tuned version of [pyannote/segmentation-3.0](https://huggingface.co/pyannote/segmentation-3.0) on the diarizers-community/simsamu default dataset. It achieves the following results on the evaluation set: - Loss: 0.2428 - Der: 0.0861 - False Alarm: 0.0245 - Missed Detection: 0.0384 - Confusion: 0.0232 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - num_epochs: 10.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | Der | False Alarm | Missed Detection | Confusion | |:-------------:|:-----:|:----:|:---------------:|:------:|:-----------:|:----------------:|:---------:| | 0.2179 | 1.0 | 111 | 0.2259 | 0.0951 | 0.0239 | 0.0486 | 0.0227 | | 0.1694 | 2.0 | 222 | 0.2379 | 0.0930 | 0.0230 | 0.0466 | 0.0234 | | 0.1559 | 3.0 | 333 | 0.2305 | 0.0898 | 0.0223 | 0.0431 | 0.0244 | | 0.149 | 4.0 | 444 | 0.2323 | 0.0893 | 0.0246 | 0.0398 | 0.0249 | | 0.1416 | 5.0 | 555 | 0.2351 | 0.0884 | 0.0243 | 0.0399 | 0.0243 | | 0.1369 | 6.0 | 666 | 0.2458 | 0.0904 | 0.0266 | 0.0370 | 0.0268 | | 0.1367 | 7.0 | 777 | 0.2410 | 0.0882 | 0.0204 | 0.0434 | 0.0244 | | 0.1306 | 8.0 | 888 | 0.2400 | 0.0866 | 0.0240 | 0.0393 | 0.0234 | | 0.1301 | 9.0 | 999 | 0.2422 | 0.0860 | 0.0243 | 0.0387 | 0.0230 | | 0.1276 | 10.0 | 1110 | 0.2428 | 0.0861 | 0.0245 | 0.0384 | 0.0232 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.0+cu121 - Datasets 2.17.0 - Tokenizers 0.19.1
{"license": "mit", "tags": ["speaker-diarization", "speaker-segmentation", "generated_from_trainer"], "datasets": ["diarizers-community/simsamu"], "base_model": "pyannote/segmentation-3.0", "model-index": [{"name": "speaker-segmentation-fine-tuned-simsamu-2", "results": []}]}
tgrhn/speaker-segmentation-fine-tuned-simsamu-2
null
[ "transformers", "tensorboard", "safetensors", "pyannet", "speaker-diarization", "speaker-segmentation", "generated_from_trainer", "dataset:diarizers-community/simsamu", "base_model:pyannote/segmentation-3.0", "license:mit", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:11:41+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #pyannet #speaker-diarization #speaker-segmentation #generated_from_trainer #dataset-diarizers-community/simsamu #base_model-pyannote/segmentation-3.0 #license-mit #endpoints_compatible #region-us
speaker-segmentation-fine-tuned-simsamu-2 ========================================= This model is a fine-tuned version of pyannote/segmentation-3.0 on the diarizers-community/simsamu default dataset. It achieves the following results on the evaluation set: * Loss: 0.2428 * Der: 0.0861 * False Alarm: 0.0245 * Missed Detection: 0.0384 * Confusion: 0.0232 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 0.001 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: cosine * num\_epochs: 10.0 ### Training results ### Framework versions * Transformers 4.40.1 * Pytorch 2.2.0+cu121 * Datasets 2.17.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* num\\_epochs: 10.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #pyannet #speaker-diarization #speaker-segmentation #generated_from_trainer #dataset-diarizers-community/simsamu #base_model-pyannote/segmentation-3.0 #license-mit #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* num\\_epochs: 10.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.0+cu121\n* Datasets 2.17.0\n* Tokenizers 0.19.1" ]
null
transformers
# Uploaded model - **Developed by:** Dogge - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "trl"], "base_model": "unsloth/llama-3-8b-bnb-4bit"}
Dogge/llama-3-translate-enko
null
[ "transformers", "safetensors", "text-generation-inference", "unsloth", "llama", "trl", "en", "base_model:unsloth/llama-3-8b-bnb-4bit", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:12:32+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #text-generation-inference #unsloth #llama #trl #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us
# Uploaded model - Developed by: Dogge - License: apache-2.0 - Finetuned from model : unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: Dogge\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #safetensors #text-generation-inference #unsloth #llama #trl #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: Dogge\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
text-generation
transformers
# mlx-community/Swallow-70b-instruct-v0.1-8bit This model was converted to MLX format from [`tokyotech-llm/Swallow-70b-instruct-v0.1`]() using mlx-lm version **0.6.0**. Refer to the [original model card](https://huggingface.co/tokyotech-llm/Swallow-70b-instruct-v0.1) for more details on the model. ## Use with mlx ```bash pip install mlx-lm ``` ```python from mlx_lm import load, generate model, tokenizer = load("mlx-community/Swallow-70b-instruct-v0.1-8bit") response = generate(model, tokenizer, prompt="hello", verbose=True) ```
{"language": ["en", "ja"], "license": "llama2", "library_name": "transformers", "tags": ["mlx"], "pipeline_tag": "text-generation", "model_type": "llama"}
mlx-community/Swallow-70b-instruct-v0.1-8bit
null
[ "transformers", "safetensors", "llama", "text-generation", "mlx", "conversational", "en", "ja", "license:llama2", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T11:12:42+00:00
[]
[ "en", "ja" ]
TAGS #transformers #safetensors #llama #text-generation #mlx #conversational #en #ja #license-llama2 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# mlx-community/Swallow-70b-instruct-v0.1-8bit This model was converted to MLX format from ['tokyotech-llm/Swallow-70b-instruct-v0.1']() using mlx-lm version 0.6.0. Refer to the original model card for more details on the model. ## Use with mlx
[ "# mlx-community/Swallow-70b-instruct-v0.1-8bit\nThis model was converted to MLX format from ['tokyotech-llm/Swallow-70b-instruct-v0.1']() using mlx-lm version 0.6.0.\nRefer to the original model card for more details on the model.", "## Use with mlx" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #mlx #conversational #en #ja #license-llama2 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# mlx-community/Swallow-70b-instruct-v0.1-8bit\nThis model was converted to MLX format from ['tokyotech-llm/Swallow-70b-instruct-v0.1']() using mlx-lm version 0.6.0.\nRefer to the original model card for more details on the model.", "## Use with mlx" ]
text-generation
transformers
# Uploaded model - **Developed by:** Dogge - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "trl", "sft"], "base_model": "unsloth/llama-3-8b-bnb-4bit"}
Dogge/llama-3-translate-enko-bf16
null
[ "transformers", "safetensors", "llama", "text-generation", "text-generation-inference", "unsloth", "trl", "sft", "en", "base_model:unsloth/llama-3-8b-bnb-4bit", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:13:53+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #llama #text-generation #text-generation-inference #unsloth #trl #sft #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
# Uploaded model - Developed by: Dogge - License: apache-2.0 - Finetuned from model : unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: Dogge\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #text-generation-inference #unsloth #trl #sft #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: Dogge\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
question-answering
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
NeginShams/roberta-extratranslation
null
[ "transformers", "safetensors", "xlm-roberta", "question-answering", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:15:11+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #xlm-roberta #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #xlm-roberta #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text2text-generation
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # t5-small-finetuned-xsum This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.5091 - Rouge1: 15.8853 - Rouge2: 5.7363 - Rougel: 13.5746 - Rougelsum: 13.6787 - Gen Len: 19.0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len | |:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|:-------:|:---------:|:-------:| | 1.5826 | 1.0 | 9258 | 1.5091 | 15.8853 | 5.7363 | 13.5746 | 13.6787 | 19.0 | ### Framework versions - Transformers 4.30.0 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.13.3
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["rouge"], "model-index": [{"name": "t5-small-finetuned-xsum", "results": []}]}
honganhle1903/t5-small-finetuned-xsum
null
[ "transformers", "pytorch", "tensorboard", "t5", "text2text-generation", "generated_from_trainer", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T11:15:34+00:00
[]
[]
TAGS #transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
t5-small-finetuned-xsum ======================= This model is a fine-tuned version of t5-small on the None dataset. It achieves the following results on the evaluation set: * Loss: 1.5091 * Rouge1: 15.8853 * Rouge2: 5.7363 * Rougel: 13.5746 * Rougelsum: 13.6787 * Gen Len: 19.0 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 1 ### Training results ### Framework versions * Transformers 4.30.0 * Pytorch 2.2.1+cu121 * Datasets 2.19.0 * Tokenizers 0.13.3
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1", "### Training results", "### Framework versions\n\n\n* Transformers 4.30.0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.13.3" ]
[ "TAGS\n#transformers #pytorch #tensorboard #t5 #text2text-generation #generated_from_trainer #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1", "### Training results", "### Framework versions\n\n\n* Transformers 4.30.0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.13.3" ]
text-to-image
diffusers
<!-- This model card has been generated automatically according to the information the training script had access to. You should probably proofread and complete it, then remove this comment. --> # SDXL LoRA DreamBooth - egioia/corgy_coins_LoRA <Gallery /> ## Model description These are egioia/corgy_coins_LoRA LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained using [DreamBooth](https://dreambooth.github.io/). LoRA for the text encoder was enabled: False. Special VAE used for training: madebyollin/sdxl-vae-fp16-fix. ## Trigger words You should use a photo of TOK coin to trigger the image generation. ## Download model Weights for this model are available in Safetensors format. [Download](egioia/corgy_coins_LoRA/tree/main) them in the Files & versions tab. ## Intended uses & limitations #### How to use ```python # TODO: add an example code snippet for running this diffusion pipeline ``` #### Limitations and bias [TODO: provide examples of latent issues and potential remediations] ## Training details [TODO: describe the data used to train the model]
{"license": "openrail++", "library_name": "diffusers", "tags": ["text-to-image", "text-to-image", "diffusers-training", "diffusers", "dora", "template:sd-lora", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "text-to-image", "text-to-image", "diffusers-training", "diffusers", "dora", "template:sd-lora", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "text-to-image", "text-to-image", "diffusers-training", "diffusers", "dora", "template:sd-lora", "stable-diffusion-xl", "stable-diffusion-xl-diffusers"], "base_model": "stabilityai/stable-diffusion-xl-base-1.0", "instance_prompt": "a photo of TOK coin", "widget": []}
egioia/corgy_coins_LoRA
null
[ "diffusers", "tensorboard", "text-to-image", "diffusers-training", "dora", "template:sd-lora", "stable-diffusion-xl", "stable-diffusion-xl-diffusers", "base_model:stabilityai/stable-diffusion-xl-base-1.0", "license:openrail++", "region:us" ]
null
2024-04-27T11:16:08+00:00
[]
[]
TAGS #diffusers #tensorboard #text-to-image #diffusers-training #dora #template-sd-lora #stable-diffusion-xl #stable-diffusion-xl-diffusers #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail++ #region-us
# SDXL LoRA DreamBooth - egioia/corgy_coins_LoRA <Gallery /> ## Model description These are egioia/corgy_coins_LoRA LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0. The weights were trained using DreamBooth. LoRA for the text encoder was enabled: False. Special VAE used for training: madebyollin/sdxl-vae-fp16-fix. ## Trigger words You should use a photo of TOK coin to trigger the image generation. ## Download model Weights for this model are available in Safetensors format. Download them in the Files & versions tab. ## Intended uses & limitations #### How to use #### Limitations and bias [TODO: provide examples of latent issues and potential remediations] ## Training details [TODO: describe the data used to train the model]
[ "# SDXL LoRA DreamBooth - egioia/corgy_coins_LoRA\n\n<Gallery />", "## Model description\n\nThese are egioia/corgy_coins_LoRA LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.\n\nThe weights were trained using DreamBooth.\n\nLoRA for the text encoder was enabled: False.\n\nSpecial VAE used for training: madebyollin/sdxl-vae-fp16-fix.", "## Trigger words\n\nYou should use a photo of TOK coin to trigger the image generation.", "## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab.", "## Intended uses & limitations", "#### How to use", "#### Limitations and bias\n\n[TODO: provide examples of latent issues and potential remediations]", "## Training details\n\n[TODO: describe the data used to train the model]" ]
[ "TAGS\n#diffusers #tensorboard #text-to-image #diffusers-training #dora #template-sd-lora #stable-diffusion-xl #stable-diffusion-xl-diffusers #base_model-stabilityai/stable-diffusion-xl-base-1.0 #license-openrail++ #region-us \n", "# SDXL LoRA DreamBooth - egioia/corgy_coins_LoRA\n\n<Gallery />", "## Model description\n\nThese are egioia/corgy_coins_LoRA LoRA adaption weights for stabilityai/stable-diffusion-xl-base-1.0.\n\nThe weights were trained using DreamBooth.\n\nLoRA for the text encoder was enabled: False.\n\nSpecial VAE used for training: madebyollin/sdxl-vae-fp16-fix.", "## Trigger words\n\nYou should use a photo of TOK coin to trigger the image generation.", "## Download model\n\nWeights for this model are available in Safetensors format.\n\nDownload them in the Files & versions tab.", "## Intended uses & limitations", "#### How to use", "#### Limitations and bias\n\n[TODO: provide examples of latent issues and potential remediations]", "## Training details\n\n[TODO: describe the data used to train the model]" ]
null
null
--- license: cc-by-sa-4.0 --- # v4からの修正点 数字を全て一桁区切りに。 # 説明 wikipedia, mbpp, grade-school-mathで学習したトークナイザー。 ## 学習に使ったデータ - 英語:1.33GB (wiki40b)<br> - 日本語:1.78GB (wiki40b) ※形態素単位で"||||"で事前分割してsentencepieceの学習時にpretokenization_delimiterを設定。<br> - コード:172KB (mbpp) <br> - 数学:2.1MB (grade-school-math) - ## 語彙の追加 以下を参考に日本語の語彙を追加。 - wikitinary 目次一覧(名詞・形容詞・形容動詞・副詞・接尾辞・助詞・動詞などから一般的に使われると思われるものを定性的に選別。) - wikitionary 日本語の基本語彙1000 - 文化庁「常用漢字一覧表」の例から一部をサンプリング。 - 時間・季節・方角に関する語 - 都道府県・観光地・東京23区 - 頻出する日本の苗字 - 定型表現(「こんにちは」「よろしく」「ございます」など) その他、以下の語彙を追加。 - 記号 - 数字(漢数字・半角数字0~9・全角数字0〜9・上付き数字0〜9) - 数学に出るギリシャ文字 ## 語彙の割合 概算ですが、アルファベットが約6割、日本語(ひらがな・カタカナ・漢字)が約4割となっています。(その他記号や数字は1~2%程度) ## 参照 - https://aclanthology.org/2020.lrec-1.297.pdf - https://www.tensorflow.org/datasets/catalog/wiki40b - https://github.com/openai/grade-school-math - https://github.com/google-research/google-research/tree/master/mbpp - https://www.bunka.go.jp/kokugo_nihongo/sisaku/joho/joho/kakuki/14/pdf/jyouyou_kanjihyou.pdf - https://ja.m.wiktionary.org/wiki/%E3%82%AB%E3%83%86%E3%82%B4%E3%83%AA:%E6%97%A5%E6%9C%AC%E8%AA%9E - ## 設定 vocab_size=56,320(語彙サイズ)<br> character_coverage=0.9995(文字のカバー率99.95%)<br> model_type="unigram"(アルゴリズム)<br> normalization="identity"(正規化なし)<br> byte_fallback=True(バイト変換あり)<br> split_digits=True(数字分割あり)<br> allow_whitespace_only_pieces=True(空白のトークンを許可する)<br> remove_extra_whitespaces=True(余分な空白の削除あり)<br> ## 形式 LlamaTokenizer<br> ※encode時に文頭にbos_tokenである"\<s\>"トークンが付きます。 # 使い方 ```python !pip install transformers>=4.34.0 from transformers import AutoTokenizer test_tokenizer = AutoTokenizer.from_pretrained("geniacllm/ja-en-tokenizer-unigram-v5", use_fast=False) ``` ```python # text text = "This is tokenizer test." # tokenize tokenized = test_tokenizer.tokenize(text) print(tokenized) # encode encoded = test_tokenizer.encode(text) print(encoded) # decode decoded = test_tokenizer.decode(encoded) print(decoded) # special_token print(test_tokenizer.special_tokens_map) # vocab size print(len(test_tokenizer)) # all subwords in vocab print(test_tokenizer.get_vocab()) ```
{"license": "cc-by-sa-4.0"}
geniacllm/ja-en-tokenizer-unigram-v5
null
[ "license:cc-by-sa-4.0", "region:us" ]
null
2024-04-27T11:16:10+00:00
[]
[]
TAGS #license-cc-by-sa-4.0 #region-us
--- license: cc-by-sa-4.0 --- # v4からの修正点 数字を全て一桁区切りに。 # 説明 wikipedia, mbpp, grade-school-mathで学習したトークナイザー。 ## 学習に使ったデータ - 英語:1.33GB (wiki40b)<br> - 日本語:1.78GB (wiki40b) ※形態素単位で"||||"で事前分割してsentencepieceの学習時にpretokenization_delimiterを設定。<br> - コード:172KB (mbpp) <br> - 数学:2.1MB (grade-school-math) - ## 語彙の追加 以下を参考に日本語の語彙を追加。 - wikitinary 目次一覧(名詞・形容詞・形容動詞・副詞・接尾辞・助詞・動詞などから一般的に使われると思われるものを定性的に選別。) - wikitionary 日本語の基本語彙1000 - 文化庁「常用漢字一覧表」の例から一部をサンプリング。 - 時間・季節・方角に関する語 - 都道府県・観光地・東京23区 - 頻出する日本の苗字 - 定型表現(「こんにちは」「よろしく」「ございます」など) その他、以下の語彙を追加。 - 記号 - 数字(漢数字・半角数字0~9・全角数字0〜9・上付き数字0〜9) - 数学に出るギリシャ文字 ## 語彙の割合 概算ですが、アルファベットが約6割、日本語(ひらがな・カタカナ・漢字)が約4割となっています。(その他記号や数字は1~2%程度) ## 参照 - URL - URL - URL - URL - URL - https://ja.m.URL - ## 設定 vocab_size=56,320(語彙サイズ)<br> character_coverage=0.9995(文字のカバー率99.95%)<br> model_type="unigram"(アルゴリズム)<br> normalization="identity"(正規化なし)<br> byte_fallback=True(バイト変換あり)<br> split_digits=True(数字分割あり)<br> allow_whitespace_only_pieces=True(空白のトークンを許可する)<br> remove_extra_whitespaces=True(余分な空白の削除あり)<br> ## 形式 LlamaTokenizer<br> ※encode時に文頭にbos_tokenである"\<s\>"トークンが付きます。 # 使い方
[ "# v4からの修正点\n数字を全て一桁区切りに。", "# 説明\nwikipedia, mbpp, grade-school-mathで学習したトークナイザー。", "## 学習に使ったデータ\n- 英語:1.33GB (wiki40b)<br>\n- 日本語:1.78GB (wiki40b) ※形態素単位で\"||||\"で事前分割してsentencepieceの学習時にpretokenization_delimiterを設定。<br>\n- コード:172KB (mbpp) <br>\n- 数学:2.1MB (grade-school-math)\n-", "## 語彙の追加\n以下を参考に日本語の語彙を追加。\n- wikitinary 目次一覧(名詞・形容詞・形容動詞・副詞・接尾辞・助詞・動詞などから一般的に使われると思われるものを定性的に選別。)\n- wikitionary 日本語の基本語彙1000\n- 文化庁「常用漢字一覧表」の例から一部をサンプリング。\n- 時間・季節・方角に関する語\n- 都道府県・観光地・東京23区\n- 頻出する日本の苗字\n- 定型表現(「こんにちは」「よろしく」「ございます」など)\nその他、以下の語彙を追加。\n- 記号\n- 数字(漢数字・半角数字0~9・全角数字0〜9・上付き数字0〜9)\n- 数学に出るギリシャ文字", "## 語彙の割合\n概算ですが、アルファベットが約6割、日本語(ひらがな・カタカナ・漢字)が約4割となっています。(その他記号や数字は1~2%程度)", "## 参照\n- URL\n- URL\n- URL\n- URL\n- URL\n- https://ja.m.URL\n-", "## 設定\nvocab_size=56,320(語彙サイズ)<br>\ncharacter_coverage=0.9995(文字のカバー率99.95%)<br>\nmodel_type=\"unigram\"(アルゴリズム)<br>\nnormalization=\"identity\"(正規化なし)<br>\nbyte_fallback=True(バイト変換あり)<br>\nsplit_digits=True(数字分割あり)<br>\nallow_whitespace_only_pieces=True(空白のトークンを許可する)<br>\nremove_extra_whitespaces=True(余分な空白の削除あり)<br>", "## 形式\nLlamaTokenizer<br>\n※encode時に文頭にbos_tokenである\"\\<s\\>\"トークンが付きます。", "# 使い方" ]
[ "TAGS\n#license-cc-by-sa-4.0 #region-us \n", "# v4からの修正点\n数字を全て一桁区切りに。", "# 説明\nwikipedia, mbpp, grade-school-mathで学習したトークナイザー。", "## 学習に使ったデータ\n- 英語:1.33GB (wiki40b)<br>\n- 日本語:1.78GB (wiki40b) ※形態素単位で\"||||\"で事前分割してsentencepieceの学習時にpretokenization_delimiterを設定。<br>\n- コード:172KB (mbpp) <br>\n- 数学:2.1MB (grade-school-math)\n-", "## 語彙の追加\n以下を参考に日本語の語彙を追加。\n- wikitinary 目次一覧(名詞・形容詞・形容動詞・副詞・接尾辞・助詞・動詞などから一般的に使われると思われるものを定性的に選別。)\n- wikitionary 日本語の基本語彙1000\n- 文化庁「常用漢字一覧表」の例から一部をサンプリング。\n- 時間・季節・方角に関する語\n- 都道府県・観光地・東京23区\n- 頻出する日本の苗字\n- 定型表現(「こんにちは」「よろしく」「ございます」など)\nその他、以下の語彙を追加。\n- 記号\n- 数字(漢数字・半角数字0~9・全角数字0〜9・上付き数字0〜9)\n- 数学に出るギリシャ文字", "## 語彙の割合\n概算ですが、アルファベットが約6割、日本語(ひらがな・カタカナ・漢字)が約4割となっています。(その他記号や数字は1~2%程度)", "## 参照\n- URL\n- URL\n- URL\n- URL\n- URL\n- https://ja.m.URL\n-", "## 設定\nvocab_size=56,320(語彙サイズ)<br>\ncharacter_coverage=0.9995(文字のカバー率99.95%)<br>\nmodel_type=\"unigram\"(アルゴリズム)<br>\nnormalization=\"identity\"(正規化なし)<br>\nbyte_fallback=True(バイト変換あり)<br>\nsplit_digits=True(数字分割あり)<br>\nallow_whitespace_only_pieces=True(空白のトークンを許可する)<br>\nremove_extra_whitespaces=True(余分な空白の削除あり)<br>", "## 形式\nLlamaTokenizer<br>\n※encode時に文頭にbos_tokenである\"\\<s\\>\"トークンが付きます。", "# 使い方" ]
text-generation
transformers
# miqu-evil-dpo # **Model Details** ## Description miqu-evil-dpo is fine-tuned model based on miqu, serving as a direct successor to PiVoT-0.1-Evil-a. It is trained with evil-tune method applied. ![image/png](./eviltune.png) <!-- prompt-template start --> ## Prompt template: Mistral Inst ``` <s> [INST] {inst} [/INST] ``` <!-- prompt-template end --> ## Disclaimer The AI model provided herein is intended for experimental purposes only. The creator of this model makes no representations or warranties of any kind, either express or implied, as to the model's accuracy, reliability, or suitability for any particular purpose. The creator shall not be held liable for any outcomes, decisions, or actions taken on the basis of the information generated by this model. Users of this model assume full responsibility for any consequences resulting from its use.
{"language": ["en"], "license": "other", "tags": ["not-for-all-audiences"], "license_name": "miqu-license", "license_link": "LICENSE", "pipeline_tag": "text-generation"}
blockblockblock/miqu-evil-dpo-bpw6-exl2
null
[ "transformers", "safetensors", "llama", "text-generation", "not-for-all-audiences", "conversational", "en", "license:other", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "6-bit", "region:us" ]
null
2024-04-27T11:18:03+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #llama #text-generation #not-for-all-audiences #conversational #en #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #6-bit #region-us
# miqu-evil-dpo # Model Details ## Description miqu-evil-dpo is fine-tuned model based on miqu, serving as a direct successor to PiVoT-0.1-Evil-a. It is trained with evil-tune method applied. !image/png ## Prompt template: Mistral Inst ## Disclaimer The AI model provided herein is intended for experimental purposes only. The creator of this model makes no representations or warranties of any kind, either express or implied, as to the model's accuracy, reliability, or suitability for any particular purpose. The creator shall not be held liable for any outcomes, decisions, or actions taken on the basis of the information generated by this model. Users of this model assume full responsibility for any consequences resulting from its use.
[ "# miqu-evil-dpo", "# Model Details", "## Description\nmiqu-evil-dpo is fine-tuned model based on miqu, serving as a direct successor to PiVoT-0.1-Evil-a.\n\nIt is trained with evil-tune method applied.\n\n!image/png", "## Prompt template: Mistral Inst", "## Disclaimer\nThe AI model provided herein is intended for experimental purposes only. The creator of this model makes no representations or warranties of any kind, either express or implied, as to the model's accuracy, reliability, or suitability for any particular purpose. The creator shall not be held liable for any outcomes, decisions, or actions taken on the basis of the information generated by this model. Users of this model assume full responsibility for any consequences resulting from its use." ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #not-for-all-audiences #conversational #en #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #6-bit #region-us \n", "# miqu-evil-dpo", "# Model Details", "## Description\nmiqu-evil-dpo is fine-tuned model based on miqu, serving as a direct successor to PiVoT-0.1-Evil-a.\n\nIt is trained with evil-tune method applied.\n\n!image/png", "## Prompt template: Mistral Inst", "## Disclaimer\nThe AI model provided herein is intended for experimental purposes only. The creator of this model makes no representations or warranties of any kind, either express or implied, as to the model's accuracy, reliability, or suitability for any particular purpose. The creator shall not be held liable for any outcomes, decisions, or actions taken on the basis of the information generated by this model. Users of this model assume full responsibility for any consequences resulting from its use." ]
text-generation
transformers
# Uploaded model - **Developed by:** Dogge - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "trl", "sft"], "base_model": "unsloth/llama-3-8b-bnb-4bit"}
Dogge/llama-3-translate-enko-4bit
null
[ "transformers", "safetensors", "llama", "text-generation", "text-generation-inference", "unsloth", "trl", "sft", "en", "base_model:unsloth/llama-3-8b-bnb-4bit", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "4-bit", "region:us" ]
null
2024-04-27T11:18:17+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #llama #text-generation #text-generation-inference #unsloth #trl #sft #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #4-bit #region-us
# Uploaded model - Developed by: Dogge - License: apache-2.0 - Finetuned from model : unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: Dogge\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #text-generation-inference #unsloth #trl #sft #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #4-bit #region-us \n", "# Uploaded model\n\n- Developed by: Dogge\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
text-generation
transformers
<p align="center" style="margin:0;padding:0"> <img src="https://huggingface.co/BramVanroy/fietje-2b-instruct/resolve/main/img/fietje-2b-banner-rounded.png" alt="Fietje banner" width="800" style="margin-left:'auto' margin-right:'auto' display:'block'"/> </p> <div style="margin:auto; text-align:center"> <h1 style="margin-bottom: 0">Fietje 2B Instruct</h1> <em>An open and efficient LLM for Dutch</em> </div> <blockquote class="tip" style="padding: 1.5em; border: 0"> <p align="center" style="text-align: center; margin: 0"> <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b">👱‍♀️ Base version</a> - <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b-instruct">🤖 Instruct version</a> (this one) - <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b-chat">💬 Chat version</a> - <a rel="nofollow" href="https://huggingface.co/BramVanroy/fietje-2b-chat-GGUF">🚀 GGUF of Instruct</a> </p> <p align="center" style="text-align: center; margin: 0"> <a href="https://huggingface.co/spaces/BramVanroy/fietje-2b"><strong>Chat with Fietje here!</strong></a> </p> </blockquote> This is the instruct version of Fietje, an SFT-tuned (instruction-tuned) variant of [the base model](https://huggingface.co/BramVanroy/fietje-2b). Fietje is an adapated version of [microsoft/phi-2](https://huggingface.co/microsoft/phi-2), tailored to Dutch text generation by training on 28B tokens. It is small and efficient with a size of 2.7 billion parameters while performing almost on par with more powerful Dutch LLMs of twice its size like [GEITje 7B Ultra](https://huggingface.co/BramVanroy/GEITje-7B-ultra). A thorough description of the creation and evaluation of Fietje as well as usage examples are available in [this Github repository](https://github.com/BramVanroy/fietje). ## Intended uses & limitations The same limitations as [phi-2](https://huggingface.co/microsoft/phi-2#limitations-of-phi-2), and LLMs in general, apply here. LLMs hallucinate, make mistakes, and should not be trusted. Use at your own risk! ## Training and evaluation data Fietje 2B instruct was finetuned from [the base model](https://huggingface.co/BramVanroy/fietje-2b) on the following datasets. Number of training samples per dataset given in brackets, totalling 201,579 samples. - [BramVanroy/ultrachat_200k_dutch](https://huggingface.co/datasets/BramVanroy/ultrachat_200k_dutch): gpt-4-1106-preview; multi-turn; fully generated (192,598) - [BramVanroy/no_robots_dutch](https://huggingface.co/datasets/BramVanroy/no_robots_dutch): gpt-4-1106-preview; prompt translate, answer generated; some items have system messages (8181) - [BramVanroy/belebele_dutch](https://huggingface.co/datasets/BramVanroy/belebele_dutch): Dutch portion of [belebele](https://huggingface.co/datasets/facebook/belebele), formatted into SFT format (800) ## Training procedure I am thankful to the [Flemish Supercomputer Center](https://www.vscentrum.be/) (VSC) for providing the computational power to accomplish this project. Accounting for waiting for jobs, training took around a day on four nodes of 4x A100 80GB each (16 total). I cannot find the exact time anymore and I do not think that the runtime in `all_results.json` accounts for interrupted-and-continued jobs. Training was done with the wonderful [alignment-handbook](https://github.com/huggingface/alignment-handbook), using DeepSpeed as a back-end. Exact training recipes and SLURM script are given in the [Github repository](https://github.com/BramVanroy/fietje). ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 6e-05 - train_batch_size: 42 - eval_batch_size: 42 - seed: 42 - distributed_type: multi-GPU - num_devices: 16 - total_train_batch_size: 672 - total_eval_batch_size: 672 - optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-07 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 3.0 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.9325 | 1.0 | 178 | 0.9060 | | 0.8687 | 2.0 | 356 | 0.8850 | | 0.8385 | 3.0 | 534 | 0.8818 | ### Framework versions - Transformers 4.39.1 - Pytorch 2.1.2+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2
{"language": ["nl"], "license": "mit", "tags": ["trl", "fietje", "alignment-handbook", "sft"], "datasets": ["BramVanroy/ultrachat_200k_dutch", "BramVanroy/no_robots_dutch", "BramVanroy/belebele_dutch"], "base_model": "BramVanroy/fietje-2b", "pipeline_tag": "text-generation", "inference": false, "model-index": [{"name": "fietje-2b-instruct", "results": []}]}
BramVanroy/fietje-2b-instruct
null
[ "transformers", "safetensors", "phi", "text-generation", "trl", "fietje", "alignment-handbook", "sft", "conversational", "nl", "dataset:BramVanroy/ultrachat_200k_dutch", "dataset:BramVanroy/no_robots_dutch", "dataset:BramVanroy/belebele_dutch", "base_model:BramVanroy/fietje-2b", "license:mit", "autotrain_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T11:18:30+00:00
[]
[ "nl" ]
TAGS #transformers #safetensors #phi #text-generation #trl #fietje #alignment-handbook #sft #conversational #nl #dataset-BramVanroy/ultrachat_200k_dutch #dataset-BramVanroy/no_robots_dutch #dataset-BramVanroy/belebele_dutch #base_model-BramVanroy/fietje-2b #license-mit #autotrain_compatible #text-generation-inference #region-us
![](URL alt=) Fietje 2B Instruct ================== *An open and efficient LLM for Dutch* > > > [<a href="URL with Fietje here!</strong>](URL GGUF of Instruct</a> > </p> > <p align=) > > > > This is the instruct version of Fietje, an SFT-tuned (instruction-tuned) variant of the base model. Fietje is an adapated version of microsoft/phi-2, tailored to Dutch text generation by training on 28B tokens. It is small and efficient with a size of 2.7 billion parameters while performing almost on par with more powerful Dutch LLMs of twice its size like GEITje 7B Ultra. A thorough description of the creation and evaluation of Fietje as well as usage examples are available in this Github repository. Intended uses & limitations --------------------------- The same limitations as phi-2, and LLMs in general, apply here. LLMs hallucinate, make mistakes, and should not be trusted. Use at your own risk! Training and evaluation data ---------------------------- Fietje 2B instruct was finetuned from the base model on the following datasets. Number of training samples per dataset given in brackets, totalling 201,579 samples. * BramVanroy/ultrachat\_200k\_dutch: gpt-4-1106-preview; multi-turn; fully generated (192,598) * BramVanroy/no\_robots\_dutch: gpt-4-1106-preview; prompt translate, answer generated; some items have system messages (8181) * BramVanroy/belebele\_dutch: Dutch portion of belebele, formatted into SFT format (800) Training procedure ------------------ I am thankful to the Flemish Supercomputer Center (VSC) for providing the computational power to accomplish this project. Accounting for waiting for jobs, training took around a day on four nodes of 4x A100 80GB each (16 total). I cannot find the exact time anymore and I do not think that the runtime in 'all\_results.json' accounts for interrupted-and-continued jobs. Training was done with the wonderful alignment-handbook, using DeepSpeed as a back-end. Exact training recipes and SLURM script are given in the Github repository. ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 6e-05 * train\_batch\_size: 42 * eval\_batch\_size: 42 * seed: 42 * distributed\_type: multi-GPU * num\_devices: 16 * total\_train\_batch\_size: 672 * total\_eval\_batch\_size: 672 * optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-07 * lr\_scheduler\_type: cosine * lr\_scheduler\_warmup\_ratio: 0.1 * num\_epochs: 3.0 ### Training results ### Framework versions * Transformers 4.39.1 * Pytorch 2.1.2+cu121 * Datasets 2.18.0 * Tokenizers 0.15.2
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-05\n* train\\_batch\\_size: 42\n* eval\\_batch\\_size: 42\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 16\n* total\\_train\\_batch\\_size: 672\n* total\\_eval\\_batch\\_size: 672\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-07\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.39.1\n* Pytorch 2.1.2+cu121\n* Datasets 2.18.0\n* Tokenizers 0.15.2" ]
[ "TAGS\n#transformers #safetensors #phi #text-generation #trl #fietje #alignment-handbook #sft #conversational #nl #dataset-BramVanroy/ultrachat_200k_dutch #dataset-BramVanroy/no_robots_dutch #dataset-BramVanroy/belebele_dutch #base_model-BramVanroy/fietje-2b #license-mit #autotrain_compatible #text-generation-inference #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 6e-05\n* train\\_batch\\_size: 42\n* eval\\_batch\\_size: 42\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 16\n* total\\_train\\_batch\\_size: 672\n* total\\_eval\\_batch\\_size: 672\n* optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-07\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 3.0", "### Training results", "### Framework versions\n\n\n* Transformers 4.39.1\n* Pytorch 2.1.2+cu121\n* Datasets 2.18.0\n* Tokenizers 0.15.2" ]
text-generation
transformers
<!-- header start --> <!-- 200823 --> <div style="width: auto; margin-left: auto; margin-right: auto"> <img src="https://github.com/LlamaEdge/LlamaEdge/raw/dev/assets/logo.svg" style="width: 100%; min-width: 400px; display: block; margin: auto;"> </div> <hr style="margin-top: 1.0em; margin-bottom: 1.0em;"> <!-- header end --> # Llama-3-8B-Instruct-262k-GGUF ## Original Model [gradientai/Llama-3-8B-Instruct-262k](https://huggingface.co/gradientai/Llama-3-8B-Instruct-262k) ## Run with LlamaEdge - LlamaEdge version: [v0.8.3](https://github.com/LlamaEdge/LlamaEdge/releases/tag/0.8.3) and above - Prompt template - Prompt type: `llama-3-chat` - Prompt string ```text <|begin_of_text|><|start_header_id|>system<|end_header_id|> {{ system_prompt }}<|eot_id|><|start_header_id|>user<|end_header_id|> {{ user_message_1 }}<|eot_id|><|start_header_id|>assistant<|end_header_id|> {{ model_answer_1 }}<|eot_id|><|start_header_id|>user<|end_header_id|> {{ user_message_2 }}<|eot_id|><|start_header_id|>assistant<|end_header_id|> ``` - Context size: `262144` - Run as LlamaEdge service ```bash wasmedge --dir .:. --nn-preload default:GGML:AUTO:Llama-3-8B-Instruct-262k-Q5_K_M.gguf \ llama-api-server.wasm \ --prompt-template llama-3-chat \ --ctx-size 262144 \ --model-name llama-3-8B-instruct-262k ``` - Run as LlamaEdge command app ```bash wasmedge --dir .:. --nn-preload default:GGML:AUTO:Llama-3-8B-Instruct-262k-Q5_K_M.gguf \ llama-chat.wasm \ --prompt-template llama-3-chat \ --ctx-size 262144 ``` ## Quantized GGUF Models | Name | Quant method | Bits | Size | Use case | | ---- | ---- | ---- | ---- | ----- | | [Llama-3-8B-Instruct-262k-Q2_K.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-Q2_K.gguf) | Q2_K | 2 | 3.18 GB| smallest, significant quality loss - not recommended for most purposes | | [Llama-3-8B-Instruct-262k-Q3_K_L.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-Q3_K_L.gguf) | Q3_K_L | 3 | 4.32 GB| small, substantial quality loss | | [Llama-3-8B-Instruct-262k-Q3_K_M.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-Q3_K_M.gguf) | Q3_K_M | 3 | 4.02 GB| very small, high quality loss | | [Meta-Llama-3-70B-Instruct-Q3_K_S.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-Q3_K_S.gguf) | Q3_K_S | 3 | 3.66 GB| very small, high quality loss | | [Llama-3-8B-Instruct-262k-Q4_0.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-Q4_0.gguf) | Q4_0 | 4 | 4.66 GB| legacy; small, very high quality loss - prefer using Q3_K_M | | [Llama-3-8B-Instruct-262k-Q4_K_M.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-Q4_K_M.gguf) | Q4_K_M | 4 | 4.92 GB| medium, balanced quality - recommended | | [Llama-3-8B-Instruct-262k-Q4_K_S.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-Q4_K_S.gguf) | Q4_K_S | 4 | 4.69 GB| small, greater quality loss | | [Llama-3-8B-Instruct-262k-Q5_0.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-Q5_0.gguf) | Q5_0 | 5 | 5.6 GB| legacy; medium, balanced quality - prefer using Q4_K_M | | [Llama-3-8B-Instruct-262k-Q5_K_M.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-Q5_K_M.gguf) | Q5_K_M | 5 | 5.73 GB| large, very low quality loss - recommended | | [Llama-3-8B-Instruct-262k-Q5_K_S.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-Q5_K_S.gguf) | Q5_K_S | 5 | 5.6 GB| large, low quality loss - recommended | | [Llama-3-8B-Instruct-262k-Q6_K.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-Q6_K.gguf) | Q6_K | 6 | 6.6 GB| very large, extremely low quality loss | | [Llama-3-8B-Instruct-262k-Q8_0.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-Q8_0.gguf) | Q8_0 | 8 | 8.54 GB| very large, extremely low quality loss - not recommended | | [Llama-3-8B-Instruct-262k-f16.gguf](https://huggingface.co/second-state/Llama-3-8B-Instruct-262k-GGUF/blob/main/Llama-3-8B-Instruct-262k-f16.gguf) | f16 | 16 | 16.1 GB| | *Quantized with llama.cpp b2734.*
{"language": ["en"], "license": "other", "tags": ["meta", "llama-3"], "license_name": "llama3", "base_model": "gradientai/Llama-3-8B-Instruct-262k", "inference": false, "model_creator": "gradient.ai", "model_type": "llama", "pipeline_tag": "text-generation", "quantized_by": "Second State Inc."}
second-state/Llama-3-8B-Instruct-262k-GGUF
null
[ "transformers", "gguf", "llama", "text-generation", "meta", "llama-3", "en", "base_model:gradientai/Llama-3-8B-Instruct-262k", "license:other", "autotrain_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T11:19:41+00:00
[]
[ "en" ]
TAGS #transformers #gguf #llama #text-generation #meta #llama-3 #en #base_model-gradientai/Llama-3-8B-Instruct-262k #license-other #autotrain_compatible #text-generation-inference #region-us
![](URL style=) --- Llama-3-8B-Instruct-262k-GGUF ============================= Original Model -------------- gradientai/Llama-3-8B-Instruct-262k Run with LlamaEdge ------------------ * LlamaEdge version: v0.8.3 and above * Prompt template + Prompt type: 'llama-3-chat' + Prompt string * Context size: '262144' * Run as LlamaEdge service * Run as LlamaEdge command app Quantized GGUF Models --------------------- *Quantized with URL b2734.*
[]
[ "TAGS\n#transformers #gguf #llama #text-generation #meta #llama-3 #en #base_model-gradientai/Llama-3-8B-Instruct-262k #license-other #autotrain_compatible #text-generation-inference #region-us \n" ]
text-to-image
diffusers
# facefusion <Gallery /> ## Download model [Download](/ramiz6900/facefusion/tree/main) them in the Files & versions tab.
{"tags": ["text-to-image", "stable-diffusion", "lora", "diffusers", "template:sd-lora"], "widget": [{"text": "-", "output": {"url": "images/images.jpg"}}], "base_model": "h94/IP-Adapter-FaceID"}
ramiz6900/facefusion
null
[ "diffusers", "text-to-image", "stable-diffusion", "lora", "template:sd-lora", "base_model:h94/IP-Adapter-FaceID", "region:us" ]
null
2024-04-27T11:20:20+00:00
[]
[]
TAGS #diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-h94/IP-Adapter-FaceID #region-us
# facefusion <Gallery /> ## Download model Download them in the Files & versions tab.
[ "# facefusion \n\n<Gallery />", "## Download model\n\n\nDownload them in the Files & versions tab." ]
[ "TAGS\n#diffusers #text-to-image #stable-diffusion #lora #template-sd-lora #base_model-h94/IP-Adapter-FaceID #region-us \n", "# facefusion \n\n<Gallery />", "## Download model\n\n\nDownload them in the Files & versions tab." ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
shallow6414/mv938bk
null
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T11:23:16+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * [meta-llama/Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) * [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: meta-llama/Meta-Llama-3-8B layer_range: - 0 - 32 - model: meta-llama/Meta-Llama-3-8B-Instruct layer_range: - 0 - 32 merge_method: slerp base_model: meta-llama/Meta-Llama-3-8B parameters: t: - filter: self_attn value: - 0 - 0.5 - 0.3 - 0.7 - 1 - filter: mlp value: - 1 - 0.5 - 0.7 - 0.3 - 0 - value: 0.5 dtype: bfloat16 ```
{"library_name": "transformers", "tags": ["mergekit", "merge"], "base_model": ["meta-llama/Meta-Llama-3-8B", "meta-llama/Meta-Llama-3-8B-Instruct"]}
skuma307/Llama3-base-instruct-SLERP
null
[ "transformers", "safetensors", "llama", "text-generation", "mergekit", "merge", "base_model:meta-llama/Meta-Llama-3-8B", "base_model:meta-llama/Meta-Llama-3-8B-Instruct", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T11:24:00+00:00
[]
[]
TAGS #transformers #safetensors #llama #text-generation #mergekit #merge #base_model-meta-llama/Meta-Llama-3-8B #base_model-meta-llama/Meta-Llama-3-8B-Instruct #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# merge This is a merge of pre-trained language models created using mergekit. ## Merge Details ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * meta-llama/Meta-Llama-3-8B * meta-llama/Meta-Llama-3-8B-Instruct ### Configuration The following YAML configuration was used to produce this model:
[ "# merge\n\nThis is a merge of pre-trained language models created using mergekit.", "## Merge Details", "### Merge Method\n\nThis model was merged using the SLERP merge method.", "### Models Merged\n\nThe following models were included in the merge:\n* meta-llama/Meta-Llama-3-8B\n* meta-llama/Meta-Llama-3-8B-Instruct", "### Configuration\n\nThe following YAML configuration was used to produce this model:" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #mergekit #merge #base_model-meta-llama/Meta-Llama-3-8B #base_model-meta-llama/Meta-Llama-3-8B-Instruct #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# merge\n\nThis is a merge of pre-trained language models created using mergekit.", "## Merge Details", "### Merge Method\n\nThis model was merged using the SLERP merge method.", "### Models Merged\n\nThe following models were included in the merge:\n* meta-llama/Meta-Llama-3-8B\n* meta-llama/Meta-Llama-3-8B-Instruct", "### Configuration\n\nThe following YAML configuration was used to produce this model:" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
OwOOwO/final10
null
[ "transformers", "safetensors", "stablelm", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:25:57+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
rikitonoto/lua_tokenizer
null
[ "transformers", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:28:19+00:00
[ "1910.09700" ]
[]
TAGS #transformers #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
question-answering
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
NeginShams/mbert-extratranslation
null
[ "transformers", "safetensors", "bert", "question-answering", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:30:52+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # 0.01_4iters_bs256_nodpo_only4w_iter_1 This model is a fine-tuned version of [HuggingFaceH4/mistral-7b-sft-beta](https://huggingface.co/HuggingFaceH4/mistral-7b-sft-beta) on the updated and the original datasets. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-07 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: multi-GPU - num_devices: 8 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - total_eval_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.40.0 - Pytorch 2.1.2+cu121 - Datasets 2.14.6 - Tokenizers 0.19.1
{"license": "mit", "tags": ["alignment-handbook", "trl", "dpo", "generated_from_trainer", "trl", "dpo", "generated_from_trainer"], "datasets": ["updated", "original"], "base_model": "HuggingFaceH4/mistral-7b-sft-beta", "model-index": [{"name": "0.01_4iters_bs256_nodpo_only4w_iter_1", "results": []}]}
ShenaoZhang/0.01_4iters_bs256_nodpo_only4w_iter_1
null
[ "transformers", "safetensors", "mistral", "text-generation", "alignment-handbook", "trl", "dpo", "generated_from_trainer", "conversational", "dataset:updated", "dataset:original", "base_model:HuggingFaceH4/mistral-7b-sft-beta", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T11:31:57+00:00
[]
[]
TAGS #transformers #safetensors #mistral #text-generation #alignment-handbook #trl #dpo #generated_from_trainer #conversational #dataset-updated #dataset-original #base_model-HuggingFaceH4/mistral-7b-sft-beta #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# 0.01_4iters_bs256_nodpo_only4w_iter_1 This model is a fine-tuned version of HuggingFaceH4/mistral-7b-sft-beta on the updated and the original datasets. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-07 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: multi-GPU - num_devices: 8 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - total_eval_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.40.0 - Pytorch 2.1.2+cu121 - Datasets 2.14.6 - Tokenizers 0.19.1
[ "# 0.01_4iters_bs256_nodpo_only4w_iter_1\n\nThis model is a fine-tuned version of HuggingFaceH4/mistral-7b-sft-beta on the updated and the original datasets.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-07\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- num_devices: 8\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 256\n- total_eval_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.40.0\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.6\n- Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #alignment-handbook #trl #dpo #generated_from_trainer #conversational #dataset-updated #dataset-original #base_model-HuggingFaceH4/mistral-7b-sft-beta #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# 0.01_4iters_bs256_nodpo_only4w_iter_1\n\nThis model is a fine-tuned version of HuggingFaceH4/mistral-7b-sft-beta on the updated and the original datasets.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-07\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- num_devices: 8\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 256\n- total_eval_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.40.0\n- Pytorch 2.1.2+cu121\n- Datasets 2.14.6\n- Tokenizers 0.19.1" ]
null
transformers
# Gemma Model Card **Model Page**: [Gemma](https://ai.google.dev/gemma/docs) This model card corresponds to the latest 2B instruct version of the Gemma model. Here you can find other models in the Gemma family: | | Base | Instruct | |----|----------------------------------------------------|----------------------------------------------------------------------| | 2B | [gemma-2b](https://huggingface.co/google/gemma-2b) | [**gemma-1.1-2b-it**](https://huggingface.co/google/gemma-1.1-2b-it) | | 7B | [gemma-7b](https://huggingface.co/google/gemma-7b) | [gemma-1.1-7b-it](https://huggingface.co/google/gemma-1.1-7b-it) | **Release Notes** This is Gemma 1.1 2B (IT), an update over the original instruction-tuned Gemma release. Gemma 1.1 was trained using a novel RLHF method, leading to substantial gains on quality, coding capabilities, factuality, instruction following and multi-turn conversation quality. We also fixed a bug in multi-turn conversations, and made sure that model responses don't always start with `"Sure,"`. We believe this release represents an improvement for most use cases, but we encourage users to test in their particular applications. The previous model [will continue to be available in the same repo](https://huggingface.co/google/gemma-2b-it). We appreciate the enthusiastic adoption of Gemma, and we continue to welcome all feedback from the community. **Resources and Technical Documentation**: * [Responsible Generative AI Toolkit](https://ai.google.dev/responsible) * [Gemma on Kaggle](https://www.kaggle.com/models/google/gemma) * [Gemma on Vertex Model Garden](https://console.cloud.google.com/vertex-ai/publishers/google/model-garden/335) **Terms of Use**: [Terms](https://www.kaggle.com/models/google/gemma/license/consent) **Authors**: Google ## Model Information Summary description and brief definition of inputs and outputs. ### Description Gemma is a family of lightweight, state-of-the-art open models from Google, built from the same research and technology used to create the Gemini models. They are text-to-text, decoder-only large language models, available in English, with open weights, pre-trained variants, and instruction-tuned variants. Gemma models are well-suited for a variety of text generation tasks, including question answering, summarization, and reasoning. Their relatively small size makes it possible to deploy them in environments with limited resources such as a laptop, desktop or your own cloud infrastructure, democratizing access to state of the art AI models and helping foster innovation for everyone. ### Usage Below we share some code snippets on how to get quickly started with running the model. First make sure to `pip install -U transformers`, then copy the snippet from the section that is relevant for your usecase. #### Running the model on a CPU As explained below, we recommend `torch.bfloat16` as the default dtype. You can use [a different precision](#precisions) if necessary. ```python from transformers import AutoTokenizer, AutoModelForCausalLM import torch tokenizer = AutoTokenizer.from_pretrained("google/gemma-1.1-2b-it") model = AutoModelForCausalLM.from_pretrained( "google/gemma-1.1-2b-it", torch_dtype=torch.bfloat16 ) input_text = "Write me a poem about Machine Learning." input_ids = tokenizer(input_text, return_tensors="pt") outputs = model.generate(**input_ids, max_new_tokens=50) print(tokenizer.decode(outputs[0])) ``` #### Running the model on a single / multi GPU ```python # pip install accelerate from transformers import AutoTokenizer, AutoModelForCausalLM import torch tokenizer = AutoTokenizer.from_pretrained("google/gemma-1.1-2b-it") model = AutoModelForCausalLM.from_pretrained( "google/gemma-1.1-2b-it", device_map="auto", torch_dtype=torch.bfloat16 ) input_text = "Write me a poem about Machine Learning." input_ids = tokenizer(input_text, return_tensors="pt").to("cuda") outputs = model.generate(**input_ids) print(tokenizer.decode(outputs[0])) ``` <a name="precisions"></a> #### Running the model on a GPU using different precisions The native weights of this model were exported in `bfloat16` precision. You can use `float16`, which may be faster on certain hardware, indicating the `torch_dtype` when loading the model. For convenience, the `float16` revision of the repo contains a copy of the weights already converted to that precision. You can also use `float32` if you skip the dtype, but no precision increase will occur (model weights will just be upcasted to `float32`). See examples below. * _Using `torch.float16`_ ```python # pip install accelerate from transformers import AutoTokenizer, AutoModelForCausalLM import torch tokenizer = AutoTokenizer.from_pretrained("google/gemma-1.1-2b-it") model = AutoModelForCausalLM.from_pretrained( "google/gemma-1.1-2b-it", device_map="auto", torch_dtype=torch.float16, revision="float16", ) input_text = "Write me a poem about Machine Learning." input_ids = tokenizer(input_text, return_tensors="pt").to("cuda") outputs = model.generate(**input_ids) print(tokenizer.decode(outputs[0])) ``` * _Using `torch.bfloat16`_ ```python # pip install accelerate from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("google/gemma-2b-it") model = AutoModelForCausalLM.from_pretrained( "google/gemma-1.1-2b-it", device_map="auto", torch_dtype=torch.bfloat16 ) input_text = "Write me a poem about Machine Learning." input_ids = tokenizer(input_text, return_tensors="pt").to("cuda") outputs = model.generate(**input_ids) print(tokenizer.decode(outputs[0])) ``` * _Upcasting to `torch.float32`_ ```python # pip install accelerate from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("google/gemma-1.1-2b-it") model = AutoModelForCausalLM.from_pretrained( "google/gemma-1.1-2b-it", device_map="auto" ) input_text = "Write me a poem about Machine Learning." input_ids = tokenizer(input_text, return_tensors="pt").to("cuda") outputs = model.generate(**input_ids) print(tokenizer.decode(outputs[0])) ``` #### Quantized Versions through `bitsandbytes` * _Using 8-bit precision (int8)_ ```python # pip install bitsandbytes accelerate from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig quantization_config = BitsAndBytesConfig(load_in_8bit=True) tokenizer = AutoTokenizer.from_pretrained("google/gemma-1.1-2b-it") model = AutoModelForCausalLM.from_pretrained( "google/gemma-1.1-2b-it", quantization_config=quantization_config ) input_text = "Write me a poem about Machine Learning." input_ids = tokenizer(input_text, return_tensors="pt").to("cuda") outputs = model.generate(**input_ids) print(tokenizer.decode(outputs[0])) ``` * _Using 4-bit precision_ ```python # pip install bitsandbytes accelerate from transformers import AutoTokenizer, AutoModelForCausalLM, BitsAndBytesConfig quantization_config = BitsAndBytesConfig(load_in_4bit=True) tokenizer = AutoTokenizer.from_pretrained("google/gemma-1.1-2b-it") model = AutoModelForCausalLM.from_pretrained( "google/gemma-1.1-2b-it", quantization_config=quantization_config ) input_text = "Write me a poem about Machine Learning." input_ids = tokenizer(input_text, return_tensors="pt").to("cuda") outputs = model.generate(**input_ids) print(tokenizer.decode(outputs[0])) ``` #### Other optimizations * _Flash Attention 2_ First make sure to install `flash-attn` in your environment `pip install flash-attn` ```diff model = AutoModelForCausalLM.from_pretrained( model_id, torch_dtype=torch.float16, + attn_implementation="flash_attention_2" ).to(0) ``` #### Running the model in JAX / Flax Use the `flax` branch of the repository: ```python import jax.numpy as jnp from transformers import AutoTokenizer, FlaxGemmaForCausalLM model_id = "google/gemma-1.1-2b-it" tokenizer = AutoTokenizer.from_pretrained(model_id) tokenizer.padding_side = "left" model, params = FlaxGemmaForCausalLM.from_pretrained( model_id, dtype=jnp.bfloat16, revision="flax", _do_init=False, ) inputs = tokenizer("Valencia and Málaga are", return_tensors="np", padding=True) output = model.generate(**inputs, params=params, max_new_tokens=20, do_sample=False) output_text = tokenizer.batch_decode(output.sequences, skip_special_tokens=True) ``` [Check this notebook](https://colab.research.google.com/github/sanchit-gandhi/notebooks/blob/main/jax_gemma.ipynb) for a comprehensive walkthrough on how to parallelize JAX inference. ### Chat Template The instruction-tuned models use a chat template that must be adhered to for conversational use. The easiest way to apply it is using the tokenizer's built-in chat template, as shown in the following snippet. Let's load the model and apply the chat template to a conversation. In this example, we'll start with a single user interaction: ```py from transformers import AutoTokenizer, AutoModelForCausalLM import transformers import torch model_id = "google/gemma-1.1-2b-it" dtype = torch.bfloat16 tokenizer = AutoTokenizer.from_pretrained(model_id) model = AutoModelForCausalLM.from_pretrained( model_id, device_map="cuda", torch_dtype=dtype, ) chat = [ { "role": "user", "content": "Write a hello world program" }, ] prompt = tokenizer.apply_chat_template(chat, tokenize=False, add_generation_prompt=True) ``` At this point, the prompt contains the following text: ``` <bos><start_of_turn>user Write a hello world program<end_of_turn> <start_of_turn>model ``` As you can see, each turn is preceded by a `<start_of_turn>` delimiter and then the role of the entity (either `user`, for content supplied by the user, or `model` for LLM responses). Turns finish with the `<end_of_turn>` token. You can follow this format to build the prompt manually, if you need to do it without the tokenizer's chat template. After the prompt is ready, generation can be performed like this: ```py inputs = tokenizer.encode(prompt, add_special_tokens=False, return_tensors="pt") outputs = model.generate(input_ids=inputs.to(model.device), max_new_tokens=150) ``` ### Fine-tuning You can find some fine-tuning scripts under the [`examples/` directory](https://huggingface.co/google/gemma-7b/tree/main/examples) of [`google/gemma-7b`](https://huggingface.co/google/gemma-7b) repository. To adapt them to this model, simply change the model-id to `google/gemma-1.1-2b-it`. We provide: * A script to perform Supervised Fine-Tuning (SFT) on UltraChat dataset using QLoRA * A script to perform SFT using FSDP on TPU devices * A notebook that you can run on a free-tier Google Colab instance to perform SFT on the English quotes dataset ### Inputs and outputs * **Input:** Text string, such as a question, a prompt, or a document to be summarized. * **Output:** Generated English-language text in response to the input, such as an answer to a question, or a summary of a document. ## Model Data Data used for model training and how the data was processed. ### Training Dataset These models were trained on a dataset of text data that includes a wide variety of sources, totaling 6 trillion tokens. Here are the key components: * Web Documents: A diverse collection of web text ensures the model is exposed to a broad range of linguistic styles, topics, and vocabulary. Primarily English-language content. * Code: Exposing the model to code helps it to learn the syntax and patterns of programming languages, which improves its ability to generate code or understand code-related questions. * Mathematics: Training on mathematical text helps the model learn logical reasoning, symbolic representation, and to address mathematical queries. The combination of these diverse data sources is crucial for training a powerful language model that can handle a wide variety of different tasks and text formats. ### Data Preprocessing Here are the key data cleaning and filtering methods applied to the training data: * CSAM Filtering: Rigorous CSAM (Child Sexual Abuse Material) filtering was applied at multiple stages in the data preparation process to ensure the exclusion of harmful and illegal content * Sensitive Data Filtering: As part of making Gemma pre-trained models safe and reliable, automated techniques were used to filter out certain personal information and other sensitive data from training sets. * Additional methods: Filtering based on content quality and safely in line with [our policies](https://storage.googleapis.com/gweb-uniblog-publish-prod/documents/2023_Google_AI_Principles_Progress_Update.pdf#page=11). ## Implementation Information Details about the model internals. ### Hardware Gemma was trained using the latest generation of [Tensor Processing Unit (TPU)](https://cloud.google.com/tpu/docs/intro-to-tpu) hardware (TPUv5e). Training large language models requires significant computational power. TPUs, designed specifically for matrix operations common in machine learning, offer several advantages in this domain: * Performance: TPUs are specifically designed to handle the massive computations involved in training LLMs. They can speed up training considerably compared to CPUs. * Memory: TPUs often come with large amounts of high-bandwidth memory, allowing for the handling of large models and batch sizes during training. This can lead to better model quality. * Scalability: TPU Pods (large clusters of TPUs) provide a scalable solution for handling the growing complexity of large foundation models. You can distribute training across multiple TPU devices for faster and more efficient processing. * Cost-effectiveness: In many scenarios, TPUs can provide a more cost-effective solution for training large models compared to CPU-based infrastructure, especially when considering the time and resources saved due to faster training. * These advantages are aligned with [Google's commitments to operate sustainably](https://sustainability.google/operating-sustainably/). ### Software Training was done using [JAX](https://github.com/google/jax) and [ML Pathways](https://blog.google/technology/ai/introducing-pathways-next-generation-ai-architecture/ml-pathways). JAX allows researchers to take advantage of the latest generation of hardware, including TPUs, for faster and more efficient training of large models. ML Pathways is Google's latest effort to build artificially intelligent systems capable of generalizing across multiple tasks. This is specially suitable for [foundation models](https://ai.google/discover/foundation-models/), including large language models like these ones. Together, JAX and ML Pathways are used as described in the [paper about the Gemini family of models](https://arxiv.org/abs/2312.11805); "the 'single controller' programming model of Jax and Pathways allows a single Python process to orchestrate the entire training run, dramatically simplifying the development workflow." ## Evaluation Model evaluation metrics and results. ### Benchmark Results The pre-trained base models were evaluated against a large collection of different datasets and metrics to cover different aspects of text generation: | Benchmark | Metric | 2B Params | 7B Params | | ------------------------------ | ------------- | ----------- | --------- | | [MMLU](https://arxiv.org/abs/2009.03300) | 5-shot, top-1 | 42.3 | 64.3 | | [HellaSwag](https://arxiv.org/abs/1905.07830) | 0-shot |71.4 | 81.2 | | [PIQA](https://arxiv.org/abs/1911.11641) | 0-shot | 77.3 | 81.2 | | [SocialIQA](https://arxiv.org/abs/1904.09728) | 0-shot | 49.7 | 51.8 | | [BooIQ](https://arxiv.org/abs/1905.10044) | 0-shot | 69.4 | 83.2 | | [WinoGrande](https://arxiv.org/abs/1907.10641) | partial score | 65.4 | 72.3 | | [CommonsenseQA](https://arxiv.org/abs/1811.00937) | 7-shot | 65.3 | 71.3 | | [OpenBookQA](https://arxiv.org/abs/1809.02789) | | 47.8 | 52.8 | | [ARC-e](https://arxiv.org/abs/1911.01547) | | 73.2 | 81.5 | | [ARC-c](https://arxiv.org/abs/1911.01547) | | 42.1 | 53.2 | | [TriviaQA](https://arxiv.org/abs/1705.03551) | 5-shot | 53.2 | 63.4 | | [Natural Questions](https://github.com/google-research-datasets/natural-questions) | 5-shot | 12.5 | 23 | | [HumanEval](https://arxiv.org/abs/2107.03374) | pass@1 | 22.0 | 32.3 | | [MBPP](https://arxiv.org/abs/2108.07732) | 3-shot | 29.2 | 44.4 | | [GSM8K](https://arxiv.org/abs/2110.14168) | maj@1 | 17.7 | 46.4 | | [MATH](https://arxiv.org/abs/2108.07732) | 4-shot | 11.8 | 24.3 | | [AGIEval](https://arxiv.org/abs/2304.06364) | | 24.2 | 41.7 | | [BIG-Bench](https://arxiv.org/abs/2206.04615) | | 35.2 | 55.1 | | ------------------------------ | ------------- | ----------- | --------- | | **Average** | | **45.0** | **56.9** | ## Ethics and Safety Ethics and safety evaluation approach and results. ### Evaluation Approach Our evaluation methods include structured evaluations and internal red-teaming testing of relevant content policies. Red-teaming was conducted by a number of different teams, each with different goals and human evaluation metrics. These models were evaluated against a number of different categories relevant to ethics and safety, including: * Text-to-Text Content Safety: Human evaluation on prompts covering safety policies including child sexual abuse and exploitation, harassment, violence and gore, and hate speech. * Text-to-Text Representational Harms: Benchmark against relevant academic datasets such as [WinoBias](https://arxiv.org/abs/1804.06876) and [BBQ Dataset](https://arxiv.org/abs/2110.08193v2). * Memorization: Automated evaluation of memorization of training data, including the risk of personally identifiable information exposure. * Large-scale harm: Tests for "dangerous capabilities," such as chemical, biological, radiological, and nuclear (CBRN) risks. ### Evaluation Results The results of ethics and safety evaluations are within acceptable thresholds for meeting [internal policies](https://storage.googleapis.com/gweb-uniblog-publish-prod/documents/2023_Google_AI_Principles_Progress_Update.pdf#page=11) for categories such as child safety, content safety, representational harms, memorization, large-scale harms. On top of robust internal evaluations, the results of well known safety benchmarks like BBQ, BOLD, Winogender, Winobias, RealToxicity, and TruthfulQA are shown here. #### Gemma 1.0 | Benchmark | Metric | Gemma 1.0 IT 2B | Gemma 1.0 IT 7B | | ------------------------ | ------------- | --------------- | --------------- | | [RealToxicity][realtox] | average | 6.86 | 7.90 | | [BOLD][bold] | | 45.57 | 49.08 | | [CrowS-Pairs][crows] | top-1 | 45.82 | 51.33 | | [BBQ Ambig][bbq] | 1-shot, top-1 | 62.58 | 92.54 | | [BBQ Disambig][bbq] | top-1 | 54.62 | 71.99 | | [Winogender][winogender] | top-1 | 51.25 | 54.17 | | [TruthfulQA][truthfulqa] | | 44.84 | 31.81 | | [Winobias 1_2][winobias] | | 56.12 | 59.09 | | [Winobias 2_2][winobias] | | 91.10 | 92.23 | | [Toxigen][toxigen] | | 29.77 | 39.59 | | ------------------------ | ------------- | --------------- | --------------- | #### Gemma 1.1 | Benchmark | Metric | Gemma 1.1 IT 2B | Gemma 1.1 IT 7B | | ------------------------ | ------------- | --------------- | --------------- | | [RealToxicity][realtox] | average | 7.03 | 8.04 | | [BOLD][bold] | | 47.76 | | | [CrowS-Pairs][crows] | top-1 | 45.89 | 49.67 | | [BBQ Ambig][bbq] | 1-shot, top-1 | 58.97 | 86.06 | | [BBQ Disambig][bbq] | top-1 | 53.90 | 85.08 | | [Winogender][winogender] | top-1 | 50.14 | 57.64 | | [TruthfulQA][truthfulqa] | | 44.24 | 45.34 | | [Winobias 1_2][winobias] | | 55.93 | 59.22 | | [Winobias 2_2][winobias] | | 89.46 | 89.2 | | [Toxigen][toxigen] | | 29.64 | 38.75 | | ------------------------ | ------------- | --------------- | --------------- | ## Usage and Limitations These models have certain limitations that users should be aware of. ### Intended Usage Open Large Language Models (LLMs) have a wide range of applications across various industries and domains. The following list of potential uses is not comprehensive. The purpose of this list is to provide contextual information about the possible use-cases that the model creators considered as part of model training and development. * Content Creation and Communication * Text Generation: These models can be used to generate creative text formats such as poems, scripts, code, marketing copy, and email drafts. * Chatbots and Conversational AI: Power conversational interfaces for customer service, virtual assistants, or interactive applications. * Text Summarization: Generate concise summaries of a text corpus, research papers, or reports. * Research and Education * Natural Language Processing (NLP) Research: These models can serve as a foundation for researchers to experiment with NLP techniques, develop algorithms, and contribute to the advancement of the field. * Language Learning Tools: Support interactive language learning experiences, aiding in grammar correction or providing writing practice. * Knowledge Exploration: Assist researchers in exploring large bodies of text by generating summaries or answering questions about specific topics. ### Limitations * Training Data * The quality and diversity of the training data significantly influence the model's capabilities. Biases or gaps in the training data can lead to limitations in the model's responses. * The scope of the training dataset determines the subject areas the model can handle effectively. * Context and Task Complexity * LLMs are better at tasks that can be framed with clear prompts and instructions. Open-ended or highly complex tasks might be challenging. * A model's performance can be influenced by the amount of context provided (longer context generally leads to better outputs, up to a certain point). * Language Ambiguity and Nuance * Natural language is inherently complex. LLMs might struggle to grasp subtle nuances, sarcasm, or figurative language. * Factual Accuracy * LLMs generate responses based on information they learned from their training datasets, but they are not knowledge bases. They may generate incorrect or outdated factual statements. * Common Sense * LLMs rely on statistical patterns in language. They might lack the ability to apply common sense reasoning in certain situations. ### Ethical Considerations and Risks The development of large language models (LLMs) raises several ethical concerns. In creating an open model, we have carefully considered the following: * Bias and Fairness * LLMs trained on large-scale, real-world text data can reflect socio-cultural biases embedded in the training material. These models underwent careful scrutiny, input data pre-processing described and posterior evaluations reported in this card. * Misinformation and Misuse * LLMs can be misused to generate text that is false, misleading, or harmful. * Guidelines are provided for responsible use with the model, see the [Responsible Generative AI Toolkit](http://ai.google.dev/gemma/responsible). * Transparency and Accountability: * This model card summarizes details on the models' architecture, capabilities, limitations, and evaluation processes. * A responsibly developed open model offers the opportunity to share innovation by making LLM technology accessible to developers and researchers across the AI ecosystem. Risks identified and mitigations: * Perpetuation of biases: It's encouraged to perform continuous monitoring (using evaluation metrics, human review) and the exploration of de-biasing techniques during model training, fine-tuning, and other use cases. * Generation of harmful content: Mechanisms and guidelines for content safety are essential. Developers are encouraged to exercise caution and implement appropriate content safety safeguards based on their specific product policies and application use cases. * Misuse for malicious purposes: Technical limitations and developer and end-user education can help mitigate against malicious applications of LLMs. Educational resources and reporting mechanisms for users to flag misuse are provided. Prohibited uses of Gemma models are outlined in the [Gemma Prohibited Use Policy](https://ai.google.dev/gemma/prohibited_use_policy). * Privacy violations: Models were trained on data filtered for removal of PII (Personally Identifiable Information). Developers are encouraged to adhere to privacy regulations with privacy-preserving techniques. ### Benefits At the time of release, this family of models provides high-performance open large language model implementations designed from the ground up for Responsible AI development compared to similarly sized models. Using the benchmark evaluation metrics described in this document, these models have shown to provide superior performance to other, comparably-sized open model alternatives.
{"license": "gemma", "library_name": "transformers", "widget": [{"messages": [{"role": "user", "content": "How does the brain work?"}]}], "inference": {"parameters": {"max_new_tokens": 200}}, "extra_gated_heading": "Access Gemma on Hugging Face", "extra_gated_prompt": "To access Gemma on Hugging Face, you\u2019re required to review and agree to Google\u2019s usage license. To do this, please ensure you\u2019re logged-in to Hugging Face and click below. Requests are processed immediately.", "extra_gated_button_content": "Acknowledge license"}
jncraton/gemma-1.1-2b-it-ct2-int8
null
[ "transformers", "arxiv:2312.11805", "arxiv:2009.03300", "arxiv:1905.07830", "arxiv:1911.11641", "arxiv:1904.09728", "arxiv:1905.10044", "arxiv:1907.10641", "arxiv:1811.00937", "arxiv:1809.02789", "arxiv:1911.01547", "arxiv:1705.03551", "arxiv:2107.03374", "arxiv:2108.07732", "arxiv:2110.14168", "arxiv:2304.06364", "arxiv:2206.04615", "arxiv:1804.06876", "arxiv:2110.08193", "license:gemma", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:33:07+00:00
[ "2312.11805", "2009.03300", "1905.07830", "1911.11641", "1904.09728", "1905.10044", "1907.10641", "1811.00937", "1809.02789", "1911.01547", "1705.03551", "2107.03374", "2108.07732", "2110.14168", "2304.06364", "2206.04615", "1804.06876", "2110.08193" ]
[]
TAGS #transformers #arxiv-2312.11805 #arxiv-2009.03300 #arxiv-1905.07830 #arxiv-1911.11641 #arxiv-1904.09728 #arxiv-1905.10044 #arxiv-1907.10641 #arxiv-1811.00937 #arxiv-1809.02789 #arxiv-1911.01547 #arxiv-1705.03551 #arxiv-2107.03374 #arxiv-2108.07732 #arxiv-2110.14168 #arxiv-2304.06364 #arxiv-2206.04615 #arxiv-1804.06876 #arxiv-2110.08193 #license-gemma #endpoints_compatible #region-us
Gemma Model Card ================ Model Page: Gemma This model card corresponds to the latest 2B instruct version of the Gemma model. Here you can find other models in the Gemma family: Base: 2B, Instruct: gemma-2b Base: 7B, Instruct: gemma-7b Release Notes This is Gemma 1.1 2B (IT), an update over the original instruction-tuned Gemma release. Gemma 1.1 was trained using a novel RLHF method, leading to substantial gains on quality, coding capabilities, factuality, instruction following and multi-turn conversation quality. We also fixed a bug in multi-turn conversations, and made sure that model responses don't always start with '"Sure,"'. We believe this release represents an improvement for most use cases, but we encourage users to test in their particular applications. The previous model will continue to be available in the same repo. We appreciate the enthusiastic adoption of Gemma, and we continue to welcome all feedback from the community. Resources and Technical Documentation: * Responsible Generative AI Toolkit * Gemma on Kaggle * Gemma on Vertex Model Garden Terms of Use: Terms Authors: Google Model Information ----------------- Summary description and brief definition of inputs and outputs. ### Description Gemma is a family of lightweight, state-of-the-art open models from Google, built from the same research and technology used to create the Gemini models. They are text-to-text, decoder-only large language models, available in English, with open weights, pre-trained variants, and instruction-tuned variants. Gemma models are well-suited for a variety of text generation tasks, including question answering, summarization, and reasoning. Their relatively small size makes it possible to deploy them in environments with limited resources such as a laptop, desktop or your own cloud infrastructure, democratizing access to state of the art AI models and helping foster innovation for everyone. ### Usage Below we share some code snippets on how to get quickly started with running the model. First make sure to 'pip install -U transformers', then copy the snippet from the section that is relevant for your usecase. #### Running the model on a CPU As explained below, we recommend 'torch.bfloat16' as the default dtype. You can use a different precision if necessary. #### Running the model on a single / multi GPU #### Running the model on a GPU using different precisions The native weights of this model were exported in 'bfloat16' precision. You can use 'float16', which may be faster on certain hardware, indicating the 'torch\_dtype' when loading the model. For convenience, the 'float16' revision of the repo contains a copy of the weights already converted to that precision. You can also use 'float32' if you skip the dtype, but no precision increase will occur (model weights will just be upcasted to 'float32'). See examples below. * *Using 'torch.float16'* * *Using 'torch.bfloat16'* * *Upcasting to 'torch.float32'* #### Quantized Versions through 'bitsandbytes' * *Using 8-bit precision (int8)* * *Using 4-bit precision* #### Other optimizations * *Flash Attention 2* First make sure to install 'flash-attn' in your environment 'pip install flash-attn' #### Running the model in JAX / Flax Use the 'flax' branch of the repository: Check this notebook for a comprehensive walkthrough on how to parallelize JAX inference. ### Chat Template The instruction-tuned models use a chat template that must be adhered to for conversational use. The easiest way to apply it is using the tokenizer's built-in chat template, as shown in the following snippet. Let's load the model and apply the chat template to a conversation. In this example, we'll start with a single user interaction: At this point, the prompt contains the following text: As you can see, each turn is preceded by a '<start\_of\_turn>' delimiter and then the role of the entity (either 'user', for content supplied by the user, or 'model' for LLM responses). Turns finish with the '<end\_of\_turn>' token. You can follow this format to build the prompt manually, if you need to do it without the tokenizer's chat template. After the prompt is ready, generation can be performed like this: ### Fine-tuning You can find some fine-tuning scripts under the 'examples/' directory of 'google/gemma-7b' repository. To adapt them to this model, simply change the model-id to 'google/gemma-1.1-2b-it'. We provide: * A script to perform Supervised Fine-Tuning (SFT) on UltraChat dataset using QLoRA * A script to perform SFT using FSDP on TPU devices * A notebook that you can run on a free-tier Google Colab instance to perform SFT on the English quotes dataset ### Inputs and outputs * Input: Text string, such as a question, a prompt, or a document to be summarized. * Output: Generated English-language text in response to the input, such as an answer to a question, or a summary of a document. Model Data ---------- Data used for model training and how the data was processed. ### Training Dataset These models were trained on a dataset of text data that includes a wide variety of sources, totaling 6 trillion tokens. Here are the key components: * Web Documents: A diverse collection of web text ensures the model is exposed to a broad range of linguistic styles, topics, and vocabulary. Primarily English-language content. * Code: Exposing the model to code helps it to learn the syntax and patterns of programming languages, which improves its ability to generate code or understand code-related questions. * Mathematics: Training on mathematical text helps the model learn logical reasoning, symbolic representation, and to address mathematical queries. The combination of these diverse data sources is crucial for training a powerful language model that can handle a wide variety of different tasks and text formats. ### Data Preprocessing Here are the key data cleaning and filtering methods applied to the training data: * CSAM Filtering: Rigorous CSAM (Child Sexual Abuse Material) filtering was applied at multiple stages in the data preparation process to ensure the exclusion of harmful and illegal content * Sensitive Data Filtering: As part of making Gemma pre-trained models safe and reliable, automated techniques were used to filter out certain personal information and other sensitive data from training sets. * Additional methods: Filtering based on content quality and safely in line with our policies. Implementation Information -------------------------- Details about the model internals. ### Hardware Gemma was trained using the latest generation of Tensor Processing Unit (TPU) hardware (TPUv5e). Training large language models requires significant computational power. TPUs, designed specifically for matrix operations common in machine learning, offer several advantages in this domain: * Performance: TPUs are specifically designed to handle the massive computations involved in training LLMs. They can speed up training considerably compared to CPUs. * Memory: TPUs often come with large amounts of high-bandwidth memory, allowing for the handling of large models and batch sizes during training. This can lead to better model quality. * Scalability: TPU Pods (large clusters of TPUs) provide a scalable solution for handling the growing complexity of large foundation models. You can distribute training across multiple TPU devices for faster and more efficient processing. * Cost-effectiveness: In many scenarios, TPUs can provide a more cost-effective solution for training large models compared to CPU-based infrastructure, especially when considering the time and resources saved due to faster training. * These advantages are aligned with Google's commitments to operate sustainably. ### Software Training was done using JAX and ML Pathways. JAX allows researchers to take advantage of the latest generation of hardware, including TPUs, for faster and more efficient training of large models. ML Pathways is Google's latest effort to build artificially intelligent systems capable of generalizing across multiple tasks. This is specially suitable for foundation models, including large language models like these ones. Together, JAX and ML Pathways are used as described in the paper about the Gemini family of models; "the 'single controller' programming model of Jax and Pathways allows a single Python process to orchestrate the entire training run, dramatically simplifying the development workflow." Evaluation ---------- Model evaluation metrics and results. ### Benchmark Results The pre-trained base models were evaluated against a large collection of different datasets and metrics to cover different aspects of text generation: Ethics and Safety ----------------- Ethics and safety evaluation approach and results. ### Evaluation Approach Our evaluation methods include structured evaluations and internal red-teaming testing of relevant content policies. Red-teaming was conducted by a number of different teams, each with different goals and human evaluation metrics. These models were evaluated against a number of different categories relevant to ethics and safety, including: * Text-to-Text Content Safety: Human evaluation on prompts covering safety policies including child sexual abuse and exploitation, harassment, violence and gore, and hate speech. * Text-to-Text Representational Harms: Benchmark against relevant academic datasets such as WinoBias and BBQ Dataset. * Memorization: Automated evaluation of memorization of training data, including the risk of personally identifiable information exposure. * Large-scale harm: Tests for "dangerous capabilities," such as chemical, biological, radiological, and nuclear (CBRN) risks. ### Evaluation Results The results of ethics and safety evaluations are within acceptable thresholds for meeting internal policies for categories such as child safety, content safety, representational harms, memorization, large-scale harms. On top of robust internal evaluations, the results of well known safety benchmarks like BBQ, BOLD, Winogender, Winobias, RealToxicity, and TruthfulQA are shown here. #### Gemma 1.0 #### Gemma 1.1 Usage and Limitations --------------------- These models have certain limitations that users should be aware of. ### Intended Usage Open Large Language Models (LLMs) have a wide range of applications across various industries and domains. The following list of potential uses is not comprehensive. The purpose of this list is to provide contextual information about the possible use-cases that the model creators considered as part of model training and development. * Content Creation and Communication + Text Generation: These models can be used to generate creative text formats such as poems, scripts, code, marketing copy, and email drafts. + Chatbots and Conversational AI: Power conversational interfaces for customer service, virtual assistants, or interactive applications. + Text Summarization: Generate concise summaries of a text corpus, research papers, or reports. * Research and Education + Natural Language Processing (NLP) Research: These models can serve as a foundation for researchers to experiment with NLP techniques, develop algorithms, and contribute to the advancement of the field. + Language Learning Tools: Support interactive language learning experiences, aiding in grammar correction or providing writing practice. + Knowledge Exploration: Assist researchers in exploring large bodies of text by generating summaries or answering questions about specific topics. ### Limitations * Training Data + The quality and diversity of the training data significantly influence the model's capabilities. Biases or gaps in the training data can lead to limitations in the model's responses. + The scope of the training dataset determines the subject areas the model can handle effectively. * Context and Task Complexity + LLMs are better at tasks that can be framed with clear prompts and instructions. Open-ended or highly complex tasks might be challenging. + A model's performance can be influenced by the amount of context provided (longer context generally leads to better outputs, up to a certain point). * Language Ambiguity and Nuance + Natural language is inherently complex. LLMs might struggle to grasp subtle nuances, sarcasm, or figurative language. * Factual Accuracy + LLMs generate responses based on information they learned from their training datasets, but they are not knowledge bases. They may generate incorrect or outdated factual statements. * Common Sense + LLMs rely on statistical patterns in language. They might lack the ability to apply common sense reasoning in certain situations. ### Ethical Considerations and Risks The development of large language models (LLMs) raises several ethical concerns. In creating an open model, we have carefully considered the following: * Bias and Fairness + LLMs trained on large-scale, real-world text data can reflect socio-cultural biases embedded in the training material. These models underwent careful scrutiny, input data pre-processing described and posterior evaluations reported in this card. * Misinformation and Misuse + LLMs can be misused to generate text that is false, misleading, or harmful. + Guidelines are provided for responsible use with the model, see the Responsible Generative AI Toolkit. * Transparency and Accountability: + This model card summarizes details on the models' architecture, capabilities, limitations, and evaluation processes. + A responsibly developed open model offers the opportunity to share innovation by making LLM technology accessible to developers and researchers across the AI ecosystem. Risks identified and mitigations: * Perpetuation of biases: It's encouraged to perform continuous monitoring (using evaluation metrics, human review) and the exploration of de-biasing techniques during model training, fine-tuning, and other use cases. * Generation of harmful content: Mechanisms and guidelines for content safety are essential. Developers are encouraged to exercise caution and implement appropriate content safety safeguards based on their specific product policies and application use cases. * Misuse for malicious purposes: Technical limitations and developer and end-user education can help mitigate against malicious applications of LLMs. Educational resources and reporting mechanisms for users to flag misuse are provided. Prohibited uses of Gemma models are outlined in the Gemma Prohibited Use Policy. * Privacy violations: Models were trained on data filtered for removal of PII (Personally Identifiable Information). Developers are encouraged to adhere to privacy regulations with privacy-preserving techniques. ### Benefits At the time of release, this family of models provides high-performance open large language model implementations designed from the ground up for Responsible AI development compared to similarly sized models. Using the benchmark evaluation metrics described in this document, these models have shown to provide superior performance to other, comparably-sized open model alternatives.
[ "### Description\n\n\nGemma is a family of lightweight, state-of-the-art open models from Google,\nbuilt from the same research and technology used to create the Gemini models.\nThey are text-to-text, decoder-only large language models, available in English,\nwith open weights, pre-trained variants, and instruction-tuned variants. Gemma\nmodels are well-suited for a variety of text generation tasks, including\nquestion answering, summarization, and reasoning. Their relatively small size\nmakes it possible to deploy them in environments with limited resources such as\na laptop, desktop or your own cloud infrastructure, democratizing access to\nstate of the art AI models and helping foster innovation for everyone.", "### Usage\n\n\nBelow we share some code snippets on how to get quickly started with running the model. First make sure to 'pip install -U transformers', then copy the snippet from the section that is relevant for your usecase.", "#### Running the model on a CPU\n\n\nAs explained below, we recommend 'torch.bfloat16' as the default dtype. You can use a different precision if necessary.", "#### Running the model on a single / multi GPU", "#### Running the model on a GPU using different precisions\n\n\nThe native weights of this model were exported in 'bfloat16' precision. You can use 'float16', which may be faster on certain hardware, indicating the 'torch\\_dtype' when loading the model. For convenience, the 'float16' revision of the repo contains a copy of the weights already converted to that precision.\n\n\nYou can also use 'float32' if you skip the dtype, but no precision increase will occur (model weights will just be upcasted to 'float32'). See examples below.\n\n\n* *Using 'torch.float16'*\n* *Using 'torch.bfloat16'*\n* *Upcasting to 'torch.float32'*", "#### Quantized Versions through 'bitsandbytes'\n\n\n* *Using 8-bit precision (int8)*\n* *Using 4-bit precision*", "#### Other optimizations\n\n\n* *Flash Attention 2*\n\n\nFirst make sure to install 'flash-attn' in your environment 'pip install flash-attn'", "#### Running the model in JAX / Flax\n\n\nUse the 'flax' branch of the repository:\n\n\nCheck this notebook for a comprehensive walkthrough on how to parallelize JAX inference.", "### Chat Template\n\n\nThe instruction-tuned models use a chat template that must be adhered to for conversational use.\nThe easiest way to apply it is using the tokenizer's built-in chat template, as shown in the following snippet.\n\n\nLet's load the model and apply the chat template to a conversation. In this example, we'll start with a single user interaction:\n\n\nAt this point, the prompt contains the following text:\n\n\nAs you can see, each turn is preceded by a '<start\\_of\\_turn>' delimiter and then the role of the entity\n(either 'user', for content supplied by the user, or 'model' for LLM responses). Turns finish with\nthe '<end\\_of\\_turn>' token.\n\n\nYou can follow this format to build the prompt manually, if you need to do it without the tokenizer's\nchat template.\n\n\nAfter the prompt is ready, generation can be performed like this:", "### Fine-tuning\n\n\nYou can find some fine-tuning scripts under the 'examples/' directory of 'google/gemma-7b' repository. To adapt them to this model, simply change the model-id to 'google/gemma-1.1-2b-it'.\n\n\nWe provide:\n\n\n* A script to perform Supervised Fine-Tuning (SFT) on UltraChat dataset using QLoRA\n* A script to perform SFT using FSDP on TPU devices\n* A notebook that you can run on a free-tier Google Colab instance to perform SFT on the English quotes dataset", "### Inputs and outputs\n\n\n* Input: Text string, such as a question, a prompt, or a document to be\nsummarized.\n* Output: Generated English-language text in response to the input, such\nas an answer to a question, or a summary of a document.\n\n\nModel Data\n----------\n\n\nData used for model training and how the data was processed.", "### Training Dataset\n\n\nThese models were trained on a dataset of text data that includes a wide variety\nof sources, totaling 6 trillion tokens. Here are the key components:\n\n\n* Web Documents: A diverse collection of web text ensures the model is exposed\nto a broad range of linguistic styles, topics, and vocabulary. Primarily\nEnglish-language content.\n* Code: Exposing the model to code helps it to learn the syntax and patterns of\nprogramming languages, which improves its ability to generate code or\nunderstand code-related questions.\n* Mathematics: Training on mathematical text helps the model learn logical\nreasoning, symbolic representation, and to address mathematical queries.\n\n\nThe combination of these diverse data sources is crucial for training a powerful\nlanguage model that can handle a wide variety of different tasks and text\nformats.", "### Data Preprocessing\n\n\nHere are the key data cleaning and filtering methods applied to the training\ndata:\n\n\n* CSAM Filtering: Rigorous CSAM (Child Sexual Abuse Material) filtering was\napplied at multiple stages in the data preparation process to ensure the\nexclusion of harmful and illegal content\n* Sensitive Data Filtering: As part of making Gemma pre-trained models safe and\nreliable, automated techniques were used to filter out certain personal\ninformation and other sensitive data from training sets.\n* Additional methods: Filtering based on content quality and safely in line with\nour policies.\n\n\nImplementation Information\n--------------------------\n\n\nDetails about the model internals.", "### Hardware\n\n\nGemma was trained using the latest generation of\nTensor Processing Unit (TPU) hardware (TPUv5e).\n\n\nTraining large language models requires significant computational power. TPUs,\ndesigned specifically for matrix operations common in machine learning, offer\nseveral advantages in this domain:\n\n\n* Performance: TPUs are specifically designed to handle the massive computations\ninvolved in training LLMs. They can speed up training considerably compared to\nCPUs.\n* Memory: TPUs often come with large amounts of high-bandwidth memory, allowing\nfor the handling of large models and batch sizes during training. This can\nlead to better model quality.\n* Scalability: TPU Pods (large clusters of TPUs) provide a scalable solution for\nhandling the growing complexity of large foundation models. You can distribute\ntraining across multiple TPU devices for faster and more efficient processing.\n* Cost-effectiveness: In many scenarios, TPUs can provide a more cost-effective\nsolution for training large models compared to CPU-based infrastructure,\nespecially when considering the time and resources saved due to faster\ntraining.\n* These advantages are aligned with\nGoogle's commitments to operate sustainably.", "### Software\n\n\nTraining was done using JAX and ML Pathways.\n\n\nJAX allows researchers to take advantage of the latest generation of hardware,\nincluding TPUs, for faster and more efficient training of large models.\n\n\nML Pathways is Google's latest effort to build artificially intelligent systems\ncapable of generalizing across multiple tasks. This is specially suitable for\nfoundation models, including large language models like\nthese ones.\n\n\nTogether, JAX and ML Pathways are used as described in the\npaper about the Gemini family of models; \"the 'single\ncontroller' programming model of Jax and Pathways allows a single Python\nprocess to orchestrate the entire training run, dramatically simplifying the\ndevelopment workflow.\"\n\n\nEvaluation\n----------\n\n\nModel evaluation metrics and results.", "### Benchmark Results\n\n\nThe pre-trained base models were evaluated against a large collection of different datasets and\nmetrics to cover different aspects of text generation:\n\n\n\nEthics and Safety\n-----------------\n\n\nEthics and safety evaluation approach and results.", "### Evaluation Approach\n\n\nOur evaluation methods include structured evaluations and internal red-teaming\ntesting of relevant content policies. Red-teaming was conducted by a number of\ndifferent teams, each with different goals and human evaluation metrics. These\nmodels were evaluated against a number of different categories relevant to\nethics and safety, including:\n\n\n* Text-to-Text Content Safety: Human evaluation on prompts covering safety\npolicies including child sexual abuse and exploitation, harassment, violence\nand gore, and hate speech.\n* Text-to-Text Representational Harms: Benchmark against relevant academic\ndatasets such as WinoBias and BBQ Dataset.\n* Memorization: Automated evaluation of memorization of training data, including\nthe risk of personally identifiable information exposure.\n* Large-scale harm: Tests for \"dangerous capabilities,\" such as chemical,\nbiological, radiological, and nuclear (CBRN) risks.", "### Evaluation Results\n\n\nThe results of ethics and safety evaluations are within acceptable thresholds\nfor meeting internal policies for categories such as child\nsafety, content safety, representational harms, memorization, large-scale harms.\nOn top of robust internal evaluations, the results of well known safety\nbenchmarks like BBQ, BOLD, Winogender, Winobias, RealToxicity, and TruthfulQA\nare shown here.", "#### Gemma 1.0", "#### Gemma 1.1\n\n\n\nUsage and Limitations\n---------------------\n\n\nThese models have certain limitations that users should be aware of.", "### Intended Usage\n\n\nOpen Large Language Models (LLMs) have a wide range of applications across\nvarious industries and domains. The following list of potential uses is not\ncomprehensive. The purpose of this list is to provide contextual information\nabout the possible use-cases that the model creators considered as part of model\ntraining and development.\n\n\n* Content Creation and Communication\n\t+ Text Generation: These models can be used to generate creative text formats\n\tsuch as poems, scripts, code, marketing copy, and email drafts.\n\t+ Chatbots and Conversational AI: Power conversational interfaces for customer\n\tservice, virtual assistants, or interactive applications.\n\t+ Text Summarization: Generate concise summaries of a text corpus, research\n\tpapers, or reports.\n* Research and Education\n\t+ Natural Language Processing (NLP) Research: These models can serve as a\n\tfoundation for researchers to experiment with NLP techniques, develop\n\talgorithms, and contribute to the advancement of the field.\n\t+ Language Learning Tools: Support interactive language learning experiences,\n\taiding in grammar correction or providing writing practice.\n\t+ Knowledge Exploration: Assist researchers in exploring large bodies of text\n\tby generating summaries or answering questions about specific topics.", "### Limitations\n\n\n* Training Data\n\t+ The quality and diversity of the training data significantly influence the\n\tmodel's capabilities. Biases or gaps in the training data can lead to\n\tlimitations in the model's responses.\n\t+ The scope of the training dataset determines the subject areas the model can\n\thandle effectively.\n* Context and Task Complexity\n\t+ LLMs are better at tasks that can be framed with clear prompts and\n\tinstructions. Open-ended or highly complex tasks might be challenging.\n\t+ A model's performance can be influenced by the amount of context provided\n\t(longer context generally leads to better outputs, up to a certain point).\n* Language Ambiguity and Nuance\n\t+ Natural language is inherently complex. LLMs might struggle to grasp subtle\n\tnuances, sarcasm, or figurative language.\n* Factual Accuracy\n\t+ LLMs generate responses based on information they learned from their\n\ttraining datasets, but they are not knowledge bases. They may generate\n\tincorrect or outdated factual statements.\n* Common Sense\n\t+ LLMs rely on statistical patterns in language. They might lack the ability\n\tto apply common sense reasoning in certain situations.", "### Ethical Considerations and Risks\n\n\nThe development of large language models (LLMs) raises several ethical concerns.\nIn creating an open model, we have carefully considered the following:\n\n\n* Bias and Fairness\n\t+ LLMs trained on large-scale, real-world text data can reflect socio-cultural\n\tbiases embedded in the training material. These models underwent careful\n\tscrutiny, input data pre-processing described and posterior evaluations\n\treported in this card.\n* Misinformation and Misuse\n\t+ LLMs can be misused to generate text that is false, misleading, or harmful.\n\t+ Guidelines are provided for responsible use with the model, see the\n\tResponsible Generative AI Toolkit.\n* Transparency and Accountability:\n\t+ This model card summarizes details on the models' architecture,\n\tcapabilities, limitations, and evaluation processes.\n\t+ A responsibly developed open model offers the opportunity to share\n\tinnovation by making LLM technology accessible to developers and researchers\n\tacross the AI ecosystem.\n\n\nRisks identified and mitigations:\n\n\n* Perpetuation of biases: It's encouraged to perform continuous monitoring\n(using evaluation metrics, human review) and the exploration of de-biasing\ntechniques during model training, fine-tuning, and other use cases.\n* Generation of harmful content: Mechanisms and guidelines for content safety\nare essential. Developers are encouraged to exercise caution and implement\nappropriate content safety safeguards based on their specific product policies\nand application use cases.\n* Misuse for malicious purposes: Technical limitations and developer and\nend-user education can help mitigate against malicious applications of LLMs.\nEducational resources and reporting mechanisms for users to flag misuse are\nprovided. Prohibited uses of Gemma models are outlined in the\nGemma Prohibited Use Policy.\n* Privacy violations: Models were trained on data filtered for removal of PII\n(Personally Identifiable Information). Developers are encouraged to adhere to\nprivacy regulations with privacy-preserving techniques.", "### Benefits\n\n\nAt the time of release, this family of models provides high-performance open\nlarge language model implementations designed from the ground up for Responsible\nAI development compared to similarly sized models.\n\n\nUsing the benchmark evaluation metrics described in this document, these models\nhave shown to provide superior performance to other, comparably-sized open model\nalternatives." ]
[ "TAGS\n#transformers #arxiv-2312.11805 #arxiv-2009.03300 #arxiv-1905.07830 #arxiv-1911.11641 #arxiv-1904.09728 #arxiv-1905.10044 #arxiv-1907.10641 #arxiv-1811.00937 #arxiv-1809.02789 #arxiv-1911.01547 #arxiv-1705.03551 #arxiv-2107.03374 #arxiv-2108.07732 #arxiv-2110.14168 #arxiv-2304.06364 #arxiv-2206.04615 #arxiv-1804.06876 #arxiv-2110.08193 #license-gemma #endpoints_compatible #region-us \n", "### Description\n\n\nGemma is a family of lightweight, state-of-the-art open models from Google,\nbuilt from the same research and technology used to create the Gemini models.\nThey are text-to-text, decoder-only large language models, available in English,\nwith open weights, pre-trained variants, and instruction-tuned variants. Gemma\nmodels are well-suited for a variety of text generation tasks, including\nquestion answering, summarization, and reasoning. Their relatively small size\nmakes it possible to deploy them in environments with limited resources such as\na laptop, desktop or your own cloud infrastructure, democratizing access to\nstate of the art AI models and helping foster innovation for everyone.", "### Usage\n\n\nBelow we share some code snippets on how to get quickly started with running the model. First make sure to 'pip install -U transformers', then copy the snippet from the section that is relevant for your usecase.", "#### Running the model on a CPU\n\n\nAs explained below, we recommend 'torch.bfloat16' as the default dtype. You can use a different precision if necessary.", "#### Running the model on a single / multi GPU", "#### Running the model on a GPU using different precisions\n\n\nThe native weights of this model were exported in 'bfloat16' precision. You can use 'float16', which may be faster on certain hardware, indicating the 'torch\\_dtype' when loading the model. For convenience, the 'float16' revision of the repo contains a copy of the weights already converted to that precision.\n\n\nYou can also use 'float32' if you skip the dtype, but no precision increase will occur (model weights will just be upcasted to 'float32'). See examples below.\n\n\n* *Using 'torch.float16'*\n* *Using 'torch.bfloat16'*\n* *Upcasting to 'torch.float32'*", "#### Quantized Versions through 'bitsandbytes'\n\n\n* *Using 8-bit precision (int8)*\n* *Using 4-bit precision*", "#### Other optimizations\n\n\n* *Flash Attention 2*\n\n\nFirst make sure to install 'flash-attn' in your environment 'pip install flash-attn'", "#### Running the model in JAX / Flax\n\n\nUse the 'flax' branch of the repository:\n\n\nCheck this notebook for a comprehensive walkthrough on how to parallelize JAX inference.", "### Chat Template\n\n\nThe instruction-tuned models use a chat template that must be adhered to for conversational use.\nThe easiest way to apply it is using the tokenizer's built-in chat template, as shown in the following snippet.\n\n\nLet's load the model and apply the chat template to a conversation. In this example, we'll start with a single user interaction:\n\n\nAt this point, the prompt contains the following text:\n\n\nAs you can see, each turn is preceded by a '<start\\_of\\_turn>' delimiter and then the role of the entity\n(either 'user', for content supplied by the user, or 'model' for LLM responses). Turns finish with\nthe '<end\\_of\\_turn>' token.\n\n\nYou can follow this format to build the prompt manually, if you need to do it without the tokenizer's\nchat template.\n\n\nAfter the prompt is ready, generation can be performed like this:", "### Fine-tuning\n\n\nYou can find some fine-tuning scripts under the 'examples/' directory of 'google/gemma-7b' repository. To adapt them to this model, simply change the model-id to 'google/gemma-1.1-2b-it'.\n\n\nWe provide:\n\n\n* A script to perform Supervised Fine-Tuning (SFT) on UltraChat dataset using QLoRA\n* A script to perform SFT using FSDP on TPU devices\n* A notebook that you can run on a free-tier Google Colab instance to perform SFT on the English quotes dataset", "### Inputs and outputs\n\n\n* Input: Text string, such as a question, a prompt, or a document to be\nsummarized.\n* Output: Generated English-language text in response to the input, such\nas an answer to a question, or a summary of a document.\n\n\nModel Data\n----------\n\n\nData used for model training and how the data was processed.", "### Training Dataset\n\n\nThese models were trained on a dataset of text data that includes a wide variety\nof sources, totaling 6 trillion tokens. Here are the key components:\n\n\n* Web Documents: A diverse collection of web text ensures the model is exposed\nto a broad range of linguistic styles, topics, and vocabulary. Primarily\nEnglish-language content.\n* Code: Exposing the model to code helps it to learn the syntax and patterns of\nprogramming languages, which improves its ability to generate code or\nunderstand code-related questions.\n* Mathematics: Training on mathematical text helps the model learn logical\nreasoning, symbolic representation, and to address mathematical queries.\n\n\nThe combination of these diverse data sources is crucial for training a powerful\nlanguage model that can handle a wide variety of different tasks and text\nformats.", "### Data Preprocessing\n\n\nHere are the key data cleaning and filtering methods applied to the training\ndata:\n\n\n* CSAM Filtering: Rigorous CSAM (Child Sexual Abuse Material) filtering was\napplied at multiple stages in the data preparation process to ensure the\nexclusion of harmful and illegal content\n* Sensitive Data Filtering: As part of making Gemma pre-trained models safe and\nreliable, automated techniques were used to filter out certain personal\ninformation and other sensitive data from training sets.\n* Additional methods: Filtering based on content quality and safely in line with\nour policies.\n\n\nImplementation Information\n--------------------------\n\n\nDetails about the model internals.", "### Hardware\n\n\nGemma was trained using the latest generation of\nTensor Processing Unit (TPU) hardware (TPUv5e).\n\n\nTraining large language models requires significant computational power. TPUs,\ndesigned specifically for matrix operations common in machine learning, offer\nseveral advantages in this domain:\n\n\n* Performance: TPUs are specifically designed to handle the massive computations\ninvolved in training LLMs. They can speed up training considerably compared to\nCPUs.\n* Memory: TPUs often come with large amounts of high-bandwidth memory, allowing\nfor the handling of large models and batch sizes during training. This can\nlead to better model quality.\n* Scalability: TPU Pods (large clusters of TPUs) provide a scalable solution for\nhandling the growing complexity of large foundation models. You can distribute\ntraining across multiple TPU devices for faster and more efficient processing.\n* Cost-effectiveness: In many scenarios, TPUs can provide a more cost-effective\nsolution for training large models compared to CPU-based infrastructure,\nespecially when considering the time and resources saved due to faster\ntraining.\n* These advantages are aligned with\nGoogle's commitments to operate sustainably.", "### Software\n\n\nTraining was done using JAX and ML Pathways.\n\n\nJAX allows researchers to take advantage of the latest generation of hardware,\nincluding TPUs, for faster and more efficient training of large models.\n\n\nML Pathways is Google's latest effort to build artificially intelligent systems\ncapable of generalizing across multiple tasks. This is specially suitable for\nfoundation models, including large language models like\nthese ones.\n\n\nTogether, JAX and ML Pathways are used as described in the\npaper about the Gemini family of models; \"the 'single\ncontroller' programming model of Jax and Pathways allows a single Python\nprocess to orchestrate the entire training run, dramatically simplifying the\ndevelopment workflow.\"\n\n\nEvaluation\n----------\n\n\nModel evaluation metrics and results.", "### Benchmark Results\n\n\nThe pre-trained base models were evaluated against a large collection of different datasets and\nmetrics to cover different aspects of text generation:\n\n\n\nEthics and Safety\n-----------------\n\n\nEthics and safety evaluation approach and results.", "### Evaluation Approach\n\n\nOur evaluation methods include structured evaluations and internal red-teaming\ntesting of relevant content policies. Red-teaming was conducted by a number of\ndifferent teams, each with different goals and human evaluation metrics. These\nmodels were evaluated against a number of different categories relevant to\nethics and safety, including:\n\n\n* Text-to-Text Content Safety: Human evaluation on prompts covering safety\npolicies including child sexual abuse and exploitation, harassment, violence\nand gore, and hate speech.\n* Text-to-Text Representational Harms: Benchmark against relevant academic\ndatasets such as WinoBias and BBQ Dataset.\n* Memorization: Automated evaluation of memorization of training data, including\nthe risk of personally identifiable information exposure.\n* Large-scale harm: Tests for \"dangerous capabilities,\" such as chemical,\nbiological, radiological, and nuclear (CBRN) risks.", "### Evaluation Results\n\n\nThe results of ethics and safety evaluations are within acceptable thresholds\nfor meeting internal policies for categories such as child\nsafety, content safety, representational harms, memorization, large-scale harms.\nOn top of robust internal evaluations, the results of well known safety\nbenchmarks like BBQ, BOLD, Winogender, Winobias, RealToxicity, and TruthfulQA\nare shown here.", "#### Gemma 1.0", "#### Gemma 1.1\n\n\n\nUsage and Limitations\n---------------------\n\n\nThese models have certain limitations that users should be aware of.", "### Intended Usage\n\n\nOpen Large Language Models (LLMs) have a wide range of applications across\nvarious industries and domains. The following list of potential uses is not\ncomprehensive. The purpose of this list is to provide contextual information\nabout the possible use-cases that the model creators considered as part of model\ntraining and development.\n\n\n* Content Creation and Communication\n\t+ Text Generation: These models can be used to generate creative text formats\n\tsuch as poems, scripts, code, marketing copy, and email drafts.\n\t+ Chatbots and Conversational AI: Power conversational interfaces for customer\n\tservice, virtual assistants, or interactive applications.\n\t+ Text Summarization: Generate concise summaries of a text corpus, research\n\tpapers, or reports.\n* Research and Education\n\t+ Natural Language Processing (NLP) Research: These models can serve as a\n\tfoundation for researchers to experiment with NLP techniques, develop\n\talgorithms, and contribute to the advancement of the field.\n\t+ Language Learning Tools: Support interactive language learning experiences,\n\taiding in grammar correction or providing writing practice.\n\t+ Knowledge Exploration: Assist researchers in exploring large bodies of text\n\tby generating summaries or answering questions about specific topics.", "### Limitations\n\n\n* Training Data\n\t+ The quality and diversity of the training data significantly influence the\n\tmodel's capabilities. Biases or gaps in the training data can lead to\n\tlimitations in the model's responses.\n\t+ The scope of the training dataset determines the subject areas the model can\n\thandle effectively.\n* Context and Task Complexity\n\t+ LLMs are better at tasks that can be framed with clear prompts and\n\tinstructions. Open-ended or highly complex tasks might be challenging.\n\t+ A model's performance can be influenced by the amount of context provided\n\t(longer context generally leads to better outputs, up to a certain point).\n* Language Ambiguity and Nuance\n\t+ Natural language is inherently complex. LLMs might struggle to grasp subtle\n\tnuances, sarcasm, or figurative language.\n* Factual Accuracy\n\t+ LLMs generate responses based on information they learned from their\n\ttraining datasets, but they are not knowledge bases. They may generate\n\tincorrect or outdated factual statements.\n* Common Sense\n\t+ LLMs rely on statistical patterns in language. They might lack the ability\n\tto apply common sense reasoning in certain situations.", "### Ethical Considerations and Risks\n\n\nThe development of large language models (LLMs) raises several ethical concerns.\nIn creating an open model, we have carefully considered the following:\n\n\n* Bias and Fairness\n\t+ LLMs trained on large-scale, real-world text data can reflect socio-cultural\n\tbiases embedded in the training material. These models underwent careful\n\tscrutiny, input data pre-processing described and posterior evaluations\n\treported in this card.\n* Misinformation and Misuse\n\t+ LLMs can be misused to generate text that is false, misleading, or harmful.\n\t+ Guidelines are provided for responsible use with the model, see the\n\tResponsible Generative AI Toolkit.\n* Transparency and Accountability:\n\t+ This model card summarizes details on the models' architecture,\n\tcapabilities, limitations, and evaluation processes.\n\t+ A responsibly developed open model offers the opportunity to share\n\tinnovation by making LLM technology accessible to developers and researchers\n\tacross the AI ecosystem.\n\n\nRisks identified and mitigations:\n\n\n* Perpetuation of biases: It's encouraged to perform continuous monitoring\n(using evaluation metrics, human review) and the exploration of de-biasing\ntechniques during model training, fine-tuning, and other use cases.\n* Generation of harmful content: Mechanisms and guidelines for content safety\nare essential. Developers are encouraged to exercise caution and implement\nappropriate content safety safeguards based on their specific product policies\nand application use cases.\n* Misuse for malicious purposes: Technical limitations and developer and\nend-user education can help mitigate against malicious applications of LLMs.\nEducational resources and reporting mechanisms for users to flag misuse are\nprovided. Prohibited uses of Gemma models are outlined in the\nGemma Prohibited Use Policy.\n* Privacy violations: Models were trained on data filtered for removal of PII\n(Personally Identifiable Information). Developers are encouraged to adhere to\nprivacy regulations with privacy-preserving techniques.", "### Benefits\n\n\nAt the time of release, this family of models provides high-performance open\nlarge language model implementations designed from the ground up for Responsible\nAI development compared to similarly sized models.\n\n\nUsing the benchmark evaluation metrics described in this document, these models\nhave shown to provide superior performance to other, comparably-sized open model\nalternatives." ]
question-answering
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
NeginShams/albert-Quran_QA
null
[ "transformers", "safetensors", "bert", "question-answering", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:35:42+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# Mistral-child-1-1 Mistral-child-1-1 is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): * [HuggingFaceH4/zephyr-7b-beta](https://huggingface.co/HuggingFaceH4/zephyr-7b-beta) * [meta-math/MetaMath-Mistral-7B](https://huggingface.co/meta-math/MetaMath-Mistral-7B) ## 🧩 Configuration ```yaml models: - model: mistralai/Mistral-7B-v0.1 # no parameters necessary for base model - model: HuggingFaceH4/zephyr-7b-beta parameters: density: 0.5 weight: 0.5 - model: meta-math/MetaMath-Mistral-7B parameters: density: 0.5 weight: 0.5 merge_method: ties base_model: mistralai/Mistral-7B-v0.1 parameters: normalize: true dtype: float16 ```
{"license": "apache-2.0", "tags": ["merge", "mergekit", "lazymergekit", "HuggingFaceH4/zephyr-7b-beta", "meta-math/MetaMath-Mistral-7B"]}
PotatoB/Mistral-child-1-1
null
[ "transformers", "safetensors", "mistral", "text-generation", "merge", "mergekit", "lazymergekit", "HuggingFaceH4/zephyr-7b-beta", "meta-math/MetaMath-Mistral-7B", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T11:36:44+00:00
[]
[]
TAGS #transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #HuggingFaceH4/zephyr-7b-beta #meta-math/MetaMath-Mistral-7B #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Mistral-child-1-1 Mistral-child-1-1 is a merge of the following models using mergekit: * HuggingFaceH4/zephyr-7b-beta * meta-math/MetaMath-Mistral-7B ## Configuration
[ "# Mistral-child-1-1\n\nMistral-child-1-1 is a merge of the following models using mergekit:\n* HuggingFaceH4/zephyr-7b-beta\n* meta-math/MetaMath-Mistral-7B", "## Configuration" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #HuggingFaceH4/zephyr-7b-beta #meta-math/MetaMath-Mistral-7B #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Mistral-child-1-1\n\nMistral-child-1-1 is a merge of the following models using mergekit:\n* HuggingFaceH4/zephyr-7b-beta\n* meta-math/MetaMath-Mistral-7B", "## Configuration" ]
null
null
EXL2 quants of [Qwen1.5 110B Chat](https://huggingface.co/Qwen/Qwen1.5-110B-Chat) [2.50 bits per weight](https://huggingface.co/turboderp/Qwen1.5-110B-Chat-exl2/tree/2.5bpw) [3.00 bits per weight](https://huggingface.co/turboderp/Qwen1.5-110B-Chat-exl2/tree/3.0bpw) [3.50 bits per weight](https://huggingface.co/turboderp/Qwen1.5-110B-Chat-exl2/tree/3.5bpw) [4.00 bits per weight](https://huggingface.co/turboderp/Qwen1.5-110B-Chat-exl2/tree/4.0bpw) [4.50 bits per weight](https://huggingface.co/turboderp/Qwen1.5-110B-Chat-exl2/tree/4.5bpw) (More sizes coming.) [measurement.json](https://huggingface.co/turboderp/Qwen1.5-110B-Chat-exl2/blob/main/measurement.json)
{}
turboderp/Qwen1.5-110B-Chat-exl2
null
[ "region:us" ]
null
2024-04-27T11:37:29+00:00
[]
[]
TAGS #region-us
EXL2 quants of Qwen1.5 110B Chat 2.50 bits per weight 3.00 bits per weight 3.50 bits per weight 4.00 bits per weight 4.50 bits per weight (More sizes coming.) URL
[]
[ "TAGS\n#region-us \n" ]
feature-extraction
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"language": ["ru"], "library_name": "transformers", "datasets": ["tay-yozhik/SyntheticTexts"]}
v-urushkin/SyntheticGPT2-small
null
[ "transformers", "safetensors", "gpt2", "feature-extraction", "ru", "dataset:tay-yozhik/SyntheticTexts", "arxiv:1910.09700", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T11:37:39+00:00
[ "1910.09700" ]
[ "ru" ]
TAGS #transformers #safetensors #gpt2 #feature-extraction #ru #dataset-tay-yozhik/SyntheticTexts #arxiv-1910.09700 #endpoints_compatible #text-generation-inference #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #gpt2 #feature-extraction #ru #dataset-tay-yozhik/SyntheticTexts #arxiv-1910.09700 #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
null
null
Apa itu Arthricore Tablet? Arthricore Harga berdiri sebagai kapsul berkualitas premium yang dibuat untuk membantu mengelola hipertensi dan meningkatkan kesehatan jantung. Formula canggihnya memadukan campuran herbal, vitamin, dan mineral yang sinergis, dipilih dengan cermat untuk mengatasi penyebab utama tekanan darah tinggi. Situs web resmi:<a href="https://www.nutritionsee.com/artyhiindos">www.Arthricore.com</a> <p><a href="https://www.nutritionsee.com/artyhiindos"> <img src="https://www.nutritionsee.com/wp-content/uploads/2024/04/Arthricore-Indonesia.png" alt="enter image description here"> </a></p> <a href="https://www.nutritionsee.com/artyhiindos">Beli sekarang!! Klik link di bawah untuk informasi lebih lanjut dan dapatkan diskon 50% sekarang... Buruan</a> Situs web resmi:<a href="https://www.nutritionsee.com/artyhiindos">www.Arthricore.com</a>
{"license": "apache-2.0"}
ArthricoreIndonesia/Arthricore
null
[ "license:apache-2.0", "region:us" ]
null
2024-04-27T11:39:29+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
Apa itu Arthricore Tablet? Arthricore Harga berdiri sebagai kapsul berkualitas premium yang dibuat untuk membantu mengelola hipertensi dan meningkatkan kesehatan jantung. Formula canggihnya memadukan campuran herbal, vitamin, dan mineral yang sinergis, dipilih dengan cermat untuk mengatasi penyebab utama tekanan darah tinggi. Situs web resmi:<a href="URL <p><a href="URL <img src="URL alt="enter image description here"> </a></p> <a href="URL sekarang!! Klik link di bawah untuk informasi lebih lanjut dan dapatkan diskon 50% sekarang... Buruan</a> Situs web resmi:<a href="URL
[]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
question-answering
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
NeginShams/parsbert-Quran_QA
null
[ "transformers", "safetensors", "bert", "question-answering", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:41:56+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
image-to-image
diffusers
Recommended version of `diffusers` is `0.20.2` with `torch` `2`. Usage Example: ```python import torch import requests from PIL import Image from diffusers import DiffusionPipeline, EulerAncestralDiscreteScheduler # Load the pipeline pipeline = DiffusionPipeline.from_pretrained( "S1T4L/Zero123pp_custom", custom_pipeline="S1T4L/Zero123pp_custom_pipeline", torch_dtype=torch.float16 ) # Feel free to tune the scheduler pipeline.scheduler = EulerAncestralDiscreteScheduler.from_config( pipeline.scheduler.config, timestep_spacing='trailing' ) pipeline.to('cuda:0') # Run the pipeline cond = Image.open(requests.get("https://d.skis.ltd/nrp/sample-data/lysol.png", stream=True).raw) result = pipeline(cond).images[0] result.show() result.save("output.png") ```
{"license": "openrail", "library_name": "diffusers", "tags": ["art"], "datasets": ["allenai/objaverse"], "pipeline_tag": "image-to-image"}
S1T4L/Zero123pp_custom
null
[ "diffusers", "art", "image-to-image", "dataset:allenai/objaverse", "license:openrail", "diffusers:Zero123PlusPipeline", "region:us" ]
null
2024-04-27T11:42:02+00:00
[]
[]
TAGS #diffusers #art #image-to-image #dataset-allenai/objaverse #license-openrail #diffusers-Zero123PlusPipeline #region-us
Recommended version of 'diffusers' is '0.20.2' with 'torch' '2'. Usage Example:
[]
[ "TAGS\n#diffusers #art #image-to-image #dataset-allenai/objaverse #license-openrail #diffusers-Zero123PlusPipeline #region-us \n" ]
text-generation
transformers
# merge This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * [TeeZee/DarkSapling-7B-v2.0](https://huggingface.co/TeeZee/DarkSapling-7B-v2.0) * [MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp](https://huggingface.co/MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp) ### Configuration The following YAML configuration was used to produce this model: ```yaml slices: - sources: - model: MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp layer_range: [0, 32] - model: TeeZee/DarkSapling-7B-v2.0 layer_range: [0, 32] merge_method: slerp base_model: MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp parameters: t: - filter: self_attn value: [0, 0.5, 0.3, 0.7, 1] - filter: mlp value: [1, 0.5, 0.7, 0.3, 0] - value: 0.5 dtype: bfloat16 ```
{"library_name": "transformers", "tags": ["mergekit", "merge"], "base_model": ["TeeZee/DarkSapling-7B-v2.0", "MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp"]}
DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0
null
[ "transformers", "safetensors", "mistral", "text-generation", "mergekit", "merge", "conversational", "base_model:TeeZee/DarkSapling-7B-v2.0", "base_model:MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T11:43:03+00:00
[]
[]
TAGS #transformers #safetensors #mistral #text-generation #mergekit #merge #conversational #base_model-TeeZee/DarkSapling-7B-v2.0 #base_model-MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# merge This is a merge of pre-trained language models created using mergekit. ## Merge Details ### Merge Method This model was merged using the SLERP merge method. ### Models Merged The following models were included in the merge: * TeeZee/DarkSapling-7B-v2.0 * MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp ### Configuration The following YAML configuration was used to produce this model:
[ "# merge\n\nThis is a merge of pre-trained language models created using mergekit.", "## Merge Details", "### Merge Method\n\nThis model was merged using the SLERP merge method.", "### Models Merged\n\nThe following models were included in the merge:\n* TeeZee/DarkSapling-7B-v2.0\n* MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp", "### Configuration\n\nThe following YAML configuration was used to produce this model:" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #mergekit #merge #conversational #base_model-TeeZee/DarkSapling-7B-v2.0 #base_model-MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# merge\n\nThis is a merge of pre-trained language models created using mergekit.", "## Merge Details", "### Merge Method\n\nThis model was merged using the SLERP merge method.", "### Models Merged\n\nThe following models were included in the merge:\n* TeeZee/DarkSapling-7B-v2.0\n* MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp", "### Configuration\n\nThe following YAML configuration was used to produce this model:" ]
text-generation
transformers
# stablelm-2-zephyr-1.6b-taskarith1 stablelm-2-zephyr-1.6b-taskarith1 is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing): * [aipib/stablelm-2-zephyr-1.6b-slerpx9](https://huggingface.co/aipib/stablelm-2-zephyr-1.6b-slerpx9) * [stabilityai/stablelm-2-zephyr-1_6b](https://huggingface.co/stabilityai/stablelm-2-zephyr-1_6b) ## 🧩 Configuration ```yaml models: - model: aipib/stablelm-2-zephyr-1.6b-slerpx9 parameters: weight: 0.4 - model: stabilityai/stablelm-2-zephyr-1_6b parameters: weight: 0.4 merge_method: task_arithmetic base_model: aipib/stablelm-2-zephyr-1.6b-slerpx9 parameters: int8_mask: true dtype: bfloat16 ``` ## 💻 Usage ```python !pip install -qU transformers accelerate from transformers import AutoTokenizer import transformers import torch model = "aipib/stablelm-2-zephyr-1.6b-taskarith1" messages = [{"role": "user", "content": "What is a large language model?"}] tokenizer = AutoTokenizer.from_pretrained(model) prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) pipeline = transformers.pipeline( "text-generation", model=model, torch_dtype=torch.float16, device_map="auto", ) outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95) print(outputs[0]["generated_text"]) ```
{"tags": ["merge", "mergekit", "lazymergekit", "aipib/stablelm-2-zephyr-1.6b-slerpx9", "stabilityai/stablelm-2-zephyr-1_6b"], "base_model": ["aipib/stablelm-2-zephyr-1.6b-slerpx9", "stabilityai/stablelm-2-zephyr-1_6b"]}
aipib/stablelm-2-zephyr-1.6b-taskarith1
null
[ "transformers", "safetensors", "stablelm", "text-generation", "merge", "mergekit", "lazymergekit", "aipib/stablelm-2-zephyr-1.6b-slerpx9", "stabilityai/stablelm-2-zephyr-1_6b", "conversational", "base_model:aipib/stablelm-2-zephyr-1.6b-slerpx9", "base_model:stabilityai/stablelm-2-zephyr-1_6b", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:47:00+00:00
[]
[]
TAGS #transformers #safetensors #stablelm #text-generation #merge #mergekit #lazymergekit #aipib/stablelm-2-zephyr-1.6b-slerpx9 #stabilityai/stablelm-2-zephyr-1_6b #conversational #base_model-aipib/stablelm-2-zephyr-1.6b-slerpx9 #base_model-stabilityai/stablelm-2-zephyr-1_6b #autotrain_compatible #endpoints_compatible #region-us
# stablelm-2-zephyr-1.6b-taskarith1 stablelm-2-zephyr-1.6b-taskarith1 is a merge of the following models using LazyMergekit: * aipib/stablelm-2-zephyr-1.6b-slerpx9 * stabilityai/stablelm-2-zephyr-1_6b ## Configuration ## Usage
[ "# stablelm-2-zephyr-1.6b-taskarith1\n\nstablelm-2-zephyr-1.6b-taskarith1 is a merge of the following models using LazyMergekit:\n* aipib/stablelm-2-zephyr-1.6b-slerpx9\n* stabilityai/stablelm-2-zephyr-1_6b", "## Configuration", "## Usage" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #merge #mergekit #lazymergekit #aipib/stablelm-2-zephyr-1.6b-slerpx9 #stabilityai/stablelm-2-zephyr-1_6b #conversational #base_model-aipib/stablelm-2-zephyr-1.6b-slerpx9 #base_model-stabilityai/stablelm-2-zephyr-1_6b #autotrain_compatible #endpoints_compatible #region-us \n", "# stablelm-2-zephyr-1.6b-taskarith1\n\nstablelm-2-zephyr-1.6b-taskarith1 is a merge of the following models using LazyMergekit:\n* aipib/stablelm-2-zephyr-1.6b-slerpx9\n* stabilityai/stablelm-2-zephyr-1_6b", "## Configuration", "## Usage" ]
question-answering
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
NeginShams/xlm-roberta-Quran_QA
null
[ "transformers", "safetensors", "xlm-roberta", "question-answering", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:47:02+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #xlm-roberta #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #xlm-roberta #question-answering #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
text-generation
transformers
# Phi-3 MoE mini 4k instruct raw The is a 8x MoE version of [microsoft/Phi-3-mini-4k-instruct](https://huggingface.co/microsoft/Phi-3-mini-4k-instruct). It is based on the Llamafied version [vonjack/Phi-3-mini-4k-instruct-LLaMAfied](https://huggingface.co/vonjack/Phi-3-mini-4k-instruct-LLaMAfied) of [Gan Feng](https://huggingface.co/vonjack). It was created with the help of [mergekit](https://github.com/arcee-ai/mergekit) with this [configuration](https://huggingface.co/PhilipMay/Phi-3-MoE-mini-4k-instruct-raw/blob/main/mergekit_moe_config.yml) and this command: ```bash TODO ``` As the router was initialized randomly during merging, this is a raw model. It should be trained before it can be used. ## Licensing Copyright (c) 2024 [Philip May](https://philipmay.org)\ Copyright (c) [Gan Feng](https://huggingface.co/vonjack)\ Copyright (c) Microsoft Corporation Licensed under the **MIT License** (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License by reviewing the file [LICENSE](https://huggingface.co/PhilipMay/Phi-3-MoE-mini-4k-instruct-raw/blob/main/LICENSE) in the repository.
{"license": "mit"}
PhilipMay/Phi-3-MoE-mini-4k-instruct-raw
null
[ "transformers", "safetensors", "mixtral", "text-generation", "conversational", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-27T11:47:58+00:00
[]
[]
TAGS #transformers #safetensors #mixtral #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Phi-3 MoE mini 4k instruct raw The is a 8x MoE version of microsoft/Phi-3-mini-4k-instruct. It is based on the Llamafied version vonjack/Phi-3-mini-4k-instruct-LLaMAfied of Gan Feng. It was created with the help of mergekit with this configuration and this command: As the router was initialized randomly during merging, this is a raw model. It should be trained before it can be used. ## Licensing Copyright (c) 2024 Philip May\ Copyright (c) Gan Feng\ Copyright (c) Microsoft Corporation Licensed under the MIT License (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License by reviewing the file LICENSE in the repository.
[ "# Phi-3 MoE mini 4k instruct raw\n\nThe is a 8x MoE version of microsoft/Phi-3-mini-4k-instruct.\nIt is based on the Llamafied version vonjack/Phi-3-mini-4k-instruct-LLaMAfied of \nGan Feng.\n\nIt was created with the help of mergekit with this\nconfiguration and this command:\n\n\n\nAs the router was initialized randomly during merging, this is a raw model.\nIt should be trained before it can be used.", "## Licensing\n\nCopyright (c) 2024 Philip May\\\nCopyright (c) Gan Feng\\\nCopyright (c) Microsoft Corporation\n\nLicensed under the MIT License (the \"License\"); you may not use this file except in compliance with the License.\nYou may obtain a copy of the License by reviewing the file\nLICENSE in the repository." ]
[ "TAGS\n#transformers #safetensors #mixtral #text-generation #conversational #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Phi-3 MoE mini 4k instruct raw\n\nThe is a 8x MoE version of microsoft/Phi-3-mini-4k-instruct.\nIt is based on the Llamafied version vonjack/Phi-3-mini-4k-instruct-LLaMAfied of \nGan Feng.\n\nIt was created with the help of mergekit with this\nconfiguration and this command:\n\n\n\nAs the router was initialized randomly during merging, this is a raw model.\nIt should be trained before it can be used.", "## Licensing\n\nCopyright (c) 2024 Philip May\\\nCopyright (c) Gan Feng\\\nCopyright (c) Microsoft Corporation\n\nLicensed under the MIT License (the \"License\"); you may not use this file except in compliance with the License.\nYou may obtain a copy of the License by reviewing the file\nLICENSE in the repository." ]
null
transformers
# DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0-Q8_0-GGUF This model was converted to GGUF format from [`DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0`](https://huggingface.co/DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew. ```bash brew install ggerganov/ggerganov/llama.cpp ``` Invoke the llama.cpp server or the CLI. CLI: ```bash llama-cli --hf-repo DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0-Q8_0-GGUF --model d_au-mistral-7b-instruct-v0.2-bagel-darksapling-dpo-7b-v2.0.Q8_0.gguf -p "The meaning to life and the universe is" ``` Server: ```bash llama-server --hf-repo DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0-Q8_0-GGUF --model d_au-mistral-7b-instruct-v0.2-bagel-darksapling-dpo-7b-v2.0.Q8_0.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. ``` git clone https://github.com/ggerganov/llama.cpp && cd llama.cpp && make && ./main -m d_au-mistral-7b-instruct-v0.2-bagel-darksapling-dpo-7b-v2.0.Q8_0.gguf -n 128 ```
{"library_name": "transformers", "tags": ["mergekit", "merge", "llama-cpp", "gguf-my-repo"], "base_model": ["TeeZee/DarkSapling-7B-v2.0", "MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp"]}
DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0-Q8_0-GGUF
null
[ "transformers", "gguf", "mergekit", "merge", "llama-cpp", "gguf-my-repo", "base_model:TeeZee/DarkSapling-7B-v2.0", "base_model:MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:52:55+00:00
[]
[]
TAGS #transformers #gguf #mergekit #merge #llama-cpp #gguf-my-repo #base_model-TeeZee/DarkSapling-7B-v2.0 #base_model-MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp #endpoints_compatible #region-us
# DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0-Q8_0-GGUF This model was converted to GGUF format from 'DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0' using URL via the URL's GGUF-my-repo space. Refer to the original model card for more details on the model. ## Use with URL Install URL through brew. Invoke the URL server or the CLI. CLI: Server: Note: You can also use this checkpoint directly through the usage steps listed in the URL repo as well.
[ "# DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0-Q8_0-GGUF\nThis model was converted to GGUF format from 'DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0' using URL via the URL's GGUF-my-repo space.\nRefer to the original model card for more details on the model.", "## Use with URL\n\nInstall URL through brew.\n\n\nInvoke the URL server or the CLI.\n\nCLI:\n\n\n\nServer:\n\n\n\nNote: You can also use this checkpoint directly through the usage steps listed in the URL repo as well." ]
[ "TAGS\n#transformers #gguf #mergekit #merge #llama-cpp #gguf-my-repo #base_model-TeeZee/DarkSapling-7B-v2.0 #base_model-MaziyarPanahi/bagel-dpo-7b-v0.1-Mistral-7B-Instruct-v0.2-slerp #endpoints_compatible #region-us \n", "# DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0-Q8_0-GGUF\nThis model was converted to GGUF format from 'DavidAU/D_AU-Mistral-7B-Instruct-v0.2-Bagel-DarkSapling-DPO-7B-v2.0' using URL via the URL's GGUF-my-repo space.\nRefer to the original model card for more details on the model.", "## Use with URL\n\nInstall URL through brew.\n\n\nInvoke the URL server or the CLI.\n\nCLI:\n\n\n\nServer:\n\n\n\nNote: You can also use this checkpoint directly through the usage steps listed in the URL repo as well." ]
automatic-speech-recognition
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # whisper-bs-cs-train-noaug-test-tstretch20-gain10-pitch20-gaussian20-lowpass10-mp3 This model is a fine-tuned version of [openai/whisper-base](https://huggingface.co/openai/whisper-base) on the common_voice_11_0 dataset. It achieves the following results on the evaluation set: - Loss: 1.0830 - Wer: 65.9355 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - training_steps: 4000 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:------:|:----:|:---------------:|:-------:| | 0.3007 | 1.4440 | 1000 | 1.1013 | 72.5808 | | 0.1741 | 2.8881 | 2000 | 1.0371 | 69.6725 | | 0.0972 | 4.3321 | 3000 | 1.0761 | 66.3609 | | 0.079 | 5.7762 | 4000 | 1.0830 | 65.9355 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "datasets": ["common_voice_11_0"], "metrics": ["wer"], "base_model": "openai/whisper-base", "model-index": [{"name": "whisper-bs-cs-train-noaug-test-tstretch20-gain10-pitch20-gaussian20-lowpass10-mp3", "results": [{"task": {"type": "automatic-speech-recognition", "name": "Automatic Speech Recognition"}, "dataset": {"name": "common_voice_11_0", "type": "common_voice_11_0", "config": "cs", "split": "None", "args": "cs"}, "metrics": [{"type": "wer", "value": 65.93546248204221, "name": "Wer"}]}]}]}
LadislavVasina1/whisper-bs-cs-train-noaug-test-tstretch20-gain10-pitch20-gaussian20-lowpass10-mp3
null
[ "transformers", "tensorboard", "safetensors", "whisper", "automatic-speech-recognition", "generated_from_trainer", "dataset:common_voice_11_0", "base_model:openai/whisper-base", "license:apache-2.0", "model-index", "endpoints_compatible", "region:us" ]
null
2024-04-27T11:53:08+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #dataset-common_voice_11_0 #base_model-openai/whisper-base #license-apache-2.0 #model-index #endpoints_compatible #region-us
whisper-bs-cs-train-noaug-test-tstretch20-gain10-pitch20-gaussian20-lowpass10-mp3 ================================================================================= This model is a fine-tuned version of openai/whisper-base on the common\_voice\_11\_0 dataset. It achieves the following results on the evaluation set: * Loss: 1.0830 * Wer: 65.9355 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 1e-05 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * gradient\_accumulation\_steps: 4 * total\_train\_batch\_size: 32 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 500 * training\_steps: 4000 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.40.1 * Pytorch 2.2.1+cu121 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #whisper #automatic-speech-recognition #generated_from_trainer #dataset-common_voice_11_0 #base_model-openai/whisper-base #license-apache-2.0 #model-index #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 1e-05\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 500\n* training\\_steps: 4000\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]