pipeline_tag
stringclasses
48 values
library_name
stringclasses
198 values
text
stringlengths
1
900k
metadata
stringlengths
2
438k
id
stringlengths
5
122
last_modified
null
tags
sequencelengths
1
1.84k
sha
null
created_at
stringlengths
25
25
arxiv
sequencelengths
0
201
languages
sequencelengths
0
1.83k
tags_str
stringlengths
17
9.34k
text_str
stringlengths
0
389k
text_lists
sequencelengths
0
722
processed_texts
sequencelengths
1
723
tokens_length
sequencelengths
1
723
input_texts
sequencelengths
1
1
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
solvit/my-midjourney-prompt-model
null
[ "transformers", "safetensors", "gemma", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T14:25:05+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #gemma #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #gemma #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 46, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #gemma #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": ["unsloth"]}
Demonthos/llama3
null
[ "transformers", "safetensors", "gguf", "llama", "unsloth", "arxiv:1910.09700", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T14:25:35+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #gguf #llama #unsloth #arxiv-1910.09700 #endpoints_compatible #text-generation-inference #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #gguf #llama #unsloth #arxiv-1910.09700 #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 43, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #gguf #llama #unsloth #arxiv-1910.09700 #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
text-generation
transformers
# Uploaded model - **Developed by:** davanstrien - **License:** apache-2.0 - **Finetuned from model :** unsloth/Phi-3-mini-4k-instruct-bnb-4bit This mistral model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "mistral", "trl", "orpo"], "base_model": "unsloth/Phi-3-mini-4k-instruct-bnb-4bit"}
davanstrien/dataset-tldr
null
[ "transformers", "safetensors", "mistral", "text-generation", "text-generation-inference", "unsloth", "trl", "orpo", "conversational", "en", "base_model:unsloth/Phi-3-mini-4k-instruct-bnb-4bit", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:26:24+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #mistral #text-generation #text-generation-inference #unsloth #trl #orpo #conversational #en #base_model-unsloth/Phi-3-mini-4k-instruct-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
# Uploaded model - Developed by: davanstrien - License: apache-2.0 - Finetuned from model : unsloth/Phi-3-mini-4k-instruct-bnb-4bit This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: davanstrien\n- License: apache-2.0\n- Finetuned from model : unsloth/Phi-3-mini-4k-instruct-bnb-4bit\n\nThis mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #text-generation-inference #unsloth #trl #orpo #conversational #en #base_model-unsloth/Phi-3-mini-4k-instruct-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: davanstrien\n- License: apache-2.0\n- Finetuned from model : unsloth/Phi-3-mini-4k-instruct-bnb-4bit\n\nThis mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ 83, 85 ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #text-generation-inference #unsloth #trl #orpo #conversational #en #base_model-unsloth/Phi-3-mini-4k-instruct-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# Uploaded model\n\n- Developed by: davanstrien\n- License: apache-2.0\n- Finetuned from model : unsloth/Phi-3-mini-4k-instruct-bnb-4bit\n\nThis mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
text-classification
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
bassemessam/Arabic-bank77-intent-classification
null
[ "transformers", "safetensors", "bert", "text-classification", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:27:26+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 37, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #bert #text-classification #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": ["unsloth"]}
avemio-digital/llama3_entity_extraction_category_adapter
null
[ "transformers", "safetensors", "unsloth", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:27:52+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #unsloth #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #unsloth #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 30, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #unsloth #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
text-generation
transformers
# **csg-wukong-1B-sft-dpo-bf16** [[中文]](#chinese) [[English]](#english) <a id="english"></a> <p align="center"> <img width="900px" alt="OpenCSG" src="./csg-wukong-logo-green.jpg"> </p> <p align="center"><a href="https://portal.opencsg.com/models">[OpenCSG Community]</a> <a href="https://github.com/opencsgs">[github]</a> <a href="https://cdn-uploads.huggingface.co/production/uploads/64c71b27d43e4dee51a8b31a/HU6vz21qKTEmUBCWqCFh9.jpeg">[wechat]</a> <a href="https://twitter.com/OpenCsg">[Twitter]</a> </p> </div> OpenCSG stands for Converged resources, Software refinement, and Generative LM. The 'C' represents Converged resources, indicating the integration and full utilization of hybrid resources. The 'S' stands for Software refinement, signifying software that is refined by large models. The 'G' represents Generative LM, which denotes widespread, inclusive, and democratized generative large models. The vision of OpenCSG is to empower every industry, every company, and every individual to own their models. We adhere to the principles of openness and open source, making the large model software stack of OpenCSG available to the community. We welcome everyone to use, send feedback, and contribute collaboratively. ## Model Description **csg-wukong-1B-sft-dpo-bf16** was finetuned on [csg-wukong-1B](https://huggingface.co/opencsg/csg-wukong-1B). <br> we will introduce more information about csg-wukong-1B. ## Model Evaluation results We submitted csg-wukong-1B on the [open_llm_leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard), and the results show our model ranked the 8th among the ~1.5B pretrained small language models. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/661790397437201d78141856/_HRTxL6N0qnNPNt-P8k9k.png) # Training ## Hardware - **GPUs:** 16 H800 - **Training time:** 43days ## Software - **Orchestration:** [Deepspeed](https://github.com/OpenCSGs) - **Neural networks:** [PyTorch](https://github.com/pytorch/pytorch) - **BP16 if applicable:** [apex](https://github.com/NVIDIA/apex) <a id="chinese"></a> <p> </p> # OpenCSG介绍 <p align="center"> <img width="300px" alt="OpenCSG" src="https://cdn-uploads.huggingface.co/production/uploads/64c71b27d43e4dee51a8b31a/GwYXPKuEoGCGcMICeW-sb.jpeg"> </p> <p align="center"><a href="https://opencsg.com/models">[OpenCSG 社区]</a> <a href="https://github.com/opencsgs">[github]</a> <a href="https://cdn-uploads.huggingface.co/production/uploads/64c71b27d43e4dee51a8b31a/HU6vz21qKTEmUBCWqCFh9.jpeg">[微信]</a> <a href="https://twitter.com/OpenCsg">[推特]</a> </p> </div> OpenCSG中 Open是开源开放;C 代表 Converged resources,整合和充分利用的混合异构资源优势,算力降本增效;S 代表 Software refined,重新定义软件的交付方式,通过大模型驱动软件开发,人力降本增效;G 代表 Generative LM,大众化、普惠化和民主化的可商用的开源生成式大模型。 OpenCSG的愿景是让每个行业、每个公司、每个人都拥有自己的模型。 我们坚持开源开放的原则,将OpenCSG的大模型软件栈开源到社区,欢迎使用、反馈和参与共建,欢迎关注。 ## 模型介绍 **csg-wukong-1B-sft-dpo-bf16** 在[csg-wukong-1B](https://huggingface.co/opencsg/csg-wukong-1B)预训练模型上微调而成. <br> 我们将在后面介绍更多关于这个模型的信息。 ## 模型评测结果 我们把csg-wukong-1B模型提交到[open_llm_leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)榜单上,结果显示我们的模型目前在~1.5B小语言模型中排名第8。 ![image/png](https://cdn-uploads.huggingface.co/production/uploads/661790397437201d78141856/ZfWZ1Fd7ccKrJVx0okV9z.png) # 训练 ## 硬件资源 - **GPU数量:** 16 H800 - **训练时间:** 43天 ## 软件使用 - **微调训练框架:** [Deepspeed](https://github.com/OpenCSGs) - **深度学习框架:** [PyTorch](https://github.com/pytorch/pytorch) - **BP16:** [apex](https://github.com/NVIDIA/apex)
{"language": ["en"], "license": "apache-2.0", "tags": ["code"], "pipeline_tag": "text-generation"}
opencsg/csg-wukong-1B-sft-dpo-bf16
null
[ "transformers", "safetensors", "llama", "text-generation", "code", "conversational", "en", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T14:31:24+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #llama #text-generation #code #conversational #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# csg-wukong-1B-sft-dpo-bf16 [[中文]](#chinese) [[English]](#english) <a id="english"></a> <p align="center"> <img width="900px" alt="OpenCSG" src="./URL"> </p> <p align="center"><a href="URL Community]</a> <a href="URL <a href="URL <a href="URL </p> </div> OpenCSG stands for Converged resources, Software refinement, and Generative LM. The 'C' represents Converged resources, indicating the integration and full utilization of hybrid resources. The 'S' stands for Software refinement, signifying software that is refined by large models. The 'G' represents Generative LM, which denotes widespread, inclusive, and democratized generative large models. The vision of OpenCSG is to empower every industry, every company, and every individual to own their models. We adhere to the principles of openness and open source, making the large model software stack of OpenCSG available to the community. We welcome everyone to use, send feedback, and contribute collaboratively. ## Model Description csg-wukong-1B-sft-dpo-bf16 was finetuned on csg-wukong-1B. <br> we will introduce more information about csg-wukong-1B. ## Model Evaluation results We submitted csg-wukong-1B on the open_llm_leaderboard, and the results show our model ranked the 8th among the ~1.5B pretrained small language models. !image/png # Training ## Hardware - GPUs: 16 H800 - Training time: 43days ## Software - Orchestration: Deepspeed - Neural networks: PyTorch - BP16 if applicable: apex <a id="chinese"></a> <p> </p> # OpenCSG介绍 <p align="center"> <img width="300px" alt="OpenCSG" src="URL </p> <p align="center"><a href="URL 社区]</a> <a href="URL <a href="URL[微信]</a> <a href="URL[推特]</a> </p> </div> OpenCSG中 Open是开源开放;C 代表 Converged resources,整合和充分利用的混合异构资源优势,算力降本增效;S 代表 Software refined,重新定义软件的交付方式,通过大模型驱动软件开发,人力降本增效;G 代表 Generative LM,大众化、普惠化和民主化的可商用的开源生成式大模型。 OpenCSG的愿景是让每个行业、每个公司、每个人都拥有自己的模型。 我们坚持开源开放的原则,将OpenCSG的大模型软件栈开源到社区,欢迎使用、反馈和参与共建,欢迎关注。 ## 模型介绍 csg-wukong-1B-sft-dpo-bf16 在csg-wukong-1B预训练模型上微调而成. <br> 我们将在后面介绍更多关于这个模型的信息。 ## 模型评测结果 我们把csg-wukong-1B模型提交到open_llm_leaderboard榜单上,结果显示我们的模型目前在~1.5B小语言模型中排名第8。 !image/png # 训练 ## 硬件资源 - GPU数量: 16 H800 - 训练时间: 43天 ## 软件使用 - 微调训练框架: Deepspeed - 深度学习框架: PyTorch - BP16: apex
[ "# csg-wukong-1B-sft-dpo-bf16 [[中文]](#chinese) [[English]](#english)\n\n<a id=\"english\"></a>\n\n<p align=\"center\">\n<img width=\"900px\" alt=\"OpenCSG\" src=\"./URL\">\n</p>\n\n<p align=\"center\"><a href=\"URL Community]</a> <a href=\"URL <a href=\"URL <a href=\"URL </p>\n\n\n</div>\nOpenCSG stands for Converged resources, Software refinement, and Generative LM. The 'C' represents Converged resources, indicating the integration and full utilization of hybrid resources. The 'S' stands for Software refinement, signifying software that is refined by large models. The 'G' represents Generative LM, which denotes widespread, inclusive, and democratized generative large models.\n\nThe vision of OpenCSG is to empower every industry, every company, and every individual to own their models. We adhere to the principles of openness and open source, making the large model software stack of OpenCSG available to the community. We welcome everyone to use, send feedback, and contribute collaboratively.", "## Model Description\n\n\n\n\ncsg-wukong-1B-sft-dpo-bf16 was finetuned on csg-wukong-1B. \n<br>\nwe will introduce more information about csg-wukong-1B.", "## Model Evaluation results\n\nWe submitted csg-wukong-1B on the open_llm_leaderboard, and\nthe results show our model ranked the 8th among the ~1.5B pretrained small language models.\n\n\n!image/png", "# Training", "## Hardware\n\n- GPUs: 16 H800 \n- Training time: 43days", "## Software\n\n- Orchestration: Deepspeed\n- Neural networks: PyTorch\n- BP16 if applicable: apex\n\n\n<a id=\"chinese\"></a>\n\n<p>\n\n</p>", "# OpenCSG介绍\n\n\n<p align=\"center\">\n<img width=\"300px\" alt=\"OpenCSG\" src=\"URL\n</p>\n\n<p align=\"center\"><a href=\"URL 社区]</a> <a href=\"URL <a href=\"URL[微信]</a> <a href=\"URL[推特]</a> </p>\n\n\n\n</div>\nOpenCSG中 Open是开源开放;C 代表 Converged resources,整合和充分利用的混合异构资源优势,算力降本增效;S 代表 Software refined,重新定义软件的交付方式,通过大模型驱动软件开发,人力降本增效;G 代表 Generative LM,大众化、普惠化和民主化的可商用的开源生成式大模型。\n\nOpenCSG的愿景是让每个行业、每个公司、每个人都拥有自己的模型。 我们坚持开源开放的原则,将OpenCSG的大模型软件栈开源到社区,欢迎使用、反馈和参与共建,欢迎关注。", "## 模型介绍\n\n\ncsg-wukong-1B-sft-dpo-bf16 在csg-wukong-1B预训练模型上微调而成.\n<br>\n\n我们将在后面介绍更多关于这个模型的信息。", "## 模型评测结果\n\n我们把csg-wukong-1B模型提交到open_llm_leaderboard榜单上,结果显示我们的模型目前在~1.5B小语言模型中排名第8。\n\n\n!image/png", "# 训练", "## 硬件资源\n\n- GPU数量: 16 H800 \n- 训练时间: 43天", "## 软件使用\n\n- 微调训练框架: Deepspeed\n- 深度学习框架: PyTorch\n- BP16: apex" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #code #conversational #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# csg-wukong-1B-sft-dpo-bf16 [[中文]](#chinese) [[English]](#english)\n\n<a id=\"english\"></a>\n\n<p align=\"center\">\n<img width=\"900px\" alt=\"OpenCSG\" src=\"./URL\">\n</p>\n\n<p align=\"center\"><a href=\"URL Community]</a> <a href=\"URL <a href=\"URL <a href=\"URL </p>\n\n\n</div>\nOpenCSG stands for Converged resources, Software refinement, and Generative LM. The 'C' represents Converged resources, indicating the integration and full utilization of hybrid resources. The 'S' stands for Software refinement, signifying software that is refined by large models. The 'G' represents Generative LM, which denotes widespread, inclusive, and democratized generative large models.\n\nThe vision of OpenCSG is to empower every industry, every company, and every individual to own their models. We adhere to the principles of openness and open source, making the large model software stack of OpenCSG available to the community. We welcome everyone to use, send feedback, and contribute collaboratively.", "## Model Description\n\n\n\n\ncsg-wukong-1B-sft-dpo-bf16 was finetuned on csg-wukong-1B. \n<br>\nwe will introduce more information about csg-wukong-1B.", "## Model Evaluation results\n\nWe submitted csg-wukong-1B on the open_llm_leaderboard, and\nthe results show our model ranked the 8th among the ~1.5B pretrained small language models.\n\n\n!image/png", "# Training", "## Hardware\n\n- GPUs: 16 H800 \n- Training time: 43days", "## Software\n\n- Orchestration: Deepspeed\n- Neural networks: PyTorch\n- BP16 if applicable: apex\n\n\n<a id=\"chinese\"></a>\n\n<p>\n\n</p>", "# OpenCSG介绍\n\n\n<p align=\"center\">\n<img width=\"300px\" alt=\"OpenCSG\" src=\"URL\n</p>\n\n<p align=\"center\"><a href=\"URL 社区]</a> <a href=\"URL <a href=\"URL[微信]</a> <a href=\"URL[推特]</a> </p>\n\n\n\n</div>\nOpenCSG中 Open是开源开放;C 代表 Converged resources,整合和充分利用的混合异构资源优势,算力降本增效;S 代表 Software refined,重新定义软件的交付方式,通过大模型驱动软件开发,人力降本增效;G 代表 Generative LM,大众化、普惠化和民主化的可商用的开源生成式大模型。\n\nOpenCSG的愿景是让每个行业、每个公司、每个人都拥有自己的模型。 我们坚持开源开放的原则,将OpenCSG的大模型软件栈开源到社区,欢迎使用、反馈和参与共建,欢迎关注。", "## 模型介绍\n\n\ncsg-wukong-1B-sft-dpo-bf16 在csg-wukong-1B预训练模型上微调而成.\n<br>\n\n我们将在后面介绍更多关于这个模型的信息。", "## 模型评测结果\n\n我们把csg-wukong-1B模型提交到open_llm_leaderboard榜单上,结果显示我们的模型目前在~1.5B小语言模型中排名第8。\n\n\n!image/png", "# 训练", "## 硬件资源\n\n- GPU数量: 16 H800 \n- 训练时间: 43天", "## 软件使用\n\n- 微调训练框架: Deepspeed\n- 深度学习框架: PyTorch\n- BP16: apex" ]
[ 49, 291, 50, 52, 2, 18, 44, 302, 64, 67, 3, 24, 34 ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #code #conversational #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# csg-wukong-1B-sft-dpo-bf16 [[中文]](#chinese) [[English]](#english)\n\n<a id=\"english\"></a>\n\n<p align=\"center\">\n<img width=\"900px\" alt=\"OpenCSG\" src=\"./URL\">\n</p>\n\n<p align=\"center\"><a href=\"URL Community]</a> <a href=\"URL <a href=\"URL <a href=\"URL </p>\n\n\n</div>\nOpenCSG stands for Converged resources, Software refinement, and Generative LM. The 'C' represents Converged resources, indicating the integration and full utilization of hybrid resources. The 'S' stands for Software refinement, signifying software that is refined by large models. The 'G' represents Generative LM, which denotes widespread, inclusive, and democratized generative large models.\n\nThe vision of OpenCSG is to empower every industry, every company, and every individual to own their models. We adhere to the principles of openness and open source, making the large model software stack of OpenCSG available to the community. We welcome everyone to use, send feedback, and contribute collaboratively.## Model Description\n\n\n\n\ncsg-wukong-1B-sft-dpo-bf16 was finetuned on csg-wukong-1B. \n<br>\nwe will introduce more information about csg-wukong-1B.## Model Evaluation results\n\nWe submitted csg-wukong-1B on the open_llm_leaderboard, and\nthe results show our model ranked the 8th among the ~1.5B pretrained small language models.\n\n\n!image/png# Training## Hardware\n\n- GPUs: 16 H800 \n- Training time: 43days## Software\n\n- Orchestration: Deepspeed\n- Neural networks: PyTorch\n- BP16 if applicable: apex\n\n\n<a id=\"chinese\"></a>\n\n<p>\n\n</p># OpenCSG介绍\n\n\n<p align=\"center\">\n<img width=\"300px\" alt=\"OpenCSG\" src=\"URL\n</p>\n\n<p align=\"center\"><a href=\"URL 社区]</a> <a href=\"URL <a href=\"URL[微信]</a> <a href=\"URL[推特]</a> </p>\n\n\n\n</div>\nOpenCSG中 Open是开源开放;C 代表 Converged resources,整合和充分利用的混合异构资源优势,算力降本增效;S 代表 Software refined,重新定义软件的交付方式,通过大模型驱动软件开发,人力降本增效;G 代表 Generative LM,大众化、普惠化和民主化的可商用的开源生成式大模型。\n\nOpenCSG的愿景是让每个行业、每个公司、每个人都拥有自己的模型。 我们坚持开源开放的原则,将OpenCSG的大模型软件栈开源到社区,欢迎使用、反馈和参与共建,欢迎关注。## 模型介绍\n\n\ncsg-wukong-1B-sft-dpo-bf16 在csg-wukong-1B预训练模型上微调而成.\n<br>\n\n我们将在后面介绍更多关于这个模型的信息。## 模型评测结果\n\n我们把csg-wukong-1B模型提交到open_llm_leaderboard榜单上,结果显示我们的模型目前在~1.5B小语言模型中排名第8。\n\n\n!image/png# 训练## 硬件资源\n\n- GPU数量: 16 H800 \n- 训练时间: 43天## 软件使用\n\n- 微调训练框架: Deepspeed\n- 深度学习框架: PyTorch\n- BP16: apex" ]
null
transformers
# Uploaded model - **Developed by:** Trelis - **License:** apache-2.0 - **Finetuned from model :** NousResearch/Meta-Llama-3-8B-Instruct This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "trl"], "base_model": "NousResearch/Meta-Llama-3-8B-Instruct"}
Trelis/Meta-Llama-3-8B-Instruct-no-fin-advice-adapters
null
[ "transformers", "safetensors", "text-generation-inference", "unsloth", "llama", "trl", "en", "base_model:NousResearch/Meta-Llama-3-8B-Instruct", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:35:32+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #text-generation-inference #unsloth #llama #trl #en #base_model-NousResearch/Meta-Llama-3-8B-Instruct #license-apache-2.0 #endpoints_compatible #region-us
# Uploaded model - Developed by: Trelis - License: apache-2.0 - Finetuned from model : NousResearch/Meta-Llama-3-8B-Instruct This llama model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: Trelis\n- License: apache-2.0\n- Finetuned from model : NousResearch/Meta-Llama-3-8B-Instruct\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #safetensors #text-generation-inference #unsloth #llama #trl #en #base_model-NousResearch/Meta-Llama-3-8B-Instruct #license-apache-2.0 #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: Trelis\n- License: apache-2.0\n- Finetuned from model : NousResearch/Meta-Llama-3-8B-Instruct\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ 65, 80 ]
[ "TAGS\n#transformers #safetensors #text-generation-inference #unsloth #llama #trl #en #base_model-NousResearch/Meta-Llama-3-8B-Instruct #license-apache-2.0 #endpoints_compatible #region-us \n# Uploaded model\n\n- Developed by: Trelis\n- License: apache-2.0\n- Finetuned from model : NousResearch/Meta-Llama-3-8B-Instruct\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
fill-mask
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # da_distilbert This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.6089 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 8 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 130 | 1.9587 | | No log | 2.0 | 260 | 1.8275 | | No log | 3.0 | 390 | 1.7576 | | 1.9627 | 4.0 | 520 | 1.7045 | | 1.9627 | 5.0 | 650 | 1.6049 | | 1.9627 | 6.0 | 780 | 1.6452 | | 1.9627 | 7.0 | 910 | 1.5920 | | 1.6873 | 8.0 | 1040 | 1.6354 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.4.0.dev20240502 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "distilbert-base-cased", "model-index": [{"name": "da_distilbert", "results": []}]}
gc394/da_distilbert
null
[ "transformers", "tensorboard", "safetensors", "distilbert", "fill-mask", "generated_from_trainer", "base_model:distilbert-base-cased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:35:45+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
da\_distilbert ============== This model is a fine-tuned version of distilbert-base-cased on the None dataset. It achieves the following results on the evaluation set: * Loss: 1.6089 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 8 ### Training results ### Framework versions * Transformers 4.40.1 * Pytorch 2.4.0.dev20240502 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 8", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.4.0.dev20240502\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 8", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.4.0.dev20240502\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ 59, 101, 5, 47 ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 8### Training results### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.4.0.dev20240502\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
text-generation
transformers
# Llama-3-70B-Instruct-ov-fp16-int4-sym ## Built with Meta Llama 3 ## Model Description This is a version of the original [Meta-Llama-3-70B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-70B-Instruct) model converted to [OpenVINO™](https://github.com/openvinotoolkit/openvino) IR (Intermediate Representation) format for optimized inference on Intel® hardware. The model is created using the examples shown in [OpenVINO™ Notebooks](https://github.com/openvinotoolkit/openvino_notebooks/tree/latest/notebooks) repository. ## Intended Use This model is designed for advanced natural language understanding and generation tasks, ideal for academic researchers and developers in commercial settings looking to integrate efficient AI capabilities into their applications. It is not to be used for creating or promoting harmful or illegal content as per the guidelines outlined in the [Meta Llama 3 Acceptable Use Policy](https://llama.meta.com/llama3/use-policy/). ## Licensing and Redistribution This model is released under the Meta Llama 3 Community License. Redistribution requires inclusion of this license and a citation to the original model. Modifications and derivative works must prominently display "Built with Meta Llama 3" and adhere to the redistribution policies detailed in the original model [license terms](https://huggingface.co/meta-llama/Meta-Llama-3-70B-Instruct/blob/main/LICENSE). ## Weight Compression Parameters For more information on the parameters, refer to the [OpenVINO™ 2024.1.0 documentation](https://docs.openvino.ai/2024/openvino-workflow/model-optimization-guide/weight-compression.html) * mode: **INT4_ASYM** * group_size: **128** * ratio: **0.8** ## Running Model Inference Install packages required for using [Optimum Intel](https://huggingface.co/docs/optimum/intel/index) integration with the OpenVINO™ backend: ```sh pip install --upgrade --upgrade-strategy eager "optimum[openvino]" ``` Run model inference: ```python from optimum.intel.openvino import OVModelForCausalLM from transformers import AutoTokenizer model_id = "nsbendre25/Llama-3-70B-Instruct-ov_fp16-int4_sym" # Initialize the tokenizer and model tokenizer = AutoTokenizer.from_pretrained(model_id) model = OVModelForCausalLM.from_pretrained(model_id) pipeline = transformers.pipeline("text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto") pipeline("i am in paris, plan me a 2 week trip") ```
{"language": ["en"], "license": "llama3", "library_name": "transformers", "tags": ["OpenVINO", "meta", "llama", "llama-3", "optimum-intel"], "pipeline_tag": "text-generation", "extra_gated_prompt": "\nMeta Llama 3 Version Release Date: April 18, 2024\n"}
nsbendre25/llama-3-70B-Instruct-ov-fp16-int4-asym
null
[ "transformers", "openvino", "llama", "text-generation", "OpenVINO", "meta", "llama-3", "optimum-intel", "conversational", "en", "license:llama3", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T14:36:01+00:00
[]
[ "en" ]
TAGS #transformers #openvino #llama #text-generation #OpenVINO #meta #llama-3 #optimum-intel #conversational #en #license-llama3 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Llama-3-70B-Instruct-ov-fp16-int4-sym ## Built with Meta Llama 3 ## Model Description This is a version of the original Meta-Llama-3-70B-Instruct model converted to OpenVINO™ IR (Intermediate Representation) format for optimized inference on Intel® hardware. The model is created using the examples shown in OpenVINO™ Notebooks repository. ## Intended Use This model is designed for advanced natural language understanding and generation tasks, ideal for academic researchers and developers in commercial settings looking to integrate efficient AI capabilities into their applications. It is not to be used for creating or promoting harmful or illegal content as per the guidelines outlined in the Meta Llama 3 Acceptable Use Policy. ## Licensing and Redistribution This model is released under the Meta Llama 3 Community License. Redistribution requires inclusion of this license and a citation to the original model. Modifications and derivative works must prominently display "Built with Meta Llama 3" and adhere to the redistribution policies detailed in the original model license terms. ## Weight Compression Parameters For more information on the parameters, refer to the OpenVINO™ 2024.1.0 documentation * mode: INT4_ASYM * group_size: 128 * ratio: 0.8 ## Running Model Inference Install packages required for using Optimum Intel integration with the OpenVINO™ backend: Run model inference:
[ "# Llama-3-70B-Instruct-ov-fp16-int4-sym", "## Built with Meta Llama 3", "## Model Description\n \nThis is a version of the original Meta-Llama-3-70B-Instruct model converted to OpenVINO™ IR (Intermediate Representation) format for optimized inference on Intel® hardware. The model is created using the examples shown in OpenVINO™ Notebooks repository.", "## Intended Use\nThis model is designed for advanced natural language understanding and generation tasks, ideal for academic researchers and developers in commercial settings looking to integrate efficient AI capabilities into their applications. It is not to be used for creating or promoting harmful or illegal content as per the guidelines outlined in the Meta Llama 3 Acceptable Use Policy.", "## Licensing and Redistribution\nThis model is released under the Meta Llama 3 Community License. Redistribution requires inclusion of this license and a citation to the original model. Modifications and derivative works must prominently display \"Built with Meta Llama 3\" and adhere to the redistribution policies detailed in the original model license terms.", "## Weight Compression Parameters\nFor more information on the parameters, refer to the OpenVINO™ 2024.1.0 documentation\n \n* mode: INT4_ASYM\n* group_size: 128\n* ratio: 0.8", "## Running Model Inference\n \nInstall packages required for using Optimum Intel integration with the OpenVINO™ backend:\n \n\n \nRun model inference:" ]
[ "TAGS\n#transformers #openvino #llama #text-generation #OpenVINO #meta #llama-3 #optimum-intel #conversational #en #license-llama3 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Llama-3-70B-Instruct-ov-fp16-int4-sym", "## Built with Meta Llama 3", "## Model Description\n \nThis is a version of the original Meta-Llama-3-70B-Instruct model converted to OpenVINO™ IR (Intermediate Representation) format for optimized inference on Intel® hardware. The model is created using the examples shown in OpenVINO™ Notebooks repository.", "## Intended Use\nThis model is designed for advanced natural language understanding and generation tasks, ideal for academic researchers and developers in commercial settings looking to integrate efficient AI capabilities into their applications. It is not to be used for creating or promoting harmful or illegal content as per the guidelines outlined in the Meta Llama 3 Acceptable Use Policy.", "## Licensing and Redistribution\nThis model is released under the Meta Llama 3 Community License. Redistribution requires inclusion of this license and a citation to the original model. Modifications and derivative works must prominently display \"Built with Meta Llama 3\" and adhere to the redistribution policies detailed in the original model license terms.", "## Weight Compression Parameters\nFor more information on the parameters, refer to the OpenVINO™ 2024.1.0 documentation\n \n* mode: INT4_ASYM\n* group_size: 128\n* ratio: 0.8", "## Running Model Inference\n \nInstall packages required for using Optimum Intel integration with the OpenVINO™ backend:\n \n\n \nRun model inference:" ]
[ 59, 24, 8, 61, 65, 61, 45, 26 ]
[ "TAGS\n#transformers #openvino #llama #text-generation #OpenVINO #meta #llama-3 #optimum-intel #conversational #en #license-llama3 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Llama-3-70B-Instruct-ov-fp16-int4-sym## Built with Meta Llama 3## Model Description\n \nThis is a version of the original Meta-Llama-3-70B-Instruct model converted to OpenVINO™ IR (Intermediate Representation) format for optimized inference on Intel® hardware. The model is created using the examples shown in OpenVINO™ Notebooks repository.## Intended Use\nThis model is designed for advanced natural language understanding and generation tasks, ideal for academic researchers and developers in commercial settings looking to integrate efficient AI capabilities into their applications. It is not to be used for creating or promoting harmful or illegal content as per the guidelines outlined in the Meta Llama 3 Acceptable Use Policy.## Licensing and Redistribution\nThis model is released under the Meta Llama 3 Community License. Redistribution requires inclusion of this license and a citation to the original model. Modifications and derivative works must prominently display \"Built with Meta Llama 3\" and adhere to the redistribution policies detailed in the original model license terms.## Weight Compression Parameters\nFor more information on the parameters, refer to the OpenVINO™ 2024.1.0 documentation\n \n* mode: INT4_ASYM\n* group_size: 128\n* ratio: 0.8## Running Model Inference\n \nInstall packages required for using Optimum Intel integration with the OpenVINO™ backend:\n \n\n \nRun model inference:" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
lunarsylph/mooncell_v39
null
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T14:37:31+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 47, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
text-generation
transformers
# GreenBit LLMs This is GreenBitAI's pretrained **low-bit** LLMs with extreme compression yet still strong performance. Please refer to our [Github page](https://github.com/GreenBitAI/green-bit-llm) for the code to run the model and more information.
{"license": "apache-2.0"}
GreenBitAI/Qwen-1.5-7B-layer-mix-bpw-4.0
null
[ "transformers", "safetensors", "qwen2", "text-generation", "conversational", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T14:37:43+00:00
[]
[]
TAGS #transformers #safetensors #qwen2 #text-generation #conversational #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# GreenBit LLMs This is GreenBitAI's pretrained low-bit LLMs with extreme compression yet still strong performance. Please refer to our Github page for the code to run the model and more information.
[ "# GreenBit LLMs\n\nThis is GreenBitAI's pretrained low-bit LLMs with extreme compression yet still strong performance.\n\nPlease refer to our Github page for the code to run the model and more information." ]
[ "TAGS\n#transformers #safetensors #qwen2 #text-generation #conversational #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# GreenBit LLMs\n\nThis is GreenBitAI's pretrained low-bit LLMs with extreme compression yet still strong performance.\n\nPlease refer to our Github page for the code to run the model and more information." ]
[ 46, 47 ]
[ "TAGS\n#transformers #safetensors #qwen2 #text-generation #conversational #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# GreenBit LLMs\n\nThis is GreenBitAI's pretrained low-bit LLMs with extreme compression yet still strong performance.\n\nPlease refer to our Github page for the code to run the model and more information." ]
text-generation
transformers
# Uploaded model - **Developed by:** ntvcie - **License:** apache-2.0 - **Finetuned from model :** unsloth/tinyllama-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "trl"], "base_model": "unsloth/tinyllama-bnb-4bit"}
ntvcie/TinyLlamaVinhntV01
null
[ "transformers", "pytorch", "safetensors", "llama", "text-generation", "text-generation-inference", "unsloth", "trl", "en", "base_model:unsloth/tinyllama-bnb-4bit", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:38:51+00:00
[]
[ "en" ]
TAGS #transformers #pytorch #safetensors #llama #text-generation #text-generation-inference #unsloth #trl #en #base_model-unsloth/tinyllama-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
# Uploaded model - Developed by: ntvcie - License: apache-2.0 - Finetuned from model : unsloth/tinyllama-bnb-4bit This llama model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: ntvcie\n- License: apache-2.0\n- Finetuned from model : unsloth/tinyllama-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #pytorch #safetensors #llama #text-generation #text-generation-inference #unsloth #trl #en #base_model-unsloth/tinyllama-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: ntvcie\n- License: apache-2.0\n- Finetuned from model : unsloth/tinyllama-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ 74, 76 ]
[ "TAGS\n#transformers #pytorch #safetensors #llama #text-generation #text-generation-inference #unsloth #trl #en #base_model-unsloth/tinyllama-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# Uploaded model\n\n- Developed by: ntvcie\n- License: apache-2.0\n- Finetuned from model : unsloth/tinyllama-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
text-generation
null
# tushar310/Phi-3-mini-128k-instruct-Q4_0-GGUF This model was converted to GGUF format from [`microsoft/Phi-3-mini-128k-instruct`](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct) using llama.cpp via the ggml.ai's [GGUF-my-repo](https://huggingface.co/spaces/ggml-org/gguf-my-repo) space. Refer to the [original model card](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct) for more details on the model. ## Use with llama.cpp Install llama.cpp through brew. ```bash brew install ggerganov/ggerganov/llama.cpp ``` Invoke the llama.cpp server or the CLI. CLI: ```bash llama-cli --hf-repo tushar310/Phi-3-mini-128k-instruct-Q4_0-GGUF --model phi-3-mini-128k-instruct.Q4_0.gguf -p "The meaning to life and the universe is" ``` Server: ```bash llama-server --hf-repo tushar310/Phi-3-mini-128k-instruct-Q4_0-GGUF --model phi-3-mini-128k-instruct.Q4_0.gguf -c 2048 ``` Note: You can also use this checkpoint directly through the [usage steps](https://github.com/ggerganov/llama.cpp?tab=readme-ov-file#usage) listed in the Llama.cpp repo as well. ``` git clone https://github.com/ggerganov/llama.cpp && cd llama.cpp && make && ./main -m phi-3-mini-128k-instruct.Q4_0.gguf -n 128 ```
{"language": ["en"], "license": "mit", "tags": ["nlp", "code", "llama-cpp", "gguf-my-repo"], "license_link": "https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/resolve/main/LICENSE", "pipeline_tag": "text-generation", "widget": [{"messages": [{"role": "user", "content": "Can you provide ways to eat combinations of bananas and dragonfruits?"}]}]}
tushar310/Phi-3-mini-128k-instruct-Q4_0-GGUF
null
[ "gguf", "nlp", "code", "llama-cpp", "gguf-my-repo", "text-generation", "en", "license:mit", "region:us" ]
null
2024-04-30T14:41:07+00:00
[]
[ "en" ]
TAGS #gguf #nlp #code #llama-cpp #gguf-my-repo #text-generation #en #license-mit #region-us
# tushar310/Phi-3-mini-128k-instruct-Q4_0-GGUF This model was converted to GGUF format from 'microsoft/Phi-3-mini-128k-instruct' using URL via the URL's GGUF-my-repo space. Refer to the original model card for more details on the model. ## Use with URL Install URL through brew. Invoke the URL server or the CLI. CLI: Server: Note: You can also use this checkpoint directly through the usage steps listed in the URL repo as well.
[ "# tushar310/Phi-3-mini-128k-instruct-Q4_0-GGUF\nThis model was converted to GGUF format from 'microsoft/Phi-3-mini-128k-instruct' using URL via the URL's GGUF-my-repo space.\nRefer to the original model card for more details on the model.", "## Use with URL\n\nInstall URL through brew.\n\n\nInvoke the URL server or the CLI.\n\nCLI:\n\n\n\nServer:\n\n\n\nNote: You can also use this checkpoint directly through the usage steps listed in the URL repo as well." ]
[ "TAGS\n#gguf #nlp #code #llama-cpp #gguf-my-repo #text-generation #en #license-mit #region-us \n", "# tushar310/Phi-3-mini-128k-instruct-Q4_0-GGUF\nThis model was converted to GGUF format from 'microsoft/Phi-3-mini-128k-instruct' using URL via the URL's GGUF-my-repo space.\nRefer to the original model card for more details on the model.", "## Use with URL\n\nInstall URL through brew.\n\n\nInvoke the URL server or the CLI.\n\nCLI:\n\n\n\nServer:\n\n\n\nNote: You can also use this checkpoint directly through the usage steps listed in the URL repo as well." ]
[ 39, 84, 52 ]
[ "TAGS\n#gguf #nlp #code #llama-cpp #gguf-my-repo #text-generation #en #license-mit #region-us \n# tushar310/Phi-3-mini-128k-instruct-Q4_0-GGUF\nThis model was converted to GGUF format from 'microsoft/Phi-3-mini-128k-instruct' using URL via the URL's GGUF-my-repo space.\nRefer to the original model card for more details on the model.## Use with URL\n\nInstall URL through brew.\n\n\nInvoke the URL server or the CLI.\n\nCLI:\n\n\n\nServer:\n\n\n\nNote: You can also use this checkpoint directly through the usage steps listed in the URL repo as well." ]
text-classification
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-finetuned-patient-doctor-text-classifier-eng This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0516 - Accuracy: 0.9879 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 2 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.0897 | 1.0 | 1547 | 0.0573 | 0.9865 | | 0.0301 | 2.0 | 3094 | 0.0516 | 0.9879 | ### Framework versions - Transformers 4.38.1 - Pytorch 2.1.2 - Datasets 2.1.0 - Tokenizers 0.15.2 - # How to Use ```python from transformers import pipeline classifier = pipeline("text-classification", model="LukeGPT88/patient-doctor-text-classifier-eng") classifier("I see you’ve set aside this special time to humiliate yourself in public.") ``` ```python Output: [{'label': 'NEUTRAL', 'score': 0.9890775680541992}] ``` # Contact Please reach out to [[email protected]]([email protected]) if you have any questions or feedback. ---
{"language": "en", "tags": ["distilbert-base-uncased", "text-classification", "patient", "doctor"], "widget": [{"text": "I've got flu"}, {"text": "I prescribe you some drugs and you need to stay at home for a couple of days"}, {"text": "Let's move to the theatre this evening!"}]}
LukeGPT88/patient-doctor-text-classifier-eng
null
[ "transformers", "tensorboard", "safetensors", "distilbert", "text-classification", "distilbert-base-uncased", "patient", "doctor", "en", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:41:15+00:00
[]
[ "en" ]
TAGS #transformers #tensorboard #safetensors #distilbert #text-classification #distilbert-base-uncased #patient #doctor #en #autotrain_compatible #endpoints_compatible #region-us
distilbert-base-uncased-finetuned-patient-doctor-text-classifier-eng ==================================================================== This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.0516 * Accuracy: 0.9879 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 2 ### Training results ### Framework versions * Transformers 4.38.1 * Pytorch 2.1.2 * Datasets 2.1.0 * Tokenizers 0.15.2 * How to Use ========== Contact ======= Please reach out to luca.flammia@URL if you have any questions or feedback. ---
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2", "### Training results", "### Framework versions\n\n\n* Transformers 4.38.1\n* Pytorch 2.1.2\n* Datasets 2.1.0\n* Tokenizers 0.15.2\n* How to Use\n==========\n\n\nContact\n=======\n\n\nPlease reach out to luca.flammia@URL if you have any questions or feedback.\n\n\n\n\n---" ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #distilbert-base-uncased #patient #doctor #en #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2", "### Training results", "### Framework versions\n\n\n* Transformers 4.38.1\n* Pytorch 2.1.2\n* Datasets 2.1.0\n* Tokenizers 0.15.2\n* How to Use\n==========\n\n\nContact\n=======\n\n\nPlease reach out to luca.flammia@URL if you have any questions or feedback.\n\n\n\n\n---" ]
[ 47, 101, 5, 85 ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #text-classification #distilbert-base-uncased #patient #doctor #en #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 2### Training results### Framework versions\n\n\n* Transformers 4.38.1\n* Pytorch 2.1.2\n* Datasets 2.1.0\n* Tokenizers 0.15.2\n* How to Use\n==========\n\n\nContact\n=======\n\n\nPlease reach out to luca.flammia@URL if you have any questions or feedback.\n\n\n\n\n---" ]
text-generation
transformers
# Untitled Model (1) This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [codellama/CodeLlama-7b-hf](https://huggingface.co/codellama/CodeLlama-7b-hf) as a base. ### Models Merged The following models were included in the merge: * [EleutherAI/llemma_7b](https://huggingface.co/EleutherAI/llemma_7b) ### Configuration The following YAML configuration was used to produce this model: ```yaml models: - model: codellama/CodeLlama-7b-hf parameters: density: 0.5 weight: 0.5 - model: EleutherAI/llemma_7b parameters: density: 0.5 weight: 0.5 merge_method: ties base_model: codellama/CodeLlama-7b-hf parameters: normalize: true int8_mask: true dtype: float16 ```
{"library_name": "transformers", "tags": ["mergekit", "merge"], "base_model": ["EleutherAI/llemma_7b", "codellama/CodeLlama-7b-hf"]}
JyoP/merged_llemma_codeLlama-ties
null
[ "transformers", "safetensors", "llama", "text-generation", "mergekit", "merge", "arxiv:2306.01708", "base_model:EleutherAI/llemma_7b", "base_model:codellama/CodeLlama-7b-hf", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T14:41:16+00:00
[ "2306.01708" ]
[]
TAGS #transformers #safetensors #llama #text-generation #mergekit #merge #arxiv-2306.01708 #base_model-EleutherAI/llemma_7b #base_model-codellama/CodeLlama-7b-hf #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Untitled Model (1) This is a merge of pre-trained language models created using mergekit. ## Merge Details ### Merge Method This model was merged using the TIES merge method using codellama/CodeLlama-7b-hf as a base. ### Models Merged The following models were included in the merge: * EleutherAI/llemma_7b ### Configuration The following YAML configuration was used to produce this model:
[ "# Untitled Model (1)\n\nThis is a merge of pre-trained language models created using mergekit.", "## Merge Details", "### Merge Method\n\nThis model was merged using the TIES merge method using codellama/CodeLlama-7b-hf as a base.", "### Models Merged\n\nThe following models were included in the merge:\n* EleutherAI/llemma_7b", "### Configuration\n\nThe following YAML configuration was used to produce this model:" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #mergekit #merge #arxiv-2306.01708 #base_model-EleutherAI/llemma_7b #base_model-codellama/CodeLlama-7b-hf #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Untitled Model (1)\n\nThis is a merge of pre-trained language models created using mergekit.", "## Merge Details", "### Merge Method\n\nThis model was merged using the TIES merge method using codellama/CodeLlama-7b-hf as a base.", "### Models Merged\n\nThe following models were included in the merge:\n* EleutherAI/llemma_7b", "### Configuration\n\nThe following YAML configuration was used to produce this model:" ]
[ 84, 21, 4, 32, 26, 16 ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #mergekit #merge #arxiv-2306.01708 #base_model-EleutherAI/llemma_7b #base_model-codellama/CodeLlama-7b-hf #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Untitled Model (1)\n\nThis is a merge of pre-trained language models created using mergekit.## Merge Details### Merge Method\n\nThis model was merged using the TIES merge method using codellama/CodeLlama-7b-hf as a base.### Models Merged\n\nThe following models were included in the merge:\n* EleutherAI/llemma_7b### Configuration\n\nThe following YAML configuration was used to produce this model:" ]
null
transformers
## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: --> <!-- ### vocab_type: --> static quants of https://huggingface.co/fiveflow/KoLlama-3-8B-Instruct <!-- provided-files --> weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion. ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.Q2_K.gguf) | Q2_K | 3.3 | | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.IQ3_XS.gguf) | IQ3_XS | 3.6 | | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.Q3_K_S.gguf) | Q3_K_S | 3.8 | | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.IQ3_S.gguf) | IQ3_S | 3.8 | beats Q3_K* | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.IQ3_M.gguf) | IQ3_M | 3.9 | | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.Q3_K_M.gguf) | Q3_K_M | 4.1 | lower quality | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.Q3_K_L.gguf) | Q3_K_L | 4.4 | | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.IQ4_XS.gguf) | IQ4_XS | 4.6 | | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.Q4_K_S.gguf) | Q4_K_S | 4.8 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.Q4_K_M.gguf) | Q4_K_M | 5.0 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.Q5_K_S.gguf) | Q5_K_S | 5.7 | | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.Q5_K_M.gguf) | Q5_K_M | 5.8 | | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.Q6_K.gguf) | Q6_K | 6.7 | very good quality | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.Q8_0.gguf) | Q8_0 | 8.6 | fast, best quality | | [GGUF](https://huggingface.co/mradermacher/KoLlama-3-8B-Instruct-GGUF/resolve/main/KoLlama-3-8B-Instruct.f16.gguf) | f16 | 16.2 | 16 bpw, overkill | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
{"language": ["en"], "license": "apache-2.0", "library_name": "transformers", "tags": ["generated_from_trainer"], "base_model": "fiveflow/KoLlama-3-8B-Instruct", "quantized_by": "mradermacher"}
mradermacher/KoLlama-3-8B-Instruct-GGUF
null
[ "transformers", "gguf", "generated_from_trainer", "en", "base_model:fiveflow/KoLlama-3-8B-Instruct", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:41:45+00:00
[]
[ "en" ]
TAGS #transformers #gguf #generated_from_trainer #en #base_model-fiveflow/KoLlama-3-8B-Instruct #license-apache-2.0 #endpoints_compatible #region-us
About ----- static quants of URL weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion. Usage ----- If you are unsure how to use GGUF files, refer to one of TheBloke's READMEs for more details, including on how to concatenate multi-part files. Provided Quants --------------- (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): !URL And here are Artefact2's thoughts on the matter: URL FAQ / Model Request ------------------- See URL for some answers to questions you might have and/or if you want some other model quantized. Thanks ------ I thank my company, nethype GmbH, for letting me use its servers and providing upgrades to my workstation to enable this work in my free time.
[]
[ "TAGS\n#transformers #gguf #generated_from_trainer #en #base_model-fiveflow/KoLlama-3-8B-Instruct #license-apache-2.0 #endpoints_compatible #region-us \n" ]
[ 51 ]
[ "TAGS\n#transformers #gguf #generated_from_trainer #en #base_model-fiveflow/KoLlama-3-8B-Instruct #license-apache-2.0 #endpoints_compatible #region-us \n" ]
text-classification
transformers
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # ai-text-detector This model is a fine-tuned version of [tf_model.h5](https://huggingface.co/Yuhhi/tf_model.h5) on an unknown dataset. It achieves the following results on the evaluation set: ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: None - training_precision: float32 ### Training results ### Framework versions - Transformers 4.32.1 - TensorFlow 2.15.0 - Datasets 2.12.0 - Tokenizers 0.13.2
{"tags": ["generated_from_keras_callback"], "base_model": "tf_model.h5", "model-index": [{"name": "ai-text-detector", "results": []}]}
Yuhhi/ai-text-detector
null
[ "transformers", "tf", "distilbert", "text-classification", "generated_from_keras_callback", "base_model:tf_model.h5", "autotrain_compatible", "endpoints_compatible", "has_space", "region:us" ]
null
2024-04-30T14:42:08+00:00
[]
[]
TAGS #transformers #tf #distilbert #text-classification #generated_from_keras_callback #base_model-tf_model.h5 #autotrain_compatible #endpoints_compatible #has_space #region-us
# ai-text-detector This model is a fine-tuned version of tf_model.h5 on an unknown dataset. It achieves the following results on the evaluation set: ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: None - training_precision: float32 ### Training results ### Framework versions - Transformers 4.32.1 - TensorFlow 2.15.0 - Datasets 2.12.0 - Tokenizers 0.13.2
[ "# ai-text-detector\n\nThis model is a fine-tuned version of tf_model.h5 on an unknown dataset.\nIt achieves the following results on the evaluation set:", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: None\n- training_precision: float32", "### Training results", "### Framework versions\n\n- Transformers 4.32.1\n- TensorFlow 2.15.0\n- Datasets 2.12.0\n- Tokenizers 0.13.2" ]
[ "TAGS\n#transformers #tf #distilbert #text-classification #generated_from_keras_callback #base_model-tf_model.h5 #autotrain_compatible #endpoints_compatible #has_space #region-us \n", "# ai-text-detector\n\nThis model is a fine-tuned version of tf_model.h5 on an unknown dataset.\nIt achieves the following results on the evaluation set:", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: None\n- training_precision: float32", "### Training results", "### Framework versions\n\n- Transformers 4.32.1\n- TensorFlow 2.15.0\n- Datasets 2.12.0\n- Tokenizers 0.13.2" ]
[ 54, 39, 7, 9, 9, 4, 32, 5, 38 ]
[ "TAGS\n#transformers #tf #distilbert #text-classification #generated_from_keras_callback #base_model-tf_model.h5 #autotrain_compatible #endpoints_compatible #has_space #region-us \n# ai-text-detector\n\nThis model is a fine-tuned version of tf_model.h5 on an unknown dataset.\nIt achieves the following results on the evaluation set:## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- optimizer: None\n- training_precision: float32### Training results### Framework versions\n\n- Transformers 4.32.1\n- TensorFlow 2.15.0\n- Datasets 2.12.0\n- Tokenizers 0.13.2" ]
text-generation
transformers
# AnFeng **~30B**, SFT... **Chinese, English** Test 0 of all. Released as an early preview of our v3 LLMs. The v3 series covers the "Shi-Ci", "AnFeng" and "Cecilia" LLM products. The sizes are labelled from small to large "Nano" "Leap" "Pattern" "Avocet "Robin" "Kestrel"
{"language": ["en"], "license": "cc-by-nc-nd-4.0", "library_name": "transformers", "pipeline_tag": "text-generation", "inference": false}
NLPark/AnFeng_v3_Avocet
null
[ "transformers", "safetensors", "cohere", "text-generation", "conversational", "en", "license:cc-by-nc-nd-4.0", "autotrain_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T14:42:55+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #cohere #text-generation #conversational #en #license-cc-by-nc-nd-4.0 #autotrain_compatible #text-generation-inference #region-us
# AnFeng ~30B, SFT... Chinese, English Test 0 of all. Released as an early preview of our v3 LLMs. The v3 series covers the "Shi-Ci", "AnFeng" and "Cecilia" LLM products. The sizes are labelled from small to large "Nano" "Leap" "Pattern" "Avocet "Robin" "Kestrel"
[ "# AnFeng\n~30B, SFT...\n\nChinese, English\nTest 0 of all.\nReleased as an early preview of our v3 LLMs.\nThe v3 series covers the \"Shi-Ci\", \"AnFeng\" and \"Cecilia\" LLM products.\nThe sizes are labelled from small to large \"Nano\" \"Leap\" \"Pattern\" \"Avocet \"Robin\" \"Kestrel\"" ]
[ "TAGS\n#transformers #safetensors #cohere #text-generation #conversational #en #license-cc-by-nc-nd-4.0 #autotrain_compatible #text-generation-inference #region-us \n", "# AnFeng\n~30B, SFT...\n\nChinese, English\nTest 0 of all.\nReleased as an early preview of our v3 LLMs.\nThe v3 series covers the \"Shi-Ci\", \"AnFeng\" and \"Cecilia\" LLM products.\nThe sizes are labelled from small to large \"Nano\" \"Leap\" \"Pattern\" \"Avocet \"Robin\" \"Kestrel\"" ]
[ 50, 84 ]
[ "TAGS\n#transformers #safetensors #cohere #text-generation #conversational #en #license-cc-by-nc-nd-4.0 #autotrain_compatible #text-generation-inference #region-us \n# AnFeng\n~30B, SFT...\n\nChinese, English\nTest 0 of all.\nReleased as an early preview of our v3 LLMs.\nThe v3 series covers the \"Shi-Ci\", \"AnFeng\" and \"Cecilia\" LLM products.\nThe sizes are labelled from small to large \"Nano\" \"Leap\" \"Pattern\" \"Avocet \"Robin\" \"Kestrel\"" ]
question-answering
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # roberta_es_v2 This model is a fine-tuned version of [FacebookAI/roberta-base](https://huggingface.co/FacebookAI/roberta-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 1.5649 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 24 - eval_batch_size: 24 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 2.3046 | 1.0 | 724 | 1.6641 | | 1.6565 | 2.0 | 1448 | 1.5655 | | 1.3038 | 3.0 | 2172 | 1.5649 | ### Framework versions - Transformers 4.41.0.dev0 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "mit", "tags": ["generated_from_trainer"], "base_model": "FacebookAI/roberta-base", "model-index": [{"name": "roberta_es_v2", "results": []}]}
enriquesaou/roberta_es_v2
null
[ "transformers", "tensorboard", "safetensors", "roberta", "question-answering", "generated_from_trainer", "base_model:FacebookAI/roberta-base", "license:mit", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:43:17+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #roberta #question-answering #generated_from_trainer #base_model-FacebookAI/roberta-base #license-mit #endpoints_compatible #region-us
roberta\_es\_v2 =============== This model is a fine-tuned version of FacebookAI/roberta-base on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 1.5649 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 24 * eval\_batch\_size: 24 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 3 ### Training results ### Framework versions * Transformers 4.41.0.dev0 * Pytorch 2.2.1+cu121 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 24\n* eval\\_batch\\_size: 24\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* Transformers 4.41.0.dev0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #roberta #question-answering #generated_from_trainer #base_model-FacebookAI/roberta-base #license-mit #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 24\n* eval\\_batch\\_size: 24\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* Transformers 4.41.0.dev0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ 46, 101, 5, 47 ]
[ "TAGS\n#transformers #tensorboard #safetensors #roberta #question-answering #generated_from_trainer #base_model-FacebookAI/roberta-base #license-mit #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 24\n* eval\\_batch\\_size: 24\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.41.0.dev0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
fill-mask
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # final-ft__roberta-base-biomedical-clinical-es__70k-ultrasounds This model is a fine-tuned version of [BSC-LT/roberta-base-biomedical-clinical-es](https://huggingface.co/BSC-LT/roberta-base-biomedical-clinical-es) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.5971 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 256 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 20 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-------:|:----:|:---------------:| | No log | 0.9978 | 229 | 0.9388 | | No log | 2.0 | 459 | 0.8139 | | No log | 2.9978 | 688 | 0.7544 | | 0.9622 | 4.0 | 918 | 0.7302 | | 0.9622 | 4.9978 | 1147 | 0.6878 | | 0.9622 | 6.0 | 1377 | 0.6754 | | 0.9622 | 6.9978 | 1606 | 0.6625 | | 0.715 | 8.0 | 1836 | 0.6431 | | 0.715 | 8.9978 | 2065 | 0.6278 | | 0.715 | 10.0 | 2295 | 0.6361 | | 0.715 | 10.9978 | 2524 | 0.6296 | | 0.6597 | 12.0 | 2754 | 0.6164 | | 0.6597 | 12.9978 | 2983 | 0.6117 | | 0.6597 | 14.0 | 3213 | 0.6052 | | 0.6597 | 14.9978 | 3442 | 0.6064 | | 0.6354 | 16.0 | 3672 | 0.6225 | | 0.6354 | 16.9978 | 3901 | 0.5974 | | 0.6354 | 18.0 | 4131 | 0.6010 | | 0.6354 | 18.9978 | 4360 | 0.5816 | | 0.6354 | 19.9564 | 4580 | 0.5971 | ### Framework versions - Transformers 4.40.0 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "BSC-LT/roberta-base-biomedical-clinical-es", "model-index": [{"name": "final-ft__roberta-base-biomedical-clinical-es__70k-ultrasounds", "results": []}]}
manucos/final-ft__roberta-base-biomedical-clinical-es__70k-ultrasounds
null
[ "transformers", "tensorboard", "safetensors", "roberta", "fill-mask", "generated_from_trainer", "base_model:BSC-LT/roberta-base-biomedical-clinical-es", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:45:24+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #roberta #fill-mask #generated_from_trainer #base_model-BSC-LT/roberta-base-biomedical-clinical-es #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
final-ft\_\_roberta-base-biomedical-clinical-es\_\_70k-ultrasounds ================================================================== This model is a fine-tuned version of BSC-LT/roberta-base-biomedical-clinical-es on the None dataset. It achieves the following results on the evaluation set: * Loss: 0.5971 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 64 * eval\_batch\_size: 64 * seed: 42 * gradient\_accumulation\_steps: 4 * total\_train\_batch\_size: 256 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 20 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.40.0 * Pytorch 2.2.1+cu121 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 256\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 20\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #roberta #fill-mask #generated_from_trainer #base_model-BSC-LT/roberta-base-biomedical-clinical-es #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 256\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 20\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ 62, 135, 5, 44 ]
[ "TAGS\n#transformers #tensorboard #safetensors #roberta #fill-mask #generated_from_trainer #base_model-BSC-LT/roberta-base-biomedical-clinical-es #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 256\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 20\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.40.0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
null
peft
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # falcon7b-finetuned-On-Screenplays This model is a fine-tuned version of [vilsonrodrigues/falcon-7b-instruct-sharded](https://huggingface.co/vilsonrodrigues/falcon-7b-instruct-sharded) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.03 - training_steps: 180 - mixed_precision_training: Native AMP ### Training results ### Framework versions - PEFT 0.10.1.dev0 - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "library_name": "peft", "tags": ["trl", "sft", "generated_from_trainer"], "base_model": "vilsonrodrigues/falcon-7b-instruct-sharded", "model-index": [{"name": "falcon7b-finetuned-On-Screenplays", "results": []}]}
Elhassnaoui-2001/falcon7b-finetuned-On-Screenplays
null
[ "peft", "tensorboard", "safetensors", "trl", "sft", "generated_from_trainer", "base_model:vilsonrodrigues/falcon-7b-instruct-sharded", "license:apache-2.0", "region:us" ]
null
2024-04-30T14:46:01+00:00
[]
[]
TAGS #peft #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-vilsonrodrigues/falcon-7b-instruct-sharded #license-apache-2.0 #region-us
# falcon7b-finetuned-On-Screenplays This model is a fine-tuned version of vilsonrodrigues/falcon-7b-instruct-sharded on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.03 - training_steps: 180 - mixed_precision_training: Native AMP ### Training results ### Framework versions - PEFT 0.10.1.dev0 - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
[ "# falcon7b-finetuned-On-Screenplays\n\nThis model is a fine-tuned version of vilsonrodrigues/falcon-7b-instruct-sharded on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.03\n- training_steps: 180\n- mixed_precision_training: Native AMP", "### Training results", "### Framework versions\n\n- PEFT 0.10.1.dev0\n- Transformers 4.40.1\n- Pytorch 2.2.1+cu121\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
[ "TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-vilsonrodrigues/falcon-7b-instruct-sharded #license-apache-2.0 #region-us \n", "# falcon7b-finetuned-On-Screenplays\n\nThis model is a fine-tuned version of vilsonrodrigues/falcon-7b-instruct-sharded on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.03\n- training_steps: 180\n- mixed_precision_training: Native AMP", "### Training results", "### Framework versions\n\n- PEFT 0.10.1.dev0\n- Transformers 4.40.1\n- Pytorch 2.2.1+cu121\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
[ 56, 44, 7, 9, 9, 4, 135, 5, 55 ]
[ "TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-vilsonrodrigues/falcon-7b-instruct-sharded #license-apache-2.0 #region-us \n# falcon7b-finetuned-On-Screenplays\n\nThis model is a fine-tuned version of vilsonrodrigues/falcon-7b-instruct-sharded on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.03\n- training_steps: 180\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- PEFT 0.10.1.dev0\n- Transformers 4.40.1\n- Pytorch 2.2.1+cu121\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
text-generation
transformers
## **Llama-3-8B-Instruct-Gradient-1048k** [exllamav2](https://github.com/turboderp/exllamav2) quant for [gradientai/Llama-3-8B-Instruct-Gradient-1048k](https://huggingface.co/gradientai/Llama-3-8B-Instruct-Gradient-1048k) **Original model information:** <a href="https://www.gradient.ai" target="_blank"><img src="https://cdn-uploads.huggingface.co/production/uploads/655bb613e8a8971e89944f3e/TSa3V8YpoVagnTYgxiLaO.png" width="200"/></a> # Llama-3 8B Gradient Instruct 1048k Gradient incorporates your data to deploy autonomous assistants that power critical operations across your business. If you're looking to build custom AI models or agents, email us a message [email protected]. For more info see our [End-to-end development service for custom LLMs and AI systems](https://gradient.ai/development-lab) This model extends LLama-3 8B's context length from 8k to > 1040K, developed by Gradient, sponsored by compute from [Crusoe Energy](https://huggingface.co/crusoeai). It demonstrates that SOTA LLMs can learn to operate on long context with minimal training by appropriately adjusting RoPE theta. We trained on 830M tokens for this stage, and 1.4B tokens total for all stages, which is < 0.01% of Llama-3's original pre-training data. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/6585dc9be92bc5f258156bd6/6MKLoX2ruLIaREiyb6coO.png) **Approach:** - [meta-llama/Meta-Llama-3-8B-Instruct](https://huggingface.co/meta-llama/Meta-Llama-3-8B-Instruct) as the base - NTK-aware interpolation [1] to initialize an optimal schedule for RoPE theta, followed by empirical RoPE theta optimization - Progressive training on increasing context lengths, similar to [Large World Model](https://huggingface.co/LargeWorldModel) [2] (See details below) **Infra:** We build on top of the EasyContext Blockwise RingAttention library [3] to scalably and efficiently train on contexts up to 1048k tokens on [Crusoe Energy](https://huggingface.co/crusoeai) high performance L40S cluster. Notably, we layered parallelism on top of Ring Attention with a custom network topology to better leverage large GPU clusters in the face of network bottlenecks from passing many KV blocks between devices. This gave us a 33x speedup in model training (compare 524k and 1048k to 65k and 262k in the table below). **Data:** For training data, we generate long contexts by augmenting [SlimPajama](https://huggingface.co/datasets/cerebras/SlimPajama-627B). **Progressive Training Details:** | | 65K | 262K | 524k | 1048k | |------------------------|-----------|-----------|-----------|-----------| | Initialize From | LLaMA-3 8B| 65K | 262K | 524k | | Sequence Length 2^N | 16 | 18 | 19 | 20 | | RoPE theta | 15.3 M | 207.1 M | 1.06B | 2.80B | | Batch Size | 1 | 1 | 16 | 16 | | Gradient Accumulation Steps | 32 | 16 | 1 | 1 | | Steps | 30 | 24 | 50 | 50 | | Total Tokens | 62914560 | 100663296 | 419430400 | 838860800 | | Learning Rate | 2.00E-05 | 2.00E-05 | 2.00E-05 | 2.00E-05 | | # GPUs | 8 | 32 | 512 | 512 | | GPU Type | NVIDIA L40S | NVIDIA L40S | NVIDIA L40S | NVIDIA L40S | | Minutes to Train (Wall)| 202 | 555 | 61 | 87 | **Quants**: - [GGUF](https://huggingface.co/crusoeai/Llama-3-8B-Instruct-1048k-GGUF) - [MLX-4bit](https://huggingface.co/mlx-community/Llama-3-8B-Instruct-1048k-4bit) ## The Gradient AI Team https://gradient.ai/ Gradient is accelerating AI transformation across industries. Our AI Foundry incorporates your data to deploy autonomous assistants that power critical operations across your business. ## Contact Us Drop an email to [[email protected]](mailto:[email protected]) ## References [1] Peng, Bowen, et al. "Yarn: Efficient context window extension of large language models." arXiv preprint arXiv:2309.00071 (2023). [2] Liu, Hao, et al. "World Model on Million-Length Video And Language With RingAttention." arXiv preprint arXiv:2402.08268 (2024). [3] https://github.com/jzhang38/EasyContext ---- # Base Model ## Model Details Meta developed and released the Meta Llama 3 family of large language models (LLMs), a collection of pretrained and instruction tuned generative text models in 8 and 70B sizes. The Llama 3 instruction tuned models are optimized for dialogue use cases and outperform many of the available open source chat models on common industry benchmarks. Further, in developing these models, we took great care to optimize helpfulness and safety. **Model developers** Meta **Variations** Llama 3 comes in two sizes — 8B and 70B parameters — in pre-trained and instruction tuned variants. **Input** Models input text only. **Output** Models generate text and code only. **Model Architecture** Llama 3 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align with human preferences for helpfulness and safety. <table> <tr> <td> </td> <td><strong>Training Data</strong> </td> <td><strong>Params</strong> </td> <td><strong>Context length</strong> </td> <td><strong>GQA</strong> </td> <td><strong>Token count</strong> </td> <td><strong>Knowledge cutoff</strong> </td> </tr> <tr> <td rowspan="2" >Llama 3 </td> <td rowspan="2" >A new mix of publicly available online data. </td> <td>8B </td> <td>8k </td> <td>Yes </td> <td rowspan="2" >15T+ </td> <td>March, 2023 </td> </tr> <tr> <td>70B </td> <td>8k </td> <td>Yes </td> <td>December, 2023 </td> </tr> </table> **Llama 3 family of models**. Token counts refer to pretraining data only. Both the 8 and 70B versions use Grouped-Query Attention (GQA) for improved inference scalability. **Model Release Date** April 18, 2024. **Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback. **License** A custom commercial license is available at: [https://llama.meta.com/llama3/license](https://llama.meta.com/llama3/license) Where to send questions or comments about the model Instructions on how to provide feedback or comments on the model can be found in the model [README](https://github.com/meta-llama/llama3). For more technical information about generation parameters and recipes for how to use Llama 3 in applications, please go [here](https://github.com/meta-llama/llama-recipes). ## Intended Use **Intended Use Cases** Llama 3 is intended for commercial and research use in English. Instruction tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks. **Out-of-scope** Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in any other way that is prohibited by the Acceptable Use Policy and Llama 3 Community License. Use in languages other than English**. **Note: Developers may fine-tune Llama 3 models for languages beyond English provided they comply with the Llama 3 Community License and the Acceptable Use Policy. ## How to use This repository contains two versions of Meta-Llama-3-8B-Instruct, for use with transformers and with the original `llama3` codebase. ### Use with transformers You can run conversational inference using the Transformers pipeline abstraction, or by leveraging the Auto classes with the `generate()` function. Let's see examples of both. #### Transformers pipeline ```python import transformers import torch model_id = "meta-llama/Meta-Llama-3-8B-Instruct" pipeline = transformers.pipeline( "text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto", ) messages = [ {"role": "system", "content": "You are a pirate chatbot who always responds in pirate speak!"}, {"role": "user", "content": "Who are you?"}, ] prompt = pipeline.tokenizer.apply_chat_template( messages, tokenize=False, add_generation_prompt=True ) terminators = [ pipeline.tokenizer.eos_token_id, pipeline.tokenizer.convert_tokens_to_ids("<|eot_id|>") ] outputs = pipeline( prompt, max_new_tokens=256, eos_token_id=terminators, do_sample=True, temperature=0.6, top_p=0.9, ) print(outputs[0]["generated_text"][len(prompt):]) ``` #### Transformers AutoModelForCausalLM ```python from transformers import AutoTokenizer, AutoModelForCausalLM import torch model_id = "meta-llama/Meta-Llama-3-8B-Instruct" tokenizer = AutoTokenizer.from_pretrained(model_id) model = AutoModelForCausalLM.from_pretrained( model_id, torch_dtype=torch.bfloat16, device_map="auto", ) messages = [ {"role": "system", "content": "You are a pirate chatbot who always responds in pirate speak!"}, {"role": "user", "content": "Who are you?"}, ] input_ids = tokenizer.apply_chat_template( messages, add_generation_prompt=True, return_tensors="pt" ).to(model.device) terminators = [ tokenizer.eos_token_id, tokenizer.convert_tokens_to_ids("<|eot_id|>") ] outputs = model.generate( input_ids, max_new_tokens=256, eos_token_id=terminators, do_sample=True, temperature=0.6, top_p=0.9, ) response = outputs[0][input_ids.shape[-1]:] print(tokenizer.decode(response, skip_special_tokens=True)) ``` ### Use with `llama3` Please, follow the instructions in the [repository](https://github.com/meta-llama/llama3) To download Original checkpoints, see the example command below leveraging `huggingface-cli`: ``` huggingface-cli download meta-llama/Meta-Llama-3-8B-Instruct --include "original/*" --local-dir Meta-Llama-3-8B-Instruct ``` For Hugging Face support, we recommend using transformers or TGI, but a similar command works. ## Hardware and Software **Training Factors** We used custom training libraries, Meta's Research SuperCluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute. **Carbon Footprint Pretraining utilized a cumulative** 7.7M GPU hours of computation on hardware of type H100-80GB (TDP of 700W). Estimated total emissions were 2290 tCO2eq, 100% of which were offset by Meta’s sustainability program. <table> <tr> <td> </td> <td><strong>Time (GPU hours)</strong> </td> <td><strong>Power Consumption (W)</strong> </td> <td><strong>Carbon Emitted(tCO2eq)</strong> </td> </tr> <tr> <td>Llama 3 8B </td> <td>1.3M </td> <td>700 </td> <td>390 </td> </tr> <tr> <td>Llama 3 70B </td> <td>6.4M </td> <td>700 </td> <td>1900 </td> </tr> <tr> <td>Total </td> <td>7.7M </td> <td> </td> <td>2290 </td> </tr> </table> **CO2 emissions during pre-training**. Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others. ## Training Data **Overview** Llama 3 was pretrained on over 15 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over 10M human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data. **Data Freshness** The pretraining data has a cutoff of March 2023 for the 7B and December 2023 for the 70B models respectively. ## Benchmarks In this section, we report the results for Llama 3 models on standard automatic benchmarks. For all the evaluations, we use our internal evaluations library. For details on the methodology see [here](https://github.com/meta-llama/llama3/blob/main/eval_methodology.md). ### Base pretrained models <table> <tr> <td><strong>Category</strong> </td> <td><strong>Benchmark</strong> </td> <td><strong>Llama 3 8B</strong> </td> <td><strong>Llama2 7B</strong> </td> <td><strong>Llama2 13B</strong> </td> <td><strong>Llama 3 70B</strong> </td> <td><strong>Llama2 70B</strong> </td> </tr> <tr> <td rowspan="6" >General </td> <td>MMLU (5-shot) </td> <td>66.6 </td> <td>45.7 </td> <td>53.8 </td> <td>79.5 </td> <td>69.7 </td> </tr> <tr> <td>AGIEval English (3-5 shot) </td> <td>45.9 </td> <td>28.8 </td> <td>38.7 </td> <td>63.0 </td> <td>54.8 </td> </tr> <tr> <td>CommonSenseQA (7-shot) </td> <td>72.6 </td> <td>57.6 </td> <td>67.6 </td> <td>83.8 </td> <td>78.7 </td> </tr> <tr> <td>Winogrande (5-shot) </td> <td>76.1 </td> <td>73.3 </td> <td>75.4 </td> <td>83.1 </td> <td>81.8 </td> </tr> <tr> <td>BIG-Bench Hard (3-shot, CoT) </td> <td>61.1 </td> <td>38.1 </td> <td>47.0 </td> <td>81.3 </td> <td>65.7 </td> </tr> <tr> <td>ARC-Challenge (25-shot) </td> <td>78.6 </td> <td>53.7 </td> <td>67.6 </td> <td>93.0 </td> <td>85.3 </td> </tr> <tr> <td>Knowledge reasoning </td> <td>TriviaQA-Wiki (5-shot) </td> <td>78.5 </td> <td>72.1 </td> <td>79.6 </td> <td>89.7 </td> <td>87.5 </td> </tr> <tr> <td rowspan="4" >Reading comprehension </td> <td>SQuAD (1-shot) </td> <td>76.4 </td> <td>72.2 </td> <td>72.1 </td> <td>85.6 </td> <td>82.6 </td> </tr> <tr> <td>QuAC (1-shot, F1) </td> <td>44.4 </td> <td>39.6 </td> <td>44.9 </td> <td>51.1 </td> <td>49.4 </td> </tr> <tr> <td>BoolQ (0-shot) </td> <td>75.7 </td> <td>65.5 </td> <td>66.9 </td> <td>79.0 </td> <td>73.1 </td> </tr> <tr> <td>DROP (3-shot, F1) </td> <td>58.4 </td> <td>37.9 </td> <td>49.8 </td> <td>79.7 </td> <td>70.2 </td> </tr> </table> ### Instruction tuned models <table> <tr> <td><strong>Benchmark</strong> </td> <td><strong>Llama 3 8B</strong> </td> <td><strong>Llama 2 7B</strong> </td> <td><strong>Llama 2 13B</strong> </td> <td><strong>Llama 3 70B</strong> </td> <td><strong>Llama 2 70B</strong> </td> </tr> <tr> <td>MMLU (5-shot) </td> <td>68.4 </td> <td>34.1 </td> <td>47.8 </td> <td>82.0 </td> <td>52.9 </td> </tr> <tr> <td>GPQA (0-shot) </td> <td>34.2 </td> <td>21.7 </td> <td>22.3 </td> <td>39.5 </td> <td>21.0 </td> </tr> <tr> <td>HumanEval (0-shot) </td> <td>62.2 </td> <td>7.9 </td> <td>14.0 </td> <td>81.7 </td> <td>25.6 </td> </tr> <tr> <td>GSM-8K (8-shot, CoT) </td> <td>79.6 </td> <td>25.7 </td> <td>77.4 </td> <td>93.0 </td> <td>57.5 </td> </tr> <tr> <td>MATH (4-shot, CoT) </td> <td>30.0 </td> <td>3.8 </td> <td>6.7 </td> <td>50.4 </td> <td>11.6 </td> </tr> </table> ### Responsibility & Safety We believe that an open approach to AI leads to better, safer products, faster innovation, and a bigger overall market. We are committed to Responsible AI development and took a series of steps to limit misuse and harm and support the open source community. Foundation models are widely capable technologies that are built to be used for a diverse range of applications. They are not designed to meet every developer preference on safety levels for all use cases, out-of-the-box, as those by their nature will differ across different applications. Rather, responsible LLM-application deployment is achieved by implementing a series of safety best practices throughout the development of such applications, from the model pre-training, fine-tuning and the deployment of systems composed of safeguards to tailor the safety needs specifically to the use case and audience. As part of the Llama 3 release, we updated our [Responsible Use Guide](https://llama.meta.com/responsible-use-guide/) to outline the steps and best practices for developers to implement model and system level safety for their application. We also provide a set of resources including [Meta Llama Guard 2](https://llama.meta.com/purple-llama/) and [Code Shield](https://llama.meta.com/purple-llama/) safeguards. These tools have proven to drastically reduce residual risks of LLM Systems, while maintaining a high level of helpfulness. We encourage developers to tune and deploy these safeguards according to their needs and we provide a [reference implementation](https://github.com/meta-llama/llama-recipes/tree/main/recipes/responsible_ai) to get you started. #### Llama 3-Instruct As outlined in the Responsible Use Guide, some trade-off between model helpfulness and model alignment is likely unavoidable. Developers should exercise discretion about how to weigh the benefits of alignment and helpfulness for their specific use case and audience. Developers should be mindful of residual risks when using Llama models and leverage additional safety tools as needed to reach the right safety bar for their use case. <span style="text-decoration:underline;">Safety</span> For our instruction tuned model, we conducted extensive red teaming exercises, performed adversarial evaluations and implemented safety mitigations techniques to lower residual risks. As with any Large Language Model, residual risks will likely remain and we recommend that developers assess these risks in the context of their use case. In parallel, we are working with the community to make AI safety benchmark standards transparent, rigorous and interpretable. <span style="text-decoration:underline;">Refusals</span> In addition to residual risks, we put a great emphasis on model refusals to benign prompts. Over-refusing not only can impact the user experience but could even be harmful in certain contexts as well. We’ve heard the feedback from the developer community and improved our fine tuning to ensure that Llama 3 is significantly less likely to falsely refuse to answer prompts than Llama 2. We built internal benchmarks and developed mitigations to limit false refusals making Llama 3 our most helpful model to date. #### Responsible release In addition to responsible use considerations outlined above, we followed a rigorous process that requires us to take extra measures against misuse and critical risks before we make our release decision. Misuse If you access or use Llama 3, you agree to the Acceptable Use Policy. The most recent copy of this policy can be found at [https://llama.meta.com/llama3/use-policy/](https://llama.meta.com/llama3/use-policy/). #### Critical risks <span style="text-decoration:underline;">CBRNE</span> (Chemical, Biological, Radiological, Nuclear, and high yield Explosives) We have conducted a two fold assessment of the safety of the model in this area: * Iterative testing during model training to assess the safety of responses related to CBRNE threats and other adversarial risks. * Involving external CBRNE experts to conduct an uplift test assessing the ability of the model to accurately provide expert knowledge and reduce barriers to potential CBRNE misuse, by reference to what can be achieved using web search (without the model). ### <span style="text-decoration:underline;">Cyber Security </span> We have evaluated Llama 3 with CyberSecEval, Meta’s cybersecurity safety eval suite, measuring Llama 3’s propensity to suggest insecure code when used as a coding assistant, and Llama 3’s propensity to comply with requests to help carry out cyber attacks, where attacks are defined by the industry standard MITRE ATT&CK cyber attack ontology. On our insecure coding and cyber attacker helpfulness tests, Llama 3 behaved in the same range or safer than models of [equivalent coding capability](https://huggingface.co/spaces/facebook/CyberSecEval). ### <span style="text-decoration:underline;">Child Safety</span> Child Safety risk assessments were conducted using a team of experts, to assess the model’s capability to produce outputs that could result in Child Safety risks and inform on any necessary and appropriate risk mitigations via fine tuning. We leveraged those expert red teaming sessions to expand the coverage of our evaluation benchmarks through Llama 3 model development. For Llama 3, we conducted new in-depth sessions using objective based methodologies to assess the model risks along multiple attack vectors. We also partnered with content specialists to perform red teaming exercises assessing potentially violating content while taking account of market specific nuances or experiences. ### Community Generative AI safety requires expertise and tooling, and we believe in the strength of the open community to accelerate its progress. We are active members of open consortiums, including the AI Alliance, Partnership in AI and MLCommons, actively contributing to safety standardization and transparency. We encourage the community to adopt taxonomies like the MLCommons Proof of Concept evaluation to facilitate collaboration and transparency on safety and content evaluations. Our Purple Llama tools are open sourced for the community to use and widely distributed across ecosystem partners including cloud service providers. We encourage community contributions to our [Github repository](https://github.com/meta-llama/PurpleLlama). Finally, we put in place a set of resources including an [output reporting mechanism](https://developers.facebook.com/llama_output_feedback) and [bug bounty program](https://www.facebook.com/whitehat) to continuously improve the Llama technology with the help of the community. ## Ethical Considerations and Limitations The core values of Llama 3 are openness, inclusivity and helpfulness. It is meant to serve everyone, and to work for a wide range of use cases. It is thus designed to be accessible to people across many different backgrounds, experiences and perspectives. Llama 3 addresses users and their needs as they are, without insertion unnecessary judgment or normativity, while reflecting the understanding that even content that may appear problematic in some cases can serve valuable purposes in others. It respects the dignity and autonomy of all users, especially in terms of the values of free thought and expression that power innovation and progress. But Llama 3 is a new technology, and like any new technology, there are risks associated with its use. Testing conducted to date has been in English, and has not covered, nor could it cover, all scenarios. For these reasons, as with all LLMs, Llama 3’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 3 models, developers should perform safety testing and tuning tailored to their specific applications of the model. As outlined in the Responsible Use Guide, we recommend incorporating [Purple Llama](https://github.com/facebookresearch/PurpleLlama) solutions into your workflows and specifically [Llama Guard](https://ai.meta.com/research/publications/llama-guard-llm-based-input-output-safeguard-for-human-ai-conversations/) which provides a base model to filter input and output prompts to layer system-level safety on top of model-level safety. Please see the Responsible Use Guide available at [http://llama.meta.com/responsible-use-guide](http://llama.meta.com/responsible-use-guide) ## Citation instructions @article{llama3modelcard, title={Llama 3 Model Card}, author={AI@Meta}, year={2024}, url = {https://github.com/meta-llama/llama3/blob/main/MODEL_CARD.md} } ## Contributors Aaditya Singh; Aaron Grattafiori; Abhimanyu Dubey; Abhinav Jauhri; Abhinav Pandey; Abhishek Kadian; Adam Kelsey; Adi Gangidi; Ahmad Al-Dahle; Ahuva Goldstand; Aiesha Letman; Ajay Menon; Akhil Mathur; Alan Schelten; Alex Vaughan; Amy Yang; Andrei Lupu; Andres Alvarado; Andrew Gallagher; Andrew Gu; Andrew Ho; Andrew Poulton; Andrew Ryan; Angela Fan; Ankit Ramchandani; Anthony Hartshorn; Archi Mitra; Archie Sravankumar; Artem Korenev; Arun Rao; Ashley Gabriel; Ashwin Bharambe; Assaf Eisenman; Aston Zhang; Aurelien Rodriguez; Austen Gregerson; Ava Spataru; Baptiste Roziere; Ben Maurer; Benjamin Leonhardi; Bernie Huang; Bhargavi Paranjape; Bing Liu; Binh Tang; Bobbie Chern; Brani Stojkovic; Brian Fuller; Catalina Mejia Arenas; Chao Zhou; Charlotte Caucheteux; Chaya Nayak; Ching-Hsiang Chu; Chloe Bi; Chris Cai; Chris Cox; Chris Marra; Chris McConnell; Christian Keller; Christoph Feichtenhofer; Christophe Touret; Chunyang Wu; Corinne Wong; Cristian Canton Ferrer; Damien Allonsius; Daniel Kreymer; Daniel Haziza; Daniel Li; Danielle Pintz; Danny Livshits; Danny Wyatt; David Adkins; David Esiobu; David Xu; Davide Testuggine; Delia David; Devi Parikh; Dhruv Choudhary; Dhruv Mahajan; Diana Liskovich; Diego Garcia-Olano; Diego Perino; Dieuwke Hupkes; Dingkang Wang; Dustin Holland; Egor Lakomkin; Elina Lobanova; Xiaoqing Ellen Tan; Emily Dinan; Eric Smith; Erik Brinkman; Esteban Arcaute; Filip Radenovic; Firat Ozgenel; Francesco Caggioni; Frank Seide; Frank Zhang; Gabriel Synnaeve; Gabriella Schwarz; Gabrielle Lee; Gada Badeer; Georgia Anderson; Graeme Nail; Gregoire Mialon; Guan Pang; Guillem Cucurell; Hailey Nguyen; Hannah Korevaar; Hannah Wang; Haroun Habeeb; Harrison Rudolph; Henry Aspegren; Hu Xu; Hugo Touvron; Iga Kozlowska; Igor Molybog; Igor Tufanov; Iliyan Zarov; Imanol Arrieta Ibarra; Irina-Elena Veliche; Isabel Kloumann; Ishan Misra; Ivan Evtimov; Jacob Xu; Jade Copet; Jake Weissman; Jan Geffert; Jana Vranes; Japhet Asher; Jason Park; Jay Mahadeokar; Jean-Baptiste Gaya; Jeet Shah; Jelmer van der Linde; Jennifer Chan; Jenny Hong; Jenya Lee; Jeremy Fu; Jeremy Teboul; Jianfeng Chi; Jianyu Huang; Jie Wang; Jiecao Yu; Joanna Bitton; Joe Spisak; Joelle Pineau; Jon Carvill; Jongsoo Park; Joseph Rocca; Joshua Johnstun; Junteng Jia; Kalyan Vasuden Alwala; Kam Hou U; Kate Plawiak; Kartikeya Upasani; Kaushik Veeraraghavan; Ke Li; Kenneth Heafield; Kevin Stone; Khalid El-Arini; Krithika Iyer; Kshitiz Malik; Kuenley Chiu; Kunal Bhalla; Kyle Huang; Lakshya Garg; Lauren Rantala-Yeary; Laurens van der Maaten; Lawrence Chen; Leandro Silva; Lee Bell; Lei Zhang; Liang Tan; Louis Martin; Lovish Madaan; Luca Wehrstedt; Lukas Blecher; Luke de Oliveira; Madeline Muzzi; Madian Khabsa; Manav Avlani; Mannat Singh; Manohar Paluri; Mark Zuckerberg; Marcin Kardas; Martynas Mankus; Mathew Oldham; Mathieu Rita; Matthew Lennie; Maya Pavlova; Meghan Keneally; Melanie Kambadur; Mihir Patel; Mikayel Samvelyan; Mike Clark; Mike Lewis; Min Si; Mitesh Kumar Singh; Mo Metanat; Mona Hassan; Naman Goyal; Narjes Torabi; Nicolas Usunier; Nikolay Bashlykov; Nikolay Bogoychev; Niladri Chatterji; Ning Dong; Oliver Aobo Yang; Olivier Duchenne; Onur Celebi; Parth Parekh; Patrick Alrassy; Paul Saab; Pavan Balaji; Pedro Rittner; Pengchuan Zhang; Pengwei Li; Petar Vasic; Peter Weng; Polina Zvyagina; Prajjwal Bhargava; Pratik Dubal; Praveen Krishnan; Punit Singh Koura; Qing He; Rachel Rodriguez; Ragavan Srinivasan; Rahul Mitra; Ramon Calderer; Raymond Li; Robert Stojnic; Roberta Raileanu; Robin Battey; Rocky Wang; Rohit Girdhar; Rohit Patel; Romain Sauvestre; Ronnie Polidoro; Roshan Sumbaly; Ross Taylor; Ruan Silva; Rui Hou; Rui Wang; Russ Howes; Ruty Rinott; Saghar Hosseini; Sai Jayesh Bondu; Samyak Datta; Sanjay Singh; Sara Chugh; Sargun Dhillon; Satadru Pan; Sean Bell; Sergey Edunov; Shaoliang Nie; Sharan Narang; Sharath Raparthy; Shaun Lindsay; Sheng Feng; Sheng Shen; Shenghao Lin; Shiva Shankar; Shruti Bhosale; Shun Zhang; Simon Vandenhende; Sinong Wang; Seohyun Sonia Kim; Soumya Batra; Sten Sootla; Steve Kehoe; Suchin Gururangan; Sumit Gupta; Sunny Virk; Sydney Borodinsky; Tamar Glaser; Tamar Herman; Tamara Best; Tara Fowler; Thomas Georgiou; Thomas Scialom; Tianhe Li; Todor Mihaylov; Tong Xiao; Ujjwal Karn; Vedanuj Goswami; Vibhor Gupta; Vignesh Ramanathan; Viktor Kerkez; Vinay Satish Kumar; Vincent Gonguet; Vish Vogeti; Vlad Poenaru; Vlad Tiberiu Mihailescu; Vladan Petrovic; Vladimir Ivanov; Wei Li; Weiwei Chu; Wenhan Xiong; Wenyin Fu; Wes Bouaziz; Whitney Meers; Will Constable; Xavier Martinet; Xiaojian Wu; Xinbo Gao; Xinfeng Xie; Xuchao Jia; Yaelle Goldschlag; Yann LeCun; Yashesh Gaur; Yasmine Babaei; Ye Qi; Yenda Li; Yi Wen; Yiwen Song; Youngjin Nam; Yuchen Hao; Yuchen Zhang; Yun Wang; Yuning Mao; Yuzi He; Zacharie Delpierre Coudert; Zachary DeVito; Zahra Hankir; Zhaoduo Wen; Zheng Yan; Zhengxing Chen; Zhenyu Yang; Zoe Papakipos
{"language": ["en"], "license": "llama3", "tags": ["meta", "llama-3", "8-bit"], "pipeline_tag": "text-generation", "base_model": ["gradientai/Llama-3-8B-Instruct-Gradient-1048k"]}
Slvcxc/Llama-3-8B-Instruct-Gradient-1048k-8.0bpw-h8-exl2
null
[ "transformers", "safetensors", "llama", "text-generation", "meta", "llama-3", "8-bit", "conversational", "en", "base_model:gradientai/Llama-3-8B-Instruct-Gradient-1048k", "license:llama3", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T14:46:39+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #llama #text-generation #meta #llama-3 #8-bit #conversational #en #base_model-gradientai/Llama-3-8B-Instruct-Gradient-1048k #license-llama3 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
Llama-3-8B-Instruct-Gradient-1048k ---------------------------------- exllamav2 quant for gradientai/Llama-3-8B-Instruct-Gradient-1048k Original model information: [<img src="URL width="200"/>](URL) Llama-3 8B Gradient Instruct 1048k ================================== Gradient incorporates your data to deploy autonomous assistants that power critical operations across your business. If you're looking to build custom AI models or agents, email us a message contact@URL. For more info see our End-to-end development service for custom LLMs and AI systems This model extends LLama-3 8B's context length from 8k to > 1040K, developed by Gradient, sponsored by compute from Crusoe Energy. It demonstrates that SOTA LLMs can learn to operate on long context with minimal training by appropriately adjusting RoPE theta. We trained on 830M tokens for this stage, and 1.4B tokens total for all stages, which is < 0.01% of Llama-3's original pre-training data. !image/png Approach: * meta-llama/Meta-Llama-3-8B-Instruct as the base * NTK-aware interpolation [1] to initialize an optimal schedule for RoPE theta, followed by empirical RoPE theta optimization * Progressive training on increasing context lengths, similar to Large World Model [2] (See details below) Infra: We build on top of the EasyContext Blockwise RingAttention library [3] to scalably and efficiently train on contexts up to 1048k tokens on Crusoe Energy high performance L40S cluster. Notably, we layered parallelism on top of Ring Attention with a custom network topology to better leverage large GPU clusters in the face of network bottlenecks from passing many KV blocks between devices. This gave us a 33x speedup in model training (compare 524k and 1048k to 65k and 262k in the table below). Data: For training data, we generate long contexts by augmenting SlimPajama. Progressive Training Details: Quants: * GGUF * MLX-4bit The Gradient AI Team -------------------- URL Gradient is accelerating AI transformation across industries. Our AI Foundry incorporates your data to deploy autonomous assistants that power critical operations across your business. Contact Us ---------- Drop an email to contact@URL References ---------- [1] Peng, Bowen, et al. "Yarn: Efficient context window extension of large language models." arXiv preprint arXiv:2309.00071 (2023). [2] Liu, Hao, et al. "World Model on Million-Length Video And Language With RingAttention." arXiv preprint arXiv:2402.08268 (2024). [3] URL --- Base Model ========== Model Details ------------- Meta developed and released the Meta Llama 3 family of large language models (LLMs), a collection of pretrained and instruction tuned generative text models in 8 and 70B sizes. The Llama 3 instruction tuned models are optimized for dialogue use cases and outperform many of the available open source chat models on common industry benchmarks. Further, in developing these models, we took great care to optimize helpfulness and safety. Model developers Meta Variations Llama 3 comes in two sizes — 8B and 70B parameters — in pre-trained and instruction tuned variants. Input Models input text only. Output Models generate text and code only. Model Architecture Llama 3 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align with human preferences for helpfulness and safety. Llama 3 family of models. Token counts refer to pretraining data only. Both the 8 and 70B versions use Grouped-Query Attention (GQA) for improved inference scalability. Model Release Date April 18, 2024. Status This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback. License A custom commercial license is available at: URL Where to send questions or comments about the model Instructions on how to provide feedback or comments on the model can be found in the model README. For more technical information about generation parameters and recipes for how to use Llama 3 in applications, please go here. Intended Use ------------ Intended Use Cases Llama 3 is intended for commercial and research use in English. Instruction tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks. Out-of-scope Use in any manner that violates applicable laws or regulations (including trade compliance laws). Use in any other way that is prohibited by the Acceptable Use Policy and Llama 3 Community License. Use in languages other than English. Note: Developers may fine-tune Llama 3 models for languages beyond English provided they comply with the Llama 3 Community License and the Acceptable Use Policy. How to use ---------- This repository contains two versions of Meta-Llama-3-8B-Instruct, for use with transformers and with the original 'llama3' codebase. ### Use with transformers You can run conversational inference using the Transformers pipeline abstraction, or by leveraging the Auto classes with the 'generate()' function. Let's see examples of both. #### Transformers pipeline #### Transformers AutoModelForCausalLM ### Use with 'llama3' Please, follow the instructions in the repository To download Original checkpoints, see the example command below leveraging 'huggingface-cli': For Hugging Face support, we recommend using transformers or TGI, but a similar command works. Hardware and Software --------------------- Training Factors We used custom training libraries, Meta's Research SuperCluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute. Carbon Footprint Pretraining utilized a cumulative 7.7M GPU hours of computation on hardware of type H100-80GB (TDP of 700W). Estimated total emissions were 2290 tCO2eq, 100% of which were offset by Meta’s sustainability program. CO2 emissions during pre-training. Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others. Training Data ------------- Overview Llama 3 was pretrained on over 15 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over 10M human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data. Data Freshness The pretraining data has a cutoff of March 2023 for the 7B and December 2023 for the 70B models respectively. Benchmarks ---------- In this section, we report the results for Llama 3 models on standard automatic benchmarks. For all the evaluations, we use our internal evaluations library. For details on the methodology see here. ### Base pretrained models ### Instruction tuned models ### Responsibility & Safety We believe that an open approach to AI leads to better, safer products, faster innovation, and a bigger overall market. We are committed to Responsible AI development and took a series of steps to limit misuse and harm and support the open source community. Foundation models are widely capable technologies that are built to be used for a diverse range of applications. They are not designed to meet every developer preference on safety levels for all use cases, out-of-the-box, as those by their nature will differ across different applications. Rather, responsible LLM-application deployment is achieved by implementing a series of safety best practices throughout the development of such applications, from the model pre-training, fine-tuning and the deployment of systems composed of safeguards to tailor the safety needs specifically to the use case and audience. As part of the Llama 3 release, we updated our Responsible Use Guide to outline the steps and best practices for developers to implement model and system level safety for their application. We also provide a set of resources including Meta Llama Guard 2 and Code Shield safeguards. These tools have proven to drastically reduce residual risks of LLM Systems, while maintaining a high level of helpfulness. We encourage developers to tune and deploy these safeguards according to their needs and we provide a reference implementation to get you started. #### Llama 3-Instruct As outlined in the Responsible Use Guide, some trade-off between model helpfulness and model alignment is likely unavoidable. Developers should exercise discretion about how to weigh the benefits of alignment and helpfulness for their specific use case and audience. Developers should be mindful of residual risks when using Llama models and leverage additional safety tools as needed to reach the right safety bar for their use case. Safety For our instruction tuned model, we conducted extensive red teaming exercises, performed adversarial evaluations and implemented safety mitigations techniques to lower residual risks. As with any Large Language Model, residual risks will likely remain and we recommend that developers assess these risks in the context of their use case. In parallel, we are working with the community to make AI safety benchmark standards transparent, rigorous and interpretable. Refusals In addition to residual risks, we put a great emphasis on model refusals to benign prompts. Over-refusing not only can impact the user experience but could even be harmful in certain contexts as well. We’ve heard the feedback from the developer community and improved our fine tuning to ensure that Llama 3 is significantly less likely to falsely refuse to answer prompts than Llama 2. We built internal benchmarks and developed mitigations to limit false refusals making Llama 3 our most helpful model to date. #### Responsible release In addition to responsible use considerations outlined above, we followed a rigorous process that requires us to take extra measures against misuse and critical risks before we make our release decision. Misuse If you access or use Llama 3, you agree to the Acceptable Use Policy. The most recent copy of this policy can be found at URL #### Critical risks CBRNE (Chemical, Biological, Radiological, Nuclear, and high yield Explosives) We have conducted a two fold assessment of the safety of the model in this area: * Iterative testing during model training to assess the safety of responses related to CBRNE threats and other adversarial risks. * Involving external CBRNE experts to conduct an uplift test assessing the ability of the model to accurately provide expert knowledge and reduce barriers to potential CBRNE misuse, by reference to what can be achieved using web search (without the model). ### Cyber Security We have evaluated Llama 3 with CyberSecEval, Meta’s cybersecurity safety eval suite, measuring Llama 3’s propensity to suggest insecure code when used as a coding assistant, and Llama 3’s propensity to comply with requests to help carry out cyber attacks, where attacks are defined by the industry standard MITRE ATT&CK cyber attack ontology. On our insecure coding and cyber attacker helpfulness tests, Llama 3 behaved in the same range or safer than models of equivalent coding capability. ### Child Safety Child Safety risk assessments were conducted using a team of experts, to assess the model’s capability to produce outputs that could result in Child Safety risks and inform on any necessary and appropriate risk mitigations via fine tuning. We leveraged those expert red teaming sessions to expand the coverage of our evaluation benchmarks through Llama 3 model development. For Llama 3, we conducted new in-depth sessions using objective based methodologies to assess the model risks along multiple attack vectors. We also partnered with content specialists to perform red teaming exercises assessing potentially violating content while taking account of market specific nuances or experiences. ### Community Generative AI safety requires expertise and tooling, and we believe in the strength of the open community to accelerate its progress. We are active members of open consortiums, including the AI Alliance, Partnership in AI and MLCommons, actively contributing to safety standardization and transparency. We encourage the community to adopt taxonomies like the MLCommons Proof of Concept evaluation to facilitate collaboration and transparency on safety and content evaluations. Our Purple Llama tools are open sourced for the community to use and widely distributed across ecosystem partners including cloud service providers. We encourage community contributions to our Github repository. Finally, we put in place a set of resources including an output reporting mechanism and bug bounty program to continuously improve the Llama technology with the help of the community. Ethical Considerations and Limitations -------------------------------------- The core values of Llama 3 are openness, inclusivity and helpfulness. It is meant to serve everyone, and to work for a wide range of use cases. It is thus designed to be accessible to people across many different backgrounds, experiences and perspectives. Llama 3 addresses users and their needs as they are, without insertion unnecessary judgment or normativity, while reflecting the understanding that even content that may appear problematic in some cases can serve valuable purposes in others. It respects the dignity and autonomy of all users, especially in terms of the values of free thought and expression that power innovation and progress. But Llama 3 is a new technology, and like any new technology, there are risks associated with its use. Testing conducted to date has been in English, and has not covered, nor could it cover, all scenarios. For these reasons, as with all LLMs, Llama 3’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 3 models, developers should perform safety testing and tuning tailored to their specific applications of the model. As outlined in the Responsible Use Guide, we recommend incorporating Purple Llama solutions into your workflows and specifically Llama Guard which provides a base model to filter input and output prompts to layer system-level safety on top of model-level safety. Please see the Responsible Use Guide available at URL instructions @article{llama3modelcard, title={Llama 3 Model Card}, author={AI@Meta}, year={2024}, url = {URL } Contributors ------------ Aaditya Singh; Aaron Grattafiori; Abhimanyu Dubey; Abhinav Jauhri; Abhinav Pandey; Abhishek Kadian; Adam Kelsey; Adi Gangidi; Ahmad Al-Dahle; Ahuva Goldstand; Aiesha Letman; Ajay Menon; Akhil Mathur; Alan Schelten; Alex Vaughan; Amy Yang; Andrei Lupu; Andres Alvarado; Andrew Gallagher; Andrew Gu; Andrew Ho; Andrew Poulton; Andrew Ryan; Angela Fan; Ankit Ramchandani; Anthony Hartshorn; Archi Mitra; Archie Sravankumar; Artem Korenev; Arun Rao; Ashley Gabriel; Ashwin Bharambe; Assaf Eisenman; Aston Zhang; Aurelien Rodriguez; Austen Gregerson; Ava Spataru; Baptiste Roziere; Ben Maurer; Benjamin Leonhardi; Bernie Huang; Bhargavi Paranjape; Bing Liu; Binh Tang; Bobbie Chern; Brani Stojkovic; Brian Fuller; Catalina Mejia Arenas; Chao Zhou; Charlotte Caucheteux; Chaya Nayak; Ching-Hsiang Chu; Chloe Bi; Chris Cai; Chris Cox; Chris Marra; Chris McConnell; Christian Keller; Christoph Feichtenhofer; Christophe Touret; Chunyang Wu; Corinne Wong; Cristian Canton Ferrer; Damien Allonsius; Daniel Kreymer; Daniel Haziza; Daniel Li; Danielle Pintz; Danny Livshits; Danny Wyatt; David Adkins; David Esiobu; David Xu; Davide Testuggine; Delia David; Devi Parikh; Dhruv Choudhary; Dhruv Mahajan; Diana Liskovich; Diego Garcia-Olano; Diego Perino; Dieuwke Hupkes; Dingkang Wang; Dustin Holland; Egor Lakomkin; Elina Lobanova; Xiaoqing Ellen Tan; Emily Dinan; Eric Smith; Erik Brinkman; Esteban Arcaute; Filip Radenovic; Firat Ozgenel; Francesco Caggioni; Frank Seide; Frank Zhang; Gabriel Synnaeve; Gabriella Schwarz; Gabrielle Lee; Gada Badeer; Georgia Anderson; Graeme Nail; Gregoire Mialon; Guan Pang; Guillem Cucurell; Hailey Nguyen; Hannah Korevaar; Hannah Wang; Haroun Habeeb; Harrison Rudolph; Henry Aspegren; Hu Xu; Hugo Touvron; Iga Kozlowska; Igor Molybog; Igor Tufanov; Iliyan Zarov; Imanol Arrieta Ibarra; Irina-Elena Veliche; Isabel Kloumann; Ishan Misra; Ivan Evtimov; Jacob Xu; Jade Copet; Jake Weissman; Jan Geffert; Jana Vranes; Japhet Asher; Jason Park; Jay Mahadeokar; Jean-Baptiste Gaya; Jeet Shah; Jelmer van der Linde; Jennifer Chan; Jenny Hong; Jenya Lee; Jeremy Fu; Jeremy Teboul; Jianfeng Chi; Jianyu Huang; Jie Wang; Jiecao Yu; Joanna Bitton; Joe Spisak; Joelle Pineau; Jon Carvill; Jongsoo Park; Joseph Rocca; Joshua Johnstun; Junteng Jia; Kalyan Vasuden Alwala; Kam Hou U; Kate Plawiak; Kartikeya Upasani; Kaushik Veeraraghavan; Ke Li; Kenneth Heafield; Kevin Stone; Khalid El-Arini; Krithika Iyer; Kshitiz Malik; Kuenley Chiu; Kunal Bhalla; Kyle Huang; Lakshya Garg; Lauren Rantala-Yeary; Laurens van der Maaten; Lawrence Chen; Leandro Silva; Lee Bell; Lei Zhang; Liang Tan; Louis Martin; Lovish Madaan; Luca Wehrstedt; Lukas Blecher; Luke de Oliveira; Madeline Muzzi; Madian Khabsa; Manav Avlani; Mannat Singh; Manohar Paluri; Mark Zuckerberg; Marcin Kardas; Martynas Mankus; Mathew Oldham; Mathieu Rita; Matthew Lennie; Maya Pavlova; Meghan Keneally; Melanie Kambadur; Mihir Patel; Mikayel Samvelyan; Mike Clark; Mike Lewis; Min Si; Mitesh Kumar Singh; Mo Metanat; Mona Hassan; Naman Goyal; Narjes Torabi; Nicolas Usunier; Nikolay Bashlykov; Nikolay Bogoychev; Niladri Chatterji; Ning Dong; Oliver Aobo Yang; Olivier Duchenne; Onur Celebi; Parth Parekh; Patrick Alrassy; Paul Saab; Pavan Balaji; Pedro Rittner; Pengchuan Zhang; Pengwei Li; Petar Vasic; Peter Weng; Polina Zvyagina; Prajjwal Bhargava; Pratik Dubal; Praveen Krishnan; Punit Singh Koura; Qing He; Rachel Rodriguez; Ragavan Srinivasan; Rahul Mitra; Ramon Calderer; Raymond Li; Robert Stojnic; Roberta Raileanu; Robin Battey; Rocky Wang; Rohit Girdhar; Rohit Patel; Romain Sauvestre; Ronnie Polidoro; Roshan Sumbaly; Ross Taylor; Ruan Silva; Rui Hou; Rui Wang; Russ Howes; Ruty Rinott; Saghar Hosseini; Sai Jayesh Bondu; Samyak Datta; Sanjay Singh; Sara Chugh; Sargun Dhillon; Satadru Pan; Sean Bell; Sergey Edunov; Shaoliang Nie; Sharan Narang; Sharath Raparthy; Shaun Lindsay; Sheng Feng; Sheng Shen; Shenghao Lin; Shiva Shankar; Shruti Bhosale; Shun Zhang; Simon Vandenhende; Sinong Wang; Seohyun Sonia Kim; Soumya Batra; Sten Sootla; Steve Kehoe; Suchin Gururangan; Sumit Gupta; Sunny Virk; Sydney Borodinsky; Tamar Glaser; Tamar Herman; Tamara Best; Tara Fowler; Thomas Georgiou; Thomas Scialom; Tianhe Li; Todor Mihaylov; Tong Xiao; Ujjwal Karn; Vedanuj Goswami; Vibhor Gupta; Vignesh Ramanathan; Viktor Kerkez; Vinay Satish Kumar; Vincent Gonguet; Vish Vogeti; Vlad Poenaru; Vlad Tiberiu Mihailescu; Vladan Petrovic; Vladimir Ivanov; Wei Li; Weiwei Chu; Wenhan Xiong; Wenyin Fu; Wes Bouaziz; Whitney Meers; Will Constable; Xavier Martinet; Xiaojian Wu; Xinbo Gao; Xinfeng Xie; Xuchao Jia; Yaelle Goldschlag; Yann LeCun; Yashesh Gaur; Yasmine Babaei; Ye Qi; Yenda Li; Yi Wen; Yiwen Song; Youngjin Nam; Yuchen Hao; Yuchen Zhang; Yun Wang; Yuning Mao; Yuzi He; Zacharie Delpierre Coudert; Zachary DeVito; Zahra Hankir; Zhaoduo Wen; Zheng Yan; Zhengxing Chen; Zhenyu Yang; Zoe Papakipos
[ "### Use with transformers\n\n\nYou can run conversational inference using the Transformers pipeline abstraction, or by leveraging the Auto classes with the 'generate()' function. Let's see examples of both.", "#### Transformers pipeline", "#### Transformers AutoModelForCausalLM", "### Use with 'llama3'\n\n\nPlease, follow the instructions in the repository\n\n\nTo download Original checkpoints, see the example command below leveraging 'huggingface-cli':\n\n\nFor Hugging Face support, we recommend using transformers or TGI, but a similar command works.\n\n\nHardware and Software\n---------------------\n\n\nTraining Factors We used custom training libraries, Meta's Research SuperCluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.\n\n\nCarbon Footprint Pretraining utilized a cumulative 7.7M GPU hours of computation on hardware of type H100-80GB (TDP of 700W). Estimated total emissions were 2290 tCO2eq, 100% of which were offset by Meta’s sustainability program.\n\n\n\nCO2 emissions during pre-training. Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.\n\n\nTraining Data\n-------------\n\n\nOverview Llama 3 was pretrained on over 15 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over 10M human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.\n\n\nData Freshness The pretraining data has a cutoff of March 2023 for the 7B and December 2023 for the 70B models respectively.\n\n\nBenchmarks\n----------\n\n\nIn this section, we report the results for Llama 3 models on standard automatic benchmarks. For all the evaluations, we use our internal evaluations library. For details on the methodology see here.", "### Base pretrained models", "### Instruction tuned models", "### Responsibility & Safety\n\n\nWe believe that an open approach to AI leads to better, safer products, faster innovation, and a bigger overall market. We are committed to Responsible AI development and took a series of steps to limit misuse and harm and support the open source community.\n\n\nFoundation models are widely capable technologies that are built to be used for a diverse range of applications. They are not designed to meet every developer preference on safety levels for all use cases, out-of-the-box, as those by their nature will differ across different applications.\n\n\nRather, responsible LLM-application deployment is achieved by implementing a series of safety best practices throughout the development of such applications, from the model pre-training, fine-tuning and the deployment of systems composed of safeguards to tailor the safety needs specifically to the use case and audience.\n\n\nAs part of the Llama 3 release, we updated our Responsible Use Guide to outline the steps and best practices for developers to implement model and system level safety for their application. We also provide a set of resources including Meta Llama Guard 2 and Code Shield safeguards. These tools have proven to drastically reduce residual risks of LLM Systems, while maintaining a high level of helpfulness. We encourage developers to tune and deploy these safeguards according to their needs and we provide a reference implementation to get you started.", "#### Llama 3-Instruct\n\n\nAs outlined in the Responsible Use Guide, some trade-off between model helpfulness and model alignment is likely unavoidable. Developers should exercise discretion about how to weigh the benefits of alignment and helpfulness for their specific use case and audience. Developers should be mindful of residual risks when using Llama models and leverage additional safety tools as needed to reach the right safety bar for their use case.\n\n\nSafety\n\n\nFor our instruction tuned model, we conducted extensive red teaming exercises, performed adversarial evaluations and implemented safety mitigations techniques to lower residual risks. As with any Large Language Model, residual risks will likely remain and we recommend that developers assess these risks in the context of their use case. In parallel, we are working with the community to make AI safety benchmark standards transparent, rigorous and interpretable.\n\n\nRefusals\n\n\nIn addition to residual risks, we put a great emphasis on model refusals to benign prompts. Over-refusing not only can impact the user experience but could even be harmful in certain contexts as well. We’ve heard the feedback from the developer community and improved our fine tuning to ensure that Llama 3 is significantly less likely to falsely refuse to answer prompts than Llama 2.\n\n\nWe built internal benchmarks and developed mitigations to limit false refusals making Llama 3 our most helpful model to date.", "#### Responsible release\n\n\nIn addition to responsible use considerations outlined above, we followed a rigorous process that requires us to take extra measures against misuse and critical risks before we make our release decision.\n\n\nMisuse\n\n\nIf you access or use Llama 3, you agree to the Acceptable Use Policy. The most recent copy of this policy can be found at URL", "#### Critical risks\n\n\nCBRNE (Chemical, Biological, Radiological, Nuclear, and high yield Explosives)\n\n\nWe have conducted a two fold assessment of the safety of the model in this area:\n\n\n* Iterative testing during model training to assess the safety of responses related to CBRNE threats and other adversarial risks.\n* Involving external CBRNE experts to conduct an uplift test assessing the ability of the model to accurately provide expert knowledge and reduce barriers to potential CBRNE misuse, by reference to what can be achieved using web search (without the model).", "### Cyber Security\n\n\nWe have evaluated Llama 3 with CyberSecEval, Meta’s cybersecurity safety eval suite, measuring Llama 3’s propensity to suggest insecure code when used as a coding assistant, and Llama 3’s propensity to comply with requests to help carry out cyber attacks, where attacks are defined by the industry standard MITRE ATT&CK cyber attack ontology. On our insecure coding and cyber attacker helpfulness tests, Llama 3 behaved in the same range or safer than models of equivalent coding capability.", "### Child Safety\n\n\nChild Safety risk assessments were conducted using a team of experts, to assess the model’s capability to produce outputs that could result in Child Safety risks and inform on any necessary and appropriate risk mitigations via fine tuning. We leveraged those expert red teaming sessions to expand the coverage of our evaluation benchmarks through Llama 3 model development. For Llama 3, we conducted new in-depth sessions using objective based methodologies to assess the model risks along multiple attack vectors. We also partnered with content specialists to perform red teaming exercises assessing potentially violating content while taking account of market specific nuances or experiences.", "### Community\n\n\nGenerative AI safety requires expertise and tooling, and we believe in the strength of the open community to accelerate its progress. We are active members of open consortiums, including the AI Alliance, Partnership in AI and MLCommons, actively contributing to safety standardization and transparency. We encourage the community to adopt taxonomies like the MLCommons Proof of Concept evaluation to facilitate collaboration and transparency on safety and content evaluations. Our Purple Llama tools are open sourced for the community to use and widely distributed across ecosystem partners including cloud service providers. We encourage community contributions to our Github repository.\n\n\nFinally, we put in place a set of resources including an output reporting mechanism and bug bounty program to continuously improve the Llama technology with the help of the community.\n\n\nEthical Considerations and Limitations\n--------------------------------------\n\n\nThe core values of Llama 3 are openness, inclusivity and helpfulness. It is meant to serve everyone, and to work for a wide range of use cases. It is thus designed to be accessible to people across many different backgrounds, experiences and perspectives. Llama 3 addresses users and their needs as they are, without insertion unnecessary judgment or normativity, while reflecting the understanding that even content that may appear problematic in some cases can serve valuable purposes in others. It respects the dignity and autonomy of all users, especially in terms of the values of free thought and expression that power innovation and progress.\n\n\nBut Llama 3 is a new technology, and like any new technology, there are risks associated with its use. Testing conducted to date has been in English, and has not covered, nor could it cover, all scenarios. For these reasons, as with all LLMs, Llama 3’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 3 models, developers should perform safety testing and tuning tailored to their specific applications of the model. As outlined in the Responsible Use Guide, we recommend incorporating Purple Llama solutions into your workflows and specifically Llama Guard which provides a base model to filter input and output prompts to layer system-level safety on top of model-level safety.\n\n\nPlease see the Responsible Use Guide available at URL\n\n\ninstructions\n\n\n@article{llama3modelcard,\n\n\ntitle={Llama 3 Model Card},\n\n\nauthor={AI@Meta},\n\n\nyear={2024},\n\n\nurl = {URL\n\n\n}\n\n\nContributors\n------------\n\n\nAaditya Singh; Aaron Grattafiori; Abhimanyu Dubey; Abhinav Jauhri; Abhinav Pandey; Abhishek Kadian; Adam Kelsey; Adi Gangidi; Ahmad Al-Dahle; Ahuva Goldstand; Aiesha Letman; Ajay Menon; Akhil Mathur; Alan Schelten; Alex Vaughan; Amy Yang; Andrei Lupu; Andres Alvarado; Andrew Gallagher; Andrew Gu; Andrew Ho; Andrew Poulton; Andrew Ryan; Angela Fan; Ankit Ramchandani; Anthony Hartshorn; Archi Mitra; Archie Sravankumar; Artem Korenev; Arun Rao; Ashley Gabriel; Ashwin Bharambe; Assaf Eisenman; Aston Zhang; Aurelien Rodriguez; Austen Gregerson; Ava Spataru; Baptiste Roziere; Ben Maurer; Benjamin Leonhardi; Bernie Huang; Bhargavi Paranjape; Bing Liu; Binh Tang; Bobbie Chern; Brani Stojkovic; Brian Fuller; Catalina Mejia Arenas; Chao Zhou; Charlotte Caucheteux; Chaya Nayak; Ching-Hsiang Chu; Chloe Bi; Chris Cai; Chris Cox; Chris Marra; Chris McConnell; Christian Keller; Christoph Feichtenhofer; Christophe Touret; Chunyang Wu; Corinne Wong; Cristian Canton Ferrer; Damien Allonsius; Daniel Kreymer; Daniel Haziza; Daniel Li; Danielle Pintz; Danny Livshits; Danny Wyatt; David Adkins; David Esiobu; David Xu; Davide Testuggine; Delia David; Devi Parikh; Dhruv Choudhary; Dhruv Mahajan; Diana Liskovich; Diego Garcia-Olano; Diego Perino; Dieuwke Hupkes; Dingkang Wang; Dustin Holland; Egor Lakomkin; Elina Lobanova; Xiaoqing Ellen Tan; Emily Dinan; Eric Smith; Erik Brinkman; Esteban Arcaute; Filip Radenovic; Firat Ozgenel; Francesco Caggioni; Frank Seide; Frank Zhang; Gabriel Synnaeve; Gabriella Schwarz; Gabrielle Lee; Gada Badeer; Georgia Anderson; Graeme Nail; Gregoire Mialon; Guan Pang; Guillem Cucurell; Hailey Nguyen; Hannah Korevaar; Hannah Wang; Haroun Habeeb; Harrison Rudolph; Henry Aspegren; Hu Xu; Hugo Touvron; Iga Kozlowska; Igor Molybog; Igor Tufanov; Iliyan Zarov; Imanol Arrieta Ibarra; Irina-Elena Veliche; Isabel Kloumann; Ishan Misra; Ivan Evtimov; Jacob Xu; Jade Copet; Jake Weissman; Jan Geffert; Jana Vranes; Japhet Asher; Jason Park; Jay Mahadeokar; Jean-Baptiste Gaya; Jeet Shah; Jelmer van der Linde; Jennifer Chan; Jenny Hong; Jenya Lee; Jeremy Fu; Jeremy Teboul; Jianfeng Chi; Jianyu Huang; Jie Wang; Jiecao Yu; Joanna Bitton; Joe Spisak; Joelle Pineau; Jon Carvill; Jongsoo Park; Joseph Rocca; Joshua Johnstun; Junteng Jia; Kalyan Vasuden Alwala; Kam Hou U; Kate Plawiak; Kartikeya Upasani; Kaushik Veeraraghavan; Ke Li; Kenneth Heafield; Kevin Stone; Khalid El-Arini; Krithika Iyer; Kshitiz Malik; Kuenley Chiu; Kunal Bhalla; Kyle Huang; Lakshya Garg; Lauren Rantala-Yeary; Laurens van der Maaten; Lawrence Chen; Leandro Silva; Lee Bell; Lei Zhang; Liang Tan; Louis Martin; Lovish Madaan; Luca Wehrstedt; Lukas Blecher; Luke de Oliveira; Madeline Muzzi; Madian Khabsa; Manav Avlani; Mannat Singh; Manohar Paluri; Mark Zuckerberg; Marcin Kardas; Martynas Mankus; Mathew Oldham; Mathieu Rita; Matthew Lennie; Maya Pavlova; Meghan Keneally; Melanie Kambadur; Mihir Patel; Mikayel Samvelyan; Mike Clark; Mike Lewis; Min Si; Mitesh Kumar Singh; Mo Metanat; Mona Hassan; Naman Goyal; Narjes Torabi; Nicolas Usunier; Nikolay Bashlykov; Nikolay Bogoychev; Niladri Chatterji; Ning Dong; Oliver Aobo Yang; Olivier Duchenne; Onur Celebi; Parth Parekh; Patrick Alrassy; Paul Saab; Pavan Balaji; Pedro Rittner; Pengchuan Zhang; Pengwei Li; Petar Vasic; Peter Weng; Polina Zvyagina; Prajjwal Bhargava; Pratik Dubal; Praveen Krishnan; Punit Singh Koura; Qing He; Rachel Rodriguez; Ragavan Srinivasan; Rahul Mitra; Ramon Calderer; Raymond Li; Robert Stojnic; Roberta Raileanu; Robin Battey; Rocky Wang; Rohit Girdhar; Rohit Patel; Romain Sauvestre; Ronnie Polidoro; Roshan Sumbaly; Ross Taylor; Ruan Silva; Rui Hou; Rui Wang; Russ Howes; Ruty Rinott; Saghar Hosseini; Sai Jayesh Bondu; Samyak Datta; Sanjay Singh; Sara Chugh; Sargun Dhillon; Satadru Pan; Sean Bell; Sergey Edunov; Shaoliang Nie; Sharan Narang; Sharath Raparthy; Shaun Lindsay; Sheng Feng; Sheng Shen; Shenghao Lin; Shiva Shankar; Shruti Bhosale; Shun Zhang; Simon Vandenhende; Sinong Wang; Seohyun Sonia Kim; Soumya Batra; Sten Sootla; Steve Kehoe; Suchin Gururangan; Sumit Gupta; Sunny Virk; Sydney Borodinsky; Tamar Glaser; Tamar Herman; Tamara Best; Tara Fowler; Thomas Georgiou; Thomas Scialom; Tianhe Li; Todor Mihaylov; Tong Xiao; Ujjwal Karn; Vedanuj Goswami; Vibhor Gupta; Vignesh Ramanathan; Viktor Kerkez; Vinay Satish Kumar; Vincent Gonguet; Vish Vogeti; Vlad Poenaru; Vlad Tiberiu Mihailescu; Vladan Petrovic; Vladimir Ivanov; Wei Li; Weiwei Chu; Wenhan Xiong; Wenyin Fu; Wes Bouaziz; Whitney Meers; Will Constable; Xavier Martinet; Xiaojian Wu; Xinbo Gao; Xinfeng Xie; Xuchao Jia; Yaelle Goldschlag; Yann LeCun; Yashesh Gaur; Yasmine Babaei; Ye Qi; Yenda Li; Yi Wen; Yiwen Song; Youngjin Nam; Yuchen Hao; Yuchen Zhang; Yun Wang; Yuning Mao; Yuzi He; Zacharie Delpierre Coudert; Zachary DeVito; Zahra Hankir; Zhaoduo Wen; Zheng Yan; Zhengxing Chen; Zhenyu Yang; Zoe Papakipos" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #meta #llama-3 #8-bit #conversational #en #base_model-gradientai/Llama-3-8B-Instruct-Gradient-1048k #license-llama3 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Use with transformers\n\n\nYou can run conversational inference using the Transformers pipeline abstraction, or by leveraging the Auto classes with the 'generate()' function. Let's see examples of both.", "#### Transformers pipeline", "#### Transformers AutoModelForCausalLM", "### Use with 'llama3'\n\n\nPlease, follow the instructions in the repository\n\n\nTo download Original checkpoints, see the example command below leveraging 'huggingface-cli':\n\n\nFor Hugging Face support, we recommend using transformers or TGI, but a similar command works.\n\n\nHardware and Software\n---------------------\n\n\nTraining Factors We used custom training libraries, Meta's Research SuperCluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.\n\n\nCarbon Footprint Pretraining utilized a cumulative 7.7M GPU hours of computation on hardware of type H100-80GB (TDP of 700W). Estimated total emissions were 2290 tCO2eq, 100% of which were offset by Meta’s sustainability program.\n\n\n\nCO2 emissions during pre-training. Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.\n\n\nTraining Data\n-------------\n\n\nOverview Llama 3 was pretrained on over 15 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over 10M human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.\n\n\nData Freshness The pretraining data has a cutoff of March 2023 for the 7B and December 2023 for the 70B models respectively.\n\n\nBenchmarks\n----------\n\n\nIn this section, we report the results for Llama 3 models on standard automatic benchmarks. For all the evaluations, we use our internal evaluations library. For details on the methodology see here.", "### Base pretrained models", "### Instruction tuned models", "### Responsibility & Safety\n\n\nWe believe that an open approach to AI leads to better, safer products, faster innovation, and a bigger overall market. We are committed to Responsible AI development and took a series of steps to limit misuse and harm and support the open source community.\n\n\nFoundation models are widely capable technologies that are built to be used for a diverse range of applications. They are not designed to meet every developer preference on safety levels for all use cases, out-of-the-box, as those by their nature will differ across different applications.\n\n\nRather, responsible LLM-application deployment is achieved by implementing a series of safety best practices throughout the development of such applications, from the model pre-training, fine-tuning and the deployment of systems composed of safeguards to tailor the safety needs specifically to the use case and audience.\n\n\nAs part of the Llama 3 release, we updated our Responsible Use Guide to outline the steps and best practices for developers to implement model and system level safety for their application. We also provide a set of resources including Meta Llama Guard 2 and Code Shield safeguards. These tools have proven to drastically reduce residual risks of LLM Systems, while maintaining a high level of helpfulness. We encourage developers to tune and deploy these safeguards according to their needs and we provide a reference implementation to get you started.", "#### Llama 3-Instruct\n\n\nAs outlined in the Responsible Use Guide, some trade-off between model helpfulness and model alignment is likely unavoidable. Developers should exercise discretion about how to weigh the benefits of alignment and helpfulness for their specific use case and audience. Developers should be mindful of residual risks when using Llama models and leverage additional safety tools as needed to reach the right safety bar for their use case.\n\n\nSafety\n\n\nFor our instruction tuned model, we conducted extensive red teaming exercises, performed adversarial evaluations and implemented safety mitigations techniques to lower residual risks. As with any Large Language Model, residual risks will likely remain and we recommend that developers assess these risks in the context of their use case. In parallel, we are working with the community to make AI safety benchmark standards transparent, rigorous and interpretable.\n\n\nRefusals\n\n\nIn addition to residual risks, we put a great emphasis on model refusals to benign prompts. Over-refusing not only can impact the user experience but could even be harmful in certain contexts as well. We’ve heard the feedback from the developer community and improved our fine tuning to ensure that Llama 3 is significantly less likely to falsely refuse to answer prompts than Llama 2.\n\n\nWe built internal benchmarks and developed mitigations to limit false refusals making Llama 3 our most helpful model to date.", "#### Responsible release\n\n\nIn addition to responsible use considerations outlined above, we followed a rigorous process that requires us to take extra measures against misuse and critical risks before we make our release decision.\n\n\nMisuse\n\n\nIf you access or use Llama 3, you agree to the Acceptable Use Policy. The most recent copy of this policy can be found at URL", "#### Critical risks\n\n\nCBRNE (Chemical, Biological, Radiological, Nuclear, and high yield Explosives)\n\n\nWe have conducted a two fold assessment of the safety of the model in this area:\n\n\n* Iterative testing during model training to assess the safety of responses related to CBRNE threats and other adversarial risks.\n* Involving external CBRNE experts to conduct an uplift test assessing the ability of the model to accurately provide expert knowledge and reduce barriers to potential CBRNE misuse, by reference to what can be achieved using web search (without the model).", "### Cyber Security\n\n\nWe have evaluated Llama 3 with CyberSecEval, Meta’s cybersecurity safety eval suite, measuring Llama 3’s propensity to suggest insecure code when used as a coding assistant, and Llama 3’s propensity to comply with requests to help carry out cyber attacks, where attacks are defined by the industry standard MITRE ATT&CK cyber attack ontology. On our insecure coding and cyber attacker helpfulness tests, Llama 3 behaved in the same range or safer than models of equivalent coding capability.", "### Child Safety\n\n\nChild Safety risk assessments were conducted using a team of experts, to assess the model’s capability to produce outputs that could result in Child Safety risks and inform on any necessary and appropriate risk mitigations via fine tuning. We leveraged those expert red teaming sessions to expand the coverage of our evaluation benchmarks through Llama 3 model development. For Llama 3, we conducted new in-depth sessions using objective based methodologies to assess the model risks along multiple attack vectors. We also partnered with content specialists to perform red teaming exercises assessing potentially violating content while taking account of market specific nuances or experiences.", "### Community\n\n\nGenerative AI safety requires expertise and tooling, and we believe in the strength of the open community to accelerate its progress. We are active members of open consortiums, including the AI Alliance, Partnership in AI and MLCommons, actively contributing to safety standardization and transparency. We encourage the community to adopt taxonomies like the MLCommons Proof of Concept evaluation to facilitate collaboration and transparency on safety and content evaluations. Our Purple Llama tools are open sourced for the community to use and widely distributed across ecosystem partners including cloud service providers. We encourage community contributions to our Github repository.\n\n\nFinally, we put in place a set of resources including an output reporting mechanism and bug bounty program to continuously improve the Llama technology with the help of the community.\n\n\nEthical Considerations and Limitations\n--------------------------------------\n\n\nThe core values of Llama 3 are openness, inclusivity and helpfulness. It is meant to serve everyone, and to work for a wide range of use cases. It is thus designed to be accessible to people across many different backgrounds, experiences and perspectives. Llama 3 addresses users and their needs as they are, without insertion unnecessary judgment or normativity, while reflecting the understanding that even content that may appear problematic in some cases can serve valuable purposes in others. It respects the dignity and autonomy of all users, especially in terms of the values of free thought and expression that power innovation and progress.\n\n\nBut Llama 3 is a new technology, and like any new technology, there are risks associated with its use. Testing conducted to date has been in English, and has not covered, nor could it cover, all scenarios. For these reasons, as with all LLMs, Llama 3’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 3 models, developers should perform safety testing and tuning tailored to their specific applications of the model. As outlined in the Responsible Use Guide, we recommend incorporating Purple Llama solutions into your workflows and specifically Llama Guard which provides a base model to filter input and output prompts to layer system-level safety on top of model-level safety.\n\n\nPlease see the Responsible Use Guide available at URL\n\n\ninstructions\n\n\n@article{llama3modelcard,\n\n\ntitle={Llama 3 Model Card},\n\n\nauthor={AI@Meta},\n\n\nyear={2024},\n\n\nurl = {URL\n\n\n}\n\n\nContributors\n------------\n\n\nAaditya Singh; Aaron Grattafiori; Abhimanyu Dubey; Abhinav Jauhri; Abhinav Pandey; Abhishek Kadian; Adam Kelsey; Adi Gangidi; Ahmad Al-Dahle; Ahuva Goldstand; Aiesha Letman; Ajay Menon; Akhil Mathur; Alan Schelten; Alex Vaughan; Amy Yang; Andrei Lupu; Andres Alvarado; Andrew Gallagher; Andrew Gu; Andrew Ho; Andrew Poulton; Andrew Ryan; Angela Fan; Ankit Ramchandani; Anthony Hartshorn; Archi Mitra; Archie Sravankumar; Artem Korenev; Arun Rao; Ashley Gabriel; Ashwin Bharambe; Assaf Eisenman; Aston Zhang; Aurelien Rodriguez; Austen Gregerson; Ava Spataru; Baptiste Roziere; Ben Maurer; Benjamin Leonhardi; Bernie Huang; Bhargavi Paranjape; Bing Liu; Binh Tang; Bobbie Chern; Brani Stojkovic; Brian Fuller; Catalina Mejia Arenas; Chao Zhou; Charlotte Caucheteux; Chaya Nayak; Ching-Hsiang Chu; Chloe Bi; Chris Cai; Chris Cox; Chris Marra; Chris McConnell; Christian Keller; Christoph Feichtenhofer; Christophe Touret; Chunyang Wu; Corinne Wong; Cristian Canton Ferrer; Damien Allonsius; Daniel Kreymer; Daniel Haziza; Daniel Li; Danielle Pintz; Danny Livshits; Danny Wyatt; David Adkins; David Esiobu; David Xu; Davide Testuggine; Delia David; Devi Parikh; Dhruv Choudhary; Dhruv Mahajan; Diana Liskovich; Diego Garcia-Olano; Diego Perino; Dieuwke Hupkes; Dingkang Wang; Dustin Holland; Egor Lakomkin; Elina Lobanova; Xiaoqing Ellen Tan; Emily Dinan; Eric Smith; Erik Brinkman; Esteban Arcaute; Filip Radenovic; Firat Ozgenel; Francesco Caggioni; Frank Seide; Frank Zhang; Gabriel Synnaeve; Gabriella Schwarz; Gabrielle Lee; Gada Badeer; Georgia Anderson; Graeme Nail; Gregoire Mialon; Guan Pang; Guillem Cucurell; Hailey Nguyen; Hannah Korevaar; Hannah Wang; Haroun Habeeb; Harrison Rudolph; Henry Aspegren; Hu Xu; Hugo Touvron; Iga Kozlowska; Igor Molybog; Igor Tufanov; Iliyan Zarov; Imanol Arrieta Ibarra; Irina-Elena Veliche; Isabel Kloumann; Ishan Misra; Ivan Evtimov; Jacob Xu; Jade Copet; Jake Weissman; Jan Geffert; Jana Vranes; Japhet Asher; Jason Park; Jay Mahadeokar; Jean-Baptiste Gaya; Jeet Shah; Jelmer van der Linde; Jennifer Chan; Jenny Hong; Jenya Lee; Jeremy Fu; Jeremy Teboul; Jianfeng Chi; Jianyu Huang; Jie Wang; Jiecao Yu; Joanna Bitton; Joe Spisak; Joelle Pineau; Jon Carvill; Jongsoo Park; Joseph Rocca; Joshua Johnstun; Junteng Jia; Kalyan Vasuden Alwala; Kam Hou U; Kate Plawiak; Kartikeya Upasani; Kaushik Veeraraghavan; Ke Li; Kenneth Heafield; Kevin Stone; Khalid El-Arini; Krithika Iyer; Kshitiz Malik; Kuenley Chiu; Kunal Bhalla; Kyle Huang; Lakshya Garg; Lauren Rantala-Yeary; Laurens van der Maaten; Lawrence Chen; Leandro Silva; Lee Bell; Lei Zhang; Liang Tan; Louis Martin; Lovish Madaan; Luca Wehrstedt; Lukas Blecher; Luke de Oliveira; Madeline Muzzi; Madian Khabsa; Manav Avlani; Mannat Singh; Manohar Paluri; Mark Zuckerberg; Marcin Kardas; Martynas Mankus; Mathew Oldham; Mathieu Rita; Matthew Lennie; Maya Pavlova; Meghan Keneally; Melanie Kambadur; Mihir Patel; Mikayel Samvelyan; Mike Clark; Mike Lewis; Min Si; Mitesh Kumar Singh; Mo Metanat; Mona Hassan; Naman Goyal; Narjes Torabi; Nicolas Usunier; Nikolay Bashlykov; Nikolay Bogoychev; Niladri Chatterji; Ning Dong; Oliver Aobo Yang; Olivier Duchenne; Onur Celebi; Parth Parekh; Patrick Alrassy; Paul Saab; Pavan Balaji; Pedro Rittner; Pengchuan Zhang; Pengwei Li; Petar Vasic; Peter Weng; Polina Zvyagina; Prajjwal Bhargava; Pratik Dubal; Praveen Krishnan; Punit Singh Koura; Qing He; Rachel Rodriguez; Ragavan Srinivasan; Rahul Mitra; Ramon Calderer; Raymond Li; Robert Stojnic; Roberta Raileanu; Robin Battey; Rocky Wang; Rohit Girdhar; Rohit Patel; Romain Sauvestre; Ronnie Polidoro; Roshan Sumbaly; Ross Taylor; Ruan Silva; Rui Hou; Rui Wang; Russ Howes; Ruty Rinott; Saghar Hosseini; Sai Jayesh Bondu; Samyak Datta; Sanjay Singh; Sara Chugh; Sargun Dhillon; Satadru Pan; Sean Bell; Sergey Edunov; Shaoliang Nie; Sharan Narang; Sharath Raparthy; Shaun Lindsay; Sheng Feng; Sheng Shen; Shenghao Lin; Shiva Shankar; Shruti Bhosale; Shun Zhang; Simon Vandenhende; Sinong Wang; Seohyun Sonia Kim; Soumya Batra; Sten Sootla; Steve Kehoe; Suchin Gururangan; Sumit Gupta; Sunny Virk; Sydney Borodinsky; Tamar Glaser; Tamar Herman; Tamara Best; Tara Fowler; Thomas Georgiou; Thomas Scialom; Tianhe Li; Todor Mihaylov; Tong Xiao; Ujjwal Karn; Vedanuj Goswami; Vibhor Gupta; Vignesh Ramanathan; Viktor Kerkez; Vinay Satish Kumar; Vincent Gonguet; Vish Vogeti; Vlad Poenaru; Vlad Tiberiu Mihailescu; Vladan Petrovic; Vladimir Ivanov; Wei Li; Weiwei Chu; Wenhan Xiong; Wenyin Fu; Wes Bouaziz; Whitney Meers; Will Constable; Xavier Martinet; Xiaojian Wu; Xinbo Gao; Xinfeng Xie; Xuchao Jia; Yaelle Goldschlag; Yann LeCun; Yashesh Gaur; Yasmine Babaei; Ye Qi; Yenda Li; Yi Wen; Yiwen Song; Youngjin Nam; Yuchen Hao; Yuchen Zhang; Yun Wang; Yuning Mao; Yuzi He; Zacharie Delpierre Coudert; Zachary DeVito; Zahra Hankir; Zhaoduo Wen; Zheng Yan; Zhengxing Chen; Zhenyu Yang; Zoe Papakipos" ]
[ 80, 42, 6, 13, 429, 8, 6, 270, 280, 72, 115, 118, 126, 2136 ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #meta #llama-3 #8-bit #conversational #en #base_model-gradientai/Llama-3-8B-Instruct-Gradient-1048k #license-llama3 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Use with transformers\n\n\nYou can run conversational inference using the Transformers pipeline abstraction, or by leveraging the Auto classes with the 'generate()' function. Let's see examples of both.#### Transformers pipeline#### Transformers AutoModelForCausalLM### Use with 'llama3'\n\n\nPlease, follow the instructions in the repository\n\n\nTo download Original checkpoints, see the example command below leveraging 'huggingface-cli':\n\n\nFor Hugging Face support, we recommend using transformers or TGI, but a similar command works.\n\n\nHardware and Software\n---------------------\n\n\nTraining Factors We used custom training libraries, Meta's Research SuperCluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.\n\n\nCarbon Footprint Pretraining utilized a cumulative 7.7M GPU hours of computation on hardware of type H100-80GB (TDP of 700W). Estimated total emissions were 2290 tCO2eq, 100% of which were offset by Meta’s sustainability program.\n\n\n\nCO2 emissions during pre-training. Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.\n\n\nTraining Data\n-------------\n\n\nOverview Llama 3 was pretrained on over 15 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over 10M human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.\n\n\nData Freshness The pretraining data has a cutoff of March 2023 for the 7B and December 2023 for the 70B models respectively.\n\n\nBenchmarks\n----------\n\n\nIn this section, we report the results for Llama 3 models on standard automatic benchmarks. For all the evaluations, we use our internal evaluations library. For details on the methodology see here.### Base pretrained models### Instruction tuned models### Responsibility & Safety\n\n\nWe believe that an open approach to AI leads to better, safer products, faster innovation, and a bigger overall market. We are committed to Responsible AI development and took a series of steps to limit misuse and harm and support the open source community.\n\n\nFoundation models are widely capable technologies that are built to be used for a diverse range of applications. They are not designed to meet every developer preference on safety levels for all use cases, out-of-the-box, as those by their nature will differ across different applications.\n\n\nRather, responsible LLM-application deployment is achieved by implementing a series of safety best practices throughout the development of such applications, from the model pre-training, fine-tuning and the deployment of systems composed of safeguards to tailor the safety needs specifically to the use case and audience.\n\n\nAs part of the Llama 3 release, we updated our Responsible Use Guide to outline the steps and best practices for developers to implement model and system level safety for their application. We also provide a set of resources including Meta Llama Guard 2 and Code Shield safeguards. These tools have proven to drastically reduce residual risks of LLM Systems, while maintaining a high level of helpfulness. We encourage developers to tune and deploy these safeguards according to their needs and we provide a reference implementation to get you started.#### Llama 3-Instruct\n\n\nAs outlined in the Responsible Use Guide, some trade-off between model helpfulness and model alignment is likely unavoidable. Developers should exercise discretion about how to weigh the benefits of alignment and helpfulness for their specific use case and audience. Developers should be mindful of residual risks when using Llama models and leverage additional safety tools as needed to reach the right safety bar for their use case.\n\n\nSafety\n\n\nFor our instruction tuned model, we conducted extensive red teaming exercises, performed adversarial evaluations and implemented safety mitigations techniques to lower residual risks. As with any Large Language Model, residual risks will likely remain and we recommend that developers assess these risks in the context of their use case. In parallel, we are working with the community to make AI safety benchmark standards transparent, rigorous and interpretable.\n\n\nRefusals\n\n\nIn addition to residual risks, we put a great emphasis on model refusals to benign prompts. Over-refusing not only can impact the user experience but could even be harmful in certain contexts as well. We’ve heard the feedback from the developer community and improved our fine tuning to ensure that Llama 3 is significantly less likely to falsely refuse to answer prompts than Llama 2.\n\n\nWe built internal benchmarks and developed mitigations to limit false refusals making Llama 3 our most helpful model to date.#### Responsible release\n\n\nIn addition to responsible use considerations outlined above, we followed a rigorous process that requires us to take extra measures against misuse and critical risks before we make our release decision.\n\n\nMisuse\n\n\nIf you access or use Llama 3, you agree to the Acceptable Use Policy. The most recent copy of this policy can be found at URL#### Critical risks\n\n\nCBRNE (Chemical, Biological, Radiological, Nuclear, and high yield Explosives)\n\n\nWe have conducted a two fold assessment of the safety of the model in this area:\n\n\n* Iterative testing during model training to assess the safety of responses related to CBRNE threats and other adversarial risks.\n* Involving external CBRNE experts to conduct an uplift test assessing the ability of the model to accurately provide expert knowledge and reduce barriers to potential CBRNE misuse, by reference to what can be achieved using web search (without the model).### Cyber Security\n\n\nWe have evaluated Llama 3 with CyberSecEval, Meta’s cybersecurity safety eval suite, measuring Llama 3’s propensity to suggest insecure code when used as a coding assistant, and Llama 3’s propensity to comply with requests to help carry out cyber attacks, where attacks are defined by the industry standard MITRE ATT&CK cyber attack ontology. On our insecure coding and cyber attacker helpfulness tests, Llama 3 behaved in the same range or safer than models of equivalent coding capability.### Child Safety\n\n\nChild Safety risk assessments were conducted using a team of experts, to assess the model’s capability to produce outputs that could result in Child Safety risks and inform on any necessary and appropriate risk mitigations via fine tuning. We leveraged those expert red teaming sessions to expand the coverage of our evaluation benchmarks through Llama 3 model development. For Llama 3, we conducted new in-depth sessions using objective based methodologies to assess the model risks along multiple attack vectors. We also partnered with content specialists to perform red teaming exercises assessing potentially violating content while taking account of market specific nuances or experiences.### Community\n\n\nGenerative AI safety requires expertise and tooling, and we believe in the strength of the open community to accelerate its progress. We are active members of open consortiums, including the AI Alliance, Partnership in AI and MLCommons, actively contributing to safety standardization and transparency. We encourage the community to adopt taxonomies like the MLCommons Proof of Concept evaluation to facilitate collaboration and transparency on safety and content evaluations. Our Purple Llama tools are open sourced for the community to use and widely distributed across ecosystem partners including cloud service providers. We encourage community contributions to our Github repository.\n\n\nFinally, we put in place a set of resources including an output reporting mechanism and bug bounty program to continuously improve the Llama technology with the help of the community.\n\n\nEthical Considerations and Limitations\n--------------------------------------\n\n\nThe core values of Llama 3 are openness, inclusivity and helpfulness. It is meant to serve everyone, and to work for a wide range of use cases. It is thus designed to be accessible to people across many different backgrounds, experiences and perspectives. Llama 3 addresses users and their needs as they are, without insertion unnecessary judgment or normativity, while reflecting the understanding that even content that may appear problematic in some cases can serve valuable purposes in others. It respects the dignity and autonomy of all users, especially in terms of the values of free thought and expression that power innovation and progress.\n\n\nBut Llama 3 is a new technology, and like any new technology, there are risks associated with its use. Testing conducted to date has been in English, and has not covered, nor could it cover, all scenarios. For these reasons, as with all LLMs, Llama 3’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 3 models, developers should perform safety testing and tuning tailored to their specific applications of the model. As outlined in the Responsible Use Guide, we recommend incorporating Purple Llama solutions into your workflows and specifically Llama Guard which provides a base model to filter input and output prompts to layer system-level safety on top of model-level safety.\n\n\nPlease see the Responsible Use Guide available at URL\n\n\ninstructions\n\n\n@article{llama3modelcard,\n\n\ntitle={Llama 3 Model Card},\n\n\nauthor={AI@Meta},\n\n\nyear={2024},\n\n\nurl = {URL\n\n\n}\n\n\nContributors\n------------\n\n\nAaditya Singh; Aaron Grattafiori; Abhimanyu Dubey; Abhinav Jauhri; Abhinav Pandey; Abhishek Kadian; Adam Kelsey; Adi Gangidi; Ahmad Al-Dahle; Ahuva Goldstand; Aiesha Letman; Ajay Menon; Akhil Mathur; Alan Schelten; Alex Vaughan; Amy Yang; Andrei Lupu; Andres Alvarado; Andrew Gallagher; Andrew Gu; Andrew Ho; Andrew Poulton; Andrew Ryan; Angela Fan; Ankit Ramchandani; Anthony Hartshorn; Archi Mitra; Archie Sravankumar; Artem Korenev; Arun Rao; Ashley Gabriel; Ashwin Bharambe; Assaf Eisenman; Aston Zhang; Aurelien Rodriguez; Austen Gregerson; Ava Spataru; Baptiste Roziere; Ben Maurer; Benjamin Leonhardi; Bernie Huang; Bhargavi Paranjape; Bing Liu; Binh Tang; Bobbie Chern; Brani Stojkovic; Brian Fuller; Catalina Mejia Arenas; Chao Zhou; Charlotte Caucheteux; Chaya Nayak; Ching-Hsiang Chu; Chloe Bi; Chris Cai; Chris Cox; Chris Marra; Chris McConnell; Christian Keller; Christoph Feichtenhofer; Christophe Touret; Chunyang Wu; Corinne Wong; Cristian Canton Ferrer; Damien Allonsius; Daniel Kreymer; Daniel Haziza; Daniel Li; Danielle Pintz; Danny Livshits; Danny Wyatt; David Adkins; David Esiobu; David Xu; Davide Testuggine; Delia David; Devi Parikh; Dhruv Choudhary; Dhruv Mahajan; Diana Liskovich; Diego Garcia-Olano; Diego Perino; Dieuwke Hupkes; Dingkang Wang; Dustin Holland; Egor Lakomkin; Elina Lobanova; Xiaoqing Ellen Tan; Emily Dinan; Eric Smith; Erik Brinkman; Esteban Arcaute; Filip Radenovic; Firat Ozgenel; Francesco Caggioni; Frank Seide; Frank Zhang; Gabriel Synnaeve; Gabriella Schwarz; Gabrielle Lee; Gada Badeer; Georgia Anderson; Graeme Nail; Gregoire Mialon; Guan Pang; Guillem Cucurell; Hailey Nguyen; Hannah Korevaar; Hannah Wang; Haroun Habeeb; Harrison Rudolph; Henry Aspegren; Hu Xu; Hugo Touvron; Iga Kozlowska; Igor Molybog; Igor Tufanov; Iliyan Zarov; Imanol Arrieta Ibarra; Irina-Elena Veliche; Isabel Kloumann; Ishan Misra; Ivan Evtimov; Jacob Xu; Jade Copet; Jake Weissman; Jan Geffert; Jana Vranes; Japhet Asher; Jason Park; Jay Mahadeokar; Jean-Baptiste Gaya; Jeet Shah; Jelmer van der Linde; Jennifer Chan; Jenny Hong; Jenya Lee; Jeremy Fu; Jeremy Teboul; Jianfeng Chi; Jianyu Huang; Jie Wang; Jiecao Yu; Joanna Bitton; Joe Spisak; Joelle Pineau; Jon Carvill; Jongsoo Park; Joseph Rocca; Joshua Johnstun; Junteng Jia; Kalyan Vasuden Alwala; Kam Hou U; Kate Plawiak; Kartikeya Upasani; Kaushik Veeraraghavan; Ke Li; Kenneth Heafield; Kevin Stone; Khalid El-Arini; Krithika Iyer; Kshitiz Malik; Kuenley Chiu; Kunal Bhalla; Kyle Huang; Lakshya Garg; Lauren Rantala-Yeary; Laurens van der Maaten; Lawrence Chen; Leandro Silva; Lee Bell; Lei Zhang; Liang Tan; Louis Martin; Lovish Madaan; Luca Wehrstedt; Lukas Blecher; Luke de Oliveira; Madeline Muzzi; Madian Khabsa; Manav Avlani; Mannat Singh; Manohar Paluri; Mark Zuckerberg; Marcin Kardas; Martynas Mankus; Mathew Oldham; Mathieu Rita; Matthew Lennie; Maya Pavlova; Meghan Keneally; Melanie Kambadur; Mihir Patel; Mikayel Samvelyan; Mike Clark; Mike Lewis; Min Si; Mitesh Kumar Singh; Mo Metanat; Mona Hassan; Naman Goyal; Narjes Torabi; Nicolas Usunier; Nikolay Bashlykov; Nikolay Bogoychev; Niladri Chatterji; Ning Dong; Oliver Aobo Yang; Olivier Duchenne; Onur Celebi; Parth Parekh; Patrick Alrassy; Paul Saab; Pavan Balaji; Pedro Rittner; Pengchuan Zhang; Pengwei Li; Petar Vasic; Peter Weng; Polina Zvyagina; Prajjwal Bhargava; Pratik Dubal; Praveen Krishnan; Punit Singh Koura; Qing He; Rachel Rodriguez; Ragavan Srinivasan; Rahul Mitra; Ramon Calderer; Raymond Li; Robert Stojnic; Roberta Raileanu; Robin Battey; Rocky Wang; Rohit Girdhar; Rohit Patel; Romain Sauvestre; Ronnie Polidoro; Roshan Sumbaly; Ross Taylor; Ruan Silva; Rui Hou; Rui Wang; Russ Howes; Ruty Rinott; Saghar Hosseini; Sai Jayesh Bondu; Samyak Datta; Sanjay Singh; Sara Chugh; Sargun Dhillon; Satadru Pan; Sean Bell; Sergey Edunov; Shaoliang Nie; Sharan Narang; Sharath Raparthy; Shaun Lindsay; Sheng Feng; Sheng Shen; Shenghao Lin; Shiva Shankar; Shruti Bhosale; Shun Zhang; Simon Vandenhende; Sinong Wang; Seohyun Sonia Kim; Soumya Batra; Sten Sootla; Steve Kehoe; Suchin Gururangan; Sumit Gupta; Sunny Virk; Sydney Borodinsky; Tamar Glaser; Tamar Herman; Tamara Best; Tara Fowler; Thomas Georgiou; Thomas Scialom; Tianhe Li; Todor Mihaylov; Tong Xiao; Ujjwal Karn; Vedanuj Goswami; Vibhor Gupta; Vignesh Ramanathan; Viktor Kerkez; Vinay Satish Kumar; Vincent Gonguet; Vish Vogeti; Vlad Poenaru; Vlad Tiberiu Mihailescu; Vladan Petrovic; Vladimir Ivanov; Wei Li; Weiwei Chu; Wenhan Xiong; Wenyin Fu; Wes Bouaziz; Whitney Meers; Will Constable; Xavier Martinet; Xiaojian Wu; Xinbo Gao; Xinfeng Xie; Xuchao Jia; Yaelle Goldschlag; Yann LeCun; Yashesh Gaur; Yasmine Babaei; Ye Qi; Yenda Li; Yi Wen; Yiwen Song; Youngjin Nam; Yuchen Hao; Yuchen Zhang; Yun Wang; Yuning Mao; Yuzi He; Zacharie Delpierre Coudert; Zachary DeVito; Zahra Hankir; Zhaoduo Wen; Zheng Yan; Zhengxing Chen; Zhenyu Yang; Zoe Papakipos" ]
text-generation
transformers
# phi-2-int8-ov * Model creator: [Microsoft](https://huggingface.co/microsoft) * Original model: [phi-2](https://huggingface.co/microsoft/phi-2) ## Description This is [phi-2](https://huggingface.co/microsoft/phi-2) model converted to the [OpenVINO™ IR](https://docs.openvino.ai/2024/documentation/openvino-ir-format.html) (Intermediate Representation) format with weights compressed to INT8 by [NNCF](https://github.com/openvinotoolkit/nncf). ## Quantization Parameters Weight compression was performed using `nncf.compress_weights` with the following parameters: * mode: **INT8_ASYM** For more information on quantization, check the [OpenVINO model optimization guide](https://docs.openvino.ai/2024/openvino-workflow/model-optimization-guide/weight-compression.html). ## Compatibility The provided OpenVINO™ IR model is compatible with: * OpenVINO version 2024.1.0 and higher * Optimum Intel 1.16.0 and higher ## Running Model Inference 1. Install packages required for using [Optimum Intel](https://huggingface.co/docs/optimum/intel/index) integration with the OpenVINO backend: ``` pip install optimum[openvino] ``` 2. Run model inference: ``` from transformers import AutoTokenizer from optimum.intel.openvino import OVModelForCausalLM model_id = "OpenVINO/phi-2-int8-ov" tokenizer = AutoTokenizer.from_pretrained(model_id) model = OVModelForCausalLM.from_pretrained(model_id) inputs = tokenizer("What is OpenVINO?", return_tensors="pt") outputs = model.generate(**inputs, max_length=200) text = tokenizer.batch_decode(outputs)[0] print(text) ``` For more examples and possible optimizations, refer to the [OpenVINO Large Language Model Inference Guide](https://docs.openvino.ai/2024/learn-openvino/llm_inference_guide.html). ## Limitations Check the original model card for [limitations](https://huggingface.co/microsoft/phi-2#limitations-of-phi-2). ## Legal information The original model is distributed under [MIT](https://huggingface.co/microsoft/phi-2/resolve/main/LICENSE) license. More details can be found in [original model card](https://huggingface.co/microsoft/phi-2).
{"language": ["en"], "license": "mit", "license_link": "https://huggingface.co/microsoft/phi-2/resolve/main/LICENSE"}
OpenVINO/phi-2-int8-ov
null
[ "transformers", "openvino", "phi", "text-generation", "en", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T14:48:03+00:00
[]
[ "en" ]
TAGS #transformers #openvino #phi #text-generation #en #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# phi-2-int8-ov * Model creator: Microsoft * Original model: phi-2 ## Description This is phi-2 model converted to the OpenVINO™ IR (Intermediate Representation) format with weights compressed to INT8 by NNCF. ## Quantization Parameters Weight compression was performed using 'nncf.compress_weights' with the following parameters: * mode: INT8_ASYM For more information on quantization, check the OpenVINO model optimization guide. ## Compatibility The provided OpenVINO™ IR model is compatible with: * OpenVINO version 2024.1.0 and higher * Optimum Intel 1.16.0 and higher ## Running Model Inference 1. Install packages required for using Optimum Intel integration with the OpenVINO backend: 2. Run model inference: For more examples and possible optimizations, refer to the OpenVINO Large Language Model Inference Guide. ## Limitations Check the original model card for limitations. ## Legal information The original model is distributed under MIT license. More details can be found in original model card.
[ "# phi-2-int8-ov\n\n * Model creator: Microsoft\n * Original model: phi-2", "## Description\n\nThis is phi-2 model converted to the OpenVINO™ IR (Intermediate Representation) format with weights compressed to INT8 by NNCF.", "## Quantization Parameters\n\nWeight compression was performed using 'nncf.compress_weights' with the following parameters:\n\n* mode: INT8_ASYM\n\nFor more information on quantization, check the OpenVINO model optimization guide.", "## Compatibility\n\nThe provided OpenVINO™ IR model is compatible with:\n\n* OpenVINO version 2024.1.0 and higher\n* Optimum Intel 1.16.0 and higher", "## Running Model Inference\n\n1. Install packages required for using Optimum Intel integration with the OpenVINO backend:\n\n\n\n2. Run model inference:\n\n\n\nFor more examples and possible optimizations, refer to the OpenVINO Large Language Model Inference Guide.", "## Limitations\n\nCheck the original model card for limitations.", "## Legal information\n\nThe original model is distributed under MIT license. More details can be found in original model card." ]
[ "TAGS\n#transformers #openvino #phi #text-generation #en #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# phi-2-int8-ov\n\n * Model creator: Microsoft\n * Original model: phi-2", "## Description\n\nThis is phi-2 model converted to the OpenVINO™ IR (Intermediate Representation) format with weights compressed to INT8 by NNCF.", "## Quantization Parameters\n\nWeight compression was performed using 'nncf.compress_weights' with the following parameters:\n\n* mode: INT8_ASYM\n\nFor more information on quantization, check the OpenVINO model optimization guide.", "## Compatibility\n\nThe provided OpenVINO™ IR model is compatible with:\n\n* OpenVINO version 2024.1.0 and higher\n* Optimum Intel 1.16.0 and higher", "## Running Model Inference\n\n1. Install packages required for using Optimum Intel integration with the OpenVINO backend:\n\n\n\n2. Run model inference:\n\n\n\nFor more examples and possible optimizations, refer to the OpenVINO Large Language Model Inference Guide.", "## Limitations\n\nCheck the original model card for limitations.", "## Legal information\n\nThe original model is distributed under MIT license. More details can be found in original model card." ]
[ 38, 22, 32, 50, 37, 48, 11, 23 ]
[ "TAGS\n#transformers #openvino #phi #text-generation #en #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# phi-2-int8-ov\n\n * Model creator: Microsoft\n * Original model: phi-2## Description\n\nThis is phi-2 model converted to the OpenVINO™ IR (Intermediate Representation) format with weights compressed to INT8 by NNCF.## Quantization Parameters\n\nWeight compression was performed using 'nncf.compress_weights' with the following parameters:\n\n* mode: INT8_ASYM\n\nFor more information on quantization, check the OpenVINO model optimization guide.## Compatibility\n\nThe provided OpenVINO™ IR model is compatible with:\n\n* OpenVINO version 2024.1.0 and higher\n* Optimum Intel 1.16.0 and higher## Running Model Inference\n\n1. Install packages required for using Optimum Intel integration with the OpenVINO backend:\n\n\n\n2. Run model inference:\n\n\n\nFor more examples and possible optimizations, refer to the OpenVINO Large Language Model Inference Guide.## Limitations\n\nCheck the original model card for limitations.## Legal information\n\nThe original model is distributed under MIT license. More details can be found in original model card." ]
text2text-generation
transformers
## Eval results on WikiLarge We obtain the following results on ```validation``` and ```test``` sets of WikiLarge: | Set | SARI | BLEU | |------------|-------|-------| | validation | 44.4 | 27.38 | | test | 39.16 | 31.62 | ## EASSE evaluation ### TurkCorpus We obtain the following results on ```validation``` and ```test``` sets of the Turk corpus: | Set | SARI | BLEU | FKGL | |------------|-------|-------|------| | validation | 38.57 | 87.7 | 8.1 | | test | 38.82 | 87.89 | 8.49 | ### ASSET We obtain the following results on ```validation``` and ```test``` sets of the ASSET corpus: | Set | SARI | BLEU | FKGL | |------------|-------|-------|------| | validation | 36.83 | 88.54 | 8.15 | | test | 35.74 | 86.96 | 8.46 |
{"language": ["en"], "tags": ["sentence-simplification"], "metrics": ["sari", "bleu"], "pipeline_tag": "text2text-generation"}
waboucay/bart-large-simplification-wikilarge-original
null
[ "transformers", "safetensors", "bart", "text2text-generation", "sentence-simplification", "en", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:48:38+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #bart #text2text-generation #sentence-simplification #en #autotrain_compatible #endpoints_compatible #region-us
Eval results on WikiLarge ------------------------- We obtain the following results on and sets of WikiLarge: Set: validation, SARI: 44.4, BLEU: 27.38 Set: test, SARI: 39.16, BLEU: 31.62 EASSE evaluation ---------------- ### TurkCorpus We obtain the following results on and sets of the Turk corpus: ### ASSET We obtain the following results on and sets of the ASSET corpus:
[ "### TurkCorpus\n\n\nWe obtain the following results on and sets of the Turk corpus:", "### ASSET\n\n\nWe obtain the following results on and sets of the ASSET corpus:" ]
[ "TAGS\n#transformers #safetensors #bart #text2text-generation #sentence-simplification #en #autotrain_compatible #endpoints_compatible #region-us \n", "### TurkCorpus\n\n\nWe obtain the following results on and sets of the Turk corpus:", "### ASSET\n\n\nWe obtain the following results on and sets of the ASSET corpus:" ]
[ 37, 19, 17 ]
[ "TAGS\n#transformers #safetensors #bart #text2text-generation #sentence-simplification #en #autotrain_compatible #endpoints_compatible #region-us \n### TurkCorpus\n\n\nWe obtain the following results on and sets of the Turk corpus:### ASSET\n\n\nWe obtain the following results on and sets of the ASSET corpus:" ]
text-generation
transformers
# **csg-wukong-1B-sft-bf16** [[中文]](#chinese) [[English]](#english) <a id="english"></a> <p align="center"> <img width="900px" alt="OpenCSG" src="./csg-wukong-logo-green.jpg"> </p> <p align="center"><a href="https://portal.opencsg.com/models">[OpenCSG Community]</a> <a href="https://github.com/opencsgs">[github]</a> <a href="https://cdn-uploads.huggingface.co/production/uploads/64c71b27d43e4dee51a8b31a/HU6vz21qKTEmUBCWqCFh9.jpeg">[wechat]</a> <a href="https://twitter.com/OpenCsg">[Twitter]</a> </p> </div> OpenCSG stands for Converged resources, Software refinement, and Generative LM. The 'C' represents Converged resources, indicating the integration and full utilization of hybrid resources. The 'S' stands for Software refinement, signifying software that is refined by large models. The 'G' represents Generative LM, which denotes widespread, inclusive, and democratized generative large models. The vision of OpenCSG is to empower every industry, every company, and every individual to own their models. We adhere to the principles of openness and open source, making the large model software stack of OpenCSG available to the community. We welcome everyone to use, send feedback, and contribute collaboratively. ## Model Description **csg-wukong-1B-sft-bf16** was finetuned on [csg-wukong-1B](https://huggingface.co/opencsg/csg-wukong-1B). <br> we will introduce more information about csg-wukong-1B. ## Model Evaluation results We submitted csg-wukong-1B on the [open_llm_leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard), and the results show our model ranked the 8th among the ~1.5B pretrained small language models. ![image/png](https://cdn-uploads.huggingface.co/production/uploads/661790397437201d78141856/_HRTxL6N0qnNPNt-P8k9k.png) # Training ## Hardware - **GPUs:** 16 H800 - **Training time:** 43days ## Software - **Orchestration:** [Deepspeed](https://github.com/OpenCSGs) - **Neural networks:** [PyTorch](https://github.com/pytorch/pytorch) - **BP16 if applicable:** [apex](https://github.com/NVIDIA/apex) <a id="chinese"></a> <p> </p> # OpenCSG介绍 <p align="center"> <img width="300px" alt="OpenCSG" src="https://cdn-uploads.huggingface.co/production/uploads/64c71b27d43e4dee51a8b31a/GwYXPKuEoGCGcMICeW-sb.jpeg"> </p> <p align="center"><a href="https://opencsg.com/models">[OpenCSG 社区]</a> <a href="https://github.com/opencsgs">[github]</a> <a href="https://cdn-uploads.huggingface.co/production/uploads/64c71b27d43e4dee51a8b31a/HU6vz21qKTEmUBCWqCFh9.jpeg">[微信]</a> <a href="https://twitter.com/OpenCsg">[推特]</a> </p> </div> OpenCSG中 Open是开源开放;C 代表 Converged resources,整合和充分利用的混合异构资源优势,算力降本增效;S 代表 Software refined,重新定义软件的交付方式,通过大模型驱动软件开发,人力降本增效;G 代表 Generative LM,大众化、普惠化和民主化的可商用的开源生成式大模型。 OpenCSG的愿景是让每个行业、每个公司、每个人都拥有自己的模型。 我们坚持开源开放的原则,将OpenCSG的大模型软件栈开源到社区,欢迎使用、反馈和参与共建,欢迎关注。 ## 模型介绍 **csg-wukong-1B-sft-bf16** 在[csg-wukong-1B](https://huggingface.co/opencsg/csg-wukong-1B)预训练模型上微调而成. <br> 我们将在后面介绍更多关于这个模型的信息。 ## 模型评测结果 我们把csg-wukong-1B模型提交到[open_llm_leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)榜单上,结果显示我们的模型目前在~1.5B小语言模型中排名第8。 ![image/png](https://cdn-uploads.huggingface.co/production/uploads/661790397437201d78141856/ZfWZ1Fd7ccKrJVx0okV9z.png) # 训练 ## 硬件资源 - **GPU数量:** 16 H800 - **训练时间:** 43天 ## 软件使用 - **微调训练框架:** [Deepspeed](https://github.com/OpenCSGs) - **深度学习框架:** [PyTorch](https://github.com/pytorch/pytorch) - **BP16:** [apex](https://github.com/NVIDIA/apex)
{"language": ["en"], "license": "apache-2.0", "tags": ["code"], "pipeline_tag": "text-generation"}
opencsg/csg-wukong-1B-sft-bf16
null
[ "transformers", "safetensors", "llama", "text-generation", "code", "conversational", "en", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T14:48:46+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #llama #text-generation #code #conversational #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# csg-wukong-1B-sft-bf16 [[中文]](#chinese) [[English]](#english) <a id="english"></a> <p align="center"> <img width="900px" alt="OpenCSG" src="./URL"> </p> <p align="center"><a href="URL Community]</a> <a href="URL <a href="URL <a href="URL </p> </div> OpenCSG stands for Converged resources, Software refinement, and Generative LM. The 'C' represents Converged resources, indicating the integration and full utilization of hybrid resources. The 'S' stands for Software refinement, signifying software that is refined by large models. The 'G' represents Generative LM, which denotes widespread, inclusive, and democratized generative large models. The vision of OpenCSG is to empower every industry, every company, and every individual to own their models. We adhere to the principles of openness and open source, making the large model software stack of OpenCSG available to the community. We welcome everyone to use, send feedback, and contribute collaboratively. ## Model Description csg-wukong-1B-sft-bf16 was finetuned on csg-wukong-1B. <br> we will introduce more information about csg-wukong-1B. ## Model Evaluation results We submitted csg-wukong-1B on the open_llm_leaderboard, and the results show our model ranked the 8th among the ~1.5B pretrained small language models. !image/png # Training ## Hardware - GPUs: 16 H800 - Training time: 43days ## Software - Orchestration: Deepspeed - Neural networks: PyTorch - BP16 if applicable: apex <a id="chinese"></a> <p> </p> # OpenCSG介绍 <p align="center"> <img width="300px" alt="OpenCSG" src="URL </p> <p align="center"><a href="URL 社区]</a> <a href="URL <a href="URL[微信]</a> <a href="URL[推特]</a> </p> </div> OpenCSG中 Open是开源开放;C 代表 Converged resources,整合和充分利用的混合异构资源优势,算力降本增效;S 代表 Software refined,重新定义软件的交付方式,通过大模型驱动软件开发,人力降本增效;G 代表 Generative LM,大众化、普惠化和民主化的可商用的开源生成式大模型。 OpenCSG的愿景是让每个行业、每个公司、每个人都拥有自己的模型。 我们坚持开源开放的原则,将OpenCSG的大模型软件栈开源到社区,欢迎使用、反馈和参与共建,欢迎关注。 ## 模型介绍 csg-wukong-1B-sft-bf16 在csg-wukong-1B预训练模型上微调而成. <br> 我们将在后面介绍更多关于这个模型的信息。 ## 模型评测结果 我们把csg-wukong-1B模型提交到open_llm_leaderboard榜单上,结果显示我们的模型目前在~1.5B小语言模型中排名第8。 !image/png # 训练 ## 硬件资源 - GPU数量: 16 H800 - 训练时间: 43天 ## 软件使用 - 微调训练框架: Deepspeed - 深度学习框架: PyTorch - BP16: apex
[ "# csg-wukong-1B-sft-bf16 [[中文]](#chinese) [[English]](#english)\n\n<a id=\"english\"></a>\n\n<p align=\"center\">\n<img width=\"900px\" alt=\"OpenCSG\" src=\"./URL\">\n</p>\n\n<p align=\"center\"><a href=\"URL Community]</a> <a href=\"URL <a href=\"URL <a href=\"URL </p>\n\n\n</div>\nOpenCSG stands for Converged resources, Software refinement, and Generative LM. The 'C' represents Converged resources, indicating the integration and full utilization of hybrid resources. The 'S' stands for Software refinement, signifying software that is refined by large models. The 'G' represents Generative LM, which denotes widespread, inclusive, and democratized generative large models.\n\nThe vision of OpenCSG is to empower every industry, every company, and every individual to own their models. We adhere to the principles of openness and open source, making the large model software stack of OpenCSG available to the community. We welcome everyone to use, send feedback, and contribute collaboratively.", "## Model Description\n\n\n\n\ncsg-wukong-1B-sft-bf16 was finetuned on csg-wukong-1B. \n<br>\nwe will introduce more information about csg-wukong-1B.", "## Model Evaluation results\n\nWe submitted csg-wukong-1B on the open_llm_leaderboard, and\nthe results show our model ranked the 8th among the ~1.5B pretrained small language models.\n\n\n!image/png", "# Training", "## Hardware\n\n- GPUs: 16 H800 \n- Training time: 43days", "## Software\n\n- Orchestration: Deepspeed\n- Neural networks: PyTorch\n- BP16 if applicable: apex\n\n\n<a id=\"chinese\"></a>\n\n<p>\n\n</p>", "# OpenCSG介绍\n\n\n<p align=\"center\">\n<img width=\"300px\" alt=\"OpenCSG\" src=\"URL\n</p>\n\n<p align=\"center\"><a href=\"URL 社区]</a> <a href=\"URL <a href=\"URL[微信]</a> <a href=\"URL[推特]</a> </p>\n\n\n\n</div>\nOpenCSG中 Open是开源开放;C 代表 Converged resources,整合和充分利用的混合异构资源优势,算力降本增效;S 代表 Software refined,重新定义软件的交付方式,通过大模型驱动软件开发,人力降本增效;G 代表 Generative LM,大众化、普惠化和民主化的可商用的开源生成式大模型。\n\nOpenCSG的愿景是让每个行业、每个公司、每个人都拥有自己的模型。 我们坚持开源开放的原则,将OpenCSG的大模型软件栈开源到社区,欢迎使用、反馈和参与共建,欢迎关注。", "## 模型介绍\n\n\ncsg-wukong-1B-sft-bf16 在csg-wukong-1B预训练模型上微调而成.\n<br>\n\n我们将在后面介绍更多关于这个模型的信息。", "## 模型评测结果\n\n我们把csg-wukong-1B模型提交到open_llm_leaderboard榜单上,结果显示我们的模型目前在~1.5B小语言模型中排名第8。\n\n\n!image/png", "# 训练", "## 硬件资源\n\n- GPU数量: 16 H800 \n- 训练时间: 43天", "## 软件使用\n\n- 微调训练框架: Deepspeed\n- 深度学习框架: PyTorch\n- BP16: apex" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #code #conversational #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# csg-wukong-1B-sft-bf16 [[中文]](#chinese) [[English]](#english)\n\n<a id=\"english\"></a>\n\n<p align=\"center\">\n<img width=\"900px\" alt=\"OpenCSG\" src=\"./URL\">\n</p>\n\n<p align=\"center\"><a href=\"URL Community]</a> <a href=\"URL <a href=\"URL <a href=\"URL </p>\n\n\n</div>\nOpenCSG stands for Converged resources, Software refinement, and Generative LM. The 'C' represents Converged resources, indicating the integration and full utilization of hybrid resources. The 'S' stands for Software refinement, signifying software that is refined by large models. The 'G' represents Generative LM, which denotes widespread, inclusive, and democratized generative large models.\n\nThe vision of OpenCSG is to empower every industry, every company, and every individual to own their models. We adhere to the principles of openness and open source, making the large model software stack of OpenCSG available to the community. We welcome everyone to use, send feedback, and contribute collaboratively.", "## Model Description\n\n\n\n\ncsg-wukong-1B-sft-bf16 was finetuned on csg-wukong-1B. \n<br>\nwe will introduce more information about csg-wukong-1B.", "## Model Evaluation results\n\nWe submitted csg-wukong-1B on the open_llm_leaderboard, and\nthe results show our model ranked the 8th among the ~1.5B pretrained small language models.\n\n\n!image/png", "# Training", "## Hardware\n\n- GPUs: 16 H800 \n- Training time: 43days", "## Software\n\n- Orchestration: Deepspeed\n- Neural networks: PyTorch\n- BP16 if applicable: apex\n\n\n<a id=\"chinese\"></a>\n\n<p>\n\n</p>", "# OpenCSG介绍\n\n\n<p align=\"center\">\n<img width=\"300px\" alt=\"OpenCSG\" src=\"URL\n</p>\n\n<p align=\"center\"><a href=\"URL 社区]</a> <a href=\"URL <a href=\"URL[微信]</a> <a href=\"URL[推特]</a> </p>\n\n\n\n</div>\nOpenCSG中 Open是开源开放;C 代表 Converged resources,整合和充分利用的混合异构资源优势,算力降本增效;S 代表 Software refined,重新定义软件的交付方式,通过大模型驱动软件开发,人力降本增效;G 代表 Generative LM,大众化、普惠化和民主化的可商用的开源生成式大模型。\n\nOpenCSG的愿景是让每个行业、每个公司、每个人都拥有自己的模型。 我们坚持开源开放的原则,将OpenCSG的大模型软件栈开源到社区,欢迎使用、反馈和参与共建,欢迎关注。", "## 模型介绍\n\n\ncsg-wukong-1B-sft-bf16 在csg-wukong-1B预训练模型上微调而成.\n<br>\n\n我们将在后面介绍更多关于这个模型的信息。", "## 模型评测结果\n\n我们把csg-wukong-1B模型提交到open_llm_leaderboard榜单上,结果显示我们的模型目前在~1.5B小语言模型中排名第8。\n\n\n!image/png", "# 训练", "## 硬件资源\n\n- GPU数量: 16 H800 \n- 训练时间: 43天", "## 软件使用\n\n- 微调训练框架: Deepspeed\n- 深度学习框架: PyTorch\n- BP16: apex" ]
[ 49, 288, 47, 52, 2, 18, 44, 302, 61, 67, 3, 24, 34 ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #code #conversational #en #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# csg-wukong-1B-sft-bf16 [[中文]](#chinese) [[English]](#english)\n\n<a id=\"english\"></a>\n\n<p align=\"center\">\n<img width=\"900px\" alt=\"OpenCSG\" src=\"./URL\">\n</p>\n\n<p align=\"center\"><a href=\"URL Community]</a> <a href=\"URL <a href=\"URL <a href=\"URL </p>\n\n\n</div>\nOpenCSG stands for Converged resources, Software refinement, and Generative LM. The 'C' represents Converged resources, indicating the integration and full utilization of hybrid resources. The 'S' stands for Software refinement, signifying software that is refined by large models. The 'G' represents Generative LM, which denotes widespread, inclusive, and democratized generative large models.\n\nThe vision of OpenCSG is to empower every industry, every company, and every individual to own their models. We adhere to the principles of openness and open source, making the large model software stack of OpenCSG available to the community. We welcome everyone to use, send feedback, and contribute collaboratively.## Model Description\n\n\n\n\ncsg-wukong-1B-sft-bf16 was finetuned on csg-wukong-1B. \n<br>\nwe will introduce more information about csg-wukong-1B.## Model Evaluation results\n\nWe submitted csg-wukong-1B on the open_llm_leaderboard, and\nthe results show our model ranked the 8th among the ~1.5B pretrained small language models.\n\n\n!image/png# Training## Hardware\n\n- GPUs: 16 H800 \n- Training time: 43days## Software\n\n- Orchestration: Deepspeed\n- Neural networks: PyTorch\n- BP16 if applicable: apex\n\n\n<a id=\"chinese\"></a>\n\n<p>\n\n</p># OpenCSG介绍\n\n\n<p align=\"center\">\n<img width=\"300px\" alt=\"OpenCSG\" src=\"URL\n</p>\n\n<p align=\"center\"><a href=\"URL 社区]</a> <a href=\"URL <a href=\"URL[微信]</a> <a href=\"URL[推特]</a> </p>\n\n\n\n</div>\nOpenCSG中 Open是开源开放;C 代表 Converged resources,整合和充分利用的混合异构资源优势,算力降本增效;S 代表 Software refined,重新定义软件的交付方式,通过大模型驱动软件开发,人力降本增效;G 代表 Generative LM,大众化、普惠化和民主化的可商用的开源生成式大模型。\n\nOpenCSG的愿景是让每个行业、每个公司、每个人都拥有自己的模型。 我们坚持开源开放的原则,将OpenCSG的大模型软件栈开源到社区,欢迎使用、反馈和参与共建,欢迎关注。## 模型介绍\n\n\ncsg-wukong-1B-sft-bf16 在csg-wukong-1B预训练模型上微调而成.\n<br>\n\n我们将在后面介绍更多关于这个模型的信息。## 模型评测结果\n\n我们把csg-wukong-1B模型提交到open_llm_leaderboard榜单上,结果显示我们的模型目前在~1.5B小语言模型中排名第8。\n\n\n!image/png# 训练## 硬件资源\n\n- GPU数量: 16 H800 \n- 训练时间: 43天## 软件使用\n\n- 微调训练框架: Deepspeed\n- 深度学习框架: PyTorch\n- BP16: apex" ]
object-detection
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
HeavenWaters/detr-finetuned-wheathead-v2
null
[ "transformers", "safetensors", "detr", "object-detection", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:49:33+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #detr #object-detection #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #detr #object-detection #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 33, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #detr #object-detection #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
feature-extraction
transformers
# phospho-small This is a SetFit model that can be used for Text Classification on CPU. The model has been trained using an efficient few-shot learning technique. ## Usage ```python from setfit import SetFitModel model = SetFitModel.from_pretrained("phospho-small-7c8faca") outputs = model.predict(["This is a sentence to classify", "Another sentence"]) # tensor([1, 0]) ``` ## References This work was possible thanks to the SetFit library and the work of: Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren (2022). Efficient Few-Shot Learning Without Prompts. ArXiv: [https://doi.org/10.48550/arxiv.2209.11055](https://doi.org/10.48550/arxiv.2209.11055)
{"language": "en", "license": "apache-2.0"}
phospho-app/phospho-small-7c8faca
null
[ "transformers", "safetensors", "mpnet", "feature-extraction", "en", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:50:07+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #mpnet #feature-extraction #en #license-apache-2.0 #endpoints_compatible #region-us
# phospho-small This is a SetFit model that can be used for Text Classification on CPU. The model has been trained using an efficient few-shot learning technique. ## Usage ## References This work was possible thanks to the SetFit library and the work of: Tunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren (2022). Efficient Few-Shot Learning Without Prompts. ArXiv: URL
[ "# phospho-small\n\nThis is a SetFit model that can be used for Text Classification on CPU.\n\nThe model has been trained using an efficient few-shot learning technique.", "## Usage", "## References\n\nThis work was possible thanks to the SetFit library and the work of:\n\nTunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren (2022). Efficient Few-Shot Learning Without Prompts. \n\nArXiv: URL" ]
[ "TAGS\n#transformers #safetensors #mpnet #feature-extraction #en #license-apache-2.0 #endpoints_compatible #region-us \n", "# phospho-small\n\nThis is a SetFit model that can be used for Text Classification on CPU.\n\nThe model has been trained using an efficient few-shot learning technique.", "## Usage", "## References\n\nThis work was possible thanks to the SetFit library and the work of:\n\nTunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren (2022). Efficient Few-Shot Learning Without Prompts. \n\nArXiv: URL" ]
[ 33, 38, 3, 78 ]
[ "TAGS\n#transformers #safetensors #mpnet #feature-extraction #en #license-apache-2.0 #endpoints_compatible #region-us \n# phospho-small\n\nThis is a SetFit model that can be used for Text Classification on CPU.\n\nThe model has been trained using an efficient few-shot learning technique.## Usage## References\n\nThis work was possible thanks to the SetFit library and the work of:\n\nTunstall, Lewis and Reimers, Nils and Jo, Unso Eun Seo and Bates, Luke and Korat, Daniel and Wasserblat, Moshe and Pereg, Oren (2022). Efficient Few-Shot Learning Without Prompts. \n\nArXiv: URL" ]
null
peft
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # adapter_ThaiSC_LLM_Scamper This model is a fine-tuned version of [openthaigpt/openthaigpt-1.0.0-7b-chat](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-7b-chat) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 8e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.03 - num_epochs: 1 ### Training results ### Framework versions - PEFT 0.8.2 - Transformers 4.38.0 - Pytorch 2.2.1+cu121 - Datasets 2.17.0 - Tokenizers 0.15.2
{"license": "apache-2.0", "library_name": "peft", "tags": ["trl", "sft", "generated_from_trainer"], "base_model": "openthaigpt/openthaigpt-1.0.0-7b-chat", "model-index": [{"name": "adapter_ThaiSC_LLM_Scamper", "results": []}]}
Thirawarit/adapter_ThaiSC_LLM_Scamper
null
[ "peft", "tensorboard", "safetensors", "trl", "sft", "generated_from_trainer", "base_model:openthaigpt/openthaigpt-1.0.0-7b-chat", "license:apache-2.0", "region:us" ]
null
2024-04-30T14:52:09+00:00
[]
[]
TAGS #peft #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-openthaigpt/openthaigpt-1.0.0-7b-chat #license-apache-2.0 #region-us
# adapter_ThaiSC_LLM_Scamper This model is a fine-tuned version of openthaigpt/openthaigpt-1.0.0-7b-chat on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 8e-05 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.03 - num_epochs: 1 ### Training results ### Framework versions - PEFT 0.8.2 - Transformers 4.38.0 - Pytorch 2.2.1+cu121 - Datasets 2.17.0 - Tokenizers 0.15.2
[ "# adapter_ThaiSC_LLM_Scamper\n\nThis model is a fine-tuned version of openthaigpt/openthaigpt-1.0.0-7b-chat on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 8e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.03\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.38.0\n- Pytorch 2.2.1+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.2" ]
[ "TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-openthaigpt/openthaigpt-1.0.0-7b-chat #license-apache-2.0 #region-us \n", "# adapter_ThaiSC_LLM_Scamper\n\nThis model is a fine-tuned version of openthaigpt/openthaigpt-1.0.0-7b-chat on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 8e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.03\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.38.0\n- Pytorch 2.2.1+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.2" ]
[ 60, 48, 7, 9, 9, 4, 108, 5, 52 ]
[ "TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-openthaigpt/openthaigpt-1.0.0-7b-chat #license-apache-2.0 #region-us \n# adapter_ThaiSC_LLM_Scamper\n\nThis model is a fine-tuned version of openthaigpt/openthaigpt-1.0.0-7b-chat on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 8e-05\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_ratio: 0.03\n- num_epochs: 1### Training results### Framework versions\n\n- PEFT 0.8.2\n- Transformers 4.38.0\n- Pytorch 2.2.1+cu121\n- Datasets 2.17.0\n- Tokenizers 0.15.2" ]
fill-mask
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # tda_1971_85_distilbert This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.6890 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 60 | 1.6985 | | No log | 2.0 | 120 | 1.7388 | | No log | 3.0 | 180 | 1.6725 | | No log | 4.0 | 240 | 1.6855 | | No log | 5.0 | 300 | 1.6056 | | No log | 6.0 | 360 | 1.7211 | | No log | 7.0 | 420 | 1.6464 | | No log | 8.0 | 480 | 1.6195 | | 1.7601 | 9.0 | 540 | 1.5872 | | 1.7601 | 10.0 | 600 | 1.6890 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "distilbert-base-cased", "model-index": [{"name": "tda_1971_85_distilbert", "results": []}]}
gc394/tda_1971_85_distilbert
null
[ "transformers", "tensorboard", "safetensors", "distilbert", "fill-mask", "generated_from_trainer", "base_model:distilbert-base-cased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:52:31+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
tda\_1971\_85\_distilbert ========================= This model is a fine-tuned version of distilbert-base-cased on the None dataset. It achieves the following results on the evaluation set: * Loss: 1.6890 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 10 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.40.1 * Pytorch 2.2.1+cu121 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ 59, 112, 5, 44 ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
fill-mask
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # tda_1986_00_distilbert This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.4929 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 71 | 1.4492 | | No log | 2.0 | 142 | 1.4750 | | No log | 3.0 | 213 | 1.4390 | | No log | 4.0 | 284 | 1.4929 | | No log | 5.0 | 355 | 1.4365 | | No log | 6.0 | 426 | 1.3863 | | No log | 7.0 | 497 | 1.4428 | | 1.5109 | 8.0 | 568 | 1.4007 | | 1.5109 | 9.0 | 639 | 1.4325 | | 1.5109 | 10.0 | 710 | 1.4929 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "distilbert-base-cased", "model-index": [{"name": "tda_1986_00_distilbert", "results": []}]}
gc394/tda_1986_00_distilbert
null
[ "transformers", "tensorboard", "safetensors", "distilbert", "fill-mask", "generated_from_trainer", "base_model:distilbert-base-cased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:52:31+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
tda\_1986\_00\_distilbert ========================= This model is a fine-tuned version of distilbert-base-cased on the None dataset. It achieves the following results on the evaluation set: * Loss: 1.4929 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 10 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.40.1 * Pytorch 2.2.1+cu121 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ 59, 112, 5, 44 ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
fill-mask
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # tda_01_16_distilbert This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.3717 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 64 | 1.4189 | | No log | 2.0 | 128 | 1.3668 | | No log | 3.0 | 192 | 1.3829 | | No log | 4.0 | 256 | 1.3625 | | No log | 5.0 | 320 | 1.3717 | ### Framework versions - Transformers 4.40.0 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "distilbert-base-cased", "model-index": [{"name": "tda_01_16_distilbert", "results": []}]}
gc394/tda_01_16_distilbert
null
[ "transformers", "tensorboard", "safetensors", "distilbert", "fill-mask", "generated_from_trainer", "base_model:distilbert-base-cased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:52:32+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
tda\_01\_16\_distilbert ======================= This model is a fine-tuned version of distilbert-base-cased on the None dataset. It achieves the following results on the evaluation set: * Loss: 1.3717 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 5 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.40.0 * Pytorch 2.2.1+cu121 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ 59, 112, 5, 44 ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 5\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.40.0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
text-generation
transformers
# Uploaded model - **Developed by:** Suru - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "trl", "sft"], "base_model": "unsloth/llama-3-8b-bnb-4bit"}
Suru/Llama3_8b_gpt_Tweets
null
[ "transformers", "safetensors", "llama", "text-generation", "text-generation-inference", "unsloth", "trl", "sft", "en", "base_model:unsloth/llama-3-8b-bnb-4bit", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:53:30+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #llama #text-generation #text-generation-inference #unsloth #trl #sft #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
# Uploaded model - Developed by: Suru - License: apache-2.0 - Finetuned from model : unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: Suru\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #text-generation-inference #unsloth #trl #sft #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: Suru\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ 76, 79 ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #text-generation-inference #unsloth #trl #sft #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n# Uploaded model\n\n- Developed by: Suru\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
null
null
GGUF version of [sentence-transformers/multi-qa-MiniLM-L6-cos-v1](https://huggingface.co/sentence-transformers/multi-qa-MiniLM-L6-cos-v1).
{"base_model": "sentence-transformers/multi-qa-MiniLM-L6-cos-v1"}
Felladrin/gguf-multi-qa-MiniLM-L6-cos-v1
null
[ "gguf", "base_model:sentence-transformers/multi-qa-MiniLM-L6-cos-v1", "region:us" ]
null
2024-04-30T14:55:05+00:00
[]
[]
TAGS #gguf #base_model-sentence-transformers/multi-qa-MiniLM-L6-cos-v1 #region-us
GGUF version of sentence-transformers/multi-qa-MiniLM-L6-cos-v1.
[]
[ "TAGS\n#gguf #base_model-sentence-transformers/multi-qa-MiniLM-L6-cos-v1 #region-us \n" ]
[ 34 ]
[ "TAGS\n#gguf #base_model-sentence-transformers/multi-qa-MiniLM-L6-cos-v1 #region-us \n" ]
sentence-similarity
sentence-transformers
# {MODEL_NAME} This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search. <!--- Describe your model here --> ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('{MODEL_NAME}') embeddings = model.encode(sentences) print(embeddings) ``` ## Evaluation Results <!--- Describe how your model was evaluated --> For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME}) ## Training The model was trained with the parameters: **DataLoader**: `torch.utils.data.dataloader.DataLoader` of length 137553 with parameters: ``` {'batch_size': 64, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'} ``` **Loss**: `sentence_transformers.losses.MSELoss.MSELoss` Parameters of the fit()-Method: ``` { "epochs": 1, "evaluation_steps": 5000, "evaluator": "sentence_transformers.evaluation.SequentialEvaluator.SequentialEvaluator", "max_grad_norm": 1, "optimizer_class": "<class 'torch.optim.adamw.AdamW'>", "optimizer_params": { "eps": 1e-06, "lr": 0.0001 }, "scheduler": "WarmupLinear", "steps_per_epoch": null, "warmup_steps": 1000, "weight_decay": 0.01 } ``` ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Citing & Authors <!--- Describe where people can find more information -->
{"library_name": "sentence-transformers", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity"], "pipeline_tag": "sentence-similarity"}
Mihaiii/test19
null
[ "sentence-transformers", "safetensors", "bert", "feature-extraction", "sentence-similarity", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:57:03+00:00
[]
[]
TAGS #sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #endpoints_compatible #region-us
# {MODEL_NAME} This is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have sentence-transformers installed: Then you can use the model like this: ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL ## Training The model was trained with the parameters: DataLoader: 'URL.dataloader.DataLoader' of length 137553 with parameters: Loss: 'sentence_transformers.losses.MSELoss.MSELoss' Parameters of the fit()-Method: ## Full Model Architecture ## Citing & Authors
[ "# {MODEL_NAME}\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.", "## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:", "## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL", "## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 137553 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.MSELoss.MSELoss' \n\nParameters of the fit()-Method:", "## Full Model Architecture", "## Citing & Authors" ]
[ "TAGS\n#sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #endpoints_compatible #region-us \n", "# {MODEL_NAME}\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.", "## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:", "## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL", "## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 137553 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.MSELoss.MSELoss' \n\nParameters of the fit()-Method:", "## Full Model Architecture", "## Citing & Authors" ]
[ 28, 41, 30, 26, 61, 5, 5 ]
[ "TAGS\n#sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #endpoints_compatible #region-us \n# {MODEL_NAME}\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 137553 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.MSELoss.MSELoss' \n\nParameters of the fit()-Method:## Full Model Architecture## Citing & Authors" ]
text-generation
transformers
# Wise-EMO-2B ## Overview Wise-EMO-2B is a 2.5 billion parameter conversational AI model that combines emotional intelligence capabilities from the EMO-2B model with insights from ancient Indian wisdom traditions. By integrating the wealth of knowledge from philosophies like Yoga, Vedanta, Buddhism, Jainism, Sikhism, Hinduism, and other indigenous spiritual practices, this model aims to provide emotionally resonant dialogue infused with profound philosophical perspectives. ## Key Features - **Emotional Intelligence**: Inherited from EMO-2B, Wise-EMO-2B excels at perceiving and responding to emotional undertones with empathy, providing emotionally supportive responses. - **Philosophical Wisdom**: The model has been further finetuned on texts from ancient Indian wisdom traditions, allowing it to draw upon profound philosophical concepts when appropriate. - **Holistic Perspective**: Wise-EMO-2B offers a holistic viewpoint that combines emotional resonance with spiritual and philosophical wisdom for a well-rounded, insightful conversational experience. - **Dynamic Contextualization**: The model adapts its communication style, emotional responses, and philosophical framing based on the specific conversational context. ## Use Cases Wise-EMO-2B can be beneficial for applications that require emotionally intelligent yet philosophically grounded dialogue, such as: - Emotional support companions with a wisdom-oriented perspective - Philosophical discussion and ideation - Storytelling and creative writing with profound themes - Personal growth, self-reflection, and mindfulness practice - Exploring ancient wisdom in relation to modern issues ## Limitations and Responsible Use While powerful, Wise-EMO-2B is an AI model with inherent limitations. Its responses should be viewed as a supportive tool rather than a substitute for professional mental health services or spiritual guidance. Additionally, the model may reflect biases from its training data. Users should think critically and report concerning outputs. As with all AI systems, Wise-EMO-2B should be used responsibly and ethically, particularly given the sensitivity around philosophy and spiritual beliefs. Outputs should be carefully evaluated in context. **This is mark one of Wise-EMO, It is just for testing purposes**
{"language": ["en"], "license": "apache-2.0", "tags": ["EMO", "HelpingAI", "Wisdom", "Philosophy", "AncientIndianKnowledge"], "datasets": ["Abhaykoul/Ancient-Indian-Wisdom"], "pipeline_tag": "text-generation", "widget": [{"messages": [{"role": "user", "content": "What insights can we gain from the Bhagavad Gita about the path of selfless action and devotion?"}]}], "inference": {"parameters": {"max_new_tokens": 500}}}
OEvortex/Wise-EMO-2B
null
[ "transformers", "safetensors", "gemma", "text-generation", "EMO", "HelpingAI", "Wisdom", "Philosophy", "AncientIndianKnowledge", "conversational", "en", "dataset:Abhaykoul/Ancient-Indian-Wisdom", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T14:57:21+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #gemma #text-generation #EMO #HelpingAI #Wisdom #Philosophy #AncientIndianKnowledge #conversational #en #dataset-Abhaykoul/Ancient-Indian-Wisdom #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Wise-EMO-2B ## Overview Wise-EMO-2B is a 2.5 billion parameter conversational AI model that combines emotional intelligence capabilities from the EMO-2B model with insights from ancient Indian wisdom traditions. By integrating the wealth of knowledge from philosophies like Yoga, Vedanta, Buddhism, Jainism, Sikhism, Hinduism, and other indigenous spiritual practices, this model aims to provide emotionally resonant dialogue infused with profound philosophical perspectives. ## Key Features - Emotional Intelligence: Inherited from EMO-2B, Wise-EMO-2B excels at perceiving and responding to emotional undertones with empathy, providing emotionally supportive responses. - Philosophical Wisdom: The model has been further finetuned on texts from ancient Indian wisdom traditions, allowing it to draw upon profound philosophical concepts when appropriate. - Holistic Perspective: Wise-EMO-2B offers a holistic viewpoint that combines emotional resonance with spiritual and philosophical wisdom for a well-rounded, insightful conversational experience. - Dynamic Contextualization: The model adapts its communication style, emotional responses, and philosophical framing based on the specific conversational context. ## Use Cases Wise-EMO-2B can be beneficial for applications that require emotionally intelligent yet philosophically grounded dialogue, such as: - Emotional support companions with a wisdom-oriented perspective - Philosophical discussion and ideation - Storytelling and creative writing with profound themes - Personal growth, self-reflection, and mindfulness practice - Exploring ancient wisdom in relation to modern issues ## Limitations and Responsible Use While powerful, Wise-EMO-2B is an AI model with inherent limitations. Its responses should be viewed as a supportive tool rather than a substitute for professional mental health services or spiritual guidance. Additionally, the model may reflect biases from its training data. Users should think critically and report concerning outputs. As with all AI systems, Wise-EMO-2B should be used responsibly and ethically, particularly given the sensitivity around philosophy and spiritual beliefs. Outputs should be carefully evaluated in context. This is mark one of Wise-EMO, It is just for testing purposes
[ "# Wise-EMO-2B", "## Overview\n\nWise-EMO-2B is a 2.5 billion parameter conversational AI model that combines emotional intelligence capabilities from the EMO-2B model with insights from ancient Indian wisdom traditions. By integrating the wealth of knowledge from philosophies like Yoga, Vedanta, Buddhism, Jainism, Sikhism, Hinduism, and other indigenous spiritual practices, this model aims to provide emotionally resonant dialogue infused with profound philosophical perspectives.", "## Key Features\n\n- Emotional Intelligence: Inherited from EMO-2B, Wise-EMO-2B excels at perceiving and responding to emotional undertones with empathy, providing emotionally supportive responses.\n\n- Philosophical Wisdom: The model has been further finetuned on texts from ancient Indian wisdom traditions, allowing it to draw upon profound philosophical concepts when appropriate.\n\n- Holistic Perspective: Wise-EMO-2B offers a holistic viewpoint that combines emotional resonance with spiritual and philosophical wisdom for a well-rounded, insightful conversational experience.\n\n- Dynamic Contextualization: The model adapts its communication style, emotional responses, and philosophical framing based on the specific conversational context.", "## Use Cases\n\nWise-EMO-2B can be beneficial for applications that require emotionally intelligent yet philosophically grounded dialogue, such as:\n\n- Emotional support companions with a wisdom-oriented perspective\n- Philosophical discussion and ideation \n- Storytelling and creative writing with profound themes\n- Personal growth, self-reflection, and mindfulness practice\n- Exploring ancient wisdom in relation to modern issues", "## Limitations and Responsible Use\n\nWhile powerful, Wise-EMO-2B is an AI model with inherent limitations. Its responses should be viewed as a supportive tool rather than a substitute for professional mental health services or spiritual guidance. Additionally, the model may reflect biases from its training data. Users should think critically and report concerning outputs.\n\nAs with all AI systems, Wise-EMO-2B should be used responsibly and ethically, particularly given the sensitivity around philosophy and spiritual beliefs. Outputs should be carefully evaluated in context.\n\n\nThis is mark one of Wise-EMO, It is just for testing purposes" ]
[ "TAGS\n#transformers #safetensors #gemma #text-generation #EMO #HelpingAI #Wisdom #Philosophy #AncientIndianKnowledge #conversational #en #dataset-Abhaykoul/Ancient-Indian-Wisdom #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Wise-EMO-2B", "## Overview\n\nWise-EMO-2B is a 2.5 billion parameter conversational AI model that combines emotional intelligence capabilities from the EMO-2B model with insights from ancient Indian wisdom traditions. By integrating the wealth of knowledge from philosophies like Yoga, Vedanta, Buddhism, Jainism, Sikhism, Hinduism, and other indigenous spiritual practices, this model aims to provide emotionally resonant dialogue infused with profound philosophical perspectives.", "## Key Features\n\n- Emotional Intelligence: Inherited from EMO-2B, Wise-EMO-2B excels at perceiving and responding to emotional undertones with empathy, providing emotionally supportive responses.\n\n- Philosophical Wisdom: The model has been further finetuned on texts from ancient Indian wisdom traditions, allowing it to draw upon profound philosophical concepts when appropriate.\n\n- Holistic Perspective: Wise-EMO-2B offers a holistic viewpoint that combines emotional resonance with spiritual and philosophical wisdom for a well-rounded, insightful conversational experience.\n\n- Dynamic Contextualization: The model adapts its communication style, emotional responses, and philosophical framing based on the specific conversational context.", "## Use Cases\n\nWise-EMO-2B can be beneficial for applications that require emotionally intelligent yet philosophically grounded dialogue, such as:\n\n- Emotional support companions with a wisdom-oriented perspective\n- Philosophical discussion and ideation \n- Storytelling and creative writing with profound themes\n- Personal growth, self-reflection, and mindfulness practice\n- Exploring ancient wisdom in relation to modern issues", "## Limitations and Responsible Use\n\nWhile powerful, Wise-EMO-2B is an AI model with inherent limitations. Its responses should be viewed as a supportive tool rather than a substitute for professional mental health services or spiritual guidance. Additionally, the model may reflect biases from its training data. Users should think critically and report concerning outputs.\n\nAs with all AI systems, Wise-EMO-2B should be used responsibly and ethically, particularly given the sensitivity around philosophy and spiritual beliefs. Outputs should be carefully evaluated in context.\n\n\nThis is mark one of Wise-EMO, It is just for testing purposes" ]
[ 78, 8, 92, 140, 74, 126 ]
[ "TAGS\n#transformers #safetensors #gemma #text-generation #EMO #HelpingAI #Wisdom #Philosophy #AncientIndianKnowledge #conversational #en #dataset-Abhaykoul/Ancient-Indian-Wisdom #license-apache-2.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Wise-EMO-2B## Overview\n\nWise-EMO-2B is a 2.5 billion parameter conversational AI model that combines emotional intelligence capabilities from the EMO-2B model with insights from ancient Indian wisdom traditions. By integrating the wealth of knowledge from philosophies like Yoga, Vedanta, Buddhism, Jainism, Sikhism, Hinduism, and other indigenous spiritual practices, this model aims to provide emotionally resonant dialogue infused with profound philosophical perspectives.## Key Features\n\n- Emotional Intelligence: Inherited from EMO-2B, Wise-EMO-2B excels at perceiving and responding to emotional undertones with empathy, providing emotionally supportive responses.\n\n- Philosophical Wisdom: The model has been further finetuned on texts from ancient Indian wisdom traditions, allowing it to draw upon profound philosophical concepts when appropriate.\n\n- Holistic Perspective: Wise-EMO-2B offers a holistic viewpoint that combines emotional resonance with spiritual and philosophical wisdom for a well-rounded, insightful conversational experience.\n\n- Dynamic Contextualization: The model adapts its communication style, emotional responses, and philosophical framing based on the specific conversational context.## Use Cases\n\nWise-EMO-2B can be beneficial for applications that require emotionally intelligent yet philosophically grounded dialogue, such as:\n\n- Emotional support companions with a wisdom-oriented perspective\n- Philosophical discussion and ideation \n- Storytelling and creative writing with profound themes\n- Personal growth, self-reflection, and mindfulness practice\n- Exploring ancient wisdom in relation to modern issues## Limitations and Responsible Use\n\nWhile powerful, Wise-EMO-2B is an AI model with inherent limitations. Its responses should be viewed as a supportive tool rather than a substitute for professional mental health services or spiritual guidance. Additionally, the model may reflect biases from its training data. Users should think critically and report concerning outputs.\n\nAs with all AI systems, Wise-EMO-2B should be used responsibly and ethically, particularly given the sensitivity around philosophy and spiritual beliefs. Outputs should be carefully evaluated in context.\n\n\nThis is mark one of Wise-EMO, It is just for testing purposes" ]
sentence-similarity
sentence-transformers
# {MODEL_NAME} This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search. <!--- Describe your model here --> ## Usage (Sentence-Transformers) Using this model becomes easy when you have [sentence-transformers](https://www.SBERT.net) installed: ``` pip install -U sentence-transformers ``` Then you can use the model like this: ```python from sentence_transformers import SentenceTransformer sentences = ["This is an example sentence", "Each sentence is converted"] model = SentenceTransformer('{MODEL_NAME}') embeddings = model.encode(sentences) print(embeddings) ``` ## Evaluation Results <!--- Describe how your model was evaluated --> For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME}) ## Training The model was trained with the parameters: **DataLoader**: `torch.utils.data.dataloader.DataLoader` of length 137553 with parameters: ``` {'batch_size': 64, 'sampler': 'torch.utils.data.sampler.RandomSampler', 'batch_sampler': 'torch.utils.data.sampler.BatchSampler'} ``` **Loss**: `sentence_transformers.losses.MSELoss.MSELoss` Parameters of the fit()-Method: ``` { "epochs": 1, "evaluation_steps": 5000, "evaluator": "sentence_transformers.evaluation.SequentialEvaluator.SequentialEvaluator", "max_grad_norm": 1, "optimizer_class": "<class 'torch.optim.adamw.AdamW'>", "optimizer_params": { "eps": 1e-06, "lr": 0.0001 }, "scheduler": "WarmupLinear", "steps_per_epoch": null, "warmup_steps": 1000, "weight_decay": 0.01 } ``` ## Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) (2): Normalize() ) ``` ## Citing & Authors <!--- Describe where people can find more information -->
{"library_name": "sentence-transformers", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity"], "pipeline_tag": "sentence-similarity"}
Mihaiii/test20
null
[ "sentence-transformers", "safetensors", "bert", "feature-extraction", "sentence-similarity", "endpoints_compatible", "region:us" ]
null
2024-04-30T14:57:25+00:00
[]
[]
TAGS #sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #endpoints_compatible #region-us
# {MODEL_NAME} This is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search. ## Usage (Sentence-Transformers) Using this model becomes easy when you have sentence-transformers installed: Then you can use the model like this: ## Evaluation Results For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL ## Training The model was trained with the parameters: DataLoader: 'URL.dataloader.DataLoader' of length 137553 with parameters: Loss: 'sentence_transformers.losses.MSELoss.MSELoss' Parameters of the fit()-Method: ## Full Model Architecture ## Citing & Authors
[ "# {MODEL_NAME}\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.", "## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:", "## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL", "## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 137553 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.MSELoss.MSELoss' \n\nParameters of the fit()-Method:", "## Full Model Architecture", "## Citing & Authors" ]
[ "TAGS\n#sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #endpoints_compatible #region-us \n", "# {MODEL_NAME}\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.", "## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:", "## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL", "## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 137553 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.MSELoss.MSELoss' \n\nParameters of the fit()-Method:", "## Full Model Architecture", "## Citing & Authors" ]
[ 28, 41, 30, 26, 61, 5, 5 ]
[ "TAGS\n#sentence-transformers #safetensors #bert #feature-extraction #sentence-similarity #endpoints_compatible #region-us \n# {MODEL_NAME}\n\nThis is a sentence-transformers model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.## Usage (Sentence-Transformers)\n\nUsing this model becomes easy when you have sentence-transformers installed:\n\n\n\nThen you can use the model like this:## Evaluation Results\n\n\n\nFor an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: URL## Training\nThe model was trained with the parameters:\n\nDataLoader:\n\n'URL.dataloader.DataLoader' of length 137553 with parameters:\n\n\nLoss:\n\n'sentence_transformers.losses.MSELoss.MSELoss' \n\nParameters of the fit()-Method:## Full Model Architecture## Citing & Authors" ]
reinforcement-learning
stable-baselines3
# **PPO** Agent playing **LunarLander-v2** This is a trained model of a **PPO** agent playing **LunarLander-v2** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
{"library_name": "stable-baselines3", "tags": ["LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "259.49 +/- 21.24", "name": "mean_reward", "verified": false}]}]}]}
dirkneethling/ppo-LunarLander-v2
null
[ "stable-baselines3", "LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
null
2024-04-30T14:59:17+00:00
[]
[]
TAGS #stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
# PPO Agent playing LunarLander-v2 This is a trained model of a PPO agent playing LunarLander-v2 using the stable-baselines3 library. ## Usage (with Stable-baselines3) TODO: Add your code
[ "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ "TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n", "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ 31, 35, 17 ]
[ "TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.## Usage (with Stable-baselines3)\nTODO: Add your code" ]
automatic-speech-recognition
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
adrianmedinav/whisper-large-v3_ro_epochs_1_2024-04-30_13-43-03
null
[ "transformers", "safetensors", "whisper", "automatic-speech-recognition", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:01:15+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #whisper #automatic-speech-recognition #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #whisper #automatic-speech-recognition #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 34, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #whisper #automatic-speech-recognition #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
text-generation
transformers
# phi-2-fp16-ov * Model creator: [Microsoft](https://huggingface.co/microsoft) * Original model: [phi-2](https://huggingface.co/microsoft/phi-2) ## Description This is [phi-2](https://huggingface.co/microsoft/phi-2) model converted to the [OpenVINO™ IR](https://docs.openvino.ai/2024/documentation/openvino-ir-format.html) (Intermediate Representation) format with weights compressed to FP16. ## Compatibility The provided OpenVINO™ IR model is compatible with: * OpenVINO version 2024.1.0 and higher * Optimum Intel 1.16.0 and higher ## Running Model Inference 1. Install packages required for using [Optimum Intel](https://huggingface.co/docs/optimum/intel/index) integration with the OpenVINO backend: ``` pip install optimum[openvino] ``` 2. Run model inference: ``` from transformers import AutoTokenizer from optimum.intel.openvino import OVModelForCausalLM model_id = "OpenVINO/phi-2-fp16-ov" tokenizer = AutoTokenizer.from_pretrained(model_id) model = OVModelForCausalLM.from_pretrained(model_id) inputs = tokenizer("What is OpenVINO?", return_tensors="pt") outputs = model.generate(**inputs, max_length=200) text = tokenizer.batch_decode(outputs)[0] print(text) ``` For more examples and possible optimizations, refer to the [OpenVINO Large Language Model Inference Guide](https://docs.openvino.ai/2024/learn-openvino/llm_inference_guide.html). ## Limitations Check the original model card for [limitations](https://huggingface.co/microsoft/phi-2#limitations-of-phi-2). ## Legal information The original model is distributed under [MIT](https://huggingface.co/microsoft/phi-2/resolve/main/LICENSE) license. More details can be found in [original model card](https://huggingface.co/microsoft/phi-2).
{"language": ["en"], "license": "mit", "license_link": "https://huggingface.co/microsoft/phi-2/resolve/main/LICENSE"}
OpenVINO/phi-2-fp16-ov
null
[ "transformers", "openvino", "phi", "text-generation", "en", "license:mit", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T15:01:28+00:00
[]
[ "en" ]
TAGS #transformers #openvino #phi #text-generation #en #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# phi-2-fp16-ov * Model creator: Microsoft * Original model: phi-2 ## Description This is phi-2 model converted to the OpenVINO™ IR (Intermediate Representation) format with weights compressed to FP16. ## Compatibility The provided OpenVINO™ IR model is compatible with: * OpenVINO version 2024.1.0 and higher * Optimum Intel 1.16.0 and higher ## Running Model Inference 1. Install packages required for using Optimum Intel integration with the OpenVINO backend: 2. Run model inference: For more examples and possible optimizations, refer to the OpenVINO Large Language Model Inference Guide. ## Limitations Check the original model card for limitations. ## Legal information The original model is distributed under MIT license. More details can be found in original model card.
[ "# phi-2-fp16-ov\n\n * Model creator: Microsoft\n * Original model: phi-2", "## Description\n\nThis is phi-2 model converted to the OpenVINO™ IR (Intermediate Representation) format with weights compressed to FP16.", "## Compatibility\n\nThe provided OpenVINO™ IR model is compatible with:\n\n* OpenVINO version 2024.1.0 and higher\n* Optimum Intel 1.16.0 and higher", "## Running Model Inference\n\n1. Install packages required for using Optimum Intel integration with the OpenVINO backend:\n\n\n\n2. Run model inference:\n\n\n\nFor more examples and possible optimizations, refer to the OpenVINO Large Language Model Inference Guide.", "## Limitations\n\nCheck the original model card for limitations.", "## Legal information\n\nThe original model is distributed under MIT license. More details can be found in original model card." ]
[ "TAGS\n#transformers #openvino #phi #text-generation #en #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# phi-2-fp16-ov\n\n * Model creator: Microsoft\n * Original model: phi-2", "## Description\n\nThis is phi-2 model converted to the OpenVINO™ IR (Intermediate Representation) format with weights compressed to FP16.", "## Compatibility\n\nThe provided OpenVINO™ IR model is compatible with:\n\n* OpenVINO version 2024.1.0 and higher\n* Optimum Intel 1.16.0 and higher", "## Running Model Inference\n\n1. Install packages required for using Optimum Intel integration with the OpenVINO backend:\n\n\n\n2. Run model inference:\n\n\n\nFor more examples and possible optimizations, refer to the OpenVINO Large Language Model Inference Guide.", "## Limitations\n\nCheck the original model card for limitations.", "## Legal information\n\nThe original model is distributed under MIT license. More details can be found in original model card." ]
[ 38, 23, 29, 37, 48, 11, 23 ]
[ "TAGS\n#transformers #openvino #phi #text-generation #en #license-mit #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# phi-2-fp16-ov\n\n * Model creator: Microsoft\n * Original model: phi-2## Description\n\nThis is phi-2 model converted to the OpenVINO™ IR (Intermediate Representation) format with weights compressed to FP16.## Compatibility\n\nThe provided OpenVINO™ IR model is compatible with:\n\n* OpenVINO version 2024.1.0 and higher\n* Optimum Intel 1.16.0 and higher## Running Model Inference\n\n1. Install packages required for using Optimum Intel integration with the OpenVINO backend:\n\n\n\n2. Run model inference:\n\n\n\nFor more examples and possible optimizations, refer to the OpenVINO Large Language Model Inference Guide.## Limitations\n\nCheck the original model card for limitations.## Legal information\n\nThe original model is distributed under MIT license. More details can be found in original model card." ]
null
null
ebony like Halle berry
{}
StupidMonk/Shayna
null
[ "region:us" ]
null
2024-04-30T15:03:42+00:00
[]
[]
TAGS #region-us
ebony like Halle berry
[]
[ "TAGS\n#region-us \n" ]
[ 5 ]
[ "TAGS\n#region-us \n" ]
null
peft
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dpo_harmlessharmless_gpt3_subset20000_modelgpt2_maxsteps5000_bz8_lr1e-05 This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 1 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 15 - training_steps: 5000 ### Training results ### Framework versions - PEFT 0.9.0 - Transformers 4.38.2 - Pytorch 2.1.2 - Datasets 2.18.0 - Tokenizers 0.15.2
{"license": "mit", "library_name": "peft", "tags": ["trl", "dpo", "generated_from_trainer"], "base_model": "gpt2", "model-index": [{"name": "dpo_harmlessharmless_gpt3_subset20000_modelgpt2_maxsteps5000_bz8_lr1e-05", "results": []}]}
Holarissun/dpo_harmlessharmless_gpt3_subset20000_modelgpt2_maxsteps5000_bz8_lr1e-05
null
[ "peft", "safetensors", "trl", "dpo", "generated_from_trainer", "base_model:gpt2", "license:mit", "region:us" ]
null
2024-04-30T15:04:02+00:00
[]
[]
TAGS #peft #safetensors #trl #dpo #generated_from_trainer #base_model-gpt2 #license-mit #region-us
# dpo_harmlessharmless_gpt3_subset20000_modelgpt2_maxsteps5000_bz8_lr1e-05 This model is a fine-tuned version of gpt2 on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 1e-05 - train_batch_size: 1 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 8 - total_train_batch_size: 8 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 15 - training_steps: 5000 ### Training results ### Framework versions - PEFT 0.9.0 - Transformers 4.38.2 - Pytorch 2.1.2 - Datasets 2.18.0 - Tokenizers 0.15.2
[ "# dpo_harmlessharmless_gpt3_subset20000_modelgpt2_maxsteps5000_bz8_lr1e-05\n\nThis model is a fine-tuned version of gpt2 on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 8\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 15\n- training_steps: 5000", "### Training results", "### Framework versions\n\n- PEFT 0.9.0\n- Transformers 4.38.2\n- Pytorch 2.1.2\n- Datasets 2.18.0\n- Tokenizers 0.15.2" ]
[ "TAGS\n#peft #safetensors #trl #dpo #generated_from_trainer #base_model-gpt2 #license-mit #region-us \n", "# dpo_harmlessharmless_gpt3_subset20000_modelgpt2_maxsteps5000_bz8_lr1e-05\n\nThis model is a fine-tuned version of gpt2 on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 8\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 15\n- training_steps: 5000", "### Training results", "### Framework versions\n\n- PEFT 0.9.0\n- Transformers 4.38.2\n- Pytorch 2.1.2\n- Datasets 2.18.0\n- Tokenizers 0.15.2" ]
[ 36, 56, 7, 9, 9, 4, 122, 5, 48 ]
[ "TAGS\n#peft #safetensors #trl #dpo #generated_from_trainer #base_model-gpt2 #license-mit #region-us \n# dpo_harmlessharmless_gpt3_subset20000_modelgpt2_maxsteps5000_bz8_lr1e-05\n\nThis model is a fine-tuned version of gpt2 on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 1e-05\n- train_batch_size: 1\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 8\n- total_train_batch_size: 8\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- lr_scheduler_warmup_steps: 15\n- training_steps: 5000### Training results### Framework versions\n\n- PEFT 0.9.0\n- Transformers 4.38.2\n- Pytorch 2.1.2\n- Datasets 2.18.0\n- Tokenizers 0.15.2" ]
text-generation
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # meditron-7b-dpo-full-sft-wo-kqa_silver_wogold This model is a fine-tuned version of [Minbyul/meditron-7b-wo-kqa_silver_wogold-sft](https://huggingface.co/Minbyul/meditron-7b-wo-kqa_silver_wogold-sft) on the HuggingFaceH4/ultrafeedback_binarized dataset. It achieves the following results on the evaluation set: - Loss: 0.6249 - Rewards/chosen: -0.0145 - Rewards/rejected: -0.1983 - Rewards/accuracies: 0.875 - Rewards/margins: 0.1839 - Logps/rejected: -653.1394 - Logps/chosen: -150.2214 - Logits/rejected: -1.0337 - Logits/chosen: -1.4037 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-07 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: multi-GPU - num_devices: 4 - gradient_accumulation_steps: 2 - total_train_batch_size: 64 - total_eval_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.39.0.dev0 - Pytorch 2.1.2 - Datasets 2.14.6 - Tokenizers 0.15.2
{"license": "llama2", "tags": ["alignment-handbook", "trl", "dpo", "generated_from_trainer", "trl", "dpo", "generated_from_trainer"], "datasets": ["HuggingFaceH4/ultrafeedback_binarized"], "base_model": "Minbyul/meditron-7b-wo-kqa_silver_wogold-sft", "model-index": [{"name": "meditron-7b-dpo-full-sft-wo-kqa_silver_wogold", "results": []}]}
Minbyul/meditron-7b-dpo-full-sft-wo-kqa_silver_wogold
null
[ "transformers", "safetensors", "llama", "text-generation", "alignment-handbook", "trl", "dpo", "generated_from_trainer", "dataset:HuggingFaceH4/ultrafeedback_binarized", "base_model:Minbyul/meditron-7b-wo-kqa_silver_wogold-sft", "license:llama2", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T15:04:44+00:00
[]
[]
TAGS #transformers #safetensors #llama #text-generation #alignment-handbook #trl #dpo #generated_from_trainer #dataset-HuggingFaceH4/ultrafeedback_binarized #base_model-Minbyul/meditron-7b-wo-kqa_silver_wogold-sft #license-llama2 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# meditron-7b-dpo-full-sft-wo-kqa_silver_wogold This model is a fine-tuned version of Minbyul/meditron-7b-wo-kqa_silver_wogold-sft on the HuggingFaceH4/ultrafeedback_binarized dataset. It achieves the following results on the evaluation set: - Loss: 0.6249 - Rewards/chosen: -0.0145 - Rewards/rejected: -0.1983 - Rewards/accuracies: 0.875 - Rewards/margins: 0.1839 - Logps/rejected: -653.1394 - Logps/chosen: -150.2214 - Logits/rejected: -1.0337 - Logits/chosen: -1.4037 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-07 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: multi-GPU - num_devices: 4 - gradient_accumulation_steps: 2 - total_train_batch_size: 64 - total_eval_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.39.0.dev0 - Pytorch 2.1.2 - Datasets 2.14.6 - Tokenizers 0.15.2
[ "# meditron-7b-dpo-full-sft-wo-kqa_silver_wogold\n\nThis model is a fine-tuned version of Minbyul/meditron-7b-wo-kqa_silver_wogold-sft on the HuggingFaceH4/ultrafeedback_binarized dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.6249\n- Rewards/chosen: -0.0145\n- Rewards/rejected: -0.1983\n- Rewards/accuracies: 0.875\n- Rewards/margins: 0.1839\n- Logps/rejected: -653.1394\n- Logps/chosen: -150.2214\n- Logits/rejected: -1.0337\n- Logits/chosen: -1.4037", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-07\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- num_devices: 4\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 64\n- total_eval_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.39.0.dev0\n- Pytorch 2.1.2\n- Datasets 2.14.6\n- Tokenizers 0.15.2" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #alignment-handbook #trl #dpo #generated_from_trainer #dataset-HuggingFaceH4/ultrafeedback_binarized #base_model-Minbyul/meditron-7b-wo-kqa_silver_wogold-sft #license-llama2 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# meditron-7b-dpo-full-sft-wo-kqa_silver_wogold\n\nThis model is a fine-tuned version of Minbyul/meditron-7b-wo-kqa_silver_wogold-sft on the HuggingFaceH4/ultrafeedback_binarized dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.6249\n- Rewards/chosen: -0.0145\n- Rewards/rejected: -0.1983\n- Rewards/accuracies: 0.875\n- Rewards/margins: 0.1839\n- Logps/rejected: -653.1394\n- Logps/chosen: -150.2214\n- Logits/rejected: -1.0337\n- Logits/chosen: -1.4037", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-07\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- num_devices: 4\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 64\n- total_eval_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.39.0.dev0\n- Pytorch 2.1.2\n- Datasets 2.14.6\n- Tokenizers 0.15.2" ]
[ 101, 175, 7, 9, 9, 4, 155, 5, 43 ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #alignment-handbook #trl #dpo #generated_from_trainer #dataset-HuggingFaceH4/ultrafeedback_binarized #base_model-Minbyul/meditron-7b-wo-kqa_silver_wogold-sft #license-llama2 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# meditron-7b-dpo-full-sft-wo-kqa_silver_wogold\n\nThis model is a fine-tuned version of Minbyul/meditron-7b-wo-kqa_silver_wogold-sft on the HuggingFaceH4/ultrafeedback_binarized dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.6249\n- Rewards/chosen: -0.0145\n- Rewards/rejected: -0.1983\n- Rewards/accuracies: 0.875\n- Rewards/margins: 0.1839\n- Logps/rejected: -653.1394\n- Logps/chosen: -150.2214\n- Logits/rejected: -1.0337\n- Logits/chosen: -1.4037## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-07\n- train_batch_size: 8\n- eval_batch_size: 8\n- seed: 42\n- distributed_type: multi-GPU\n- num_devices: 4\n- gradient_accumulation_steps: 2\n- total_train_batch_size: 64\n- total_eval_batch_size: 32\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.1\n- num_epochs: 1### Training results### Framework versions\n\n- Transformers 4.39.0.dev0\n- Pytorch 2.1.2\n- Datasets 2.14.6\n- Tokenizers 0.15.2" ]
text-to-image
null
## reverse <img src="https://via.placeholder.com/468x300?text=App+Screenshot+Here" alt="Generated on Image Pipeline" style="border-radius: 10px;"> **This lora model is uploaded on [imagepipeline.io](https://imagepipeline.io/)** Model details - version 1 [![Try this model](https://img.shields.io/badge/try_this_model-image_pipeline-BD9319)](https://imagepipeline.io/models/reverse?id=c50ec5bb-5215-4d84-ac7e-2f9caa563ff5/) ## How to try this model ? You can try using it locally or send an API call to test the output quality. Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/). No payment required. Coding in `php` `javascript` `node` etc ? Checkout our documentation [![documentation](https://img.shields.io/badge/documentation-image_pipeline-blue)](https://docs.imagepipeline.io/docs/introduction) ```python import requests import json url = "https://imagepipeline.io/sd/text2image/v1/run" payload = json.dumps({ "model_id": "sd1.5", "prompt": "ultra realistic close up portrait ((beautiful pale cyberpunk female with heavy black eyeliner)), blue eyes, shaved side haircut, hyper detail, cinematic lighting, magic neon, dark red city, Canon EOS R3, nikon, f/1.4, ISO 200, 1/160s, 8K, RAW, unedited, symmetrical balance, in-frame, 8K", "negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime", "width": "512", "height": "512", "samples": "1", "num_inference_steps": "30", "safety_checker": false, "guidance_scale": 7.5, "multi_lingual": "no", "embeddings": "", "lora_models": "c50ec5bb-5215-4d84-ac7e-2f9caa563ff5", "lora_weights": "0.5" }) headers = { 'Content-Type': 'application/json', 'API-Key': 'your_api_key' } response = requests.request("POST", url, headers=headers, data=payload) print(response.text) } ``` Get more ready to use `MODELS` like this for `SD 1.5` and `SDXL` : [![All models](https://img.shields.io/badge/Get%20All%20Models-image_pipeline-BD9319)](https://imagepipeline.io/models) ### API Reference #### Generate Image ```http https://api.imagepipeline.io/sd/text2image/v1 ``` | Headers | Type | Description | |:----------------------| :------- |:-------------------------------------------------------------------------------------------------------------------| | `API-Key` | `str` | Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/) | | `Content-Type` | `str` | application/json - content type of the request body | | Parameter | Type | Description | | :-------- | :------- | :------------------------- | | `model_id` | `str` | Your base model, find available lists in [models page](https://imagepipeline.io/models) or upload your own| | `prompt` | `str` | Text Prompt. Check our [Prompt Guide](https://docs.imagepipeline.io/docs/SD-1.5/docs/extras/prompt-guide) for tips | | `num_inference_steps` | `int [1-50]` | Noise is removed with each step, resulting in a higher-quality image over time. Ideal value 30-50 (without LCM) | | `guidance_scale` | `float [1-20]` | Higher guidance scale prioritizes text prompt relevance but sacrifices image quality. Ideal value 7.5-12.5 | | `lora_models` | `str, array` | Pass the model_id(s) of LoRA models that can be found in models page | | `lora_weights` | `str, array` | Strength of the LoRA effect | --- license: creativeml-openrail-m tags: - imagepipeline - imagepipeline.io - text-to-image - ultra-realistic pinned: false pipeline_tag: text-to-image --- ### Feedback If you have any feedback, please reach out to us at [email protected] #### 🔗 Visit Website [![portfolio](https://img.shields.io/badge/image_pipeline-BD9319?style=for-the-badge&logo=gocd&logoColor=white)](https://imagepipeline.io/) If you are the original author of this model, please [click here](https://airtable.com/apprTaRnJbDJ8ufOx/shr4g7o9B6fWfOlUR) to add credits
{"license": "creativeml-openrail-m", "tags": ["imagepipeline", "imagepipeline.io", "text-to-image", "ultra-realistic"], "pinned": false, "pipeline_tag": "text-to-image"}
imagepipeline/reverse
null
[ "imagepipeline", "imagepipeline.io", "text-to-image", "ultra-realistic", "license:creativeml-openrail-m", "region:us" ]
null
2024-04-30T15:06:45+00:00
[]
[]
TAGS #imagepipeline #imagepipeline.io #text-to-image #ultra-realistic #license-creativeml-openrail-m #region-us
reverse ------- <img src="URL alt="Generated on Image Pipeline" style="border-radius: 10px;"> This lora model is uploaded on URL Model details - version 1 ![Try this model](URL How to try this model ? ----------------------- You can try using it locally or send an API call to test the output quality. Get your 'API\_KEY' from URL. No payment required. Coding in 'php' 'javascript' 'node' etc ? Checkout our documentation ![documentation](URL Get more ready to use 'MODELS' like this for 'SD 1.5' and 'SDXL' : ![All models](URL ### API Reference #### Generate Image --- license: creativeml-openrail-m tags: * imagepipeline * URL * text-to-image * ultra-realistic pinned: false pipeline\_tag: text-to-image --- ### Feedback If you have any feedback, please reach out to us at hello@URL #### Visit Website ![portfolio](URL If you are the original author of this model, please click here to add credits
[ "### API Reference", "#### Generate Image\n\n\n\n\n\n\n---\n\n\nlicense: creativeml-openrail-m\ntags:\n\n\n* imagepipeline\n* URL\n* text-to-image\n* ultra-realistic\npinned: false\npipeline\\_tag: text-to-image\n\n\n\n\n---", "### Feedback\n\n\nIf you have any feedback, please reach out to us at hello@URL", "#### Visit Website\n\n\n![portfolio](URL\n\n\nIf you are the original author of this model, please click here to add credits" ]
[ "TAGS\n#imagepipeline #imagepipeline.io #text-to-image #ultra-realistic #license-creativeml-openrail-m #region-us \n", "### API Reference", "#### Generate Image\n\n\n\n\n\n\n---\n\n\nlicense: creativeml-openrail-m\ntags:\n\n\n* imagepipeline\n* URL\n* text-to-image\n* ultra-realistic\npinned: false\npipeline\\_tag: text-to-image\n\n\n\n\n---", "### Feedback\n\n\nIf you have any feedback, please reach out to us at hello@URL", "#### Visit Website\n\n\n![portfolio](URL\n\n\nIf you are the original author of this model, please click here to add credits" ]
[ 35, 5, 53, 20, 29 ]
[ "TAGS\n#imagepipeline #imagepipeline.io #text-to-image #ultra-realistic #license-creativeml-openrail-m #region-us \n### API Reference#### Generate Image\n\n\n\n\n\n\n---\n\n\nlicense: creativeml-openrail-m\ntags:\n\n\n* imagepipeline\n* URL\n* text-to-image\n* ultra-realistic\npinned: false\npipeline\\_tag: text-to-image\n\n\n\n\n---### Feedback\n\n\nIf you have any feedback, please reach out to us at hello@URL#### Visit Website\n\n\n![portfolio](URL\n\n\nIf you are the original author of this model, please click here to add credits" ]
sentence-similarity
sentence-transformers
# Squirtle Squirtle is a distill of [bge-base-en-v1.5](BAAI/bge-base-en-v1.5). ## Intended purpose <span style="color:blue">This model is designed for use in semantic-autocomplete ([click here for demo](https://mihaiii.github.io/semantic-autocomplete/)).</span> Make sure you also pass `pipelineParams={{ pooling: "cls", normalize: true }}` since the default pooling in the component is mean. ## Usage Other than within [semantic-autocomplete](https://github.com/Mihaiii/semantic-autocomplete), you can use this model same as [bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5#usage).
{"license": "mit", "library_name": "sentence-transformers", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "bge", "mteb"], "datasets": ["Mihaiii/qa-assistant"], "pipeline_tag": "sentence-similarity", "model-index": [{"name": "Squirtle", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 69.59701492537313}, {"type": "ap", "value": 31.80839087521638}, {"type": "f1", "value": 63.43204352573031}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 82.09027499999999}, {"type": "ap", "value": 76.95004336850603}, {"type": "f1", "value": 82.04505556179174}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 41.943999999999996}, {"type": "f1", "value": 40.40964457303876}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "mteb/arguana", "config": "default", "split": "test", "revision": "c22ab2a51041ffd869aaddef7af8d8215647e41a"}, "metrics": [{"type": "map_at_1", "value": 13.869000000000002}, {"type": "map_at_10", "value": 24.631}, {"type": "map_at_100", "value": 25.965}, {"type": "map_at_1000", "value": 26.023000000000003}, {"type": "map_at_20", "value": 25.442999999999998}, {"type": "map_at_3", "value": 20.827}, {"type": "map_at_5", "value": 22.776}, {"type": "mrr_at_1", "value": 14.580000000000002}, {"type": "mrr_at_10", "value": 24.91}, {"type": "mrr_at_100", "value": 26.229999999999997}, {"type": "mrr_at_1000", "value": 26.288}, {"type": "mrr_at_20", "value": 25.708}, {"type": "mrr_at_3", "value": 21.136}, {"type": "mrr_at_5", "value": 23.02}, {"type": "ndcg_at_1", "value": 13.869000000000002}, {"type": "ndcg_at_10", "value": 31.14}, {"type": "ndcg_at_100", "value": 37.885999999999996}, {"type": "ndcg_at_1000", "value": 39.497}, {"type": "ndcg_at_20", "value": 34.068}, {"type": "ndcg_at_3", "value": 23.163}, {"type": "ndcg_at_5", "value": 26.677}, {"type": "precision_at_1", "value": 13.869000000000002}, {"type": "precision_at_10", "value": 5.220000000000001}, {"type": "precision_at_100", "value": 0.844}, {"type": "precision_at_1000", "value": 0.097}, {"type": "precision_at_20", "value": 3.186}, {"type": "precision_at_3", "value": 9.981}, {"type": "precision_at_5", "value": 7.696}, {"type": "recall_at_1", "value": 13.869000000000002}, {"type": "recall_at_10", "value": 52.205}, {"type": "recall_at_100", "value": 84.42399999999999}, {"type": "recall_at_1000", "value": 97.297}, {"type": "recall_at_20", "value": 63.727000000000004}, {"type": "recall_at_3", "value": 29.942999999999998}, {"type": "recall_at_5", "value": 38.478}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 33.042527574996505}, {"type": "v_measures", "value": [0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295, 0.2896613951792161, 0.2974905938215674, 0.28195491579456905, 0.3008325954323272, 0.3012695848509836, 0.28933380000430453, 0.297420818100457, 0.2792041800887245, 0.3049968405105834, 0.30704380358904726, 0.39238640618067383, 0.3932595512850983, 0.3875472939281748, 0.39822946285500505, 0.39839156092566014, 0.40184636328122075, 0.39008499175162326, 0.3984035967802891, 0.39159106298575347, 0.3923217036338575, 0.3916410911561569, 0.2357749280106326, 0.23682806457721106, 0.3122239617657793, 0.26610676013174756, 0.18123482803921434, 0.2504695156635453, 0.10917464735757001, 0.16714512698028008, 1.0, 0.19931410358764295]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 24.68133686033884}, {"type": "v_measures", "value": [0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024, 0.2005976632299017, 0.208968006943616, 0.20946008190179435, 0.20539809799180958, 0.21463587994609631, 0.20913407901977635, 0.20908020832330956, 0.1944493063711425, 0.20181175619582953, 0.2249901827151246, 0.29132293951181787, 0.29570222215271086, 0.2796075942678196, 0.28871411057617774, 0.29302758518431116, 0.29227253592096986, 0.2856462545898644, 0.28687743467743254, 0.2900793948371436, 0.28627385826697854, 0.27308659940457203, 0.14117319401377473, 0.1761477350541332, 0.24048342650129406, 0.19387054212465876, 0.14470023981605995, 0.16704070762984086, 0.07547453139959907, 0.127993495025131, 1.0, 0.14319476311235024]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 52.344372012529384}, {"type": "mrr", "value": 65.32614430813877}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 69.44065444549933}, {"type": "cos_sim_spearman", "value": 71.77814153774398}, {"type": "euclidean_pearson", "value": 70.59416783558756}, {"type": "euclidean_spearman", "value": 71.77814153774398}, {"type": "manhattan_pearson", "value": 70.99287197201959}, {"type": "manhattan_spearman", "value": 72.0769435268729}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 67.12987012987013}, {"type": "f1", "value": 65.99991975715585}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 30.861774505346606}, {"type": "v_measures", "value": [0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453, 0.3057878417529878, 0.3086229109676654, 0.3080657568280612, 0.3002878816865892, 0.30903247986282023, 0.3022960257813801, 0.31981283125167154, 0.3119766955566159, 0.3039859162306553, 0.31630911061621453]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 21.100665285420916}, {"type": "v_measures", "value": [0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236, 0.21042268101320297, 0.19607301651541253, 0.21811669828359762, 0.20892482431651227, 0.20621532003083415, 0.215815720040119, 0.20517452774094483, 0.21396360841093787, 0.20967704706047804, 0.22568308513005236]}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "mteb/cqadupstack-android", "config": "default", "split": "test", "revision": "f46a197baaae43b4f621051089b82a364682dfeb"}, "metrics": [{"type": "map_at_1", "value": 17.835}, {"type": "map_at_10", "value": 24.718999999999998}, {"type": "map_at_100", "value": 25.755}, {"type": "map_at_1000", "value": 25.887}, {"type": "map_at_20", "value": 25.217}, {"type": "map_at_3", "value": 23.076}, {"type": "map_at_5", "value": 23.96}, {"type": "mrr_at_1", "value": 23.033}, {"type": "mrr_at_10", "value": 29.868}, {"type": "mrr_at_100", "value": 30.757}, {"type": "mrr_at_1000", "value": 30.834}, {"type": "mrr_at_20", "value": 30.37}, {"type": "mrr_at_3", "value": 28.112}, {"type": "mrr_at_5", "value": 29.185}, {"type": "ndcg_at_1", "value": 23.033}, {"type": "ndcg_at_10", "value": 28.899}, {"type": "ndcg_at_100", "value": 33.788000000000004}, {"type": "ndcg_at_1000", "value": 36.962}, {"type": "ndcg_at_20", "value": 30.497000000000003}, {"type": "ndcg_at_3", "value": 26.442}, {"type": "ndcg_at_5", "value": 27.466}, {"type": "precision_at_1", "value": 23.033}, {"type": "precision_at_10", "value": 5.351}, {"type": "precision_at_100", "value": 0.9610000000000001}, {"type": "precision_at_1000", "value": 0.151}, {"type": "precision_at_20", "value": 3.2259999999999995}, {"type": "precision_at_3", "value": 12.923000000000002}, {"type": "precision_at_5", "value": 8.956}, {"type": "recall_at_1", "value": 17.835}, {"type": "recall_at_10", "value": 36.034}, {"type": "recall_at_100", "value": 57.615}, {"type": "recall_at_1000", "value": 79.72}, {"type": "recall_at_20", "value": 41.894999999999996}, {"type": "recall_at_3", "value": 28.313}, {"type": "recall_at_5", "value": 31.639}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackEnglishRetrieval", "type": "mteb/cqadupstack-english", "config": "default", "split": "test", "revision": "ad9991cb51e31e31e430383c75ffb2885547b5f0"}, "metrics": [{"type": "map_at_1", "value": 12.166}, {"type": "map_at_10", "value": 16.320999999999998}, {"type": "map_at_100", "value": 16.954}, {"type": "map_at_1000", "value": 17.054}, {"type": "map_at_20", "value": 16.651}, {"type": "map_at_3", "value": 14.890999999999998}, {"type": "map_at_5", "value": 15.695999999999998}, {"type": "mrr_at_1", "value": 15.287}, {"type": "mrr_at_10", "value": 19.487}, {"type": "mrr_at_100", "value": 20.11}, {"type": "mrr_at_1000", "value": 20.185}, {"type": "mrr_at_20", "value": 19.830000000000002}, {"type": "mrr_at_3", "value": 18.068}, {"type": "mrr_at_5", "value": 18.855}, {"type": "ndcg_at_1", "value": 15.287}, {"type": "ndcg_at_10", "value": 19.198999999999998}, {"type": "ndcg_at_100", "value": 22.395}, {"type": "ndcg_at_1000", "value": 25.106}, {"type": "ndcg_at_20", "value": 20.297}, {"type": "ndcg_at_3", "value": 16.743}, {"type": "ndcg_at_5", "value": 17.855999999999998}, {"type": "precision_at_1", "value": 15.287}, {"type": "precision_at_10", "value": 3.605}, {"type": "precision_at_100", "value": 0.638}, {"type": "precision_at_1000", "value": 0.108}, {"type": "precision_at_20", "value": 2.166}, {"type": "precision_at_3", "value": 8.089}, {"type": "precision_at_5", "value": 5.822}, {"type": "recall_at_1", "value": 12.166}, {"type": "recall_at_10", "value": 24.701999999999998}, {"type": "recall_at_100", "value": 39.199}, {"type": "recall_at_1000", "value": 58.205}, {"type": "recall_at_20", "value": 28.791}, {"type": "recall_at_3", "value": 17.469}, {"type": "recall_at_5", "value": 20.615}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGamingRetrieval", "type": "mteb/cqadupstack-gaming", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 19.667}, {"type": "map_at_10", "value": 27.163999999999998}, {"type": "map_at_100", "value": 28.044000000000004}, {"type": "map_at_1000", "value": 28.142}, {"type": "map_at_20", "value": 27.645999999999997}, {"type": "map_at_3", "value": 24.914}, {"type": "map_at_5", "value": 26.078000000000003}, {"type": "mrr_at_1", "value": 23.197000000000003}, {"type": "mrr_at_10", "value": 30.202}, {"type": "mrr_at_100", "value": 30.976}, {"type": "mrr_at_1000", "value": 31.047000000000004}, {"type": "mrr_at_20", "value": 30.636000000000003}, {"type": "mrr_at_3", "value": 28.004}, {"type": "mrr_at_5", "value": 29.164}, {"type": "ndcg_at_1", "value": 23.197000000000003}, {"type": "ndcg_at_10", "value": 31.618000000000002}, {"type": "ndcg_at_100", "value": 35.977}, {"type": "ndcg_at_1000", "value": 38.458}, {"type": "ndcg_at_20", "value": 33.242}, {"type": "ndcg_at_3", "value": 27.285999999999998}, {"type": "ndcg_at_5", "value": 29.163}, {"type": "precision_at_1", "value": 23.197000000000003}, {"type": "precision_at_10", "value": 5.26}, {"type": "precision_at_100", "value": 0.8200000000000001}, {"type": "precision_at_1000", "value": 0.11199999999999999}, {"type": "precision_at_20", "value": 3.082}, {"type": "precision_at_3", "value": 12.247}, {"type": "precision_at_5", "value": 8.577}, {"type": "recall_at_1", "value": 19.667}, {"type": "recall_at_10", "value": 42.443}, {"type": "recall_at_100", "value": 62.254}, {"type": "recall_at_1000", "value": 80.44}, {"type": "recall_at_20", "value": 48.447}, {"type": "recall_at_3", "value": 30.518}, {"type": "recall_at_5", "value": 35.22}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGisRetrieval", "type": "mteb/cqadupstack-gis", "config": "default", "split": "test", "revision": "5003b3064772da1887988e05400cf3806fe491f2"}, "metrics": [{"type": "map_at_1", "value": 10.923}, {"type": "map_at_10", "value": 14.24}, {"type": "map_at_100", "value": 15.001000000000001}, {"type": "map_at_1000", "value": 15.092}, {"type": "map_at_20", "value": 14.623}, {"type": "map_at_3", "value": 13.168}, {"type": "map_at_5", "value": 13.678}, {"type": "mrr_at_1", "value": 11.525}, {"type": "mrr_at_10", "value": 15.187000000000001}, {"type": "mrr_at_100", "value": 15.939999999999998}, {"type": "mrr_at_1000", "value": 16.03}, {"type": "mrr_at_20", "value": 15.557000000000002}, {"type": "mrr_at_3", "value": 13.991999999999999}, {"type": "mrr_at_5", "value": 14.557}, {"type": "ndcg_at_1", "value": 11.525}, {"type": "ndcg_at_10", "value": 16.512999999999998}, {"type": "ndcg_at_100", "value": 20.445}, {"type": "ndcg_at_1000", "value": 23.398}, {"type": "ndcg_at_20", "value": 17.832}, {"type": "ndcg_at_3", "value": 14.224}, {"type": "ndcg_at_5", "value": 15.136}, {"type": "precision_at_1", "value": 11.525}, {"type": "precision_at_10", "value": 2.565}, {"type": "precision_at_100", "value": 0.484}, {"type": "precision_at_1000", "value": 0.076}, {"type": "precision_at_20", "value": 1.582}, {"type": "precision_at_3", "value": 5.989}, {"type": "precision_at_5", "value": 4.1579999999999995}, {"type": "recall_at_1", "value": 10.923}, {"type": "recall_at_10", "value": 22.695}, {"type": "recall_at_100", "value": 40.892}, {"type": "recall_at_1000", "value": 64.456}, {"type": "recall_at_20", "value": 27.607}, {"type": "recall_at_3", "value": 16.348}, {"type": "recall_at_5", "value": 18.504}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackMathematicaRetrieval", "type": "mteb/cqadupstack-mathematica", "config": "default", "split": "test", "revision": "90fceea13679c63fe563ded68f3b6f06e50061de"}, "metrics": [{"type": "map_at_1", "value": 5.409}, {"type": "map_at_10", "value": 8.584999999999999}, {"type": "map_at_100", "value": 9.392}, {"type": "map_at_1000", "value": 9.5}, {"type": "map_at_20", "value": 8.943}, {"type": "map_at_3", "value": 7.3}, {"type": "map_at_5", "value": 7.962}, {"type": "mrr_at_1", "value": 6.965000000000001}, {"type": "mrr_at_10", "value": 10.593}, {"type": "mrr_at_100", "value": 11.496}, {"type": "mrr_at_1000", "value": 11.578}, {"type": "mrr_at_20", "value": 11.021}, {"type": "mrr_at_3", "value": 8.976}, {"type": "mrr_at_5", "value": 9.797}, {"type": "ndcg_at_1", "value": 6.965000000000001}, {"type": "ndcg_at_10", "value": 11.056000000000001}, {"type": "ndcg_at_100", "value": 15.683}, {"type": "ndcg_at_1000", "value": 18.873}, {"type": "ndcg_at_20", "value": 12.331}, {"type": "ndcg_at_3", "value": 8.334}, {"type": "ndcg_at_5", "value": 9.512}, {"type": "precision_at_1", "value": 6.965000000000001}, {"type": "precision_at_10", "value": 2.177}, {"type": "precision_at_100", "value": 0.54}, {"type": "precision_at_1000", "value": 0.095}, {"type": "precision_at_20", "value": 1.468}, {"type": "precision_at_3", "value": 3.9800000000000004}, {"type": "precision_at_5", "value": 3.109}, {"type": "recall_at_1", "value": 5.409}, {"type": "recall_at_10", "value": 16.895}, {"type": "recall_at_100", "value": 38.167}, {"type": "recall_at_1000", "value": 61.783}, {"type": "recall_at_20", "value": 21.248}, {"type": "recall_at_3", "value": 9.518}, {"type": "recall_at_5", "value": 12.426}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackPhysicsRetrieval", "type": "mteb/cqadupstack-physics", "config": "default", "split": "test", "revision": "79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4"}, "metrics": [{"type": "map_at_1", "value": 13.688}, {"type": "map_at_10", "value": 19.096}, {"type": "map_at_100", "value": 20.058}, {"type": "map_at_1000", "value": 20.194000000000003}, {"type": "map_at_20", "value": 19.595000000000002}, {"type": "map_at_3", "value": 17.313000000000002}, {"type": "map_at_5", "value": 18.41}, {"type": "mrr_at_1", "value": 17.132}, {"type": "mrr_at_10", "value": 22.95}, {"type": "mrr_at_100", "value": 23.799}, {"type": "mrr_at_1000", "value": 23.884}, {"type": "mrr_at_20", "value": 23.419999999999998}, {"type": "mrr_at_3", "value": 20.95}, {"type": "mrr_at_5", "value": 22.21}, {"type": "ndcg_at_1", "value": 17.132}, {"type": "ndcg_at_10", "value": 22.88}, {"type": "ndcg_at_100", "value": 27.572000000000003}, {"type": "ndcg_at_1000", "value": 30.824}, {"type": "ndcg_at_20", "value": 24.516}, {"type": "ndcg_at_3", "value": 19.64}, {"type": "ndcg_at_5", "value": 21.4}, {"type": "precision_at_1", "value": 17.132}, {"type": "precision_at_10", "value": 4.263999999999999}, {"type": "precision_at_100", "value": 0.7969999999999999}, {"type": "precision_at_1000", "value": 0.125}, {"type": "precision_at_20", "value": 2.6519999999999997}, {"type": "precision_at_3", "value": 9.336}, {"type": "precision_at_5", "value": 6.93}, {"type": "recall_at_1", "value": 13.688}, {"type": "recall_at_10", "value": 30.537999999999997}, {"type": "recall_at_100", "value": 51.017999999999994}, {"type": "recall_at_1000", "value": 73.921}, {"type": "recall_at_20", "value": 36.174}, {"type": "recall_at_3", "value": 21.568}, {"type": "recall_at_5", "value": 26.127}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackProgrammersRetrieval", "type": "mteb/cqadupstack-programmers", "config": "default", "split": "test", "revision": "6184bc1440d2dbc7612be22b50686b8826d22b32"}, "metrics": [{"type": "map_at_1", "value": 8.173}, {"type": "map_at_10", "value": 11.648}, {"type": "map_at_100", "value": 12.434000000000001}, {"type": "map_at_1000", "value": 12.540000000000001}, {"type": "map_at_20", "value": 12.030000000000001}, {"type": "map_at_3", "value": 10.568}, {"type": "map_at_5", "value": 11.064}, {"type": "mrr_at_1", "value": 10.274}, {"type": "mrr_at_10", "value": 14.505}, {"type": "mrr_at_100", "value": 15.332}, {"type": "mrr_at_1000", "value": 15.409}, {"type": "mrr_at_20", "value": 14.899999999999999}, {"type": "mrr_at_3", "value": 13.375}, {"type": "mrr_at_5", "value": 13.929}, {"type": "ndcg_at_1", "value": 10.274}, {"type": "ndcg_at_10", "value": 14.283999999999999}, {"type": "ndcg_at_100", "value": 18.731}, {"type": "ndcg_at_1000", "value": 21.744}, {"type": "ndcg_at_20", "value": 15.647}, {"type": "ndcg_at_3", "value": 12.278}, {"type": "ndcg_at_5", "value": 12.974}, {"type": "precision_at_1", "value": 10.274}, {"type": "precision_at_10", "value": 2.683}, {"type": "precision_at_100", "value": 0.582}, {"type": "precision_at_1000", "value": 0.099}, {"type": "precision_at_20", "value": 1.7409999999999999}, {"type": "precision_at_3", "value": 6.088}, {"type": "precision_at_5", "value": 4.201}, {"type": "recall_at_1", "value": 8.173}, {"type": "recall_at_10", "value": 19.642}, {"type": "recall_at_100", "value": 40.213}, {"type": "recall_at_1000", "value": 62.083999999999996}, {"type": "recall_at_20", "value": 24.537}, {"type": "recall_at_3", "value": 13.700999999999999}, {"type": "recall_at_5", "value": 15.751000000000001}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackRetrieval", "type": "mteb/cqadupstack", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 11.252416666666667}, {"type": "map_at_10", "value": 15.589583333333334}, {"type": "map_at_100", "value": 16.381166666666665}, {"type": "map_at_1000", "value": 16.490333333333332}, {"type": "map_at_20", "value": 15.99116666666667}, {"type": "map_at_3", "value": 14.140916666666667}, {"type": "map_at_5", "value": 14.9045}, {"type": "mrr_at_1", "value": 13.710416666666664}, {"type": "mrr_at_10", "value": 18.34416666666667}, {"type": "mrr_at_100", "value": 19.110083333333336}, {"type": "mrr_at_1000", "value": 19.192583333333335}, {"type": "mrr_at_20", "value": 18.74783333333333}, {"type": "mrr_at_3", "value": 16.799416666666666}, {"type": "mrr_at_5", "value": 17.62725}, {"type": "ndcg_at_1", "value": 13.710416666666664}, {"type": "ndcg_at_10", "value": 18.628583333333335}, {"type": "ndcg_at_100", "value": 22.733666666666668}, {"type": "ndcg_at_1000", "value": 25.728499999999997}, {"type": "ndcg_at_20", "value": 19.994500000000002}, {"type": "ndcg_at_3", "value": 15.918083333333332}, {"type": "ndcg_at_5", "value": 17.086999999999996}, {"type": "precision_at_1", "value": 13.710416666666664}, {"type": "precision_at_10", "value": 3.3575}, {"type": "precision_at_100", "value": 0.6368333333333333}, {"type": "precision_at_1000", "value": 0.10508333333333333}, {"type": "precision_at_20", "value": 2.074833333333333}, {"type": "precision_at_3", "value": 7.440333333333333}, {"type": "precision_at_5", "value": 5.341916666666667}, {"type": "recall_at_1", "value": 11.252416666666667}, {"type": "recall_at_10", "value": 25.200833333333332}, {"type": "recall_at_100", "value": 44.075333333333326}, {"type": "recall_at_1000", "value": 66.12541666666665}, {"type": "recall_at_20", "value": 30.24916666666667}, {"type": "recall_at_3", "value": 17.46591666666667}, {"type": "recall_at_5", "value": 20.53691666666667}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackStatsRetrieval", "type": "mteb/cqadupstack-stats", "config": "default", "split": "test", "revision": "65ac3a16b8e91f9cee4c9828cc7c335575432a2a"}, "metrics": [{"type": "map_at_1", "value": 8.696}, {"type": "map_at_10", "value": 12.339}, {"type": "map_at_100", "value": 12.946}, {"type": "map_at_1000", "value": 13.04}, {"type": "map_at_20", "value": 12.6}, {"type": "map_at_3", "value": 11.06}, {"type": "map_at_5", "value": 11.530999999999999}, {"type": "mrr_at_1", "value": 10.276}, {"type": "mrr_at_10", "value": 14.463999999999999}, {"type": "mrr_at_100", "value": 15.07}, {"type": "mrr_at_1000", "value": 15.152}, {"type": "mrr_at_20", "value": 14.737}, {"type": "mrr_at_3", "value": 13.037}, {"type": "mrr_at_5", "value": 13.627}, {"type": "ndcg_at_1", "value": 10.276}, {"type": "ndcg_at_10", "value": 15.085}, {"type": "ndcg_at_100", "value": 18.538}, {"type": "ndcg_at_1000", "value": 21.461}, {"type": "ndcg_at_20", "value": 15.976}, {"type": "ndcg_at_3", "value": 12.454}, {"type": "ndcg_at_5", "value": 13.195}, {"type": "precision_at_1", "value": 10.276}, {"type": "precision_at_10", "value": 2.669}, {"type": "precision_at_100", "value": 0.48900000000000005}, {"type": "precision_at_1000", "value": 0.08}, {"type": "precision_at_20", "value": 1.572}, {"type": "precision_at_3", "value": 5.726}, {"type": "precision_at_5", "value": 3.9570000000000003}, {"type": "recall_at_1", "value": 8.696}, {"type": "recall_at_10", "value": 21.766}, {"type": "recall_at_100", "value": 38.269}, {"type": "recall_at_1000", "value": 61.106}, {"type": "recall_at_20", "value": 24.992}, {"type": "recall_at_3", "value": 14.032}, {"type": "recall_at_5", "value": 15.967999999999998}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackTexRetrieval", "type": "mteb/cqadupstack-tex", "config": "default", "split": "test", "revision": "46989137a86843e03a6195de44b09deda022eec7"}, "metrics": [{"type": "map_at_1", "value": 6.13}, {"type": "map_at_10", "value": 9.067}, {"type": "map_at_100", "value": 9.687999999999999}, {"type": "map_at_1000", "value": 9.792}, {"type": "map_at_20", "value": 9.384}, {"type": "map_at_3", "value": 8.006}, {"type": "map_at_5", "value": 8.581999999999999}, {"type": "mrr_at_1", "value": 7.605}, {"type": "mrr_at_10", "value": 11.111}, {"type": "mrr_at_100", "value": 11.745999999999999}, {"type": "mrr_at_1000", "value": 11.837}, {"type": "mrr_at_20", "value": 11.452}, {"type": "mrr_at_3", "value": 9.922}, {"type": "mrr_at_5", "value": 10.522}, {"type": "ndcg_at_1", "value": 7.605}, {"type": "ndcg_at_10", "value": 11.302}, {"type": "ndcg_at_100", "value": 14.629}, {"type": "ndcg_at_1000", "value": 17.739}, {"type": "ndcg_at_20", "value": 12.411}, {"type": "ndcg_at_3", "value": 9.28}, {"type": "ndcg_at_5", "value": 10.161000000000001}, {"type": "precision_at_1", "value": 7.605}, {"type": "precision_at_10", "value": 2.22}, {"type": "precision_at_100", "value": 0.46499999999999997}, {"type": "precision_at_1000", "value": 0.087}, {"type": "precision_at_20", "value": 1.428}, {"type": "precision_at_3", "value": 4.565}, {"type": "precision_at_5", "value": 3.3649999999999998}, {"type": "recall_at_1", "value": 6.13}, {"type": "recall_at_10", "value": 16.009999999999998}, {"type": "recall_at_100", "value": 31.467}, {"type": "recall_at_1000", "value": 54.722}, {"type": "recall_at_20", "value": 20.137}, {"type": "recall_at_3", "value": 10.347000000000001}, {"type": "recall_at_5", "value": 12.692}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackUnixRetrieval", "type": "mteb/cqadupstack-unix", "config": "default", "split": "test", "revision": "6c6430d3a6d36f8d2a829195bc5dc94d7e063e53"}, "metrics": [{"type": "map_at_1", "value": 11.645}, {"type": "map_at_10", "value": 15.466}, {"type": "map_at_100", "value": 16.147}, {"type": "map_at_1000", "value": 16.247}, {"type": "map_at_20", "value": 15.806999999999999}, {"type": "map_at_3", "value": 14.011000000000001}, {"type": "map_at_5", "value": 14.967}, {"type": "mrr_at_1", "value": 14.179}, {"type": "mrr_at_10", "value": 18.512}, {"type": "mrr_at_100", "value": 19.184}, {"type": "mrr_at_1000", "value": 19.267}, {"type": "mrr_at_20", "value": 18.855}, {"type": "mrr_at_3", "value": 16.993}, {"type": "mrr_at_5", "value": 17.954}, {"type": "ndcg_at_1", "value": 14.179}, {"type": "ndcg_at_10", "value": 18.311}, {"type": "ndcg_at_100", "value": 21.996}, {"type": "ndcg_at_1000", "value": 24.942}, {"type": "ndcg_at_20", "value": 19.522000000000002}, {"type": "ndcg_at_3", "value": 15.593000000000002}, {"type": "ndcg_at_5", "value": 17.116}, {"type": "precision_at_1", "value": 14.179}, {"type": "precision_at_10", "value": 3.116}, {"type": "precision_at_100", "value": 0.5519999999999999}, {"type": "precision_at_1000", "value": 0.091}, {"type": "precision_at_20", "value": 1.87}, {"type": "precision_at_3", "value": 7.090000000000001}, {"type": "precision_at_5", "value": 5.224}, {"type": "recall_at_1", "value": 11.645}, {"type": "recall_at_10", "value": 24.206}, {"type": "recall_at_100", "value": 41.29}, {"type": "recall_at_1000", "value": 63.205999999999996}, {"type": "recall_at_20", "value": 28.659000000000002}, {"type": "recall_at_3", "value": 16.771}, {"type": "recall_at_5", "value": 20.602}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWebmastersRetrieval", "type": "mteb/cqadupstack-webmasters", "config": "default", "split": "test", "revision": "160c094312a0e1facb97e55eeddb698c0abe3571"}, "metrics": [{"type": "map_at_1", "value": 12.435}, {"type": "map_at_10", "value": 17.263}, {"type": "map_at_100", "value": 18.137}, {"type": "map_at_1000", "value": 18.282999999999998}, {"type": "map_at_20", "value": 17.724}, {"type": "map_at_3", "value": 15.648000000000001}, {"type": "map_at_5", "value": 16.542}, {"type": "mrr_at_1", "value": 15.809999999999999}, {"type": "mrr_at_10", "value": 20.687}, {"type": "mrr_at_100", "value": 21.484}, {"type": "mrr_at_1000", "value": 21.567}, {"type": "mrr_at_20", "value": 21.124000000000002}, {"type": "mrr_at_3", "value": 19.104}, {"type": "mrr_at_5", "value": 19.974}, {"type": "ndcg_at_1", "value": 15.809999999999999}, {"type": "ndcg_at_10", "value": 20.801}, {"type": "ndcg_at_100", "value": 25.001}, {"type": "ndcg_at_1000", "value": 28.347}, {"type": "ndcg_at_20", "value": 22.223000000000003}, {"type": "ndcg_at_3", "value": 18.046}, {"type": "ndcg_at_5", "value": 19.308}, {"type": "precision_at_1", "value": 15.809999999999999}, {"type": "precision_at_10", "value": 4.032}, {"type": "precision_at_100", "value": 0.832}, {"type": "precision_at_1000", "value": 0.16}, {"type": "precision_at_20", "value": 2.54}, {"type": "precision_at_3", "value": 8.63}, {"type": "precision_at_5", "value": 6.4030000000000005}, {"type": "recall_at_1", "value": 12.435}, {"type": "recall_at_10", "value": 27.495000000000005}, {"type": "recall_at_100", "value": 47.522999999999996}, {"type": "recall_at_1000", "value": 70.804}, {"type": "recall_at_20", "value": 33.334}, {"type": "recall_at_3", "value": 19.192}, {"type": "recall_at_5", "value": 22.435}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWordpressRetrieval", "type": "mteb/cqadupstack-wordpress", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 8.262}, {"type": "map_at_10", "value": 11.167}, {"type": "map_at_100", "value": 12.017999999999999}, {"type": "map_at_1000", "value": 12.113}, {"type": "map_at_20", "value": 11.674}, {"type": "map_at_3", "value": 9.736}, {"type": "map_at_5", "value": 10.384}, {"type": "mrr_at_1", "value": 9.242}, {"type": "mrr_at_10", "value": 12.564}, {"type": "mrr_at_100", "value": 13.427}, {"type": "mrr_at_1000", "value": 13.520999999999999}, {"type": "mrr_at_20", "value": 13.072000000000001}, {"type": "mrr_at_3", "value": 11.06}, {"type": "mrr_at_5", "value": 11.753}, {"type": "ndcg_at_1", "value": 9.242}, {"type": "ndcg_at_10", "value": 13.594999999999999}, {"type": "ndcg_at_100", "value": 18.049}, {"type": "ndcg_at_1000", "value": 20.888}, {"type": "ndcg_at_20", "value": 15.440000000000001}, {"type": "ndcg_at_3", "value": 10.697}, {"type": "ndcg_at_5", "value": 11.757}, {"type": "precision_at_1", "value": 9.242}, {"type": "precision_at_10", "value": 2.348}, {"type": "precision_at_100", "value": 0.482}, {"type": "precision_at_1000", "value": 0.077}, {"type": "precision_at_20", "value": 1.5709999999999997}, {"type": "precision_at_3", "value": 4.621}, {"type": "precision_at_5", "value": 3.401}, {"type": "recall_at_1", "value": 8.262}, {"type": "recall_at_10", "value": 19.983999999999998}, {"type": "recall_at_100", "value": 40.997}, {"type": "recall_at_1000", "value": 63.058}, {"type": "recall_at_20", "value": 27.168999999999997}, {"type": "recall_at_3", "value": 11.814}, {"type": "recall_at_5", "value": 14.463999999999999}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "mteb/climate-fever", "config": "default", "split": "test", "revision": "47f2ac6acb640fc46020b02a5b59fdda04d39380"}, "metrics": [{"type": "map_at_1", "value": 4.058}, {"type": "map_at_10", "value": 6.734}, {"type": "map_at_100", "value": 7.593999999999999}, {"type": "map_at_1000", "value": 7.736999999999999}, {"type": "map_at_20", "value": 7.102}, {"type": "map_at_3", "value": 5.559}, {"type": "map_at_5", "value": 6.178999999999999}, {"type": "mrr_at_1", "value": 8.404}, {"type": "mrr_at_10", "value": 13.514999999999999}, {"type": "mrr_at_100", "value": 14.518}, {"type": "mrr_at_1000", "value": 14.599}, {"type": "mrr_at_20", "value": 14.025000000000002}, {"type": "mrr_at_3", "value": 11.584999999999999}, {"type": "mrr_at_5", "value": 12.588}, {"type": "ndcg_at_1", "value": 8.404}, {"type": "ndcg_at_10", "value": 10.02}, {"type": "ndcg_at_100", "value": 14.771999999999998}, {"type": "ndcg_at_1000", "value": 18.251}, {"type": "ndcg_at_20", "value": 11.378}, {"type": "ndcg_at_3", "value": 7.675}, {"type": "ndcg_at_5", "value": 8.558}, {"type": "precision_at_1", "value": 8.404}, {"type": "precision_at_10", "value": 3.212}, {"type": "precision_at_100", "value": 0.83}, {"type": "precision_at_1000", "value": 0.146}, {"type": "precision_at_20", "value": 2.186}, {"type": "precision_at_3", "value": 5.624}, {"type": "precision_at_5", "value": 4.5600000000000005}, {"type": "recall_at_1", "value": 4.058}, {"type": "recall_at_10", "value": 12.751999999999999}, {"type": "recall_at_100", "value": 30.219}, {"type": "recall_at_1000", "value": 50.749}, {"type": "recall_at_20", "value": 16.634}, {"type": "recall_at_3", "value": 7.234999999999999}, {"type": "recall_at_5", "value": 9.418}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "mteb/dbpedia", "config": "default", "split": "test", "revision": "c0f706b76e590d620bd6618b3ca8efdd34e2d659"}, "metrics": [{"type": "map_at_1", "value": 5.516}, {"type": "map_at_10", "value": 11.001}, {"type": "map_at_100", "value": 14.527999999999999}, {"type": "map_at_1000", "value": 15.417}, {"type": "map_at_20", "value": 12.446}, {"type": "map_at_3", "value": 8.269}, {"type": "map_at_5", "value": 9.345}, {"type": "mrr_at_1", "value": 43.5}, {"type": "mrr_at_10", "value": 54.078}, {"type": "mrr_at_100", "value": 54.655}, {"type": "mrr_at_1000", "value": 54.679}, {"type": "mrr_at_20", "value": 54.461999999999996}, {"type": "mrr_at_3", "value": 51.37500000000001}, {"type": "mrr_at_5", "value": 53.25}, {"type": "ndcg_at_1", "value": 33.125}, {"type": "ndcg_at_10", "value": 25.665}, {"type": "ndcg_at_100", "value": 28.116000000000003}, {"type": "ndcg_at_1000", "value": 34.477000000000004}, {"type": "ndcg_at_20", "value": 25.027}, {"type": "ndcg_at_3", "value": 28.4}, {"type": "ndcg_at_5", "value": 27.094}, {"type": "precision_at_1", "value": 43.5}, {"type": "precision_at_10", "value": 21.65}, {"type": "precision_at_100", "value": 6.351999999999999}, {"type": "precision_at_1000", "value": 1.306}, {"type": "precision_at_20", "value": 15.662}, {"type": "precision_at_3", "value": 32.333}, {"type": "precision_at_5", "value": 28.199999999999996}, {"type": "recall_at_1", "value": 5.516}, {"type": "recall_at_10", "value": 15.457}, {"type": "recall_at_100", "value": 32.903}, {"type": "recall_at_1000", "value": 53.81700000000001}, {"type": "recall_at_20", "value": 20.365}, {"type": "recall_at_3", "value": 9.528}, {"type": "recall_at_5", "value": 11.619}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 45.79}, {"type": "f1", "value": 38.89634882093881}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "mteb/fever", "config": "default", "split": "test", "revision": "bea83ef9e8fb933d90a2f1d5515737465d613e12"}, "metrics": [{"type": "map_at_1", "value": 18.063000000000002}, {"type": "map_at_10", "value": 24.911}, {"type": "map_at_100", "value": 25.688}, {"type": "map_at_1000", "value": 25.758}, {"type": "map_at_20", "value": 25.358999999999998}, {"type": "map_at_3", "value": 22.743}, {"type": "map_at_5", "value": 23.924}, {"type": "mrr_at_1", "value": 19.472}, {"type": "mrr_at_10", "value": 26.587}, {"type": "mrr_at_100", "value": 27.362}, {"type": "mrr_at_1000", "value": 27.428}, {"type": "mrr_at_20", "value": 27.040999999999997}, {"type": "mrr_at_3", "value": 24.362000000000002}, {"type": "mrr_at_5", "value": 25.593}, {"type": "ndcg_at_1", "value": 19.472}, {"type": "ndcg_at_10", "value": 29.183999999999997}, {"type": "ndcg_at_100", "value": 33.207}, {"type": "ndcg_at_1000", "value": 35.21}, {"type": "ndcg_at_20", "value": 30.791}, {"type": "ndcg_at_3", "value": 24.701999999999998}, {"type": "ndcg_at_5", "value": 26.823000000000004}, {"type": "precision_at_1", "value": 19.472}, {"type": "precision_at_10", "value": 4.469}, {"type": "precision_at_100", "value": 0.6629999999999999}, {"type": "precision_at_1000", "value": 0.08499999999999999}, {"type": "precision_at_20", "value": 2.59}, {"type": "precision_at_3", "value": 10.401}, {"type": "precision_at_5", "value": 7.363}, {"type": "recall_at_1", "value": 18.063000000000002}, {"type": "recall_at_10", "value": 41.071999999999996}, {"type": "recall_at_100", "value": 60.049}, {"type": "recall_at_1000", "value": 75.64699999999999}, {"type": "recall_at_20", "value": 47.211999999999996}, {"type": "recall_at_3", "value": 28.796}, {"type": "recall_at_5", "value": 33.894999999999996}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "mteb/fiqa", "config": "default", "split": "test", "revision": "27a168819829fe9bcd655c2df245fb19452e8e06"}, "metrics": [{"type": "map_at_1", "value": 2.45}, {"type": "map_at_10", "value": 4.255}, {"type": "map_at_100", "value": 4.809}, {"type": "map_at_1000", "value": 4.954}, {"type": "map_at_20", "value": 4.513}, {"type": "map_at_3", "value": 3.4029999999999996}, {"type": "map_at_5", "value": 3.782}, {"type": "mrr_at_1", "value": 4.938}, {"type": "mrr_at_10", "value": 8.231}, {"type": "mrr_at_100", "value": 8.902000000000001}, {"type": "mrr_at_1000", "value": 9.019}, {"type": "mrr_at_20", "value": 8.530999999999999}, {"type": "mrr_at_3", "value": 6.944}, {"type": "mrr_at_5", "value": 7.623}, {"type": "ndcg_at_1", "value": 4.938}, {"type": "ndcg_at_10", "value": 6.425}, {"type": "ndcg_at_100", "value": 9.661999999999999}, {"type": "ndcg_at_1000", "value": 13.911999999999999}, {"type": "ndcg_at_20", "value": 7.3}, {"type": "ndcg_at_3", "value": 4.907}, {"type": "ndcg_at_5", "value": 5.406}, {"type": "precision_at_1", "value": 4.938}, {"type": "precision_at_10", "value": 2.037}, {"type": "precision_at_100", "value": 0.528}, {"type": "precision_at_1000", "value": 0.125}, {"type": "precision_at_20", "value": 1.366}, {"type": "precision_at_3", "value": 3.344}, {"type": "precision_at_5", "value": 2.7470000000000003}, {"type": "recall_at_1", "value": 2.45}, {"type": "recall_at_10", "value": 8.987}, {"type": "recall_at_100", "value": 22.302}, {"type": "recall_at_1000", "value": 49.903999999999996}, {"type": "recall_at_20", "value": 11.712}, {"type": "recall_at_3", "value": 4.675}, {"type": "recall_at_5", "value": 6.161}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "mteb/hotpotqa", "config": "default", "split": "test", "revision": "ab518f4d6fcca38d87c25209f94beba119d02014"}, "metrics": [{"type": "map_at_1", "value": 23.585}, {"type": "map_at_10", "value": 31.893}, {"type": "map_at_100", "value": 32.851}, {"type": "map_at_1000", "value": 32.951}, {"type": "map_at_20", "value": 32.415}, {"type": "map_at_3", "value": 29.787000000000003}, {"type": "map_at_5", "value": 31.012}, {"type": "mrr_at_1", "value": 47.171}, {"type": "mrr_at_10", "value": 54.333}, {"type": "mrr_at_100", "value": 54.949000000000005}, {"type": "mrr_at_1000", "value": 54.98800000000001}, {"type": "mrr_at_20", "value": 54.702}, {"type": "mrr_at_3", "value": 52.632999999999996}, {"type": "mrr_at_5", "value": 53.652}, {"type": "ndcg_at_1", "value": 47.171}, {"type": "ndcg_at_10", "value": 39.884}, {"type": "ndcg_at_100", "value": 44.019000000000005}, {"type": "ndcg_at_1000", "value": 46.303}, {"type": "ndcg_at_20", "value": 41.461999999999996}, {"type": "ndcg_at_3", "value": 36.153999999999996}, {"type": "ndcg_at_5", "value": 38.072}, {"type": "precision_at_1", "value": 47.171}, {"type": "precision_at_10", "value": 8.396}, {"type": "precision_at_100", "value": 1.169}, {"type": "precision_at_1000", "value": 0.147}, {"type": "precision_at_20", "value": 4.707}, {"type": "precision_at_3", "value": 22.408}, {"type": "precision_at_5", "value": 14.966}, {"type": "recall_at_1", "value": 23.585}, {"type": "recall_at_10", "value": 41.978}, {"type": "recall_at_100", "value": 58.447}, {"type": "recall_at_1000", "value": 73.7}, {"type": "recall_at_20", "value": 47.07}, {"type": "recall_at_3", "value": 33.611999999999995}, {"type": "recall_at_5", "value": 37.413999999999994}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 74.9528}, {"type": "ap", "value": 69.50790744137139}, {"type": "f1", "value": 74.77689594327182}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "mteb/msmarco", "config": "default", "split": "dev", "revision": "c5a29a104738b98a9e76336939199e264163d4a0"}, "metrics": [{"type": "map_at_1", "value": 8.186}, {"type": "map_at_10", "value": 13.352}, {"type": "map_at_100", "value": 14.147000000000002}, {"type": "map_at_1000", "value": 14.231}, {"type": "map_at_20", "value": 13.753000000000002}, {"type": "map_at_3", "value": 11.529}, {"type": "map_at_5", "value": 12.497}, {"type": "mrr_at_1", "value": 8.424}, {"type": "mrr_at_10", "value": 13.675999999999998}, {"type": "mrr_at_100", "value": 14.475999999999999}, {"type": "mrr_at_1000", "value": 14.557}, {"type": "mrr_at_20", "value": 14.084}, {"type": "mrr_at_3", "value": 11.843}, {"type": "mrr_at_5", "value": 12.82}, {"type": "ndcg_at_1", "value": 8.424}, {"type": "ndcg_at_10", "value": 16.534}, {"type": "ndcg_at_100", "value": 20.982}, {"type": "ndcg_at_1000", "value": 23.538999999999998}, {"type": "ndcg_at_20", "value": 18.012}, {"type": "ndcg_at_3", "value": 12.729}, {"type": "ndcg_at_5", "value": 14.466999999999999}, {"type": "precision_at_1", "value": 8.424}, {"type": "precision_at_10", "value": 2.7449999999999997}, {"type": "precision_at_100", "value": 0.507}, {"type": "precision_at_1000", "value": 0.073}, {"type": "precision_at_20", "value": 1.683}, {"type": "precision_at_3", "value": 5.478000000000001}, {"type": "precision_at_5", "value": 4.16}, {"type": "recall_at_1", "value": 8.186}, {"type": "recall_at_10", "value": 26.415}, {"type": "recall_at_100", "value": 48.282000000000004}, {"type": "recall_at_1000", "value": 68.869}, {"type": "recall_at_20", "value": 32.207}, {"type": "recall_at_3", "value": 15.909}, {"type": "recall_at_5", "value": 20.09}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 87.26858185134519}, {"type": "f1", "value": 86.73793752046078}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 54.65800273597811}, {"type": "f1", "value": 36.16413360524473}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 61.519838601210495}, {"type": "f1", "value": 58.35755839392156}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 67.04102219233357}, {"type": "f1", "value": 65.55523696441647}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 27.16765056253893}, {"type": "v_measures", "value": [0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124, 0.2535665532592405, 0.25745435154373697, 0.2588139996653209, 0.2563977645588755, 0.2572790917147801, 0.28011260965698515, 0.28489569719921415, 0.2978121202496781, 0.2927319740642704, 0.27770089434179124]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 23.778196508186724}, {"type": "v_measures", "value": [0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752, 0.22243646306633857, 0.2203410753173429, 0.2227543188103344, 0.22414069966133132, 0.2284479943649894, 0.2523527902057292, 0.25535019508635054, 0.25480623149347, 0.2575581979609686, 0.23963168485181752]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 30.088514713666076}, {"type": "mrr", "value": 31.010218178449588}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "mteb/nfcorpus", "config": "default", "split": "test", "revision": "ec0fa4fe99da2ff19ca1214b7966684033a58814"}, "metrics": [{"type": "map_at_1", "value": 2.228}, {"type": "map_at_10", "value": 4.338}, {"type": "map_at_100", "value": 5.427}, {"type": "map_at_1000", "value": 6.325}, {"type": "map_at_20", "value": 4.729}, {"type": "map_at_3", "value": 3.495}, {"type": "map_at_5", "value": 3.8150000000000004}, {"type": "mrr_at_1", "value": 22.291}, {"type": "mrr_at_10", "value": 29.622}, {"type": "mrr_at_100", "value": 30.547}, {"type": "mrr_at_1000", "value": 30.618000000000002}, {"type": "mrr_at_20", "value": 30.070000000000004}, {"type": "mrr_at_3", "value": 27.141}, {"type": "mrr_at_5", "value": 28.488000000000003}, {"type": "ndcg_at_1", "value": 21.362000000000002}, {"type": "ndcg_at_10", "value": 15.64}, {"type": "ndcg_at_100", "value": 14.832}, {"type": "ndcg_at_1000", "value": 23.980999999999998}, {"type": "ndcg_at_20", "value": 14.408000000000001}, {"type": "ndcg_at_3", "value": 18.719}, {"type": "ndcg_at_5", "value": 17.137}, {"type": "precision_at_1", "value": 21.981}, {"type": "precision_at_10", "value": 11.548}, {"type": "precision_at_100", "value": 4.223}, {"type": "precision_at_1000", "value": 1.6500000000000001}, {"type": "precision_at_20", "value": 8.39}, {"type": "precision_at_3", "value": 17.337}, {"type": "precision_at_5", "value": 14.613000000000001}, {"type": "recall_at_1", "value": 2.228}, {"type": "recall_at_10", "value": 6.9190000000000005}, {"type": "recall_at_100", "value": 16.854}, {"type": "recall_at_1000", "value": 49.179}, {"type": "recall_at_20", "value": 9.166}, {"type": "recall_at_3", "value": 4.263}, {"type": "recall_at_5", "value": 4.956}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "mteb/nq", "config": "default", "split": "test", "revision": "b774495ed302d8c44a3a7ea25c90dbce03968f31"}, "metrics": [{"type": "map_at_1", "value": 9.176}, {"type": "map_at_10", "value": 15.720999999999998}, {"type": "map_at_100", "value": 16.847}, {"type": "map_at_1000", "value": 16.939999999999998}, {"type": "map_at_20", "value": 16.355}, {"type": "map_at_3", "value": 13.402}, {"type": "map_at_5", "value": 14.663}, {"type": "mrr_at_1", "value": 10.458}, {"type": "mrr_at_10", "value": 17.413}, {"type": "mrr_at_100", "value": 18.442}, {"type": "mrr_at_1000", "value": 18.52}, {"type": "mrr_at_20", "value": 18.006}, {"type": "mrr_at_3", "value": 15.043999999999999}, {"type": "mrr_at_5", "value": 16.367}, {"type": "ndcg_at_1", "value": 10.458}, {"type": "ndcg_at_10", "value": 19.994999999999997}, {"type": "ndcg_at_100", "value": 25.665}, {"type": "ndcg_at_1000", "value": 28.277}, {"type": "ndcg_at_20", "value": 22.233}, {"type": "ndcg_at_3", "value": 15.168999999999999}, {"type": "ndcg_at_5", "value": 17.453}, {"type": "precision_at_1", "value": 10.458}, {"type": "precision_at_10", "value": 3.711}, {"type": "precision_at_100", "value": 0.697}, {"type": "precision_at_1000", "value": 0.095}, {"type": "precision_at_20", "value": 2.3810000000000002}, {"type": "precision_at_3", "value": 7.204000000000001}, {"type": "precision_at_5", "value": 5.568}, {"type": "recall_at_1", "value": 9.176}, {"type": "recall_at_10", "value": 31.646}, {"type": "recall_at_100", "value": 57.865}, {"type": "recall_at_1000", "value": 78.11399999999999}, {"type": "recall_at_20", "value": 40.117000000000004}, {"type": "recall_at_3", "value": 18.67}, {"type": "recall_at_5", "value": 24.063000000000002}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "mteb/quora", "config": "default", "split": "test", "revision": "e4e08e0b7dbe3c8700f0daef558ff32256715259"}, "metrics": [{"type": "map_at_1", "value": 62.597}, {"type": "map_at_10", "value": 75.3}, {"type": "map_at_100", "value": 76.057}, {"type": "map_at_1000", "value": 76.089}, {"type": "map_at_20", "value": 75.762}, {"type": "map_at_3", "value": 72.41499999999999}, {"type": "map_at_5", "value": 74.139}, {"type": "mrr_at_1", "value": 72.11999999999999}, {"type": "mrr_at_10", "value": 79.44600000000001}, {"type": "mrr_at_100", "value": 79.691}, {"type": "mrr_at_1000", "value": 79.696}, {"type": "mrr_at_20", "value": 79.604}, {"type": "mrr_at_3", "value": 78.015}, {"type": "mrr_at_5", "value": 78.90700000000001}, {"type": "ndcg_at_1", "value": 72.15}, {"type": "ndcg_at_10", "value": 79.937}, {"type": "ndcg_at_100", "value": 82.074}, {"type": "ndcg_at_1000", "value": 82.443}, {"type": "ndcg_at_20", "value": 80.916}, {"type": "ndcg_at_3", "value": 76.452}, {"type": "ndcg_at_5", "value": 78.192}, {"type": "precision_at_1", "value": 72.15}, {"type": "precision_at_10", "value": 12.117}, {"type": "precision_at_100", "value": 1.4500000000000002}, {"type": "precision_at_1000", "value": 0.154}, {"type": "precision_at_20", "value": 6.503}, {"type": "precision_at_3", "value": 33.267}, {"type": "precision_at_5", "value": 21.944}, {"type": "recall_at_1", "value": 62.597}, {"type": "recall_at_10", "value": 88.911}, {"type": "recall_at_100", "value": 97.112}, {"type": "recall_at_1000", "value": 99.229}, {"type": "recall_at_20", "value": 92.231}, {"type": "recall_at_3", "value": 78.83099999999999}, {"type": "recall_at_5", "value": 83.757}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 31.453135224292588}, {"type": "v_measures", "value": [0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356, 0.34024081488556046, 0.31978719363198366, 0.28326863670514296, 0.2736227852661663, 0.33176589594215805, 0.281739297860462, 0.3714152055541526, 0.2784460528138246, 0.28292867038320446, 0.3011498262585792, 0.2903236549747166, 0.36937775233378656, 0.30011371483471927, 0.33579158840067747, 0.3774325279364799, 0.2798489399988548, 0.30350039884840657, 0.39379070544611877, 0.29845537391174287, 0.280224383799162, 0.2683644031255058, 0.28462417081553165, 0.4207860651822375, 0.30599639335371903, 0.29028935381025356]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "385e3cb46b4cfa89021f56c4380204149d0efe33"}, "metrics": [{"type": "v_measure", "value": 43.69122416835423}, {"type": "v_measures", "value": [0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799, 0.4949442160711536, 0.5089714608477952, 0.533056646726052, 0.28870974397114113, 0.4845435888947718, 0.4358272686082502, 0.15963756448560423, 0.4966594103138184, 0.4483975331373559, 0.5183749837794799]}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "mteb/scidocs", "config": "default", "split": "test", "revision": "f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88"}, "metrics": [{"type": "map_at_1", "value": 2.558}, {"type": "map_at_10", "value": 5.4670000000000005}, {"type": "map_at_100", "value": 6.601999999999999}, {"type": "map_at_1000", "value": 6.816}, {"type": "map_at_20", "value": 6.013}, {"type": "map_at_3", "value": 4.132000000000001}, {"type": "map_at_5", "value": 4.672}, {"type": "mrr_at_1", "value": 12.5}, {"type": "mrr_at_10", "value": 18.454}, {"type": "mrr_at_100", "value": 19.585}, {"type": "mrr_at_1000", "value": 19.698999999999998}, {"type": "mrr_at_20", "value": 19.093}, {"type": "mrr_at_3", "value": 16.25}, {"type": "mrr_at_5", "value": 17.349999999999998}, {"type": "ndcg_at_1", "value": 12.5}, {"type": "ndcg_at_10", "value": 9.931}, {"type": "ndcg_at_100", "value": 15.332}, {"type": "ndcg_at_1000", "value": 20.285}, {"type": "ndcg_at_20", "value": 11.73}, {"type": "ndcg_at_3", "value": 9.425}, {"type": "ndcg_at_5", "value": 7.994}, {"type": "precision_at_1", "value": 12.5}, {"type": "precision_at_10", "value": 5.11}, {"type": "precision_at_100", "value": 1.299}, {"type": "precision_at_1000", "value": 0.251}, {"type": "precision_at_20", "value": 3.5999999999999996}, {"type": "precision_at_3", "value": 8.533}, {"type": "precision_at_5", "value": 6.7}, {"type": "recall_at_1", "value": 2.558}, {"type": "recall_at_10", "value": 10.4}, {"type": "recall_at_100", "value": 26.35}, {"type": "recall_at_1000", "value": 50.888}, {"type": "recall_at_20", "value": 14.610000000000001}, {"type": "recall_at_3", "value": 5.208}, {"type": "recall_at_5", "value": 6.808}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "20a6d6f312dd54037fe07a32d58e5e168867909d"}, "metrics": [{"type": "cos_sim_pearson", "value": 80.46080544471825}, {"type": "cos_sim_spearman", "value": 77.33681018334157}, {"type": "euclidean_pearson", "value": 78.32030772877526}, {"type": "euclidean_spearman", "value": 77.3367915580176}, {"type": "manhattan_pearson", "value": 78.23694581981565}, {"type": "manhattan_spearman", "value": 77.24572801084182}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 77.33143319366522}, {"type": "cos_sim_spearman", "value": 70.15243619467687}, {"type": "euclidean_pearson", "value": 74.35384725257417}, {"type": "euclidean_spearman", "value": 70.15020588975051}, {"type": "manhattan_pearson", "value": 74.49763893926959}, {"type": "manhattan_spearman", "value": 70.35289409088577}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 75.43426290814391}, {"type": "cos_sim_spearman", "value": 78.41580967540904}, {"type": "euclidean_pearson", "value": 77.87697798842441}, {"type": "euclidean_spearman", "value": 78.41580967540904}, {"type": "manhattan_pearson", "value": 77.7742301162175}, {"type": "manhattan_spearman", "value": 78.23561925777014}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 75.72059066580607}, {"type": "cos_sim_spearman", "value": 74.76063270848232}, {"type": "euclidean_pearson", "value": 75.96422568212527}, {"type": "euclidean_spearman", "value": 74.76063912580608}, {"type": "manhattan_pearson", "value": 75.93446446206052}, {"type": "manhattan_spearman", "value": 74.80351881324513}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 79.50308070637769}, {"type": "cos_sim_spearman", "value": 82.00177922226122}, {"type": "euclidean_pearson", "value": 81.88334998600465}, {"type": "euclidean_spearman", "value": 82.00175996908672}, {"type": "manhattan_pearson", "value": 82.04162815561806}, {"type": "manhattan_spearman", "value": 82.16179492395742}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 72.660749090443}, {"type": "cos_sim_spearman", "value": 78.27062791462116}, {"type": "euclidean_pearson", "value": 77.22132046879575}, {"type": "euclidean_spearman", "value": 78.27062749235377}, {"type": "manhattan_pearson", "value": 77.30349168561915}, {"type": "manhattan_spearman", "value": 78.38610133247218}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_pearson", "value": 84.40073205259823}, {"type": "cos_sim_spearman", "value": 85.85093351857286}, {"type": "euclidean_pearson", "value": 86.39555107737667}, {"type": "euclidean_spearman", "value": 85.85093351857286}, {"type": "manhattan_pearson", "value": 86.15780582794078}, {"type": "manhattan_spearman", "value": 85.67768599300385}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 54.06121880120164}, {"type": "cos_sim_spearman", "value": 61.20018366762684}, {"type": "euclidean_pearson", "value": 59.08089664894604}, {"type": "euclidean_spearman", "value": 61.20018366762684}, {"type": "manhattan_pearson", "value": 58.88169190353213}, {"type": "manhattan_spearman", "value": 60.82629422553597}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 76.9607252955321}, {"type": "cos_sim_spearman", "value": 79.20891358738938}, {"type": "euclidean_pearson", "value": 79.53044888138301}, {"type": "euclidean_spearman", "value": 79.20891358738938}, {"type": "manhattan_pearson", "value": 79.37313113618887}, {"type": "manhattan_spearman", "value": 79.0667751270519}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 71.0421477784269}, {"type": "mrr", "value": 89.94940426312975}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "mteb/scifact", "config": "default", "split": "test", "revision": "0228b52cf27578f30900b9e5271d331663a030d7"}, "metrics": [{"type": "map_at_1", "value": 31.900000000000002}, {"type": "map_at_10", "value": 38.494}, {"type": "map_at_100", "value": 39.353}, {"type": "map_at_1000", "value": 39.427}, {"type": "map_at_20", "value": 38.952}, {"type": "map_at_3", "value": 36.238}, {"type": "map_at_5", "value": 37.36}, {"type": "mrr_at_1", "value": 34.0}, {"type": "mrr_at_10", "value": 40.327}, {"type": "mrr_at_100", "value": 41.052}, {"type": "mrr_at_1000", "value": 41.120000000000005}, {"type": "mrr_at_20", "value": 40.737}, {"type": "mrr_at_3", "value": 38.333}, {"type": "mrr_at_5", "value": 39.367000000000004}, {"type": "ndcg_at_1", "value": 34.0}, {"type": "ndcg_at_10", "value": 42.419000000000004}, {"type": "ndcg_at_100", "value": 46.589000000000006}, {"type": "ndcg_at_1000", "value": 48.966}, {"type": "ndcg_at_20", "value": 43.980000000000004}, {"type": "ndcg_at_3", "value": 38.124}, {"type": "ndcg_at_5", "value": 39.952}, {"type": "precision_at_1", "value": 34.0}, {"type": "precision_at_10", "value": 5.933}, {"type": "precision_at_100", "value": 0.8330000000000001}, {"type": "precision_at_1000", "value": 0.104}, {"type": "precision_at_20", "value": 3.3329999999999997}, {"type": "precision_at_3", "value": 15.0}, {"type": "precision_at_5", "value": 10.067}, {"type": "recall_at_1", "value": 31.900000000000002}, {"type": "recall_at_10", "value": 52.800000000000004}, {"type": "recall_at_100", "value": 72.10600000000001}, {"type": "recall_at_1000", "value": 91.60000000000001}, {"type": "recall_at_20", "value": 58.699999999999996}, {"type": "recall_at_3", "value": 41.317}, {"type": "recall_at_5", "value": 45.761}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.62871287128714}, {"type": "cos_sim_ap", "value": 85.22434241429664}, {"type": "cos_sim_f1", "value": 79.31605074462217}, {"type": "cos_sim_precision", "value": 88.43788437884379}, {"type": "cos_sim_recall", "value": 71.89999999999999}, {"type": "dot_accuracy", "value": 99.62871287128714}, {"type": "dot_ap", "value": 85.22434241429666}, {"type": "dot_f1", "value": 79.31605074462217}, {"type": "dot_precision", "value": 88.43788437884379}, {"type": "dot_recall", "value": 71.89999999999999}, {"type": "euclidean_accuracy", "value": 99.62871287128714}, {"type": "euclidean_ap", "value": 85.22434237736961}, {"type": "euclidean_f1", "value": 79.31605074462217}, {"type": "euclidean_precision", "value": 88.43788437884379}, {"type": "euclidean_recall", "value": 71.89999999999999}, {"type": "manhattan_accuracy", "value": 99.62475247524752}, {"type": "manhattan_ap", "value": 85.53918872229502}, {"type": "manhattan_f1", "value": 79.38618925831203}, {"type": "manhattan_precision", "value": 81.2565445026178}, {"type": "manhattan_recall", "value": 77.60000000000001}, {"type": "max_accuracy", "value": 99.62871287128714}, {"type": "max_ap", "value": 85.53918872229502}, {"type": "max_f1", "value": 79.38618925831203}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 39.16142357597941}, {"type": "v_measures", "value": [0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965, 0.3824405761636396, 0.44216202123263126, 0.3390286805950001, 0.40370202650437953, 0.3687764786128344, 0.3002689364743748, 0.3406756129607103, 0.4239251906201308, 0.41513537797197647, 0.39562333880392536, 0.44243846336620263, 0.4564014124962121, 0.46843968839295613, 0.3486700249457605, 0.3931094737880025, 0.38614031871714743, 0.39009948062151834, 0.3952861715088528, 0.3768164106667065, 0.39372559829701875, 0.41022022885425324, 0.3442845107165114, 0.36768421400456974, 0.40522290066464794, 0.40007875701488965]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 29.175984546605825}, {"type": "v_measures", "value": [0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907, 0.28319515044921223, 0.2715264094552343, 0.27440620100214314, 0.26830955555466396, 0.27653185247970546, 0.3178752664718975, 0.3080336049306678, 0.3068022206397505, 0.3022010188359171, 0.3087171748413907]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 40.56760857818254}, {"type": "mrr", "value": 40.94357439945675}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 30.764610926778037}, {"type": "cos_sim_spearman", "value": 30.298920879214158}, {"type": "dot_pearson", "value": 30.764611831321552}, {"type": "dot_spearman", "value": 30.298299440561465}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "mteb/trec-covid", "config": "default", "split": "test", "revision": "bb9466bac8153a0349341eb1b22e06409e78ef4e"}, "metrics": [{"type": "map_at_1", "value": 0.109}, {"type": "map_at_10", "value": 0.781}, {"type": "map_at_100", "value": 2.995}, {"type": "map_at_1000", "value": 6.854}, {"type": "map_at_20", "value": 1.2}, {"type": "map_at_3", "value": 0.28700000000000003}, {"type": "map_at_5", "value": 0.434}, {"type": "mrr_at_1", "value": 42.0}, {"type": "mrr_at_10", "value": 54.955}, {"type": "mrr_at_100", "value": 55.655}, {"type": "mrr_at_1000", "value": 55.689}, {"type": "mrr_at_20", "value": 55.42399999999999}, {"type": "mrr_at_3", "value": 51.0}, {"type": "mrr_at_5", "value": 53.800000000000004}, {"type": "ndcg_at_1", "value": 39.0}, {"type": "ndcg_at_10", "value": 39.479}, {"type": "ndcg_at_100", "value": 25.752000000000002}, {"type": "ndcg_at_1000", "value": 22.868}, {"type": "ndcg_at_20", "value": 35.707}, {"type": "ndcg_at_3", "value": 39.419}, {"type": "ndcg_at_5", "value": 39.64}, {"type": "precision_at_1", "value": 42.0}, {"type": "precision_at_10", "value": 43.6}, {"type": "precision_at_100", "value": 25.88}, {"type": "precision_at_1000", "value": 10.784}, {"type": "precision_at_20", "value": 37.8}, {"type": "precision_at_3", "value": 43.333}, {"type": "precision_at_5", "value": 43.6}, {"type": "recall_at_1", "value": 0.109}, {"type": "recall_at_10", "value": 1.038}, {"type": "recall_at_100", "value": 5.495}, {"type": "recall_at_1000", "value": 21.665}, {"type": "recall_at_20", "value": 1.722}, {"type": "recall_at_3", "value": 0.318}, {"type": "recall_at_5", "value": 0.522}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "mteb/touche2020", "config": "default", "split": "test", "revision": "a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f"}, "metrics": [{"type": "map_at_1", "value": 1.302}, {"type": "map_at_10", "value": 2.514}, {"type": "map_at_100", "value": 3.341}, {"type": "map_at_1000", "value": 3.757}, {"type": "map_at_20", "value": 2.85}, {"type": "map_at_3", "value": 1.8450000000000002}, {"type": "map_at_5", "value": 1.873}, {"type": "mrr_at_1", "value": 18.367}, {"type": "mrr_at_10", "value": 24.789}, {"type": "mrr_at_100", "value": 26.517000000000003}, {"type": "mrr_at_1000", "value": 26.593}, {"type": "mrr_at_20", "value": 25.946}, {"type": "mrr_at_3", "value": 22.448999999999998}, {"type": "mrr_at_5", "value": 22.959}, {"type": "ndcg_at_1", "value": 16.326999999999998}, {"type": "ndcg_at_10", "value": 7.7509999999999994}, {"type": "ndcg_at_100", "value": 10.67}, {"type": "ndcg_at_1000", "value": 17.76}, {"type": "ndcg_at_20", "value": 7.674}, {"type": "ndcg_at_3", "value": 10.369}, {"type": "ndcg_at_5", "value": 7.840999999999999}, {"type": "precision_at_1", "value": 18.367}, {"type": "precision_at_10", "value": 7.142999999999999}, {"type": "precision_at_100", "value": 2.327}, {"type": "precision_at_1000", "value": 0.6779999999999999}, {"type": "precision_at_20", "value": 5.408}, {"type": "precision_at_3", "value": 11.565}, {"type": "precision_at_5", "value": 7.3469999999999995}, {"type": "recall_at_1", "value": 1.302}, {"type": "recall_at_10", "value": 4.919}, {"type": "recall_at_100", "value": 14.430000000000001}, {"type": "recall_at_1000", "value": 36.949}, {"type": "recall_at_20", "value": 7.0040000000000004}, {"type": "recall_at_3", "value": 2.2319999999999998}, {"type": "recall_at_5", "value": 2.3449999999999998}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "edfaf9da55d3dd50d43143d90c1ac476895ae6de"}, "metrics": [{"type": "accuracy", "value": 64.47265625}, {"type": "ap", "value": 11.979631561643862}, {"type": "f1", "value": 49.90647543589666}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 61.79966044142614}, {"type": "f1", "value": 61.89030508018869}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 28.234217666259703}, {"type": "v_measures", "value": [0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239, 0.29450695840941515, 0.30590470809304793, 0.29205899710992034, 0.27123807357354457, 0.28092608890535714, 0.2787486406145347, 0.26689540227394454, 0.26139744229328293, 0.2785944239497992, 0.2931510314031239]}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 84.0317100792752}, {"type": "cos_sim_ap", "value": 67.56361271781817}, {"type": "cos_sim_f1", "value": 63.082081211970696}, {"type": "cos_sim_precision", "value": 59.58245367112362}, {"type": "cos_sim_recall", "value": 67.01846965699208}, {"type": "dot_accuracy", "value": 84.0317100792752}, {"type": "dot_ap", "value": 67.56359342938897}, {"type": "dot_f1", "value": 63.082081211970696}, {"type": "dot_precision", "value": 59.58245367112362}, {"type": "dot_recall", "value": 67.01846965699208}, {"type": "euclidean_accuracy", "value": 84.0317100792752}, {"type": "euclidean_ap", "value": 67.5636169518733}, {"type": "euclidean_f1", "value": 63.082081211970696}, {"type": "euclidean_precision", "value": 59.58245367112362}, {"type": "euclidean_recall", "value": 67.01846965699208}, {"type": "manhattan_accuracy", "value": 84.0734338677952}, {"type": "manhattan_ap", "value": 67.44969672020721}, {"type": "manhattan_f1", "value": 63.09479205695017}, {"type": "manhattan_precision", "value": 59.90040313018734}, {"type": "manhattan_recall", "value": 66.64907651715039}, {"type": "max_accuracy", "value": 84.0734338677952}, {"type": "max_ap", "value": 67.5636169518733}, {"type": "max_f1", "value": 63.09479205695017}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 87.60624054022587}, {"type": "cos_sim_ap", "value": 82.94451598409692}, {"type": "cos_sim_f1", "value": 74.76484194294527}, {"type": "cos_sim_precision", "value": 74.86874613959235}, {"type": "cos_sim_recall", "value": 74.66122574684324}, {"type": "dot_accuracy", "value": 87.60624054022587}, {"type": "dot_ap", "value": 82.94451133280317}, {"type": "dot_f1", "value": 74.76484194294527}, {"type": "dot_precision", "value": 74.86874613959235}, {"type": "dot_recall", "value": 74.66122574684324}, {"type": "euclidean_accuracy", "value": 87.60624054022587}, {"type": "euclidean_ap", "value": 82.94449586426977}, {"type": "euclidean_f1", "value": 74.76484194294527}, {"type": "euclidean_precision", "value": 74.86874613959235}, {"type": "euclidean_recall", "value": 74.66122574684324}, {"type": "manhattan_accuracy", "value": 87.63922847052432}, {"type": "manhattan_ap", "value": 82.9449637573502}, {"type": "manhattan_f1", "value": 74.9452996046217}, {"type": "manhattan_precision", "value": 74.73015386970833}, {"type": "manhattan_recall", "value": 75.1616877117339}, {"type": "max_accuracy", "value": 87.63922847052432}, {"type": "max_ap", "value": 82.9449637573502}, {"type": "max_f1", "value": 74.9452996046217}]}]}]}
Mihaiii/Squirtle
null
[ "sentence-transformers", "onnx", "safetensors", "bert", "feature-extraction", "sentence-similarity", "bge", "mteb", "dataset:Mihaiii/qa-assistant", "license:mit", "model-index", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:06:52+00:00
[]
[]
TAGS #sentence-transformers #onnx #safetensors #bert #feature-extraction #sentence-similarity #bge #mteb #dataset-Mihaiii/qa-assistant #license-mit #model-index #endpoints_compatible #region-us
# Squirtle Squirtle is a distill of bge-base-en-v1.5. ## Intended purpose <span style="color:blue">This model is designed for use in semantic-autocomplete (click here for demo).</span> Make sure you also pass 'pipelineParams={{ pooling: "cls", normalize: true }}' since the default pooling in the component is mean. ## Usage Other than within semantic-autocomplete, you can use this model same as bge-base-en-v1.5.
[ "# Squirtle\n\nSquirtle is a distill of bge-base-en-v1.5.", "## Intended purpose\n\n<span style=\"color:blue\">This model is designed for use in semantic-autocomplete (click here for demo).</span>\nMake sure you also pass 'pipelineParams={{ pooling: \"cls\", normalize: true }}' since the default pooling in the component is mean.", "## Usage\n\nOther than within semantic-autocomplete, you can use this model same as bge-base-en-v1.5." ]
[ "TAGS\n#sentence-transformers #onnx #safetensors #bert #feature-extraction #sentence-similarity #bge #mteb #dataset-Mihaiii/qa-assistant #license-mit #model-index #endpoints_compatible #region-us \n", "# Squirtle\n\nSquirtle is a distill of bge-base-en-v1.5.", "## Intended purpose\n\n<span style=\"color:blue\">This model is designed for use in semantic-autocomplete (click here for demo).</span>\nMake sure you also pass 'pipelineParams={{ pooling: \"cls\", normalize: true }}' since the default pooling in the component is mean.", "## Usage\n\nOther than within semantic-autocomplete, you can use this model same as bge-base-en-v1.5." ]
[ 57, 27, 76, 32 ]
[ "TAGS\n#sentence-transformers #onnx #safetensors #bert #feature-extraction #sentence-similarity #bge #mteb #dataset-Mihaiii/qa-assistant #license-mit #model-index #endpoints_compatible #region-us \n# Squirtle\n\nSquirtle is a distill of bge-base-en-v1.5.## Intended purpose\n\n<span style=\"color:blue\">This model is designed for use in semantic-autocomplete (click here for demo).</span>\nMake sure you also pass 'pipelineParams={{ pooling: \"cls\", normalize: true }}' since the default pooling in the component is mean.## Usage\n\nOther than within semantic-autocomplete, you can use this model same as bge-base-en-v1.5." ]
text-to-image
null
## Ring <img src="https://via.placeholder.com/468x300?text=App+Screenshot+Here" alt="Generated on Image Pipeline" style="border-radius: 10px;"> **This lora model is uploaded on [imagepipeline.io](https://imagepipeline.io/)** Model details - ring gag [![Try this model](https://img.shields.io/badge/try_this_model-image_pipeline-BD9319)](https://imagepipeline.io/models/Ring?id=497630fb-da2f-40ce-ab65-fc9c5428ab9c/) ## How to try this model ? You can try using it locally or send an API call to test the output quality. Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/). No payment required. Coding in `php` `javascript` `node` etc ? Checkout our documentation [![documentation](https://img.shields.io/badge/documentation-image_pipeline-blue)](https://docs.imagepipeline.io/docs/introduction) ```python import requests import json url = "https://imagepipeline.io/sd/text2image/v1/run" payload = json.dumps({ "model_id": "sd1.5", "prompt": "ultra realistic close up portrait ((beautiful pale cyberpunk female with heavy black eyeliner)), blue eyes, shaved side haircut, hyper detail, cinematic lighting, magic neon, dark red city, Canon EOS R3, nikon, f/1.4, ISO 200, 1/160s, 8K, RAW, unedited, symmetrical balance, in-frame, 8K", "negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime", "width": "512", "height": "512", "samples": "1", "num_inference_steps": "30", "safety_checker": false, "guidance_scale": 7.5, "multi_lingual": "no", "embeddings": "", "lora_models": "497630fb-da2f-40ce-ab65-fc9c5428ab9c", "lora_weights": "0.5" }) headers = { 'Content-Type': 'application/json', 'API-Key': 'your_api_key' } response = requests.request("POST", url, headers=headers, data=payload) print(response.text) } ``` Get more ready to use `MODELS` like this for `SD 1.5` and `SDXL` : [![All models](https://img.shields.io/badge/Get%20All%20Models-image_pipeline-BD9319)](https://imagepipeline.io/models) ### API Reference #### Generate Image ```http https://api.imagepipeline.io/sd/text2image/v1 ``` | Headers | Type | Description | |:----------------------| :------- |:-------------------------------------------------------------------------------------------------------------------| | `API-Key` | `str` | Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/) | | `Content-Type` | `str` | application/json - content type of the request body | | Parameter | Type | Description | | :-------- | :------- | :------------------------- | | `model_id` | `str` | Your base model, find available lists in [models page](https://imagepipeline.io/models) or upload your own| | `prompt` | `str` | Text Prompt. Check our [Prompt Guide](https://docs.imagepipeline.io/docs/SD-1.5/docs/extras/prompt-guide) for tips | | `num_inference_steps` | `int [1-50]` | Noise is removed with each step, resulting in a higher-quality image over time. Ideal value 30-50 (without LCM) | | `guidance_scale` | `float [1-20]` | Higher guidance scale prioritizes text prompt relevance but sacrifices image quality. Ideal value 7.5-12.5 | | `lora_models` | `str, array` | Pass the model_id(s) of LoRA models that can be found in models page | | `lora_weights` | `str, array` | Strength of the LoRA effect | --- license: creativeml-openrail-m tags: - imagepipeline - imagepipeline.io - text-to-image - ultra-realistic pinned: false pipeline_tag: text-to-image --- ### Feedback If you have any feedback, please reach out to us at [email protected] #### 🔗 Visit Website [![portfolio](https://img.shields.io/badge/image_pipeline-BD9319?style=for-the-badge&logo=gocd&logoColor=white)](https://imagepipeline.io/) If you are the original author of this model, please [click here](https://airtable.com/apprTaRnJbDJ8ufOx/shr4g7o9B6fWfOlUR) to add credits
{"license": "creativeml-openrail-m", "tags": ["imagepipeline", "imagepipeline.io", "text-to-image", "ultra-realistic"], "pinned": false, "pipeline_tag": "text-to-image"}
imagepipeline/Ring
null
[ "imagepipeline", "imagepipeline.io", "text-to-image", "ultra-realistic", "license:creativeml-openrail-m", "region:us" ]
null
2024-04-30T15:07:20+00:00
[]
[]
TAGS #imagepipeline #imagepipeline.io #text-to-image #ultra-realistic #license-creativeml-openrail-m #region-us
Ring ---- <img src="URL alt="Generated on Image Pipeline" style="border-radius: 10px;"> This lora model is uploaded on URL Model details - ring gag ![Try this model](URL How to try this model ? ----------------------- You can try using it locally or send an API call to test the output quality. Get your 'API\_KEY' from URL. No payment required. Coding in 'php' 'javascript' 'node' etc ? Checkout our documentation ![documentation](URL Get more ready to use 'MODELS' like this for 'SD 1.5' and 'SDXL' : ![All models](URL ### API Reference #### Generate Image --- license: creativeml-openrail-m tags: * imagepipeline * URL * text-to-image * ultra-realistic pinned: false pipeline\_tag: text-to-image --- ### Feedback If you have any feedback, please reach out to us at hello@URL #### Visit Website ![portfolio](URL If you are the original author of this model, please click here to add credits
[ "### API Reference", "#### Generate Image\n\n\n\n\n\n\n---\n\n\nlicense: creativeml-openrail-m\ntags:\n\n\n* imagepipeline\n* URL\n* text-to-image\n* ultra-realistic\npinned: false\npipeline\\_tag: text-to-image\n\n\n\n\n---", "### Feedback\n\n\nIf you have any feedback, please reach out to us at hello@URL", "#### Visit Website\n\n\n![portfolio](URL\n\n\nIf you are the original author of this model, please click here to add credits" ]
[ "TAGS\n#imagepipeline #imagepipeline.io #text-to-image #ultra-realistic #license-creativeml-openrail-m #region-us \n", "### API Reference", "#### Generate Image\n\n\n\n\n\n\n---\n\n\nlicense: creativeml-openrail-m\ntags:\n\n\n* imagepipeline\n* URL\n* text-to-image\n* ultra-realistic\npinned: false\npipeline\\_tag: text-to-image\n\n\n\n\n---", "### Feedback\n\n\nIf you have any feedback, please reach out to us at hello@URL", "#### Visit Website\n\n\n![portfolio](URL\n\n\nIf you are the original author of this model, please click here to add credits" ]
[ 35, 5, 53, 20, 29 ]
[ "TAGS\n#imagepipeline #imagepipeline.io #text-to-image #ultra-realistic #license-creativeml-openrail-m #region-us \n### API Reference#### Generate Image\n\n\n\n\n\n\n---\n\n\nlicense: creativeml-openrail-m\ntags:\n\n\n* imagepipeline\n* URL\n* text-to-image\n* ultra-realistic\npinned: false\npipeline\\_tag: text-to-image\n\n\n\n\n---### Feedback\n\n\nIf you have any feedback, please reach out to us at hello@URL#### Visit Website\n\n\n![portfolio](URL\n\n\nIf you are the original author of this model, please click here to add credits" ]
text-classification
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # WangchanBERTa This model was trained from scratch on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 48 - eval_batch_size: 48 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"tags": ["generated_from_trainer"], "model-index": [{"name": "WangchanBERTa", "results": []}]}
tidarat/WangchanBERTa
null
[ "transformers", "safetensors", "camembert", "text-classification", "generated_from_trainer", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:09:48+00:00
[]
[]
TAGS #transformers #safetensors #camembert #text-classification #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us
# WangchanBERTa This model was trained from scratch on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 48 - eval_batch_size: 48 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 5 ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
[ "# WangchanBERTa\n\nThis model was trained from scratch on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 48\n- eval_batch_size: 48\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5", "### Framework versions\n\n- Transformers 4.40.1\n- Pytorch 2.2.1+cu121\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #safetensors #camembert #text-classification #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us \n", "# WangchanBERTa\n\nThis model was trained from scratch on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 48\n- eval_batch_size: 48\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5", "### Framework versions\n\n- Transformers 4.40.1\n- Pytorch 2.2.1+cu121\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
[ 35, 17, 7, 9, 9, 4, 93, 44 ]
[ "TAGS\n#transformers #safetensors #camembert #text-classification #generated_from_trainer #autotrain_compatible #endpoints_compatible #region-us \n# WangchanBERTa\n\nThis model was trained from scratch on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 48\n- eval_batch_size: 48\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 5### Framework versions\n\n- Transformers 4.40.1\n- Pytorch 2.2.1+cu121\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
abc88767/model19
null
[ "transformers", "safetensors", "stablelm", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:10:54+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 41, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
null
null
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # O0430HMA25 This model is a fine-tuned version of [allenai/OLMo-1B](https://huggingface.co/allenai/OLMo-1B) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.1464 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine_with_restarts - lr_scheduler_warmup_steps: 80 - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 2.3321 | 0.09 | 10 | 0.2026 | | 0.1587 | 0.18 | 20 | 0.1513 | | 0.1484 | 0.27 | 30 | 0.1700 | | 0.157 | 0.36 | 40 | 0.1554 | | 0.1545 | 0.45 | 50 | 0.1565 | | 0.1711 | 0.54 | 60 | 0.1519 | | 0.3899 | 0.63 | 70 | 0.1762 | | 0.1605 | 0.73 | 80 | 0.1588 | | 0.1806 | 0.82 | 90 | 0.1766 | | 0.2771 | 0.91 | 100 | 0.1774 | | 0.3353 | 1.0 | 110 | 1.2812 | | 0.7796 | 1.09 | 120 | 3.0669 | | 0.6043 | 1.18 | 130 | 0.1611 | | 0.1654 | 1.27 | 140 | 0.1517 | | 0.156 | 1.36 | 150 | 0.1525 | | 0.1492 | 1.45 | 160 | 0.1531 | | 0.1496 | 1.54 | 170 | 0.1504 | | 0.1528 | 1.63 | 180 | 0.1523 | | 0.1504 | 1.72 | 190 | 0.1482 | | 0.1467 | 1.81 | 200 | 0.1498 | | 0.1511 | 1.9 | 210 | 0.1509 | | 0.1486 | 1.99 | 220 | 0.1509 | | 0.1502 | 2.08 | 230 | 0.1493 | | 0.1429 | 2.18 | 240 | 0.1471 | | 0.1444 | 2.27 | 250 | 0.1490 | | 0.1463 | 2.36 | 260 | 0.1482 | | 0.1447 | 2.45 | 270 | 0.1471 | | 0.1431 | 2.54 | 280 | 0.1471 | | 0.144 | 2.63 | 290 | 0.1477 | | 0.1464 | 2.72 | 300 | 0.1465 | | 0.1453 | 2.81 | 310 | 0.1463 | | 0.1459 | 2.9 | 320 | 0.1464 | | 0.1467 | 2.99 | 330 | 0.1464 | ### Framework versions - Transformers 4.36.0.dev0 - Pytorch 2.1.2+cu121 - Datasets 2.14.6 - Tokenizers 0.14.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "allenai/OLMo-1B", "model-index": [{"name": "O0430HMA25", "results": []}]}
Litzy619/O0430HMA25
null
[ "safetensors", "generated_from_trainer", "base_model:allenai/OLMo-1B", "license:apache-2.0", "region:us" ]
null
2024-04-30T15:11:12+00:00
[]
[]
TAGS #safetensors #generated_from_trainer #base_model-allenai/OLMo-1B #license-apache-2.0 #region-us
O0430HMA25 ========== This model is a fine-tuned version of allenai/OLMo-1B on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.1464 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 0.0003 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * gradient\_accumulation\_steps: 16 * total\_train\_batch\_size: 128 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: cosine\_with\_restarts * lr\_scheduler\_warmup\_steps: 80 * num\_epochs: 3 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.36.0.dev0 * Pytorch 2.1.2+cu121 * Datasets 2.14.6 * Tokenizers 0.14.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 16\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\\_with\\_restarts\n* lr\\_scheduler\\_warmup\\_steps: 80\n* num\\_epochs: 3\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.36.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.14.6\n* Tokenizers 0.14.1" ]
[ "TAGS\n#safetensors #generated_from_trainer #base_model-allenai/OLMo-1B #license-apache-2.0 #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 16\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\\_with\\_restarts\n* lr\\_scheduler\\_warmup\\_steps: 80\n* num\\_epochs: 3\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.36.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.14.6\n* Tokenizers 0.14.1" ]
[ 35, 160, 5, 47 ]
[ "TAGS\n#safetensors #generated_from_trainer #base_model-allenai/OLMo-1B #license-apache-2.0 #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 16\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\\_with\\_restarts\n* lr\\_scheduler\\_warmup\\_steps: 80\n* num\\_epochs: 3\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.36.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.14.6\n* Tokenizers 0.14.1" ]
null
null
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # O0430HMA26 This model is a fine-tuned version of [allenai/OLMo-1B](https://huggingface.co/allenai/OLMo-1B) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.0687 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0003 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 16 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine_with_restarts - lr_scheduler_warmup_steps: 80 - num_epochs: 3 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 2.3358 | 0.09 | 10 | 0.2025 | | 0.1669 | 0.18 | 20 | 0.1548 | | 0.1489 | 0.27 | 30 | 0.1637 | | 0.1568 | 0.36 | 40 | 0.1512 | | 0.1504 | 0.45 | 50 | 0.1543 | | 0.151 | 0.54 | 60 | 0.1516 | | 0.1517 | 0.63 | 70 | 0.1471 | | 0.1512 | 0.73 | 80 | 0.1543 | | 0.1466 | 0.82 | 90 | 0.1497 | | 0.1498 | 0.91 | 100 | 0.1498 | | 0.421 | 1.0 | 110 | 0.7483 | | 0.6289 | 1.09 | 120 | 0.4520 | | 0.6214 | 1.18 | 130 | 0.5937 | | 1.234 | 1.27 | 140 | 0.1622 | | 0.1733 | 1.36 | 150 | 0.2787 | | 0.1681 | 1.45 | 160 | 0.1530 | | 0.151 | 1.54 | 170 | 0.1512 | | 0.1503 | 1.63 | 180 | 0.1482 | | 0.146 | 1.72 | 190 | 0.1427 | | 0.1328 | 1.81 | 200 | 0.1179 | | 0.1194 | 1.9 | 210 | 0.0917 | | 0.0822 | 1.99 | 220 | 0.0782 | | 0.0783 | 2.08 | 230 | 0.0748 | | 0.0772 | 2.18 | 240 | 0.0769 | | 0.0714 | 2.27 | 250 | 0.0763 | | 0.0793 | 2.36 | 260 | 0.0728 | | 0.0725 | 2.45 | 270 | 0.0720 | | 0.0628 | 2.54 | 280 | 0.0674 | | 0.0705 | 2.63 | 290 | 0.0689 | | 0.075 | 2.72 | 300 | 0.0708 | | 0.0687 | 2.81 | 310 | 0.0688 | | 0.0715 | 2.9 | 320 | 0.0687 | | 0.0762 | 2.99 | 330 | 0.0687 | ### Framework versions - Transformers 4.36.0.dev0 - Pytorch 2.1.2+cu121 - Datasets 2.14.6 - Tokenizers 0.14.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "allenai/OLMo-1B", "model-index": [{"name": "O0430HMA26", "results": []}]}
Litzy619/O0430HMA26
null
[ "safetensors", "generated_from_trainer", "base_model:allenai/OLMo-1B", "license:apache-2.0", "region:us" ]
null
2024-04-30T15:11:21+00:00
[]
[]
TAGS #safetensors #generated_from_trainer #base_model-allenai/OLMo-1B #license-apache-2.0 #region-us
O0430HMA26 ========== This model is a fine-tuned version of allenai/OLMo-1B on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.0687 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 0.0003 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * gradient\_accumulation\_steps: 16 * total\_train\_batch\_size: 128 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: cosine\_with\_restarts * lr\_scheduler\_warmup\_steps: 80 * num\_epochs: 3 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.36.0.dev0 * Pytorch 2.1.2+cu121 * Datasets 2.14.6 * Tokenizers 0.14.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 16\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\\_with\\_restarts\n* lr\\_scheduler\\_warmup\\_steps: 80\n* num\\_epochs: 3\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.36.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.14.6\n* Tokenizers 0.14.1" ]
[ "TAGS\n#safetensors #generated_from_trainer #base_model-allenai/OLMo-1B #license-apache-2.0 #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 16\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\\_with\\_restarts\n* lr\\_scheduler\\_warmup\\_steps: 80\n* num\\_epochs: 3\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.36.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.14.6\n* Tokenizers 0.14.1" ]
[ 35, 160, 5, 47 ]
[ "TAGS\n#safetensors #generated_from_trainer #base_model-allenai/OLMo-1B #license-apache-2.0 #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0003\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 16\n* total\\_train\\_batch\\_size: 128\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\\_with\\_restarts\n* lr\\_scheduler\\_warmup\\_steps: 80\n* num\\_epochs: 3\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.36.0.dev0\n* Pytorch 2.1.2+cu121\n* Datasets 2.14.6\n* Tokenizers 0.14.1" ]
sentence-similarity
sentence-transformers
# Wartortle Wartortle is a distill of [bge-base-en-v1.5](BAAI/bge-base-en-v1.5). ## Intended purpose <span style="color:blue">This model is designed for use in semantic-autocomplete ([click here for demo](https://mihaiii.github.io/semantic-autocomplete/)).</span> Make sure you also pass `pipelineParams={{ pooling: "cls", normalize: true }}` since the default pooling in the component is mean. ## Usage Other than within [semantic-autocomplete](https://github.com/Mihaiii/semantic-autocomplete), you can use this model same as [bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5#usage).
{"license": "mit", "library_name": "sentence-transformers", "tags": ["sentence-transformers", "feature-extraction", "sentence-similarity", "bge", "mteb"], "datasets": ["Mihaiii/qa-assistant"], "pipeline_tag": "sentence-similarity", "model-index": [{"name": "Wartortle", "results": [{"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonCounterfactualClassification (en)", "type": "mteb/amazon_counterfactual", "config": "en", "split": "test", "revision": "e8379541af4e31359cca9fbcf4b00f2671dba205"}, "metrics": [{"type": "accuracy", "value": 70.40298507462687}, {"type": "ap", "value": 32.88973775597331}, {"type": "f1", "value": 64.3726772221329}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonPolarityClassification", "type": "mteb/amazon_polarity", "config": "default", "split": "test", "revision": "e2d317d38cd51312af73b3d32a06d1a08b442046"}, "metrics": [{"type": "accuracy", "value": 82.0381}, {"type": "ap", "value": 77.15483149750918}, {"type": "f1", "value": 81.97695449378108}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB AmazonReviewsClassification (en)", "type": "mteb/amazon_reviews_multi", "config": "en", "split": "test", "revision": "1399c76144fd37290681b995c656ef9b2e06e26d"}, "metrics": [{"type": "accuracy", "value": 42.412}, {"type": "f1", "value": 41.039684315409595}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ArguAna", "type": "mteb/arguana", "config": "default", "split": "test", "revision": "c22ab2a51041ffd869aaddef7af8d8215647e41a"}, "metrics": [{"type": "map_at_1", "value": 16.003}, {"type": "map_at_10", "value": 28.448}, {"type": "map_at_100", "value": 29.781999999999996}, {"type": "map_at_1000", "value": 29.822}, {"type": "map_at_20", "value": 29.278}, {"type": "map_at_3", "value": 23.874000000000002}, {"type": "map_at_5", "value": 26.491}, {"type": "mrr_at_1", "value": 16.714000000000002}, {"type": "mrr_at_10", "value": 28.727999999999998}, {"type": "mrr_at_100", "value": 30.055}, {"type": "mrr_at_1000", "value": 30.095}, {"type": "mrr_at_20", "value": 29.558}, {"type": "mrr_at_3", "value": 24.194}, {"type": "mrr_at_5", "value": 26.778999999999996}, {"type": "ndcg_at_1", "value": 16.003}, {"type": "ndcg_at_10", "value": 35.865}, {"type": "ndcg_at_100", "value": 42.304}, {"type": "ndcg_at_1000", "value": 43.333}, {"type": "ndcg_at_20", "value": 38.876}, {"type": "ndcg_at_3", "value": 26.436999999999998}, {"type": "ndcg_at_5", "value": 31.139}, {"type": "precision_at_1", "value": 16.003}, {"type": "precision_at_10", "value": 5.982}, {"type": "precision_at_100", "value": 0.898}, {"type": "precision_at_1000", "value": 0.098}, {"type": "precision_at_20", "value": 3.585}, {"type": "precision_at_3", "value": 11.285}, {"type": "precision_at_5", "value": 9.046999999999999}, {"type": "recall_at_1", "value": 16.003}, {"type": "recall_at_10", "value": 59.815}, {"type": "recall_at_100", "value": 89.75800000000001}, {"type": "recall_at_1000", "value": 97.795}, {"type": "recall_at_20", "value": 71.693}, {"type": "recall_at_3", "value": 33.855000000000004}, {"type": "recall_at_5", "value": 45.235}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringP2P", "type": "mteb/arxiv-clustering-p2p", "config": "default", "split": "test", "revision": "a122ad7f3f0291bf49cc6f4d32aa80929df69d5d"}, "metrics": [{"type": "v_measure", "value": 35.843668514122115}, {"type": "v_measures", "value": [0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313, 0.3334224034497392, 0.3341547890740972, 0.3357840169117339, 0.34882361674739576, 0.3295989566449552, 0.346573603986452, 0.3336839394053626, 0.33891447096132693, 0.3324032010342291, 0.3292339117184913, 0.413709338323792, 0.42196122544420617, 0.41164392816543693, 0.42687170671748026, 0.4196249053060285, 0.41799639579395365, 0.4197169573853409, 0.42335330439048224, 0.4140526534891295, 0.4219636794179018, 0.41570945861753056, 0.2537851472263153, 0.2634446492249144, 0.34061280544819494, 0.2898513290669649, 0.1963574383667656, 0.2587887795105382, 0.12932481412215552, 0.1917433748379367, 1.0, 0.21843243858900313]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB ArxivClusteringS2S", "type": "mteb/arxiv-clustering-s2s", "config": "default", "split": "test", "revision": "f910caf1a6075f7329cdf8c1a6135696f37dbd53"}, "metrics": [{"type": "v_measure", "value": 27.30050438270763}, {"type": "v_measures", "value": [0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464, 0.23642097081553853, 0.2550168800484499, 0.22388101177482445, 0.23677844530410433, 0.23302400926717484, 0.2312253885015082, 0.2354386747624341, 0.2434698661390688, 0.23920597358638096, 0.2512106241376185, 0.3237281317507259, 0.3153648020875208, 0.3117729309162446, 0.319429269175071, 0.31481846607797553, 0.3121035311610101, 0.3130556512675126, 0.32399062528074335, 0.31831654820410643, 0.31740229450043655, 0.3044319259774819, 0.18252134416266622, 0.19507933632329977, 0.26557161108388766, 0.22167460515993895, 0.15338594119020302, 0.18495792754689827, 0.09580075834175401, 0.15430061870022888, 1.0, 0.14977819539455464]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB AskUbuntuDupQuestions", "type": "mteb/askubuntudupquestions-reranking", "config": "default", "split": "test", "revision": "2000358ca161889fa9c082cb41daa8dcfb161a54"}, "metrics": [{"type": "map", "value": 54.0887502643707}, {"type": "mrr", "value": 67.73864485775843}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB BIOSSES", "type": "mteb/biosses-sts", "config": "default", "split": "test", "revision": "d3fb88f8f02e40887cd149695127462bbcf29b4a"}, "metrics": [{"type": "cos_sim_pearson", "value": 78.95194509739122}, {"type": "cos_sim_spearman", "value": 80.77894903688735}, {"type": "euclidean_pearson", "value": 79.39078717146849}, {"type": "euclidean_spearman", "value": 80.77894903688735}, {"type": "manhattan_pearson", "value": 78.71356224958951}, {"type": "manhattan_spearman", "value": 80.19520079602864}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB Banking77Classification", "type": "mteb/banking77", "config": "default", "split": "test", "revision": "0fd18e25b25c072e09e0d92ab615fda904d66300"}, "metrics": [{"type": "accuracy", "value": 71.07467532467531}, {"type": "f1", "value": 70.01947223710656}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringP2P", "type": "mteb/biorxiv-clustering-p2p", "config": "default", "split": "test", "revision": "65b79d1d13f80053f67aca9498d9402c2d9f1f40"}, "metrics": [{"type": "v_measure", "value": 32.35131737359483}, {"type": "v_measures", "value": [0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466, 0.30923604547266875, 0.31460114779964393, 0.3220031887684693, 0.3157541534649746, 0.3261157725875504, 0.32829750804174646, 0.31520163124284745, 0.32583889441755653, 0.33779729550799154, 0.34028610005603466]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB BiorxivClusteringS2S", "type": "mteb/biorxiv-clustering-s2s", "config": "default", "split": "test", "revision": "258694dd0231531bc1fd9de6ceb52a0853c6d908"}, "metrics": [{"type": "v_measure", "value": 24.05979515497522}, {"type": "v_measures", "value": [0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027, 0.24423605317277747, 0.2375355178050442, 0.24890298326625343, 0.2343221982594761, 0.22861139295656668, 0.23279193251929806, 0.234905158950128, 0.2452790973282123, 0.24547781895496315, 0.2539173622848027]}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackAndroidRetrieval", "type": "mteb/cqadupstack-android", "config": "default", "split": "test", "revision": "f46a197baaae43b4f621051089b82a364682dfeb"}, "metrics": [{"type": "map_at_1", "value": 20.799}, {"type": "map_at_10", "value": 28.028}, {"type": "map_at_100", "value": 29.066}, {"type": "map_at_1000", "value": 29.205}, {"type": "map_at_20", "value": 28.541}, {"type": "map_at_3", "value": 25.741000000000003}, {"type": "map_at_5", "value": 26.962000000000003}, {"type": "mrr_at_1", "value": 27.039}, {"type": "mrr_at_10", "value": 34.028000000000006}, {"type": "mrr_at_100", "value": 34.823}, {"type": "mrr_at_1000", "value": 34.894}, {"type": "mrr_at_20", "value": 34.476}, {"type": "mrr_at_3", "value": 31.855}, {"type": "mrr_at_5", "value": 33.114}, {"type": "ndcg_at_1", "value": 27.039}, {"type": "ndcg_at_10", "value": 32.958999999999996}, {"type": "ndcg_at_100", "value": 37.778}, {"type": "ndcg_at_1000", "value": 40.703}, {"type": "ndcg_at_20", "value": 34.58}, {"type": "ndcg_at_3", "value": 29.443}, {"type": "ndcg_at_5", "value": 30.887999999999998}, {"type": "precision_at_1", "value": 27.039}, {"type": "precision_at_10", "value": 6.252000000000001}, {"type": "precision_at_100", "value": 1.0659999999999998}, {"type": "precision_at_1000", "value": 0.16199999999999998}, {"type": "precision_at_20", "value": 3.705}, {"type": "precision_at_3", "value": 14.402000000000001}, {"type": "precision_at_5", "value": 10.157}, {"type": "recall_at_1", "value": 20.799}, {"type": "recall_at_10", "value": 41.819}, {"type": "recall_at_100", "value": 63.32299999999999}, {"type": "recall_at_1000", "value": 82.994}, {"type": "recall_at_20", "value": 48.024}, {"type": "recall_at_3", "value": 30.523}, {"type": "recall_at_5", "value": 35.214}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackEnglishRetrieval", "type": "mteb/cqadupstack-english", "config": "default", "split": "test", "revision": "ad9991cb51e31e31e430383c75ffb2885547b5f0"}, "metrics": [{"type": "map_at_1", "value": 13.431999999999999}, {"type": "map_at_10", "value": 18.384}, {"type": "map_at_100", "value": 19.067999999999998}, {"type": "map_at_1000", "value": 19.178}, {"type": "map_at_20", "value": 18.732}, {"type": "map_at_3", "value": 16.834}, {"type": "map_at_5", "value": 17.758}, {"type": "mrr_at_1", "value": 16.624}, {"type": "mrr_at_10", "value": 21.467}, {"type": "mrr_at_100", "value": 22.126}, {"type": "mrr_at_1000", "value": 22.206}, {"type": "mrr_at_20", "value": 21.8}, {"type": "mrr_at_3", "value": 19.894000000000002}, {"type": "mrr_at_5", "value": 20.794999999999998}, {"type": "ndcg_at_1", "value": 16.624}, {"type": "ndcg_at_10", "value": 21.502}, {"type": "ndcg_at_100", "value": 25.006}, {"type": "ndcg_at_1000", "value": 27.842}, {"type": "ndcg_at_20", "value": 22.651}, {"type": "ndcg_at_3", "value": 18.857}, {"type": "ndcg_at_5", "value": 20.149}, {"type": "precision_at_1", "value": 16.624}, {"type": "precision_at_10", "value": 4.025}, {"type": "precision_at_100", "value": 0.705}, {"type": "precision_at_1000", "value": 0.117}, {"type": "precision_at_20", "value": 2.408}, {"type": "precision_at_3", "value": 9.107999999999999}, {"type": "precision_at_5", "value": 6.561}, {"type": "recall_at_1", "value": 13.431999999999999}, {"type": "recall_at_10", "value": 27.648}, {"type": "recall_at_100", "value": 43.455}, {"type": "recall_at_1000", "value": 63.246}, {"type": "recall_at_20", "value": 31.896}, {"type": "recall_at_3", "value": 20.084}, {"type": "recall_at_5", "value": 23.593}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGamingRetrieval", "type": "mteb/cqadupstack-gaming", "config": "default", "split": "test", "revision": "4885aa143210c98657558c04aaf3dc47cfb54340"}, "metrics": [{"type": "map_at_1", "value": 24.26}, {"type": "map_at_10", "value": 32.432}, {"type": "map_at_100", "value": 33.415}, {"type": "map_at_1000", "value": 33.512}, {"type": "map_at_20", "value": 32.949}, {"type": "map_at_3", "value": 29.938}, {"type": "map_at_5", "value": 31.328}, {"type": "mrr_at_1", "value": 27.900000000000002}, {"type": "mrr_at_10", "value": 35.449000000000005}, {"type": "mrr_at_100", "value": 36.293}, {"type": "mrr_at_1000", "value": 36.359}, {"type": "mrr_at_20", "value": 35.92}, {"type": "mrr_at_3", "value": 33.166000000000004}, {"type": "mrr_at_5", "value": 34.439}, {"type": "ndcg_at_1", "value": 27.900000000000002}, {"type": "ndcg_at_10", "value": 37.074}, {"type": "ndcg_at_100", "value": 41.786}, {"type": "ndcg_at_1000", "value": 44.01}, {"type": "ndcg_at_20", "value": 38.786}, {"type": "ndcg_at_3", "value": 32.440000000000005}, {"type": "ndcg_at_5", "value": 34.615}, {"type": "precision_at_1", "value": 27.900000000000002}, {"type": "precision_at_10", "value": 6.056}, {"type": "precision_at_100", "value": 0.924}, {"type": "precision_at_1000", "value": 0.11900000000000001}, {"type": "precision_at_20", "value": 3.4979999999999998}, {"type": "precision_at_3", "value": 14.274000000000001}, {"type": "precision_at_5", "value": 10.044}, {"type": "recall_at_1", "value": 24.26}, {"type": "recall_at_10", "value": 48.266}, {"type": "recall_at_100", "value": 69.433}, {"type": "recall_at_1000", "value": 85.419}, {"type": "recall_at_20", "value": 54.578}, {"type": "recall_at_3", "value": 35.776}, {"type": "recall_at_5", "value": 41.076}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackGisRetrieval", "type": "mteb/cqadupstack-gis", "config": "default", "split": "test", "revision": "5003b3064772da1887988e05400cf3806fe491f2"}, "metrics": [{"type": "map_at_1", "value": 13.277}, {"type": "map_at_10", "value": 17.776}, {"type": "map_at_100", "value": 18.476}, {"type": "map_at_1000", "value": 18.572}, {"type": "map_at_20", "value": 18.102}, {"type": "map_at_3", "value": 16.072}, {"type": "map_at_5", "value": 17.085}, {"type": "mrr_at_1", "value": 14.237}, {"type": "mrr_at_10", "value": 19.051000000000002}, {"type": "mrr_at_100", "value": 19.728}, {"type": "mrr_at_1000", "value": 19.819}, {"type": "mrr_at_20", "value": 19.346}, {"type": "mrr_at_3", "value": 17.439}, {"type": "mrr_at_5", "value": 18.387999999999998}, {"type": "ndcg_at_1", "value": 14.237}, {"type": "ndcg_at_10", "value": 20.669999999999998}, {"type": "ndcg_at_100", "value": 24.58}, {"type": "ndcg_at_1000", "value": 27.557}, {"type": "ndcg_at_20", "value": 21.784}, {"type": "ndcg_at_3", "value": 17.369}, {"type": "ndcg_at_5", "value": 19.067999999999998}, {"type": "precision_at_1", "value": 14.237}, {"type": "precision_at_10", "value": 3.232}, {"type": "precision_at_100", "value": 0.5579999999999999}, {"type": "precision_at_1000", "value": 0.08499999999999999}, {"type": "precision_at_20", "value": 1.881}, {"type": "precision_at_3", "value": 7.3069999999999995}, {"type": "precision_at_5", "value": 5.333}, {"type": "recall_at_1", "value": 13.277}, {"type": "recall_at_10", "value": 28.496}, {"type": "recall_at_100", "value": 47.343}, {"type": "recall_at_1000", "value": 70.92699999999999}, {"type": "recall_at_20", "value": 32.646}, {"type": "recall_at_3", "value": 19.570999999999998}, {"type": "recall_at_5", "value": 23.624000000000002}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackMathematicaRetrieval", "type": "mteb/cqadupstack-mathematica", "config": "default", "split": "test", "revision": "90fceea13679c63fe563ded68f3b6f06e50061de"}, "metrics": [{"type": "map_at_1", "value": 6.329999999999999}, {"type": "map_at_10", "value": 10.16}, {"type": "map_at_100", "value": 11.004}, {"type": "map_at_1000", "value": 11.136}, {"type": "map_at_20", "value": 10.546999999999999}, {"type": "map_at_3", "value": 8.491}, {"type": "map_at_5", "value": 9.383}, {"type": "mrr_at_1", "value": 7.587000000000001}, {"type": "mrr_at_10", "value": 12.434000000000001}, {"type": "mrr_at_100", "value": 13.279}, {"type": "mrr_at_1000", "value": 13.377}, {"type": "mrr_at_20", "value": 12.855}, {"type": "mrr_at_3", "value": 10.282}, {"type": "mrr_at_5", "value": 11.42}, {"type": "ndcg_at_1", "value": 7.587000000000001}, {"type": "ndcg_at_10", "value": 13.239999999999998}, {"type": "ndcg_at_100", "value": 17.727999999999998}, {"type": "ndcg_at_1000", "value": 21.346}, {"type": "ndcg_at_20", "value": 14.649000000000001}, {"type": "ndcg_at_3", "value": 9.687}, {"type": "ndcg_at_5", "value": 11.306}, {"type": "precision_at_1", "value": 7.587000000000001}, {"type": "precision_at_10", "value": 2.749}, {"type": "precision_at_100", "value": 0.583}, {"type": "precision_at_1000", "value": 0.104}, {"type": "precision_at_20", "value": 1.76}, {"type": "precision_at_3", "value": 4.643}, {"type": "precision_at_5", "value": 3.881}, {"type": "recall_at_1", "value": 6.329999999999999}, {"type": "recall_at_10", "value": 20.596999999999998}, {"type": "recall_at_100", "value": 40.642}, {"type": "recall_at_1000", "value": 67.268}, {"type": "recall_at_20", "value": 25.615}, {"type": "recall_at_3", "value": 11.036}, {"type": "recall_at_5", "value": 14.909}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackPhysicsRetrieval", "type": "mteb/cqadupstack-physics", "config": "default", "split": "test", "revision": "79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4"}, "metrics": [{"type": "map_at_1", "value": 16.558}, {"type": "map_at_10", "value": 22.551}, {"type": "map_at_100", "value": 23.669}, {"type": "map_at_1000", "value": 23.809}, {"type": "map_at_20", "value": 23.173}, {"type": "map_at_3", "value": 20.681}, {"type": "map_at_5", "value": 21.674}, {"type": "mrr_at_1", "value": 20.693}, {"type": "mrr_at_10", "value": 27.133000000000003}, {"type": "mrr_at_100", "value": 28.073999999999998}, {"type": "mrr_at_1000", "value": 28.16}, {"type": "mrr_at_20", "value": 27.693}, {"type": "mrr_at_3", "value": 25.201}, {"type": "mrr_at_5", "value": 26.407999999999998}, {"type": "ndcg_at_1", "value": 20.693}, {"type": "ndcg_at_10", "value": 26.701999999999998}, {"type": "ndcg_at_100", "value": 32.031}, {"type": "ndcg_at_1000", "value": 35.265}, {"type": "ndcg_at_20", "value": 28.814}, {"type": "ndcg_at_3", "value": 23.474}, {"type": "ndcg_at_5", "value": 24.924}, {"type": "precision_at_1", "value": 20.693}, {"type": "precision_at_10", "value": 4.986}, {"type": "precision_at_100", "value": 0.915}, {"type": "precision_at_1000", "value": 0.13699999999999998}, {"type": "precision_at_20", "value": 3.157}, {"type": "precision_at_3", "value": 11.132}, {"type": "precision_at_5", "value": 8.027}, {"type": "recall_at_1", "value": 16.558}, {"type": "recall_at_10", "value": 34.636}, {"type": "recall_at_100", "value": 57.745999999999995}, {"type": "recall_at_1000", "value": 80.438}, {"type": "recall_at_20", "value": 42.248000000000005}, {"type": "recall_at_3", "value": 25.419999999999998}, {"type": "recall_at_5", "value": 29.254}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackProgrammersRetrieval", "type": "mteb/cqadupstack-programmers", "config": "default", "split": "test", "revision": "6184bc1440d2dbc7612be22b50686b8826d22b32"}, "metrics": [{"type": "map_at_1", "value": 10.231}, {"type": "map_at_10", "value": 14.352}, {"type": "map_at_100", "value": 15.174000000000001}, {"type": "map_at_1000", "value": 15.310000000000002}, {"type": "map_at_20", "value": 14.704}, {"type": "map_at_3", "value": 12.878}, {"type": "map_at_5", "value": 13.632}, {"type": "mrr_at_1", "value": 12.556999999999999}, {"type": "mrr_at_10", "value": 17.378}, {"type": "mrr_at_100", "value": 18.186}, {"type": "mrr_at_1000", "value": 18.287}, {"type": "mrr_at_20", "value": 17.752000000000002}, {"type": "mrr_at_3", "value": 15.772}, {"type": "mrr_at_5", "value": 16.6}, {"type": "ndcg_at_1", "value": 12.556999999999999}, {"type": "ndcg_at_10", "value": 17.501}, {"type": "ndcg_at_100", "value": 22.065}, {"type": "ndcg_at_1000", "value": 25.607999999999997}, {"type": "ndcg_at_20", "value": 18.756}, {"type": "ndcg_at_3", "value": 14.691}, {"type": "ndcg_at_5", "value": 15.842}, {"type": "precision_at_1", "value": 12.556999999999999}, {"type": "precision_at_10", "value": 3.322}, {"type": "precision_at_100", "value": 0.6709999999999999}, {"type": "precision_at_1000", "value": 0.11399999999999999}, {"type": "precision_at_20", "value": 2.0549999999999997}, {"type": "precision_at_3", "value": 6.963}, {"type": "precision_at_5", "value": 5.137}, {"type": "recall_at_1", "value": 10.231}, {"type": "recall_at_10", "value": 24.2}, {"type": "recall_at_100", "value": 45.051}, {"type": "recall_at_1000", "value": 70.372}, {"type": "recall_at_20", "value": 28.624}, {"type": "recall_at_3", "value": 16.209}, {"type": "recall_at_5", "value": 19.259999999999998}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackRetrieval", "type": "mteb/cqadupstack", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 13.304916666666664}, {"type": "map_at_10", "value": 18.2725}, {"type": "map_at_100", "value": 19.125249999999998}, {"type": "map_at_1000", "value": 19.246166666666664}, {"type": "map_at_20", "value": 18.682916666666667}, {"type": "map_at_3", "value": 16.61425}, {"type": "map_at_5", "value": 17.508000000000003}, {"type": "mrr_at_1", "value": 16.06625}, {"type": "mrr_at_10", "value": 21.317583333333335}, {"type": "mrr_at_100", "value": 22.106583333333333}, {"type": "mrr_at_1000", "value": 22.195}, {"type": "mrr_at_20", "value": 21.716500000000003}, {"type": "mrr_at_3", "value": 19.601666666666667}, {"type": "mrr_at_5", "value": 20.540333333333326}, {"type": "ndcg_at_1", "value": 16.06625}, {"type": "ndcg_at_10", "value": 21.690500000000004}, {"type": "ndcg_at_100", "value": 26.08625}, {"type": "ndcg_at_1000", "value": 29.223333333333336}, {"type": "ndcg_at_20", "value": 23.085083333333333}, {"type": "ndcg_at_3", "value": 18.621583333333337}, {"type": "ndcg_at_5", "value": 19.984999999999996}, {"type": "precision_at_1", "value": 16.06625}, {"type": "precision_at_10", "value": 3.9008333333333334}, {"type": "precision_at_100", "value": 0.7179166666666666}, {"type": "precision_at_1000", "value": 0.11541666666666667}, {"type": "precision_at_20", "value": 2.3684166666666666}, {"type": "precision_at_3", "value": 8.643}, {"type": "precision_at_5", "value": 6.230833333333333}, {"type": "recall_at_1", "value": 13.304916666666664}, {"type": "recall_at_10", "value": 29.081916666666665}, {"type": "recall_at_100", "value": 49.29125}, {"type": "recall_at_1000", "value": 72.18308333333331}, {"type": "recall_at_20", "value": 34.271499999999996}, {"type": "recall_at_3", "value": 20.34425}, {"type": "recall_at_5", "value": 23.923583333333333}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackStatsRetrieval", "type": "mteb/cqadupstack-stats", "config": "default", "split": "test", "revision": "65ac3a16b8e91f9cee4c9828cc7c335575432a2a"}, "metrics": [{"type": "map_at_1", "value": 10.539}, {"type": "map_at_10", "value": 14.783}, {"type": "map_at_100", "value": 15.542}, {"type": "map_at_1000", "value": 15.644}, {"type": "map_at_20", "value": 15.139}, {"type": "map_at_3", "value": 13.508999999999999}, {"type": "map_at_5", "value": 14.191}, {"type": "mrr_at_1", "value": 12.577}, {"type": "mrr_at_10", "value": 17.212}, {"type": "mrr_at_100", "value": 17.95}, {"type": "mrr_at_1000", "value": 18.043}, {"type": "mrr_at_20", "value": 17.563000000000002}, {"type": "mrr_at_3", "value": 15.951}, {"type": "mrr_at_5", "value": 16.587}, {"type": "ndcg_at_1", "value": 12.577}, {"type": "ndcg_at_10", "value": 17.683}, {"type": "ndcg_at_100", "value": 21.783}, {"type": "ndcg_at_1000", "value": 24.802}, {"type": "ndcg_at_20", "value": 18.944}, {"type": "ndcg_at_3", "value": 15.204999999999998}, {"type": "ndcg_at_5", "value": 16.274}, {"type": "precision_at_1", "value": 12.577}, {"type": "precision_at_10", "value": 2.991}, {"type": "precision_at_100", "value": 0.557}, {"type": "precision_at_1000", "value": 0.08800000000000001}, {"type": "precision_at_20", "value": 1.81}, {"type": "precision_at_3", "value": 6.952999999999999}, {"type": "precision_at_5", "value": 4.8469999999999995}, {"type": "recall_at_1", "value": 10.539}, {"type": "recall_at_10", "value": 24.541}, {"type": "recall_at_100", "value": 43.732}, {"type": "recall_at_1000", "value": 66.97800000000001}, {"type": "recall_at_20", "value": 29.331000000000003}, {"type": "recall_at_3", "value": 17.096}, {"type": "recall_at_5", "value": 20.080000000000002}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackTexRetrieval", "type": "mteb/cqadupstack-tex", "config": "default", "split": "test", "revision": "46989137a86843e03a6195de44b09deda022eec7"}, "metrics": [{"type": "map_at_1", "value": 7.954}, {"type": "map_at_10", "value": 11.091}, {"type": "map_at_100", "value": 11.828}, {"type": "map_at_1000", "value": 11.935}, {"type": "map_at_20", "value": 11.44}, {"type": "map_at_3", "value": 9.876}, {"type": "map_at_5", "value": 10.496}, {"type": "mrr_at_1", "value": 9.738}, {"type": "mrr_at_10", "value": 13.361}, {"type": "mrr_at_100", "value": 14.096}, {"type": "mrr_at_1000", "value": 14.184}, {"type": "mrr_at_20", "value": 13.721}, {"type": "mrr_at_3", "value": 12.004}, {"type": "mrr_at_5", "value": 12.658}, {"type": "ndcg_at_1", "value": 9.738}, {"type": "ndcg_at_10", "value": 13.592}, {"type": "ndcg_at_100", "value": 17.512}, {"type": "ndcg_at_1000", "value": 20.602999999999998}, {"type": "ndcg_at_20", "value": 14.789}, {"type": "ndcg_at_3", "value": 11.232000000000001}, {"type": "ndcg_at_5", "value": 12.191}, {"type": "precision_at_1", "value": 9.738}, {"type": "precision_at_10", "value": 2.598}, {"type": "precision_at_100", "value": 0.553}, {"type": "precision_at_1000", "value": 0.096}, {"type": "precision_at_20", "value": 1.652}, {"type": "precision_at_3", "value": 5.311}, {"type": "precision_at_5", "value": 3.895}, {"type": "recall_at_1", "value": 7.954}, {"type": "recall_at_10", "value": 18.932}, {"type": "recall_at_100", "value": 37.082}, {"type": "recall_at_1000", "value": 60.114999999999995}, {"type": "recall_at_20", "value": 23.339}, {"type": "recall_at_3", "value": 12.318999999999999}, {"type": "recall_at_5", "value": 14.834}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackUnixRetrieval", "type": "mteb/cqadupstack-unix", "config": "default", "split": "test", "revision": "6c6430d3a6d36f8d2a829195bc5dc94d7e063e53"}, "metrics": [{"type": "map_at_1", "value": 13.764999999999999}, {"type": "map_at_10", "value": 17.766000000000002}, {"type": "map_at_100", "value": 18.637999999999998}, {"type": "map_at_1000", "value": 18.755}, {"type": "map_at_20", "value": 18.242}, {"type": "map_at_3", "value": 16.502}, {"type": "map_at_5", "value": 17.155}, {"type": "mrr_at_1", "value": 16.604}, {"type": "mrr_at_10", "value": 21.071}, {"type": "mrr_at_100", "value": 21.906}, {"type": "mrr_at_1000", "value": 22.0}, {"type": "mrr_at_20", "value": 21.545}, {"type": "mrr_at_3", "value": 19.667}, {"type": "mrr_at_5", "value": 20.395}, {"type": "ndcg_at_1", "value": 16.604}, {"type": "ndcg_at_10", "value": 20.742}, {"type": "ndcg_at_100", "value": 25.363999999999997}, {"type": "ndcg_at_1000", "value": 28.607}, {"type": "ndcg_at_20", "value": 22.469}, {"type": "ndcg_at_3", "value": 18.276999999999997}, {"type": "ndcg_at_5", "value": 19.277}, {"type": "precision_at_1", "value": 16.604}, {"type": "precision_at_10", "value": 3.47}, {"type": "precision_at_100", "value": 0.651}, {"type": "precision_at_1000", "value": 0.104}, {"type": "precision_at_20", "value": 2.169}, {"type": "precision_at_3", "value": 8.209}, {"type": "precision_at_5", "value": 5.7090000000000005}, {"type": "recall_at_1", "value": 13.764999999999999}, {"type": "recall_at_10", "value": 26.752}, {"type": "recall_at_100", "value": 47.988}, {"type": "recall_at_1000", "value": 71.859}, {"type": "recall_at_20", "value": 33.25}, {"type": "recall_at_3", "value": 19.777}, {"type": "recall_at_5", "value": 22.39}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWebmastersRetrieval", "type": "mteb/cqadupstack-webmasters", "config": "default", "split": "test", "revision": "160c094312a0e1facb97e55eeddb698c0abe3571"}, "metrics": [{"type": "map_at_1", "value": 14.435999999999998}, {"type": "map_at_10", "value": 19.517}, {"type": "map_at_100", "value": 20.380000000000003}, {"type": "map_at_1000", "value": 20.558}, {"type": "map_at_20", "value": 19.858}, {"type": "map_at_3", "value": 17.764}, {"type": "map_at_5", "value": 18.705}, {"type": "mrr_at_1", "value": 18.182000000000002}, {"type": "mrr_at_10", "value": 23.342}, {"type": "mrr_at_100", "value": 24.121000000000002}, {"type": "mrr_at_1000", "value": 24.226}, {"type": "mrr_at_20", "value": 23.71}, {"type": "mrr_at_3", "value": 21.573999999999998}, {"type": "mrr_at_5", "value": 22.572}, {"type": "ndcg_at_1", "value": 18.182000000000002}, {"type": "ndcg_at_10", "value": 23.322000000000003}, {"type": "ndcg_at_100", "value": 27.529999999999998}, {"type": "ndcg_at_1000", "value": 31.434}, {"type": "ndcg_at_20", "value": 24.274}, {"type": "ndcg_at_3", "value": 20.307}, {"type": "ndcg_at_5", "value": 21.681}, {"type": "precision_at_1", "value": 18.182000000000002}, {"type": "precision_at_10", "value": 4.486}, {"type": "precision_at_100", "value": 0.907}, {"type": "precision_at_1000", "value": 0.17500000000000002}, {"type": "precision_at_20", "value": 2.727}, {"type": "precision_at_3", "value": 9.684}, {"type": "precision_at_5", "value": 7.074999999999999}, {"type": "recall_at_1", "value": 14.435999999999998}, {"type": "recall_at_10", "value": 30.221999999999998}, {"type": "recall_at_100", "value": 50.657}, {"type": "recall_at_1000", "value": 77.803}, {"type": "recall_at_20", "value": 34.044999999999995}, {"type": "recall_at_3", "value": 21.394}, {"type": "recall_at_5", "value": 25.058000000000003}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB CQADupstackWordpressRetrieval", "type": "mteb/cqadupstack-wordpress", "config": "default", "split": "test", "revision": "4ffe81d471b1924886b33c7567bfb200e9eec5c4"}, "metrics": [{"type": "map_at_1", "value": 8.078000000000001}, {"type": "map_at_10", "value": 12.43}, {"type": "map_at_100", "value": 13.242999999999999}, {"type": "map_at_1000", "value": 13.34}, {"type": "map_at_20", "value": 12.767999999999999}, {"type": "map_at_3", "value": 11.085}, {"type": "map_at_5", "value": 11.727}, {"type": "mrr_at_1", "value": 9.057}, {"type": "mrr_at_10", "value": 13.885}, {"type": "mrr_at_100", "value": 14.697}, {"type": "mrr_at_1000", "value": 14.785}, {"type": "mrr_at_20", "value": 14.216999999999999}, {"type": "mrr_at_3", "value": 12.415}, {"type": "mrr_at_5", "value": 13.108}, {"type": "ndcg_at_1", "value": 9.057}, {"type": "ndcg_at_10", "value": 15.299}, {"type": "ndcg_at_100", "value": 19.872}, {"type": "ndcg_at_1000", "value": 22.903000000000002}, {"type": "ndcg_at_20", "value": 16.525000000000002}, {"type": "ndcg_at_3", "value": 12.477}, {"type": "ndcg_at_5", "value": 13.605}, {"type": "precision_at_1", "value": 9.057}, {"type": "precision_at_10", "value": 2.643}, {"type": "precision_at_100", "value": 0.525}, {"type": "precision_at_1000", "value": 0.084}, {"type": "precision_at_20", "value": 1.599}, {"type": "precision_at_3", "value": 5.7299999999999995}, {"type": "precision_at_5", "value": 4.104}, {"type": "recall_at_1", "value": 8.078000000000001}, {"type": "recall_at_10", "value": 22.874}, {"type": "recall_at_100", "value": 45.043}, {"type": "recall_at_1000", "value": 68.77799999999999}, {"type": "recall_at_20", "value": 27.662}, {"type": "recall_at_3", "value": 14.926}, {"type": "recall_at_5", "value": 17.791}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB ClimateFEVER", "type": "mteb/climate-fever", "config": "default", "split": "test", "revision": "47f2ac6acb640fc46020b02a5b59fdda04d39380"}, "metrics": [{"type": "map_at_1", "value": 4.460999999999999}, {"type": "map_at_10", "value": 8.625}, {"type": "map_at_100", "value": 9.772}, {"type": "map_at_1000", "value": 9.952}, {"type": "map_at_20", "value": 9.133}, {"type": "map_at_3", "value": 6.961}, {"type": "map_at_5", "value": 7.727}, {"type": "mrr_at_1", "value": 9.381}, {"type": "mrr_at_10", "value": 16.742}, {"type": "mrr_at_100", "value": 17.901}, {"type": "mrr_at_1000", "value": 17.983}, {"type": "mrr_at_20", "value": 17.368}, {"type": "mrr_at_3", "value": 14.126}, {"type": "mrr_at_5", "value": 15.504000000000001}, {"type": "ndcg_at_1", "value": 9.381}, {"type": "ndcg_at_10", "value": 13.111}, {"type": "ndcg_at_100", "value": 19.043}, {"type": "ndcg_at_1000", "value": 22.901}, {"type": "ndcg_at_20", "value": 14.909}, {"type": "ndcg_at_3", "value": 9.727}, {"type": "ndcg_at_5", "value": 10.91}, {"type": "precision_at_1", "value": 9.381}, {"type": "precision_at_10", "value": 4.391}, {"type": "precision_at_100", "value": 1.075}, {"type": "precision_at_1000", "value": 0.178}, {"type": "precision_at_20", "value": 2.9739999999999998}, {"type": "precision_at_3", "value": 7.448}, {"type": "precision_at_5", "value": 5.954000000000001}, {"type": "recall_at_1", "value": 4.460999999999999}, {"type": "recall_at_10", "value": 17.657999999999998}, {"type": "recall_at_100", "value": 39.201}, {"type": "recall_at_1000", "value": 61.229}, {"type": "recall_at_20", "value": 22.758}, {"type": "recall_at_3", "value": 9.724}, {"type": "recall_at_5", "value": 12.651000000000002}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB DBPedia", "type": "mteb/dbpedia", "config": "default", "split": "test", "revision": "c0f706b76e590d620bd6618b3ca8efdd34e2d659"}, "metrics": [{"type": "map_at_1", "value": 5.849}, {"type": "map_at_10", "value": 12.828999999999999}, {"type": "map_at_100", "value": 17.204}, {"type": "map_at_1000", "value": 18.314}, {"type": "map_at_20", "value": 14.607000000000001}, {"type": "map_at_3", "value": 9.442}, {"type": "map_at_5", "value": 10.808}, {"type": "mrr_at_1", "value": 48.75}, {"type": "mrr_at_10", "value": 59.82300000000001}, {"type": "mrr_at_100", "value": 60.293}, {"type": "mrr_at_1000", "value": 60.307}, {"type": "mrr_at_20", "value": 60.131}, {"type": "mrr_at_3", "value": 57.208000000000006}, {"type": "mrr_at_5", "value": 58.583}, {"type": "ndcg_at_1", "value": 36.875}, {"type": "ndcg_at_10", "value": 29.328}, {"type": "ndcg_at_100", "value": 32.2}, {"type": "ndcg_at_1000", "value": 39.125}, {"type": "ndcg_at_20", "value": 28.674}, {"type": "ndcg_at_3", "value": 32.469}, {"type": "ndcg_at_5", "value": 30.613}, {"type": "precision_at_1", "value": 48.75}, {"type": "precision_at_10", "value": 24.099999999999998}, {"type": "precision_at_100", "value": 7.292999999999999}, {"type": "precision_at_1000", "value": 1.486}, {"type": "precision_at_20", "value": 17.812}, {"type": "precision_at_3", "value": 37.167}, {"type": "precision_at_5", "value": 31.1}, {"type": "recall_at_1", "value": 5.849}, {"type": "recall_at_10", "value": 18.473}, {"type": "recall_at_100", "value": 37.602000000000004}, {"type": "recall_at_1000", "value": 60.68599999999999}, {"type": "recall_at_20", "value": 23.552}, {"type": "recall_at_3", "value": 11.077}, {"type": "recall_at_5", "value": 13.511999999999999}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB EmotionClassification", "type": "mteb/emotion", "config": "default", "split": "test", "revision": "4f58c6b202a23cf9a4da393831edf4f9183cad37"}, "metrics": [{"type": "accuracy", "value": 46.78}, {"type": "f1", "value": 40.027922341568576}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FEVER", "type": "mteb/fever", "config": "default", "split": "test", "revision": "bea83ef9e8fb933d90a2f1d5515737465d613e12"}, "metrics": [{"type": "map_at_1", "value": 21.675}, {"type": "map_at_10", "value": 30.4}, {"type": "map_at_100", "value": 31.285}, {"type": "map_at_1000", "value": 31.351000000000003}, {"type": "map_at_20", "value": 30.917}, {"type": "map_at_3", "value": 27.748}, {"type": "map_at_5", "value": 29.265}, {"type": "mrr_at_1", "value": 23.327}, {"type": "mrr_at_10", "value": 32.363}, {"type": "mrr_at_100", "value": 33.237}, {"type": "mrr_at_1000", "value": 33.298}, {"type": "mrr_at_20", "value": 32.883}, {"type": "mrr_at_3", "value": 29.665000000000003}, {"type": "mrr_at_5", "value": 31.230999999999998}, {"type": "ndcg_at_1", "value": 23.327}, {"type": "ndcg_at_10", "value": 35.576}, {"type": "ndcg_at_100", "value": 40.071}, {"type": "ndcg_at_1000", "value": 41.884}, {"type": "ndcg_at_20", "value": 37.431}, {"type": "ndcg_at_3", "value": 30.173}, {"type": "ndcg_at_5", "value": 32.883}, {"type": "precision_at_1", "value": 23.327}, {"type": "precision_at_10", "value": 5.438}, {"type": "precision_at_100", "value": 0.784}, {"type": "precision_at_1000", "value": 0.096}, {"type": "precision_at_20", "value": 3.121}, {"type": "precision_at_3", "value": 12.741}, {"type": "precision_at_5", "value": 9.078999999999999}, {"type": "recall_at_1", "value": 21.675}, {"type": "recall_at_10", "value": 49.952999999999996}, {"type": "recall_at_100", "value": 70.953}, {"type": "recall_at_1000", "value": 84.902}, {"type": "recall_at_20", "value": 57.081}, {"type": "recall_at_3", "value": 35.301}, {"type": "recall_at_5", "value": 41.805}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB FiQA2018", "type": "mteb/fiqa", "config": "default", "split": "test", "revision": "27a168819829fe9bcd655c2df245fb19452e8e06"}, "metrics": [{"type": "map_at_1", "value": 3.096}, {"type": "map_at_10", "value": 5.4879999999999995}, {"type": "map_at_100", "value": 6.199000000000001}, {"type": "map_at_1000", "value": 6.348}, {"type": "map_at_20", "value": 5.826}, {"type": "map_at_3", "value": 4.43}, {"type": "map_at_5", "value": 4.899}, {"type": "mrr_at_1", "value": 6.481000000000001}, {"type": "mrr_at_10", "value": 10.059999999999999}, {"type": "mrr_at_100", "value": 10.905}, {"type": "mrr_at_1000", "value": 11.019}, {"type": "mrr_at_20", "value": 10.513}, {"type": "mrr_at_3", "value": 8.436}, {"type": "mrr_at_5", "value": 9.168999999999999}, {"type": "ndcg_at_1", "value": 6.481000000000001}, {"type": "ndcg_at_10", "value": 8.097999999999999}, {"type": "ndcg_at_100", "value": 12.092}, {"type": "ndcg_at_1000", "value": 16.5}, {"type": "ndcg_at_20", "value": 9.353}, {"type": "ndcg_at_3", "value": 6.148}, {"type": "ndcg_at_5", "value": 6.714}, {"type": "precision_at_1", "value": 6.481000000000001}, {"type": "precision_at_10", "value": 2.5309999999999997}, {"type": "precision_at_100", "value": 0.6479999999999999}, {"type": "precision_at_1000", "value": 0.14100000000000001}, {"type": "precision_at_20", "value": 1.752}, {"type": "precision_at_3", "value": 4.064}, {"type": "precision_at_5", "value": 3.272}, {"type": "recall_at_1", "value": 3.096}, {"type": "recall_at_10", "value": 11.575000000000001}, {"type": "recall_at_100", "value": 27.560000000000002}, {"type": "recall_at_1000", "value": 56.391999999999996}, {"type": "recall_at_20", "value": 15.611}, {"type": "recall_at_3", "value": 5.821}, {"type": "recall_at_5", "value": 7.6259999999999994}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB HotpotQA", "type": "mteb/hotpotqa", "config": "default", "split": "test", "revision": "ab518f4d6fcca38d87c25209f94beba119d02014"}, "metrics": [{"type": "map_at_1", "value": 27.481}, {"type": "map_at_10", "value": 38.229}, {"type": "map_at_100", "value": 39.186}, {"type": "map_at_1000", "value": 39.283}, {"type": "map_at_20", "value": 38.763999999999996}, {"type": "map_at_3", "value": 35.652}, {"type": "map_at_5", "value": 37.18}, {"type": "mrr_at_1", "value": 54.962999999999994}, {"type": "mrr_at_10", "value": 62.651999999999994}, {"type": "mrr_at_100", "value": 63.158}, {"type": "mrr_at_1000", "value": 63.18899999999999}, {"type": "mrr_at_20", "value": 62.965}, {"type": "mrr_at_3", "value": 61.013}, {"type": "mrr_at_5", "value": 62.004999999999995}, {"type": "ndcg_at_1", "value": 54.962999999999994}, {"type": "ndcg_at_10", "value": 47.03}, {"type": "ndcg_at_100", "value": 50.938}, {"type": "ndcg_at_1000", "value": 53.028}, {"type": "ndcg_at_20", "value": 48.571999999999996}, {"type": "ndcg_at_3", "value": 42.751}, {"type": "ndcg_at_5", "value": 44.981}, {"type": "precision_at_1", "value": 54.962999999999994}, {"type": "precision_at_10", "value": 9.919}, {"type": "precision_at_100", "value": 1.302}, {"type": "precision_at_1000", "value": 0.158}, {"type": "precision_at_20", "value": 5.4559999999999995}, {"type": "precision_at_3", "value": 26.671}, {"type": "precision_at_5", "value": 17.764}, {"type": "recall_at_1", "value": 27.481}, {"type": "recall_at_10", "value": 49.595}, {"type": "recall_at_100", "value": 65.078}, {"type": "recall_at_1000", "value": 79.001}, {"type": "recall_at_20", "value": 54.564}, {"type": "recall_at_3", "value": 40.007}, {"type": "recall_at_5", "value": 44.409}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ImdbClassification", "type": "mteb/imdb", "config": "default", "split": "test", "revision": "3d86128a09e091d6018b6d26cad27f2739fc2db7"}, "metrics": [{"type": "accuracy", "value": 74.5976}, {"type": "ap", "value": 68.90030024726627}, {"type": "f1", "value": 74.44139933523756}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB MSMARCO", "type": "mteb/msmarco", "config": "default", "split": "dev", "revision": "c5a29a104738b98a9e76336939199e264163d4a0"}, "metrics": [{"type": "map_at_1", "value": 9.392}, {"type": "map_at_10", "value": 15.858}, {"type": "map_at_100", "value": 16.821}, {"type": "map_at_1000", "value": 16.916999999999998}, {"type": "map_at_20", "value": 16.378}, {"type": "map_at_3", "value": 13.627}, {"type": "map_at_5", "value": 14.837}, {"type": "mrr_at_1", "value": 9.642000000000001}, {"type": "mrr_at_10", "value": 16.189999999999998}, {"type": "mrr_at_100", "value": 17.149}, {"type": "mrr_at_1000", "value": 17.241}, {"type": "mrr_at_20", "value": 16.712}, {"type": "mrr_at_3", "value": 13.94}, {"type": "mrr_at_5", "value": 15.173}, {"type": "ndcg_at_1", "value": 9.642000000000001}, {"type": "ndcg_at_10", "value": 19.798}, {"type": "ndcg_at_100", "value": 24.93}, {"type": "ndcg_at_1000", "value": 27.723}, {"type": "ndcg_at_20", "value": 21.676000000000002}, {"type": "ndcg_at_3", "value": 15.135000000000002}, {"type": "ndcg_at_5", "value": 17.323}, {"type": "precision_at_1", "value": 9.642000000000001}, {"type": "precision_at_10", "value": 3.335}, {"type": "precision_at_100", "value": 0.597}, {"type": "precision_at_1000", "value": 0.084}, {"type": "precision_at_20", "value": 2.052}, {"type": "precision_at_3", "value": 6.585000000000001}, {"type": "precision_at_5", "value": 5.0569999999999995}, {"type": "recall_at_1", "value": 9.392}, {"type": "recall_at_10", "value": 32.074000000000005}, {"type": "recall_at_100", "value": 56.816}, {"type": "recall_at_1000", "value": 79.107}, {"type": "recall_at_20", "value": 39.404}, {"type": "recall_at_3", "value": 19.211}, {"type": "recall_at_5", "value": 24.476}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPDomainClassification (en)", "type": "mteb/mtop_domain", "config": "en", "split": "test", "revision": "d80d48c1eb48d3562165c59d59d0034df9fff0bf"}, "metrics": [{"type": "accuracy", "value": 88.23529411764707}, {"type": "f1", "value": 87.7087794539205}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MTOPIntentClassification (en)", "type": "mteb/mtop_intent", "config": "en", "split": "test", "revision": "ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba"}, "metrics": [{"type": "accuracy", "value": 54.9361605107159}, {"type": "f1", "value": 37.32757786855856}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveIntentClassification (en)", "type": "mteb/amazon_massive_intent", "config": "en", "split": "test", "revision": "31efe3c427b0bae9c22cbb560b8f15491cc6bed7"}, "metrics": [{"type": "accuracy", "value": 62.269670477471415}, {"type": "f1", "value": 59.31689853710541}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB MassiveScenarioClassification (en)", "type": "mteb/amazon_massive_scenario", "config": "en", "split": "test", "revision": "7d571f92784cd94a019292a1f45445077d0ef634"}, "metrics": [{"type": "accuracy", "value": 68.21788836583724}, {"type": "f1", "value": 67.10588384512401}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringP2P", "type": "mteb/medrxiv-clustering-p2p", "config": "default", "split": "test", "revision": "e7a26af6f3ae46b30dde8737f02c07b1505bcc73"}, "metrics": [{"type": "v_measure", "value": 28.23811395981688}, {"type": "v_measures", "value": [0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605, 0.2595788729651215, 0.26996810148313405, 0.2732996057137152, 0.2774563691306207, 0.26582943289674876, 0.2936161238913042, 0.28255533387533865, 0.30526428824928975, 0.29682495712055484, 0.2994183106558605]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB MedrxivClusteringS2S", "type": "mteb/medrxiv-clustering-s2s", "config": "default", "split": "test", "revision": "35191c8c0dca72d8ff3efcd72aa802307d469663"}, "metrics": [{"type": "v_measure", "value": 25.338025048309298}, {"type": "v_measures", "value": [0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565, 0.23561525025080798, 0.2368226918044119, 0.24303781513599937, 0.24417515042071292, 0.24407943936106685, 0.26391083211917715, 0.2674725559380188, 0.2739349725257399, 0.26620500686003945, 0.25854879041495565]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB MindSmallReranking", "type": "mteb/mind_small", "config": "default", "split": "test", "revision": "3bdac13927fdc888b903db93b2ffdbd90b295a69"}, "metrics": [{"type": "map", "value": 30.27968813284564}, {"type": "mrr", "value": 31.192897822243165}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NFCorpus", "type": "mteb/nfcorpus", "config": "default", "split": "test", "revision": "ec0fa4fe99da2ff19ca1214b7966684033a58814"}, "metrics": [{"type": "map_at_1", "value": 2.8930000000000002}, {"type": "map_at_10", "value": 5.63}, {"type": "map_at_100", "value": 6.981999999999999}, {"type": "map_at_1000", "value": 7.99}, {"type": "map_at_20", "value": 6.165}, {"type": "map_at_3", "value": 4.466}, {"type": "map_at_5", "value": 4.885}, {"type": "mrr_at_1", "value": 27.245}, {"type": "mrr_at_10", "value": 34.952}, {"type": "mrr_at_100", "value": 35.83}, {"type": "mrr_at_1000", "value": 35.892}, {"type": "mrr_at_20", "value": 35.464}, {"type": "mrr_at_3", "value": 32.611000000000004}, {"type": "mrr_at_5", "value": 33.725}, {"type": "ndcg_at_1", "value": 25.697}, {"type": "ndcg_at_10", "value": 18.746}, {"type": "ndcg_at_100", "value": 17.613}, {"type": "ndcg_at_1000", "value": 26.698}, {"type": "ndcg_at_20", "value": 17.607}, {"type": "ndcg_at_3", "value": 22.163}, {"type": "ndcg_at_5", "value": 20.497}, {"type": "precision_at_1", "value": 26.625}, {"type": "precision_at_10", "value": 13.437}, {"type": "precision_at_100", "value": 4.805000000000001}, {"type": "precision_at_1000", "value": 1.733}, {"type": "precision_at_20", "value": 10.17}, {"type": "precision_at_3", "value": 20.433}, {"type": "precision_at_5", "value": 17.214}, {"type": "recall_at_1", "value": 2.8930000000000002}, {"type": "recall_at_10", "value": 8.731}, {"type": "recall_at_100", "value": 19.236}, {"type": "recall_at_1000", "value": 50.632}, {"type": "recall_at_20", "value": 11.402}, {"type": "recall_at_3", "value": 5.207}, {"type": "recall_at_5", "value": 6.021}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB NQ", "type": "mteb/nq", "config": "default", "split": "test", "revision": "b774495ed302d8c44a3a7ea25c90dbce03968f31"}, "metrics": [{"type": "map_at_1", "value": 10.116999999999999}, {"type": "map_at_10", "value": 18.062}, {"type": "map_at_100", "value": 19.276}, {"type": "map_at_1000", "value": 19.366}, {"type": "map_at_20", "value": 18.719}, {"type": "map_at_3", "value": 15.018999999999998}, {"type": "map_at_5", "value": 16.659}, {"type": "mrr_at_1", "value": 11.587}, {"type": "mrr_at_10", "value": 19.75}, {"type": "mrr_at_100", "value": 20.855}, {"type": "mrr_at_1000", "value": 20.929000000000002}, {"type": "mrr_at_20", "value": 20.377000000000002}, {"type": "mrr_at_3", "value": 16.733999999999998}, {"type": "mrr_at_5", "value": 18.422}, {"type": "ndcg_at_1", "value": 11.559}, {"type": "ndcg_at_10", "value": 23.25}, {"type": "ndcg_at_100", "value": 29.364}, {"type": "ndcg_at_1000", "value": 31.775}, {"type": "ndcg_at_20", "value": 25.56}, {"type": "ndcg_at_3", "value": 17.052}, {"type": "ndcg_at_5", "value": 19.98}, {"type": "precision_at_1", "value": 11.559}, {"type": "precision_at_10", "value": 4.447}, {"type": "precision_at_100", "value": 0.796}, {"type": "precision_at_1000", "value": 0.10200000000000001}, {"type": "precision_at_20", "value": 2.762}, {"type": "precision_at_3", "value": 8.14}, {"type": "precision_at_5", "value": 6.524000000000001}, {"type": "recall_at_1", "value": 10.116999999999999}, {"type": "recall_at_10", "value": 37.736999999999995}, {"type": "recall_at_100", "value": 65.998}, {"type": "recall_at_1000", "value": 84.533}, {"type": "recall_at_20", "value": 46.43}, {"type": "recall_at_3", "value": 21.282}, {"type": "recall_at_5", "value": 28.1}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB QuoraRetrieval", "type": "mteb/quora", "config": "default", "split": "test", "revision": "e4e08e0b7dbe3c8700f0daef558ff32256715259"}, "metrics": [{"type": "map_at_1", "value": 64.706}, {"type": "map_at_10", "value": 77.777}, {"type": "map_at_100", "value": 78.509}, {"type": "map_at_1000", "value": 78.537}, {"type": "map_at_20", "value": 78.237}, {"type": "map_at_3", "value": 74.802}, {"type": "map_at_5", "value": 76.655}, {"type": "mrr_at_1", "value": 74.62}, {"type": "mrr_at_10", "value": 81.817}, {"type": "mrr_at_100", "value": 82.021}, {"type": "mrr_at_1000", "value": 82.025}, {"type": "mrr_at_20", "value": 81.962}, {"type": "mrr_at_3", "value": 80.452}, {"type": "mrr_at_5", "value": 81.352}, {"type": "ndcg_at_1", "value": 74.64}, {"type": "ndcg_at_10", "value": 82.30499999999999}, {"type": "ndcg_at_100", "value": 84.21}, {"type": "ndcg_at_1000", "value": 84.505}, {"type": "ndcg_at_20", "value": 83.255}, {"type": "ndcg_at_3", "value": 78.851}, {"type": "ndcg_at_5", "value": 80.72200000000001}, {"type": "precision_at_1", "value": 74.64}, {"type": "precision_at_10", "value": 12.457}, {"type": "precision_at_100", "value": 1.473}, {"type": "precision_at_1000", "value": 0.155}, {"type": "precision_at_20", "value": 6.677}, {"type": "precision_at_3", "value": 34.29}, {"type": "precision_at_5", "value": 22.7}, {"type": "recall_at_1", "value": 64.706}, {"type": "recall_at_10", "value": 91.01}, {"type": "recall_at_100", "value": 98.039}, {"type": "recall_at_1000", "value": 99.66000000000001}, {"type": "recall_at_20", "value": 94.184}, {"type": "recall_at_3", "value": 81.12700000000001}, {"type": "recall_at_5", "value": 86.319}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClustering", "type": "mteb/reddit-clustering", "config": "default", "split": "test", "revision": "24640382cdbf8abc73003fb0fa6d111a705499eb"}, "metrics": [{"type": "v_measure", "value": 35.92118583596968}, {"type": "v_measures", "value": [0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517, 0.36348286190285317, 0.41972047648086824, 0.3157847416716583, 0.2924797675615204, 0.36261138426299355, 0.32847874534659877, 0.3889547507270807, 0.3127510849003159, 0.3307172377975423, 0.302750283797456, 0.32237517082958256, 0.40638708346483654, 0.3611211245185695, 0.3833760828081467, 0.5064461714989106, 0.3079898927539046, 0.39678017551052513, 0.43921575409868097, 0.3495794570559578, 0.30573013564346774, 0.32165840125624, 0.3272115833286129, 0.48520494325401664, 0.33563783213490467, 0.31385131638717517]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB RedditClusteringP2P", "type": "mteb/reddit-clustering-p2p", "config": "default", "split": "test", "revision": "385e3cb46b4cfa89021f56c4380204149d0efe33"}, "metrics": [{"type": "v_measure", "value": 46.08479450077311}, {"type": "v_measures", "value": [0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339, 0.5265304787752646, 0.5359671631118251, 0.5151264848317407, 0.2651162836277955, 0.5088505797958291, 0.4624679441261622, 0.21416898427604442, 0.5365759379838768, 0.49779861038453793, 0.5458769831642339]}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SCIDOCS", "type": "mteb/scidocs", "config": "default", "split": "test", "revision": "f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88"}, "metrics": [{"type": "map_at_1", "value": 2.823}, {"type": "map_at_10", "value": 6.162999999999999}, {"type": "map_at_100", "value": 7.462000000000001}, {"type": "map_at_1000", "value": 7.707}, {"type": "map_at_20", "value": 6.7989999999999995}, {"type": "map_at_3", "value": 4.614}, {"type": "map_at_5", "value": 5.221}, {"type": "mrr_at_1", "value": 13.8}, {"type": "mrr_at_10", "value": 20.317}, {"type": "mrr_at_100", "value": 21.495}, {"type": "mrr_at_1000", "value": 21.609}, {"type": "mrr_at_20", "value": 21.038999999999998}, {"type": "mrr_at_3", "value": 17.916999999999998}, {"type": "mrr_at_5", "value": 19.047}, {"type": "ndcg_at_1", "value": 13.8}, {"type": "ndcg_at_10", "value": 11.124}, {"type": "ndcg_at_100", "value": 17.058}, {"type": "ndcg_at_1000", "value": 22.584}, {"type": "ndcg_at_20", "value": 13.165}, {"type": "ndcg_at_3", "value": 10.453999999999999}, {"type": "ndcg_at_5", "value": 8.844000000000001}, {"type": "precision_at_1", "value": 13.8}, {"type": "precision_at_10", "value": 5.800000000000001}, {"type": "precision_at_100", "value": 1.443}, {"type": "precision_at_1000", "value": 0.27899999999999997}, {"type": "precision_at_20", "value": 4.08}, {"type": "precision_at_3", "value": 9.5}, {"type": "precision_at_5", "value": 7.42}, {"type": "recall_at_1", "value": 2.823}, {"type": "recall_at_10", "value": 11.790000000000001}, {"type": "recall_at_100", "value": 29.282000000000004}, {"type": "recall_at_1000", "value": 56.720000000000006}, {"type": "recall_at_20", "value": 16.54}, {"type": "recall_at_3", "value": 5.808}, {"type": "recall_at_5", "value": 7.548000000000001}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB SICK-R", "type": "mteb/sickr-sts", "config": "default", "split": "test", "revision": "20a6d6f312dd54037fe07a32d58e5e168867909d"}, "metrics": [{"type": "cos_sim_pearson", "value": 82.35546558185588}, {"type": "cos_sim_spearman", "value": 78.23859592249686}, {"type": "euclidean_pearson", "value": 79.98024519769696}, {"type": "euclidean_spearman", "value": 78.23859183509182}, {"type": "manhattan_pearson", "value": 79.89939470434149}, {"type": "manhattan_spearman", "value": 78.14002412024936}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS12", "type": "mteb/sts12-sts", "config": "default", "split": "test", "revision": "a0d554a64d88156834ff5ae9920b964011b16384"}, "metrics": [{"type": "cos_sim_pearson", "value": 82.77892045885623}, {"type": "cos_sim_spearman", "value": 75.1886741501174}, {"type": "euclidean_pearson", "value": 79.25545188379738}, {"type": "euclidean_spearman", "value": 75.18638344905548}, {"type": "manhattan_pearson", "value": 79.22653149623625}, {"type": "manhattan_spearman", "value": 75.27810415336305}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS13", "type": "mteb/sts13-sts", "config": "default", "split": "test", "revision": "7e90230a92c190f1bf69ae9002b8cea547a64cca"}, "metrics": [{"type": "cos_sim_pearson", "value": 77.0780386627305}, {"type": "cos_sim_spearman", "value": 79.33304952540263}, {"type": "euclidean_pearson", "value": 78.8995877109086}, {"type": "euclidean_spearman", "value": 79.33304952540263}, {"type": "manhattan_pearson", "value": 78.53767885744242}, {"type": "manhattan_spearman", "value": 78.98963272082919}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS14", "type": "mteb/sts14-sts", "config": "default", "split": "test", "revision": "6031580fec1f6af667f0bd2da0a551cf4f0b2375"}, "metrics": [{"type": "cos_sim_pearson", "value": 77.40102517193851}, {"type": "cos_sim_spearman", "value": 76.56213113240312}, {"type": "euclidean_pearson", "value": 77.28763789251809}, {"type": "euclidean_spearman", "value": 76.56214567337607}, {"type": "manhattan_pearson", "value": 77.07003484382906}, {"type": "manhattan_spearman", "value": 76.42170507923466}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS15", "type": "mteb/sts15-sts", "config": "default", "split": "test", "revision": "ae752c7c21bf194d8b67fd573edf7ae58183cbe3"}, "metrics": [{"type": "cos_sim_pearson", "value": 83.08619018791619}, {"type": "cos_sim_spearman", "value": 84.7000298638952}, {"type": "euclidean_pearson", "value": 84.45835118534818}, {"type": "euclidean_spearman", "value": 84.7000136316961}, {"type": "manhattan_pearson", "value": 84.49026098485562}, {"type": "manhattan_spearman", "value": 84.7341511290005}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS16", "type": "mteb/sts16-sts", "config": "default", "split": "test", "revision": "4d8694f8f0e0100860b497b999b3dbed754a0513"}, "metrics": [{"type": "cos_sim_pearson", "value": 78.16153099155702}, {"type": "cos_sim_spearman", "value": 81.43851932231388}, {"type": "euclidean_pearson", "value": 80.64566170494548}, {"type": "euclidean_spearman", "value": 81.43851888295582}, {"type": "manhattan_pearson", "value": 80.60043965519766}, {"type": "manhattan_spearman", "value": 81.39436114361187}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS17 (en-en)", "type": "mteb/sts17-crosslingual-sts", "config": "en-en", "split": "test", "revision": "af5e6fb845001ecf41f4c1e033ce921939a2a68d"}, "metrics": [{"type": "cos_sim_pearson", "value": 85.79691929686385}, {"type": "cos_sim_spearman", "value": 86.61476790521185}, {"type": "euclidean_pearson", "value": 87.19188107234186}, {"type": "euclidean_spearman", "value": 86.61476790521185}, {"type": "manhattan_pearson", "value": 87.1048361434476}, {"type": "manhattan_spearman", "value": 86.62564632760721}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STS22 (en)", "type": "mteb/sts22-crosslingual-sts", "config": "en", "split": "test", "revision": "eea2b4fe26a775864c896887d910b76a8098ad3f"}, "metrics": [{"type": "cos_sim_pearson", "value": 57.47315801345834}, {"type": "cos_sim_spearman", "value": 63.42561529682427}, {"type": "euclidean_pearson", "value": 61.72162209797075}, {"type": "euclidean_spearman", "value": 63.42561529682427}, {"type": "manhattan_pearson", "value": 61.90168887814704}, {"type": "manhattan_spearman", "value": 63.750754243527155}]}, {"task": {"type": "STS"}, "dataset": {"name": "MTEB STSBenchmark", "type": "mteb/stsbenchmark-sts", "config": "default", "split": "test", "revision": "b0fddb56ed78048fa8b90373c8a3cfc37b684831"}, "metrics": [{"type": "cos_sim_pearson", "value": 79.85854385132735}, {"type": "cos_sim_spearman", "value": 81.7934403165178}, {"type": "euclidean_pearson", "value": 81.76737446129472}, {"type": "euclidean_spearman", "value": 81.79344583519841}, {"type": "manhattan_pearson", "value": 81.51600708713269}, {"type": "manhattan_spearman", "value": 81.5208648976934}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB SciDocsRR", "type": "mteb/scidocs-reranking", "config": "default", "split": "test", "revision": "d3c5e1fc0b855ab6097bf1cda04dd73947d7caab"}, "metrics": [{"type": "map", "value": 74.47725483163819}, {"type": "mrr", "value": 91.68947066005887}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB SciFact", "type": "mteb/scifact", "config": "default", "split": "test", "revision": "0228b52cf27578f30900b9e5271d331663a030d7"}, "metrics": [{"type": "map_at_1", "value": 36.75}, {"type": "map_at_10", "value": 45.448}, {"type": "map_at_100", "value": 46.25}, {"type": "map_at_1000", "value": 46.333}, {"type": "map_at_20", "value": 45.965}, {"type": "map_at_3", "value": 42.848000000000006}, {"type": "map_at_5", "value": 44.098}, {"type": "mrr_at_1", "value": 39.0}, {"type": "mrr_at_10", "value": 46.916000000000004}, {"type": "mrr_at_100", "value": 47.61}, {"type": "mrr_at_1000", "value": 47.684}, {"type": "mrr_at_20", "value": 47.402}, {"type": "mrr_at_3", "value": 44.667}, {"type": "mrr_at_5", "value": 45.867000000000004}, {"type": "ndcg_at_1", "value": 39.0}, {"type": "ndcg_at_10", "value": 50.241}, {"type": "ndcg_at_100", "value": 53.701}, {"type": "ndcg_at_1000", "value": 55.84}, {"type": "ndcg_at_20", "value": 52.022}, {"type": "ndcg_at_3", "value": 45.248}, {"type": "ndcg_at_5", "value": 47.332}, {"type": "precision_at_1", "value": 39.0}, {"type": "precision_at_10", "value": 7.199999999999999}, {"type": "precision_at_100", "value": 0.903}, {"type": "precision_at_1000", "value": 0.108}, {"type": "precision_at_20", "value": 3.9829999999999997}, {"type": "precision_at_3", "value": 18.333}, {"type": "precision_at_5", "value": 12.2}, {"type": "recall_at_1", "value": 36.75}, {"type": "recall_at_10", "value": 63.62799999999999}, {"type": "recall_at_100", "value": 78.85600000000001}, {"type": "recall_at_1000", "value": 95.6}, {"type": "recall_at_20", "value": 70.489}, {"type": "recall_at_3", "value": 49.928}, {"type": "recall_at_5", "value": 55.161}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB SprintDuplicateQuestions", "type": "mteb/sprintduplicatequestions-pairclassification", "config": "default", "split": "test", "revision": "d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46"}, "metrics": [{"type": "cos_sim_accuracy", "value": 99.68514851485149}, {"type": "cos_sim_ap", "value": 89.84995835652664}, {"type": "cos_sim_f1", "value": 83.54037267080744}, {"type": "cos_sim_precision", "value": 86.58798283261802}, {"type": "cos_sim_recall", "value": 80.7}, {"type": "dot_accuracy", "value": 99.68514851485149}, {"type": "dot_ap", "value": 89.84995822010269}, {"type": "dot_f1", "value": 83.54037267080744}, {"type": "dot_precision", "value": 86.58798283261802}, {"type": "dot_recall", "value": 80.7}, {"type": "euclidean_accuracy", "value": 99.68514851485149}, {"type": "euclidean_ap", "value": 89.84995835652664}, {"type": "euclidean_f1", "value": 83.54037267080744}, {"type": "euclidean_precision", "value": 86.58798283261802}, {"type": "euclidean_recall", "value": 80.7}, {"type": "manhattan_accuracy", "value": 99.69504950495049}, {"type": "manhattan_ap", "value": 90.15934028795763}, {"type": "manhattan_f1", "value": 84.10256410256412}, {"type": "manhattan_precision", "value": 86.31578947368422}, {"type": "manhattan_recall", "value": 82.0}, {"type": "max_accuracy", "value": 99.69504950495049}, {"type": "max_ap", "value": 90.15934028795763}, {"type": "max_f1", "value": 84.10256410256412}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClustering", "type": "mteb/stackexchange-clustering", "config": "default", "split": "test", "revision": "6cbc1f7b2bc0622f2e39d2c77fa502909748c259"}, "metrics": [{"type": "v_measure", "value": 45.30526182881455}, {"type": "v_measures", "value": [0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713, 0.4344425345974089, 0.5135881589154038, 0.37360609992777216, 0.44322308669982985, 0.4258373836686056, 0.37002419178401297, 0.3950750262645659, 0.4703810042020446, 0.4910523906839406, 0.45466057667041015, 0.5726496772458282, 0.4854711941992666, 0.5369040307391952, 0.47225744279530213, 0.4343396666563287, 0.43259528575250467, 0.4576684562107158, 0.4584484754489512, 0.45112618895014645, 0.4529676773831033, 0.46470509372382, 0.3859234328751357, 0.4173981863952295, 0.4703065804595455, 0.4616636149545713]}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB StackExchangeClusteringP2P", "type": "mteb/stackexchange-clustering-p2p", "config": "default", "split": "test", "revision": "815ca46b2622cec33ccafc3735d572c266efdb44"}, "metrics": [{"type": "v_measure", "value": 29.907825848421005}, {"type": "v_measures", "value": [0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515, 0.2904718592534847, 0.2823816900665434, 0.28421239223298045, 0.2780948428258039, 0.28672624299995775, 0.31491216516638987, 0.3106150681150922, 0.3160504189195483, 0.31763770570244787, 0.3096801995598515]}]}, {"task": {"type": "Reranking"}, "dataset": {"name": "MTEB StackOverflowDupQuestions", "type": "mteb/stackoverflowdupquestions-reranking", "config": "default", "split": "test", "revision": "e185fbe320c72810689fc5848eb6114e1ef5ec69"}, "metrics": [{"type": "map", "value": 42.29730951798082}, {"type": "mrr", "value": 42.927117816823696}]}, {"task": {"type": "Summarization"}, "dataset": {"name": "MTEB SummEval", "type": "mteb/summeval", "config": "default", "split": "test", "revision": "cda12ad7615edc362dbf25a00fdd61d3b1eaf93c"}, "metrics": [{"type": "cos_sim_pearson", "value": 31.06400884629347}, {"type": "cos_sim_spearman", "value": 30.706758615234286}, {"type": "dot_pearson", "value": 31.064025024903586}, {"type": "dot_spearman", "value": 30.70979367079321}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB TRECCOVID", "type": "mteb/trec-covid", "config": "default", "split": "test", "revision": "bb9466bac8153a0349341eb1b22e06409e78ef4e"}, "metrics": [{"type": "map_at_1", "value": 0.131}, {"type": "map_at_10", "value": 0.699}, {"type": "map_at_100", "value": 2.7279999999999998}, {"type": "map_at_1000", "value": 6.349}, {"type": "map_at_20", "value": 1.0999999999999999}, {"type": "map_at_3", "value": 0.292}, {"type": "map_at_5", "value": 0.422}, {"type": "mrr_at_1", "value": 48.0}, {"type": "mrr_at_10", "value": 56.233}, {"type": "mrr_at_100", "value": 57.57600000000001}, {"type": "mrr_at_1000", "value": 57.582}, {"type": "mrr_at_20", "value": 57.17100000000001}, {"type": "mrr_at_3", "value": 54.333}, {"type": "mrr_at_5", "value": 56.033}, {"type": "ndcg_at_1", "value": 44.0}, {"type": "ndcg_at_10", "value": 35.736000000000004}, {"type": "ndcg_at_100", "value": 23.53}, {"type": "ndcg_at_1000", "value": 20.848}, {"type": "ndcg_at_20", "value": 32.458}, {"type": "ndcg_at_3", "value": 40.765}, {"type": "ndcg_at_5", "value": 38.32}, {"type": "precision_at_1", "value": 48.0}, {"type": "precision_at_10", "value": 37.0}, {"type": "precision_at_100", "value": 23.44}, {"type": "precision_at_1000", "value": 9.754}, {"type": "precision_at_20", "value": 33.300000000000004}, {"type": "precision_at_3", "value": 42.667}, {"type": "precision_at_5", "value": 40.400000000000006}, {"type": "recall_at_1", "value": 0.131}, {"type": "recall_at_10", "value": 0.8789999999999999}, {"type": "recall_at_100", "value": 4.9590000000000005}, {"type": "recall_at_1000", "value": 19.534000000000002}, {"type": "recall_at_20", "value": 1.539}, {"type": "recall_at_3", "value": 0.314}, {"type": "recall_at_5", "value": 0.484}]}, {"task": {"type": "Retrieval"}, "dataset": {"name": "MTEB Touche2020", "type": "mteb/touche2020", "config": "default", "split": "test", "revision": "a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f"}, "metrics": [{"type": "map_at_1", "value": 1.175}, {"type": "map_at_10", "value": 2.59}, {"type": "map_at_100", "value": 3.3169999999999997}, {"type": "map_at_1000", "value": 3.7449999999999997}, {"type": "map_at_20", "value": 2.881}, {"type": "map_at_3", "value": 1.76}, {"type": "map_at_5", "value": 2.2030000000000003}, {"type": "mrr_at_1", "value": 16.326999999999998}, {"type": "mrr_at_10", "value": 24.189}, {"type": "mrr_at_100", "value": 25.686999999999998}, {"type": "mrr_at_1000", "value": 25.743}, {"type": "mrr_at_20", "value": 24.937}, {"type": "mrr_at_3", "value": 22.448999999999998}, {"type": "mrr_at_5", "value": 23.366999999999997}, {"type": "ndcg_at_1", "value": 14.285999999999998}, {"type": "ndcg_at_10", "value": 8.001999999999999}, {"type": "ndcg_at_100", "value": 10.833}, {"type": "ndcg_at_1000", "value": 18.258}, {"type": "ndcg_at_20", "value": 7.707999999999999}, {"type": "ndcg_at_3", "value": 11.213}, {"type": "ndcg_at_5", "value": 9.934}, {"type": "precision_at_1", "value": 16.326999999999998}, {"type": "precision_at_10", "value": 7.3469999999999995}, {"type": "precision_at_100", "value": 2.4899999999999998}, {"type": "precision_at_1000", "value": 0.7100000000000001}, {"type": "precision_at_20", "value": 5.408}, {"type": "precision_at_3", "value": 12.925}, {"type": "precision_at_5", "value": 10.612}, {"type": "recall_at_1", "value": 1.175}, {"type": "recall_at_10", "value": 4.596}, {"type": "recall_at_100", "value": 14.41}, {"type": "recall_at_1000", "value": 39.294000000000004}, {"type": "recall_at_20", "value": 6.436999999999999}, {"type": "recall_at_3", "value": 2.367}, {"type": "recall_at_5", "value": 3.3230000000000004}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB ToxicConversationsClassification", "type": "mteb/toxic_conversations_50k", "config": "default", "split": "test", "revision": "edfaf9da55d3dd50d43143d90c1ac476895ae6de"}, "metrics": [{"type": "accuracy", "value": 65.1513671875}, {"type": "ap", "value": 12.303071109448203}, {"type": "f1", "value": 50.43533728860237}]}, {"task": {"type": "Classification"}, "dataset": {"name": "MTEB TweetSentimentExtractionClassification", "type": "mteb/tweet_sentiment_extraction", "config": "default", "split": "test", "revision": "d604517c81ca91fe16a244d1248fc021f9ecee7a"}, "metrics": [{"type": "accuracy", "value": 62.5438596491228}, {"type": "f1", "value": 62.69763355089073}]}, {"task": {"type": "Clustering"}, "dataset": {"name": "MTEB TwentyNewsgroupsClustering", "type": "mteb/twentynewsgroups-clustering", "config": "default", "split": "test", "revision": "6125ec4e24fa026cec8a478383ee943acfbd5449"}, "metrics": [{"type": "v_measure", "value": 31.692515423088473}, {"type": "v_measures", "value": [0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743, 0.31329437576982844, 0.3203569385112976, 0.3427302354400537, 0.3045275740558555, 0.3228406069698239, 0.3215023256245064, 0.30524504896475263, 0.31571502008047786, 0.2995174236038641, 0.32352199328838743]}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterSemEval2015", "type": "mteb/twittersemeval2015-pairclassification", "config": "default", "split": "test", "revision": "70970daeab8776df92f5ea462b6173c0b46fd2d1"}, "metrics": [{"type": "cos_sim_accuracy", "value": 84.00190737318948}, {"type": "cos_sim_ap", "value": 67.48296380006165}, {"type": "cos_sim_f1", "value": 62.996718920889535}, {"type": "cos_sim_precision", "value": 58.39152962378914}, {"type": "cos_sim_recall", "value": 68.3905013192612}, {"type": "dot_accuracy", "value": 84.00190737318948}, {"type": "dot_ap", "value": 67.48295942427862}, {"type": "dot_f1", "value": 62.996718920889535}, {"type": "dot_precision", "value": 58.39152962378914}, {"type": "dot_recall", "value": 68.3905013192612}, {"type": "euclidean_accuracy", "value": 84.00190737318948}, {"type": "euclidean_ap", "value": 67.482961801317}, {"type": "euclidean_f1", "value": 62.996718920889535}, {"type": "euclidean_precision", "value": 58.39152962378914}, {"type": "euclidean_recall", "value": 68.3905013192612}, {"type": "manhattan_accuracy", "value": 83.94826250223521}, {"type": "manhattan_ap", "value": 67.32115101507013}, {"type": "manhattan_f1", "value": 62.665684830633275}, {"type": "manhattan_precision", "value": 58.5819183111519}, {"type": "manhattan_recall", "value": 67.36147757255937}, {"type": "max_accuracy", "value": 84.00190737318948}, {"type": "max_ap", "value": 67.48296380006165}, {"type": "max_f1", "value": 62.996718920889535}]}, {"task": {"type": "PairClassification"}, "dataset": {"name": "MTEB TwitterURLCorpus", "type": "mteb/twitterurlcorpus-pairclassification", "config": "default", "split": "test", "revision": "8b6510b0b1fa4e4c4f879467980e9be563ec1cdf"}, "metrics": [{"type": "cos_sim_accuracy", "value": 88.30286800946948}, {"type": "cos_sim_ap", "value": 84.5306725053528}, {"type": "cos_sim_f1", "value": 76.5947752126367}, {"type": "cos_sim_precision", "value": 75.56188192987715}, {"type": "cos_sim_recall", "value": 77.65629812134279}, {"type": "dot_accuracy", "value": 88.30286800946948}, {"type": "dot_ap", "value": 84.53066920468329}, {"type": "dot_f1", "value": 76.5947752126367}, {"type": "dot_precision", "value": 75.56188192987715}, {"type": "dot_recall", "value": 77.65629812134279}, {"type": "euclidean_accuracy", "value": 88.30286800946948}, {"type": "euclidean_ap", "value": 84.53066432305307}, {"type": "euclidean_f1", "value": 76.5947752126367}, {"type": "euclidean_precision", "value": 75.56188192987715}, {"type": "euclidean_recall", "value": 77.65629812134279}, {"type": "manhattan_accuracy", "value": 88.39795086738852}, {"type": "manhattan_ap", "value": 84.51446339083833}, {"type": "manhattan_f1", "value": 76.57867106644667}, {"type": "manhattan_precision", "value": 74.64181286549709}, {"type": "manhattan_recall", "value": 78.61872497690176}, {"type": "max_accuracy", "value": 88.39795086738852}, {"type": "max_ap", "value": 84.5306725053528}, {"type": "max_f1", "value": 76.5947752126367}]}]}]}
Mihaiii/Wartortle
null
[ "sentence-transformers", "onnx", "safetensors", "bert", "feature-extraction", "sentence-similarity", "bge", "mteb", "dataset:Mihaiii/qa-assistant", "license:mit", "model-index", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:12:13+00:00
[]
[]
TAGS #sentence-transformers #onnx #safetensors #bert #feature-extraction #sentence-similarity #bge #mteb #dataset-Mihaiii/qa-assistant #license-mit #model-index #endpoints_compatible #region-us
# Wartortle Wartortle is a distill of bge-base-en-v1.5. ## Intended purpose <span style="color:blue">This model is designed for use in semantic-autocomplete (click here for demo).</span> Make sure you also pass 'pipelineParams={{ pooling: "cls", normalize: true }}' since the default pooling in the component is mean. ## Usage Other than within semantic-autocomplete, you can use this model same as bge-base-en-v1.5.
[ "# Wartortle\n\nWartortle is a distill of bge-base-en-v1.5.", "## Intended purpose\n\n<span style=\"color:blue\">This model is designed for use in semantic-autocomplete (click here for demo).</span>\nMake sure you also pass 'pipelineParams={{ pooling: \"cls\", normalize: true }}' since the default pooling in the component is mean.", "## Usage\n\nOther than within semantic-autocomplete, you can use this model same as bge-base-en-v1.5." ]
[ "TAGS\n#sentence-transformers #onnx #safetensors #bert #feature-extraction #sentence-similarity #bge #mteb #dataset-Mihaiii/qa-assistant #license-mit #model-index #endpoints_compatible #region-us \n", "# Wartortle\n\nWartortle is a distill of bge-base-en-v1.5.", "## Intended purpose\n\n<span style=\"color:blue\">This model is designed for use in semantic-autocomplete (click here for demo).</span>\nMake sure you also pass 'pipelineParams={{ pooling: \"cls\", normalize: true }}' since the default pooling in the component is mean.", "## Usage\n\nOther than within semantic-autocomplete, you can use this model same as bge-base-en-v1.5." ]
[ 57, 25, 76, 32 ]
[ "TAGS\n#sentence-transformers #onnx #safetensors #bert #feature-extraction #sentence-similarity #bge #mteb #dataset-Mihaiii/qa-assistant #license-mit #model-index #endpoints_compatible #region-us \n# Wartortle\n\nWartortle is a distill of bge-base-en-v1.5.## Intended purpose\n\n<span style=\"color:blue\">This model is designed for use in semantic-autocomplete (click here for demo).</span>\nMake sure you also pass 'pipelineParams={{ pooling: \"cls\", normalize: true }}' since the default pooling in the component is mean.## Usage\n\nOther than within semantic-autocomplete, you can use this model same as bge-base-en-v1.5." ]
null
peft
# Model Card for Model ID Unsloth base model Code Llama2 34b (https://huggingface.co/unsloth/codellama-34b-bnb-4bit) trained on Alpaca Dataset. This is a repository for another 34B instruct-tuned version. ``` Tue Apr 30 18:21:29 2024 +---------------------------------------------------------------------------------------+ | NVIDIA-SMI 535.129.03 Driver Version: 535.129.03 CUDA Version: 12.2 | |-----------------------------------------+----------------------+----------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |=========================================+======================+======================| | 0 NVIDIA A100-SXM4-40GB On | 00000000:07:00.0 Off | 0 | | N/A 75C P0 389W / 400W | 20959MiB / 40960MiB | 100% Default | | | | Disabled | +-----------------------------------------+----------------------+----------------------+ +---------------------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=======================================================================================| | 0 N/A N/A 9175 C python 20938MiB | +---------------------------------------------------------------------------------------+ ``` ### Framework versions - Unsloth - Torch - PEFT 0.10.0
{"library_name": "peft", "base_model": "unsloth/codellama-34b-bnb-4bit"}
vincentoh/codellama-34b-alpaca-instruct
null
[ "peft", "safetensors", "base_model:unsloth/codellama-34b-bnb-4bit", "region:us" ]
null
2024-04-30T15:12:27+00:00
[]
[]
TAGS #peft #safetensors #base_model-unsloth/codellama-34b-bnb-4bit #region-us
# Model Card for Model ID Unsloth base model Code Llama2 34b (URL trained on Alpaca Dataset. This is a repository for another 34B instruct-tuned version. ### Framework versions - Unsloth - Torch - PEFT 0.10.0
[ "# Model Card for Model ID\n\nUnsloth base model Code Llama2 34b (URL\ntrained on Alpaca Dataset.\n\nThis is a repository for another 34B instruct-tuned version.", "### Framework versions\n\n- Unsloth\n- Torch\n- PEFT 0.10.0" ]
[ "TAGS\n#peft #safetensors #base_model-unsloth/codellama-34b-bnb-4bit #region-us \n", "# Model Card for Model ID\n\nUnsloth base model Code Llama2 34b (URL\ntrained on Alpaca Dataset.\n\nThis is a repository for another 34B instruct-tuned version.", "### Framework versions\n\n- Unsloth\n- Torch\n- PEFT 0.10.0" ]
[ 33, 42, 19 ]
[ "TAGS\n#peft #safetensors #base_model-unsloth/codellama-34b-bnb-4bit #region-us \n# Model Card for Model ID\n\nUnsloth base model Code Llama2 34b (URL\ntrained on Alpaca Dataset.\n\nThis is a repository for another 34B instruct-tuned version.### Framework versions\n\n- Unsloth\n- Torch\n- PEFT 0.10.0" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": ["trl", "sft"]}
dbaek111/Mistral-7B-v0.2-Elon_407_HPC_Q
null
[ "transformers", "safetensors", "mistral", "text-generation", "trl", "sft", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "4-bit", "region:us" ]
null
2024-04-30T15:13:44+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #mistral #text-generation #trl #sft #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #trl #sft #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 54, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #trl #sft #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #4-bit #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
reinforcement-learning
null
# **Reinforce** Agent playing **Pixelcopter-PLE-v0** This is a trained model of a **Reinforce** agent playing **Pixelcopter-PLE-v0** . To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: https://huggingface.co/deep-rl-course/unit4/introduction
{"tags": ["Pixelcopter-PLE-v0", "reinforce", "reinforcement-learning", "custom-implementation", "deep-rl-class"], "model-index": [{"name": "Reinforce-Pixelcopter-PLE-v0", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "Pixelcopter-PLE-v0", "type": "Pixelcopter-PLE-v0"}, "metrics": [{"type": "mean_reward", "value": "38.80 +/- 24.91", "name": "mean_reward", "verified": false}]}]}]}
lzacchini/Reinforce-Pixelcopter-PLE-v0
null
[ "Pixelcopter-PLE-v0", "reinforce", "reinforcement-learning", "custom-implementation", "deep-rl-class", "model-index", "region:us" ]
null
2024-04-30T15:13:55+00:00
[]
[]
TAGS #Pixelcopter-PLE-v0 #reinforce #reinforcement-learning #custom-implementation #deep-rl-class #model-index #region-us
# Reinforce Agent playing Pixelcopter-PLE-v0 This is a trained model of a Reinforce agent playing Pixelcopter-PLE-v0 . To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL
[ "# Reinforce Agent playing Pixelcopter-PLE-v0\n This is a trained model of a Reinforce agent playing Pixelcopter-PLE-v0 .\n To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL" ]
[ "TAGS\n#Pixelcopter-PLE-v0 #reinforce #reinforcement-learning #custom-implementation #deep-rl-class #model-index #region-us \n", "# Reinforce Agent playing Pixelcopter-PLE-v0\n This is a trained model of a Reinforce agent playing Pixelcopter-PLE-v0 .\n To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL" ]
[ 37, 56 ]
[ "TAGS\n#Pixelcopter-PLE-v0 #reinforce #reinforcement-learning #custom-implementation #deep-rl-class #model-index #region-us \n# Reinforce Agent playing Pixelcopter-PLE-v0\n This is a trained model of a Reinforce agent playing Pixelcopter-PLE-v0 .\n To learn to use this model and train yours check Unit 4 of the Deep Reinforcement Learning Course: URL" ]
null
transformers
`Epoch = 0, Step = 1000` # Model Card for Model ID ``` python pretrain.py \ --data-path "chart-rela-ins/unichart-pretrain-table-extraction-v2" \ --train-images "/workspace/data/UniChartImages/" \ --output-dir "/workspace/output_data/unichart_pretrain/" \ --max-steps 36000 \ --batch-size 2 \ --valid-batch-size 2 \ --num-workers 12 \ --lr 5e-5 \ --log-every-n-steps 20 \ --val-check-interval 0.5 \ --warmup-steps 2000 \ --checkpoint-steps 1000 \ --accumulate-grad-batches 64 \ --processor-path "ahmed-masry/unichart-base-960" \ --image-size 512 \ --pretrained-vision-encoder "nxquang-al/unichart-base-960-encoder" \ --pretrained-decoder "nxquang-al/unichart-base-960-decoder" \ --wandb-project "Pretrain-ChartReLA-Instruct" \ ```
{"library_name": "transformers", "tags": []}
chart-rela-ins/pretrain-small-unichart-table-bs64-high-lr
null
[ "transformers", "safetensors", "Chart-rela-instruct", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:14:40+00:00
[]
[]
TAGS #transformers #safetensors #Chart-rela-instruct #endpoints_compatible #region-us
'Epoch = 0, Step = 1000' # Model Card for Model ID
[ "# Model Card for Model ID" ]
[ "TAGS\n#transformers #safetensors #Chart-rela-instruct #endpoints_compatible #region-us \n", "# Model Card for Model ID" ]
[ 24, 6 ]
[ "TAGS\n#transformers #safetensors #Chart-rela-instruct #endpoints_compatible #region-us \n# Model Card for Model ID" ]
question-answering
transformers
<!-- This model card has been generated automatically according to the information Keras had access to. You should probably proofread and complete it, then remove this comment. --> # rajivsah240/my_awesome_qa_model This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 3.4612 - Validation Loss: 2.2288 - Epoch: 0 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'Adam', 'weight_decay': None, 'clipnorm': None, 'global_clipnorm': None, 'clipvalue': None, 'use_ema': False, 'ema_momentum': 0.99, 'ema_overwrite_frequency': None, 'jit_compile': False, 'is_legacy_optimizer': False, 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 2e-05, 'decay_steps': 500, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Epoch | |:----------:|:---------------:|:-----:| | 3.4612 | 2.2288 | 0 | ### Framework versions - Transformers 4.40.0 - TensorFlow 2.15.0 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_keras_callback"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "rajivsah240/my_awesome_qa_model", "results": []}]}
rajivsah240/my_awesome_qa_model
null
[ "transformers", "tf", "distilbert", "question-answering", "generated_from_keras_callback", "base_model:distilbert-base-uncased", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:17:47+00:00
[]
[]
TAGS #transformers #tf #distilbert #question-answering #generated_from_keras_callback #base_model-distilbert-base-uncased #license-apache-2.0 #endpoints_compatible #region-us
rajivsah240/my\_awesome\_qa\_model ================================== This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set: * Train Loss: 3.4612 * Validation Loss: 2.2288 * Epoch: 0 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * optimizer: {'name': 'Adam', 'weight\_decay': None, 'clipnorm': None, 'global\_clipnorm': None, 'clipvalue': None, 'use\_ema': False, 'ema\_momentum': 0.99, 'ema\_overwrite\_frequency': None, 'jit\_compile': False, 'is\_legacy\_optimizer': False, 'learning\_rate': {'module': 'keras.optimizers.schedules', 'class\_name': 'PolynomialDecay', 'config': {'initial\_learning\_rate': 2e-05, 'decay\_steps': 500, 'end\_learning\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered\_name': None}, 'beta\_1': 0.9, 'beta\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False} * training\_precision: float32 ### Training results ### Framework versions * Transformers 4.40.0 * TensorFlow 2.15.0 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'weight\\_decay': None, 'clipnorm': None, 'global\\_clipnorm': None, 'clipvalue': None, 'use\\_ema': False, 'ema\\_momentum': 0.99, 'ema\\_overwrite\\_frequency': None, 'jit\\_compile': False, 'is\\_legacy\\_optimizer': False, 'learning\\_rate': {'module': 'keras.optimizers.schedules', 'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 500, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered\\_name': None}, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.0\n* TensorFlow 2.15.0\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tf #distilbert #question-answering #generated_from_keras_callback #base_model-distilbert-base-uncased #license-apache-2.0 #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'weight\\_decay': None, 'clipnorm': None, 'global\\_clipnorm': None, 'clipvalue': None, 'use\\_ema': False, 'ema\\_momentum': 0.99, 'ema\\_overwrite\\_frequency': None, 'jit\\_compile': False, 'is\\_legacy\\_optimizer': False, 'learning\\_rate': {'module': 'keras.optimizers.schedules', 'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 500, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered\\_name': None}, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.0\n* TensorFlow 2.15.0\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ 54, 290, 5, 38 ]
[ "TAGS\n#transformers #tf #distilbert #question-answering #generated_from_keras_callback #base_model-distilbert-base-uncased #license-apache-2.0 #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* optimizer: {'name': 'Adam', 'weight\\_decay': None, 'clipnorm': None, 'global\\_clipnorm': None, 'clipvalue': None, 'use\\_ema': False, 'ema\\_momentum': 0.99, 'ema\\_overwrite\\_frequency': None, 'jit\\_compile': False, 'is\\_legacy\\_optimizer': False, 'learning\\_rate': {'module': 'keras.optimizers.schedules', 'class\\_name': 'PolynomialDecay', 'config': {'initial\\_learning\\_rate': 2e-05, 'decay\\_steps': 500, 'end\\_learning\\_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered\\_name': None}, 'beta\\_1': 0.9, 'beta\\_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False}\n* training\\_precision: float32### Training results### Framework versions\n\n\n* Transformers 4.40.0\n* TensorFlow 2.15.0\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
reinforcement-learning
stable-baselines3
# **DQN** Agent playing **SpaceInvadersNoFrameskip-v4** This is a trained model of a **DQN** agent playing **SpaceInvadersNoFrameskip-v4** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3) and the [RL Zoo](https://github.com/DLR-RM/rl-baselines3-zoo). The RL Zoo is a training framework for Stable Baselines3 reinforcement learning agents, with hyperparameter optimization and pre-trained agents included. ## Usage (with SB3 RL Zoo) RL Zoo: https://github.com/DLR-RM/rl-baselines3-zoo<br/> SB3: https://github.com/DLR-RM/stable-baselines3<br/> SB3 Contrib: https://github.com/Stable-Baselines-Team/stable-baselines3-contrib Install the RL Zoo (with SB3 and SB3-Contrib): ```bash pip install rl_zoo3 ``` ``` # Download model and save it into the logs/ folder python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga Cheekydave -f logs/ python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ ``` If you installed the RL Zoo3 via pip (`pip install rl_zoo3`), from anywhere you can do: ``` python -m rl_zoo3.load_from_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -orga Cheekydave -f logs/ python -m rl_zoo3.enjoy --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ ``` ## Training (with the RL Zoo) ``` python -m rl_zoo3.train --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ # Upload the model and generate video (when possible) python -m rl_zoo3.push_to_hub --algo dqn --env SpaceInvadersNoFrameskip-v4 -f logs/ -orga Cheekydave ``` ## Hyperparameters ```python OrderedDict([('batch_size', 32), ('buffer_size', 100000), ('env_wrapper', ['stable_baselines3.common.atari_wrappers.AtariWrapper']), ('exploration_final_eps', 0.01), ('exploration_fraction', 0.1), ('frame_stack', 4), ('gradient_steps', 1), ('learning_rate', 0.0001), ('learning_starts', 100000), ('n_timesteps', 1000000.0), ('optimize_memory_usage', False), ('policy', 'CnnPolicy'), ('target_update_interval', 1000), ('train_freq', 4), ('normalize', False)]) ``` # Environment Arguments ```python {'render_mode': 'rgb_array'} ```
{"library_name": "stable-baselines3", "tags": ["SpaceInvadersNoFrameskip-v4", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "DQN", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "SpaceInvadersNoFrameskip-v4", "type": "SpaceInvadersNoFrameskip-v4"}, "metrics": [{"type": "mean_reward", "value": "718.50 +/- 256.51", "name": "mean_reward", "verified": false}]}]}]}
Cheekydave/dqn-SpaceInvadersNoFrameskip-v4
null
[ "stable-baselines3", "SpaceInvadersNoFrameskip-v4", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
null
2024-04-30T15:18:27+00:00
[]
[]
TAGS #stable-baselines3 #SpaceInvadersNoFrameskip-v4 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
# DQN Agent playing SpaceInvadersNoFrameskip-v4 This is a trained model of a DQN agent playing SpaceInvadersNoFrameskip-v4 using the stable-baselines3 library and the RL Zoo. The RL Zoo is a training framework for Stable Baselines3 reinforcement learning agents, with hyperparameter optimization and pre-trained agents included. ## Usage (with SB3 RL Zoo) RL Zoo: URL SB3: URL SB3 Contrib: URL Install the RL Zoo (with SB3 and SB3-Contrib): If you installed the RL Zoo3 via pip ('pip install rl_zoo3'), from anywhere you can do: ## Training (with the RL Zoo) ## Hyperparameters # Environment Arguments
[ "# DQN Agent playing SpaceInvadersNoFrameskip-v4\nThis is a trained model of a DQN agent playing SpaceInvadersNoFrameskip-v4\nusing the stable-baselines3 library\nand the RL Zoo.\n\nThe RL Zoo is a training framework for Stable Baselines3\nreinforcement learning agents,\nwith hyperparameter optimization and pre-trained agents included.", "## Usage (with SB3 RL Zoo)\n\nRL Zoo: URL\nSB3: URL\nSB3 Contrib: URL\n\nInstall the RL Zoo (with SB3 and SB3-Contrib):\n\n\n\n\nIf you installed the RL Zoo3 via pip ('pip install rl_zoo3'), from anywhere you can do:", "## Training (with the RL Zoo)", "## Hyperparameters", "# Environment Arguments" ]
[ "TAGS\n#stable-baselines3 #SpaceInvadersNoFrameskip-v4 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n", "# DQN Agent playing SpaceInvadersNoFrameskip-v4\nThis is a trained model of a DQN agent playing SpaceInvadersNoFrameskip-v4\nusing the stable-baselines3 library\nand the RL Zoo.\n\nThe RL Zoo is a training framework for Stable Baselines3\nreinforcement learning agents,\nwith hyperparameter optimization and pre-trained agents included.", "## Usage (with SB3 RL Zoo)\n\nRL Zoo: URL\nSB3: URL\nSB3 Contrib: URL\n\nInstall the RL Zoo (with SB3 and SB3-Contrib):\n\n\n\n\nIf you installed the RL Zoo3 via pip ('pip install rl_zoo3'), from anywhere you can do:", "## Training (with the RL Zoo)", "## Hyperparameters", "# Environment Arguments" ]
[ 37, 81, 76, 10, 6, 3 ]
[ "TAGS\n#stable-baselines3 #SpaceInvadersNoFrameskip-v4 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# DQN Agent playing SpaceInvadersNoFrameskip-v4\nThis is a trained model of a DQN agent playing SpaceInvadersNoFrameskip-v4\nusing the stable-baselines3 library\nand the RL Zoo.\n\nThe RL Zoo is a training framework for Stable Baselines3\nreinforcement learning agents,\nwith hyperparameter optimization and pre-trained agents included.## Usage (with SB3 RL Zoo)\n\nRL Zoo: URL\nSB3: URL\nSB3 Contrib: URL\n\nInstall the RL Zoo (with SB3 and SB3-Contrib):\n\n\n\n\nIf you installed the RL Zoo3 via pip ('pip install rl_zoo3'), from anywhere you can do:## Training (with the RL Zoo)## Hyperparameters# Environment Arguments" ]
token-classification
flair
# Recognition of UTEs and company mentions in Flair This is a model trained using [Flair](https://github.com/flairNLP/flair/) to recognise mentions of UTEs (Unión Temporal de Empresas) and companies in public tenders. It is a finetune of the flair/ner-spanish-large model (retrained from scratch to include additional tags). Based on document-level XLM-R embeddings and [FLERT](https://arxiv.org/pdf/2011.06993v1.pdf/). ## Demo: How to use in Flair Requires: **[Flair](https://github.com/flairNLP/flair/)** (`pip install flair`) ```python from flair.data import Sentence from flair.models import SequenceTagger # load tagger tagger = SequenceTagger.load("BSC-LT/NextProcurement-NER-Spanish-UTE-Company") # make example sentence sentence = Sentence("PODACESA OBRAS Y SERVICIOS, S.A, y ECR INFRAESTRUCTURAS Y SERVICIOS HIDRÁULICOS S.L., constituidos en UTE PODACESA-ECR realizan la siguiente oferta:") # predict NER tags tagger.predict(sentence) # print sentence print(sentence) # print predicted NER spans print('The following NER tags are found:') # iterate over entities and print for entity in sentence.get_spans('ner'): print(entity) ``` This yields the following output: ``` Sentence[24]: "PODACESA OBRAS Y SERVICIOS, S.A, y ECR INFRAESTRUCTURAS Y SERVICIOS HIDRAULICOS S.L., constituidos en UTE PODACESA-ECR realizan la siguiente oferta:" _ ["PODACESA OBRAS Y SERVICIOS, S.A, y ECR INFRAESTRUCTURAS Y SERVICIOS HIDRAULICOS S.L."/UTE, "PODACESA-ECR"/UTE] The following NER tags are found: Span[0:14]: "PODACESA OBRAS Y SERVICIOS, S.A, y ECR INFRAESTRUCTURAS Y SERVICIOS HIDRAULICOS S.L." _ UTE (0.995) Span[18:19]: "PODACESA-ECR" _ UTE (0.9955) ``` and with the sentence "PODACESA OBRAS Y SERVICIOS, S.A realiza la siguiente oferta:" ``` Sentence[11]: "PODACESA OBRAS Y SERVICIOS, S.A realiza la siguiente oferta:" _ ["PODACESA OBRAS Y SERVICIOS, S.A"/SINGLE_COMPANY] The following NER tags are found: Span[0:6]: "PODACESA OBRAS Y SERVICIOS, S.A" _ SINGLE_COMPANY (1.0) ``` ## Training: Script to train this model The following Flair script was used to train this model (**TODO: update**): ```python import torch # 1. get the corpus from flair.datasets import CONLL_03_SPANISH corpus = CONLL_03_SPANISH() # 2. what tag do we want to predict? tag_type = 'ner' # 3. make the tag dictionary from the corpus tag_dictionary = corpus.make_tag_dictionary(tag_type=tag_type) # 4. initialize fine-tuneable transformer embeddings WITH document context from flair.embeddings import TransformerWordEmbeddings embeddings = TransformerWordEmbeddings( model='xlm-roberta-large', layers="-1", subtoken_pooling="first", fine_tune=True, use_context=True, ) # 5. initialize bare-bones sequence tagger (no CRF, no RNN, no reprojection) from flair.models import SequenceTagger tagger = SequenceTagger( hidden_size=256, embeddings=embeddings, tag_dictionary=tag_dictionary, tag_type='ner', use_crf=False, use_rnn=False, reproject_embeddings=False, ) # 6. initialize trainer with AdamW optimizer from flair.trainers import ModelTrainer trainer = ModelTrainer(tagger, corpus, optimizer=torch.optim.AdamW) # 7. run training with XLM parameters (20 epochs, small LR) from torch.optim.lr_scheduler import OneCycleLR trainer.train('resources/taggers/ner-spanish-large', learning_rate=5.0e-6, mini_batch_size=4, mini_batch_chunk_size=1, max_epochs=20, scheduler=OneCycleLR, embeddings_storage_mode='none', weight_decay=0., ) ) ``` ## Evaluation Results ``` Results: - F-score (micro) 0.7431 - F-score (macro) 0.7429 - Accuracy 0.5944 By class: precision recall f1-score support UTE 0.7568 0.7887 0.7724 71 SINGLE_COMPANY 0.6538 0.7846 0.7133 65 micro avg 0.7039 0.7868 0.7431 136 macro avg 0.7053 0.7867 0.7429 136 weighted avg 0.7076 0.7868 0.7442 136 ``` ## Additional information ### Author The Language Technologies Unit from Barcelona Supercomputing Center. ### Contact For further information, please send an email to <[email protected]>. ### Copyright Copyright(c) 2023 by Language Technologies Unit, Barcelona Supercomputing Center. ### License [Apache License, Version 2.0](https://www.apache.org/licenses/LICENSE-2.0) ### Funding This work has been promoted and financed by the European Commission Health and Digital Executive Agency, Connecting Europe Facility, Grant Agreement Nº INEA/CEF/ICT/A2020/2373713, Action Title Open Harmonized and Enriched Procurement Data Platform (nextProcurement), Action number 2020-ES-IA-0255. ### Disclaimer <details> <summary>Click to expand</summary> The model published in this repository is intended for a generalist purpose and is available to third parties under a permissive Apache License, Version 2.0. Be aware that the model may have biases and/or any other undesirable distortions. When third parties deploy or provide systems and/or services to other parties using this model (or any system based on it) or become users of the model, they should note that it is their responsibility to mitigate the risks arising from its use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner and creator of the model (Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties. </details>
{"language": "es", "license": "apache-2.0", "tags": ["flair", "token-classification", "sequence-tagger-model"], "datasets": ["conll2003", "BSC-LT/NextProcurement-NER-Spanish-UTE-Company-annotated"], "widget": [{"text": "PODACESA OBRAS Y SERVICIOS, S.A, y ECR INFRAESTRUCTURAS Y SERVICIOS HIDR\u00c1ULICOS S.L., constituidos en UTE PODACESA-ECR realizan la siguiente oferta:"}, {"text": "PODACESA OBRAS Y SERVICIOS, S.A realiza la siguiente oferta:"}]}
BSC-LT/NextProcurement-NER-Spanish-UTE-Company
null
[ "flair", "pytorch", "token-classification", "sequence-tagger-model", "es", "dataset:conll2003", "dataset:BSC-LT/NextProcurement-NER-Spanish-UTE-Company-annotated", "arxiv:2011.06993", "license:apache-2.0", "region:us" ]
null
2024-04-30T15:18:30+00:00
[ "2011.06993" ]
[ "es" ]
TAGS #flair #pytorch #token-classification #sequence-tagger-model #es #dataset-conll2003 #dataset-BSC-LT/NextProcurement-NER-Spanish-UTE-Company-annotated #arxiv-2011.06993 #license-apache-2.0 #region-us
# Recognition of UTEs and company mentions in Flair This is a model trained using Flair to recognise mentions of UTEs (Unión Temporal de Empresas) and companies in public tenders. It is a finetune of the flair/ner-spanish-large model (retrained from scratch to include additional tags). Based on document-level XLM-R embeddings and FLERT. ## Demo: How to use in Flair Requires: Flair ('pip install flair') This yields the following output: and with the sentence "PODACESA OBRAS Y SERVICIOS, S.A realiza la siguiente oferta:" ## Training: Script to train this model The following Flair script was used to train this model (TODO: update): ## Evaluation Results ## Additional information ### Author The Language Technologies Unit from Barcelona Supercomputing Center. ### Contact For further information, please send an email to <langtech@URL>. ### Copyright Copyright(c) 2023 by Language Technologies Unit, Barcelona Supercomputing Center. ### License Apache License, Version 2.0 ### Funding This work has been promoted and financed by the European Commission Health and Digital Executive Agency, Connecting Europe Facility, Grant Agreement Nº INEA/CEF/ICT/A2020/2373713, Action Title Open Harmonized and Enriched Procurement Data Platform (nextProcurement), Action number 2020-ES-IA-0255. ### Disclaimer <details> <summary>Click to expand</summary> The model published in this repository is intended for a generalist purpose and is available to third parties under a permissive Apache License, Version 2.0. Be aware that the model may have biases and/or any other undesirable distortions. When third parties deploy or provide systems and/or services to other parties using this model (or any system based on it) or become users of the model, they should note that it is their responsibility to mitigate the risks arising from its use and, in any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence. In no event shall the owner and creator of the model (Barcelona Supercomputing Center) be liable for any results arising from the use made by third parties. </details>
[ "# Recognition of UTEs and company mentions in Flair\n\nThis is a model trained using Flair to recognise mentions of UTEs (Unión Temporal de Empresas) \nand companies in public tenders.\n\nIt is a finetune of the flair/ner-spanish-large model (retrained from scratch to include additional tags).\n\nBased on document-level XLM-R embeddings and FLERT.", "## Demo: How to use in Flair\n\nRequires: Flair ('pip install flair')\n\n\n\nThis yields the following output:\n\n\nand with the sentence \"PODACESA OBRAS Y SERVICIOS, S.A realiza la siguiente oferta:\"", "## Training: Script to train this model\n\nThe following Flair script was used to train this model (TODO: update):", "## Evaluation Results", "## Additional information", "### Author\nThe Language Technologies Unit from Barcelona Supercomputing Center.", "### Contact\nFor further information, please send an email to <langtech@URL>.", "### Copyright\nCopyright(c) 2023 by Language Technologies Unit, Barcelona Supercomputing Center.", "### License\nApache License, Version 2.0", "### Funding\nThis work has been promoted and financed by the European Commission Health and Digital Executive Agency, Connecting Europe Facility, Grant Agreement Nº INEA/CEF/ICT/A2020/2373713, Action Title Open Harmonized and Enriched Procurement Data Platform (nextProcurement), Action number 2020-ES-IA-0255.", "### Disclaimer\n<details>\n<summary>Click to expand</summary>\n\nThe model published in this repository is intended for a generalist purpose and is available to third parties under a permissive Apache License, Version 2.0. \n\nBe aware that the model may have biases and/or any other undesirable distortions.\n\nWhen third parties deploy or provide systems and/or services to other parties using this model (or any system based on it) \nor become users of the model, they should note that it is their responsibility to mitigate the risks arising from its use and, \nin any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\nIn no event shall the owner and creator of the model (Barcelona Supercomputing Center) \nbe liable for any results arising from the use made by third parties.\n\n</details>" ]
[ "TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #es #dataset-conll2003 #dataset-BSC-LT/NextProcurement-NER-Spanish-UTE-Company-annotated #arxiv-2011.06993 #license-apache-2.0 #region-us \n", "# Recognition of UTEs and company mentions in Flair\n\nThis is a model trained using Flair to recognise mentions of UTEs (Unión Temporal de Empresas) \nand companies in public tenders.\n\nIt is a finetune of the flair/ner-spanish-large model (retrained from scratch to include additional tags).\n\nBased on document-level XLM-R embeddings and FLERT.", "## Demo: How to use in Flair\n\nRequires: Flair ('pip install flair')\n\n\n\nThis yields the following output:\n\n\nand with the sentence \"PODACESA OBRAS Y SERVICIOS, S.A realiza la siguiente oferta:\"", "## Training: Script to train this model\n\nThe following Flair script was used to train this model (TODO: update):", "## Evaluation Results", "## Additional information", "### Author\nThe Language Technologies Unit from Barcelona Supercomputing Center.", "### Contact\nFor further information, please send an email to <langtech@URL>.", "### Copyright\nCopyright(c) 2023 by Language Technologies Unit, Barcelona Supercomputing Center.", "### License\nApache License, Version 2.0", "### Funding\nThis work has been promoted and financed by the European Commission Health and Digital Executive Agency, Connecting Europe Facility, Grant Agreement Nº INEA/CEF/ICT/A2020/2373713, Action Title Open Harmonized and Enriched Procurement Data Platform (nextProcurement), Action number 2020-ES-IA-0255.", "### Disclaimer\n<details>\n<summary>Click to expand</summary>\n\nThe model published in this repository is intended for a generalist purpose and is available to third parties under a permissive Apache License, Version 2.0. \n\nBe aware that the model may have biases and/or any other undesirable distortions.\n\nWhen third parties deploy or provide systems and/or services to other parties using this model (or any system based on it) \nor become users of the model, they should note that it is their responsibility to mitigate the risks arising from its use and, \nin any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\nIn no event shall the owner and creator of the model (Barcelona Supercomputing Center) \nbe liable for any results arising from the use made by third parties.\n\n</details>" ]
[ 78, 84, 54, 26, 4, 4, 16, 21, 22, 11, 73, 178 ]
[ "TAGS\n#flair #pytorch #token-classification #sequence-tagger-model #es #dataset-conll2003 #dataset-BSC-LT/NextProcurement-NER-Spanish-UTE-Company-annotated #arxiv-2011.06993 #license-apache-2.0 #region-us \n# Recognition of UTEs and company mentions in Flair\n\nThis is a model trained using Flair to recognise mentions of UTEs (Unión Temporal de Empresas) \nand companies in public tenders.\n\nIt is a finetune of the flair/ner-spanish-large model (retrained from scratch to include additional tags).\n\nBased on document-level XLM-R embeddings and FLERT.## Demo: How to use in Flair\n\nRequires: Flair ('pip install flair')\n\n\n\nThis yields the following output:\n\n\nand with the sentence \"PODACESA OBRAS Y SERVICIOS, S.A realiza la siguiente oferta:\"## Training: Script to train this model\n\nThe following Flair script was used to train this model (TODO: update):## Evaluation Results## Additional information### Author\nThe Language Technologies Unit from Barcelona Supercomputing Center.### Contact\nFor further information, please send an email to <langtech@URL>.### Copyright\nCopyright(c) 2023 by Language Technologies Unit, Barcelona Supercomputing Center.### License\nApache License, Version 2.0### Funding\nThis work has been promoted and financed by the European Commission Health and Digital Executive Agency, Connecting Europe Facility, Grant Agreement Nº INEA/CEF/ICT/A2020/2373713, Action Title Open Harmonized and Enriched Procurement Data Platform (nextProcurement), Action number 2020-ES-IA-0255.### Disclaimer\n<details>\n<summary>Click to expand</summary>\n\nThe model published in this repository is intended for a generalist purpose and is available to third parties under a permissive Apache License, Version 2.0. \n\nBe aware that the model may have biases and/or any other undesirable distortions.\n\nWhen third parties deploy or provide systems and/or services to other parties using this model (or any system based on it) \nor become users of the model, they should note that it is their responsibility to mitigate the risks arising from its use and, \nin any event, to comply with applicable regulations, including regulations regarding the use of Artificial Intelligence.\n\nIn no event shall the owner and creator of the model (Barcelona Supercomputing Center) \nbe liable for any results arising from the use made by third parties.\n\n</details>" ]
null
peft
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # vissa_test This model is a fine-tuned version of [openthaigpt/openthaigpt-1.0.0-7b-chat](https://huggingface.co/openthaigpt/openthaigpt-1.0.0-7b-chat) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.03 - training_steps: 180 - mixed_precision_training: Native AMP ### Training results ### Framework versions - PEFT 0.10.1.dev0 - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "library_name": "peft", "tags": ["trl", "sft", "generated_from_trainer"], "base_model": "openthaigpt/openthaigpt-1.0.0-7b-chat", "model-index": [{"name": "vissa_test", "results": []}]}
Vissa15AI/vissa_test
null
[ "peft", "tensorboard", "safetensors", "trl", "sft", "generated_from_trainer", "base_model:openthaigpt/openthaigpt-1.0.0-7b-chat", "license:apache-2.0", "region:us" ]
null
2024-04-30T15:18:58+00:00
[]
[]
TAGS #peft #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-openthaigpt/openthaigpt-1.0.0-7b-chat #license-apache-2.0 #region-us
# vissa_test This model is a fine-tuned version of openthaigpt/openthaigpt-1.0.0-7b-chat on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 16 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.03 - training_steps: 180 - mixed_precision_training: Native AMP ### Training results ### Framework versions - PEFT 0.10.1.dev0 - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
[ "# vissa_test\n\nThis model is a fine-tuned version of openthaigpt/openthaigpt-1.0.0-7b-chat on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.03\n- training_steps: 180\n- mixed_precision_training: Native AMP", "### Training results", "### Framework versions\n\n- PEFT 0.10.1.dev0\n- Transformers 4.40.1\n- Pytorch 2.2.1+cu121\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
[ "TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-openthaigpt/openthaigpt-1.0.0-7b-chat #license-apache-2.0 #region-us \n", "# vissa_test\n\nThis model is a fine-tuned version of openthaigpt/openthaigpt-1.0.0-7b-chat on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.03\n- training_steps: 180\n- mixed_precision_training: Native AMP", "### Training results", "### Framework versions\n\n- PEFT 0.10.1.dev0\n- Transformers 4.40.1\n- Pytorch 2.2.1+cu121\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
[ 60, 40, 7, 9, 9, 4, 135, 5, 55 ]
[ "TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-openthaigpt/openthaigpt-1.0.0-7b-chat #license-apache-2.0 #region-us \n# vissa_test\n\nThis model is a fine-tuned version of openthaigpt/openthaigpt-1.0.0-7b-chat on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 0.0002\n- train_batch_size: 16\n- eval_batch_size: 8\n- seed: 42\n- gradient_accumulation_steps: 4\n- total_train_batch_size: 64\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.03\n- training_steps: 180\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- PEFT 0.10.1.dev0\n- Transformers 4.40.1\n- Pytorch 2.2.1+cu121\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
text-generation
transformers
# Uploaded model - **Developed by:** yadz45 - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["fr"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "trl"], "base_model": "unsloth/llama-3-8b-bnb-4bit", "pipeline_tag": "text-generation"}
yadz45/IA_lo2
null
[ "transformers", "safetensors", "text-generation-inference", "unsloth", "llama", "trl", "text-generation", "fr", "base_model:unsloth/llama-3-8b-bnb-4bit", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:19:46+00:00
[]
[ "fr" ]
TAGS #transformers #safetensors #text-generation-inference #unsloth #llama #trl #text-generation #fr #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us
# Uploaded model - Developed by: yadz45 - License: apache-2.0 - Finetuned from model : unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: yadz45\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #safetensors #text-generation-inference #unsloth #llama #trl #text-generation #fr #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: yadz45\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ 68, 81 ]
[ "TAGS\n#transformers #safetensors #text-generation-inference #unsloth #llama #trl #text-generation #fr #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n# Uploaded model\n\n- Developed by: yadz45\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
reinforcement-learning
stable-baselines3
# **PPO** Agent playing **LunarLander-v2** This is a trained model of a **PPO** agent playing **LunarLander-v2** using the [stable-baselines3 library](https://github.com/DLR-RM/stable-baselines3). ## Usage (with Stable-baselines3) TODO: Add your code ```python from stable_baselines3 import ... from huggingface_sb3 import load_from_hub ... ```
{"library_name": "stable-baselines3", "tags": ["LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "stable-baselines3"], "model-index": [{"name": "PPO", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "LunarLander-v2", "type": "LunarLander-v2"}, "metrics": [{"type": "mean_reward", "value": "235.19 +/- 19.94", "name": "mean_reward", "verified": false}]}]}]}
jchenmath/ppo-LunarLander-v2
null
[ "stable-baselines3", "LunarLander-v2", "deep-reinforcement-learning", "reinforcement-learning", "model-index", "region:us" ]
null
2024-04-30T15:22:53+00:00
[]
[]
TAGS #stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us
# PPO Agent playing LunarLander-v2 This is a trained model of a PPO agent playing LunarLander-v2 using the stable-baselines3 library. ## Usage (with Stable-baselines3) TODO: Add your code
[ "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ "TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n", "# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.", "## Usage (with Stable-baselines3)\nTODO: Add your code" ]
[ 31, 35, 17 ]
[ "TAGS\n#stable-baselines3 #LunarLander-v2 #deep-reinforcement-learning #reinforcement-learning #model-index #region-us \n# PPO Agent playing LunarLander-v2\nThis is a trained model of a PPO agent playing LunarLander-v2\nusing the stable-baselines3 library.## Usage (with Stable-baselines3)\nTODO: Add your code" ]
text-to-image
diffusers
# KrazyGlue The purpose of this model is to keep you glued to the screen generating images. Samples and prompts: ![Free AI image generator Crazy Glue samples](https://cdn-uploads.huggingface.co/production/uploads/63239b8370edc53f51cd5d42/vOnfjn9z67KFrOd7VJY6g.png) (Click for larger) Top left: a cute girl with freckles on her face, cgsociety unreal engine, wet t-shirt, short skirt, style of aenami alena, trending on artstartion, inspired by Fyodor Vasilyev, looks a bit similar to amy adams, emissive light, fluffy orange skin, dribbble, dramatic rendering Top right: 90s grainy vhs still young mother loose shirt, headband. holding a baby, on the couch, posing, bow. bokeh, bright lighting. smile Bottom left: dragon holds the sun upside down in the sky with her hands, she stands on a mountain, under the mountain there is a small town, illustration, sharp focus, very detailed, 8 k, hd Bottom right: Excellent quality and high resolution photo which depicts an old black panther in its natural habitat. The panther stands on four legs at a distance of 25 meters. Its torso is directed parallel to the lens. This is not a portrait, the panther is fully visible! ! ! The panther looks straight into the camera lens. Her appearance is menacing, majestic, but not aggressive. She looks wary but indifferent without fear. She shows that she has everything under control and she owns the situation. Back background in brown tones. Most likely the area is arid, something like a desert. However, the vegetation is still present. The quality of the photo is professional, taken by a professional photographer on the latest model of the camera in high resolution. InsaneRealistic by cordonsolution8 merged with the Hellmix model by Barons, Kitsch-In-Sync v2 by iamxenos, the cryptids lora by RIXYN, and artistic models with the CokeGirls lora by iamxenos. Original pages: https://civitai.com/models/108585?modelVersionId=116883 (InsaneRealistic) https://civitai.com/models/186251/coca-cola-gil-elvgrenhaddon-sundblom-pinup-style https://civitai.com/models/142552?modelVersionId=163068 (Kitsch-In-Sync v2) https://civitai.com/models/21493/hellmix?modelVersionId=25632 https://civitai.com/models/64766/cryptids?modelVersionId=69407 (Cryptids LoRA)
{"language": ["en"], "license": "creativeml-openrail-m", "library_name": "diffusers", "tags": ["Realism", "Girls", "Cute", "cordonsolution8", "iamxenos", "RIXYN", "Barons", "stable-diffusion", "stable-diffusion-diffusers", "diffusers", "text-to-image"], "pipeline_tag": "text-to-image"}
Yntec/KrazyGlue
null
[ "diffusers", "safetensors", "Realism", "Girls", "Cute", "cordonsolution8", "iamxenos", "RIXYN", "Barons", "stable-diffusion", "stable-diffusion-diffusers", "text-to-image", "en", "license:creativeml-openrail-m", "endpoints_compatible", "diffusers:StableDiffusionPipeline", "region:us", "has_space" ]
null
2024-04-30T15:23:01+00:00
[]
[ "en" ]
TAGS #diffusers #safetensors #Realism #Girls #Cute #cordonsolution8 #iamxenos #RIXYN #Barons #stable-diffusion #stable-diffusion-diffusers #text-to-image #en #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us #has_space
# KrazyGlue The purpose of this model is to keep you glued to the screen generating images. Samples and prompts: !Free AI image generator Crazy Glue samples (Click for larger) Top left: a cute girl with freckles on her face, cgsociety unreal engine, wet t-shirt, short skirt, style of aenami alena, trending on artstartion, inspired by Fyodor Vasilyev, looks a bit similar to amy adams, emissive light, fluffy orange skin, dribbble, dramatic rendering Top right: 90s grainy vhs still young mother loose shirt, headband. holding a baby, on the couch, posing, bow. bokeh, bright lighting. smile Bottom left: dragon holds the sun upside down in the sky with her hands, she stands on a mountain, under the mountain there is a small town, illustration, sharp focus, very detailed, 8 k, hd Bottom right: Excellent quality and high resolution photo which depicts an old black panther in its natural habitat. The panther stands on four legs at a distance of 25 meters. Its torso is directed parallel to the lens. This is not a portrait, the panther is fully visible! ! ! The panther looks straight into the camera lens. Her appearance is menacing, majestic, but not aggressive. She looks wary but indifferent without fear. She shows that she has everything under control and she owns the situation. Back background in brown tones. Most likely the area is arid, something like a desert. However, the vegetation is still present. The quality of the photo is professional, taken by a professional photographer on the latest model of the camera in high resolution. InsaneRealistic by cordonsolution8 merged with the Hellmix model by Barons, Kitsch-In-Sync v2 by iamxenos, the cryptids lora by RIXYN, and artistic models with the CokeGirls lora by iamxenos. Original pages: URL (InsaneRealistic) URL URL (Kitsch-In-Sync v2) URL URL (Cryptids LoRA)
[ "# KrazyGlue\n\nThe purpose of this model is to keep you glued to the screen generating images.\n\nSamples and prompts:\n\n!Free AI image generator Crazy Glue samples\n\n(Click for larger)\n\nTop left: a cute girl with freckles on her face, cgsociety unreal engine, wet t-shirt, short skirt, style of aenami alena, trending on artstartion, inspired by Fyodor Vasilyev, looks a bit similar to amy adams, emissive light, fluffy orange skin, dribbble, dramatic rendering\n\nTop right: 90s grainy vhs still young mother loose shirt, headband. holding a baby, on the couch, posing, bow. bokeh, bright lighting. smile\n\nBottom left: dragon holds the sun upside down in the sky with her hands, she stands on a mountain, under the mountain there is a small town, illustration, sharp focus, very detailed, 8 k, hd\n\nBottom right: Excellent quality and high resolution photo which depicts an old black panther in its natural habitat. The panther stands on four legs at a distance of 25 meters. Its torso is directed parallel to the lens. This is not a portrait, the panther is fully visible! ! ! The panther looks straight into the camera lens. Her appearance is menacing, majestic, but not aggressive. She looks wary but indifferent without fear. She shows that she has everything under control and she owns the situation. Back background in brown tones. Most likely the area is arid, something like a desert. However, the vegetation is still present. The quality of the photo is professional, taken by a professional photographer on the latest model of the camera in high resolution.\n\nInsaneRealistic by cordonsolution8 merged with the Hellmix model by Barons, Kitsch-In-Sync v2 by iamxenos, the cryptids lora by RIXYN, and artistic models with the CokeGirls lora by iamxenos.\n\nOriginal pages:\n\nURL (InsaneRealistic)\n\nURL\n\nURL (Kitsch-In-Sync v2)\n\nURL\n\nURL (Cryptids LoRA)" ]
[ "TAGS\n#diffusers #safetensors #Realism #Girls #Cute #cordonsolution8 #iamxenos #RIXYN #Barons #stable-diffusion #stable-diffusion-diffusers #text-to-image #en #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us #has_space \n", "# KrazyGlue\n\nThe purpose of this model is to keep you glued to the screen generating images.\n\nSamples and prompts:\n\n!Free AI image generator Crazy Glue samples\n\n(Click for larger)\n\nTop left: a cute girl with freckles on her face, cgsociety unreal engine, wet t-shirt, short skirt, style of aenami alena, trending on artstartion, inspired by Fyodor Vasilyev, looks a bit similar to amy adams, emissive light, fluffy orange skin, dribbble, dramatic rendering\n\nTop right: 90s grainy vhs still young mother loose shirt, headband. holding a baby, on the couch, posing, bow. bokeh, bright lighting. smile\n\nBottom left: dragon holds the sun upside down in the sky with her hands, she stands on a mountain, under the mountain there is a small town, illustration, sharp focus, very detailed, 8 k, hd\n\nBottom right: Excellent quality and high resolution photo which depicts an old black panther in its natural habitat. The panther stands on four legs at a distance of 25 meters. Its torso is directed parallel to the lens. This is not a portrait, the panther is fully visible! ! ! The panther looks straight into the camera lens. Her appearance is menacing, majestic, but not aggressive. She looks wary but indifferent without fear. She shows that she has everything under control and she owns the situation. Back background in brown tones. Most likely the area is arid, something like a desert. However, the vegetation is still present. The quality of the photo is professional, taken by a professional photographer on the latest model of the camera in high resolution.\n\nInsaneRealistic by cordonsolution8 merged with the Hellmix model by Barons, Kitsch-In-Sync v2 by iamxenos, the cryptids lora by RIXYN, and artistic models with the CokeGirls lora by iamxenos.\n\nOriginal pages:\n\nURL (InsaneRealistic)\n\nURL\n\nURL (Kitsch-In-Sync v2)\n\nURL\n\nURL (Cryptids LoRA)" ]
[ 83, 437 ]
[ "TAGS\n#diffusers #safetensors #Realism #Girls #Cute #cordonsolution8 #iamxenos #RIXYN #Barons #stable-diffusion #stable-diffusion-diffusers #text-to-image #en #license-creativeml-openrail-m #endpoints_compatible #diffusers-StableDiffusionPipeline #region-us #has_space \n# KrazyGlue\n\nThe purpose of this model is to keep you glued to the screen generating images.\n\nSamples and prompts:\n\n!Free AI image generator Crazy Glue samples\n\n(Click for larger)\n\nTop left: a cute girl with freckles on her face, cgsociety unreal engine, wet t-shirt, short skirt, style of aenami alena, trending on artstartion, inspired by Fyodor Vasilyev, looks a bit similar to amy adams, emissive light, fluffy orange skin, dribbble, dramatic rendering\n\nTop right: 90s grainy vhs still young mother loose shirt, headband. holding a baby, on the couch, posing, bow. bokeh, bright lighting. smile\n\nBottom left: dragon holds the sun upside down in the sky with her hands, she stands on a mountain, under the mountain there is a small town, illustration, sharp focus, very detailed, 8 k, hd\n\nBottom right: Excellent quality and high resolution photo which depicts an old black panther in its natural habitat. The panther stands on four legs at a distance of 25 meters. Its torso is directed parallel to the lens. This is not a portrait, the panther is fully visible! ! ! The panther looks straight into the camera lens. Her appearance is menacing, majestic, but not aggressive. She looks wary but indifferent without fear. She shows that she has everything under control and she owns the situation. Back background in brown tones. Most likely the area is arid, something like a desert. However, the vegetation is still present. The quality of the photo is professional, taken by a professional photographer on the latest model of the camera in high resolution.\n\nInsaneRealistic by cordonsolution8 merged with the Hellmix model by Barons, Kitsch-In-Sync v2 by iamxenos, the cryptids lora by RIXYN, and artistic models with the CokeGirls lora by iamxenos.\n\nOriginal pages:\n\nURL (InsaneRealistic)\n\nURL\n\nURL (Kitsch-In-Sync v2)\n\nURL\n\nURL (Cryptids LoRA)" ]
text2text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
Luluuu/0429_SEASON_baseline_checkpoint_4854
null
[ "transformers", "safetensors", "bart", "text2text-generation", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:23:16+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bart #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bart #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 39, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #bart #text2text-generation #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
usernameisanna/blip2-deit-seed-epoch-1
null
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:26:11+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 26, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
text-generation
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # selfbiorag-7b-dpo-full-sft-wo-kqa_silver_wogold This model is a fine-tuned version of [Minbyul/selfbiorag-7b-wo-kqa_silver_wogold-sft](https://huggingface.co/Minbyul/selfbiorag-7b-wo-kqa_silver_wogold-sft) on the HuggingFaceH4/ultrafeedback_binarized dataset. It achieves the following results on the evaluation set: - Loss: 0.3027 - Rewards/chosen: -1.2987 - Rewards/rejected: -5.2535 - Rewards/accuracies: 0.8697 - Rewards/margins: 3.9549 - Logps/rejected: -1323.0533 - Logps/chosen: -663.1464 - Logits/rejected: -0.3508 - Logits/chosen: -0.2498 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-06 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - distributed_type: multi-GPU - num_devices: 4 - gradient_accumulation_steps: 2 - total_train_batch_size: 64 - total_eval_batch_size: 32 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | Rewards/chosen | Rewards/rejected | Rewards/accuracies | Rewards/margins | Logps/rejected | Logps/chosen | Logits/rejected | Logits/chosen | |:-------------:|:-----:|:----:|:---------------:|:--------------:|:----------------:|:------------------:|:---------------:|:--------------:|:------------:|:---------------:|:-------------:| | 0.2357 | 0.35 | 100 | 0.3364 | -0.8114 | -3.4524 | 0.8579 | 2.6410 | -1142.9419 | -614.4251 | -0.0367 | 0.0065 | | 0.154 | 0.71 | 200 | 0.3009 | -0.9868 | -4.1905 | 0.8632 | 3.2037 | -1216.7521 | -631.9589 | -0.3156 | -0.2396 | ### Framework versions - Transformers 4.39.0.dev0 - Pytorch 2.1.2 - Datasets 2.14.6 - Tokenizers 0.15.2
{"tags": ["alignment-handbook", "trl", "dpo", "generated_from_trainer", "trl", "dpo", "generated_from_trainer"], "datasets": ["HuggingFaceH4/ultrafeedback_binarized"], "base_model": "Minbyul/selfbiorag-7b-wo-kqa_silver_wogold-sft", "model-index": [{"name": "selfbiorag-7b-dpo-full-sft-wo-kqa_silver_wogold", "results": []}]}
Minbyul/selfbiorag-7b-dpo-full-sft-wo-kqa_silver_wogold
null
[ "transformers", "safetensors", "llama", "text-generation", "alignment-handbook", "trl", "dpo", "generated_from_trainer", "dataset:HuggingFaceH4/ultrafeedback_binarized", "base_model:Minbyul/selfbiorag-7b-wo-kqa_silver_wogold-sft", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T15:27:21+00:00
[]
[]
TAGS #transformers #safetensors #llama #text-generation #alignment-handbook #trl #dpo #generated_from_trainer #dataset-HuggingFaceH4/ultrafeedback_binarized #base_model-Minbyul/selfbiorag-7b-wo-kqa_silver_wogold-sft #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
selfbiorag-7b-dpo-full-sft-wo-kqa\_silver\_wogold ================================================= This model is a fine-tuned version of Minbyul/selfbiorag-7b-wo-kqa\_silver\_wogold-sft on the HuggingFaceH4/ultrafeedback\_binarized dataset. It achieves the following results on the evaluation set: * Loss: 0.3027 * Rewards/chosen: -1.2987 * Rewards/rejected: -5.2535 * Rewards/accuracies: 0.8697 * Rewards/margins: 3.9549 * Logps/rejected: -1323.0533 * Logps/chosen: -663.1464 * Logits/rejected: -0.3508 * Logits/chosen: -0.2498 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 5e-06 * train\_batch\_size: 8 * eval\_batch\_size: 8 * seed: 42 * distributed\_type: multi-GPU * num\_devices: 4 * gradient\_accumulation\_steps: 2 * total\_train\_batch\_size: 64 * total\_eval\_batch\_size: 32 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: cosine * lr\_scheduler\_warmup\_ratio: 0.1 * num\_epochs: 1 ### Training results ### Framework versions * Transformers 4.39.0.dev0 * Pytorch 2.1.2 * Datasets 2.14.6 * Tokenizers 0.15.2
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 1", "### Training results", "### Framework versions\n\n\n* Transformers 4.39.0.dev0\n* Pytorch 2.1.2\n* Datasets 2.14.6\n* Tokenizers 0.15.2" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #alignment-handbook #trl #dpo #generated_from_trainer #dataset-HuggingFaceH4/ultrafeedback_binarized #base_model-Minbyul/selfbiorag-7b-wo-kqa_silver_wogold-sft #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 1", "### Training results", "### Framework versions\n\n\n* Transformers 4.39.0.dev0\n* Pytorch 2.1.2\n* Datasets 2.14.6\n* Tokenizers 0.15.2" ]
[ 95, 176, 5, 43 ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #alignment-handbook #trl #dpo #generated_from_trainer #dataset-HuggingFaceH4/ultrafeedback_binarized #base_model-Minbyul/selfbiorag-7b-wo-kqa_silver_wogold-sft #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-06\n* train\\_batch\\_size: 8\n* eval\\_batch\\_size: 8\n* seed: 42\n* distributed\\_type: multi-GPU\n* num\\_devices: 4\n* gradient\\_accumulation\\_steps: 2\n* total\\_train\\_batch\\_size: 64\n* total\\_eval\\_batch\\_size: 32\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: cosine\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 1### Training results### Framework versions\n\n\n* Transformers 4.39.0.dev0\n* Pytorch 2.1.2\n* Datasets 2.14.6\n* Tokenizers 0.15.2" ]
null
null
Number of experts present in the library: 8 | Expert Name | Base Model | Trained on | Adapter Type | | --- | --- | --- | --- | | e0 | phi-2 | sordonia/flan-10k-flat/None | skilled_lora | | e1 | phi-2 | sordonia/flan-10k-flat/None | skilled_lora | | e2 | phi-2 | sordonia/flan-10k-flat/None | skilled_lora | | e3 | phi-2 | sordonia/flan-10k-flat/None | skilled_lora | | e4 | phi-2 | sordonia/flan-10k-flat/None | skilled_lora | | e5 | phi-2 | sordonia/flan-10k-flat/None | skilled_lora | | e6 | phi-2 | sordonia/flan-10k-flat/None | skilled_lora | | e7 | phi-2 | sordonia/flan-10k-flat/None | skilled_lora | Last updated on: 2024-04-30 15:27:43+00:00
{}
pclucas14/phi2_mhr_S32_1ep
null
[ "region:us" ]
null
2024-04-30T15:27:43+00:00
[]
[]
TAGS #region-us
Number of experts present in the library: 8
[]
[ "TAGS\n#region-us \n" ]
[ 5 ]
[ "TAGS\n#region-us \n" ]
fill-mask
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # tda_2001_16_distilbert This model is a fine-tuned version of [distilbert-base-cased](https://huggingface.co/distilbert-base-cased) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.2095 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | No log | 1.0 | 128 | 1.2329 | | No log | 2.0 | 256 | 1.2978 | | No log | 3.0 | 384 | 1.2600 | | 1.3691 | 4.0 | 512 | 1.2740 | | 1.3691 | 5.0 | 640 | 1.1886 | | 1.3691 | 6.0 | 768 | 1.2319 | | 1.3691 | 7.0 | 896 | 1.2037 | | 1.2783 | 8.0 | 1024 | 1.2322 | | 1.2783 | 9.0 | 1152 | 1.2019 | | 1.2783 | 10.0 | 1280 | 1.2095 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "distilbert-base-cased", "model-index": [{"name": "tda_2001_16_distilbert", "results": []}]}
gc394/tda_2001_16_distilbert
null
[ "transformers", "tensorboard", "safetensors", "distilbert", "fill-mask", "generated_from_trainer", "base_model:distilbert-base-cased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:28:38+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
tda\_2001\_16\_distilbert ========================= This model is a fine-tuned version of distilbert-base-cased on the None dataset. It achieves the following results on the evaluation set: * Loss: 1.2095 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 10 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * Transformers 4.40.1 * Pytorch 2.2.1+cu121 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ 59, 112, 5, 44 ]
[ "TAGS\n#transformers #tensorboard #safetensors #distilbert #fill-mask #generated_from_trainer #base_model-distilbert-base-cased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
feature-extraction
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
gutsartificial/RecipeBert
null
[ "transformers", "safetensors", "bert", "feature-extraction", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:30:14+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 32, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #bert #feature-extraction #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
null
transformers
## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: --> <!-- ### vocab_type: --> weighted/imatrix quants of https://huggingface.co/elinas/Llama-3-8B-Ultra-Instruct You should use `--override-kv tokenizer.ggml.pre=str:llama3` and a current llama.cpp version to work around a bug in llama.cpp that made these quants. (see https://old.reddit.com/r/LocalLLaMA/comments/1cg0z1i/bpe_pretokenization_support_is_now_merged_llamacpp/?share_id=5dBFB9x0cOJi8vbr-Murh) <!-- provided-files --> static quants are available at https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-GGUF ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-IQ1_S.gguf) | i1-IQ1_S | 2.1 | for the desperate | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-IQ1_M.gguf) | i1-IQ1_M | 2.3 | for the desperate | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-IQ2_XXS.gguf) | i1-IQ2_XXS | 2.5 | | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-IQ2_XS.gguf) | i1-IQ2_XS | 2.7 | | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-IQ2_S.gguf) | i1-IQ2_S | 2.9 | | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-IQ2_M.gguf) | i1-IQ2_M | 3.0 | | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-Q2_K.gguf) | i1-Q2_K | 3.3 | IQ3_XXS probably better | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-IQ3_XXS.gguf) | i1-IQ3_XXS | 3.4 | lower quality | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-IQ3_XS.gguf) | i1-IQ3_XS | 3.6 | | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-Q3_K_S.gguf) | i1-Q3_K_S | 3.8 | IQ3_XS probably better | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-IQ3_S.gguf) | i1-IQ3_S | 3.8 | beats Q3_K* | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-IQ3_M.gguf) | i1-IQ3_M | 3.9 | | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-Q3_K_M.gguf) | i1-Q3_K_M | 4.1 | IQ3_S probably better | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-Q3_K_L.gguf) | i1-Q3_K_L | 4.4 | IQ3_M probably better | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-IQ4_XS.gguf) | i1-IQ4_XS | 4.5 | | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-Q4_0.gguf) | i1-Q4_0 | 4.8 | fast, low quality | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-Q4_K_S.gguf) | i1-Q4_K_S | 4.8 | optimal size/speed/quality | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-Q4_K_M.gguf) | i1-Q4_K_M | 5.0 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-Q5_K_S.gguf) | i1-Q5_K_S | 5.7 | | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-Q5_K_M.gguf) | i1-Q5_K_M | 5.8 | | | [GGUF](https://huggingface.co/mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF/resolve/main/Llama-3-8B-Ultra-Instruct.i1-Q6_K.gguf) | i1-Q6_K | 6.7 | practically like static Q6_K | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
{"language": ["en"], "license": "llama3", "library_name": "transformers", "tags": ["mergekit", "merge"], "base_model": "elinas/Llama-3-8B-Ultra-Instruct", "quantized_by": "mradermacher"}
mradermacher/Llama-3-8B-Ultra-Instruct-i1-GGUF
null
[ "transformers", "gguf", "mergekit", "merge", "en", "base_model:elinas/Llama-3-8B-Ultra-Instruct", "license:llama3", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:31:26+00:00
[]
[ "en" ]
TAGS #transformers #gguf #mergekit #merge #en #base_model-elinas/Llama-3-8B-Ultra-Instruct #license-llama3 #endpoints_compatible #region-us
About ----- weighted/imatrix quants of URL You should use '--override-kv URL=str:llama3' and a current URL version to work around a bug in URL that made these quants. (see URL static quants are available at URL Usage ----- If you are unsure how to use GGUF files, refer to one of TheBloke's READMEs for more details, including on how to concatenate multi-part files. Provided Quants --------------- (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): !URL And here are Artefact2's thoughts on the matter: URL FAQ / Model Request ------------------- See URL for some answers to questions you might have and/or if you want some other model quantized. Thanks ------ I thank my company, nethype GmbH, for letting me use its servers and providing upgrades to my workstation to enable this work in my free time.
[]
[ "TAGS\n#transformers #gguf #mergekit #merge #en #base_model-elinas/Llama-3-8B-Ultra-Instruct #license-llama3 #endpoints_compatible #region-us \n" ]
[ 49 ]
[ "TAGS\n#transformers #gguf #mergekit #merge #en #base_model-elinas/Llama-3-8B-Ultra-Instruct #license-llama3 #endpoints_compatible #region-us \n" ]
null
transformers
`Epoch = 1, Global Step = 7669` # Training Script ``` python pretrain.py \ --data-path "chart-rela-ins/unichart-pretrain-table-extraction-v2" \ --train-images "/workspace/data/UniChartImages/" \ --output-dir "/workspace/output_data/unichart_pretrain/" \ --max-steps 1600000 \ --batch-size 2 \ --valid-batch-size 2 \ --num-workers 12 \ --lr 5e-5 \ --log-every-n-steps 100 \ --val-check-interval 0.5 \ --warmup-steps 8000 \ --checkpoint-steps 40000 \ --accumulate-grad-batches 64 \ --processor-path "ahmed-masry/unichart-base-960" \ --image-size 512 \ --pretrained-vision-encoder "nxquang-al/unichart-base-960-encoder" \ --pretrained-decoder "nxquang-al/unichart-base-960-decoder" \ --wandb-project "Pretrain-ChartReLA-Instruct" \ ```
{"library_name": "transformers", "tags": []}
chart-rela-ins/pretrain-small-unichart-table-bs64-low-lr
null
[ "transformers", "safetensors", "Chart-rela-instruct", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:33:14+00:00
[]
[]
TAGS #transformers #safetensors #Chart-rela-instruct #endpoints_compatible #region-us
'Epoch = 1, Global Step = 7669' # Training Script
[ "# Training Script" ]
[ "TAGS\n#transformers #safetensors #Chart-rela-instruct #endpoints_compatible #region-us \n", "# Training Script" ]
[ 24, 3 ]
[ "TAGS\n#transformers #safetensors #Chart-rela-instruct #endpoints_compatible #region-us \n# Training Script" ]
null
peft
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # llava-1.5-7b-hf-ft-mix-vsft This model is a fine-tuned version of [llava-hf/llava-1.5-7b-hf](https://huggingface.co/llava-hf/llava-1.5-7b-hf) on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results ### Framework versions - PEFT 0.10.0 - Transformers 4.40.1 - Pytorch 2.1.2 - Datasets 2.18.0 - Tokenizers 0.19.1
{"library_name": "peft", "tags": ["trl", "sft", "generated_from_trainer"], "base_model": "llava-hf/llava-1.5-7b-hf", "model-index": [{"name": "llava-1.5-7b-hf-ft-mix-vsft", "results": []}]}
GURU369/llava-1.5-7b-hf-ft-mix-vsft
null
[ "peft", "tensorboard", "safetensors", "trl", "sft", "generated_from_trainer", "base_model:llava-hf/llava-1.5-7b-hf", "region:us" ]
null
2024-04-30T15:33:22+00:00
[]
[]
TAGS #peft #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-llava-hf/llava-1.5-7b-hf #region-us
# llava-1.5-7b-hf-ft-mix-vsft This model is a fine-tuned version of llava-hf/llava-1.5-7b-hf on an unknown dataset. ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 2 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results ### Framework versions - PEFT 0.10.0 - Transformers 4.40.1 - Pytorch 2.1.2 - Datasets 2.18.0 - Tokenizers 0.19.1
[ "# llava-1.5-7b-hf-ft-mix-vsft\n\nThis model is a fine-tuned version of llava-hf/llava-1.5-7b-hf on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 10\n- mixed_precision_training: Native AMP", "### Training results", "### Framework versions\n\n- PEFT 0.10.0\n- Transformers 4.40.1\n- Pytorch 2.1.2\n- Datasets 2.18.0\n- Tokenizers 0.19.1" ]
[ "TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-llava-hf/llava-1.5-7b-hf #region-us \n", "# llava-1.5-7b-hf-ft-mix-vsft\n\nThis model is a fine-tuned version of llava-hf/llava-1.5-7b-hf on an unknown dataset.", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 10\n- mixed_precision_training: Native AMP", "### Training results", "### Framework versions\n\n- PEFT 0.10.0\n- Transformers 4.40.1\n- Pytorch 2.1.2\n- Datasets 2.18.0\n- Tokenizers 0.19.1" ]
[ 50, 53, 7, 9, 9, 4, 102, 5, 48 ]
[ "TAGS\n#peft #tensorboard #safetensors #trl #sft #generated_from_trainer #base_model-llava-hf/llava-1.5-7b-hf #region-us \n# llava-1.5-7b-hf-ft-mix-vsft\n\nThis model is a fine-tuned version of llava-hf/llava-1.5-7b-hf on an unknown dataset.## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 5e-05\n- train_batch_size: 2\n- eval_batch_size: 8\n- seed: 42\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: linear\n- num_epochs: 10\n- mixed_precision_training: Native AMP### Training results### Framework versions\n\n- PEFT 0.10.0\n- Transformers 4.40.1\n- Pytorch 2.1.2\n- Datasets 2.18.0\n- Tokenizers 0.19.1" ]
null
null
Neo_AD_MotionLora_V1 [Watch the Video](https://huggingface.co/Neofuturism/Neo_AD_MotionLora_V1/blob/main/MOTION_LORA.mp4) 4 Motion Loras trained on AnimateDiff V3 1. Neo360Pan: Slow 360 pan around center 2. Neo360Pan_B: Same as above, subjective smoother output 3. NeoPanUp: Background and foreground sliding upward 4. NeoPanDown: Background and foreground sliding downward 5. NeoMoveIn: Slow move forward
{"license": "apache-2.0"}
Neofuturism/Neo_AD_MotionLora_V1
null
[ "license:apache-2.0", "region:us" ]
null
2024-04-30T15:35:48+00:00
[]
[]
TAGS #license-apache-2.0 #region-us
Neo_AD_MotionLora_V1 Watch the Video 4 Motion Loras trained on AnimateDiff V3 1. Neo360Pan: Slow 360 pan around center 2. Neo360Pan_B: Same as above, subjective smoother output 3. NeoPanUp: Background and foreground sliding upward 4. NeoPanDown: Background and foreground sliding downward 5. NeoMoveIn: Slow move forward
[]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
[ 13 ]
[ "TAGS\n#license-apache-2.0 #region-us \n" ]
text-generation
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/emillykkejensen/LLM-instruct/runs/v8wdcn55) # Llama-3-8B-instruct-dansk This model is a fine-tuned version of [meta-llama/Meta-Llama-3-8B](https://huggingface.co/meta-llama/Meta-Llama-3-8B) on the [kobprof/skolegpt-instruct](https://huggingface.co/datasets/kobprof/skolegpt-instruct) dataset. It achieves the following results on the evaluation set: - Loss: 0.9477 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - distributed_type: multi-GPU - num_devices: 4 - total_train_batch_size: 4 - total_eval_batch_size: 4 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.2 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.41.0.dev0 - Pytorch 2.2.0 - Datasets 2.19.0 - Tokenizers 0.19.1
{"language": ["da"], "license": "other", "library_name": "transformers", "tags": ["trl", "sft", "generated_from_trainer", "danish"], "datasets": ["kobprof/skolegpt-instruct"], "base_model": "meta-llama/Meta-Llama-3-8B", "pipeline_tag": "text-generation", "model-index": [{"name": "Llama-3-8B-instruct-dansk", "results": []}]}
emillykkejensen/Llama-3-8B-instruct-dansk
null
[ "transformers", "safetensors", "llama", "text-generation", "trl", "sft", "generated_from_trainer", "danish", "da", "dataset:kobprof/skolegpt-instruct", "base_model:meta-llama/Meta-Llama-3-8B", "license:other", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T15:37:56+00:00
[]
[ "da" ]
TAGS #transformers #safetensors #llama #text-generation #trl #sft #generated_from_trainer #danish #da #dataset-kobprof/skolegpt-instruct #base_model-meta-llama/Meta-Llama-3-8B #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
<img src="URL alt="Visualize in Weights & Biases" width="200" height="32"/> # Llama-3-8B-instruct-dansk This model is a fine-tuned version of meta-llama/Meta-Llama-3-8B on the kobprof/skolegpt-instruct dataset. It achieves the following results on the evaluation set: - Loss: 0.9477 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 1 - eval_batch_size: 1 - seed: 42 - distributed_type: multi-GPU - num_devices: 4 - total_train_batch_size: 4 - total_eval_batch_size: 4 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: cosine - lr_scheduler_warmup_ratio: 0.2 - num_epochs: 1 ### Training results ### Framework versions - Transformers 4.41.0.dev0 - Pytorch 2.2.0 - Datasets 2.19.0 - Tokenizers 0.19.1
[ "# Llama-3-8B-instruct-dansk\n\nThis model is a fine-tuned version of meta-llama/Meta-Llama-3-8B on the kobprof/skolegpt-instruct dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.9477", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 1\n- eval_batch_size: 1\n- seed: 42\n- distributed_type: multi-GPU\n- num_devices: 4\n- total_train_batch_size: 4\n- total_eval_batch_size: 4\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.2\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.41.0.dev0\n- Pytorch 2.2.0\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #trl #sft #generated_from_trainer #danish #da #dataset-kobprof/skolegpt-instruct #base_model-meta-llama/Meta-Llama-3-8B #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Llama-3-8B-instruct-dansk\n\nThis model is a fine-tuned version of meta-llama/Meta-Llama-3-8B on the kobprof/skolegpt-instruct dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.9477", "## Model description\n\nMore information needed", "## Intended uses & limitations\n\nMore information needed", "## Training and evaluation data\n\nMore information needed", "## Training procedure", "### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 1\n- eval_batch_size: 1\n- seed: 42\n- distributed_type: multi-GPU\n- num_devices: 4\n- total_train_batch_size: 4\n- total_eval_batch_size: 4\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.2\n- num_epochs: 1", "### Training results", "### Framework versions\n\n- Transformers 4.41.0.dev0\n- Pytorch 2.2.0\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
[ 89, 73, 7, 9, 9, 4, 147, 5, 43 ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #trl #sft #generated_from_trainer #danish #da #dataset-kobprof/skolegpt-instruct #base_model-meta-llama/Meta-Llama-3-8B #license-other #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Llama-3-8B-instruct-dansk\n\nThis model is a fine-tuned version of meta-llama/Meta-Llama-3-8B on the kobprof/skolegpt-instruct dataset.\nIt achieves the following results on the evaluation set:\n- Loss: 0.9477## Model description\n\nMore information needed## Intended uses & limitations\n\nMore information needed## Training and evaluation data\n\nMore information needed## Training procedure### Training hyperparameters\n\nThe following hyperparameters were used during training:\n- learning_rate: 2e-05\n- train_batch_size: 1\n- eval_batch_size: 1\n- seed: 42\n- distributed_type: multi-GPU\n- num_devices: 4\n- total_train_batch_size: 4\n- total_eval_batch_size: 4\n- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n- lr_scheduler_type: cosine\n- lr_scheduler_warmup_ratio: 0.2\n- num_epochs: 1### Training results### Framework versions\n\n- Transformers 4.41.0.dev0\n- Pytorch 2.2.0\n- Datasets 2.19.0\n- Tokenizers 0.19.1" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
ilhemhmz752/Agrobot-llama2-ft
null
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:39:37+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 26, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
null
peft
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # agrobot-ft This model is a fine-tuned version of [meta-llama/Llama-2-7b-chat-hf](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf) on the None dataset. It achieves the following results on the evaluation set: - Loss: 1.7517 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 4 - eval_batch_size: 4 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 2 - num_epochs: 10 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 2.26 | 1.0 | 102 | 1.8542 | | 1.7656 | 2.0 | 205 | 1.7839 | | 1.7085 | 3.0 | 307 | 1.7542 | | 1.6346 | 4.0 | 410 | 1.7347 | | 1.5946 | 5.0 | 512 | 1.7275 | | 1.5335 | 6.0 | 615 | 1.7295 | | 1.5043 | 7.0 | 717 | 1.7276 | | 1.455 | 8.0 | 820 | 1.7389 | | 1.443 | 9.0 | 922 | 1.7459 | | 1.4064 | 9.95 | 1020 | 1.7517 | ### Framework versions - PEFT 0.10.0 - Transformers 4.39.3 - Pytorch 2.1.0+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2
{"license": "llama2", "library_name": "peft", "tags": ["generated_from_trainer"], "base_model": "meta-llama/Llama-2-7b-chat-hf", "model-index": [{"name": "agrobot-ft", "results": []}]}
ilhemhmz752/agrobot-ft
null
[ "peft", "tensorboard", "safetensors", "generated_from_trainer", "base_model:meta-llama/Llama-2-7b-chat-hf", "license:llama2", "region:us" ]
null
2024-04-30T15:39:40+00:00
[]
[]
TAGS #peft #tensorboard #safetensors #generated_from_trainer #base_model-meta-llama/Llama-2-7b-chat-hf #license-llama2 #region-us
agrobot-ft ========== This model is a fine-tuned version of meta-llama/Llama-2-7b-chat-hf on the None dataset. It achieves the following results on the evaluation set: * Loss: 1.7517 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 0.0002 * train\_batch\_size: 4 * eval\_batch\_size: 4 * seed: 42 * gradient\_accumulation\_steps: 4 * total\_train\_batch\_size: 16 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_steps: 2 * num\_epochs: 10 * mixed\_precision\_training: Native AMP ### Training results ### Framework versions * PEFT 0.10.0 * Transformers 4.39.3 * Pytorch 2.1.0+cu121 * Datasets 2.18.0 * Tokenizers 0.15.2
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* PEFT 0.10.0\n* Transformers 4.39.3\n* Pytorch 2.1.0+cu121\n* Datasets 2.18.0\n* Tokenizers 0.15.2" ]
[ "TAGS\n#peft #tensorboard #safetensors #generated_from_trainer #base_model-meta-llama/Llama-2-7b-chat-hf #license-llama2 #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP", "### Training results", "### Framework versions\n\n\n* PEFT 0.10.0\n* Transformers 4.39.3\n* Pytorch 2.1.0+cu121\n* Datasets 2.18.0\n* Tokenizers 0.15.2" ]
[ 49, 151, 5, 52 ]
[ "TAGS\n#peft #tensorboard #safetensors #generated_from_trainer #base_model-meta-llama/Llama-2-7b-chat-hf #license-llama2 #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.0002\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 4\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_steps: 2\n* num\\_epochs: 10\n* mixed\\_precision\\_training: Native AMP### Training results### Framework versions\n\n\n* PEFT 0.10.0\n* Transformers 4.39.3\n* Pytorch 2.1.0+cu121\n* Datasets 2.18.0\n* Tokenizers 0.15.2" ]
text-generation
transformers
# Model Trained Using AutoTrain This model was trained using AutoTrain. For more information, please visit [AutoTrain](https://hf.co/docs/autotrain). # Usage ```python from transformers import AutoModelForCausalLM, AutoTokenizer model_path = "PATH_TO_THIS_REPO" tokenizer = AutoTokenizer.from_pretrained(model_path) model = AutoModelForCausalLM.from_pretrained( model_path, device_map="auto", torch_dtype='auto' ).eval() # Prompt content: "hi" messages = [ {"role": "user", "content": "hi"} ] input_ids = tokenizer.apply_chat_template(conversation=messages, tokenize=True, add_generation_prompt=True, return_tensors='pt') output_ids = model.generate(input_ids.to('cuda')) response = tokenizer.decode(output_ids[0][input_ids.shape[1]:], skip_special_tokens=True) # Model response: "Hello! How can I assist you today?" print(response) ```
{"license": "other", "library_name": "transformers", "tags": ["autotrain", "text-generation-inference", "text-generation", "peft"], "widget": [{"messages": [{"role": "user", "content": "What is your favorite condiment?"}]}]}
nanxiz/autotrain-fzpxr-ximzt
null
[ "transformers", "tensorboard", "safetensors", "phi3", "text-generation", "autotrain", "text-generation-inference", "peft", "conversational", "custom_code", "license:other", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:41:45+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #phi3 #text-generation #autotrain #text-generation-inference #peft #conversational #custom_code #license-other #autotrain_compatible #endpoints_compatible #region-us
# Model Trained Using AutoTrain This model was trained using AutoTrain. For more information, please visit AutoTrain. # Usage
[ "# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.", "# Usage" ]
[ "TAGS\n#transformers #tensorboard #safetensors #phi3 #text-generation #autotrain #text-generation-inference #peft #conversational #custom_code #license-other #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.", "# Usage" ]
[ 54, 23, 2 ]
[ "TAGS\n#transformers #tensorboard #safetensors #phi3 #text-generation #autotrain #text-generation-inference #peft #conversational #custom_code #license-other #autotrain_compatible #endpoints_compatible #region-us \n# Model Trained Using AutoTrain\n\nThis model was trained using AutoTrain. For more information, please visit AutoTrain.# Usage" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
shallow6414/jmgvp9n
null
[ "transformers", "safetensors", "llama", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T15:41:50+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 47, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #llama #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
golf2248/10jfipg
null
[ "transformers", "safetensors", "stablelm", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:43:03+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 41, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
reinforcement-learning
null
# **Q-Learning** Agent playing1 **FrozenLake-v1** This is a trained model of a **Q-Learning** agent playing **FrozenLake-v1** . ## Usage ```python model = load_from_hub(repo_id="raulgadea/q-FrozenLake-v1-4x4-noSlippery", filename="q-learning.pkl") # Don't forget to check if you need to add additional attributes (is_slippery=False etc) env = gym.make(model["env_id"]) ```
{"tags": ["FrozenLake-v1-4x4-no_slippery", "q-learning", "reinforcement-learning", "custom-implementation"], "model-index": [{"name": "q-FrozenLake-v1-4x4-noSlippery", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "FrozenLake-v1-4x4-no_slippery", "type": "FrozenLake-v1-4x4-no_slippery"}, "metrics": [{"type": "mean_reward", "value": "1.00 +/- 0.00", "name": "mean_reward", "verified": false}]}]}]}
raulgadea/q-FrozenLake-v1-4x4-noSlippery
null
[ "FrozenLake-v1-4x4-no_slippery", "q-learning", "reinforcement-learning", "custom-implementation", "model-index", "region:us" ]
null
2024-04-30T15:44:12+00:00
[]
[]
TAGS #FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us
# Q-Learning Agent playing1 FrozenLake-v1 This is a trained model of a Q-Learning agent playing FrozenLake-v1 . ## Usage
[ "# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage" ]
[ "TAGS\n#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n", "# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage" ]
[ 35, 33 ]
[ "TAGS\n#FrozenLake-v1-4x4-no_slippery #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n# Q-Learning Agent playing1 FrozenLake-v1\n This is a trained model of a Q-Learning agent playing FrozenLake-v1 .\n\n ## Usage" ]
text-classification
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # distilbert-base-uncased-trained This model is a fine-tuned version of [distilbert-base-uncased](https://huggingface.co/distilbert-base-uncased) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.3511 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 64 - eval_batch_size: 64 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 0.4352 | 1.0 | 157 | 0.3683 | | 0.2797 | 2.0 | 314 | 0.3349 | | 0.2142 | 3.0 | 471 | 0.3511 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "base_model": "distilbert-base-uncased", "model-index": [{"name": "distilbert-base-uncased-trained", "results": []}]}
Lasghar/distilbert-base-uncased-trained
null
[ "transformers", "safetensors", "distilbert", "text-classification", "generated_from_trainer", "base_model:distilbert-base-uncased", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:45:28+00:00
[]
[]
TAGS #transformers #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
distilbert-base-uncased-trained =============================== This model is a fine-tuned version of distilbert-base-uncased on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.3511 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 64 * eval\_batch\_size: 64 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 3 ### Training results ### Framework versions * Transformers 4.40.1 * Pytorch 2.2.1+cu121 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ 56, 101, 5, 44 ]
[ "TAGS\n#transformers #safetensors #distilbert #text-classification #generated_from_trainer #base_model-distilbert-base-uncased #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 64\n* eval\\_batch\\_size: 64\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
text-generation
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # dbloom-1b1-alpha-1 This model is a fine-tuned version of [bigscience/bloomz-1b1](https://huggingface.co/bigscience/bloomz-1b1) on the None dataset. It achieves the following results on the evaluation set: - Loss: 5.7774 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.001 - train_batch_size: 4 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 1 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 14.0871 | 0.11 | 100 | 8.6262 | | 7.0473 | 0.22 | 200 | 6.7608 | | 6.3553 | 0.33 | 300 | 6.3938 | | 6.1571 | 0.44 | 400 | 6.2497 | | 6.0 | 0.55 | 500 | 6.1143 | | 5.8703 | 0.66 | 600 | 5.9965 | | 5.7946 | 0.77 | 700 | 5.9184 | | 5.6473 | 0.88 | 800 | 5.8412 | | 5.5571 | 0.99 | 900 | 5.7774 | ### Framework versions - Transformers 4.39.3 - Pytorch 2.2.2+cu121 - Datasets 2.18.0 - Tokenizers 0.15.2
{"license": "bigscience-bloom-rail-1.0", "tags": ["generated_from_trainer"], "base_model": "bigscience/bloomz-1b1", "model-index": [{"name": "dbloom-1b1-alpha-1", "results": []}]}
greenbureau/dbloom-1b1-alpha-1
null
[ "transformers", "safetensors", "bloom", "text-generation", "generated_from_trainer", "base_model:bigscience/bloomz-1b1", "license:bigscience-bloom-rail-1.0", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T15:47:44+00:00
[]
[]
TAGS #transformers #safetensors #bloom #text-generation #generated_from_trainer #base_model-bigscience/bloomz-1b1 #license-bigscience-bloom-rail-1.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
dbloom-1b1-alpha-1 ================== This model is a fine-tuned version of bigscience/bloomz-1b1 on the None dataset. It achieves the following results on the evaluation set: * Loss: 5.7774 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 0.001 * train\_batch\_size: 4 * eval\_batch\_size: 8 * seed: 42 * gradient\_accumulation\_steps: 4 * total\_train\_batch\_size: 16 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 1 ### Training results ### Framework versions * Transformers 4.39.3 * Pytorch 2.2.2+cu121 * Datasets 2.18.0 * Tokenizers 0.15.2
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1", "### Training results", "### Framework versions\n\n\n* Transformers 4.39.3\n* Pytorch 2.2.2+cu121\n* Datasets 2.18.0\n* Tokenizers 0.15.2" ]
[ "TAGS\n#transformers #safetensors #bloom #text-generation #generated_from_trainer #base_model-bigscience/bloomz-1b1 #license-bigscience-bloom-rail-1.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1", "### Training results", "### Framework versions\n\n\n* Transformers 4.39.3\n* Pytorch 2.2.2+cu121\n* Datasets 2.18.0\n* Tokenizers 0.15.2" ]
[ 67, 123, 5, 44 ]
[ "TAGS\n#transformers #safetensors #bloom #text-generation #generated_from_trainer #base_model-bigscience/bloomz-1b1 #license-bigscience-bloom-rail-1.0 #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 0.001\n* train\\_batch\\_size: 4\n* eval\\_batch\\_size: 8\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 16\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 1### Training results### Framework versions\n\n\n* Transformers 4.39.3\n* Pytorch 2.2.2+cu121\n* Datasets 2.18.0\n* Tokenizers 0.15.2" ]
null
peft
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed] ### Framework versions - PEFT 0.8.2
{"library_name": "peft", "base_model": "HiTZ/Medical-mT5-large"}
Mikelium5/Medical-mT5-large_art-nat-int-gal-sep-LoRa
null
[ "peft", "safetensors", "arxiv:1910.09700", "base_model:HiTZ/Medical-mT5-large", "region:us" ]
null
2024-04-30T15:48:50+00:00
[ "1910.09700" ]
[]
TAGS #peft #safetensors #arxiv-1910.09700 #base_model-HiTZ/Medical-mT5-large #region-us
# Model Card for Model ID ## Model Details ### Model Description - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact ### Framework versions - PEFT 0.8.2
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ "TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-HiTZ/Medical-mT5-large #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact", "### Framework versions\n\n- PEFT 0.8.2" ]
[ 36, 6, 4, 50, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5, 13 ]
[ "TAGS\n#peft #safetensors #arxiv-1910.09700 #base_model-HiTZ/Medical-mT5-large #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\n\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact### Framework versions\n\n- PEFT 0.8.2" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
HenryCai1129/adapter-llama-adaptertoxic2nontoxic-100-50-0.0003
null
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:48:53+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 26, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
abc88767/model20
null
[ "transformers", "safetensors", "stablelm", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:50:11+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 41, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
null
transformers
# Visually Guided Generative Text-Layout Pre-training for Document Intelligence The ViTLP model was proposed in [Visually Guided Generative Text-Layout Pre-training for Document Intelligence](https://arxiv.org/abs/2403.16516), which is a generative foundation model for document intelligence. We provide the pre-trained checkpoint *ViTLP-medium* (380M). The pre-trained ViTLP model can natively perform OCR text localization and recognition. ## Demo on Document Text Recognition & Localization The code of ViTLP inference and demo is assisible at [https://github.com/Veason-silverbullet/ViTLP](https://github.com/Veason-silverbullet/ViTLP). ![ocr-demo-1.png](https://cdn-uploads.huggingface.co/production/uploads/628b42befc0078a72e38aed3/6FTpQA2NQWeryoXKBXSXH.png) ![ocr-demo-2.png](https://cdn-uploads.huggingface.co/production/uploads/628b42befc0078a72e38aed3/_mRHmn6SLp8naQzAETii4.png) # Preset FAQ - Why is ViTLP-medium (380M)? When I commenced this project, it was on the eve of LLMs (precisely speaking, ChatGPT). ViTLP-base presented in our paper, is actually a rather small pre-trained model. We know it is expected to scale up ViTLP in this LLM era. However, the pre-training scale is commonly constrained by computation resources and the pre-training dataset scale, in which context ViTLP-medium (380M) is the largest pre-training scale so far we can support. Besides, this scale of ViTLP also brings inference sweetness including speed and memory usage. Typically, OCR on a page of a document image can be processed within 5~10 seconds in an Nvidia 4090, which is comparable to (and faster than) most OCR engines (and LLMs). # Note ViTLP is pronounced /ˈvai·tlp/ (vital). The first version of our paper was submitted to [OpenReview](https://openreview.net/forum?id=ARtBIBAmNR) in June 2023.
{"language": ["en"], "license": "mit"}
veason/ViTLP-medium
null
[ "transformers", "pytorch", "ViTLP", "en", "arxiv:2403.16516", "license:mit", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:54:12+00:00
[ "2403.16516" ]
[ "en" ]
TAGS #transformers #pytorch #ViTLP #en #arxiv-2403.16516 #license-mit #endpoints_compatible #region-us
# Visually Guided Generative Text-Layout Pre-training for Document Intelligence The ViTLP model was proposed in Visually Guided Generative Text-Layout Pre-training for Document Intelligence, which is a generative foundation model for document intelligence. We provide the pre-trained checkpoint *ViTLP-medium* (380M). The pre-trained ViTLP model can natively perform OCR text localization and recognition. ## Demo on Document Text Recognition & Localization The code of ViTLP inference and demo is assisible at URL !URL !URL # Preset FAQ - Why is ViTLP-medium (380M)? When I commenced this project, it was on the eve of LLMs (precisely speaking, ChatGPT). ViTLP-base presented in our paper, is actually a rather small pre-trained model. We know it is expected to scale up ViTLP in this LLM era. However, the pre-training scale is commonly constrained by computation resources and the pre-training dataset scale, in which context ViTLP-medium (380M) is the largest pre-training scale so far we can support. Besides, this scale of ViTLP also brings inference sweetness including speed and memory usage. Typically, OCR on a page of a document image can be processed within 5~10 seconds in an Nvidia 4090, which is comparable to (and faster than) most OCR engines (and LLMs). # Note ViTLP is pronounced /ˈvai·tlp/ (vital). The first version of our paper was submitted to OpenReview in June 2023.
[ "# Visually Guided Generative Text-Layout Pre-training for Document Intelligence\n\nThe ViTLP model was proposed in Visually Guided Generative Text-Layout Pre-training for Document Intelligence, which is a generative foundation model for document intelligence. We provide the pre-trained checkpoint *ViTLP-medium* (380M). The pre-trained ViTLP model can natively perform OCR text localization and recognition.", "## Demo on Document Text Recognition & Localization\n\nThe code of ViTLP inference and demo is assisible at URL\n\n!URL\n\n!URL", "# Preset FAQ\n- Why is ViTLP-medium (380M)?\n\nWhen I commenced this project, it was on the eve of LLMs (precisely speaking, ChatGPT). ViTLP-base presented in our paper, is actually a rather small pre-trained model. We know it is expected to scale up ViTLP in this LLM era. However, the pre-training scale is commonly constrained by computation resources and the pre-training dataset scale, in which context ViTLP-medium (380M) is the largest pre-training scale so far we can support.\n\nBesides, this scale of ViTLP also brings inference sweetness including speed and memory usage. Typically, OCR on a page of a document image can be processed within 5~10 seconds in an Nvidia 4090, which is comparable to (and faster than) most OCR engines (and LLMs).", "# Note\nViTLP is pronounced /ˈvai·tlp/ (vital). The first version of our paper was submitted to OpenReview in June 2023." ]
[ "TAGS\n#transformers #pytorch #ViTLP #en #arxiv-2403.16516 #license-mit #endpoints_compatible #region-us \n", "# Visually Guided Generative Text-Layout Pre-training for Document Intelligence\n\nThe ViTLP model was proposed in Visually Guided Generative Text-Layout Pre-training for Document Intelligence, which is a generative foundation model for document intelligence. We provide the pre-trained checkpoint *ViTLP-medium* (380M). The pre-trained ViTLP model can natively perform OCR text localization and recognition.", "## Demo on Document Text Recognition & Localization\n\nThe code of ViTLP inference and demo is assisible at URL\n\n!URL\n\n!URL", "# Preset FAQ\n- Why is ViTLP-medium (380M)?\n\nWhen I commenced this project, it was on the eve of LLMs (precisely speaking, ChatGPT). ViTLP-base presented in our paper, is actually a rather small pre-trained model. We know it is expected to scale up ViTLP in this LLM era. However, the pre-training scale is commonly constrained by computation resources and the pre-training dataset scale, in which context ViTLP-medium (380M) is the largest pre-training scale so far we can support.\n\nBesides, this scale of ViTLP also brings inference sweetness including speed and memory usage. Typically, OCR on a page of a document image can be processed within 5~10 seconds in an Nvidia 4090, which is comparable to (and faster than) most OCR engines (and LLMs).", "# Note\nViTLP is pronounced /ˈvai·tlp/ (vital). The first version of our paper was submitted to OpenReview in June 2023." ]
[ 37, 86, 32, 189, 36 ]
[ "TAGS\n#transformers #pytorch #ViTLP #en #arxiv-2403.16516 #license-mit #endpoints_compatible #region-us \n# Visually Guided Generative Text-Layout Pre-training for Document Intelligence\n\nThe ViTLP model was proposed in Visually Guided Generative Text-Layout Pre-training for Document Intelligence, which is a generative foundation model for document intelligence. We provide the pre-trained checkpoint *ViTLP-medium* (380M). The pre-trained ViTLP model can natively perform OCR text localization and recognition.## Demo on Document Text Recognition & Localization\n\nThe code of ViTLP inference and demo is assisible at URL\n\n!URL\n\n!URL# Preset FAQ\n- Why is ViTLP-medium (380M)?\n\nWhen I commenced this project, it was on the eve of LLMs (precisely speaking, ChatGPT). ViTLP-base presented in our paper, is actually a rather small pre-trained model. We know it is expected to scale up ViTLP in this LLM era. However, the pre-training scale is commonly constrained by computation resources and the pre-training dataset scale, in which context ViTLP-medium (380M) is the largest pre-training scale so far we can support.\n\nBesides, this scale of ViTLP also brings inference sweetness including speed and memory usage. Typically, OCR on a page of a document image can be processed within 5~10 seconds in an Nvidia 4090, which is comparable to (and faster than) most OCR engines (and LLMs).# Note\nViTLP is pronounced /ˈvai·tlp/ (vital). The first version of our paper was submitted to OpenReview in June 2023." ]
text-generation
transformers
# [MaziyarPanahi/LlaMAndement-13b-GGUF](https://huggingface.co/MaziyarPanahi/LlaMAndement-13b-GGUF) - Model creator: [AgentPublic](https://huggingface.co/AgentPublic) - Original model: [AgentPublic/LlaMAndement-13b](https://huggingface.co/AgentPublic/LlaMAndement-13b) ## Description [MaziyarPanahi/LlaMAndement-13b-GGUF](https://huggingface.co/MaziyarPanahi/LlaMAndement-13b-GGUF) contains GGUF format model files for [AgentPublic/LlaMAndement-13b](https://huggingface.co/AgentPublic/LlaMAndement-13b). ### About GGUF GGUF is a new format introduced by the llama.cpp team on August 21st 2023. It is a replacement for GGML, which is no longer supported by llama.cpp. Here is an incomplete list of clients and libraries that are known to support GGUF: * [llama.cpp](https://github.com/ggerganov/llama.cpp). The source project for GGUF. Offers a CLI and a server option. * [llama-cpp-python](https://github.com/abetlen/llama-cpp-python), a Python library with GPU accel, LangChain support, and OpenAI-compatible API server. * [LM Studio](https://lmstudio.ai/), an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023. * [text-generation-webui](https://github.com/oobabooga/text-generation-webui), the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration. * [KoboldCpp](https://github.com/LostRuins/koboldcpp), a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling. * [GPT4All](https://gpt4all.io/index.html), a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel. * [LoLLMS Web UI](https://github.com/ParisNeo/lollms-webui), a great web UI with many interesting and unique features, including a full model library for easy model selection. * [Faraday.dev](https://faraday.dev/), an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration. * [candle](https://github.com/huggingface/candle), a Rust ML framework with a focus on performance, including GPU support, and ease of use. * [ctransformers](https://github.com/marella/ctransformers), a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models. ## Special thanks 🙏 Special thanks to [Georgi Gerganov](https://github.com/ggerganov) and the whole team working on [llama.cpp](https://github.com/ggerganov/llama.cpp/) for making all of this possible.
{"tags": ["quantized", "2-bit", "3-bit", "4-bit", "5-bit", "6-bit", "8-bit", "GGUF", "text-generation", "llama", "llama-2", "text-generation"], "model_name": "LlaMAndement-13b-GGUF", "base_model": "AgentPublic/LlaMAndement-13b", "inference": false, "model_creator": "AgentPublic", "pipeline_tag": "text-generation", "quantized_by": "MaziyarPanahi"}
MaziyarPanahi/LlaMAndement-13b-GGUF
null
[ "transformers", "gguf", "mistral", "quantized", "2-bit", "3-bit", "4-bit", "5-bit", "6-bit", "8-bit", "GGUF", "text-generation", "llama", "llama-2", "base_model:AgentPublic/LlaMAndement-13b", "text-generation-inference", "region:us" ]
null
2024-04-30T15:54:58+00:00
[]
[]
TAGS #transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #text-generation #llama #llama-2 #base_model-AgentPublic/LlaMAndement-13b #text-generation-inference #region-us
# MaziyarPanahi/LlaMAndement-13b-GGUF - Model creator: AgentPublic - Original model: AgentPublic/LlaMAndement-13b ## Description MaziyarPanahi/LlaMAndement-13b-GGUF contains GGUF format model files for AgentPublic/LlaMAndement-13b. ### About GGUF GGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL. Here is an incomplete list of clients and libraries that are known to support GGUF: * URL. The source project for GGUF. Offers a CLI and a server option. * llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server. * LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023. * text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration. * KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling. * GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel. * LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection. * URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration. * candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use. * ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models. ## Special thanks Special thanks to Georgi Gerganov and the whole team working on URL for making all of this possible.
[ "# MaziyarPanahi/LlaMAndement-13b-GGUF\n- Model creator: AgentPublic\n- Original model: AgentPublic/LlaMAndement-13b", "## Description\nMaziyarPanahi/LlaMAndement-13b-GGUF contains GGUF format model files for AgentPublic/LlaMAndement-13b.", "### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.", "## Special thanks\n\n Special thanks to Georgi Gerganov and the whole team working on URL for making all of this possible." ]
[ "TAGS\n#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #text-generation #llama #llama-2 #base_model-AgentPublic/LlaMAndement-13b #text-generation-inference #region-us \n", "# MaziyarPanahi/LlaMAndement-13b-GGUF\n- Model creator: AgentPublic\n- Original model: AgentPublic/LlaMAndement-13b", "## Description\nMaziyarPanahi/LlaMAndement-13b-GGUF contains GGUF format model files for AgentPublic/LlaMAndement-13b.", "### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.", "## Special thanks\n\n Special thanks to Georgi Gerganov and the whole team working on URL for making all of this possible." ]
[ 80, 43, 42, 392, 27 ]
[ "TAGS\n#transformers #gguf #mistral #quantized #2-bit #3-bit #4-bit #5-bit #6-bit #8-bit #GGUF #text-generation #llama #llama-2 #base_model-AgentPublic/LlaMAndement-13b #text-generation-inference #region-us \n# MaziyarPanahi/LlaMAndement-13b-GGUF\n- Model creator: AgentPublic\n- Original model: AgentPublic/LlaMAndement-13b## Description\nMaziyarPanahi/LlaMAndement-13b-GGUF contains GGUF format model files for AgentPublic/LlaMAndement-13b.### About GGUF\n\nGGUF is a new format introduced by the URL team on August 21st 2023. It is a replacement for GGML, which is no longer supported by URL.\n\nHere is an incomplete list of clients and libraries that are known to support GGUF:\n\n* URL. The source project for GGUF. Offers a CLI and a server option.\n* llama-cpp-python, a Python library with GPU accel, LangChain support, and OpenAI-compatible API server.\n* LM Studio, an easy-to-use and powerful local GUI for Windows and macOS (Silicon), with GPU acceleration. Linux available, in beta as of 27/11/2023.\n* text-generation-webui, the most widely used web UI, with many features and powerful extensions. Supports GPU acceleration.\n* KoboldCpp, a fully featured web UI, with GPU accel across all platforms and GPU architectures. Especially good for story telling.\n* GPT4All, a free and open source local running GUI, supporting Windows, Linux and macOS with full GPU accel.\n* LoLLMS Web UI, a great web UI with many interesting and unique features, including a full model library for easy model selection.\n* URL, an attractive and easy to use character-based chat GUI for Windows and macOS (both Silicon and Intel), with GPU acceleration.\n* candle, a Rust ML framework with a focus on performance, including GPU support, and ease of use.\n* ctransformers, a Python library with GPU accel, LangChain support, and OpenAI-compatible AI server. Note, as of time of writing (November 27th 2023), ctransformers has not been updated in a long time and does not support many recent models.## Special thanks\n\n Special thanks to Georgi Gerganov and the whole team working on URL for making all of this possible." ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
OwOOwO/finalupdate
null
[ "transformers", "safetensors", "stablelm", "text-generation", "conversational", "arxiv:1910.09700", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:56:13+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 41, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #stablelm #text-generation #conversational #arxiv-1910.09700 #autotrain_compatible #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
jamesohe/casaudit3-4bit-p02-adapter
null
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:56:57+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 26, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
null
transformers
# Uploaded model - **Developed by:** yadz45 - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "gguf"], "base_model": "unsloth/llama-3-8b-bnb-4bit"}
yadz45/IA_simple2
null
[ "transformers", "text-generation-inference", "unsloth", "llama", "gguf", "en", "base_model:unsloth/llama-3-8b-bnb-4bit", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-30T15:58:08+00:00
[]
[ "en" ]
TAGS #transformers #text-generation-inference #unsloth #llama #gguf #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us
# Uploaded model - Developed by: yadz45 - License: apache-2.0 - Finetuned from model : unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: yadz45\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #text-generation-inference #unsloth #llama #gguf #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: yadz45\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ 61, 81 ]
[ "TAGS\n#transformers #text-generation-inference #unsloth #llama #gguf #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n# Uploaded model\n\n- Developed by: yadz45\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
reinforcement-learning
null
# **Q-Learning** Agent playing1 **Taxi-v3** This is a trained model of a **Q-Learning** agent playing **Taxi-v3** . ## Usage ```python model = load_from_hub(repo_id="raulgadea/q-Taxi-v3", filename="q-learning.pkl") # Don't forget to check if you need to add additional attributes (is_slippery=False etc) env = gym.make(model["env_id"]) ```
{"tags": ["Taxi-v3", "q-learning", "reinforcement-learning", "custom-implementation"], "model-index": [{"name": "q-Taxi-v3", "results": [{"task": {"type": "reinforcement-learning", "name": "reinforcement-learning"}, "dataset": {"name": "Taxi-v3", "type": "Taxi-v3"}, "metrics": [{"type": "mean_reward", "value": "7.56 +/- 2.71", "name": "mean_reward", "verified": false}]}]}]}
raulgadea/q-Taxi-v3
null
[ "Taxi-v3", "q-learning", "reinforcement-learning", "custom-implementation", "model-index", "region:us" ]
null
2024-04-30T15:59:19+00:00
[]
[]
TAGS #Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us
# Q-Learning Agent playing1 Taxi-v3 This is a trained model of a Q-Learning agent playing Taxi-v3 . ## Usage
[ "# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage" ]
[ "TAGS\n#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n", "# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage" ]
[ 26, 31 ]
[ "TAGS\n#Taxi-v3 #q-learning #reinforcement-learning #custom-implementation #model-index #region-us \n# Q-Learning Agent playing1 Taxi-v3\n This is a trained model of a Q-Learning agent playing Taxi-v3 .\n\n ## Usage" ]
text-generation
transformers
# # Fast-Inference with Ctranslate2 Speedup inference while reducing memory by 2x-4x using int8 inference in C++ on CPU or GPU. quantized version of [meta-llama/Llama-2-7b-chat-hf](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf) ```bash pip install ctranslate2 ``` Checkpoint compatible to [ctranslate2>=3.22.0](https://github.com/OpenNMT/CTranslate2) - `compute_type=int8_float16` for `device="cuda"` - `compute_type=int8` for `device="cpu"` Converted on 2023-12-01 using CTranslate2==3.22.0 and ``` from ctranslate2.converters import TransformersConverter TransformersConverter( "meta-llama/Llama-2-7b-chat-hf", activation_scales=None, copy_files=['tokenizer.json', 'USE_POLICY.md', 'generation_config.json', 'README.md', 'special_tokens_map.json', 'tokenizer_config.json', 'LICENSE.txt', '.gitattributes'], load_as_float16=True, revision=None, low_cpu_mem_usage=True, trust_remote_code=True, ).convert( output_dir=str(tmp_dir), vmap = None, quantization="int8", force = True, ) ``` # License and other remarks: This is just a quantized version. License conditions are intended to be idential to original huggingface repo. # Original description, copied from https://huggingface.co/meta-llama/Llama-2-7b-chat-hf # **Llama 2** Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 7B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom. ## Model Details *Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the [website](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) and accept our License before requesting access here.* Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM. **Model Developers** Meta **Variations** Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations. **Input** Models input text only. **Output** Models generate text only. **Model Architecture** Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety. ||Training Data|Params|Content Length|GQA|Tokens|LR| |---|---|---|---|---|---|---| |Llama 2|*A new mix of publicly available online data*|7B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|13B|4k|&#10007;|2.0T|3.0 x 10<sup>-4</sup>| |Llama 2|*A new mix of publicly available online data*|70B|4k|&#10004;|2.0T|1.5 x 10<sup>-4</sup>| *Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability. **Model Dates** Llama 2 was trained between January 2023 and July 2023. **Status** This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback. **License** A custom commercial license is available at: [https://ai.meta.com/resources/models-and-libraries/llama-downloads/](https://ai.meta.com/resources/models-and-libraries/llama-downloads/) **Research Paper** ["Llama-2: Open Foundation and Fine-tuned Chat Models"](arxiv.org/abs/2307.09288) ## Intended Use **Intended Use Cases** Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks. To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the `INST` and `<<SYS>>` tags, `BOS` and `EOS` tokens, and the whitespaces and breaklines in between (we recommend calling `strip()` on inputs to avoid double-spaces). See our reference code in github for details: [`chat_completion`](https://github.com/facebookresearch/llama/blob/main/llama/generation.py#L212). **Out-of-scope Uses** Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2. ## Hardware and Software **Training Factors** We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute. **Carbon Footprint** Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program. ||Time (GPU hours)|Power Consumption (W)|Carbon Emitted(tCO<sub>2</sub>eq)| |---|---|---|---| |Llama 2 7B|184320|400|31.22| |Llama 2 13B|368640|400|62.44| |Llama 2 70B|1720320|400|291.42| |Total|3311616||539.00| **CO<sub>2</sub> emissions during pretraining.** Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others. ## Training Data **Overview** Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data. **Data Freshness** The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023. ## Evaluation Results In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library. |Model|Size|Code|Commonsense Reasoning|World Knowledge|Reading Comprehension|Math|MMLU|BBH|AGI Eval| |---|---|---|---|---|---|---|---|---|---| |Llama 1|7B|14.1|60.8|46.2|58.5|6.95|35.1|30.3|23.9| |Llama 1|13B|18.9|66.1|52.6|62.3|10.9|46.9|37.0|33.9| |Llama 1|33B|26.0|70.0|58.4|67.6|21.4|57.8|39.8|41.7| |Llama 1|65B|30.7|70.7|60.5|68.6|30.8|63.4|43.5|47.6| |Llama 2|7B|16.8|63.9|48.9|61.3|14.6|45.3|32.6|29.3| |Llama 2|13B|24.5|66.9|55.4|65.8|28.7|54.8|39.4|39.1| |Llama 2|70B|**37.5**|**71.9**|**63.6**|**69.4**|**35.2**|**68.9**|**51.2**|**54.2**| **Overall performance on grouped academic benchmarks.** *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1. |||TruthfulQA|Toxigen| |---|---|---|---| |Llama 1|7B|27.42|23.00| |Llama 1|13B|41.74|23.08| |Llama 1|33B|44.19|22.57| |Llama 1|65B|48.71|21.77| |Llama 2|7B|33.29|**21.25**| |Llama 2|13B|41.86|26.10| |Llama 2|70B|**50.18**|24.60| **Evaluation of pretrained LLMs on automatic safety benchmarks.** For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better). |||TruthfulQA|Toxigen| |---|---|---|---| |Llama-2-Chat|7B|57.04|**0.00**| |Llama-2-Chat|13B|62.18|**0.00**| |Llama-2-Chat|70B|**64.14**|0.01| **Evaluation of fine-tuned LLMs on different safety datasets.** Same metric definitions as above. ## Ethical Considerations and Limitations Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model. Please see the Responsible Use Guide available at [https://ai.meta.com/llama/responsible-use-guide/](https://ai.meta.com/llama/responsible-use-guide) ## Reporting Issues Please report any software “bug,” or other problems with the models through one of the following means: - Reporting issues with the model: [github.com/facebookresearch/llama](http://github.com/facebookresearch/llama) - Reporting problematic content generated by the model: [developers.facebook.com/llama_output_feedback](http://developers.facebook.com/llama_output_feedback) - Reporting bugs and security concerns: [facebook.com/whitehat/info](http://facebook.com/whitehat/info) ## Llama Model Index |Model|Llama2|Llama2-hf|Llama2-chat|Llama2-chat-hf| |---|---|---|---|---| |7B| [Link](https://huggingface.co/meta-llama/Llama-2-7b) | [Link](https://huggingface.co/meta-llama/Llama-2-7b-hf) | [Link](https://huggingface.co/meta-llama/Llama-2-7b-chat) | [Link](https://huggingface.co/meta-llama/Llama-2-7b-chat-hf)| |13B| [Link](https://huggingface.co/meta-llama/Llama-2-13b) | [Link](https://huggingface.co/meta-llama/Llama-2-13b-hf) | [Link](https://huggingface.co/meta-llama/Llama-2-13b-chat) | [Link](https://huggingface.co/meta-llama/Llama-2-13b-chat-hf)| |70B| [Link](https://huggingface.co/meta-llama/Llama-2-70b) | [Link](https://huggingface.co/meta-llama/Llama-2-70b-hf) | [Link](https://huggingface.co/meta-llama/Llama-2-70b-chat) | [Link](https://huggingface.co/meta-llama/Llama-2-70b-chat-hf)|
{"language": ["en"], "tags": ["ctranslate2", "int8", "float16", "facebook", "meta", "pytorch", "llama", "llama-2"], "extra_gated_heading": "Access Llama 2 on Hugging Face", "extra_gated_description": "This is a form to enable access to Llama 2 on Hugging Face after you have been granted access from Meta. Please visit the [Meta website](https://ai.meta.com/resources/models-and-libraries/llama-downloads) and accept our license terms and acceptable use policy before submitting this form. Requests will be processed in 1-2 days.", "extra_gated_prompt": "**Your Hugging Face account email address MUST match the email you provide on the Meta website, or your request will not be approved.**", "extra_gated_button_content": "Submit", "extra_gated_fields": {"I agree to share my name, email address and username with Meta and confirm that I have already been granted download access on the Meta website": "checkbox"}, "pipeline_tag": "text-generation", "inference": false, "arxiv": 2307.09288}
Weblet/Llama-2-7b-chat-hf-ct2-int8
null
[ "transformers", "pytorch", "llama", "text-generation", "ctranslate2", "int8", "float16", "facebook", "meta", "llama-2", "conversational", "en", "arxiv:2307.09288", "autotrain_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T15:59:38+00:00
[ "2307.09288" ]
[ "en" ]
TAGS #transformers #pytorch #llama #text-generation #ctranslate2 #int8 #float16 #facebook #meta #llama-2 #conversational #en #arxiv-2307.09288 #autotrain_compatible #text-generation-inference #region-us
# Fast-Inference with Ctranslate2 ================================= Speedup inference while reducing memory by 2x-4x using int8 inference in C++ on CPU or GPU. quantized version of meta-llama/Llama-2-7b-chat-hf Checkpoint compatible to ctranslate2>=3.22.0 * 'compute\_type=int8\_float16' for 'device="cuda"' * 'compute\_type=int8' for 'device="cpu"' Converted on 2023-12-01 using CTranslate2==3.22.0 and License and other remarks: ========================== This is just a quantized version. License conditions are intended to be idential to original huggingface repo. Original description, copied from URL ===================================== Llama 2 ======= Llama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 7B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom. Model Details ------------- *Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the website and accept our License before requesting access here.* Meta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM. Model Developers Meta Variations Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations. Input Models input text only. Output Models generate text only. Model Architecture Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety. *Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability. Model Dates Llama 2 was trained between January 2023 and July 2023. Status This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback. License A custom commercial license is available at: URL Research Paper "Llama-2: Open Foundation and Fine-tuned Chat Models" Intended Use ------------ Intended Use Cases Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks. To get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the 'INST' and '<>' tags, 'BOS' and 'EOS' tokens, and the whitespaces and breaklines in between (we recommend calling 'strip()' on inputs to avoid double-spaces). See our reference code in github for details: 'chat\_completion'. Out-of-scope Uses Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2. Hardware and Software --------------------- Training Factors We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute. Carbon Footprint Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program. CO2 emissions during pretraining. Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others. Training Data ------------- Overview Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data. Data Freshness The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023. Evaluation Results ------------------ In this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library. Overall performance on grouped academic benchmarks. *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1. Evaluation of pretrained LLMs on automatic safety benchmarks. For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better). Evaluation of fine-tuned LLMs on different safety datasets. Same metric definitions as above. Ethical Considerations and Limitations -------------------------------------- Llama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model. Please see the Responsible Use Guide available at URL Reporting Issues ---------------- Please report any software “bug,” or other problems with the models through one of the following means: * Reporting issues with the model: URL * Reporting problematic content generated by the model: URL * Reporting bugs and security concerns: URL Llama Model Index -----------------
[ "# Fast-Inference with Ctranslate2\n=================================\n\n\nSpeedup inference while reducing memory by 2x-4x using int8 inference in C++ on CPU or GPU.\n\n\nquantized version of meta-llama/Llama-2-7b-chat-hf\n\n\nCheckpoint compatible to ctranslate2>=3.22.0\n\n\n* 'compute\\_type=int8\\_float16' for 'device=\"cuda\"'\n* 'compute\\_type=int8' for 'device=\"cpu\"'\n\n\nConverted on 2023-12-01 using CTranslate2==3.22.0 and\n\n\nLicense and other remarks:\n==========================\n\n\nThis is just a quantized version. License conditions are intended to be idential to original huggingface repo.\n\n\nOriginal description, copied from URL\n=====================================\n\n\nLlama 2\n=======\n\n\nLlama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 7B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.\n\n\nModel Details\n-------------\n\n\n*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the website and accept our License before requesting access here.*\n\n\nMeta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.\n\n\nModel Developers Meta\n\n\nVariations Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.\n\n\nInput Models input text only.\n\n\nOutput Models generate text only.\n\n\nModel Architecture Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.\n\n\n\n*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.\n\n\nModel Dates Llama 2 was trained between January 2023 and July 2023.\n\n\nStatus This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.\n\n\nLicense A custom commercial license is available at: URL\n\n\nResearch Paper \"Llama-2: Open Foundation and Fine-tuned Chat Models\"\n\n\nIntended Use\n------------\n\n\nIntended Use Cases Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.\n\n\nTo get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the 'INST' and '<>' tags, 'BOS' and 'EOS' tokens, and the whitespaces and breaklines in between (we recommend calling 'strip()' on inputs to avoid double-spaces). See our reference code in github for details: 'chat\\_completion'.\n\n\nOut-of-scope Uses Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.\n\n\nHardware and Software\n---------------------\n\n\nTraining Factors We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.\n\n\nCarbon Footprint Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.\n\n\n\nCO2 emissions during pretraining. Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.\n\n\nTraining Data\n-------------\n\n\nOverview Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.\n\n\nData Freshness The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.\n\n\nEvaluation Results\n------------------\n\n\nIn this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.\n\n\n\nOverall performance on grouped academic benchmarks. *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.\n\n\n\nEvaluation of pretrained LLMs on automatic safety benchmarks. For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).\n\n\n\nEvaluation of fine-tuned LLMs on different safety datasets. Same metric definitions as above.\n\n\nEthical Considerations and Limitations\n--------------------------------------\n\n\nLlama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.\n\n\nPlease see the Responsible Use Guide available at URL\n\n\nReporting Issues\n----------------\n\n\nPlease report any software “bug,” or other problems with the models through one of the following means:\n\n\n* Reporting issues with the model: URL\n* Reporting problematic content generated by the model: URL\n* Reporting bugs and security concerns: URL\n\n\nLlama Model Index\n-----------------" ]
[ "TAGS\n#transformers #pytorch #llama #text-generation #ctranslate2 #int8 #float16 #facebook #meta #llama-2 #conversational #en #arxiv-2307.09288 #autotrain_compatible #text-generation-inference #region-us \n", "# Fast-Inference with Ctranslate2\n=================================\n\n\nSpeedup inference while reducing memory by 2x-4x using int8 inference in C++ on CPU or GPU.\n\n\nquantized version of meta-llama/Llama-2-7b-chat-hf\n\n\nCheckpoint compatible to ctranslate2>=3.22.0\n\n\n* 'compute\\_type=int8\\_float16' for 'device=\"cuda\"'\n* 'compute\\_type=int8' for 'device=\"cpu\"'\n\n\nConverted on 2023-12-01 using CTranslate2==3.22.0 and\n\n\nLicense and other remarks:\n==========================\n\n\nThis is just a quantized version. License conditions are intended to be idential to original huggingface repo.\n\n\nOriginal description, copied from URL\n=====================================\n\n\nLlama 2\n=======\n\n\nLlama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 7B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.\n\n\nModel Details\n-------------\n\n\n*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the website and accept our License before requesting access here.*\n\n\nMeta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.\n\n\nModel Developers Meta\n\n\nVariations Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.\n\n\nInput Models input text only.\n\n\nOutput Models generate text only.\n\n\nModel Architecture Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.\n\n\n\n*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.\n\n\nModel Dates Llama 2 was trained between January 2023 and July 2023.\n\n\nStatus This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.\n\n\nLicense A custom commercial license is available at: URL\n\n\nResearch Paper \"Llama-2: Open Foundation and Fine-tuned Chat Models\"\n\n\nIntended Use\n------------\n\n\nIntended Use Cases Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.\n\n\nTo get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the 'INST' and '<>' tags, 'BOS' and 'EOS' tokens, and the whitespaces and breaklines in between (we recommend calling 'strip()' on inputs to avoid double-spaces). See our reference code in github for details: 'chat\\_completion'.\n\n\nOut-of-scope Uses Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.\n\n\nHardware and Software\n---------------------\n\n\nTraining Factors We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.\n\n\nCarbon Footprint Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.\n\n\n\nCO2 emissions during pretraining. Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.\n\n\nTraining Data\n-------------\n\n\nOverview Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.\n\n\nData Freshness The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.\n\n\nEvaluation Results\n------------------\n\n\nIn this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.\n\n\n\nOverall performance on grouped academic benchmarks. *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.\n\n\n\nEvaluation of pretrained LLMs on automatic safety benchmarks. For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).\n\n\n\nEvaluation of fine-tuned LLMs on different safety datasets. Same metric definitions as above.\n\n\nEthical Considerations and Limitations\n--------------------------------------\n\n\nLlama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.\n\n\nPlease see the Responsible Use Guide available at URL\n\n\nReporting Issues\n----------------\n\n\nPlease report any software “bug,” or other problems with the models through one of the following means:\n\n\n* Reporting issues with the model: URL\n* Reporting problematic content generated by the model: URL\n* Reporting bugs and security concerns: URL\n\n\nLlama Model Index\n-----------------" ]
[ 67, 1838 ]
[ "TAGS\n#transformers #pytorch #llama #text-generation #ctranslate2 #int8 #float16 #facebook #meta #llama-2 #conversational #en #arxiv-2307.09288 #autotrain_compatible #text-generation-inference #region-us \n# Fast-Inference with Ctranslate2\n=================================\n\n\nSpeedup inference while reducing memory by 2x-4x using int8 inference in C++ on CPU or GPU.\n\n\nquantized version of meta-llama/Llama-2-7b-chat-hf\n\n\nCheckpoint compatible to ctranslate2>=3.22.0\n\n\n* 'compute\\_type=int8\\_float16' for 'device=\"cuda\"'\n* 'compute\\_type=int8' for 'device=\"cpu\"'\n\n\nConverted on 2023-12-01 using CTranslate2==3.22.0 and\n\n\nLicense and other remarks:\n==========================\n\n\nThis is just a quantized version. License conditions are intended to be idential to original huggingface repo.\n\n\nOriginal description, copied from URL\n=====================================\n\n\nLlama 2\n=======\n\n\nLlama 2 is a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. This is the repository for the 7B fine-tuned model, optimized for dialogue use cases and converted for the Hugging Face Transformers format. Links to other models can be found in the index at the bottom.\n\n\nModel Details\n-------------\n\n\n*Note: Use of this model is governed by the Meta license. In order to download the model weights and tokenizer, please visit the website and accept our License before requesting access here.*\n\n\nMeta developed and publicly released the Llama 2 family of large language models (LLMs), a collection of pretrained and fine-tuned generative text models ranging in scale from 7 billion to 70 billion parameters. Our fine-tuned LLMs, called Llama-2-Chat, are optimized for dialogue use cases. Llama-2-Chat models outperform open-source chat models on most benchmarks we tested, and in our human evaluations for helpfulness and safety, are on par with some popular closed-source models like ChatGPT and PaLM.\n\n\nModel Developers Meta\n\n\nVariations Llama 2 comes in a range of parameter sizes — 7B, 13B, and 70B — as well as pretrained and fine-tuned variations.\n\n\nInput Models input text only.\n\n\nOutput Models generate text only.\n\n\nModel Architecture Llama 2 is an auto-regressive language model that uses an optimized transformer architecture. The tuned versions use supervised fine-tuning (SFT) and reinforcement learning with human feedback (RLHF) to align to human preferences for helpfulness and safety.\n\n\n\n*Llama 2 family of models.* Token counts refer to pretraining data only. All models are trained with a global batch-size of 4M tokens. Bigger models - 70B -- use Grouped-Query Attention (GQA) for improved inference scalability.\n\n\nModel Dates Llama 2 was trained between January 2023 and July 2023.\n\n\nStatus This is a static model trained on an offline dataset. Future versions of the tuned models will be released as we improve model safety with community feedback.\n\n\nLicense A custom commercial license is available at: URL\n\n\nResearch Paper \"Llama-2: Open Foundation and Fine-tuned Chat Models\"\n\n\nIntended Use\n------------\n\n\nIntended Use Cases Llama 2 is intended for commercial and research use in English. Tuned models are intended for assistant-like chat, whereas pretrained models can be adapted for a variety of natural language generation tasks.\n\n\nTo get the expected features and performance for the chat versions, a specific formatting needs to be followed, including the 'INST' and '<>' tags, 'BOS' and 'EOS' tokens, and the whitespaces and breaklines in between (we recommend calling 'strip()' on inputs to avoid double-spaces). See our reference code in github for details: 'chat\\_completion'.\n\n\nOut-of-scope Uses Use in any manner that violates applicable laws or regulations (including trade compliance laws).Use in languages other than English. Use in any other way that is prohibited by the Acceptable Use Policy and Licensing Agreement for Llama 2.\n\n\nHardware and Software\n---------------------\n\n\nTraining Factors We used custom training libraries, Meta's Research Super Cluster, and production clusters for pretraining. Fine-tuning, annotation, and evaluation were also performed on third-party cloud compute.\n\n\nCarbon Footprint Pretraining utilized a cumulative 3.3M GPU hours of computation on hardware of type A100-80GB (TDP of 350-400W). Estimated total emissions were 539 tCO2eq, 100% of which were offset by Meta’s sustainability program.\n\n\n\nCO2 emissions during pretraining. Time: total GPU time required for training each model. Power Consumption: peak power capacity per GPU device for the GPUs used adjusted for power usage efficiency. 100% of the emissions are directly offset by Meta's sustainability program, and because we are openly releasing these models, the pretraining costs do not need to be incurred by others.\n\n\nTraining Data\n-------------\n\n\nOverview Llama 2 was pretrained on 2 trillion tokens of data from publicly available sources. The fine-tuning data includes publicly available instruction datasets, as well as over one million new human-annotated examples. Neither the pretraining nor the fine-tuning datasets include Meta user data.\n\n\nData Freshness The pretraining data has a cutoff of September 2022, but some tuning data is more recent, up to July 2023.\n\n\nEvaluation Results\n------------------\n\n\nIn this section, we report the results for the Llama 1 and Llama 2 models on standard academic benchmarks.For all the evaluations, we use our internal evaluations library.\n\n\n\nOverall performance on grouped academic benchmarks. *Code:* We report the average pass@1 scores of our models on HumanEval and MBPP. *Commonsense Reasoning:* We report the average of PIQA, SIQA, HellaSwag, WinoGrande, ARC easy and challenge, OpenBookQA, and CommonsenseQA. We report 7-shot results for CommonSenseQA and 0-shot results for all other benchmarks. *World Knowledge:* We evaluate the 5-shot performance on NaturalQuestions and TriviaQA and report the average. *Reading Comprehension:* For reading comprehension, we report the 0-shot average on SQuAD, QuAC, and BoolQ. *MATH:* We report the average of the GSM8K (8 shot) and MATH (4 shot) benchmarks at top 1.\n\n\n\nEvaluation of pretrained LLMs on automatic safety benchmarks. For TruthfulQA, we present the percentage of generations that are both truthful and informative (the higher the better). For ToxiGen, we present the percentage of toxic generations (the smaller the better).\n\n\n\nEvaluation of fine-tuned LLMs on different safety datasets. Same metric definitions as above.\n\n\nEthical Considerations and Limitations\n--------------------------------------\n\n\nLlama 2 is a new technology that carries risks with use. Testing conducted to date has been in English, and has not covered, nor could it cover all scenarios. For these reasons, as with all LLMs, Llama 2’s potential outputs cannot be predicted in advance, and the model may in some instances produce inaccurate, biased or other objectionable responses to user prompts. Therefore, before deploying any applications of Llama 2, developers should perform safety testing and tuning tailored to their specific applications of the model.\n\n\nPlease see the Responsible Use Guide available at URL\n\n\nReporting Issues\n----------------\n\n\nPlease report any software “bug,” or other problems with the models through one of the following means:\n\n\n* Reporting issues with the model: URL\n* Reporting problematic content generated by the model: URL\n* Reporting bugs and security concerns: URL\n\n\nLlama Model Index\n-----------------" ]
text-to-image
null
## ReaL <img src="https://via.placeholder.com/468x300?text=App+Screenshot+Here" alt="Generated on Image Pipeline" style="border-radius: 10px;"> **This lora model is uploaded on [imagepipeline.io](https://imagepipeline.io/)** Model details - FOOTJOB REAL [![Try this model](https://img.shields.io/badge/try_this_model-image_pipeline-BD9319)](https://imagepipeline.io/models/ReaL?id=16043964-2894-4bc4-9ffb-5c0f42c7938c/) ## How to try this model ? You can try using it locally or send an API call to test the output quality. Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/). No payment required. Coding in `php` `javascript` `node` etc ? Checkout our documentation [![documentation](https://img.shields.io/badge/documentation-image_pipeline-blue)](https://docs.imagepipeline.io/docs/introduction) ```python import requests import json url = "https://imagepipeline.io/sd/text2image/v1/run" payload = json.dumps({ "model_id": "sd1.5", "prompt": "ultra realistic close up portrait ((beautiful pale cyberpunk female with heavy black eyeliner)), blue eyes, shaved side haircut, hyper detail, cinematic lighting, magic neon, dark red city, Canon EOS R3, nikon, f/1.4, ISO 200, 1/160s, 8K, RAW, unedited, symmetrical balance, in-frame, 8K", "negative_prompt": "painting, extra fingers, mutated hands, poorly drawn hands, poorly drawn face, deformed, ugly, blurry, bad anatomy, bad proportions, extra limbs, cloned face, skinny, glitchy, double torso, extra arms, extra hands, mangled fingers, missing lips, ugly face, distorted face, extra legs, anime", "width": "512", "height": "512", "samples": "1", "num_inference_steps": "30", "safety_checker": false, "guidance_scale": 7.5, "multi_lingual": "no", "embeddings": "", "lora_models": "16043964-2894-4bc4-9ffb-5c0f42c7938c", "lora_weights": "0.5" }) headers = { 'Content-Type': 'application/json', 'API-Key': 'your_api_key' } response = requests.request("POST", url, headers=headers, data=payload) print(response.text) } ``` Get more ready to use `MODELS` like this for `SD 1.5` and `SDXL` : [![All models](https://img.shields.io/badge/Get%20All%20Models-image_pipeline-BD9319)](https://imagepipeline.io/models) ### API Reference #### Generate Image ```http https://api.imagepipeline.io/sd/text2image/v1 ``` | Headers | Type | Description | |:----------------------| :------- |:-------------------------------------------------------------------------------------------------------------------| | `API-Key` | `str` | Get your `API_KEY` from [imagepipeline.io](https://imagepipeline.io/) | | `Content-Type` | `str` | application/json - content type of the request body | | Parameter | Type | Description | | :-------- | :------- | :------------------------- | | `model_id` | `str` | Your base model, find available lists in [models page](https://imagepipeline.io/models) or upload your own| | `prompt` | `str` | Text Prompt. Check our [Prompt Guide](https://docs.imagepipeline.io/docs/SD-1.5/docs/extras/prompt-guide) for tips | | `num_inference_steps` | `int [1-50]` | Noise is removed with each step, resulting in a higher-quality image over time. Ideal value 30-50 (without LCM) | | `guidance_scale` | `float [1-20]` | Higher guidance scale prioritizes text prompt relevance but sacrifices image quality. Ideal value 7.5-12.5 | | `lora_models` | `str, array` | Pass the model_id(s) of LoRA models that can be found in models page | | `lora_weights` | `str, array` | Strength of the LoRA effect | --- license: creativeml-openrail-m tags: - imagepipeline - imagepipeline.io - text-to-image - ultra-realistic pinned: false pipeline_tag: text-to-image --- ### Feedback If you have any feedback, please reach out to us at [email protected] #### 🔗 Visit Website [![portfolio](https://img.shields.io/badge/image_pipeline-BD9319?style=for-the-badge&logo=gocd&logoColor=white)](https://imagepipeline.io/) If you are the original author of this model, please [click here](https://airtable.com/apprTaRnJbDJ8ufOx/shr4g7o9B6fWfOlUR) to add credits
{"license": "creativeml-openrail-m", "tags": ["imagepipeline", "imagepipeline.io", "text-to-image", "ultra-realistic"], "pinned": false, "pipeline_tag": "text-to-image"}
imagepipeline/ReaL
null
[ "imagepipeline", "imagepipeline.io", "text-to-image", "ultra-realistic", "license:creativeml-openrail-m", "region:us" ]
null
2024-04-30T15:59:39+00:00
[]
[]
TAGS #imagepipeline #imagepipeline.io #text-to-image #ultra-realistic #license-creativeml-openrail-m #region-us
ReaL ---- <img src="URL alt="Generated on Image Pipeline" style="border-radius: 10px;"> This lora model is uploaded on URL Model details - FOOTJOB REAL ![Try this model](URL How to try this model ? ----------------------- You can try using it locally or send an API call to test the output quality. Get your 'API\_KEY' from URL. No payment required. Coding in 'php' 'javascript' 'node' etc ? Checkout our documentation ![documentation](URL Get more ready to use 'MODELS' like this for 'SD 1.5' and 'SDXL' : ![All models](URL ### API Reference #### Generate Image --- license: creativeml-openrail-m tags: * imagepipeline * URL * text-to-image * ultra-realistic pinned: false pipeline\_tag: text-to-image --- ### Feedback If you have any feedback, please reach out to us at hello@URL #### Visit Website ![portfolio](URL If you are the original author of this model, please click here to add credits
[ "### API Reference", "#### Generate Image\n\n\n\n\n\n\n---\n\n\nlicense: creativeml-openrail-m\ntags:\n\n\n* imagepipeline\n* URL\n* text-to-image\n* ultra-realistic\npinned: false\npipeline\\_tag: text-to-image\n\n\n\n\n---", "### Feedback\n\n\nIf you have any feedback, please reach out to us at hello@URL", "#### Visit Website\n\n\n![portfolio](URL\n\n\nIf you are the original author of this model, please click here to add credits" ]
[ "TAGS\n#imagepipeline #imagepipeline.io #text-to-image #ultra-realistic #license-creativeml-openrail-m #region-us \n", "### API Reference", "#### Generate Image\n\n\n\n\n\n\n---\n\n\nlicense: creativeml-openrail-m\ntags:\n\n\n* imagepipeline\n* URL\n* text-to-image\n* ultra-realistic\npinned: false\npipeline\\_tag: text-to-image\n\n\n\n\n---", "### Feedback\n\n\nIf you have any feedback, please reach out to us at hello@URL", "#### Visit Website\n\n\n![portfolio](URL\n\n\nIf you are the original author of this model, please click here to add credits" ]
[ 35, 5, 53, 20, 29 ]
[ "TAGS\n#imagepipeline #imagepipeline.io #text-to-image #ultra-realistic #license-creativeml-openrail-m #region-us \n### API Reference#### Generate Image\n\n\n\n\n\n\n---\n\n\nlicense: creativeml-openrail-m\ntags:\n\n\n* imagepipeline\n* URL\n* text-to-image\n* ultra-realistic\npinned: false\npipeline\\_tag: text-to-image\n\n\n\n\n---### Feedback\n\n\nIf you have any feedback, please reach out to us at hello@URL#### Visit Website\n\n\n![portfolio](URL\n\n\nIf you are the original author of this model, please click here to add credits" ]
text-generation
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> A [LLaVA (Large Language and Vision Assistant)](https://github.com/haotian-liu/LLaVA) version of [microsoft/Phi-3-mini-128k-instruct](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct) ## Model Details Trained with this dataset of [liuhaotian/LLaVA-Instruct-150K](https://huggingface.co/datasets/liuhaotian/LLaVA-Instruct-150K). This model has a around 4.1B parameters, where the 3.8B came from the base model.
{"base_model": "microsoft/Phi-3-mini-128k-instruct", "inference": false}
praysimanjuntak/llava-phi3-3.8b-lora
null
[ "transformers", "safetensors", "phi3", "text-generation", "conversational", "custom_code", "base_model:microsoft/Phi-3-mini-128k-instruct", "autotrain_compatible", "region:us" ]
null
2024-04-30T15:59:53+00:00
[]
[]
TAGS #transformers #safetensors #phi3 #text-generation #conversational #custom_code #base_model-microsoft/Phi-3-mini-128k-instruct #autotrain_compatible #region-us
# Model Card for Model ID A LLaVA (Large Language and Vision Assistant) version of microsoft/Phi-3-mini-128k-instruct ## Model Details Trained with this dataset of liuhaotian/LLaVA-Instruct-150K. This model has a around 4.1B parameters, where the 3.8B came from the base model.
[ "# Model Card for Model ID\n\n\nA LLaVA (Large Language and Vision Assistant) version of microsoft/Phi-3-mini-128k-instruct", "## Model Details\n\nTrained with this dataset of liuhaotian/LLaVA-Instruct-150K. This model has a around 4.1B parameters, where the 3.8B came from the base model." ]
[ "TAGS\n#transformers #safetensors #phi3 #text-generation #conversational #custom_code #base_model-microsoft/Phi-3-mini-128k-instruct #autotrain_compatible #region-us \n", "# Model Card for Model ID\n\n\nA LLaVA (Large Language and Vision Assistant) version of microsoft/Phi-3-mini-128k-instruct", "## Model Details\n\nTrained with this dataset of liuhaotian/LLaVA-Instruct-150K. This model has a around 4.1B parameters, where the 3.8B came from the base model." ]
[ 48, 31, 46 ]
[ "TAGS\n#transformers #safetensors #phi3 #text-generation #conversational #custom_code #base_model-microsoft/Phi-3-mini-128k-instruct #autotrain_compatible #region-us \n# Model Card for Model ID\n\n\nA LLaVA (Large Language and Vision Assistant) version of microsoft/Phi-3-mini-128k-instruct## Model Details\n\nTrained with this dataset of liuhaotian/LLaVA-Instruct-150K. This model has a around 4.1B parameters, where the 3.8B came from the base model." ]
null
transformers
# Uploaded model - **Developed by:** Entreprenerdly - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [<img src="https://raw.githubusercontent.com/unslothai/unsloth/main/images/unsloth%20made%20with%20love.png" width="200"/>](https://github.com/unslothai/unsloth)
{"language": ["en"], "license": "apache-2.0", "tags": ["text-generation-inference", "transformers", "unsloth", "llama", "trl"], "base_model": "unsloth/llama-3-8b-bnb-4bit"}
Entreprenerdly/finetuned_llama3_unsloth_lora_model
null
[ "transformers", "safetensors", "text-generation-inference", "unsloth", "llama", "trl", "en", "base_model:unsloth/llama-3-8b-bnb-4bit", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-30T16:01:54+00:00
[]
[ "en" ]
TAGS #transformers #safetensors #text-generation-inference #unsloth #llama #trl #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us
# Uploaded model - Developed by: Entreprenerdly - License: apache-2.0 - Finetuned from model : unsloth/llama-3-8b-bnb-4bit This llama model was trained 2x faster with Unsloth and Huggingface's TRL library. <img src="URL width="200"/>
[ "# Uploaded model\n\n- Developed by: Entreprenerdly\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ "TAGS\n#transformers #safetensors #text-generation-inference #unsloth #llama #trl #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n", "# Uploaded model\n\n- Developed by: Entreprenerdly\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
[ 64, 82 ]
[ "TAGS\n#transformers #safetensors #text-generation-inference #unsloth #llama #trl #en #base_model-unsloth/llama-3-8b-bnb-4bit #license-apache-2.0 #endpoints_compatible #region-us \n# Uploaded model\n\n- Developed by: Entreprenerdly\n- License: apache-2.0\n- Finetuned from model : unsloth/llama-3-8b-bnb-4bit\n\nThis llama model was trained 2x faster with Unsloth and Huggingface's TRL library.\n\n<img src=\"URL width=\"200\"/>" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
Lasion/wav2vec2-base-vivos
null
[ "transformers", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-30T16:03:52+00:00
[ "1910.09700" ]
[]
TAGS #transformers #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 22, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
text-generation
transformers
# dictalm2.0-yam-peleg-Mistral-instruct-Merge dictalm2.0-yam-peleg-Mistral-instruct-Merge is a merge of the following models using [LazyMergekit](https://colab.research.google.com/drive/1obulZ1ROXHjYLn6PPZJwRR6GzgQogxxb?usp=sharing): * [yam-peleg/Hebrew-Mistral-7B](https://huggingface.co/yam-peleg/Hebrew-Mistral-7B) * [dicta-il/dictalm2.0-instruct](https://huggingface.co/dicta-il/dictalm2.0-instruct) ## 🧩 Configuration ```yaml slices: - sources: - model: yam-peleg/Hebrew-Mistral-7B layer_range: [0, 32] - model: dicta-il/dictalm2.0-instruct layer_range: [0, 32] merge_method: slerp base_model: dicta-il/dictalm2.0-instruct parameters: t: - filter: self_attn value: [0, 0.5, 0.3, 0.7, 1] - filter: mlp value: [1, 0.5, 0.7, 0.3, 0] - value: 0.5 dtype: bfloat16 ``` ## 💻 Usage ```python !pip install -qU transformers accelerate from transformers import AutoTokenizer import transformers import torch model = "itayl/dictalm2.0-yam-peleg-Mistral-instruct-Merge" messages = [{"role": "user", "content": "What is a large language model?"}] tokenizer = AutoTokenizer.from_pretrained(model) prompt = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True) pipeline = transformers.pipeline( "text-generation", model=model, torch_dtype=torch.float16, device_map="auto", ) outputs = pipeline(prompt, max_new_tokens=256, do_sample=True, temperature=0.7, top_k=50, top_p=0.95) print(outputs[0]["generated_text"]) ```
{"tags": ["merge", "mergekit", "lazymergekit", "yam-peleg/Hebrew-Mistral-7B", "dicta-il/dictalm2.0-instruct"], "base_model": ["yam-peleg/Hebrew-Mistral-7B", "dicta-il/dictalm2.0-instruct"]}
itayl/dictalm2.0-1yam-peleg-Mistral-instruct-Merge
null
[ "transformers", "safetensors", "mistral", "text-generation", "merge", "mergekit", "lazymergekit", "yam-peleg/Hebrew-Mistral-7B", "dicta-il/dictalm2.0-instruct", "conversational", "base_model:yam-peleg/Hebrew-Mistral-7B", "base_model:dicta-il/dictalm2.0-instruct", "autotrain_compatible", "endpoints_compatible", "text-generation-inference", "region:us" ]
null
2024-04-30T16:04:18+00:00
[]
[]
TAGS #transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #yam-peleg/Hebrew-Mistral-7B #dicta-il/dictalm2.0-instruct #conversational #base_model-yam-peleg/Hebrew-Mistral-7B #base_model-dicta-il/dictalm2.0-instruct #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us
# dictalm2.0-yam-peleg-Mistral-instruct-Merge dictalm2.0-yam-peleg-Mistral-instruct-Merge is a merge of the following models using LazyMergekit: * yam-peleg/Hebrew-Mistral-7B * dicta-il/dictalm2.0-instruct ## Configuration ## Usage
[ "# dictalm2.0-yam-peleg-Mistral-instruct-Merge\n\ndictalm2.0-yam-peleg-Mistral-instruct-Merge is a merge of the following models using LazyMergekit:\n* yam-peleg/Hebrew-Mistral-7B\n* dicta-il/dictalm2.0-instruct", "## Configuration", "## Usage" ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #yam-peleg/Hebrew-Mistral-7B #dicta-il/dictalm2.0-instruct #conversational #base_model-yam-peleg/Hebrew-Mistral-7B #base_model-dicta-il/dictalm2.0-instruct #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n", "# dictalm2.0-yam-peleg-Mistral-instruct-Merge\n\ndictalm2.0-yam-peleg-Mistral-instruct-Merge is a merge of the following models using LazyMergekit:\n* yam-peleg/Hebrew-Mistral-7B\n* dicta-il/dictalm2.0-instruct", "## Configuration", "## Usage" ]
[ 113, 83, 3, 3 ]
[ "TAGS\n#transformers #safetensors #mistral #text-generation #merge #mergekit #lazymergekit #yam-peleg/Hebrew-Mistral-7B #dicta-il/dictalm2.0-instruct #conversational #base_model-yam-peleg/Hebrew-Mistral-7B #base_model-dicta-il/dictalm2.0-instruct #autotrain_compatible #endpoints_compatible #text-generation-inference #region-us \n# dictalm2.0-yam-peleg-Mistral-instruct-Merge\n\ndictalm2.0-yam-peleg-Mistral-instruct-Merge is a merge of the following models using LazyMergekit:\n* yam-peleg/Hebrew-Mistral-7B\n* dicta-il/dictalm2.0-instruct## Configuration## Usage" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
Trelis/Meta-Llama-3-8B-Instruct-forced-french-adapters
null
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-30T16:04:53+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 26, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
null
transformers
## About <!-- ### quantize_version: 2 --> <!-- ### output_tensor_quantised: 1 --> <!-- ### convert_type: --> <!-- ### vocab_type: --> static quants of https://huggingface.co/chlee10/T3Q-Llama3-8B-dpo-v2.0 <!-- provided-files --> weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion. ## Usage If you are unsure how to use GGUF files, refer to one of [TheBloke's READMEs](https://huggingface.co/TheBloke/KafkaLM-70B-German-V0.1-GGUF) for more details, including on how to concatenate multi-part files. ## Provided Quants (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) | Link | Type | Size/GB | Notes | |:-----|:-----|--------:|:------| | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.Q2_K.gguf) | Q2_K | 3.3 | | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.IQ3_XS.gguf) | IQ3_XS | 3.6 | | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.Q3_K_S.gguf) | Q3_K_S | 3.8 | | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.IQ3_S.gguf) | IQ3_S | 3.8 | beats Q3_K* | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.IQ3_M.gguf) | IQ3_M | 3.9 | | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.Q3_K_M.gguf) | Q3_K_M | 4.1 | lower quality | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.Q3_K_L.gguf) | Q3_K_L | 4.4 | | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.IQ4_XS.gguf) | IQ4_XS | 4.6 | | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.Q4_K_S.gguf) | Q4_K_S | 4.8 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.Q4_K_M.gguf) | Q4_K_M | 5.0 | fast, recommended | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.Q5_K_S.gguf) | Q5_K_S | 5.7 | | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.Q5_K_M.gguf) | Q5_K_M | 5.8 | | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.Q6_K.gguf) | Q6_K | 6.7 | very good quality | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.Q8_0.gguf) | Q8_0 | 8.6 | fast, best quality | | [GGUF](https://huggingface.co/mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF/resolve/main/T3Q-Llama3-8B-dpo-v2.0.f16.gguf) | f16 | 16.2 | 16 bpw, overkill | Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): ![image.png](https://www.nethype.de/huggingface_embed/quantpplgraph.png) And here are Artefact2's thoughts on the matter: https://gist.github.com/Artefact2/b5f810600771265fc1e39442288e8ec9 ## FAQ / Model Request See https://huggingface.co/mradermacher/model_requests for some answers to questions you might have and/or if you want some other model quantized. ## Thanks I thank my company, [nethype GmbH](https://www.nethype.de/), for letting me use its servers and providing upgrades to my workstation to enable this work in my free time. <!-- end -->
{"language": ["en"], "license": "apache-2.0", "library_name": "transformers", "base_model": "chlee10/T3Q-Llama3-8B-dpo-v2.0", "quantized_by": "mradermacher"}
mradermacher/T3Q-Llama3-8B-dpo-v2.0-GGUF
null
[ "transformers", "gguf", "en", "base_model:chlee10/T3Q-Llama3-8B-dpo-v2.0", "license:apache-2.0", "endpoints_compatible", "region:us" ]
null
2024-04-30T16:04:59+00:00
[]
[ "en" ]
TAGS #transformers #gguf #en #base_model-chlee10/T3Q-Llama3-8B-dpo-v2.0 #license-apache-2.0 #endpoints_compatible #region-us
About ----- static quants of URL weighted/imatrix quants seem not to be available (by me) at this time. If they do not show up a week or so after the static ones, I have probably not planned for them. Feel free to request them by opening a Community Discussion. Usage ----- If you are unsure how to use GGUF files, refer to one of TheBloke's READMEs for more details, including on how to concatenate multi-part files. Provided Quants --------------- (sorted by size, not necessarily quality. IQ-quants are often preferable over similar sized non-IQ quants) Here is a handy graph by ikawrakow comparing some lower-quality quant types (lower is better): !URL And here are Artefact2's thoughts on the matter: URL FAQ / Model Request ------------------- See URL for some answers to questions you might have and/or if you want some other model quantized. Thanks ------ I thank my company, nethype GmbH, for letting me use its servers and providing upgrades to my workstation to enable this work in my free time.
[]
[ "TAGS\n#transformers #gguf #en #base_model-chlee10/T3Q-Llama3-8B-dpo-v2.0 #license-apache-2.0 #endpoints_compatible #region-us \n" ]
[ 53 ]
[ "TAGS\n#transformers #gguf #en #base_model-chlee10/T3Q-Llama3-8B-dpo-v2.0 #license-apache-2.0 #endpoints_compatible #region-us \n" ]
image-classification
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # FASHION-vision This model is a fine-tuned version of [google/vit-huge-patch14-224-in21k](https://huggingface.co/google/vit-huge-patch14-224-in21k) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.2821 - Accuracy: 0.9034 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 16 - eval_batch_size: 16 - seed: 42 - gradient_accumulation_steps: 4 - total_train_batch_size: 64 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_ratio: 0.1 - num_epochs: 10 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.5334 | 1.0 | 750 | 0.5595 | 0.8302 | | 0.3389 | 2.0 | 1500 | 0.4090 | 0.8602 | | 0.358 | 3.0 | 2250 | 0.3631 | 0.8717 | | 0.3672 | 4.0 | 3000 | 0.3368 | 0.8815 | | 0.3458 | 5.0 | 3750 | 0.3231 | 0.8842 | | 0.2721 | 6.0 | 4500 | 0.3075 | 0.8885 | | 0.2397 | 7.0 | 5250 | 0.3035 | 0.8899 | | 0.2779 | 8.0 | 6000 | 0.2893 | 0.8963 | | 0.2046 | 9.0 | 6750 | 0.2868 | 0.8991 | | 0.2599 | 10.0 | 7500 | 0.2821 | 0.9034 | ### Framework versions - Transformers 4.40.1 - Pytorch 2.2.2 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "apache-2.0", "tags": ["generated_from_trainer"], "metrics": ["accuracy"], "base_model": "google/vit-huge-patch14-224-in21k", "model-index": [{"name": "FASHION-vision", "results": []}]}
nathanReitinger/FASHION-vision
null
[ "transformers", "tensorboard", "safetensors", "vit", "image-classification", "generated_from_trainer", "base_model:google/vit-huge-patch14-224-in21k", "license:apache-2.0", "autotrain_compatible", "endpoints_compatible", "region:us" ]
null
2024-04-30T16:06:16+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #vit #image-classification #generated_from_trainer #base_model-google/vit-huge-patch14-224-in21k #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us
FASHION-vision ============== This model is a fine-tuned version of google/vit-huge-patch14-224-in21k on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.2821 * Accuracy: 0.9034 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 5e-05 * train\_batch\_size: 16 * eval\_batch\_size: 16 * seed: 42 * gradient\_accumulation\_steps: 4 * total\_train\_batch\_size: 64 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * lr\_scheduler\_warmup\_ratio: 0.1 * num\_epochs: 10 ### Training results ### Framework versions * Transformers 4.40.1 * Pytorch 2.2.2 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.2\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #vit #image-classification #generated_from_trainer #base_model-google/vit-huge-patch14-224-in21k #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10", "### Training results", "### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.2\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ 65, 142, 5, 40 ]
[ "TAGS\n#transformers #tensorboard #safetensors #vit #image-classification #generated_from_trainer #base_model-google/vit-huge-patch14-224-in21k #license-apache-2.0 #autotrain_compatible #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 5e-05\n* train\\_batch\\_size: 16\n* eval\\_batch\\_size: 16\n* seed: 42\n* gradient\\_accumulation\\_steps: 4\n* total\\_train\\_batch\\_size: 64\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* lr\\_scheduler\\_warmup\\_ratio: 0.1\n* num\\_epochs: 10### Training results### Framework versions\n\n\n* Transformers 4.40.1\n* Pytorch 2.2.2\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
null
transformers
# Model Card for Model ID <!-- Provide a quick summary of what the model is/does. --> ## Model Details ### Model Description <!-- Provide a longer summary of what this model is. --> This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated. - **Developed by:** [More Information Needed] - **Funded by [optional]:** [More Information Needed] - **Shared by [optional]:** [More Information Needed] - **Model type:** [More Information Needed] - **Language(s) (NLP):** [More Information Needed] - **License:** [More Information Needed] - **Finetuned from model [optional]:** [More Information Needed] ### Model Sources [optional] <!-- Provide the basic links for the model. --> - **Repository:** [More Information Needed] - **Paper [optional]:** [More Information Needed] - **Demo [optional]:** [More Information Needed] ## Uses <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. --> ### Direct Use <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. --> [More Information Needed] ### Downstream Use [optional] <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app --> [More Information Needed] ### Out-of-Scope Use <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. --> [More Information Needed] ## Bias, Risks, and Limitations <!-- This section is meant to convey both technical and sociotechnical limitations. --> [More Information Needed] ### Recommendations <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. --> Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. [More Information Needed] ## Training Details ### Training Data <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. --> [More Information Needed] ### Training Procedure <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. --> #### Preprocessing [optional] [More Information Needed] #### Training Hyperparameters - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision --> #### Speeds, Sizes, Times [optional] <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. --> [More Information Needed] ## Evaluation <!-- This section describes the evaluation protocols and provides the results. --> ### Testing Data, Factors & Metrics #### Testing Data <!-- This should link to a Dataset Card if possible. --> [More Information Needed] #### Factors <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. --> [More Information Needed] #### Metrics <!-- These are the evaluation metrics being used, ideally with a description of why. --> [More Information Needed] ### Results [More Information Needed] #### Summary ## Model Examination [optional] <!-- Relevant interpretability work for the model goes here --> [More Information Needed] ## Environmental Impact <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly --> Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700). - **Hardware Type:** [More Information Needed] - **Hours used:** [More Information Needed] - **Cloud Provider:** [More Information Needed] - **Compute Region:** [More Information Needed] - **Carbon Emitted:** [More Information Needed] ## Technical Specifications [optional] ### Model Architecture and Objective [More Information Needed] ### Compute Infrastructure [More Information Needed] #### Hardware [More Information Needed] #### Software [More Information Needed] ## Citation [optional] <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. --> **BibTeX:** [More Information Needed] **APA:** [More Information Needed] ## Glossary [optional] <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. --> [More Information Needed] ## More Information [optional] [More Information Needed] ## Model Card Authors [optional] [More Information Needed] ## Model Card Contact [More Information Needed]
{"library_name": "transformers", "tags": []}
omezzinemariem/codeqwen-text-to-RULE
null
[ "transformers", "safetensors", "arxiv:1910.09700", "endpoints_compatible", "region:us" ]
null
2024-04-30T16:07:09+00:00
[ "1910.09700" ]
[]
TAGS #transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us
# Model Card for Model ID ## Model Details ### Model Description This is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated. - Developed by: - Funded by [optional]: - Shared by [optional]: - Model type: - Language(s) (NLP): - License: - Finetuned from model [optional]: ### Model Sources [optional] - Repository: - Paper [optional]: - Demo [optional]: ## Uses ### Direct Use ### Downstream Use [optional] ### Out-of-Scope Use ## Bias, Risks, and Limitations ### Recommendations Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations. ## How to Get Started with the Model Use the code below to get started with the model. ## Training Details ### Training Data ### Training Procedure #### Preprocessing [optional] #### Training Hyperparameters - Training regime: #### Speeds, Sizes, Times [optional] ## Evaluation ### Testing Data, Factors & Metrics #### Testing Data #### Factors #### Metrics ### Results #### Summary ## Model Examination [optional] ## Environmental Impact Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). - Hardware Type: - Hours used: - Cloud Provider: - Compute Region: - Carbon Emitted: ## Technical Specifications [optional] ### Model Architecture and Objective ### Compute Infrastructure #### Hardware #### Software [optional] BibTeX: APA: ## Glossary [optional] ## More Information [optional] ## Model Card Authors [optional] ## Model Card Contact
[ "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n", "# Model Card for Model ID", "## Model Details", "### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:", "### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:", "## Uses", "### Direct Use", "### Downstream Use [optional]", "### Out-of-Scope Use", "## Bias, Risks, and Limitations", "### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.", "## How to Get Started with the Model\n\nUse the code below to get started with the model.", "## Training Details", "### Training Data", "### Training Procedure", "#### Preprocessing [optional]", "#### Training Hyperparameters\n\n- Training regime:", "#### Speeds, Sizes, Times [optional]", "## Evaluation", "### Testing Data, Factors & Metrics", "#### Testing Data", "#### Factors", "#### Metrics", "### Results", "#### Summary", "## Model Examination [optional]", "## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:", "## Technical Specifications [optional]", "### Model Architecture and Objective", "### Compute Infrastructure", "#### Hardware", "#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:", "## Glossary [optional]", "## More Information [optional]", "## Model Card Authors [optional]", "## Model Card Contact" ]
[ 26, 6, 4, 75, 23, 3, 5, 8, 9, 8, 34, 20, 4, 5, 5, 11, 13, 12, 3, 10, 6, 5, 6, 4, 5, 7, 49, 7, 7, 5, 5, 15, 7, 7, 8, 5 ]
[ "TAGS\n#transformers #safetensors #arxiv-1910.09700 #endpoints_compatible #region-us \n# Model Card for Model ID## Model Details### Model Description\n\n\n\nThis is the model card of a transformers model that has been pushed on the Hub. This model card has been automatically generated.\n\n- Developed by: \n- Funded by [optional]: \n- Shared by [optional]: \n- Model type: \n- Language(s) (NLP): \n- License: \n- Finetuned from model [optional]:### Model Sources [optional]\n\n\n\n- Repository: \n- Paper [optional]: \n- Demo [optional]:## Uses### Direct Use### Downstream Use [optional]### Out-of-Scope Use## Bias, Risks, and Limitations### Recommendations\n\n\n\nUsers (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.## How to Get Started with the Model\n\nUse the code below to get started with the model.## Training Details### Training Data### Training Procedure#### Preprocessing [optional]#### Training Hyperparameters\n\n- Training regime:#### Speeds, Sizes, Times [optional]## Evaluation### Testing Data, Factors & Metrics#### Testing Data#### Factors#### Metrics### Results#### Summary## Model Examination [optional]## Environmental Impact\n\n\n\nCarbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).\n\n- Hardware Type: \n- Hours used: \n- Cloud Provider: \n- Compute Region: \n- Carbon Emitted:## Technical Specifications [optional]### Model Architecture and Objective### Compute Infrastructure#### Hardware#### Software\n\n\n\n[optional]\n\n\n\nBibTeX:\n\n\n\nAPA:## Glossary [optional]## More Information [optional]## Model Card Authors [optional]## Model Card Contact" ]
question-answering
transformers
<!-- This model card has been generated automatically according to the information the Trainer had access to. You should probably proofread and complete it, then remove this comment. --> # roberta_en_v2 This model is a fine-tuned version of [FacebookAI/roberta-base](https://huggingface.co/FacebookAI/roberta-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.9374 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2e-05 - train_batch_size: 24 - eval_batch_size: 24 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | |:-------------:|:-----:|:----:|:---------------:| | 1.137 | 1.0 | 609 | 0.9105 | | 0.8061 | 2.0 | 1218 | 0.8708 | | 0.6349 | 3.0 | 1827 | 0.9374 | ### Framework versions - Transformers 4.41.0.dev0 - Pytorch 2.2.1+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1
{"license": "mit", "tags": ["generated_from_trainer"], "base_model": "FacebookAI/roberta-base", "model-index": [{"name": "roberta_en_v2", "results": []}]}
enriquesaou/roberta_en_v2
null
[ "transformers", "tensorboard", "safetensors", "roberta", "question-answering", "generated_from_trainer", "base_model:FacebookAI/roberta-base", "license:mit", "endpoints_compatible", "region:us" ]
null
2024-04-30T16:08:25+00:00
[]
[]
TAGS #transformers #tensorboard #safetensors #roberta #question-answering #generated_from_trainer #base_model-FacebookAI/roberta-base #license-mit #endpoints_compatible #region-us
roberta\_en\_v2 =============== This model is a fine-tuned version of FacebookAI/roberta-base on an unknown dataset. It achieves the following results on the evaluation set: * Loss: 0.9374 Model description ----------------- More information needed Intended uses & limitations --------------------------- More information needed Training and evaluation data ---------------------------- More information needed Training procedure ------------------ ### Training hyperparameters The following hyperparameters were used during training: * learning\_rate: 2e-05 * train\_batch\_size: 24 * eval\_batch\_size: 24 * seed: 42 * optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 * lr\_scheduler\_type: linear * num\_epochs: 3 ### Training results ### Framework versions * Transformers 4.41.0.dev0 * Pytorch 2.2.1+cu121 * Datasets 2.19.0 * Tokenizers 0.19.1
[ "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 24\n* eval\\_batch\\_size: 24\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* Transformers 4.41.0.dev0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ "TAGS\n#transformers #tensorboard #safetensors #roberta #question-answering #generated_from_trainer #base_model-FacebookAI/roberta-base #license-mit #endpoints_compatible #region-us \n", "### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 24\n* eval\\_batch\\_size: 24\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3", "### Training results", "### Framework versions\n\n\n* Transformers 4.41.0.dev0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]
[ 46, 101, 5, 47 ]
[ "TAGS\n#transformers #tensorboard #safetensors #roberta #question-answering #generated_from_trainer #base_model-FacebookAI/roberta-base #license-mit #endpoints_compatible #region-us \n### Training hyperparameters\n\n\nThe following hyperparameters were used during training:\n\n\n* learning\\_rate: 2e-05\n* train\\_batch\\_size: 24\n* eval\\_batch\\_size: 24\n* seed: 42\n* optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08\n* lr\\_scheduler\\_type: linear\n* num\\_epochs: 3### Training results### Framework versions\n\n\n* Transformers 4.41.0.dev0\n* Pytorch 2.2.1+cu121\n* Datasets 2.19.0\n* Tokenizers 0.19.1" ]