|
--- |
|
license: mit |
|
language: |
|
- en |
|
pipeline_tag: text-generation |
|
tags: |
|
- gemma |
|
--- |
|
|
|
# HelpingAI-180B-base |
|
|
|
## Description |
|
The HelpingAI-180B-base model is a large-scale artificial intelligence model developed to assist in various natural language processing tasks. Trained on a diverse range of data sources, this model is designed to generate text, facilitate language understanding, and support various downstream tasks. |
|
|
|
## Model Information |
|
- **Model size**: 176 billion parameters |
|
- **Training data**: Diverse datasets covering a wide range of topics and domains. |
|
- **Training objective**: Language modeling with an emphasis on understanding and generating human-like text. |
|
- **Tokenizer**: Gemma tokenizer |
|
## Intended Use |
|
The HelpingAI-180B-base model is intended for researchers, developers, and practitioners in the field of natural language processing (NLP). It can be used for a variety of tasks, including but not limited to: |
|
- Text generation |
|
- Language understanding |
|
- Text summarization |
|
- Dialogue generation |
|
This model for research |