File size: 1,043 Bytes
49b8306 327304f 49b8306 5648e40 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 |
---
license: mit
language:
- en
pipeline_tag: text-generation
tags:
- gemma
---
# HelpingAI-180B-base
## Description
The HelpingAI-180B-base model is a large-scale artificial intelligence model developed to assist in various natural language processing tasks. Trained on a diverse range of data sources, this model is designed to generate text, facilitate language understanding, and support various downstream tasks.
## Model Information
- **Model size**: 176 billion parameters
- **Training data**: Diverse datasets covering a wide range of topics and domains.
- **Training objective**: Language modeling with an emphasis on understanding and generating human-like text.
- **Tokenizer**: Gemma tokenizer
## Intended Use
The HelpingAI-180B-base model is intended for researchers, developers, and practitioners in the field of natural language processing (NLP). It can be used for a variety of tasks, including but not limited to:
- Text generation
- Language understanding
- Text summarization
- Dialogue generation
This model for research |