|
--- |
|
library_name: transformers |
|
base_model: |
|
- meta-llama/Llama-3.1-8B |
|
--- |
|
|
|
# Epos-8B |
|
|
|
Epos-8B is a fine-tuned version of the base model **Llama-3.1-8B** from Meta, optimized for storytelling, dialogue generation, and creative writing. The model specializes in generating rich narratives, immersive prose, and dynamic character interactions, making it ideal for creative tasks. |
|
|
|
![image/png](https://cdn-uploads.huggingface.co/production/uploads/65dbd5a60e6ad24551b3959f/P01YmhjrdTfpJBpyWfyy9.png) |
|
|
|
--- |
|
|
|
|
|
## Model Details |
|
|
|
### Model Description |
|
|
|
Epos-8B is an 8 billion parameter language model fine-tuned for storytelling and narrative tasks. Inspired by the grandeur of epic tales, it is designed to produce high-quality, engaging content that evokes the depth and imagination of ancient myths and modern storytelling traditions. |
|
|
|
- **Developed by:** P0x0 |
|
- **Funded by:** P0x0 |
|
- **Shared by:** P0x0 |
|
- **Model type:** Transformer-based Language Model |
|
- **Language(s) (NLP):** Primarily English |
|
- **License:** Apache 2.0 |
|
- **Finetuned from model:** [meta-llama/Llama-3.1-8B](https://huggingface.co/meta-llama/Llama-3.1-8B) |
|
|
|
### Model Sources |
|
|
|
- **Repository:** [Epos-8B on Hugging Face](https://huggingface.co/P0x0/Epos-8B) |
|
|
|
--- |
|
|
|
## Uses |
|
|
|
### Direct Use |
|
|
|
Epos-8B is ideal for: |
|
- **Storytelling:** Generate detailed, immersive, and engaging narratives. |
|
- **Dialogue Creation:** Create realistic and dynamic character interactions for stories or games. |
|
## How to Get Started with the Model |
|
|
|
To run the quantized version of the model, you can use [KoboldCPP](https://github.com/LostRuins/koboldcpp), which allows you to run quantized GGUF models locally. |
|
|
|
### Steps: |
|
1. Download [KoboldCPP](https://github.com/LostRuins/koboldcpp). |
|
2. Follow the setup instructions provided in the repository. |
|
3. Download the GGUF variant of Epos-8B from [Epos-8B-GGUF](https://huggingface.co/P0x0/Epos-8B-GGUF). |
|
4. Load the model in KoboldCPP and start generating! |
|
|
|
|