mrseeker87's picture
Update README.md
6e08f70 verified
|
raw
history blame
575 Bytes
metadata
license: apache-2.0
language:
  - en
pipeline_tag: text-generation
tags:
  - pytorch
  - mixtral
  - fine-tuned
  - moe

Mixtral 8x7B - Holodeck

Model Description

Mistral 7B-Holodeck is a finetune created using Mixtral's 8x7B model.

Training data

The training data contains around 3000 ebooks in various genres. Most parts of the dataset have been prepended using the following text: [Genre: <genre1>, <genre2>]


Limitations and Biases

Based on known problems with NLP technology, potential relevant factors include bias (gender, profession, race and religion).