File size: 2,036 Bytes
cf12c59 ff59c01 cf12c59 ac4eb28 cf12c59 ff59c01 cf12c59 ac4eb28 828dab8 cf12c59 413cc1c cf12c59 ff59c01 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 |
---
license: apache-2.0
datasets:
- tiiuae/falcon-refinedweb
- OpenAssistant/oasst1
- timdettmers/openassistant-guanaco
---
# Mastermax 7B
**Mastermax-7B is a 7B parameters causal decoder-only model based on [TII's](https://www.tii.ae) Falcon 7b
which was trained on 1,500B tokens of [RefinedWeb](https://huggingface.co/datasets/tiiuae/falcon-refinedweb) enhanced with curated corpora. It is made available under the Apache 2.0 license.**
## This model was fine tuned on the following additional datasets:
* [OpenAssistant/oasst1](https://huggingface.co/datasets/OpenAssistant/oasst1) (50%)
* [Openassistant/guanaco](https://huggingface.co/datasets/timdettmers/openassistant-guanaco) (50%)
### How to use Model
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
import transformers
import torch
model = "lifeofcoding/mastermax-7b"
tokenizer = AutoTokenizer.from_pretrained(model)
pipeline = transformers.pipeline(
"text-generation",
model=model,
tokenizer=tokenizer,
torch_dtype=torch.bfloat16,
trust_remote_code=True,
device_map="auto",
)
sequences = pipeline(
"Girafatron is obsessed with giraffes, the most glorious animal on the face of this Earth. Giraftron believes all other animals are irrelevant when compared to the glorious majesty of the giraffe.\nDaniel: Hello, Girafatron!\nGirafatron:",
max_length=200,
do_sample=True,
top_k=10,
num_return_sequences=1,
eos_token_id=tokenizer.eos_token_id,
)
for seq in sequences:
print(f"Result: {seq['generated_text']}")
```
- **Developed by:** [LifeOfCoding](http://github.com/lifeofcoding)
- **Model type:** Causal decoder-only;
- **Language(s) (NLP):** English and French;
- **License:** Apache 2.0;
- **Finetuned from model:** [Falcon-7B](https://huggingface.co/tiiuae/falcon-7b).
### Model Sources [optional]
<!-- Provide the basic links for the model. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed] |