File size: 2,514 Bytes
b1c6cc2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
---
license: llama2
model_type: llama
tags:
  - facebook
  - meta
  - pytorch
  - llama
  - llama-2
---

# GOAT-70B-STORYTELLING model

![GOAT-70B-STORYTELLING](img_here)

GOAT-70B-STORYTELLING model is supervised finetuned (SFT) version of LLaMA 2 developed by GOAT.AI lab for story-writing. 


# GOAT-STORYTELLING-AGENT
The GOAT-70B-STORYTELLING model has been developed as an integral component within the GOAT-STORYTELLING-AGENT Framework. This framework is designed to facilitate the generation of high-quality, cohesive, and captivating narratives, including stories and books. It achieves this by utilizing inputs such as plot outlines, character profiles, their interrelationships, and other relevant details. Example is provided below.

# Model description
 - **Base Architecture:** LLaMA 2 70B 
 - **License:** llama2
 - **Context window length:** 4096 tokens

### Learn more

- **Blog:** TBA
- **Framework:** github.com/GOAT-STORYTELLING-AGENT (TBA)
## Uses

The main purpose of GOAT-70B-STORYTELLING is to generate books, novels, moviescripts and etc. as an agent in cope with our GOAT-STORYTELLING-AGENT Framework. It is specifically designed for storywriters.

## Usage


Usage can be either self-hosted via `transformers` or used with Spaces

```
import torch

from transformers import AutoTokenizer, AutoModelForCausalLM

model_name = "GOAT-AI/GOAT-70B-STORYTELLING"

tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(
    model_name,
    torch_dtype=torch.bfloat16
)
```

Use it inside via GOAT-STORYTELLING-AGENT framework:

```
GOAT-STORYTELLING-AGENT code
```
## Training dataset

Training dataset was collected via GOAT-STORYTELLING-AGENT framework and GPT-4, incorporated with opensource instruction data. We will not release the dataset. 

## License

GOAT-70B-STORYTELLING model is based on [Meta's LLaMA-2-70b-hf](https://huggingface.co/meta-llama/Llama-2-70b-hf), and using own datasets.  

GOAT-70B-STORYTELLING model weights are available under LLAMA-2 license. Note that the GOAT-70B-STORYTELLING model weights require access to the LLaMA-2 model weighs. The GOAT-70B-STORYTELLING model is based on LLaMA-2 and should be used according to the LLaMA-2 license. 

### Risks and Biases 

GOAT-70B-STORYTELLING model can produce factually incorrect output and should not be relied on to deliver factually accurate information. Therefore, the GOAT-70B-STORYTELLING model could possibly generate wrong, biased, or otherwise offensive outputs.