GGUF
Not-For-All-Audiences
Inference Endpoints
conversational
File size: 1,744 Bytes
1c0cc25
 
 
 
2a3f39b
1c0cc25
 
a32c0ca
31805d9
 
294b06d
31805d9
e6793f9
31805d9
 
 
9b8311b
5ec1914
0248806
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
---
datasets:
- lemonilia/LimaRP
- grimulkan/theory-of-mind
- Epiculous/Gnosis
tags:
- not-for-all-audiences
license: agpl-3.0
---

# Fett-uccine

This model is created by training Mistral base model on LimaRP (ShareGPT format provided by SAO), theory of mind, and gnosis(provided by jeiku).

The 8-bit lora was then merged into Mistral Instruct resulting in what you see here.

Works best with ChatML Instruct

This model is in honor of the SillyTavern community, keep being awesome!

Optimal Settings provided by Nitral:
```
{
    "temp": 5,
    "temperature_last": true,
    "top_p": 1,
    "top_k": 0,
    "top_a": 0,
    "tfs": 1,
    "epsilon_cutoff": 0,
    "eta_cutoff": 0,
    "typical_p": 1,
    "min_p": 0.05,
    "rep_pen": 1,
    "rep_pen_range": 0,
    "no_repeat_ngram_size": 0,
    "penalty_alpha": 0,
    "num_beams": 1,
    "length_penalty": 0,
    "min_length": 0,
    "encoder_rep_pen": 1,
    "freq_pen": 0,
    "presence_pen": 0,
    "do_sample": true,
    "early_stopping": false,
    "dynatemp": false,
    "min_temp": 1,
    "max_temp": 5,
    "dynatemp_exponent": 1,
    "smoothing_factor": 0.3,
    "add_bos_token": true,
    "truncation_length": 2048,
    "ban_eos_token": false,
    "skip_special_tokens": true,
    "streaming": false,
    "mirostat_mode": 0,
    "mirostat_tau": 5,
    "mirostat_eta": 0.1,
    "guidance_scale": 1,
    "negative_prompt": "",
    "grammar_string": "",
    "banned_tokens": "",
    "ignore_eos_token_aphrodite": false,
    "spaces_between_special_tokens_aphrodite": true,
    "sampler_order": [
        6,
        0,
        1,
        3,
        4,
        2,
        5
    ],
    "logit_bias": [],
    "n": 1,
    "rep_pen_size": 0,
    "genamt": 150,
    "max_length": 8192
}
```