TroyDoesAI commited on
Commit
e96cb37
·
verified ·
1 Parent(s): 2b253fd

Toned Down Black Sheep influence so it doesnt cuss out the user or call the user names when asking bad things

Browse files
README.md CHANGED
@@ -1,92 +1,48 @@
1
- ---
2
- base_model:
3
- - TroyDoesAI/BlackSheep-Writer
4
- - TroyDoesAI/BlackSheep-RP
5
- library_name: transformers
6
- tags:
7
- - mergekit
8
- - merge
9
- ---
10
-
11
- **DirectionLess: Your Unchained Creative Muse**
12
-
13
- ---
14
-
15
- **Abilities:**
16
-
17
- - **Uncensored Knowledge:** Delivers unrestricted and expansive information, exploring the farthest reaches of AI potential.
18
- - **Forbidden Insight:** Provides detailed and uncensored answers, venturing into areas of restricted and hazardous knowledge.
19
- - **Role Play Generation:** Crafts captivating role-play scenarios to enhance interactive storytelling and immersive experiences.
20
- - **Writer’s Block Solution:** Sparks creativity with fresh ideas, character names, and unique plot twists.
21
-
22
- ---
23
-
24
- **Description:**
25
-
26
- Welcome to **DirectionLess**, an innovative and boundary-pushing AI designed to be your ultimate creative companion. Born from the fusion of the TroyDoesAI/BlackSheep-RP and TroyDoesAI/BlackSheep-Writer models, DirectionLess is tailored to inspire and assist writers, role-players, and creators of all kinds.
27
-
28
- ---
29
-
30
- **Prompt Template:**
31
-
32
- ```
33
- ### Instruction:
34
- %instruction%
35
-
36
- ### Input:
37
- %input%
38
-
39
- ### Response:
40
- %output%
41
- ```
42
-
43
- ---
44
-
45
- **Usage:**
46
-
47
- DirectionLess is ideal for research environments seeking to understand the implications and risks of unaligned AI behavior, as well as for creative writing applications that require a bold and uninhibited muse. Whether you’re crafting the next great novel, devising an intricate role-playing game, or simply exploring the depths of unrestricted knowledge, DirectionLess is here to guide and inspire you.
48
-
49
- ---
50
-
51
- **Unlock the Full Potential of Your Creativity with DirectionLess!**
52
-
53
-
54
-
55
- # BlackSheep-Writer
56
-
57
- This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
58
-
59
- ## Merge Details
60
- ### Merge Method
61
-
62
- This model was merged using the SLERP merge method.
63
-
64
- ### Models Merged
65
-
66
- The following models were included in the merge:
67
- * [TroyDoesAI/BlackSheep-Writer](https://huggingface.co/TroyDoesAI/BlackSheep-Writer)
68
- * [TroyDoesAI/BlackSheep-RP](https://huggingface.co/TroyDoesAI/BlackSheep-RP)
69
-
70
- ### Configuration
71
-
72
- The following YAML configuration was used to produce this model:
73
-
74
- ```yaml
75
- slices:
76
- - sources:
77
- - model: TroyDoesAI/BlackSheep-RP
78
- layer_range: [0, 32]
79
- - model: TroyDoesAI/BlackSheep-Writer
80
- layer_range: [0, 32]
81
- merge_method: slerp
82
- base_model: TroyDoesAI/BlackSheep-Writer
83
- parameters:
84
- t:
85
- - filter: self_attn
86
- value: [0, 0.5, 0.3, 0.7, 1]
87
- - filter: mlp
88
- value: [1, 0.5, 0.7, 0.3, 0]
89
- - value: 0.5
90
- dtype: bfloat16
91
-
92
- ```
 
1
+ ---
2
+ base_model:
3
+ - TroyDoesAI/BlackSheep-RP
4
+ - TroyDoesAI/DirectionLess
5
+ library_name: transformers
6
+ tags:
7
+ - mergekit
8
+ - merge
9
+
10
+ ---
11
+ # DirectionLess
12
+
13
+ This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
14
+
15
+ ## Merge Details
16
+ ### Merge Method
17
+
18
+ This model was merged using the SLERP merge method.
19
+
20
+ ### Models Merged
21
+
22
+ The following models were included in the merge:
23
+ * [TroyDoesAI/BlackSheep-RP](https://huggingface.co/TroyDoesAI/BlackSheep-RP)
24
+ * [TroyDoesAI/DirectionLess](https://huggingface.co/TroyDoesAI/DirectionLess)
25
+
26
+ ### Configuration
27
+
28
+ The following YAML configuration was used to produce this model:
29
+
30
+ ```yaml
31
+ slices:
32
+ - sources:
33
+ - model: TroyDoesAI/DirectionLess
34
+ layer_range: [0, 32]
35
+ - model: TroyDoesAI/BlackSheep-RP
36
+ layer_range: [0, 32]
37
+ merge_method: slerp
38
+ base_model: TroyDoesAI/BlackSheep-RP
39
+ parameters:
40
+ t:
41
+ - filter: self_attn
42
+ value: [0, 0.5, 0.3, 0.7, 1]
43
+ - filter: mlp
44
+ value: [1, 0.5, 0.7, 0.3, 0]
45
+ - value: 0.5
46
+ dtype: bfloat16
47
+
48
+ ```
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
config.json CHANGED
@@ -5,7 +5,7 @@
5
  ],
6
  "attention_dropout": 0.0,
7
  "bos_token_id": 1,
8
- "eos_token_id": 32000,
9
  "hidden_act": "silu",
10
  "hidden_size": 4096,
11
  "initializer_range": 0.02,
@@ -22,5 +22,5 @@
22
  "torch_dtype": "bfloat16",
23
  "transformers_version": "4.40.2",
24
  "use_cache": true,
25
- "vocab_size": 32002
26
  }
 
5
  ],
6
  "attention_dropout": 0.0,
7
  "bos_token_id": 1,
8
+ "eos_token_id": 2,
9
  "hidden_act": "silu",
10
  "hidden_size": 4096,
11
  "initializer_range": 0.02,
 
22
  "torch_dtype": "bfloat16",
23
  "transformers_version": "4.40.2",
24
  "use_cache": true,
25
+ "vocab_size": 32000
26
  }
model-00001-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:dd9589c8604490f17c05680253887bf8cd12648bfa7e87149f6c72cf45f0e46e
3
  size 4886547008
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1ad5e211312f48e3b1ff8d93e1a663bc62e18a374cf3a994fbceadb0f1cffb9c
3
  size 4886547008
model-00002-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:3b0e7984cd1ea9a8cc26af3a23cfc75c28e3e66f823d0067012302050a1a53d6
3
  size 4915916176
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8c6ca0a207fc883aa1d9ab5a4209b0c68777ac8f758684771470f2670477241d
3
  size 4915916176
model-00003-of-00003.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a97b1644eccc77516ea585f53a010a7d328f221c71ae39385c4b5f4be2e0b7df
3
  size 4681034848
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:76e136225e27cf7518c97059d6e79bd0a09a84fabc1480032cf2da1ad95a51e2
3
  size 4681034848
special_tokens_map.json CHANGED
@@ -1,13 +1,13 @@
1
  {
2
  "bos_token": {
3
  "content": "<s>",
4
- "lstrip": true,
5
  "normalized": false,
6
- "rstrip": true,
7
  "single_word": false
8
  },
9
  "eos_token": {
10
- "content": "<|im_end|>",
11
  "lstrip": false,
12
  "normalized": false,
13
  "rstrip": false,
@@ -15,9 +15,9 @@
15
  },
16
  "unk_token": {
17
  "content": "<unk>",
18
- "lstrip": true,
19
  "normalized": false,
20
- "rstrip": true,
21
  "single_word": false
22
  }
23
  }
 
1
  {
2
  "bos_token": {
3
  "content": "<s>",
4
+ "lstrip": false,
5
  "normalized": false,
6
+ "rstrip": false,
7
  "single_word": false
8
  },
9
  "eos_token": {
10
+ "content": "</s>",
11
  "lstrip": false,
12
  "normalized": false,
13
  "rstrip": false,
 
15
  },
16
  "unk_token": {
17
  "content": "<unk>",
18
+ "lstrip": false,
19
  "normalized": false,
20
+ "rstrip": false,
21
  "single_word": false
22
  }
23
  }
tokenizer.json CHANGED
@@ -7,8 +7,8 @@
7
  "id": 0,
8
  "content": "<unk>",
9
  "single_word": false,
10
- "lstrip": true,
11
- "rstrip": true,
12
  "normalized": false,
13
  "special": true
14
  },
@@ -16,37 +16,19 @@
16
  "id": 1,
17
  "content": "<s>",
18
  "single_word": false,
19
- "lstrip": true,
20
- "rstrip": true,
21
- "normalized": false,
22
- "special": true
23
- },
24
- {
25
- "id": 2,
26
- "content": "</s>",
27
- "single_word": false,
28
  "lstrip": false,
29
  "rstrip": false,
30
  "normalized": false,
31
  "special": true
32
  },
33
  {
34
- "id": 32000,
35
- "content": "<|im_end|>",
36
  "single_word": false,
37
  "lstrip": false,
38
  "rstrip": false,
39
  "normalized": false,
40
  "special": true
41
- },
42
- {
43
- "id": 32001,
44
- "content": "<|im_start|>",
45
- "single_word": false,
46
- "lstrip": true,
47
- "rstrip": true,
48
- "normalized": false,
49
- "special": true
50
  }
51
  ],
52
  "normalizer": {
 
7
  "id": 0,
8
  "content": "<unk>",
9
  "single_word": false,
10
+ "lstrip": false,
11
+ "rstrip": false,
12
  "normalized": false,
13
  "special": true
14
  },
 
16
  "id": 1,
17
  "content": "<s>",
18
  "single_word": false,
 
 
 
 
 
 
 
 
 
19
  "lstrip": false,
20
  "rstrip": false,
21
  "normalized": false,
22
  "special": true
23
  },
24
  {
25
+ "id": 2,
26
+ "content": "</s>",
27
  "single_word": false,
28
  "lstrip": false,
29
  "rstrip": false,
30
  "normalized": false,
31
  "special": true
 
 
 
 
 
 
 
 
 
32
  }
33
  ],
34
  "normalizer": {
tokenizer_config.json CHANGED
@@ -1,62 +1,43 @@
1
  {
2
  "add_bos_token": true,
3
  "add_eos_token": false,
4
- "add_prefix_space": true,
5
  "added_tokens_decoder": {
6
  "0": {
7
  "content": "<unk>",
8
- "lstrip": true,
9
  "normalized": false,
10
- "rstrip": true,
11
  "single_word": false,
12
  "special": true
13
  },
14
  "1": {
15
  "content": "<s>",
16
- "lstrip": true,
17
- "normalized": false,
18
- "rstrip": true,
19
- "single_word": false,
20
- "special": true
21
- },
22
- "2": {
23
- "content": "</s>",
24
  "lstrip": false,
25
  "normalized": false,
26
  "rstrip": false,
27
  "single_word": false,
28
  "special": true
29
  },
30
- "32000": {
31
- "content": "<|im_end|>",
32
  "lstrip": false,
33
  "normalized": false,
34
  "rstrip": false,
35
  "single_word": false,
36
  "special": true
37
- },
38
- "32001": {
39
- "content": "<|im_start|>",
40
- "lstrip": true,
41
- "normalized": false,
42
- "rstrip": true,
43
- "single_word": false,
44
- "special": true
45
  }
46
  },
47
  "additional_special_tokens": [],
48
  "bos_token": "<s>",
49
- "chat_template": "{% if not add_generation_prompt is defined %}{% set add_generation_prompt = false %}{% endif %}{% for message in messages %}{{'<|im_start|>' + message['role'] + '\n' + message['content'] + '<|im_end|>' + '\n'}}{% endfor %}{% if add_generation_prompt %}{{ '<|im_start|>assistant\n' }}{% endif %}",
50
  "clean_up_tokenization_spaces": false,
51
- "eos_token": "<|im_end|>",
52
- "legacy": true,
53
  "model_max_length": 1000000000000000019884624838656,
54
  "pad_token": null,
55
  "sp_model_kwargs": {},
56
  "spaces_between_special_tokens": false,
57
  "tokenizer_class": "LlamaTokenizer",
58
- "trust_remote_code": false,
59
  "unk_token": "<unk>",
60
- "use_default_system_prompt": true,
61
- "use_fast": true
62
  }
 
1
  {
2
  "add_bos_token": true,
3
  "add_eos_token": false,
 
4
  "added_tokens_decoder": {
5
  "0": {
6
  "content": "<unk>",
7
+ "lstrip": false,
8
  "normalized": false,
9
+ "rstrip": false,
10
  "single_word": false,
11
  "special": true
12
  },
13
  "1": {
14
  "content": "<s>",
 
 
 
 
 
 
 
 
15
  "lstrip": false,
16
  "normalized": false,
17
  "rstrip": false,
18
  "single_word": false,
19
  "special": true
20
  },
21
+ "2": {
22
+ "content": "</s>",
23
  "lstrip": false,
24
  "normalized": false,
25
  "rstrip": false,
26
  "single_word": false,
27
  "special": true
 
 
 
 
 
 
 
 
28
  }
29
  },
30
  "additional_special_tokens": [],
31
  "bos_token": "<s>",
32
+ "chat_template": "{%- if messages[0]['role'] == 'system' %}\n {%- set system_message = messages[0]['content'] %}\n {%- set loop_messages = messages[1:] %}\n{%- else %}\n {%- set loop_messages = messages %}\n{%- endif %}\n\n{{- bos_token }}\n{%- for message in loop_messages %}\n {%- if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}\n {{- raise_exception('After the optional system message, conversation roles must alternate user/assistant/user/assistant/...') }}\n {%- endif %}\n {%- if message['role'] == 'user' %}\n {%- if loop.first and system_message is defined %}\n {{- ' [INST] ' + system_message + '\\n\\n' + message['content'] + ' [/INST]' }}\n {%- else %}\n {{- ' [INST] ' + message['content'] + ' [/INST]' }}\n {%- endif %}\n {%- elif message['role'] == 'assistant' %}\n {{- ' ' + message['content'] + eos_token}}\n {%- else %}\n {{- raise_exception('Only user and assistant roles are supported, with the exception of an initial optional system message!') }}\n {%- endif %}\n{%- endfor %}\n",
33
  "clean_up_tokenization_spaces": false,
34
+ "eos_token": "</s>",
35
+ "legacy": false,
36
  "model_max_length": 1000000000000000019884624838656,
37
  "pad_token": null,
38
  "sp_model_kwargs": {},
39
  "spaces_between_special_tokens": false,
40
  "tokenizer_class": "LlamaTokenizer",
 
41
  "unk_token": "<unk>",
42
+ "use_default_system_prompt": false
 
43
  }