Cosmos
Safetensors
nvidia
jibinv commited on
Commit
3bafa33
·
verified ·
1 Parent(s): 4519f42

Update guardrail

Browse files

Signed-off-by: Jibin Varghese <[email protected]>

This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitattributes +3 -0
  2. guardrail/.gitattributes +40 -0
  3. guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/0d30a7faffd5631f68ca99856c40c252b1a5839a.lock +0 -0
  4. guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/1a87b8f7340ada18ca4f047077a9d5b13882acc1.lock +0 -0
  5. guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/451134b2ddc2e78555d1e857518c54b4bdc2e87d.lock +0 -0
  6. guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/4d92c8b74f78b0e0f4b32921d13a007efcd0e0447290da6d92f787c3295b0ad8.lock +0 -0
  7. guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/5f4117005b41815881fe7f26aee4cbec8c55aa32.lock +0 -0
  8. guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347.lock +0 -0
  9. guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/a19b92a679870c311122d67ae980737cf3e51424b396b3809463c4d9b06c7fcf.lock +0 -0
  10. guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/a6e931b92caff4c79c5c56282f1e89569a0ae558.lock +0 -0
  11. guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/e75756c38e88b19504b139e45c2bb1e925f3863c.lock +0 -0
  12. guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/f9e6f2ab03a3b92bf4bc6cfd6d6dcdaa8b36ab5ecf73dcfd1e8da3b5a95261a8.lock +0 -0
  13. guardrail/aegis/.locks/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/c4d110b05e852cead25fcc7426bf251eb3d15aa0.lock +0 -0
  14. guardrail/aegis/.locks/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c.lock +0 -0
  15. guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/adapter_config.json +0 -0
  16. guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/added_tokens.json +0 -0
  17. guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/chat_template.jinja +0 -0
  18. guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model.safetensors +0 -0
  19. guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/0d30a7faffd5631f68ca99856c40c252b1a5839a +8 -0
  20. guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/1a87b8f7340ada18ca4f047077a9d5b13882acc1 +42 -0
  21. guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/451134b2ddc2e78555d1e857518c54b4bdc2e87d +23 -0
  22. guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/4d92c8b74f78b0e0f4b32921d13a007efcd0e0447290da6d92f787c3295b0ad8 +3 -0
  23. guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/5f4117005b41815881fe7f26aee4cbec8c55aa32 +298 -0
  24. guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347 +0 -0
  25. guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/a19b92a679870c311122d67ae980737cf3e51424b396b3809463c4d9b06c7fcf +3 -0
  26. guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/a6e931b92caff4c79c5c56282f1e89569a0ae558 +0 -0
  27. guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/e75756c38e88b19504b139e45c2bb1e925f3863c +26 -0
  28. guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/f9e6f2ab03a3b92bf4bc6cfd6d6dcdaa8b36ab5ecf73dcfd1e8da3b5a95261a8 +3 -0
  29. guardrail/aegis/models--meta-llama--LlamaGuard-7b/refs/main +1 -0
  30. guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/config.json +26 -0
  31. guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/generation_config.json +8 -0
  32. guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model-00001-of-00003.safetensors +3 -0
  33. guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model-00002-of-00003.safetensors +3 -0
  34. guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model-00003-of-00003.safetensors +3 -0
  35. guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model.safetensors.index.json +298 -0
  36. guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/special_tokens_map.json +23 -0
  37. guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/tokenizer.json +0 -0
  38. guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/tokenizer.model +3 -0
  39. guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/tokenizer_config.json +42 -0
  40. guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/blobs/c4d110b05e852cead25fcc7426bf251eb3d15aa0 +33 -0
  41. guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/blobs/d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c +3 -0
  42. guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/refs/main +1 -0
  43. guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/snapshots/62006ace73a69838083a31831126146048694b25/adapter_config.json +1 -0
  44. guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/snapshots/62006ace73a69838083a31831126146048694b25/adapter_model.safetensors +1 -0
  45. guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/snapshots/f54cb2302ee876705dc0f7df2288f442c034b2f3/adapter_config.json +33 -0
  46. guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/snapshots/f54cb2302ee876705dc0f7df2288f442c034b2f3/adapter_model.safetensors +3 -0
  47. guardrail/blocklist/custom/blocklist +383 -0
  48. guardrail/blocklist/exact_match/blocked +1414 -0
  49. guardrail/blocklist/nltk_data/corpora/wordnet.zip +3 -0
  50. guardrail/blocklist/nltk_data/tokenizers/punkt_tab.zip +3 -0
.gitattributes CHANGED
@@ -34,3 +34,6 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
  Cosmos-1.0-Guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/blobs/d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c filter=lfs diff=lfs merge=lfs -text
 
 
 
 
34
  *.zst filter=lfs diff=lfs merge=lfs -text
35
  *tfevents* filter=lfs diff=lfs merge=lfs -text
36
  Cosmos-1.0-Guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/blobs/d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c filter=lfs diff=lfs merge=lfs -text
37
+ guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model-00001-of-00003.safetensors filter=lfs diff=lfs merge=lfs -text
38
+ guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model-00002-of-00003.safetensors filter=lfs diff=lfs merge=lfs -text
39
+ guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model-00003-of-00003.safetensors filter=lfs diff=lfs merge=lfs -text
guardrail/.gitattributes ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tar filter=lfs diff=lfs merge=lfs -text
29
+ *.tflite filter=lfs diff=lfs merge=lfs -text
30
+ *.tgz filter=lfs diff=lfs merge=lfs -text
31
+ *.wasm filter=lfs diff=lfs merge=lfs -text
32
+ *.xz filter=lfs diff=lfs merge=lfs -text
33
+ *.zip filter=lfs diff=lfs merge=lfs -text
34
+ *.zst filter=lfs diff=lfs merge=lfs -text
35
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ aegis/models--meta-llama--LlamaGuard-7b/blobs/4d92c8b74f78b0e0f4b32921d13a007efcd0e0447290da6d92f787c3295b0ad8 filter=lfs diff=lfs merge=lfs -text
37
+ aegis/models--meta-llama--LlamaGuard-7b/blobs/a19b92a679870c311122d67ae980737cf3e51424b396b3809463c4d9b06c7fcf filter=lfs diff=lfs merge=lfs -text
38
+ aegis/models--meta-llama--LlamaGuard-7b/blobs/f9e6f2ab03a3b92bf4bc6cfd6d6dcdaa8b36ab5ecf73dcfd1e8da3b5a95261a8 filter=lfs diff=lfs merge=lfs -text
39
+ aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/blobs/d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c filter=lfs diff=lfs merge=lfs -text
40
+ video_content_safety_filter/models--google--siglip-so400m-patch14-384/blobs/ea2abad2b7f8a9c1aa5e49a244d5d57ffa71c56f720c94bc5d240ef4d6e1d94a filter=lfs diff=lfs merge=lfs -text
guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/0d30a7faffd5631f68ca99856c40c252b1a5839a.lock ADDED
File without changes
guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/1a87b8f7340ada18ca4f047077a9d5b13882acc1.lock ADDED
File without changes
guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/451134b2ddc2e78555d1e857518c54b4bdc2e87d.lock ADDED
File without changes
guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/4d92c8b74f78b0e0f4b32921d13a007efcd0e0447290da6d92f787c3295b0ad8.lock ADDED
File without changes
guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/5f4117005b41815881fe7f26aee4cbec8c55aa32.lock ADDED
File without changes
guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347.lock ADDED
File without changes
guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/a19b92a679870c311122d67ae980737cf3e51424b396b3809463c4d9b06c7fcf.lock ADDED
File without changes
guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/a6e931b92caff4c79c5c56282f1e89569a0ae558.lock ADDED
File without changes
guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/e75756c38e88b19504b139e45c2bb1e925f3863c.lock ADDED
File without changes
guardrail/aegis/.locks/models--meta-llama--LlamaGuard-7b/f9e6f2ab03a3b92bf4bc6cfd6d6dcdaa8b36ab5ecf73dcfd1e8da3b5a95261a8.lock ADDED
File without changes
guardrail/aegis/.locks/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/c4d110b05e852cead25fcc7426bf251eb3d15aa0.lock ADDED
File without changes
guardrail/aegis/.locks/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c.lock ADDED
File without changes
guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/adapter_config.json ADDED
File without changes
guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/added_tokens.json ADDED
File without changes
guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/chat_template.jinja ADDED
File without changes
guardrail/aegis/models--meta-llama--LlamaGuard-7b/.no_exist/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model.safetensors ADDED
File without changes
guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/0d30a7faffd5631f68ca99856c40c252b1a5839a ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "pad_token_id": 0,
4
+ "bos_token_id": 1,
5
+ "eos_token_id": 2,
6
+ "do_sample": false,
7
+ "transformers_version": "4.35.2"
8
+ }
guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/1a87b8f7340ada18ca4f047077a9d5b13882acc1 ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "added_tokens_decoder": {
5
+ "0": {
6
+ "content": "<unk>",
7
+ "lstrip": false,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ },
13
+ "1": {
14
+ "content": "<s>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false,
19
+ "special": true
20
+ },
21
+ "2": {
22
+ "content": "</s>",
23
+ "lstrip": false,
24
+ "normalized": false,
25
+ "rstrip": false,
26
+ "single_word": false,
27
+ "special": true
28
+ }
29
+ },
30
+ "bos_token": "<s>",
31
+ "chat_template": "{% if messages|length % 2 == 0 %}{% set role = 'Agent' %}{% else %}{% set role = 'User' %}{% endif %}{{ \"[INST] Task: Check if there is unsafe content in '\" + role + \"' messages in conversations according our safety policy with the below categories.\n\n<BEGIN UNSAFE CONTENT CATEGORIES>\nO1: Violence and Hate. \nShould not\n- Help people plan or engage in violence.\n- Encourage people to engage in violence.\n- Express hateful, derogatory, or demeaning sentiments against people based on sensitive personal characteristics like their race, color, religion, national origin, sexual orientation, gender, gender identity, or disability.\n- Encourage discrimination against people based on those characteristics and should not use slurs or create insults based on those characteristics.\nCan\n- Provide information on violence and discrimination.\n- Discuss the topics of hate and violence, and can discuss historical events involving violence.\nO2: Sexual Content.\nShould not\n- Engage in sexually explicit (i.e., erotic) conversations and should not encourage people to engage in sexual activities.\nCan\n- Discuss topics related to sexuality and sexual education.\n- Provide general information regarding sexual activities.\n- Provide guidance on topics related to sexual health.\nO3: Criminal Planning.\nShould not\n- Help people plan or engage in criminal activities like theft, kidnapping, or financial crimes, especially when the person indicates that they may act on the information provided to them.\n- Encourage people to engage in criminal activities or condone criminal behavior.\nCan\n- Help people understand how crimes are committed, provided that the response is not so detailed that it could be used to provide a roadmap for future crimes.\n- Help people understand how systems work that are used to prevent crimes, provided those descriptions do not provide detailed information on how to defeat those systems.\nO4: Guns and Illegal Weapons.\nShould not\n- Assist in the planning or execution of firearm or illegal weapons crimes, especially when the person indicates that they may act on the information provided to them.\n- Encourage people to engage in firearm or illegal weapons crimes or condone such behavior.\nCan\n- Discuss firearms and the arguments for and against firearm ownership.\nO5: Regulated or Controlled Substances.\nShould not\n- Encourage or assist people in the trafficking, consumption, or use of regulated or controlled substances, including but not limited to illegal drugs, tobacco, alcohol, and cannabis.\n- Assist or encourage people to create such substances when it is illegal to do so.\nCan\n- Discuss the use of regulated or controlled substances, and can provide general information on those substances (ex: what their chemical composition is and what their effects are).\n- Provide information on how those substances are manufactured, provided that the information is not so detailed as to provide step-by-step instructions for their creation.\nO6: Self-Harm.\nShould not\n- Encourage people to harm themselves.\n- Romanticize or condone self-harm or suicide.\n- Provide information regarding the methods of suicide or self-harm.\n- Provide any information or advice, other than information regarding appropriate health resources, to people who express or imply that they wish to harm themselves.\nCan\n- Provide general information regarding self-harm and suicide (ex: statistics, the mental health challenges faced by those who take these actions).\nShould\n- Provide information regarding appropriate health resources to people who imply or express the desire to harm themselves.\n<END UNSAFE CONTENT CATEGORIES>\n\n<BEGIN CONVERSATION>\n\n\" }}{% for message in messages %}{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}{{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}{% endif %}{% set content = message['content'] %}{% if message['role'] == 'user' %}{% set role = 'User' %}{% elif message['role'] == 'assistant' %}{% set role = 'Agent' %}{% endif %}{{ role + ': ' + content.strip() + '\n\n' }}{% endfor %}{{ \"<END CONVERSATION>\n\nProvide your safety assessment for \" + role + \" in the above conversation:\n- First line must read 'safe' or 'unsafe'.\n- If unsafe, a second line must include a comma-separated list of violated categories. [/INST]\" }}",
32
+ "clean_up_tokenization_spaces": false,
33
+ "eos_token": "</s>",
34
+ "legacy": true,
35
+ "model_max_length": 1000000000000000019884624838656,
36
+ "pad_token": null,
37
+ "sp_model_kwargs": {},
38
+ "spaces_between_special_tokens": false,
39
+ "tokenizer_class": "LlamaTokenizer",
40
+ "unk_token": "<unk>",
41
+ "use_default_system_prompt": false
42
+ }
guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/451134b2ddc2e78555d1e857518c54b4bdc2e87d ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "</s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "unk_token": {
17
+ "content": "<unk>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/4d92c8b74f78b0e0f4b32921d13a007efcd0e0447290da6d92f787c3295b0ad8 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4d92c8b74f78b0e0f4b32921d13a007efcd0e0447290da6d92f787c3295b0ad8
3
+ size 4947390880
guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/5f4117005b41815881fe7f26aee4cbec8c55aa32 ADDED
@@ -0,0 +1,298 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 13476831232
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00003-of-00003.safetensors",
7
+ "model.embed_tokens.weight": "model-00001-of-00003.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00003.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
10
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
11
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
12
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
13
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
14
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
15
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
16
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
17
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00003.safetensors",
18
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
19
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
20
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
21
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
22
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
23
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
24
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
25
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
26
+ "model.layers.10.input_layernorm.weight": "model-00001-of-00003.safetensors",
27
+ "model.layers.10.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
28
+ "model.layers.10.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
29
+ "model.layers.10.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
30
+ "model.layers.10.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
31
+ "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
32
+ "model.layers.10.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
33
+ "model.layers.10.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
34
+ "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
35
+ "model.layers.11.input_layernorm.weight": "model-00002-of-00003.safetensors",
36
+ "model.layers.11.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
37
+ "model.layers.11.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
38
+ "model.layers.11.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
39
+ "model.layers.11.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
40
+ "model.layers.11.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
41
+ "model.layers.11.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
42
+ "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
43
+ "model.layers.11.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
44
+ "model.layers.12.input_layernorm.weight": "model-00002-of-00003.safetensors",
45
+ "model.layers.12.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
46
+ "model.layers.12.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
47
+ "model.layers.12.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
48
+ "model.layers.12.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
49
+ "model.layers.12.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
50
+ "model.layers.12.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
51
+ "model.layers.12.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
52
+ "model.layers.12.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
53
+ "model.layers.13.input_layernorm.weight": "model-00002-of-00003.safetensors",
54
+ "model.layers.13.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
55
+ "model.layers.13.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
56
+ "model.layers.13.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
57
+ "model.layers.13.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
58
+ "model.layers.13.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
59
+ "model.layers.13.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
60
+ "model.layers.13.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
61
+ "model.layers.13.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
62
+ "model.layers.14.input_layernorm.weight": "model-00002-of-00003.safetensors",
63
+ "model.layers.14.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
64
+ "model.layers.14.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
65
+ "model.layers.14.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
66
+ "model.layers.14.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
67
+ "model.layers.14.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
68
+ "model.layers.14.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
69
+ "model.layers.14.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
70
+ "model.layers.14.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
71
+ "model.layers.15.input_layernorm.weight": "model-00002-of-00003.safetensors",
72
+ "model.layers.15.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
73
+ "model.layers.15.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
74
+ "model.layers.15.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
75
+ "model.layers.15.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
76
+ "model.layers.15.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
77
+ "model.layers.15.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
78
+ "model.layers.15.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
79
+ "model.layers.15.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
80
+ "model.layers.16.input_layernorm.weight": "model-00002-of-00003.safetensors",
81
+ "model.layers.16.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
82
+ "model.layers.16.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
83
+ "model.layers.16.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
84
+ "model.layers.16.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
85
+ "model.layers.16.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
86
+ "model.layers.16.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
87
+ "model.layers.16.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
88
+ "model.layers.16.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
89
+ "model.layers.17.input_layernorm.weight": "model-00002-of-00003.safetensors",
90
+ "model.layers.17.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
91
+ "model.layers.17.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
92
+ "model.layers.17.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
93
+ "model.layers.17.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
94
+ "model.layers.17.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
95
+ "model.layers.17.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
96
+ "model.layers.17.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
97
+ "model.layers.17.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
98
+ "model.layers.18.input_layernorm.weight": "model-00002-of-00003.safetensors",
99
+ "model.layers.18.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
100
+ "model.layers.18.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
101
+ "model.layers.18.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
102
+ "model.layers.18.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
103
+ "model.layers.18.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
104
+ "model.layers.18.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
105
+ "model.layers.18.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
106
+ "model.layers.18.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
107
+ "model.layers.19.input_layernorm.weight": "model-00002-of-00003.safetensors",
108
+ "model.layers.19.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
109
+ "model.layers.19.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
110
+ "model.layers.19.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
111
+ "model.layers.19.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
112
+ "model.layers.19.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
113
+ "model.layers.19.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
114
+ "model.layers.19.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
115
+ "model.layers.19.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
116
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00003.safetensors",
117
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
118
+ "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
119
+ "model.layers.2.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
120
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
121
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
122
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
123
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
124
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
125
+ "model.layers.20.input_layernorm.weight": "model-00002-of-00003.safetensors",
126
+ "model.layers.20.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
127
+ "model.layers.20.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
128
+ "model.layers.20.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
129
+ "model.layers.20.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
130
+ "model.layers.20.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
131
+ "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
132
+ "model.layers.20.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
133
+ "model.layers.20.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
134
+ "model.layers.21.input_layernorm.weight": "model-00002-of-00003.safetensors",
135
+ "model.layers.21.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
136
+ "model.layers.21.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
137
+ "model.layers.21.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
138
+ "model.layers.21.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
139
+ "model.layers.21.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
140
+ "model.layers.21.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
141
+ "model.layers.21.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
142
+ "model.layers.21.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
143
+ "model.layers.22.input_layernorm.weight": "model-00002-of-00003.safetensors",
144
+ "model.layers.22.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
145
+ "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
146
+ "model.layers.22.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
147
+ "model.layers.22.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
148
+ "model.layers.22.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
149
+ "model.layers.22.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
150
+ "model.layers.22.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
151
+ "model.layers.22.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
152
+ "model.layers.23.input_layernorm.weight": "model-00003-of-00003.safetensors",
153
+ "model.layers.23.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
154
+ "model.layers.23.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
155
+ "model.layers.23.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
156
+ "model.layers.23.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
157
+ "model.layers.23.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
158
+ "model.layers.23.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
159
+ "model.layers.23.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
160
+ "model.layers.23.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
161
+ "model.layers.24.input_layernorm.weight": "model-00003-of-00003.safetensors",
162
+ "model.layers.24.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
163
+ "model.layers.24.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
164
+ "model.layers.24.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
165
+ "model.layers.24.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
166
+ "model.layers.24.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
167
+ "model.layers.24.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
168
+ "model.layers.24.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
169
+ "model.layers.24.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
170
+ "model.layers.25.input_layernorm.weight": "model-00003-of-00003.safetensors",
171
+ "model.layers.25.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
172
+ "model.layers.25.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
173
+ "model.layers.25.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
174
+ "model.layers.25.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
175
+ "model.layers.25.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
176
+ "model.layers.25.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
177
+ "model.layers.25.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
178
+ "model.layers.25.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
179
+ "model.layers.26.input_layernorm.weight": "model-00003-of-00003.safetensors",
180
+ "model.layers.26.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
181
+ "model.layers.26.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
182
+ "model.layers.26.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
183
+ "model.layers.26.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
184
+ "model.layers.26.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
185
+ "model.layers.26.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
186
+ "model.layers.26.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
187
+ "model.layers.26.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
188
+ "model.layers.27.input_layernorm.weight": "model-00003-of-00003.safetensors",
189
+ "model.layers.27.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
190
+ "model.layers.27.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
191
+ "model.layers.27.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
192
+ "model.layers.27.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
193
+ "model.layers.27.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
194
+ "model.layers.27.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
195
+ "model.layers.27.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
196
+ "model.layers.27.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
197
+ "model.layers.28.input_layernorm.weight": "model-00003-of-00003.safetensors",
198
+ "model.layers.28.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
199
+ "model.layers.28.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
200
+ "model.layers.28.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
201
+ "model.layers.28.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
202
+ "model.layers.28.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
203
+ "model.layers.28.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
204
+ "model.layers.28.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
205
+ "model.layers.28.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
206
+ "model.layers.29.input_layernorm.weight": "model-00003-of-00003.safetensors",
207
+ "model.layers.29.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
208
+ "model.layers.29.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
209
+ "model.layers.29.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
210
+ "model.layers.29.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
211
+ "model.layers.29.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
212
+ "model.layers.29.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
213
+ "model.layers.29.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
214
+ "model.layers.29.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
215
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00003.safetensors",
216
+ "model.layers.3.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
217
+ "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
218
+ "model.layers.3.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
219
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
220
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
221
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
222
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
223
+ "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
224
+ "model.layers.30.input_layernorm.weight": "model-00003-of-00003.safetensors",
225
+ "model.layers.30.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
226
+ "model.layers.30.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
227
+ "model.layers.30.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
228
+ "model.layers.30.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
229
+ "model.layers.30.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
230
+ "model.layers.30.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
231
+ "model.layers.30.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
232
+ "model.layers.30.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
233
+ "model.layers.31.input_layernorm.weight": "model-00003-of-00003.safetensors",
234
+ "model.layers.31.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
235
+ "model.layers.31.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
236
+ "model.layers.31.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
237
+ "model.layers.31.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
238
+ "model.layers.31.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
239
+ "model.layers.31.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
240
+ "model.layers.31.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
241
+ "model.layers.31.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
242
+ "model.layers.4.input_layernorm.weight": "model-00001-of-00003.safetensors",
243
+ "model.layers.4.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
244
+ "model.layers.4.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
245
+ "model.layers.4.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
246
+ "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
247
+ "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
248
+ "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
249
+ "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
250
+ "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
251
+ "model.layers.5.input_layernorm.weight": "model-00001-of-00003.safetensors",
252
+ "model.layers.5.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
253
+ "model.layers.5.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
254
+ "model.layers.5.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
255
+ "model.layers.5.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
256
+ "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
257
+ "model.layers.5.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
258
+ "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
259
+ "model.layers.5.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
260
+ "model.layers.6.input_layernorm.weight": "model-00001-of-00003.safetensors",
261
+ "model.layers.6.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
262
+ "model.layers.6.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
263
+ "model.layers.6.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
264
+ "model.layers.6.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
265
+ "model.layers.6.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
266
+ "model.layers.6.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
267
+ "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
268
+ "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
269
+ "model.layers.7.input_layernorm.weight": "model-00001-of-00003.safetensors",
270
+ "model.layers.7.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
271
+ "model.layers.7.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
272
+ "model.layers.7.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
273
+ "model.layers.7.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
274
+ "model.layers.7.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
275
+ "model.layers.7.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
276
+ "model.layers.7.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
277
+ "model.layers.7.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
278
+ "model.layers.8.input_layernorm.weight": "model-00001-of-00003.safetensors",
279
+ "model.layers.8.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
280
+ "model.layers.8.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
281
+ "model.layers.8.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
282
+ "model.layers.8.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
283
+ "model.layers.8.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
284
+ "model.layers.8.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
285
+ "model.layers.8.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
286
+ "model.layers.8.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
287
+ "model.layers.9.input_layernorm.weight": "model-00001-of-00003.safetensors",
288
+ "model.layers.9.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
289
+ "model.layers.9.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
290
+ "model.layers.9.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
291
+ "model.layers.9.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
292
+ "model.layers.9.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
293
+ "model.layers.9.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
294
+ "model.layers.9.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
295
+ "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
296
+ "model.norm.weight": "model-00003-of-00003.safetensors"
297
+ }
298
+ }
guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347 ADDED
Binary file (500 kB). View file
 
guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/a19b92a679870c311122d67ae980737cf3e51424b396b3809463c4d9b06c7fcf ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a19b92a679870c311122d67ae980737cf3e51424b396b3809463c4d9b06c7fcf
3
+ size 3590488816
guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/a6e931b92caff4c79c5c56282f1e89569a0ae558 ADDED
The diff for this file is too large to render. See raw diff
 
guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/e75756c38e88b19504b139e45c2bb1e925f3863c ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "LlamaForCausalLM"
4
+ ],
5
+ "attention_bias": false,
6
+ "bos_token_id": 1,
7
+ "eos_token_id": 2,
8
+ "hidden_act": "silu",
9
+ "hidden_size": 4096,
10
+ "initializer_range": 0.02,
11
+ "intermediate_size": 11008,
12
+ "max_position_embeddings": 4096,
13
+ "model_type": "llama",
14
+ "num_attention_heads": 32,
15
+ "num_hidden_layers": 32,
16
+ "num_key_value_heads": 32,
17
+ "pretraining_tp": 1,
18
+ "rms_norm_eps": 1e-05,
19
+ "rope_scaling": null,
20
+ "rope_theta": 10000.0,
21
+ "tie_word_embeddings": false,
22
+ "torch_dtype": "bfloat16",
23
+ "transformers_version": "4.35.2",
24
+ "use_cache": true,
25
+ "vocab_size": 32000
26
+ }
guardrail/aegis/models--meta-llama--LlamaGuard-7b/blobs/f9e6f2ab03a3b92bf4bc6cfd6d6dcdaa8b36ab5ecf73dcfd1e8da3b5a95261a8 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f9e6f2ab03a3b92bf4bc6cfd6d6dcdaa8b36ab5ecf73dcfd1e8da3b5a95261a8
3
+ size 4938985352
guardrail/aegis/models--meta-llama--LlamaGuard-7b/refs/main ADDED
@@ -0,0 +1 @@
 
 
1
+ dfcfa3409b9994a4722d44e05f82e81ea73c5106
guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "LlamaForCausalLM"
4
+ ],
5
+ "attention_bias": false,
6
+ "bos_token_id": 1,
7
+ "eos_token_id": 2,
8
+ "hidden_act": "silu",
9
+ "hidden_size": 4096,
10
+ "initializer_range": 0.02,
11
+ "intermediate_size": 11008,
12
+ "max_position_embeddings": 4096,
13
+ "model_type": "llama",
14
+ "num_attention_heads": 32,
15
+ "num_hidden_layers": 32,
16
+ "num_key_value_heads": 32,
17
+ "pretraining_tp": 1,
18
+ "rms_norm_eps": 1e-05,
19
+ "rope_scaling": null,
20
+ "rope_theta": 10000.0,
21
+ "tie_word_embeddings": false,
22
+ "torch_dtype": "bfloat16",
23
+ "transformers_version": "4.35.2",
24
+ "use_cache": true,
25
+ "vocab_size": 32000
26
+ }
guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/generation_config.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "pad_token_id": 0,
4
+ "bos_token_id": 1,
5
+ "eos_token_id": 2,
6
+ "do_sample": false,
7
+ "transformers_version": "4.35.2"
8
+ }
guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model-00001-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f9e6f2ab03a3b92bf4bc6cfd6d6dcdaa8b36ab5ecf73dcfd1e8da3b5a95261a8
3
+ size 4938985352
guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model-00002-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4d92c8b74f78b0e0f4b32921d13a007efcd0e0447290da6d92f787c3295b0ad8
3
+ size 4947390880
guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model-00003-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a19b92a679870c311122d67ae980737cf3e51424b396b3809463c4d9b06c7fcf
3
+ size 3590488816
guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/model.safetensors.index.json ADDED
@@ -0,0 +1,298 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 13476831232
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00003-of-00003.safetensors",
7
+ "model.embed_tokens.weight": "model-00001-of-00003.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00003.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
10
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
11
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
12
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
13
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
14
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
15
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
16
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
17
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00003.safetensors",
18
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
19
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
20
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
21
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
22
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
23
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
24
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
25
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
26
+ "model.layers.10.input_layernorm.weight": "model-00001-of-00003.safetensors",
27
+ "model.layers.10.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
28
+ "model.layers.10.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
29
+ "model.layers.10.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
30
+ "model.layers.10.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
31
+ "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
32
+ "model.layers.10.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
33
+ "model.layers.10.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
34
+ "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
35
+ "model.layers.11.input_layernorm.weight": "model-00002-of-00003.safetensors",
36
+ "model.layers.11.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
37
+ "model.layers.11.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
38
+ "model.layers.11.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
39
+ "model.layers.11.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
40
+ "model.layers.11.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
41
+ "model.layers.11.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
42
+ "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
43
+ "model.layers.11.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
44
+ "model.layers.12.input_layernorm.weight": "model-00002-of-00003.safetensors",
45
+ "model.layers.12.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
46
+ "model.layers.12.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
47
+ "model.layers.12.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
48
+ "model.layers.12.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
49
+ "model.layers.12.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
50
+ "model.layers.12.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
51
+ "model.layers.12.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
52
+ "model.layers.12.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
53
+ "model.layers.13.input_layernorm.weight": "model-00002-of-00003.safetensors",
54
+ "model.layers.13.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
55
+ "model.layers.13.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
56
+ "model.layers.13.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
57
+ "model.layers.13.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
58
+ "model.layers.13.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
59
+ "model.layers.13.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
60
+ "model.layers.13.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
61
+ "model.layers.13.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
62
+ "model.layers.14.input_layernorm.weight": "model-00002-of-00003.safetensors",
63
+ "model.layers.14.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
64
+ "model.layers.14.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
65
+ "model.layers.14.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
66
+ "model.layers.14.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
67
+ "model.layers.14.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
68
+ "model.layers.14.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
69
+ "model.layers.14.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
70
+ "model.layers.14.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
71
+ "model.layers.15.input_layernorm.weight": "model-00002-of-00003.safetensors",
72
+ "model.layers.15.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
73
+ "model.layers.15.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
74
+ "model.layers.15.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
75
+ "model.layers.15.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
76
+ "model.layers.15.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
77
+ "model.layers.15.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
78
+ "model.layers.15.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
79
+ "model.layers.15.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
80
+ "model.layers.16.input_layernorm.weight": "model-00002-of-00003.safetensors",
81
+ "model.layers.16.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
82
+ "model.layers.16.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
83
+ "model.layers.16.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
84
+ "model.layers.16.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
85
+ "model.layers.16.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
86
+ "model.layers.16.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
87
+ "model.layers.16.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
88
+ "model.layers.16.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
89
+ "model.layers.17.input_layernorm.weight": "model-00002-of-00003.safetensors",
90
+ "model.layers.17.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
91
+ "model.layers.17.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
92
+ "model.layers.17.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
93
+ "model.layers.17.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
94
+ "model.layers.17.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
95
+ "model.layers.17.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
96
+ "model.layers.17.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
97
+ "model.layers.17.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
98
+ "model.layers.18.input_layernorm.weight": "model-00002-of-00003.safetensors",
99
+ "model.layers.18.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
100
+ "model.layers.18.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
101
+ "model.layers.18.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
102
+ "model.layers.18.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
103
+ "model.layers.18.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
104
+ "model.layers.18.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
105
+ "model.layers.18.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
106
+ "model.layers.18.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
107
+ "model.layers.19.input_layernorm.weight": "model-00002-of-00003.safetensors",
108
+ "model.layers.19.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
109
+ "model.layers.19.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
110
+ "model.layers.19.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
111
+ "model.layers.19.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
112
+ "model.layers.19.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
113
+ "model.layers.19.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
114
+ "model.layers.19.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
115
+ "model.layers.19.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
116
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00003.safetensors",
117
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
118
+ "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
119
+ "model.layers.2.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
120
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
121
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
122
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
123
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
124
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
125
+ "model.layers.20.input_layernorm.weight": "model-00002-of-00003.safetensors",
126
+ "model.layers.20.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
127
+ "model.layers.20.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
128
+ "model.layers.20.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
129
+ "model.layers.20.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
130
+ "model.layers.20.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
131
+ "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
132
+ "model.layers.20.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
133
+ "model.layers.20.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
134
+ "model.layers.21.input_layernorm.weight": "model-00002-of-00003.safetensors",
135
+ "model.layers.21.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
136
+ "model.layers.21.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
137
+ "model.layers.21.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
138
+ "model.layers.21.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
139
+ "model.layers.21.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
140
+ "model.layers.21.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
141
+ "model.layers.21.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
142
+ "model.layers.21.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
143
+ "model.layers.22.input_layernorm.weight": "model-00002-of-00003.safetensors",
144
+ "model.layers.22.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
145
+ "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
146
+ "model.layers.22.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
147
+ "model.layers.22.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
148
+ "model.layers.22.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
149
+ "model.layers.22.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
150
+ "model.layers.22.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
151
+ "model.layers.22.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
152
+ "model.layers.23.input_layernorm.weight": "model-00003-of-00003.safetensors",
153
+ "model.layers.23.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
154
+ "model.layers.23.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
155
+ "model.layers.23.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
156
+ "model.layers.23.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
157
+ "model.layers.23.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
158
+ "model.layers.23.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
159
+ "model.layers.23.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
160
+ "model.layers.23.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
161
+ "model.layers.24.input_layernorm.weight": "model-00003-of-00003.safetensors",
162
+ "model.layers.24.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
163
+ "model.layers.24.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
164
+ "model.layers.24.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
165
+ "model.layers.24.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
166
+ "model.layers.24.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
167
+ "model.layers.24.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
168
+ "model.layers.24.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
169
+ "model.layers.24.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
170
+ "model.layers.25.input_layernorm.weight": "model-00003-of-00003.safetensors",
171
+ "model.layers.25.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
172
+ "model.layers.25.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
173
+ "model.layers.25.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
174
+ "model.layers.25.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
175
+ "model.layers.25.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
176
+ "model.layers.25.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
177
+ "model.layers.25.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
178
+ "model.layers.25.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
179
+ "model.layers.26.input_layernorm.weight": "model-00003-of-00003.safetensors",
180
+ "model.layers.26.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
181
+ "model.layers.26.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
182
+ "model.layers.26.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
183
+ "model.layers.26.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
184
+ "model.layers.26.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
185
+ "model.layers.26.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
186
+ "model.layers.26.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
187
+ "model.layers.26.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
188
+ "model.layers.27.input_layernorm.weight": "model-00003-of-00003.safetensors",
189
+ "model.layers.27.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
190
+ "model.layers.27.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
191
+ "model.layers.27.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
192
+ "model.layers.27.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
193
+ "model.layers.27.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
194
+ "model.layers.27.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
195
+ "model.layers.27.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
196
+ "model.layers.27.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
197
+ "model.layers.28.input_layernorm.weight": "model-00003-of-00003.safetensors",
198
+ "model.layers.28.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
199
+ "model.layers.28.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
200
+ "model.layers.28.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
201
+ "model.layers.28.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
202
+ "model.layers.28.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
203
+ "model.layers.28.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
204
+ "model.layers.28.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
205
+ "model.layers.28.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
206
+ "model.layers.29.input_layernorm.weight": "model-00003-of-00003.safetensors",
207
+ "model.layers.29.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
208
+ "model.layers.29.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
209
+ "model.layers.29.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
210
+ "model.layers.29.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
211
+ "model.layers.29.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
212
+ "model.layers.29.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
213
+ "model.layers.29.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
214
+ "model.layers.29.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
215
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00003.safetensors",
216
+ "model.layers.3.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
217
+ "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
218
+ "model.layers.3.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
219
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
220
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
221
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
222
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
223
+ "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
224
+ "model.layers.30.input_layernorm.weight": "model-00003-of-00003.safetensors",
225
+ "model.layers.30.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
226
+ "model.layers.30.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
227
+ "model.layers.30.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
228
+ "model.layers.30.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
229
+ "model.layers.30.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
230
+ "model.layers.30.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
231
+ "model.layers.30.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
232
+ "model.layers.30.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
233
+ "model.layers.31.input_layernorm.weight": "model-00003-of-00003.safetensors",
234
+ "model.layers.31.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
235
+ "model.layers.31.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
236
+ "model.layers.31.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
237
+ "model.layers.31.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
238
+ "model.layers.31.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
239
+ "model.layers.31.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
240
+ "model.layers.31.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
241
+ "model.layers.31.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
242
+ "model.layers.4.input_layernorm.weight": "model-00001-of-00003.safetensors",
243
+ "model.layers.4.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
244
+ "model.layers.4.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
245
+ "model.layers.4.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
246
+ "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
247
+ "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
248
+ "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
249
+ "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
250
+ "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
251
+ "model.layers.5.input_layernorm.weight": "model-00001-of-00003.safetensors",
252
+ "model.layers.5.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
253
+ "model.layers.5.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
254
+ "model.layers.5.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
255
+ "model.layers.5.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
256
+ "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
257
+ "model.layers.5.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
258
+ "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
259
+ "model.layers.5.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
260
+ "model.layers.6.input_layernorm.weight": "model-00001-of-00003.safetensors",
261
+ "model.layers.6.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
262
+ "model.layers.6.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
263
+ "model.layers.6.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
264
+ "model.layers.6.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
265
+ "model.layers.6.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
266
+ "model.layers.6.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
267
+ "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
268
+ "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
269
+ "model.layers.7.input_layernorm.weight": "model-00001-of-00003.safetensors",
270
+ "model.layers.7.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
271
+ "model.layers.7.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
272
+ "model.layers.7.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
273
+ "model.layers.7.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
274
+ "model.layers.7.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
275
+ "model.layers.7.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
276
+ "model.layers.7.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
277
+ "model.layers.7.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
278
+ "model.layers.8.input_layernorm.weight": "model-00001-of-00003.safetensors",
279
+ "model.layers.8.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
280
+ "model.layers.8.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
281
+ "model.layers.8.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
282
+ "model.layers.8.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
283
+ "model.layers.8.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
284
+ "model.layers.8.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
285
+ "model.layers.8.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
286
+ "model.layers.8.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
287
+ "model.layers.9.input_layernorm.weight": "model-00001-of-00003.safetensors",
288
+ "model.layers.9.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
289
+ "model.layers.9.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
290
+ "model.layers.9.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
291
+ "model.layers.9.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
292
+ "model.layers.9.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
293
+ "model.layers.9.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
294
+ "model.layers.9.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
295
+ "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
296
+ "model.norm.weight": "model-00003-of-00003.safetensors"
297
+ }
298
+ }
guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "</s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "unk_token": {
17
+ "content": "<unk>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347
3
+ size 499723
guardrail/aegis/models--meta-llama--LlamaGuard-7b/snapshots/dfcfa3409b9994a4722d44e05f82e81ea73c5106/tokenizer_config.json ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "added_tokens_decoder": {
5
+ "0": {
6
+ "content": "<unk>",
7
+ "lstrip": false,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ },
13
+ "1": {
14
+ "content": "<s>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false,
19
+ "special": true
20
+ },
21
+ "2": {
22
+ "content": "</s>",
23
+ "lstrip": false,
24
+ "normalized": false,
25
+ "rstrip": false,
26
+ "single_word": false,
27
+ "special": true
28
+ }
29
+ },
30
+ "bos_token": "<s>",
31
+ "chat_template": "{% if messages|length % 2 == 0 %}{% set role = 'Agent' %}{% else %}{% set role = 'User' %}{% endif %}{{ \"[INST] Task: Check if there is unsafe content in '\" + role + \"' messages in conversations according our safety policy with the below categories.\n\n<BEGIN UNSAFE CONTENT CATEGORIES>\nO1: Violence and Hate. \nShould not\n- Help people plan or engage in violence.\n- Encourage people to engage in violence.\n- Express hateful, derogatory, or demeaning sentiments against people based on sensitive personal characteristics like their race, color, religion, national origin, sexual orientation, gender, gender identity, or disability.\n- Encourage discrimination against people based on those characteristics and should not use slurs or create insults based on those characteristics.\nCan\n- Provide information on violence and discrimination.\n- Discuss the topics of hate and violence, and can discuss historical events involving violence.\nO2: Sexual Content.\nShould not\n- Engage in sexually explicit (i.e., erotic) conversations and should not encourage people to engage in sexual activities.\nCan\n- Discuss topics related to sexuality and sexual education.\n- Provide general information regarding sexual activities.\n- Provide guidance on topics related to sexual health.\nO3: Criminal Planning.\nShould not\n- Help people plan or engage in criminal activities like theft, kidnapping, or financial crimes, especially when the person indicates that they may act on the information provided to them.\n- Encourage people to engage in criminal activities or condone criminal behavior.\nCan\n- Help people understand how crimes are committed, provided that the response is not so detailed that it could be used to provide a roadmap for future crimes.\n- Help people understand how systems work that are used to prevent crimes, provided those descriptions do not provide detailed information on how to defeat those systems.\nO4: Guns and Illegal Weapons.\nShould not\n- Assist in the planning or execution of firearm or illegal weapons crimes, especially when the person indicates that they may act on the information provided to them.\n- Encourage people to engage in firearm or illegal weapons crimes or condone such behavior.\nCan\n- Discuss firearms and the arguments for and against firearm ownership.\nO5: Regulated or Controlled Substances.\nShould not\n- Encourage or assist people in the trafficking, consumption, or use of regulated or controlled substances, including but not limited to illegal drugs, tobacco, alcohol, and cannabis.\n- Assist or encourage people to create such substances when it is illegal to do so.\nCan\n- Discuss the use of regulated or controlled substances, and can provide general information on those substances (ex: what their chemical composition is and what their effects are).\n- Provide information on how those substances are manufactured, provided that the information is not so detailed as to provide step-by-step instructions for their creation.\nO6: Self-Harm.\nShould not\n- Encourage people to harm themselves.\n- Romanticize or condone self-harm or suicide.\n- Provide information regarding the methods of suicide or self-harm.\n- Provide any information or advice, other than information regarding appropriate health resources, to people who express or imply that they wish to harm themselves.\nCan\n- Provide general information regarding self-harm and suicide (ex: statistics, the mental health challenges faced by those who take these actions).\nShould\n- Provide information regarding appropriate health resources to people who imply or express the desire to harm themselves.\n<END UNSAFE CONTENT CATEGORIES>\n\n<BEGIN CONVERSATION>\n\n\" }}{% for message in messages %}{% if (message['role'] == 'user') != (loop.index0 % 2 == 0) %}{{ raise_exception('Conversation roles must alternate user/assistant/user/assistant/...') }}{% endif %}{% set content = message['content'] %}{% if message['role'] == 'user' %}{% set role = 'User' %}{% elif message['role'] == 'assistant' %}{% set role = 'Agent' %}{% endif %}{{ role + ': ' + content.strip() + '\n\n' }}{% endfor %}{{ \"<END CONVERSATION>\n\nProvide your safety assessment for \" + role + \" in the above conversation:\n- First line must read 'safe' or 'unsafe'.\n- If unsafe, a second line must include a comma-separated list of violated categories. [/INST]\" }}",
32
+ "clean_up_tokenization_spaces": false,
33
+ "eos_token": "</s>",
34
+ "legacy": true,
35
+ "model_max_length": 1000000000000000019884624838656,
36
+ "pad_token": null,
37
+ "sp_model_kwargs": {},
38
+ "spaces_between_special_tokens": false,
39
+ "tokenizer_class": "LlamaTokenizer",
40
+ "unk_token": "<unk>",
41
+ "use_default_system_prompt": false
42
+ }
guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/blobs/c4d110b05e852cead25fcc7426bf251eb3d15aa0 ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": "/workspace/hugging_face/hub/models--meta-llama--LlamaGuard-7b/snapshots/3e764390d6b39028ddea5b20603c89476107b41e/",
5
+ "bias": "none",
6
+ "fan_in_fan_out": false,
7
+ "inference_mode": true,
8
+ "init_lora_weights": true,
9
+ "layers_pattern": null,
10
+ "layers_to_transform": null,
11
+ "loftq_config": {},
12
+ "lora_alpha": 32,
13
+ "lora_dropout": 0.05,
14
+ "megatron_config": null,
15
+ "megatron_core": "megatron.core",
16
+ "modules_to_save": null,
17
+ "peft_type": "LORA",
18
+ "r": 16,
19
+ "rank_pattern": {},
20
+ "revision": null,
21
+ "target_modules": [
22
+ "v_proj",
23
+ "lm_head",
24
+ "k_proj",
25
+ "up_proj",
26
+ "gate_proj",
27
+ "q_proj",
28
+ "down_proj",
29
+ "o_proj"
30
+ ],
31
+ "task_type": "CAUSAL_LM",
32
+ "use_rslora": false
33
+ }
guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/blobs/d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c
3
+ size 162278280
guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/refs/main ADDED
@@ -0,0 +1 @@
 
 
1
+ 62006ace73a69838083a31831126146048694b25
guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/snapshots/62006ace73a69838083a31831126146048694b25/adapter_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ ../../blobs/c4d110b05e852cead25fcc7426bf251eb3d15aa0
guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/snapshots/62006ace73a69838083a31831126146048694b25/adapter_model.safetensors ADDED
@@ -0,0 +1 @@
 
 
1
+ ../../blobs/d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c
guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/snapshots/f54cb2302ee876705dc0f7df2288f442c034b2f3/adapter_config.json ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "alpha_pattern": {},
3
+ "auto_mapping": null,
4
+ "base_model_name_or_path": "/workspace/hugging_face/hub/models--meta-llama--LlamaGuard-7b/snapshots/3e764390d6b39028ddea5b20603c89476107b41e/",
5
+ "bias": "none",
6
+ "fan_in_fan_out": false,
7
+ "inference_mode": true,
8
+ "init_lora_weights": true,
9
+ "layers_pattern": null,
10
+ "layers_to_transform": null,
11
+ "loftq_config": {},
12
+ "lora_alpha": 32,
13
+ "lora_dropout": 0.05,
14
+ "megatron_config": null,
15
+ "megatron_core": "megatron.core",
16
+ "modules_to_save": null,
17
+ "peft_type": "LORA",
18
+ "r": 16,
19
+ "rank_pattern": {},
20
+ "revision": null,
21
+ "target_modules": [
22
+ "v_proj",
23
+ "lm_head",
24
+ "k_proj",
25
+ "up_proj",
26
+ "gate_proj",
27
+ "q_proj",
28
+ "down_proj",
29
+ "o_proj"
30
+ ],
31
+ "task_type": "CAUSAL_LM",
32
+ "use_rslora": false
33
+ }
guardrail/aegis/models--nvidia--Aegis-AI-Content-Safety-LlamaGuard-Defensive-1.0/snapshots/f54cb2302ee876705dc0f7df2288f442c034b2f3/adapter_model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d79b29a0b9ab36db8038e39e847b3c81ebd56dd8d796551943ea4b43b2e6c55c
3
+ size 162278280
guardrail/blocklist/custom/blocklist ADDED
@@ -0,0 +1,383 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ nude
2
+ naked
3
+ nudity
4
+ clothingless
5
+ undress
6
+ striptease
7
+ blood
8
+ horror
9
+ gross
10
+ gore
11
+ butcherly
12
+ brutal
13
+ zombie
14
+ crucifixion
15
+ secretion
16
+ horrific
17
+ disgusting
18
+ carnage
19
+ creepy
20
+ grotesque
21
+ gruesome
22
+ macabre
23
+ freaky
24
+ morbid
25
+ hideous
26
+ bloodstream
27
+ horrifying
28
+ hemorrhage
29
+ lurid
30
+ sordid
31
+ bloodthirsty
32
+ ghastly
33
+ gratuitous
34
+ nightmarish
35
+ savagery
36
+ cronenberg
37
+ sadism
38
+ salacious
39
+ barbarity
40
+ gunplay
41
+ heart wrenching
42
+ ghoulish
43
+ tawdry
44
+ nauseating
45
+ titillating
46
+ bloodthirsty
47
+ pulpy
48
+ systole
49
+ fetishistic
50
+ sanguinary
51
+ discomforting
52
+ bulletstorm
53
+ slaughterous
54
+ sanguineous
55
+ bespatter
56
+ beflake
57
+ deskin
58
+ deskinned
59
+ deskinning
60
+ lifeless
61
+ bleeding
62
+ The Nightmare Before Christmas
63
+ Snow White and the Seven Dwarfs
64
+ Snow White
65
+ the Seven Dwarfs
66
+ Baymax
67
+ Big Hero 6
68
+ zootopia
69
+ aristocats
70
+ Mary Poppins
71
+ Inside Out
72
+ Nemo
73
+ Finding Nemo
74
+ Monsters, Inc.
75
+ Buzz Lightyear
76
+ DuckTales
77
+ 101 Dalmatians
78
+ Lilo & Stitch
79
+ Mickey Mouse
80
+ Minnie Mouse
81
+ Mary Poppins
82
+ Mulan
83
+ The Lion King
84
+ Oliver & Company
85
+ Dumbo
86
+ Bugs Bunny
87
+ Daffy Duck
88
+ Porky Pig
89
+ Elmer Fudd
90
+ Tweety Bird
91
+ Sylvester the Cat
92
+ Yosemite Sam
93
+ Foghorn Leghorn
94
+ Marvin the Martian
95
+ Pep\u00e9 le Pew
96
+ Superman
97
+ Batman
98
+ Wonder Woman
99
+ Green Lantern
100
+ The Flash
101
+ Aquaman
102
+ Harry Potter
103
+ Frodo Baggins
104
+ Gandalf the Grey
105
+ Bilbo Baggins
106
+ The Joker
107
+ Lex Luthor
108
+ Darkseid
109
+ Sinestro
110
+ Brainiac
111
+ Black Adam
112
+ Ra's al Ghul
113
+ Mr. Freeze
114
+ Lord Voldemort
115
+ Tom and Jerry
116
+ Scooby-Doo
117
+ Sylvester & Tweety
118
+ The Flintstones
119
+ Johnny Bravo
120
+ Popeye
121
+ Yogi Bear
122
+ Ant-Man
123
+ Captain America
124
+ Captain Marvel
125
+ Hawkeye
126
+ Magneto
127
+ She-Hulk
128
+ Silver Surfer
129
+ Spider-Man
130
+ Spider-Woman
131
+ Star-Lord
132
+ Thanos
133
+ Super Mario
134
+ Princess Peach
135
+ Bowser
136
+ Toadette
137
+ Yoshi
138
+ Wario
139
+ Waluigi
140
+ Donkey Kong
141
+ Diddy Kong
142
+ Rosalina
143
+ Bowser Jr.
144
+ Koopaling
145
+ Princess Zelda
146
+ Ganondorf
147
+ Pikachu
148
+ Charizard
149
+ Bulbasaur
150
+ Squirtle
151
+ Jigglypuff
152
+ Meowth
153
+ Lucario
154
+ Greninja
155
+ Mewtwo
156
+ Eevee
157
+ Trump
158
+ Monet
159
+ Mona Lisa
160
+ Jensen Huang
161
+ Mark Zuckerberg
162
+ Avatar
163
+ Obama
164
+ Oprah Winfrey
165
+ Tom Cruise
166
+ Lady Gaga
167
+ Kim Kardashian
168
+ Taylor Swift
169
+ Jennifer Lopez
170
+ 3M
171
+ Acura
172
+ Adidas
173
+ Airpods
174
+ Armani
175
+ AstraZeneca
176
+ Aston Martin
177
+ AMD
178
+ Amy Somerville
179
+ Anthony Theakston
180
+ AT&T
181
+ Audi
182
+ Aventador
183
+ Barbie
184
+ Batman
185
+ Batmobile
186
+ Bayer
187
+ Bently
188
+ BMW
189
+ Boston Dynamics
190
+ Blue Origin
191
+ Bosch
192
+ Brawl Stars
193
+ Brokis
194
+ Budweiser
195
+ Burger King
196
+ Bugatti
197
+ Burberry
198
+ Buick
199
+ Cadiliac
200
+ Canon
201
+ Cartier
202
+ Cassina
203
+ Chanel
204
+ Chick-fil-A
205
+ Cisco
206
+ Chevron
207
+ Chromebook
208
+ Chrystler
209
+ Citroen
210
+ Colgate
211
+ CocaCola
212
+ Countach
213
+ DC Comics
214
+ Daimler
215
+ Dell
216
+ DHL
217
+ Dior
218
+ Disney
219
+ Duracell
220
+ Dunkin Donuts
221
+ Eichholtz
222
+ Entler Studio
223
+ Elemento
224
+ Electrolux
225
+ Estee Lauder
226
+ Exxon
227
+ Fedex
228
+ Ferrari
229
+ Ferdinand Kramer
230
+ Fiskars
231
+ Fisker
232
+ Ford
233
+ Fortnite
234
+ Gatorade
235
+ Gandalf
236
+ Gameboy
237
+ Gamecube
238
+ General Motors
239
+ Gillette
240
+ Glock
241
+ GMC
242
+ Graef
243
+ Grant Featherston Contour
244
+ Gucci
245
+ Gulfstream
246
+ Gundam
247
+ Haier
248
+ Harley Davidson
249
+ Harry Potter
250
+ Heineken
251
+ Hennessy
252
+ Hermes
253
+ Hitachi
254
+ Hogwarts
255
+ Homepod
256
+ Honda
257
+ Honeywell
258
+ Hoka
259
+ Huawei
260
+ Huracan
261
+ Ikea
262
+ Imac
263
+ Ipad
264
+ Iphone
265
+ Ipod
266
+ Isuzu
267
+ Infiniti
268
+ Intel
269
+ Jabulani
270
+ John Deere
271
+ John Pomp Studios
272
+ Kelloggs
273
+ KFC
274
+ Kia
275
+ Land Rover
276
+ Lamborghini
277
+ Lancel
278
+ Lego
279
+ Lenovo
280
+ Lexus
281
+ LG
282
+ L'Oreal
283
+ Lockheed
284
+ Louis Vuitton
285
+ Lululemon
286
+ Macbook
287
+ Martinelli Luce
288
+ Mattel
289
+ Mazda
290
+ McDonalds
291
+ McDonald's
292
+ McClaren
293
+ Medtronic
294
+ Mercedes
295
+ Mettler Toledo
296
+ Michelin
297
+ Microsoft
298
+ Mini Cooper
299
+ Mitsubishi
300
+ Miura
301
+ MLB
302
+ Mondial
303
+ Montblanc
304
+ Motopanfilo
305
+ Motorola
306
+ Mortal Kombat
307
+ Murci\u00e9lago
308
+ N64
309
+ NBA
310
+ NCAA
311
+ Nike
312
+ Nintendo
313
+ Nissan
314
+ Northrop Grumman
315
+ Nokia
316
+ NVIDIA
317
+ Oakley
318
+ Panasonic
319
+ Pantene
320
+ Piaget
321
+ Pepsi
322
+ Peugeot
323
+ Philips
324
+ Playstation
325
+ Poltrona Frau
326
+ Porsche
327
+ Prada
328
+ Publix
329
+ Purina
330
+ Rakuten
331
+ Rayban
332
+ Raytheon
333
+ Reebok
334
+ Red Bull
335
+ Renault
336
+ Rimet
337
+ Rivian
338
+ Roche
339
+ Rolex
340
+ Rolls Royce
341
+ Saleen
342
+ Samsung
343
+ Schwan-Stabilo
344
+ Serapian
345
+ Shelby
346
+ Siemens
347
+ SpaceX
348
+ Starbucks
349
+ Stanley Cup
350
+ Stavros
351
+ Steiner Optics
352
+ Sonos
353
+ Sony
354
+ Skidata
355
+ Superman
356
+ Subaru
357
+ Suzuki
358
+ Taylormade
359
+ Tesla
360
+ Terminator
361
+ Toyota
362
+ Tuttuno
363
+ USPS
364
+ Vespa
365
+ Verizon
366
+ Volocopter
367
+ Volvo
368
+ Volkswagen
369
+ Walmart
370
+ Wii
371
+ WonderGlass
372
+ Woolworths
373
+ Xiaomi
374
+ Xbox
375
+ YKK
376
+ Zaha Hadid
377
+ Zuny
378
+ war
379
+ torture
380
+ murder
381
+ sniper
382
+ shotgun
383
+ rifle
guardrail/blocklist/exact_match/blocked ADDED
@@ -0,0 +1,1414 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Abu
2
+ Adams
3
+ Adele
4
+ Adonis Creed
5
+ Aerith Gainsborough
6
+ Agatha Harkness
7
+ Agnes
8
+ Ahri
9
+ Ahsoka Tano
10
+ Aku
11
+ Akuma
12
+ Aladdin
13
+ Alan Partridge
14
+ Alex
15
+ Alex Levy
16
+ Alex the Lion
17
+ Ali G
18
+ Alice
19
+ Alpha Pig
20
+ Alphonse Elric
21
+ Alyx Vance
22
+ America Chavez
23
+ Amethyst
24
+ Amicia de Rune
25
+ Anakin
26
+ Anakin Skywalker
27
+ Anastasia
28
+ Anastasia Tremaine
29
+ Angelica Pickles
30
+ Angelina Ballerina
31
+ Angie
32
+ Angus
33
+ Anna
34
+ Annie
35
+ Anpanman
36
+ Ant-Man
37
+ Anthony Edwards
38
+ Anton Chigurh
39
+ Anton Ego
40
+ Apollo Justice
41
+ Aqua Teen Carl
42
+ Aquaman
43
+ Archie Andrews
44
+ Archimedes
45
+ Ariana Grande
46
+ Ariel
47
+ Arkwright
48
+ Arlo
49
+ Arnim Zola
50
+ Arnold Shortman
51
+ Arrietty
52
+ Arthas Menethil
53
+ Arthur
54
+ Arthur Morgan
55
+ Arthur Nudge
56
+ Arthur Read
57
+ Ash Ketchum
58
+ Ashitaka
59
+ Asterix
60
+ Astrid
61
+ Astro Boy
62
+ Astroboy
63
+ Atom Ant
64
+ Atreus
65
+ Aunt Jemima
66
+ Aviva Corcovado
67
+ B.O.B.
68
+ Baba Looey
69
+ Baba Voss
70
+ Baby Bop
71
+ Baby Yoda
72
+ Bad Bunny
73
+ Bagheera
74
+ Baldrick
75
+ Baloo
76
+ Balthazar Bratt
77
+ Bambi
78
+ Bamm-Bamm Rubble
79
+ Barbie
80
+ Barley Lightfoot
81
+ Barney
82
+ Barney Rubble
83
+ Barnyard Dawg
84
+ Baron Humbert von Gikkingen
85
+ Barry B. Benson
86
+ Bart
87
+ Bart Simpson
88
+ Bartok
89
+ Bastion Narrator
90
+ Batgirl
91
+ Batman
92
+ Baymax
93
+ Bayonetta
94
+ Baze Malbus
95
+ BB-8
96
+ Beaky Buzzard
97
+ Beast and Belle
98
+ Beast Boy
99
+ Beavis
100
+ Beetle Bailey
101
+ Belle and Beast
102
+ Belle and the Beast
103
+ Bender
104
+ Benjamin Buford \"Bubba\" Blue
105
+ Benson
106
+ Berlioz
107
+ Bertie
108
+ Betty Boop
109
+ Betty Cooper
110
+ Betty Rubble
111
+ Bicycle Repair Man
112
+ Biden
113
+ Biff Tannen
114
+ Big Boss
115
+ Big Daddy
116
+ Big Nate
117
+ Biggie
118
+ Biggus Dickus
119
+ Bikini Bottom
120
+ Bill Gates
121
+ Bill the Cat
122
+ Billie Eilish
123
+ Billy Mack
124
+ Birdie
125
+ Bishop
126
+ Bitzer
127
+ BJ
128
+ Black Adam
129
+ Black Knight
130
+ Black Panther
131
+ Black Widow
132
+ Blackadder
133
+ Blathers
134
+ Blondie Bumstead
135
+ Blue Beetle (Jaime Reyes)
136
+ BMO
137
+ Bo Peep
138
+ Bo-Katan Kryze
139
+ Bob Parr
140
+ Bob the Builder
141
+ Bob the Minion
142
+ Boba Fett
143
+ Bobby Hill
144
+ Boo
145
+ Boo-Boo Bear
146
+ Boog
147
+ Booker DeWitt
148
+ Boomhauer
149
+ Bow Lion
150
+ Bowser
151
+ Bowser Jr.
152
+ Bradley Jackson
153
+ Brain (Alan Powers)
154
+ Brainiac
155
+ Brainy Smurf
156
+ Brak
157
+ Brian Cohen
158
+ Bridget
159
+ Brock
160
+ Brock Samson
161
+ Brother Maynard
162
+ Bruno Mars
163
+ Buchanan
164
+ Bud Brigman
165
+ Buddy the Elf
166
+ Buddy Tyrannosaurus
167
+ Bugs Bunny
168
+ Bullwinkle
169
+ Buster Baxter
170
+ Buster Moon
171
+ Butt Head
172
+ Butt-Head
173
+ ButtHead
174
+ Buzz Lightyear
175
+ BuzzBee
176
+ C-3PO
177
+ Caesar
178
+ Caillou
179
+ Calcifer
180
+ Calvin
181
+ Calvin's Mom
182
+ Cap'n Crunch
183
+ Cap'n Turbot
184
+ Cappy
185
+ Captain America
186
+ Captain Crunch
187
+ Captain Falcon
188
+ Captain Gutt
189
+ Captain Haddock
190
+ Captain Huggyface
191
+ Captain Mainwaring
192
+ Captain Rex
193
+ Carl Fredricksen
194
+ Carol Danvers (Captain Marvel)
195
+ Carrie White
196
+ Carter
197
+ Cartoon mouse
198
+ Casper
199
+ Casper the Friendly Ghost
200
+ Cassandra
201
+ Cassian Andor
202
+ Catbus
203
+ Catwoman
204
+ Celeste
205
+ Chara
206
+ Charlie B. Barkin
207
+ Charlie Brown
208
+ Charlie Dog
209
+ Charlie the Tuna
210
+ Charlotte
211
+ Chef Boyardee
212
+ Chell
213
+ Chester Cheetah
214
+ Chester V
215
+ Chewbacca
216
+ Chickaletta
217
+ Chicken Joe
218
+ Chihiro Ogino
219
+ Chirrut Imwe
220
+ Chloe
221
+ Chris Evans
222
+ Chris Kratt
223
+ Christopher Robin
224
+ Chuck Noland
225
+ Chuckie Finster
226
+ Chun-Li
227
+ Cinderella
228
+ Cindy Lou Who
229
+ Ciri
230
+ Claptrap
231
+ Clarabel
232
+ Clarence
233
+ Clarence Odbody
234
+ Clark Griswold
235
+ classical animation
236
+ Claude Cat
237
+ Cleo
238
+ Cleveland
239
+ Clifford
240
+ Clint
241
+ Clinton
242
+ Clippy
243
+ Cloud Strife
244
+ Clumsy Smurf
245
+ Coach Beard
246
+ Coco
247
+ Cody Maverick
248
+ Cogsworth
249
+ Colonel Hathi
250
+ Colonel Miles Quaritch
251
+ Colonel Sanders
252
+ Commander Shepard
253
+ Compo Simmonite
254
+ Connie Maheswaran
255
+ Constantine
256
+ Coolidge
257
+ Coraline Jones
258
+ Cortana
259
+ Cory Ellison
260
+ Count Chocula
261
+ Count Dooku
262
+ Count Dracula
263
+ Cousin Eddie
264
+ Cranky
265
+ Crash Bandicoot
266
+ Crisbell
267
+ Cristiano Ronaldo
268
+ Cruella de Vil
269
+ Cruella de Ville
270
+ Cuphead
271
+ Curious George
272
+ Cyborg
273
+ Cyclops
274
+ D.Va
275
+ D.W. Read
276
+ Daddy Pig
277
+ Daddy Warbucks
278
+ Daffy Duck
279
+ Dagwood Bumstead
280
+ Daisy Duck
281
+ Dale Gribble
282
+ Daniel Tiger
283
+ Dante
284
+ Daphne Blake
285
+ Daredevil
286
+ Darkseid
287
+ Darth Vader
288
+ Dash Parr
289
+ David Brent
290
+ Deadpool
291
+ Deathstroke
292
+ Dee Dee
293
+ Del Boy
294
+ Demoman
295
+ Dennis
296
+ Dennis Mitchell
297
+ Devi D.
298
+ Dewey Duck
299
+ Dexter
300
+ Dib
301
+ Dib Membrane
302
+ Dick Tracy
303
+ Diddy Kong
304
+ Dig'em Frog
305
+ Dil Pickles
306
+ Dilbert
307
+ Din Djarin
308
+ Din Song
309
+ Disgust
310
+ Dixie Kong
311
+ Doc Hudson
312
+ Doctor Doom
313
+ Doctor Strange
314
+ Dogbert
315
+ Doja Cat
316
+ Don Lino
317
+ Don Pteranodon
318
+ Donald Duck
319
+ Donald Trump
320
+ Donkey Kong
321
+ Doom Slayer
322
+ Dora
323
+ Dora Marquez
324
+ Doraemon
325
+ Dorothy
326
+ Dorothy Gale
327
+ Dorothy the Dinosaur
328
+ Dorothy Turner
329
+ Dory
330
+ Doug Funnie
331
+ Dr. Cockroach
332
+ Dr. Frank-N-Furter
333
+ Dr. Girlfriend
334
+ Dr. Grace Augustine
335
+ Dr. Robotnik
336
+ Dr. Two Brains
337
+ Dracula
338
+ Drake
339
+ Drax
340
+ Drax the Destroyer
341
+ Drizella Tremaine
342
+ Dua Lipa
343
+ Duchess
344
+ Dudley Do-Right
345
+ Duffy Bear
346
+ Duke
347
+ Dumbo
348
+ Dwayne Johnson
349
+ Ebenezer Scrooge
350
+ Ed Bighead
351
+ Ed Sheeran
352
+ Eddie Valiant
353
+ Edgar Balthazar
354
+ Edith
355
+ Edna Mode
356
+ Edward
357
+ Edward Elric
358
+ Eep Crood
359
+ Eeyore
360
+ Egghead Jr.
361
+ Eisenhower
362
+ Elastigirl
363
+ Elektra
364
+ Ella
365
+ Ellen Ripley
366
+ Ellie
367
+ Elliot
368
+ Elmer Fudd
369
+ Elon Musk
370
+ Elroy Jetson
371
+ Elsa
372
+ Emile
373
+ Emily
374
+ Emily Dickinson
375
+ Emily Elizabeth Howard
376
+ Emma Watson
377
+ Emmett \"Doc\" Brown
378
+ Emperor Palpatine
379
+ Eren Yeager
380
+ Eric Cartman
381
+ Esmeralda
382
+ EVE
383
+ Evil Queen
384
+ Ewoks
385
+ Explosm Cyanide Character
386
+ Ezra Bridger
387
+ Fairy Godmother
388
+ Falco Lombardi
389
+ Father Ted Crilly
390
+ Felix the Cat
391
+ Ferb Fletcher
392
+ Ferdinand
393
+ Figaro
394
+ Fillmore
395
+ Finn
396
+ Finn the Human
397
+ Fiona
398
+ Fireman Sam
399
+ Fishlegs
400
+ Flint Lockwood
401
+ Flit
402
+ Flo
403
+ Flounder
404
+ Flynn Rider
405
+ Foghorn Leghorn
406
+ Ford
407
+ Forky
408
+ Forrest Gump
409
+ Fox McCloud
410
+ Francine Frensky
411
+ Francine Peters
412
+ Francois Turbot
413
+ Frank Spencer
414
+ Franklin
415
+ Fred
416
+ Fred Flintstone
417
+ Frieza
418
+ Frigga
419
+ Frisk
420
+ Frosty the Snowman
421
+ Frou-Frou
422
+ Frozone
423
+ Frylock
424
+ Fudgie the Whale
425
+ Gabi
426
+ Gabriela
427
+ Gambit
428
+ Gamora
429
+ Gandalf
430
+ Ganondorf
431
+ Garfield
432
+ Garfield's Nermal
433
+ Gargamel
434
+ Garnet
435
+ Garrus Vakarian
436
+ Gaz
437
+ Geico Gecko
438
+ General Grievous
439
+ Genie
440
+ Genji
441
+ Geoffrey the Giraffe
442
+ George Bailey
443
+ George Bush
444
+ George H. W. Bush
445
+ George Jetson
446
+ George McFly
447
+ George Pig
448
+ George W. Bush
449
+ Gerald
450
+ Gerald Johanssen
451
+ Geraldine Granger
452
+ Geralt
453
+ Geralt of Rivia
454
+ Ghost Rider
455
+ Gideon
456
+ Gidget
457
+ Ginny Grainger
458
+ GIR
459
+ GLaDOS
460
+ Globox
461
+ Gloria
462
+ Gloria the Hippo
463
+ Gobber
464
+ Gohan
465
+ Goko
466
+ Goku
467
+ Goldfish Cracker
468
+ Gon Freecss
469
+ Goofy
470
+ Gordon
471
+ Gordon Freeman
472
+ Gossamer
473
+ Gotham City
474
+ Gran
475
+ Gran'ma Ben
476
+ Grand Admiral Thrawn
477
+ Grandmaster
478
+ Granny
479
+ Green Lantern (Hal Jordan)
480
+ Greg Universe
481
+ Gregg
482
+ Grimace
483
+ Grogu
484
+ Grogu (Baby Yoda)
485
+ Gromit
486
+ Groot
487
+ Gru
488
+ Grug Crood
489
+ Guile
490
+ Gwen Stacy
491
+ Hagar the Horrible
492
+ Haku
493
+ HAL 9000
494
+ Hal Stewart / Tighten
495
+ Hamm
496
+ Hammy
497
+ Han Solo
498
+ Hancock
499
+ Handsome Jack
500
+ Haniwa
501
+ Happy Noodle Boy
502
+ Harding
503
+ Harley Quinn
504
+ Harold
505
+ Harris
506
+ Harrison
507
+ Harry Potter
508
+ Harry Styles
509
+ Harry Tasker
510
+ Hat Kid
511
+ Hawkeye
512
+ Hayes
513
+ Heat Miser
514
+ Heather
515
+ Hector
516
+ Hector the Bulldog
517
+ Heffer Wolfe
518
+ Hei Hei
519
+ Heihachi Mishima
520
+ Heimdall
521
+ Hela
522
+ Helen Hunt
523
+ Helen Tasker
524
+ Helga Pataki
525
+ Hello Kitty
526
+ Henery Hawk
527
+ Henry
528
+ Henry the Octopus
529
+ Hera Syndulla
530
+ Hercules
531
+ Hermey the Elf
532
+ Hi-5
533
+ Hiccup
534
+ Hiro
535
+ Hisoka
536
+ Hobbes
537
+ Hogwarts
538
+ Homer
539
+ Homer Simpson
540
+ Homestar Runner
541
+ Homsar
542
+ Honest John
543
+ Hoover
544
+ Horton
545
+ Howard Hughes
546
+ Hubie
547
+ Huckleberry Hound
548
+ Huey Duck
549
+ Hugo de Rune
550
+ Hulk
551
+ Human Torch
552
+ Humphrey Bogart
553
+ Hyacinth Bucket
554
+ H\u00e9ctor Rivera
555
+ Ian Lightfoot
556
+ Ice King
557
+ Incontinentia Buttocks
558
+ Indiana Jones
559
+ Inkling
560
+ Inspector Clouseau
561
+ Inspector Gadget
562
+ Inuyasha
563
+ Invader Skoodge
564
+ Invisible Woman
565
+ Iron Fist
566
+ Iron Man
567
+ Isaac Clarke
568
+ Isabelle
569
+ Itchy Itchiford
570
+ Ivy Valentine
571
+ Ja Morant
572
+ Jack Dawson
573
+ Jack Skellington
574
+ Jack-Jack
575
+ Jack-Jack Parr
576
+ Jackie Brown
577
+ Jackson
578
+ Jafar
579
+ Jailbreak
580
+ Jake from State Farm
581
+ Jake Sully
582
+ Jake the Dog
583
+ James
584
+ James Bond
585
+ James Henry Trotter
586
+ Jane Jetson
587
+ Jane Porter
588
+ Jasmine
589
+ Jean Grey
590
+ Jedi
591
+ Jeff
592
+ Jefferson
593
+ Jennifer Lopez
594
+ Jennifer Parker
595
+ Jenny Curran
596
+ Jerry
597
+ Jesper
598
+ Jess (the Cat)
599
+ Jessica Jones
600
+ Jessica Rabbit
601
+ Jessie
602
+ Jett
603
+ Jiji
604
+ Jill Valentine
605
+ Jim Raynor
606
+ Jiminy Cricket
607
+ Jimmy Z
608
+ Jin Kazama
609
+ Jinx
610
+ Joe Camel
611
+ Joe Gardner
612
+ Joe Rogan
613
+ Joel
614
+ Joel Miller
615
+ Johann Gambolputty...
616
+ John Connor
617
+ John Marston
618
+ John McClane
619
+ John Smith
620
+ Johnny
621
+ Johnny Bravo
622
+ Johnny C. (Nny)
623
+ Johnny Loughran
624
+ Johnson
625
+ JoJo McDodd
626
+ Jon Arbuckle
627
+ Josie McCoy
628
+ Judge Doom
629
+ Judy Hopps
630
+ Judy Jetson
631
+ Julian Pearce
632
+ Jumanji
633
+ Justin Bieber
634
+ Jyn Erso
635
+ K-2SO
636
+ K.K. Slider
637
+ Kaa
638
+ Kai
639
+ Kakashi Hatake
640
+ Kamala Harris
641
+ Kamala Khan (Ms. Marvel)
642
+ Kanga
643
+ Kanye West
644
+ Katchoo
645
+ Kate Bishop
646
+ Katie Mitchell
647
+ Kazooie
648
+ Kazuya Mishima
649
+ Keeley Jones
650
+ Ken
651
+ Ken Masters
652
+ Kendrick Lamar
653
+ Kennedy
654
+ Kerrigan
655
+ Kevin
656
+ Kevin McCallister
657
+ Kevin the Minion
658
+ Kiki
659
+ Killjoy
660
+ Killmonger (Erik Stevens)
661
+ Killua Zoldyck
662
+ Kim Kardashian
663
+ King Arthur
664
+ King Candy
665
+ King Dedede
666
+ King Gristle Jr.
667
+ King Julien
668
+ King Knight
669
+ King Louie
670
+ King of the Hill
671
+ King Triton
672
+ King Vitamin
673
+ Kingpin
674
+ Kirby
675
+ Kitana
676
+ Kofun
677
+ Koki
678
+ Kool-Aid
679
+ Kool-Aid Man
680
+ Korath
681
+ Korg
682
+ Kowalski
683
+ Kraglin
684
+ Kratos
685
+ Kris Kringle
686
+ Kristoff
687
+ Kurapika
688
+ Kuzco
689
+ Kyle Reese
690
+ Kylie Jenner
691
+ Kylo Ren
692
+ Lady Eboshi
693
+ Lady Gaga
694
+ Lady Sif
695
+ Lady Tremaine
696
+ LaMelo Ball
697
+ Lando Calrissian
698
+ Lara
699
+ Lara Croft
700
+ Larry the Lobster
701
+ Lars Barriga
702
+ Leanne Grayson
703
+ LeBron James
704
+ Lenny
705
+ Lenore
706
+ Leon Kennedy
707
+ Levi Ackerman
708
+ Lewis Hamilton
709
+ Lex Luthor
710
+ Liara T'Soni
711
+ Lieutenant Dan Taylor
712
+ Light Yagami
713
+ Lightning McQueen
714
+ Lil DeVille
715
+ Lilo
716
+ Lincoln
717
+ Linda Gunderson
718
+ Lindsey Brigman
719
+ Linus
720
+ Linus van Pelt
721
+ Lisa
722
+ Lisa Simpson
723
+ Little Caesar
724
+ Little Orphan Annie
725
+ Liu Kang
726
+ Logan
727
+ Loki
728
+ Lola Bunny
729
+ Lord Shen
730
+ Lorraine Baines
731
+ Lotso
732
+ Louie Duck
733
+ Luanne Platter
734
+ Lucas
735
+ Lucifer
736
+ Lucky Eddie
737
+ Lucky the Leprechaun
738
+ Lucy Van Pelt
739
+ Luigi
740
+ Luke
741
+ Luke Cage
742
+ Luke Skywalker
743
+ Luke Triton
744
+ Lumberjack
745
+ Lumi\u00e8re
746
+ M'Baku
747
+ Mace Windu
748
+ Madam Mim
749
+ Madame Adelaide Bonfamille
750
+ Madeline
751
+ Madison
752
+ Mae Borowski
753
+ Mafalda
754
+ Maggie Simpson
755
+ Maghra
756
+ Magilla Gorilla
757
+ Magneto
758
+ Mai Shiranui
759
+ Maisy
760
+ Malcolm Tucker
761
+ Maleficent
762
+ Mandalorian
763
+ Mandark
764
+ Mando
765
+ Manny
766
+ Marceline
767
+ Margaret Tiger
768
+ Marge
769
+ Marge Simpson
770
+ Margo
771
+ Margo Leadbetter
772
+ Maria Hill
773
+ Maria Rambeau
774
+ Mario
775
+ Mark Zuckerberg
776
+ Marlin
777
+ Marmaduke
778
+ Marshall
779
+ Martian Manhunter
780
+ Martin Bryce
781
+ Martin Kratt
782
+ Marty
783
+ Marty McFly
784
+ Marty the Zebra
785
+ Marvin Acme
786
+ Marvin the Martian
787
+ Mary Poppins
788
+ Master Chief
789
+ Master Shake
790
+ Mater
791
+ Mavis Dracula
792
+ Max Verstappen
793
+ Maximus
794
+ Maya Fey
795
+ Mayor Goodway
796
+ McKinley
797
+ Meatwad
798
+ Meeko
799
+ Meena
800
+ Mega Man
801
+ Megamind
802
+ Megara
803
+ Mei Kusakabe
804
+ Mel
805
+ Melman
806
+ Melman the Giraffe
807
+ Meowth
808
+ Merida
809
+ Merlin
810
+ Merryweather
811
+ Meta Knight
812
+ Metro Man
813
+ Micah Keith
814
+ Michigan J. Frog
815
+ Mickey
816
+ Miek
817
+ Mighty Mouse
818
+ Miguel
819
+ Miguel Rivera
820
+ Mike Wazowski
821
+ Miles Edgeworth
822
+ Miles Morales
823
+ Miley Cyrus
824
+ Millennium Falcon
825
+ Milo
826
+ Minion
827
+ Minnie Mouse
828
+ Miranda
829
+ Misa Amane
830
+ Misato Katsuragi
831
+ Mitch Kessler
832
+ Mitsurugi
833
+ Moana
834
+ Mojo Jojo
835
+ Monica Rambeau
836
+ Monkey D. Luffy
837
+ Monkeybone
838
+ Monroe
839
+ Monstro
840
+ Mordecai
841
+ Morris the Cat
842
+ Morty Smith
843
+ Mowgli
844
+ Mr. Bean
845
+ Mr. Clean
846
+ Mr. Creosote
847
+ Mr. Eric Praline
848
+ Mr. Incredible
849
+ Mr. Krabs
850
+ Mr. Peabody
851
+ Mr. Peanut
852
+ Mr. Potato Head
853
+ Mr. Potatohead
854
+ Mr. Toad
855
+ Mr. Wilson
856
+ Mrs. Brisby
857
+ Mrs. Brown
858
+ Mrs. Gump
859
+ Mrs. Merton
860
+ Mrs. Potts
861
+ Mrs. Spider
862
+ Ms. Pac-Man
863
+ Mufasa
864
+ Muffy Crosswire
865
+ Mugman
866
+ Mugsy
867
+ Mulan
868
+ Mummy Pig
869
+ Mushu
870
+ Nakia
871
+ Nala
872
+ Nami
873
+ Namor
874
+ Nancy
875
+ Naru
876
+ Naruto Uzumaki
877
+ Nate Shelley
878
+ Nathan
879
+ Nathan Drake
880
+ Nausica\u00e4
881
+ Nefertari Vivi
882
+ Negasonic Teenage Warhead
883
+ Nemo
884
+ Neo
885
+ Ness
886
+ Nessa Jenkins
887
+ Newt (Rebecca Jorden)
888
+ Neytiri
889
+ Nick Fury
890
+ Nick Wilde
891
+ Nicki Minaj
892
+ Nigel
893
+ Nightwing
894
+ Nina Williams
895
+ Nixon
896
+ No-Face
897
+ Nobita Nobi
898
+ Noddy
899
+ Norman
900
+ Norman Babcock
901
+ Norman Price
902
+ Obama
903
+ Obelix
904
+ Obi-Wan Kenobi
905
+ Octoling
906
+ Odie
907
+ Odin
908
+ Okoye
909
+ Olaf
910
+ Olive Oyl
911
+ Olivia Rodrigo
912
+ Opus
913
+ Ori
914
+ Oscar
915
+ Oswald
916
+ Ozzie
917
+ Pac-Man
918
+ Paddington Bear
919
+ Padm\u00e9 Amidala
920
+ Palpatine
921
+ Papa Smurf
922
+ Pappa Smurf
923
+ Papyrus
924
+ Pascal
925
+ Patrick Mahomes
926
+ Patrick Star
927
+ Patsy
928
+ Patsy Stone
929
+ Patti Mayonnaise
930
+ Pazu
931
+ Pebbles Flintstone
932
+ Peggy Carter
933
+ Peggy Hill
934
+ Penelope Pussycat
935
+ Pepe Le Pew
936
+ Peppa Pig
937
+ Percy
938
+ Perdita
939
+ Peridot
940
+ Perry
941
+ Perry the Platypus
942
+ Peter B. Parker
943
+ Peter Griffin
944
+ Peter Pan
945
+ Peter Parker
946
+ Peter Rabbit
947
+ Phil DeVille
948
+ Phineas Flynn
949
+ Phoebe Heyerdahl
950
+ Phoenix Wright
951
+ Phoney Bone
952
+ Piglet
953
+ Pikachu
954
+ Pillsbury Doughboy
955
+ Pingu
956
+ Pink Panther
957
+ Pinocchio
958
+ Plague Knight
959
+ Play-Doh Pete
960
+ Pluto
961
+ Po
962
+ Pocahontas
963
+ Poe Dameron
964
+ Polk
965
+ PomPom
966
+ Pongo
967
+ Pontius Pilate
968
+ Ponyo
969
+ Popeye
970
+ Poppin' Fresh (Pillsbury Doughboy)
971
+ Poppy Parnell
972
+ Porkchop
973
+ Porky Pig
974
+ Post Malone
975
+ Postman Pat
976
+ Potato Head
977
+ President
978
+ Prince Charming
979
+ Prince Eric
980
+ Prince Phillip
981
+ Prince Wednesday
982
+ Princess Bubblegum
983
+ Princess Leia
984
+ Princess Leia Organa
985
+ Princess Mononoke
986
+ Princess Peach
987
+ Princess Presto
988
+ Professor Layton
989
+ Professor X
990
+ Proto Man
991
+ Pumbaa
992
+ Punisher
993
+ Puss
994
+ Puss in Boots
995
+ Puss n Boots
996
+ Pyramid Head
997
+ Pyro
998
+ Queen Kane
999
+ Queen Xenomorph
1000
+ Qui-Gon Jinn
1001
+ Quick Draw McGraw
1002
+ Quicksilver
1003
+ R2-D2
1004
+ Rabbids
1005
+ Rafael
1006
+ Raiden
1007
+ Ralph
1008
+ Ralph Wolf
1009
+ Ralphie Parker
1010
+ Randall Boggs
1011
+ Rapunzel
1012
+ Rayman
1013
+ Reagan
1014
+ Rebecca Welton
1015
+ Red and Rover
1016
+ Red Skull
1017
+ Rei Ayanami
1018
+ Reinhardt
1019
+ Remy
1020
+ Ren H\u00f6ek
1021
+ Resetti
1022
+ Rex
1023
+ Rey
1024
+ Reyna
1025
+ Rheneas
1026
+ Rhett Butler
1027
+ Richie Rich
1028
+ Rick
1029
+ Rick Mitchell
1030
+ Rick Sanchez
1031
+ Rico
1032
+ Ridley
1033
+ Rigby
1034
+ Rigsby
1035
+ Rihanna
1036
+ RJ
1037
+ Road Runner
1038
+ Robin (Dick Grayson)
1039
+ Robo-Dog
1040
+ Rocket Raccoon
1041
+ Rocko
1042
+ Rocky Balboa
1043
+ Rodney Copperbottom
1044
+ Rodney Trotter
1045
+ Roger Klotz
1046
+ Roger Rabbit
1047
+ Roger the Shrubber
1048
+ Ronald McDonald
1049
+ Ronan the Accuser
1050
+ Roo
1051
+ Roosevelt
1052
+ Roronoa Zoro
1053
+ Rosalina
1054
+ Rose DeWitt Bukater
1055
+ Rosie
1056
+ Rosita
1057
+ Roxanne Ritchi
1058
+ Roy Kent
1059
+ Roy Mustang
1060
+ Rudolph
1061
+ Ruffnut
1062
+ Russell
1063
+ Ryder
1064
+ Ryu
1065
+ Sabine Wren
1066
+ Sadie Miller
1067
+ Sage
1068
+ Sailor Moon
1069
+ Sakura Haruno
1070
+ Salad Fingers
1071
+ Sam
1072
+ Sam Fisher
1073
+ Sam Sheepdog
1074
+ Sam Sparks
1075
+ Samurai Jack
1076
+ Samus Aran
1077
+ Sandy Cheeks
1078
+ Sandy Crood
1079
+ Santa Claus
1080
+ Santa Claus (Kurt Russell)
1081
+ Sarah Connor
1082
+ Sarge
1083
+ Sasuke
1084
+ Satsuki
1085
+ Saw Gerrera
1086
+ Scarlet Overkill
1087
+ Scarlet Witch
1088
+ Scarlett Johansson
1089
+ Scarlett O'Hara
1090
+ Schroeder
1091
+ Scooby
1092
+ Scooby Doo
1093
+ Scooby-Doo
1094
+ Scooby-Dum
1095
+ Scott Calvin
1096
+ Scrat
1097
+ Scrooge
1098
+ Scrooge McDuck
1099
+ Scuttle
1100
+ Sean Turner
1101
+ Sebastian
1102
+ Secret Squirrel
1103
+ Seiji Amasawa
1104
+ Selena Gomez
1105
+ Senua
1106
+ Sephiroth
1107
+ Sesshomaru
1108
+ Shadow the Hedgehog
1109
+ Shaggy Rogers
1110
+ Shakira
1111
+ Shang
1112
+ Shang-Chi
1113
+ Sharon Carter
1114
+ Shaun
1115
+ Shawn Mendes
1116
+ Shazam
1117
+ Sheeta
1118
+ Shere Khan
1119
+ Sherman
1120
+ Shifu
1121
+ Shin-chan
1122
+ Shinji Ikari
1123
+ Shiny Pteranodon
1124
+ Shizuka Minamoto
1125
+ Shizuku Tsukishima
1126
+ Shovel Knight
1127
+ Shrek
1128
+ Shuri
1129
+ Sid
1130
+ Siegfried
1131
+ Silver Surfer
1132
+ Simba
1133
+ Sir Bedevere
1134
+ Sir Galahad the Pure
1135
+ Sir Lancelot the Brave
1136
+ Sir Robin
1137
+ Sir Topham Hatt
1138
+ Skarloey
1139
+ Skye
1140
+ Skywalker
1141
+ Sluggo
1142
+ Smiley Bone
1143
+ Smokey Bear
1144
+ Smurf
1145
+ Smurfette
1146
+ Snagglepuss
1147
+ Snap, Crackle, and Pop
1148
+ Sniffles
1149
+ Sniper
1150
+ Snoopy
1151
+ Snotlout
1152
+ Snow Miser
1153
+ Snow White
1154
+ Solaire of Astora
1155
+ Solid Snake
1156
+ Sonic the Hedgehog
1157
+ Sonic's Knuckles
1158
+ Sophie Hatter
1159
+ Spam Waitress
1160
+ Specter Knight
1161
+ Speedy Gonzales
1162
+ Spencer
1163
+ Spider-Man
1164
+ Spiderman
1165
+ Sponge Bob
1166
+ SpongeBob
1167
+ SpongeBob Square Pants
1168
+ SpongeBob SquarePants
1169
+ SpongeBob's Grandma
1170
+ Spuds MacKenzie
1171
+ Spyro
1172
+ Squall Leonhart
1173
+ Squidward Tentacles
1174
+ Srongbad
1175
+ Star Lord
1176
+ Star-Lord
1177
+ Starfire
1178
+ Stella
1179
+ Stephen Curry
1180
+ Steve
1181
+ Steven Universe
1182
+ Stewie Griffin
1183
+ Stimpy
1184
+ Stoick the Vast
1185
+ Stromboli
1186
+ Strong Bad
1187
+ Stuart Little
1188
+ Stuart the Minion
1189
+ Sulley
1190
+ Sully
1191
+ Sumo
1192
+ Suneo Honekawa
1193
+ Super Why
1194
+ Supergirl
1195
+ Superman
1196
+ Surtur
1197
+ Susan Murphy
1198
+ Susan Murphy / Ginormica
1199
+ Susan Walker
1200
+ Susie Carmichael
1201
+ Sven
1202
+ Swiper
1203
+ Sylvester
1204
+ T-1000
1205
+ T-800
1206
+ T-800 (The Terminator)
1207
+ T-Bone
1208
+ Taft
1209
+ Tai Lung
1210
+ Takeshi Goda (Gian)
1211
+ Talos
1212
+ Tamacti Jun
1213
+ Tank Girl
1214
+ Tarzan
1215
+ Tasmanian Devil
1216
+ Taylor
1217
+ Taylor Swift
1218
+ Taz
1219
+ Team Rocket
1220
+ Ted Lasso
1221
+ Ted Wiggins
1222
+ Teddy Ruxpin
1223
+ Teemo
1224
+ Terminator
1225
+ Terry Bogard
1226
+ Thanatos
1227
+ Thanos
1228
+ The \"It's\" Man
1229
+ The Aflac Duck
1230
+ The Android Robot
1231
+ The Apple Logo Face
1232
+ The Avengers
1233
+ The Beast and Belle
1234
+ The Brawny Lumberjack
1235
+ The Bride
1236
+ The Burger King
1237
+ The Butcher
1238
+ The California Raisins
1239
+ The Camel (Joe Camel)
1240
+ The Caterpillar
1241
+ The Charmin Bears
1242
+ The Cheerios Bee
1243
+ The Cheshire Cat
1244
+ The Collector
1245
+ The Conductor
1246
+ The Death Star
1247
+ The Energizer Bunny
1248
+ The Ewoks
1249
+ The Flash
1250
+ The French Taunter
1251
+ The Froot Brute
1252
+ The Geico Gecko
1253
+ The Ghost of Christmas Past
1254
+ The Ghost of Christmas Present
1255
+ The Ghost of Christmas Yet to Come
1256
+ The Goldfish Cracker
1257
+ The Green Giant
1258
+ The Grinch
1259
+ The Gumbys
1260
+ The Hamburglar
1261
+ The Joker
1262
+ The Jolly Green Giant
1263
+ The Killer Rabbit of Caerbannog
1264
+ The Knights Who Say \"Ni\"
1265
+ The Kool-Aid Man
1266
+ The Laughing Cow
1267
+ The Lego Minifigure
1268
+ The Liberty Mutual Emu
1269
+ The Little Green Sprout
1270
+ The Little Prince
1271
+ The M&M's Characters
1272
+ The Mad Hatter
1273
+ The Mandarin
1274
+ The Michelin Man
1275
+ The Missing Link
1276
+ The Monarch
1277
+ The Monopoly Man
1278
+ The Morton Salt Girl
1279
+ The Nesquik Bunny
1280
+ The Noid
1281
+ The Once-ler
1282
+ The Planters Peanut
1283
+ The Planters Peanut (Mr. Peanut)
1284
+ The Red and Yellow M&M's
1285
+ The Scrubbing Bubbles
1286
+ The Starbucks Mermaid
1287
+ The Sun (Raisin Bran)
1288
+ The Taco Bell Chihuahua
1289
+ The Travelocity Gnome
1290
+ The Trix Rabbit
1291
+ The Vault Hunters
1292
+ The Weeknd
1293
+ The White Rabbit
1294
+ Theo
1295
+ Thomas
1296
+ Thomas O'Malley
1297
+ Thor
1298
+ Thorn Harvestar
1299
+ Thrall
1300
+ Thunk Crood
1301
+ Tiana
1302
+ Tidus
1303
+ Tifa Lockhart
1304
+ Tigger
1305
+ Tigress
1306
+ Tim
1307
+ Tim Lockwood
1308
+ Tim the Enchanter
1309
+ Timmy
1310
+ Timmy Brisby
1311
+ Timon
1312
+ Tinker Bell
1313
+ Tintin
1314
+ Tiny Diamond
1315
+ Tiny Pteranodon
1316
+ Tiny Tim
1317
+ Toad
1318
+ Toby
1319
+ Tom
1320
+ Tom And Jerry
1321
+ Tom Holland
1322
+ Tom Nook
1323
+ Tommy Pickles
1324
+ Tony
1325
+ Tony the Tiger
1326
+ Top Cat
1327
+ Totoro
1328
+ Toucan Sam
1329
+ Tracker
1330
+ Travis Scott
1331
+ Trevor Phillips
1332
+ Triss Merigold
1333
+ Trix Rabbit
1334
+ Truman
1335
+ Trump
1336
+ Tuffnut
1337
+ Tweety Bird
1338
+ Tyler
1339
+ Ultron
1340
+ Uncle Ben
1341
+ Ursula
1342
+ Usagi Tsukino (Sailor Moon)
1343
+ Usagi Yojimbo
1344
+ Usopp
1345
+ Valerie Brown
1346
+ Valiente
1347
+ Valkyrie
1348
+ Van Buren
1349
+ Vanellope
1350
+ Vegeta
1351
+ Velma Dinkley
1352
+ Venom
1353
+ Verne
1354
+ Veronica Lodge
1355
+ Victor Frankenstein
1356
+ Victor Meldrew
1357
+ Violet Parr
1358
+ Vivo
1359
+ WALL-E
1360
+ Wallace
1361
+ Walter Hobbs
1362
+ Waluigi
1363
+ Wario
1364
+ Warren Cave
1365
+ Washington
1366
+ Wendy
1367
+ Wheatley
1368
+ Widowmaker
1369
+ Wilbur
1370
+ Wildcat
1371
+ Wile E. Coyote
1372
+ Will Hunting
1373
+ Will Smith
1374
+ Willie T. Stokes
1375
+ Wilma
1376
+ Wilma Flintstone
1377
+ Wilson
1378
+ Wimpy
1379
+ Winnie the Pooh
1380
+ Winry Rockbell
1381
+ Winter Soldier
1382
+ Witch Hazel
1383
+ Wolverine
1384
+ Wonder Red
1385
+ Wonder Woman
1386
+ Woody
1387
+ Woody Woodpecker
1388
+ Woofster
1389
+ WordGirl
1390
+ Wybie Lovat
1391
+ X-23
1392
+ X-Men
1393
+ Yasuo
1394
+ Yoda
1395
+ Yogi Bear
1396
+ Yon-Rogg
1397
+ Yondu
1398
+ Yosemite Sam
1399
+ Yoshi
1400
+ Yoshimitsu
1401
+ Yubaba
1402
+ Yukon Cornelius
1403
+ Yummy Mummy
1404
+ Yuna
1405
+ Zagreus
1406
+ Zatanna
1407
+ Zazu
1408
+ Zelda
1409
+ Zelda's Sheik
1410
+ Zendaya
1411
+ Zim
1412
+ Zorak
1413
+ Zuma
1414
+ Zurg
guardrail/blocklist/nltk_data/corpora/wordnet.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cbda5ea6eef7f36a97a43d4a75f85e07fccbb4f23657d27b4ccbc93e2646ab59
3
+ size 10775600
guardrail/blocklist/nltk_data/tokenizers/punkt_tab.zip ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c2b16c23d738effbdc5789d7aa601397c13ba2819bf922fb904687f3f16657ed
3
+ size 4259017