LoneStriker
commited on
Commit
•
c496583
1
Parent(s):
aac9448
Upload folder using huggingface_hub
Browse files- README.md +55 -0
- config.json +30 -0
- huggingface-metadata.txt +10 -0
- mergekit_moe_config.yml +40 -0
- model.safetensors.index.json +1 -0
- output-00001-of-00002.safetensors +3 -0
- output-00002-of-00002.safetensors +3 -0
- special_tokens_map.json +24 -0
- tokenizer.json +0 -0
- tokenizer.model +3 -0
- tokenizer_config.json +42 -0
README.md
ADDED
@@ -0,0 +1,55 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
---
|
2 |
+
license: apache-2.0
|
3 |
+
language:
|
4 |
+
- en
|
5 |
+
tags:
|
6 |
+
- merge
|
7 |
+
---
|
8 |
+
![image/jpeg](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/7JsqBt8QRiZmcMh-ameqH.jpeg)
|
9 |
+
# It's alive!!!! Half the size and better on GSM8k and Winogrande than Mixtral Instruct 8x 7B!
|
10 |
+
|
11 |
+
A frankenMoE using only DPO models. To be used with Chat-instruct mode enabled.
|
12 |
+
|
13 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/wGRcusncUd-mCdksvYckY.png)
|
14 |
+
|
15 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/rx1GfLMEIP3T-r3bxqW9r.png)
|
16 |
+
|
17 |
+
|
18 |
+
- [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) - router
|
19 |
+
- [udkai/Turdus](https://huggingface.co/udkai/Turdus) - expert #1
|
20 |
+
- [distilabeled-Marcoro14-7B-slerp](https://huggingface.co/argilla/distilabeled-Marcoro14-7B-slerp) - expert #2
|
21 |
+
- [SanjiWatsuki/Kunoichi-DPO-v2-7B](https://huggingface.co/SanjiWatsuki/Kunoichi-DPO-v2-7B) - expert #3
|
22 |
+
- [Neuronovo/neuronovo-9B-v0.3](https://huggingface.co/Neuronovo/neuronovo-9B-v0.3) - expert #4
|
23 |
+
|
24 |
+
# "[What is a Mixture of Experts (MoE)?](https://huggingface.co/blog/moe)"
|
25 |
+
### (from the MistralAI papers...click the quoted question above to navigate to it directly.)
|
26 |
+
|
27 |
+
The scale of a model is one of the most important axes for better model quality. Given a fixed computing budget, training a larger model for fewer steps is better than training a smaller model for more steps.
|
28 |
+
|
29 |
+
Mixture of Experts enable models to be pretrained with far less compute, which means you can dramatically scale up the model or dataset size with the same compute budget as a dense model. In particular, a MoE model should achieve the same quality as its dense counterpart much faster during pretraining.
|
30 |
+
|
31 |
+
So, what exactly is a MoE? In the context of transformer models, a MoE consists of two main elements:
|
32 |
+
|
33 |
+
Sparse MoE layers are used instead of dense feed-forward network (FFN) layers. MoE layers have a certain number of “experts” (e.g. 32 in my "frankenMoE"), where each expert is a neural network. In practice, the experts are FFNs, but they can also be more complex networks or even a MoE itself, leading to hierarchical MoEs!
|
34 |
+
|
35 |
+
A gate network or router, that determines which tokens are sent to which expert. For example, in the image below, the token “More” is sent to the second expert, and the token "Parameters” is sent to the first network. As we’ll explore later, we can send a token to more than one expert. How to route a token to an expert is one of the big decisions when working with MoEs - the router is composed of learned parameters and is pretrained at the same time as the rest of the network.
|
36 |
+
|
37 |
+
At every layer, for every token, a router network chooses two of these groups (the “experts”) to process the token and combine their output additively.
|
38 |
+
|
39 |
+
![image/png](https://cdn-uploads.huggingface.co/production/uploads/6589d7e6586088fd2784a12c/up_I0R2TQGjqTShZp_1Sz.png)
|
40 |
+
|
41 |
+
Switch Layer
|
42 |
+
MoE layer from the [Switch Transformers paper](https://arxiv.org/abs/2101.03961)
|
43 |
+
|
44 |
+
So, to recap, in MoEs we replace every FFN layer of the transformer model with an MoE layer, which is composed of a gate network and a certain number of experts.
|
45 |
+
|
46 |
+
Although MoEs provide benefits like efficient pretraining and faster inference compared to dense models, they also come with challenges:
|
47 |
+
|
48 |
+
Training: MoEs enable significantly more compute-efficient pretraining, but they’ve historically struggled to generalize during fine-tuning, leading to overfitting.
|
49 |
+
Inference: Although a MoE might have many parameters, only some of them are used during inference. This leads to much faster inference compared to a dense model with the same number of parameters. However, all parameters need to be loaded in RAM, so memory requirements are high. For example, [given a MoE like Mixtral 8x7B](https://huggingface.co/blog/moe), we’ll need to have enough VRAM to hold a dense 47B parameter model. Why 47B parameters and not 8 x 7B = 56B? That’s because in MoE models, only the FFN layers are treated as individual experts, and the rest of the model parameters are shared. At the same time, assuming just two experts are being used per token, the inference speed (FLOPs) is like using a 12B model (as opposed to a 14B model), because it computes 2x7B matrix multiplications, but with some layers shared (more on this soon).
|
50 |
+
|
51 |
+
If all our tokens are sent to just a few popular experts, that will make training inefficient. In a normal MoE training, the gating network converges to mostly activate the same few experts. This self-reinforces as favored experts are trained quicker and hence selected more. To mitigate this, an auxiliary loss is added to encourage giving all experts equal importance. This loss ensures that all experts receive a roughly equal number of training examples. The following sections will also explore the concept of expert capacity, which introduces a threshold of how many tokens can be processed by an expert. In transformers, the auxiliary loss is exposed via the aux_loss parameter.
|
52 |
+
|
53 |
+
|
54 |
+
## "Wait...but you called this a frankenMoE?"
|
55 |
+
The difference between MoE and "frankenMoE" lies in the fact that the router layer in a model like the one on this repo is not trained simultaneously.
|
config.json
ADDED
@@ -0,0 +1,30 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"_name_or_path": "SanjiWatsuki/Kunoichi-DPO-v2-7B",
|
3 |
+
"architectures": [
|
4 |
+
"MixtralForCausalLM"
|
5 |
+
],
|
6 |
+
"attention_dropout": 0.0,
|
7 |
+
"bos_token_id": 1,
|
8 |
+
"eos_token_id": 2,
|
9 |
+
"hidden_act": "silu",
|
10 |
+
"hidden_size": 4096,
|
11 |
+
"initializer_range": 0.02,
|
12 |
+
"intermediate_size": 14336,
|
13 |
+
"max_position_embeddings": 8192,
|
14 |
+
"model_type": "mixtral",
|
15 |
+
"num_attention_heads": 32,
|
16 |
+
"num_experts_per_tok": 2,
|
17 |
+
"num_hidden_layers": 32,
|
18 |
+
"num_key_value_heads": 8,
|
19 |
+
"num_local_experts": 4,
|
20 |
+
"output_router_logits": false,
|
21 |
+
"rms_norm_eps": 1e-05,
|
22 |
+
"rope_theta": 10000.0,
|
23 |
+
"router_aux_loss_coef": 0.001,
|
24 |
+
"sliding_window": null,
|
25 |
+
"tie_word_embeddings": false,
|
26 |
+
"torch_dtype": "float16",
|
27 |
+
"transformers_version": "4.37.0.dev0",
|
28 |
+
"use_cache": true,
|
29 |
+
"vocab_size": 32000
|
30 |
+
}
|
huggingface-metadata.txt
ADDED
@@ -0,0 +1,10 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
url: https://huggingface.co/Kquant03/FrankenDPO-4x7B-bf16
|
2 |
+
branch: main
|
3 |
+
download date: 2024-01-16 14:23:34
|
4 |
+
sha256sum:
|
5 |
+
4d35281a3e88f5a2966e48d6bfc4629fb21b3d4be53d3e7c3c82fbf97ae80a15 model-00001-of-00005.safetensors
|
6 |
+
cdd61403c1ba57b311d4907972faf6d1f4e1ea40d71b44c4b894a68c2a05d218 model-00002-of-00005.safetensors
|
7 |
+
4c12260fa6d13943d719abfe2e41fdcdeb672c79437ae2fb0f6cbdc868128630 model-00003-of-00005.safetensors
|
8 |
+
b2ba2235cbfd7ac96b255774116eeebf8fef028f67706c6ec05d42f41facc061 model-00004-of-00005.safetensors
|
9 |
+
219714d908032bdde0b1218e6ac09c80dbc4a6b437fa42502f4ce8032f414dea model-00005-of-00005.safetensors
|
10 |
+
dadfd56d766715c61d2ef780a525ab43b8e6da4de6865bda3d95fdef5e134055 tokenizer.model
|
mergekit_moe_config.yml
ADDED
@@ -0,0 +1,40 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
base_model: SanjiWatsuki/Kunoichi-DPO-v2-7B
|
2 |
+
gate_mode: hidden
|
3 |
+
dtype: bfloat16
|
4 |
+
experts:
|
5 |
+
- source_model: udkai/Turdus
|
6 |
+
positive_prompts:
|
7 |
+
- "helpful"
|
8 |
+
- "Relevant"
|
9 |
+
- "Factual"
|
10 |
+
- "Precise"
|
11 |
+
- "Descriptive"
|
12 |
+
- source_model: argilla/distilabeled-Marcoro14-7B-slerp
|
13 |
+
positive_prompts:
|
14 |
+
- "Math"
|
15 |
+
- "Programming"
|
16 |
+
- "Engineering"
|
17 |
+
- "Science"
|
18 |
+
negative_prompts:
|
19 |
+
- "inaccurate"
|
20 |
+
- "incorrect"
|
21 |
+
- source_model: SanjiWatsuki/Kunoichi-DPO-v2-7B
|
22 |
+
positive_prompts:
|
23 |
+
- "Discuss"
|
24 |
+
- "Chat"
|
25 |
+
- "engaging"
|
26 |
+
- "stimulating"
|
27 |
+
negative_prompts:
|
28 |
+
- "Sorry"
|
29 |
+
- "As an AI"
|
30 |
+
- "cannot"
|
31 |
+
- "not capable"
|
32 |
+
- source_model: Neuronovo/neuronovo-7B-v0.3
|
33 |
+
positive_prompts:
|
34 |
+
- "logical"
|
35 |
+
- "informative"
|
36 |
+
- "proper solution"
|
37 |
+
- "innovative"
|
38 |
+
negative_prompts:
|
39 |
+
- "unhelpful"
|
40 |
+
- "inaccurate"
|
model.safetensors.index.json
ADDED
@@ -0,0 +1 @@
|
|
|
|
|
1 |
+
{"metadata": {"mergekit_version": "0.0.3.2"}, "weight_map": {"model.embed_tokens.weight": "model-00001-of-00005.safetensors", "model.norm.weight": "model-00001-of-00005.safetensors", "lm_head.weight": "model-00001-of-00005.safetensors", "model.layers.0.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.1.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.2.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.3.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.4.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.5.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.6.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.7.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.8.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.9.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.10.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.11.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.12.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.13.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.14.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.15.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.16.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.17.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.18.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.19.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.20.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.21.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.22.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.23.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.24.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.25.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.26.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.27.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.28.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.29.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.30.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.31.input_layernorm.weight": "model-00001-of-00005.safetensors", "model.layers.0.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.0.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.0.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.0.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.1.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.1.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.1.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.1.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.2.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.2.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.2.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.2.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.3.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.3.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.3.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.3.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.4.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.4.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.4.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.4.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.5.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.5.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.5.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.5.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.6.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.6.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.6.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.6.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.7.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.7.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.7.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.7.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.8.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.8.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.8.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.8.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.9.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.9.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.9.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.9.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.10.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.10.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.10.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.10.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.11.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.11.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.11.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.11.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.12.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.12.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.12.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.12.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.13.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.13.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.13.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.13.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.14.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.14.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.14.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.14.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.15.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.15.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.15.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.15.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.16.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.16.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.16.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.16.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.17.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.17.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.17.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.17.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.18.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.18.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.18.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.18.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.19.block_sparse_moe.experts.0.w3.weight": "model-00001-of-00005.safetensors", "model.layers.19.block_sparse_moe.experts.1.w3.weight": "model-00001-of-00005.safetensors", "model.layers.19.block_sparse_moe.experts.2.w3.weight": "model-00001-of-00005.safetensors", "model.layers.19.block_sparse_moe.experts.3.w3.weight": "model-00001-of-00005.safetensors", "model.layers.20.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00005.safetensors", "model.layers.20.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00005.safetensors", "model.layers.20.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00005.safetensors", "model.layers.20.block_sparse_moe.experts.3.w3.weight": "model-00002-of-00005.safetensors", "model.layers.21.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00005.safetensors", "model.layers.21.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00005.safetensors", "model.layers.21.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00005.safetensors", "model.layers.21.block_sparse_moe.experts.3.w3.weight": "model-00002-of-00005.safetensors", "model.layers.22.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00005.safetensors", "model.layers.22.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00005.safetensors", "model.layers.22.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00005.safetensors", "model.layers.22.block_sparse_moe.experts.3.w3.weight": "model-00002-of-00005.safetensors", "model.layers.23.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00005.safetensors", "model.layers.23.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00005.safetensors", "model.layers.23.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00005.safetensors", "model.layers.23.block_sparse_moe.experts.3.w3.weight": "model-00002-of-00005.safetensors", "model.layers.24.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00005.safetensors", "model.layers.24.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00005.safetensors", "model.layers.24.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00005.safetensors", "model.layers.24.block_sparse_moe.experts.3.w3.weight": "model-00002-of-00005.safetensors", "model.layers.25.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00005.safetensors", "model.layers.25.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00005.safetensors", "model.layers.25.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00005.safetensors", "model.layers.25.block_sparse_moe.experts.3.w3.weight": "model-00002-of-00005.safetensors", "model.layers.26.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00005.safetensors", "model.layers.26.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00005.safetensors", "model.layers.26.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00005.safetensors", "model.layers.26.block_sparse_moe.experts.3.w3.weight": "model-00002-of-00005.safetensors", "model.layers.27.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00005.safetensors", "model.layers.27.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00005.safetensors", "model.layers.27.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00005.safetensors", "model.layers.27.block_sparse_moe.experts.3.w3.weight": "model-00002-of-00005.safetensors", "model.layers.28.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00005.safetensors", "model.layers.28.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00005.safetensors", "model.layers.28.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00005.safetensors", "model.layers.28.block_sparse_moe.experts.3.w3.weight": "model-00002-of-00005.safetensors", "model.layers.29.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00005.safetensors", "model.layers.29.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00005.safetensors", "model.layers.29.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00005.safetensors", "model.layers.29.block_sparse_moe.experts.3.w3.weight": "model-00002-of-00005.safetensors", "model.layers.30.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00005.safetensors", "model.layers.30.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00005.safetensors", "model.layers.30.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00005.safetensors", "model.layers.30.block_sparse_moe.experts.3.w3.weight": "model-00002-of-00005.safetensors", "model.layers.31.block_sparse_moe.experts.0.w3.weight": "model-00002-of-00005.safetensors", "model.layers.31.block_sparse_moe.experts.1.w3.weight": "model-00002-of-00005.safetensors", "model.layers.31.block_sparse_moe.experts.2.w3.weight": "model-00002-of-00005.safetensors", "model.layers.31.block_sparse_moe.experts.3.w3.weight": "model-00002-of-00005.safetensors", "model.layers.0.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00005.safetensors", "model.layers.0.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00005.safetensors", "model.layers.0.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00005.safetensors", "model.layers.0.block_sparse_moe.experts.3.w2.weight": "model-00002-of-00005.safetensors", "model.layers.1.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00005.safetensors", "model.layers.1.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00005.safetensors", "model.layers.1.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00005.safetensors", "model.layers.1.block_sparse_moe.experts.3.w2.weight": "model-00002-of-00005.safetensors", "model.layers.2.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00005.safetensors", "model.layers.2.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00005.safetensors", "model.layers.2.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00005.safetensors", "model.layers.2.block_sparse_moe.experts.3.w2.weight": "model-00002-of-00005.safetensors", "model.layers.3.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00005.safetensors", "model.layers.3.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00005.safetensors", "model.layers.3.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00005.safetensors", "model.layers.3.block_sparse_moe.experts.3.w2.weight": "model-00002-of-00005.safetensors", "model.layers.4.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00005.safetensors", "model.layers.4.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00005.safetensors", "model.layers.4.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00005.safetensors", "model.layers.4.block_sparse_moe.experts.3.w2.weight": "model-00002-of-00005.safetensors", "model.layers.5.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00005.safetensors", "model.layers.5.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00005.safetensors", "model.layers.5.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00005.safetensors", "model.layers.5.block_sparse_moe.experts.3.w2.weight": "model-00002-of-00005.safetensors", "model.layers.6.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00005.safetensors", "model.layers.6.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00005.safetensors", "model.layers.6.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00005.safetensors", "model.layers.6.block_sparse_moe.experts.3.w2.weight": "model-00002-of-00005.safetensors", "model.layers.7.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00005.safetensors", "model.layers.7.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00005.safetensors", "model.layers.7.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00005.safetensors", "model.layers.7.block_sparse_moe.experts.3.w2.weight": "model-00002-of-00005.safetensors", "model.layers.8.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00005.safetensors", "model.layers.8.block_sparse_moe.experts.1.w2.weight": "model-00002-of-00005.safetensors", "model.layers.8.block_sparse_moe.experts.2.w2.weight": "model-00002-of-00005.safetensors", "model.layers.8.block_sparse_moe.experts.3.w2.weight": "model-00002-of-00005.safetensors", "model.layers.9.block_sparse_moe.experts.0.w2.weight": "model-00002-of-00005.safetensors", "model.layers.9.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.9.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.9.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.10.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.10.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.10.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.10.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.11.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.11.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.11.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.11.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.12.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.12.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.12.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.12.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.13.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.13.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.13.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.13.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.14.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.14.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.14.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.14.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.15.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.15.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.15.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.15.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.16.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.16.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.16.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.16.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.17.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.17.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.17.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.17.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.18.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.18.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.18.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.18.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.19.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.19.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.19.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.19.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.20.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.20.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.20.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.20.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.21.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.21.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.21.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.21.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.22.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.22.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.22.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.22.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.23.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.23.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.23.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.23.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.24.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.24.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.24.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.24.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.25.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.25.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.25.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.25.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.26.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.26.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.26.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.26.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.27.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.27.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.27.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.27.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.28.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.28.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.28.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.28.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.29.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.29.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.29.block_sparse_moe.experts.2.w2.weight": "model-00003-of-00005.safetensors", "model.layers.29.block_sparse_moe.experts.3.w2.weight": "model-00003-of-00005.safetensors", "model.layers.30.block_sparse_moe.experts.0.w2.weight": "model-00003-of-00005.safetensors", "model.layers.30.block_sparse_moe.experts.1.w2.weight": "model-00003-of-00005.safetensors", "model.layers.30.block_sparse_moe.experts.2.w2.weight": "model-00004-of-00005.safetensors", "model.layers.30.block_sparse_moe.experts.3.w2.weight": "model-00004-of-00005.safetensors", "model.layers.31.block_sparse_moe.experts.0.w2.weight": "model-00004-of-00005.safetensors", "model.layers.31.block_sparse_moe.experts.1.w2.weight": "model-00004-of-00005.safetensors", "model.layers.31.block_sparse_moe.experts.2.w2.weight": "model-00004-of-00005.safetensors", "model.layers.31.block_sparse_moe.experts.3.w2.weight": "model-00004-of-00005.safetensors", "model.layers.0.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.0.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.0.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.0.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.1.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.1.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.1.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.1.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.2.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.2.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.2.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.2.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.3.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.3.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.3.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.3.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.4.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.4.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.4.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.4.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.5.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.5.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.5.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.5.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.6.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.6.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.6.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.6.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.7.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.7.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.7.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.7.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.8.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.8.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.8.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.8.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.9.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.9.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.9.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.9.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.10.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.10.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.10.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.10.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.11.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.11.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.11.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.11.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.12.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.12.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.12.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.12.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.13.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.13.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.13.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.13.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.14.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.14.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.14.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.14.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.15.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.15.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.15.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.15.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.16.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.16.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.16.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.16.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.17.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.17.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.17.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.17.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.18.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.18.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.18.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.18.block_sparse_moe.experts.3.w1.weight": "model-00004-of-00005.safetensors", "model.layers.19.block_sparse_moe.experts.0.w1.weight": "model-00004-of-00005.safetensors", "model.layers.19.block_sparse_moe.experts.1.w1.weight": "model-00004-of-00005.safetensors", "model.layers.19.block_sparse_moe.experts.2.w1.weight": "model-00004-of-00005.safetensors", "model.layers.19.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.20.block_sparse_moe.experts.0.w1.weight": "model-00005-of-00005.safetensors", "model.layers.20.block_sparse_moe.experts.1.w1.weight": "model-00005-of-00005.safetensors", "model.layers.20.block_sparse_moe.experts.2.w1.weight": "model-00005-of-00005.safetensors", "model.layers.20.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.21.block_sparse_moe.experts.0.w1.weight": "model-00005-of-00005.safetensors", "model.layers.21.block_sparse_moe.experts.1.w1.weight": "model-00005-of-00005.safetensors", "model.layers.21.block_sparse_moe.experts.2.w1.weight": "model-00005-of-00005.safetensors", "model.layers.21.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.22.block_sparse_moe.experts.0.w1.weight": "model-00005-of-00005.safetensors", "model.layers.22.block_sparse_moe.experts.1.w1.weight": "model-00005-of-00005.safetensors", "model.layers.22.block_sparse_moe.experts.2.w1.weight": "model-00005-of-00005.safetensors", "model.layers.22.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.23.block_sparse_moe.experts.0.w1.weight": "model-00005-of-00005.safetensors", "model.layers.23.block_sparse_moe.experts.1.w1.weight": "model-00005-of-00005.safetensors", "model.layers.23.block_sparse_moe.experts.2.w1.weight": "model-00005-of-00005.safetensors", "model.layers.23.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.24.block_sparse_moe.experts.0.w1.weight": "model-00005-of-00005.safetensors", "model.layers.24.block_sparse_moe.experts.1.w1.weight": "model-00005-of-00005.safetensors", "model.layers.24.block_sparse_moe.experts.2.w1.weight": "model-00005-of-00005.safetensors", "model.layers.24.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.25.block_sparse_moe.experts.0.w1.weight": "model-00005-of-00005.safetensors", "model.layers.25.block_sparse_moe.experts.1.w1.weight": "model-00005-of-00005.safetensors", "model.layers.25.block_sparse_moe.experts.2.w1.weight": "model-00005-of-00005.safetensors", "model.layers.25.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.26.block_sparse_moe.experts.0.w1.weight": "model-00005-of-00005.safetensors", "model.layers.26.block_sparse_moe.experts.1.w1.weight": "model-00005-of-00005.safetensors", "model.layers.26.block_sparse_moe.experts.2.w1.weight": "model-00005-of-00005.safetensors", "model.layers.26.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.27.block_sparse_moe.experts.0.w1.weight": "model-00005-of-00005.safetensors", "model.layers.27.block_sparse_moe.experts.1.w1.weight": "model-00005-of-00005.safetensors", "model.layers.27.block_sparse_moe.experts.2.w1.weight": "model-00005-of-00005.safetensors", "model.layers.27.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.28.block_sparse_moe.experts.0.w1.weight": "model-00005-of-00005.safetensors", "model.layers.28.block_sparse_moe.experts.1.w1.weight": "model-00005-of-00005.safetensors", "model.layers.28.block_sparse_moe.experts.2.w1.weight": "model-00005-of-00005.safetensors", "model.layers.28.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.29.block_sparse_moe.experts.0.w1.weight": "model-00005-of-00005.safetensors", "model.layers.29.block_sparse_moe.experts.1.w1.weight": "model-00005-of-00005.safetensors", "model.layers.29.block_sparse_moe.experts.2.w1.weight": "model-00005-of-00005.safetensors", "model.layers.29.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.30.block_sparse_moe.experts.0.w1.weight": "model-00005-of-00005.safetensors", "model.layers.30.block_sparse_moe.experts.1.w1.weight": "model-00005-of-00005.safetensors", "model.layers.30.block_sparse_moe.experts.2.w1.weight": "model-00005-of-00005.safetensors", "model.layers.30.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.31.block_sparse_moe.experts.0.w1.weight": "model-00005-of-00005.safetensors", "model.layers.31.block_sparse_moe.experts.1.w1.weight": "model-00005-of-00005.safetensors", "model.layers.31.block_sparse_moe.experts.2.w1.weight": "model-00005-of-00005.safetensors", "model.layers.31.block_sparse_moe.experts.3.w1.weight": "model-00005-of-00005.safetensors", "model.layers.0.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.1.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.2.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.3.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.4.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.5.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.6.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.7.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.8.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.9.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.10.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.11.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.12.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.13.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.14.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.15.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.16.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.17.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.18.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.19.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.20.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.21.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.22.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.23.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.24.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.25.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.26.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.27.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.28.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.29.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.30.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.31.post_attention_layernorm.weight": "model-00005-of-00005.safetensors", "model.layers.0.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.1.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.2.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.3.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.4.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.5.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.6.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.7.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.8.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.9.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.10.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.11.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.12.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.13.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.14.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.15.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.16.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.17.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.18.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.19.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.20.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.21.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.22.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.23.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.24.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.25.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.26.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.27.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.28.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.29.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.30.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.31.self_attn.q_proj.weight": "model-00005-of-00005.safetensors", "model.layers.0.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.1.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.2.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.3.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.4.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.5.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.6.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.7.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.8.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.9.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.10.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.11.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.12.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.13.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.14.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.15.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.16.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.17.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.18.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.19.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.20.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.21.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.22.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.23.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.24.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.25.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.26.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.27.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.28.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.29.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.30.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.31.self_attn.k_proj.weight": "model-00005-of-00005.safetensors", "model.layers.0.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.1.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.2.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.3.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.4.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.5.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.6.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.7.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.8.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.9.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.10.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.11.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.12.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.13.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.14.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.15.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.16.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.17.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.18.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.19.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.20.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.21.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.22.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.23.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.24.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.25.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.26.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.27.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.28.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.29.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.30.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.31.self_attn.v_proj.weight": "model-00005-of-00005.safetensors", "model.layers.0.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.1.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.2.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.3.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.4.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.5.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.6.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.7.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.8.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.9.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.10.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.11.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.12.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.13.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.14.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.15.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.16.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.17.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.18.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.19.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.20.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.21.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.22.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.23.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.24.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.25.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.26.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.27.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.28.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.29.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.30.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.31.self_attn.o_proj.weight": "model-00005-of-00005.safetensors", "model.layers.0.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.1.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.2.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.3.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.4.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.5.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.6.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.7.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.8.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.9.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.10.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.11.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.12.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.13.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.14.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.15.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.16.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.17.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.18.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.19.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.20.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.21.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.22.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.23.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.24.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.25.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.26.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.27.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.28.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.29.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.30.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors", "model.layers.31.block_sparse_moe.gate.weight": "model-00005-of-00005.safetensors"}}
|
output-00001-of-00002.safetensors
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:249771ad1d6896d071e491598fe2a2b943a293c97463081d99ed7a170bd5f3a9
|
3 |
+
size 8580635952
|
output-00002-of-00002.safetensors
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:821fe254e1631810abb61334ccff665faca13fc464da43ed291d2f11b56d3951
|
3 |
+
size 759854712
|
special_tokens_map.json
ADDED
@@ -0,0 +1,24 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"bos_token": {
|
3 |
+
"content": "<s>",
|
4 |
+
"lstrip": false,
|
5 |
+
"normalized": false,
|
6 |
+
"rstrip": false,
|
7 |
+
"single_word": false
|
8 |
+
},
|
9 |
+
"eos_token": {
|
10 |
+
"content": "</s>",
|
11 |
+
"lstrip": false,
|
12 |
+
"normalized": false,
|
13 |
+
"rstrip": false,
|
14 |
+
"single_word": false
|
15 |
+
},
|
16 |
+
"pad_token": "<s>",
|
17 |
+
"unk_token": {
|
18 |
+
"content": "<unk>",
|
19 |
+
"lstrip": false,
|
20 |
+
"normalized": false,
|
21 |
+
"rstrip": false,
|
22 |
+
"single_word": false
|
23 |
+
}
|
24 |
+
}
|
tokenizer.json
ADDED
The diff for this file is too large to render.
See raw diff
|
|
tokenizer.model
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:dadfd56d766715c61d2ef780a525ab43b8e6da4de6865bda3d95fdef5e134055
|
3 |
+
size 493443
|
tokenizer_config.json
ADDED
@@ -0,0 +1,42 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
{
|
2 |
+
"add_bos_token": true,
|
3 |
+
"add_eos_token": false,
|
4 |
+
"added_tokens_decoder": {
|
5 |
+
"0": {
|
6 |
+
"content": "<unk>",
|
7 |
+
"lstrip": false,
|
8 |
+
"normalized": false,
|
9 |
+
"rstrip": false,
|
10 |
+
"single_word": false,
|
11 |
+
"special": true
|
12 |
+
},
|
13 |
+
"1": {
|
14 |
+
"content": "<s>",
|
15 |
+
"lstrip": false,
|
16 |
+
"normalized": false,
|
17 |
+
"rstrip": false,
|
18 |
+
"single_word": false,
|
19 |
+
"special": true
|
20 |
+
},
|
21 |
+
"2": {
|
22 |
+
"content": "</s>",
|
23 |
+
"lstrip": false,
|
24 |
+
"normalized": false,
|
25 |
+
"rstrip": false,
|
26 |
+
"single_word": false,
|
27 |
+
"special": true
|
28 |
+
}
|
29 |
+
},
|
30 |
+
"additional_special_tokens": [],
|
31 |
+
"bos_token": "<s>",
|
32 |
+
"clean_up_tokenization_spaces": false,
|
33 |
+
"eos_token": "</s>",
|
34 |
+
"legacy": true,
|
35 |
+
"model_max_length": 1000000000000000019884624838656,
|
36 |
+
"pad_token": "<s>",
|
37 |
+
"sp_model_kwargs": {},
|
38 |
+
"spaces_between_special_tokens": false,
|
39 |
+
"tokenizer_class": "LlamaTokenizer",
|
40 |
+
"unk_token": "<unk>",
|
41 |
+
"use_default_system_prompt": false
|
42 |
+
}
|