Ritik009999 commited on
Commit
da71f8c
1 Parent(s): bd53fa0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -28
README.md CHANGED
@@ -1,9 +1,6 @@
1
  ---
2
  license: apache-2.0
3
  tags:
4
- - merge
5
- - mergekit
6
- - lazymergekit
7
  - meta-math/MetaMath-Mistral-7B
8
  - mistralai/Mistral-7B-v0.1
9
  - EmbeddedLLM/Mistral-7B-Merge-14-v0.2
@@ -11,33 +8,9 @@ tags:
11
 
12
  # mathexpert7b
13
 
14
- mathexpert7b is a merge of the following models using [mergekit](https://github.com/cg123/mergekit):
15
  * [meta-math/MetaMath-Mistral-7B](https://huggingface.co/meta-math/MetaMath-Mistral-7B)
16
  * [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
17
  * [EmbeddedLLM/Mistral-7B-Merge-14-v0.2](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.2)
18
 
19
- ## 🧩 Configuration
20
-
21
- ```yaml
22
- models:
23
- - model: mistralai/Mistral-7B-v0.1
24
- # No parameters necessary for base model
25
- - model: meta-math/MetaMath-Mistral-7B
26
- parameters:
27
- density: 0.53
28
- weight: 0.4
29
- - model: mistralai/Mistral-7B-v0.1
30
- parameters:
31
- density: 0.53
32
- weight: 0.3
33
- - model: EmbeddedLLM/Mistral-7B-Merge-14-v0.2
34
- parameters:
35
- density: 0.53
36
- weight: 0.3
37
- merge_method: dare_ties
38
- base_model: mistralai/Mistral-7B-v0.1
39
- parameters:
40
- int8_mask: true
41
- dtype: bfloat16
42
-
43
  ```
 
1
  ---
2
  license: apache-2.0
3
  tags:
 
 
 
4
  - meta-math/MetaMath-Mistral-7B
5
  - mistralai/Mistral-7B-v0.1
6
  - EmbeddedLLM/Mistral-7B-Merge-14-v0.2
 
8
 
9
  # mathexpert7b
10
 
11
+ mathexpert7b is a merge of the following models further aligned by Direct preference optimization.
12
  * [meta-math/MetaMath-Mistral-7B](https://huggingface.co/meta-math/MetaMath-Mistral-7B)
13
  * [mistralai/Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
14
  * [EmbeddedLLM/Mistral-7B-Merge-14-v0.2](https://huggingface.co/EmbeddedLLM/Mistral-7B-Merge-14-v0.2)
15
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16
  ```