Danish-Swedish Merged Model
This is a merge of the following models, all based on mistralai/Mistral-7B-v0.1
:
danish-foundation-models/munin-7b-alpha
, continued pretraining on Danish data;timpal0l/Mistral-7B-v0.1-flashback-v2
, continued pretraining on Swedish data.
Model Details
- Merged by: Dan Saattrup Nielsen
- Model type: Decoder model, based on
mistralai/Mistral-7B-v0.1
- Language(s): Danish and Swedish
- License: CC-BY-4.0
- Merge configuration:
dict( models=[ dict( model="danish-foundation-models/munin-7b-alpha", parameters=dict( weight=1.0, ), ), dict( model="timpal0l/Mistral-7B-v0.1-flashback-v2", parameters=dict( weight=1.0, ), ), ], merge_method="task_arithmetic", base_model="mistralai/Mistral-7B-v0.1", parameters=dict( int8_mask=True, normalize=True, ), dtype="bfloat16", )
- Downloads last month
- 20
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for merge-crew/da-sv-task-arithmetic
Merge model
this model