merge
This is a merge of pre-trained language models created using mergekit.
Merge Details
Merge Method
This model was merged using the passthrough merge method.
Models Merged
The following models were included in the merge:
Configuration
The following YAML configuration was used to produce this model:
dtype: bfloat16
merge_method: passthrough
slices:
- sources:
- layer_range: [0, 16]
model: Aculi/Little-Bitch-1.1B
- sources:
- layer_range: [6, 16]
model: Aculi/Little-Bitch-1.1B
parameters:
scale:
- filter: 'o_proj'
value: 0.0
- filter: 'down_proj'
value: 0.0
- value: 1.0
- sources:
- layer_range: [6, 16]
model: Aculi/Little-Bitch-1.1B
parameters:
scale:
- filter: 'o_proj'
value: 0.0
- filter: 'down_proj'
value: 0.0
- value: 1.0
- sources:
- layer_range: [6, 16]
model: Aculi/Little-Bitch-1.1B
parameters:
scale:
- filter: 'o_proj'
value: 0.0
- filter: 'down_proj'
value: 0.0
- value: 1.0
- sources:
- layer_range: [16, 22]
model: Aculi/Little-Bitch-1.1B
- Downloads last month
- 2
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for Fischerboot/thisisa3.3llama
Base model
Aculi/Little-Bitch-1.1B