Edit model card

Models Merged

The following models were included in the merge:

A Merges of my best models !:

A very great model as it contains the deltas from all of the very hard trained models : all these models were heavy coders!

Configuration

The following YAML configuration was used to produce this model:


models:
  - model: LeroyDyer/Mixtral_AI_Cyber_3.m2
    parameters:
      density: [0.256, 0.512, 0.128] # density gradient
      weight: 0.382
  - model: LeroyDyer/Mixtral_AI_Cyber_2.0
    parameters:
      density: 0.382
      weight: [0.256, 0.128, 0.256, 0.128] # weight gradient
  - model: LeroyDyer/Mixtral_AI_Cyber_3.0
    parameters:
      density: 0.382
      weight: [0.128, 0.512, 0.128, 0.128] # weight gradient      
  - model: LeroyDyer/Mixtral_AI_Cyber_3.m1
    parameters:
      density: 0.382
      weight: [0.256, 0.256, 0.512, 0.128] # weight gradient    
  - model: LeroyDyer/Mixtral_AI_Cyber_1.0
    parameters:
      density: 0.382
      weight: [0.128, 0.512, 0.128, 0.128] # weight gradient                
  - model: LeroyDyer/Mixtral_AI_Cyber_3.1_SFT
    parameters:
      density: 0.382
      weight:
        - filter: mlp
          value: 0.5
        - value: 0
merge_method: ties
base_model: LeroyDyer/Mixtral_AI_Cyber_3.m2
parameters:
  normalize: true
  int8_mask: true
dtype: float16
Downloads last month
61
Safetensors
Model size
7.24B params
Tensor type
FP16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for LeroyDyer/Mixtral_AI_Cyber_4.0

Base model

liminerity/M7-7b
Finetuned
(1)
this model
Finetunes
1 model
Quantizations
1 model

Spaces using LeroyDyer/Mixtral_AI_Cyber_4.0 4

Collection including LeroyDyer/Mixtral_AI_Cyber_4.0