nintwentydo's picture
Add files using upload-large-folder tool
08490ac verified
raw
history blame contribute delete
226 Bytes
DEFAULT_stage:
DEFAULT_modifiers:
GPTQModifier:
sequential_targets: [MistralDecoderLayer]
scheme: W4A16
targets: Linear
ignore: ['re:.*lm_head', 're:vision_tower.*', 're:multi_modal_projector.*']