allstax's picture
Update README.md
61b895b
|
raw
history blame
743 Bytes
metadata
base_model: mistralai/Mistral-7B-v0.1

Model Card for Model ID

The model does its best to explain python code in plain language

Model Details

Trained by: trained by AllStax Technologies Model type: CodeExplainer-7b-v0.1 is a language model based on mistralai/Mistral-7B-v0.1. Language(s): English We fine-tuned using a data generated by GPT-3.5 and other models.

Prompting

Prompt Template for alpaca style

### Instruction:

<prompt>

### Response:

Loading the model

from transformers import AutoModelForCausalLM, AutoTokenizer, GPTQConfig model_id = "allstax/CodeExplainer-7b-v0.1" tokenizer = AutoTokenizer.from_pretrained(model_id) quant_model = AutoModelForCausalLM.from_pretrained(model_id, device_map='auto')