File size: 749 Bytes
d4e15df 61b895b e39ea23 d4e15df af0d69d d4e15df 61b895b d4e15df 61b895b d4e15df 61b895b d4e15df 61b895b d4e15df 61b895b d4e15df 61b895b 4756927 9c20c6d 61b895b 4756927 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 |
---
base_model: mistralai/Mistral-7B-v0.1
license: apache-2.0
---
# Code Explainer
The model does its best to explain python code in plain language
# Model Details
Trained by: trained by AllStax Technologies
Model type: CodeExplainer-7b-v0.1 is a language model based on mistralai/Mistral-7B-v0.1.
Language(s): English
We fine-tuned using a data generated by GPT-3.5 and other models.
# Prompting
Prompt Template for alpaca style
```
### Instruction:
<prompt>
### Response:
```
# Loading the model
```
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "allstax/CodeExplainer-7b-v0.1"
tokenizer = AutoTokenizer.from_pretrained(model_id)
quant_model = AutoModelForCausalLM.from_pretrained(model_id, device_map='auto')
``` |