File size: 1,235 Bytes
90fd8de |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 |
# LLama-2 7b Wu, Koo, Black, Blum, Scalzo, Kurtz
## Model Description
LLama-2-WuKurtz is a state-of-the-art language model developed by Wu, Koo, Black, Blum, Scalzo, and Kurtz. It has been fine-tuned on our synthesized dataset comprising 80,000 training examples mastering nephrology.
This model is apart of our paper Boosting Open-Sourced Large Language Models with Proprietary Imitation Learning [released soon!]
## Training Data
The model was trained on a Nephrology synthesized dataset that was carefully curated and preprocessed. This dataset includes 80,000 examples that cover a wide range from imitation learning from proprietary LLMs, proprietary data, and lecture information, providing the model with a comprehensive understanding of Nephrology
## Model Performance
Detailed performance metrics will be updated soon!
## Usage
You can use this model for a variety of NLP tasks, including but not limited to text generation, text classification, sentiment analysis, and named entity recognition.
```python
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("SeanWu25/llama-2-7b-WuKurtz")
model = AutoModelForCausalLM.from_pretrained("SeanWu25/llama-2-7b-WuKurtz")
|