Model Summary

MedPhi-2 is a Phi-2, 2.7 billion parameters, further trained for the biomedical domain. It was proposed in MedExQA paper.

๐Ÿง‘โ€โš•๏ธ MedExQA

Medical Question Answering Benchmark with Multiple Explanations

๐Ÿ“„ Paper โ€ข โฌ Dataset โ€ข โš•๏ธ MedPhi2

Model Details

Model Description

  • Model type: Clinical LLM (Large Language Model)
  • Language(s) (NLP): English
  • License: MIT license
  • Finetuned from model [optional]: Phi-2

Citation

BibTeX:

@article{kim2024medexqa,
  title={MedExQA: Medical Question Answering Benchmark with Multiple Explanations},
  author={Kim, Yunsoo and Wu, Jinge and Abdulle, Yusuf and Wu, Honghan},
  journal={arXiv e-prints},
  pages={arXiv--2406},
  year={2024}
}
Downloads last month
91
Safetensors
Model size
2.78B params
Tensor type
BF16
ยท
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model authors have turned it off explicitly.