mathstral-7b-ov / README.md
doberst's picture
Update README.md
722b12e verified
|
raw
history blame
1.04 kB
metadata
license: apache-2.0
inference: false
tags:
  - green
  - p7
  - llmware-chat
  - ov

mathstral-7b-ov

mathstral-7b-ov is an OpenVino int4 quantized version of Mathstral, providing a very inference implementation, optimized for AI PCs using Intel GPU, CPU and NPU.

mathstral-7b-ov is an instruct-trained model from Mistral with excellent capabilities for answering mathematical oriented problems and questions.

Model Description

  • Developed by: mistral
  • Quantized by: llmware
  • Model type: mistral-7b
  • Parameters: 7 billion
  • Model Parent: mistralai/Mathstral-7B-v0.1
  • Language(s) (NLP): English
  • License: Apache 2.0
  • Uses: Chat-oriented Math problems
  • RAG Benchmark Accuracy Score: NA
  • Quantization: int4

Model Card Contact

llmware on github

llmware on hf

llmware website