Ichsan2895's picture
Update README.md
ec1e254 verified
metadata
datasets:
  - wikipedia
  - Ichsan2895/OASST_Top1_Indonesian
  - Ichsan2895/alpaca-gpt4-indonesian
language:
  - id
  - en
pipeline_tag: text-generation
license: cc-by-nc-sa-4.0
MERAK

HAPPY TO ANNOUNCE THE RELEASE OF MERAK-7B-V4_4bit_q128_awq!

Merak-7B is the Large Language Model of Indonesian Language

This model is based on Mistral-7B-OpenOrca and fine tuned by some of Indonesia Wikipedia articles that I cleaned before.

Leveraging QLoRA (QLora: Efficient Finetuning of Quantized LLMs), Merak-7B is able to run with 16 GB VRAM

Licensed under Creative Commons-By Attribution-Share Alike-Non Commercial (CC-BY-SA-NC 4.0) Merak-7B empowers AI enthusiasts, researchers alike.

Big thanks to all my friends and communities that help to build our first model. Thanks for Axolotl for a great fine tuning tool which designed to streamline the fine-tuning of various AI models.

Feel free, to ask me about the model and please share the news on your social media.