A model which is jointly trained and fine-tuned on Quran, Saheefa and nahj-al-balaqa. All Datasets are available Here. Code will be available soon ...

Some Examples for filling the mask:

  • 
    

ุฐูŽู„ููƒูŽ [MASK] ู„ูŽุง ุฑูŽูŠู’ุจูŽ ูููŠู‡ู ู‡ูุฏู‹ู‰ ู„ูู„ู’ู…ูุชู‘ูŽู‚ููŠู†ูŽ

- ```
ูŠูŽุง ุฃูŽูŠู‘ูู‡ูŽุง ุงู„ู†ู‘ูŽุงุณู ุงุนู’ุจูุฏููˆุง ุฑูŽุจู‘ูŽูƒูู…ู ุงู„ู‘ูŽุฐููŠ ุฎูŽู„ูŽู‚ูŽูƒูู…ู’ ูˆูŽุงู„ู‘ูŽุฐููŠู†ูŽ ู…ูู†ู’ ู‚ูŽุจู’ู„ููƒูู…ู’ ู„ูŽุนูŽู„ู‘ูŽูƒูู…ู’ [MASK]

This model is fine-tuned on Bert Base Arabic for 30 epochs. We have used Masked Language Modeling to fine-tune the model. Also, after each 5 epochs, we have completely masked the words again for the model to learn the embeddings very well and not overfit the data.

Downloads last month
70
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.