license: apache-2.0
language:
- ar
metrics:
- accuracy
library_name: transformers
Arabic Poetry Fine-Tuned Model
This model is a fine-tuned version of the GPT-2 model, specifically trained on Arabic poetry. It is designed to generate Arabic poetry and can be used for creative writing, educational purposes, or research in natural language processing.
Try it Out
You can test the model directly here:
Model Details
- Model Type: GPT-2
- Language: Arabic
- License: Apache-2.0
- Author: NightPrince
Intended Use
This model is intended for generating Arabic poetry. It can be used in applications such as:
- Creative writing tools
- Educational resources for learning Arabic poetry
- Research in natural language processing and generation
Training Data
The model was fine-tuned on a dataset of Arabic poetry. The dataset includes works from various poets and covers a range of styles and themes.
Training Procedure
- Framework: PyTorch
- Hardware: Trained on a GPU
- Epochs: 5
- Batch Size: [Specify if known]
- Learning Rate: [Specify if known]
Evaluation
The model was evaluated based on its ability to generate coherent and stylistically appropriate poetry. The training loss achieved was approximately 2.67, indicating a good level of learning.
Limitations and Biases
As with any language model, this model may generate biased or inappropriate content. Users should be aware of these limitations and use the model responsibly.
Acknowledgements
This model was developed by NightPrince and is hosted on Hugging Face. Special thanks to the creators of the original GPT-2 model and the Hugging Face team for their support.
Contact
For questions or feedback, please contact NightPrince via Hugging Face.