Phi-4-mm-inst-asr-turkish-3

This model is a fine-tuned version of microsoft/Phi-4-multimodal-instruct on a 1300-hour Turkish audio dataset.

Training Prompt

The model was initially fine-tuned using the original ASR prompt: "Transcribe the audio clip into text."
This prompt is language agnostic—as described in the model paper:

The ASR prompt for Phi-4-Multimodal is “Transcribe the audio clip into text.”, which is language agnostic. We notice that the model can learn to recognize in the target language perfectly without providing language information, while Qwen2-audio and Gemini-2.0-Flash require the language information in the prompt to obtain the optimal ASR performance.

However, we found that using a language-defining prompt, such as: "Transcribe the Turkish audio." leads to better performance.
See: ysdede/Phi-4-mm-inst-asr-turkish

Training Results

When benchmarked with the original ASR prompt "Transcribe the audio clip into text.", the evaluation results were as follows:

  • Before Fine-Tuning:

    • WER: 153.84
    • CER: 82.57
  • After Fine-Tuning:

    • WER: 64.76
    • CER: 29.85

Inference

Load generation_config and processor from the base model as a quick fix to use the default generation settings.

Note: The new models currently lack high-quality fine-tuning scripts. When saving a fine-tuned model using model.save_pretrained(), the processor configuration—including essential audio parameters—is not automatically saved. This omission can lead to errors during inference due to the model’s complex architecture. Loading these components from the base model ensures that all critical settings are properly included.

generation_config = GenerationConfig.from_pretrained(
    'microsoft/Phi-4-multimodal-instruct', 'generation_config.json'
)
processor = AutoProcessor.from_pretrained(
    'microsoft/Phi-4-multimodal-instruct', trust_remote_code=True
)

Framework versions

  • Transformers 4.46.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.3.2
  • Tokenizers 0.20.3
Downloads last month
126
Safetensors
Model size
5.57B params
Tensor type
BF16
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The HF Inference API does not support model that require custom code execution.

Model tree for ysdede/Phi-4-mm-inst-asr-turkish-3

Finetuned
(7)
this model