ales commited on
Commit
46c10f5
·
1 Parent(s): 8408300

evaluated on fleurs using raw_transcription as text column

Browse files
logs/{eval_fleurs_20221221.log → eval_fleurs_20221221_071600_transcription.log} RENAMED
File without changes
logs/eval_fleurs_20221221_101048_raw_transcription.log ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ 12/21/2022 10:10:48 - INFO - __main__ - running evaluation script with following parameters: Namespace(batch_size=16, config='be_by', dataset='google/fleurs', device=0, language='be', max_eval_samples=None, model_id='ales/whisper-small-belarusian', push_to_hub=True, save_predictions=True, split='test', streaming=True, text_column='raw_transcription')
2
+ 12/21/2022 10:10:48 - INFO - __main__ - using following text normalizer: <belarusian_text_normalizer.BelarusianTextNormalizer object at 0x7fe1116a06a0>
3
+ 12/21/2022 10:10:54 - INFO - __main__ - loading dataset
4
+ 12/21/2022 10:10:56 - INFO - __main__ - running inference
5
+ 12/21/2022 10:52:01 - INFO - __main__ - computing metrics
6
+ 12/21/2022 10:52:01 - INFO - __main__ - metrics computed
7
+ 12/21/2022 10:52:01 - INFO - __main__ - WER: 45.89674723962996
8
+ 12/21/2022 10:52:01 - INFO - __main__ - saving predictions to: "preds_google_fleurs_be_by_test_20221221-101048.tsv"
9
+ 12/21/2022 10:52:01 - INFO - __main__ - updating model card and pushing to HuggingFace Hub
10
+ /home/ubuntu/python_venvs/hf_env/lib/python3.8/site-packages/transformers/generation/utils.py:1134: UserWarning: You have modified the pretrained model configuration to control generation. This is a deprecated strategy to control generation and will be removed soon, in a future version. Please use a generation configuration file (see https://huggingface.co/docs/transformers/main_classes/text_generation)
11
+ warnings.warn(
12
+ Traceback (most recent call last):
13
+ File "src/run_eval_whisper_streaming.py", line 219, in <module>
14
+ main(args)
15
+ File "src/run_eval_whisper_streaming.py", line 123, in main
16
+ evaluate.push_to_hub(
17
+ File "/home/ubuntu/python_venvs/hf_env/lib/python3.8/site-packages/evaluate/hub.py", line 119, in push_to_hub
18
+ return metadata_update(repo_id=model_id, metadata=metadata, overwrite=overwrite)
19
+ File "/home/ubuntu/python_venvs/hf_env/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 124, in _inner_fn
20
+ return fn(*args, **kwargs)
21
+ File "/home/ubuntu/python_venvs/hf_env/lib/python3.8/site-packages/huggingface_hub/repocard.py", line 802, in metadata_update
22
+ raise ValueError(
23
+ ValueError: You passed a new value for the existing metric 'name: WER, type: wer'. Set `overwrite=True` to overwrite existing metrics.