Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
SpideyDLK
/
wav2vec2-large-xls-r-300m-sinhala-aug-data-with-original-split-part2
like
0
Automatic Speech Recognition
Transformers
TensorBoard
Safetensors
wav2vec2
Inference Endpoints
arxiv:
1910.09700
Model card
Files
Files and versions
Metrics
Training metrics
Community
Train
Deploy
Use this model
main
wav2vec2-large-xls-r-300m-sinhala-aug-data-with-original-split-part2
1 contributor
History:
52 commits
SpideyDLK
Training in progress, step 20000, checkpoint
dec9569
verified
10 months ago
last-checkpoint
Training in progress, step 20000, checkpoint
10 months ago
runs
Training in progress, step 20000
10 months ago
.gitattributes
1.52 kB
initial commit
10 months ago
README.md
5.17 kB
Upload tokenizer
10 months ago
added_tokens.json
30 Bytes
Upload tokenizer
10 months ago
config.json
2.09 kB
Training in progress, step 10400
10 months ago
model.safetensors
1.26 GB
LFS
Training in progress, step 20000
10 months ago
preprocessor_config.json
214 Bytes
Training in progress, step 10400
10 months ago
special_tokens_map.json
96 Bytes
Upload tokenizer
10 months ago
tokenizer_config.json
1.05 kB
Upload tokenizer
10 months ago
training_args.bin
pickle
Detected Pickle imports (9)
"transformers.trainer_utils.IntervalStrategy"
,
"transformers.trainer_utils.HubStrategy"
,
"transformers.trainer_utils.SchedulerType"
,
"transformers.trainer_pt_utils.AcceleratorConfig"
,
"torch.device"
,
"transformers.training_args.TrainingArguments"
,
"accelerate.utils.dataclasses.DistributedType"
,
"transformers.training_args.OptimizerNames"
,
"accelerate.state.PartialState"
How to fix it?
5.05 kB
LFS
Training in progress, step 10400
10 months ago
vocab.json
1.01 kB
Upload tokenizer
10 months ago