Whisper Small IsiXhosa
This model is a fine-tuned version of openai/whisper-small on the ISIXHOSA-ASR-TRAIN dataset. It achieves the following results on the evaluation set:
- Loss: 0.9605
- Wer: 57.9545
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 32
- eval_batch_size: 16
- seed: 42
- optimizer: Use adamw_torch_fused with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 100
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
| Training Loss | Epoch | Step | Validation Loss | Wer |
|---|---|---|---|---|
| 3.5789 | 1.0 | 17 | 3.0669 | 116.6667 |
| 2.5749 | 2.0 | 34 | 2.1387 | 94.1919 |
| 1.6536 | 3.0 | 51 | 1.6168 | 83.5859 |
| 1.2888 | 4.0 | 68 | 1.2976 | 73.8636 |
| 0.747 | 5.0 | 85 | 1.0975 | 67.6768 |
| 0.4241 | 6.0 | 102 | 0.9918 | 63.1313 |
| 0.2659 | 7.0 | 119 | 0.9502 | 61.2374 |
| 0.1003 | 8.0 | 136 | 0.9372 | 58.7121 |
| 0.0531 | 9.0 | 153 | 0.9341 | 58.0808 |
| 0.0245 | 10.0 | 170 | 0.9448 | 58.3333 |
| 0.016 | 11.0 | 187 | 0.9444 | 58.4596 |
| 0.009 | 12.0 | 204 | 0.9559 | 58.4596 |
| 0.0069 | 13.0 | 221 | 0.9572 | 57.3232 |
| 0.0063 | 14.0 | 238 | 0.9594 | 58.0808 |
| 0.0058 | 15.0 | 255 | 0.9605 | 57.9545 |
Framework versions
- Transformers 4.57.0
- Pytorch 2.8.0+cu128
- Datasets 4.2.0
- Tokenizers 0.22.1
- Downloads last month
- 5
Model tree for zionia/whisper-small-isixhosa
Base model
openai/whisper-smallDataset used to train zionia/whisper-small-isixhosa
Evaluation results
- Wer on ISIXHOSA-ASR-TRAINself-reported57.955