xymfei / distil-whisper Goto Github PK
View Code? Open in Web Editor NEWThis project forked from huggingface/distil-whisper
Distil-Whisper 是 Whisper 的蒸馏版本,速度提高了 6 倍,体积缩小了 49%,并且在分布外评估集上的字错误率 (WER) 在 1% 以内。Distilled variant of Whisper for speech recognition. 6x faster, 50% smaller, within 1% word error rate.
License: MIT License