-
Hi, |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
This feature is not added yet, as you can see the current standalone is still on revision r194 when the current python repo's revision is 239.
Batched inference increase speed much more than "slightly", but it decrease accuracy of transcriptions because of the context is not working and it's prone to hallucinations, some areas can be better because VAD does better chunking, but in general there will be some drop in transcription quality. |
Beta Was this translation helpful? Give feedback.
-
Released r239.1 with batched inference, |
Beta Was this translation helpful? Give feedback.
Released r239.1 with batched inference,
Just add
--batched
to a command, "batch_size=8" is by default.