You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, If you do an inference with a Transcriber object t.transcribe(..), after returning the result, it should release any resources related to that inference. But it stays on VRAM and after a few calls to t.transcribe(), I get CUDA related out of memory errors. Looking at nvidia-smi it shows the memory is still occupied even after the transcript has been returned.
It would be nice to have a long lived Transcriber object which can be reused avoiding the lengthy creation time. If you're busy please give me a hint on how it may be done so I can give it a shot and submit a PR. Thanks for your project.
The text was updated successfully, but these errors were encountered:
Hi, If you do an inference with a Transcriber object t.transcribe(..), after returning the result, it should release any resources related to that inference. But it stays on VRAM and after a few calls to t.transcribe(), I get CUDA related out of memory errors. Looking at nvidia-smi it shows the memory is still occupied even after the transcript has been returned.
It would be nice to have a long lived Transcriber object which can be reused avoiding the lengthy creation time. If you're busy please give me a hint on how it may be done so I can give it a shot and submit a PR. Thanks for your project.
Hi, If you do an inference with a
Transcriber
objectt.transcribe(..)
, after returning the result, it should release any resources related to that inference. But it stays on VRAM and after a few calls tot.transcribe()
, I get CUDA related out of memory errors. Looking atnvidia-smi
it shows the memory is still occupied even after the transcript has been returned.It would be nice to have a long lived
Transcriber
object which can be reused avoiding the lengthy creation time. If you're busy please give me a hint on how it may be done so I can give it a shot and submit a PR. Thanks for your project.The text was updated successfully, but these errors were encountered: