You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi,
I wonder how can I release RAM memory when I use CUDA device. So, the code is the following:
# delete all models except 'small.en' and release memory
prepare(model_map, model_name)
model = model_map.get(model_name)
if model is None:
device = torch.device("cuda:0" if torch.cuda.is_available() else "cpu")
model = whisper.load_model(name=model_name, device=device)
model_map[model_name] = model
When whisper 'load' method is called I can see how my RAM memory goes up and stays busy even when all data transferred to vGPU (cuda:0). When I call delete_model(..) I can see vGPU memory is released, but RAM remains.
def prepare(model_map, name):
if model_map.get(name) is None:
if name != "small.en":
keys_to_delete = []
for key, value in model_map.items():
if key != "small.en":
delete_model(model_map.get(key, None))
keys_to_delete.append(key)
for key in keys_to_delete:
del model_map[key]
gc.collect()
def delete_model(model):
if model:
del model.encoder
del model.decoder
del model
torch.cuda.empty_cache()
gc.collect()
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
I wonder how can I release RAM memory when I use CUDA device. So, the code is the following:
When whisper 'load' method is called I can see how my RAM memory goes up and stays busy even when all data transferred to vGPU (cuda:0). When I call delete_model(..) I can see vGPU memory is released, but RAM remains.
Any thoughts?
Beta Was this translation helpful? Give feedback.
All reactions