-
Notifications
You must be signed in to change notification settings - Fork 8
Description
Dear SpiceMix Team,
I hope this email finds you well!
I am currently using the SpiceMix package to analyze 10X HD data, which is quite large—each sample contains approximately 400,000 spots. However, I am encountering memory issues when attempting to run the analysis on a server with 80GB of GPU memory and 1000GB of CPU memory, causing the program to run out of memory.
I have tried reducing batch sizes and optimizing data processing, but the out-of-memory issue persists. I would like to ask if there are any strategies or suggestions for managing memory effectively when working with large-scale 10X HD datasets, especially when GPU and CPU memory are limited.
Here are my current hardware specifications and the issues I am facing:
GPU: 80GB
CPU: 1000GB
Dataset size: Approximately 400,000 spots per sample
Specifically, I would like to know:
-
Are there any memory management optimizations for large datasets that you recommend?
-
Are there specific parameters or settings within SpiceMix that could be adjusted to avoid memory overflow?
-
Are there alternative approaches to reducing memory usage while maintaining the analysis accuracy?
If possible, I would greatly appreciate any advice or solutions you may have. Thank you for your work on SpiceMix, and I look forward to your response.
Best regards,
mimi