-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
mask volumes with 4D data: OOM error #41
Comments
weirdly, this error stopped happening after I restarted the spark local cluster. But good to have it here for reference, in case it happens again |
reopening because this error is happening consistently for the 4D dataset, both in my local machine and remote machines running with docker. Spark keeps having issues of memory at that point in the code, we should probably improve that operation. |
setting |
a similar error happens at step 5 - clean_cells. Similarly, the error is avoided by setting Should we consider changing the default values of |
So, just to clarify -- this is an out-of-memory error, correct? In general, we expect people to start with a lot of RAM for these analyses, so I am inclined to keep these on (so that the jobs run faster without people needing to manually turn them on). Is it possible, at all, to catch this error and return a more meaningful error message to the user? That would probably be ideal. |
the error is possibly due to an out of memory error allocated to the worker subprocesses. One possible solution would be to configure spark to increase this limit. |
Ok, looking at this again.
|
Short Java error:
Short Python error track:
some references:
The text was updated successfully, but these errors were encountered: