-
Notifications
You must be signed in to change notification settings - Fork 8
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
flexible requirements #26
Comments
yeah, I've been struggling with that choice, particularly in the case of |
actually we might be able to do it within |
I believe that pyspark automatically comes with spark, so we don't need to worry about system installations: https://stackoverflow.com/a/51729469/5427308 |
oh, that's interesting! it would definitely make installation simpler. However, we might have conflicts of versions in the case multiple Spark installations are in the path, I got this problem myself. |
apparently only pip installing pyspark is sufficient, the tests are passing like this now! |
great! hopefully this should be robust and we won't run into problems. since we got rid of spark as a dependency, it makes sense to do same for ants, since it has a simple wrapper in python. https://pypi.org/project/antspyx/ it should be straightforward to convert the corresponding ants calls into python. i can give it a shot, and the advantage would be a package that can be fully installed via pip without need to install other software. |
yes, great idea with ANTs python package as well! |
The
==
in requirements should probably be relaxed to>=
in most cases (since==
will be very rigid/inflexible outside of a container or virtual environment). However, we should check that>=
will not break, at least for a default current conda configuration. Alternatively, we can not enforce the requirements except in the case of generating containers/virtual environments.The text was updated successfully, but these errors were encountered: