Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

flexible requirements #26

Open
mikarubi opened this issue Aug 30, 2024 · 7 comments
Open

flexible requirements #26

mikarubi opened this issue Aug 30, 2024 · 7 comments

Comments

@mikarubi
Copy link
Owner

The == in requirements should probably be relaxed to >= in most cases (since == will be very rigid/inflexible outside of a container or virtual environment). However, we should check that >= will not break, at least for a default current conda configuration. Alternatively, we can not enforce the requirements except in the case of generating containers/virtual environments.

@luiztauffer
Copy link
Collaborator

yeah, I've been struggling with that choice, particularly in the case of pyspark, which might work with multiple versions, but I believe it should match the version of Spark installed in the system.
I agree we should not have it pinned like that, so we should probably it remove from the requirements and include instructions to install it separately. I don't think we can automatically test the spark version at pip install step and dynamically choose the pyspark version... but I'll look into it

@luiztauffer
Copy link
Collaborator

actually we might be able to do it within setup.py

@mikarubi
Copy link
Owner Author

I believe that pyspark automatically comes with spark, so we don't need to worry about system installations: https://stackoverflow.com/a/51729469/5427308

@luiztauffer
Copy link
Collaborator

oh, that's interesting! it would definitely make installation simpler. However, we might have conflicts of versions in the case multiple Spark installations are in the path, I got this problem myself.

@luiztauffer
Copy link
Collaborator

apparently only pip installing pyspark is sufficient, the tests are passing like this now!
we still might find the issue of pre-installed spark in the system path, though

@mikarubi
Copy link
Owner Author

mikarubi commented Sep 3, 2024

great! hopefully this should be robust and we won't run into problems. since we got rid of spark as a dependency, it makes sense to do same for ants, since it has a simple wrapper in python. https://pypi.org/project/antspyx/

it should be straightforward to convert the corresponding ants calls into python. i can give it a shot, and the advantage would be a package that can be fully installed via pip without need to install other software.

@luiztauffer
Copy link
Collaborator

yes, great idea with ANTs python package as well!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants