-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Remove fixed dependence on numpy==1.23.4 #66
Comments
Hi, that is not possible. pycraf is not yet compatible with numpy 2.0, which introduced new binary APIs that require a fair bit of work to adapt to. As the package contains Cython-extensions, the numpy versions against which the wheels were compiled for are essential, otherwise you will get errors during runtime. |
The other question would be, why you get an error during installation of the older numpy version. You may want to increase the verbosity of the pip command to get to the root cause of this. |
|
To install numpy on Python 3.12, you must use numpy version 1.26.4, numpy/numpy#23808 (comment) |
I have no clue what's going on there, because for Python 3.12, the 1.26.4 version of numpy is used in the pyproject.toml. Therefore, it is also very odd that your pip first attempts to install numpy 2.0 and only when that didn't work, it tries to install 1.23.4. For information, do you run this in a venv? What happens, if you install numpy 1.26.4 first - will it still try to download the wrong numpy? By-the-way, pycraf (especially the notebooks) makes most sense when used together with a fair number of other packages, e.g. for GIS data. It can be quite a challenge to get a working installation for all these packages. I personally use anaconda, as this works very well; also on Windows. These days, if you use the conda-forge channel, almost everything is available as a binary package, which really eases the installation. |
The output of requires_dist: ['astropy', 'build', 'ipdb', 'matplotlib', 'numpy ==1.23.4', 'pip', 'pyproj', 'pytest', 'pytest-astropy', 'pytest-doctestplus', 'pytest-remotedata', 'rasterio', 'scipy', 'setuptools', 'setuptools-scm', 'sgp4', 'twine', 'wheel', "sphinx ; extra == 'docs'", "sphinx-astropy[confv2] ; extra == 'docs'", "sphinx-copybutton ; extra == 'docs'", "pydata-sphinx-theme ; extra == 'docs'", "sphinx-design ; extra == 'docs'", "pytest >=7.0 ; extra == 'docs'", "cartopy ; extra == 'recommended'", "ffmpeg ; extra == 'recommended'", "fiona ; extra == 'recommended'", "geopandas ; extra == 'recommended'", "h5py ; extra == 'recommended'", "openpyxl ; extra == 'recommended'", "osmnx ; extra == 'recommended'", "pandas ; extra == 'recommended'", "reproject ; extra == 'recommended'", "shapely ; extra == 'recommended'", "tqdm ; extra == 'recommended'", "pytest >=7.0 ; extra == 'test'", "pytest-astropy >=0.10 ; extra == 'test'", "pytest-doctestplus ; extra == 'test'", "pytest-remotedata ; extra == 'test'"] |
Thanks, that is very useful. I also checked the linux wheels, there the 3.12 package has no pin at all, while 3.10 and 3.11 both have the 1.23.4 pin (which is also not correct, only 3.11 should have it). As I said, I have absolutely no clue, what's happening here. Sometimes Python packaging is really a miracle to me. Maybe there is a strange syntax error in the pyproject.toml or something is wrong with the build scripts. As you seem to know a little bit about this, feel free to have a look into the azure pipeline etc. config. I'm happy to make the required fixes, if we can identify the issue. |
Why not just change
to
|
It's not so simple. The different numpy versions are not binary compatible. One usually wants to compile (the Cython extensions) with the oldest numpy that works with a given Python version. Otherwise, if the wheel gets compiled with a newer numpy than what the user has installed, the package will seg-fault. I'm working on migration to numpy 2. In the meantime, could you use a work-around (e.g., install numpy and other dependencies yourself and then install pycraf with the Interestingly, I maintain another package where I use the same setup to build wheels etc. In the other case, the numpy pins (which are really only required during the build-wheel process) do not end up in the wheel/package, while here they did. I don't understand... |
The text was updated successfully, but these errors were encountered: