Is it possible to programmatically check if a wheel (whl) is compatible with the chosen Python installation before attempting to install?
I'm making an automated packages installer (packages needed for my Python project to work), and I need to only attempt to install compatible pkgs, so if there are errors, I know they are only from the compatible modules and I should see what happened (not errors also from incompatible pkgs, which I wouldn't care). Example: I'd have wheels for Python 3.5 and 3.7, and in a 3.5 installation, 3.7 wheels could not be tried to be installed.
I've tried pkginfo (https://pypi.org/project/pkginfo/), but on wheel.supported_platforms, it returns an empty array and I can't do anything with that (a wheel with "any" or with "win32" on their name in the platform part, returned an empty array, so I can't use that, it seems).
Also tried the output from python -m pip debug --verbose, but the following appears:
WARNING: This command is only meant for debugging. Do not use this with automation for parsing and getting these details, since the output and options of this command may change without no
tice.
This makes the command not possible to use, even though bellow that it prints the "Compatible tags", which more or less I could use to determine if a wheel is supported or not from its name. Example of those "Compatible tags" in a Python array:
['cp39-cp39-win_amd64', 'cp39-abi3-win_amd64', 'cp39-none-win_amd64', 'cp38-abi3-win_amd64', 'cp37-abi3-win_amd64', 'cp36-abi3-win_amd64', 'cp35-abi3-win_amd64', 'cp34-abi3-win_amd64', 'cp
33-abi3-win_amd64', 'cp32-abi3-win_amd64', 'py39-none-win_amd64', 'py3-none-win_amd64', 'py38-none-win_amd64', 'py37-none-win_amd64', 'py36-none-win_amd64', 'py35-none-win_amd64', 'py34-no
ne-win_amd64', 'py33-none-win_amd64', 'py32-none-win_amd64', 'py31-none-win_amd64', 'py30-none-win_amd64', 'cp39-none-any', 'py39-none-any', 'py3-none-any', 'py38-none-any', 'py37-none-any
', 'py36-none-any', 'py35-none-any', 'py34-none-any', 'py33-none-any', 'py32-none-any', 'py31-none-any', 'py30-none-any']
With, for example, "pyHook-1.5.1-cp36-cp36m-win32.whl", I could check the name and see if it's compatible or not (except because of the warning above...).
Any other ideas?
Thanks in advance for any help!
EDIT: I could go manually and pull things from the name and hard-code the some possibilities I see on documentation, like "win32" and "win_amd64" (as I did before), but then I'd need to know exactly all the possibilities that the parts of the name can have (I saw a cool expression on the documentation: "e.g." - which means there are more than the mentioned things) and have a lot of work on that. I was hoping there was already someone that had made such thing (maybe even Python itself has some way in any of its internal packages).
You can do this using packaging.
pip install packaging
An example code to get the tags similar to how you got from pip would be:
from packaging.tags import sys_tags
tags = sys_tags()
print([str(tag) for tag in tags])
# ['cp39-cp39-manylinux_2_33_x86_64', 'cp39-cp39-manylinux_2_32_x86_64', 'cp39-cp39-manylinux_2_31_x86_64', ..... , 'py31-none-any', 'py30-none-any']
Of course, you can do much more things programmatically with the above variable tags:
>>> tags = sys_tags()
>>> for tag in list(tags)[:3]:
... print(tag.interpreter, tag.abi, tag.platform)
...
cp39 cp39 manylinux_2_33_x86_64
cp39 cp39 manylinux_2_32_x86_64
cp39 cp39 manylinux_2_31_x86_64
For more in-depth documentation, check: https://packaging.pypa.io/en/latest/tags.html#packaging.tags.sys_tags
Related
When I use a plugin that requires python, it can't find it and barfs.
The places that seem to being searched are:
Using -version I see both:
+python/dyn
+python3/dyn
However :echo has("python3") returns 0.
I'm not sure if this is compile time config, or runtime-configurable via .vimrc.
I'm not a python developer, and the few times I've ventured into that world were in the middle of the python2/python3 mess that turned me off completely. I've played around enough to have configured pyenv it seems, and get
╰─$ which python
/Users/benlieb/.pyenv/shims/python
╰─$ python --version
Python 3.10.3
Can anyone help shed light on what to do to get python3 findable/usable in my vim?
Update:
Following #romainl's suggestion below I set in my .vimrc
set pythonthreedll=/Users/benlieb/.pyenv/shims/python
But getting the following error:
+python/dyn and +python3/dyn are described here: :help python-dynamic.
By default, :help 'pythonthreedll' points to:
/opt/homebrew/Frameworks/Python.framework/Versions/3.10/Python
because MacVim is built against that version. The message in your screenshot says that there is nothing at that path. In order to have a working Python 3 interface, you can either:
install Python 3.10 via homebrew,
or point pythonthreedll to a valid path.
For example, I don't use Homebrew so the default value is useless to me, but I use MacPorts so this is my pythonthreedll:
set pythonthreedll=/opt/local/Library/Frameworks/Python.framework/Versions/3.10/lib/libpython3.10.dylib
After some time, I found the following works, thought it was not a fun path of discovery.
let &pythonthreedll = trim(system("pyenv which python"))
There are two version of my little tool:
https://pypi.python.org/pypi/tbzuploader/2017.11.0
https://pypi.python.org/pypi/tbzuploader/2017.12.0 Bug: The pypi page looks ugly.
In the last update a change in README.rst cases a warning:
user#host> rst2html.py README.rst > /tmp/foo.html
README.rst:18: (WARNING/2) Inline emphasis start-string without end-string.
README.rst:18: (WARNING/2) Inline emphasis start-string without end-string.
Now the pypi page looks ugly :-(
I use this recipe to do CI, bumpversion, upload to pypi: https://github.com/guettli/github-travis-bumpversion-pypi
How could I ensure that no broken README.rst gets released any more? With other words I want to avoid that the pypi page looks ugly.
Dear detail lovers: Please don't look into the current particular error in the README.rst. That's is not the question :-)
Update
As of Sep 21, 2018, the Python Packaging Authority recommends an alternative command twine check. To install twine:
pip install twine
twine check dist/*
Note that twine requires readme_renderer. You could still use readme_renderer, and you only need to install twine if you want its other features, which is a good idea anyway if you are releasing to PyPI.
From the official Python packaging docs, Uploading your Project to PyPI:
Tip: The reStructuredText parser used on PyPI is not Sphinx! Furthermore, to ensure safety of all users, certain kinds of URLs and directives are forbidden or stripped out (e.g., the .. raw:: directive). Before trying to upload your distribution, you should check to see if your brief / long descriptions provided in setup.py are valid. You can do this by following the instructions for the pypa/readme_renderer tool.
And from that tool's README.rst:
To check your long description's locally simply install the readme_renderer library using:
$ pip install readme_renderer
$ python setup.py check -r -s
Preamble
I had a readme which would not render on PyPi, other than the first element on the page (an image). I ran the file against multiple validators, and tested it against other renders. It worked perfectly fine everywhere else! So, after a long, nasty fight with it, and numerous version bumps so I could test a PyPi revision, I tried reducing the file to a bare minimum, from which I'd build it back up. It turned out that the first line was always processed, and then nothing else was...
Solution
Discovering this clue regarding the first line, I then had an epiphany... All I had to do was change the line endings in the file! I was editing the file in Windows, with Windows line endings being tacked on implicitly. I changed that to Unix style and (poof!) PyPi fully rendered the doc!
Rant...
I've encountered such things in the past, but I took it for granted that PyPi would handle cross platform issues like this. I mean one of the key features of Python is being cross platform! Am I the first person working in Windows to encounter this?! I don't appreciate the hours of time this wasted.
You could try if rstcheck catches the type of error in your readme. If it does, run it after pytest in your script section. (and add it in your requirements ofc).
One thing I can't get over - when I use numpy in Visual Studio and I want to declare an array of zeroes, I write:
x = numpy.zeros(n)
and it is correct for the interpreter. BUT THE AUTOCOMPLETION GIVES ME:
X = numpy.zeros_like ...
How can I change it to get actually helpful autocompletion? In C++ I get everything allright, so I guess it's an internal problem in Python case.
Edit: As I see the problem is that numpy.zeros is defined in numeric.py as:
zeros = multiarray.zeros. Apparently this is not enough for IntelliSense (or VisualAssist for this matter), which requires def function to actually see the structure.
You need to install the python 3.5 and download the corresponding wheel for numpy. Then using the command: pip install xxxx(numpy wheel version that you download) to install it. For more the detail information about the installation staff, you can have a look at this.
Then open or create a python application project in VS and set the python 3.5 as the default environment, then I can found the intellisense for numpy.zeros also works fine in .py file like the following screenshot: (python 3.5)
If set the python 2.7 as the default environment, the intellisense just like your description as below:
We have so may versions of wheel.
How could we know which version should be installed into my system?
I remember there is a certain command which could check my system environment.
Or is there any other ways?
---------------------Example Below this line -----------
scikit_learn-0.17.1-cp27-cp27m-win32.whl
scikit_learn-0.17.1-cp27-cp27m-win_amd64.whl
scikit_learn-0.17.1-cp34-cp34m-win32.whl
scikit_learn-0.17.1-cp34-cp34m-win_amd64.whl
scikit_learn-0.17.1-cp35-cp35m-win32.whl
scikit_learn-0.17.1-cp35-cp35m-win_amd64.whl
scikit_learn-0.18rc2-cp27-cp27m-win32.whl
scikit_learn-0.18rc2-cp27-cp27m-win_amd64.whl
scikit_learn-0.18rc2-cp34-cp34m-win32.whl
scikit_learn-0.18rc2-cp34-cp34m-win_amd64.whl
scikit_learn-0.18rc2-cp35-cp35m-win32.whl
scikit_learn-0.18rc2-cp35-cp35m-win_amd64.whl
In case this is still an issue, the following should tell you the information you need to know about your architecture to choose a wheel:
import platform
print platform.architecture()
You don't have to know. Use pip - it will select the most specific wheel available.
As a warning, pip._internal isn't a stable API, so you wouldn't want to rely on it. But in case it's helpful (as it was to me) - this answer gives a way of solving the problem:
You can get it in python from pip following this solution:
Since pip version 19.3,
TargetPython.get_tags() returns
the supported PEP 425 tags to check wheel candidates against (source). The tags are returned in order of preference (most preferred first).
from pip._internal.models.target_python import TargetPython
target_python = TargetPython()
pep425tags = target_python.get_tags()
The class TargetPython encapsulates the properties of a Python interpreter one is targeting for a package install, download, etc.
To avoid using pip._internal, you can use, in the shell (see here):
$ path/to/pythonX.Y -m pip debug --verbose
I want to use pybloomfilter package: https://github.com/axiak/pybloomfiltermmap,
I managed to install it under python3, but it seems the saved file does not have the original information (under python2, this won't happen), I looked into source code and it seems there is nothing specific to python2, so I am totally lost on how to make this library compatible with python3.
EDIT1:
by "not have the original information", I mean, when I add some string into the filter, then I end the program, the next time, I use open to load the filter, the filter is clear, it does not remember the added strings.
pybloomfiltermmap3 is a Python 3 fork of pybloomfiltermmap by Michael Axiak (#axiak).
class pybloomfilter.BloomFilter(capacity : int, error_rate : float[,
filename=None : string ][, perm=0755 ])
Install:
Please have Cython installed. Please note that this version is for Python 3. In case you are using Python 2, please see
https://github.com/axiak/pybloomfiltermmap.
To install:
$ pip install cython
$ pip install pybloomfiltermmap3
to build and install the module.
And there is a instance method to sync the file:
BloomFilter.sync()
Forces a sync() call on the underlying mmap file object.
Use this if you are about to copy the file and you want to be Sure (TM)
you got everything correctly.