Checking setuptools install_requires on testpypi - python

I am trying test a python package I want to release using test test.pypi.
In the setup.py file I have install_requires=['numpy>=1.15','scipy>=0.0','scikit-learn>=0.2','numba>=0.0'],
Scipy and Numpy get downloaded and install as expected.
I get the following error: ERROR: Could not find a version that satisfies the requirement numba>=0.0
As a note, if I do pip install numba before my test package it will work, but I am trying to make the package work correctly.
I notice when it does the scipy requirement first, it shows Downloading https://test-files.pythonhosted.org/packages/68/72/eb962a3ae2755af6b1f31f7a94dccc21bfc41bb1637c5877a043e711b1d7/scipy-0.1.tar.gz .
So from the url, it seems like it is using test-files, but is this the regular pypi or just the test one?
My question is: what is the appropriate way to write the install_requires so I can make sure the test works before putting it on the actual pypi site?

There is nothing wrong with your syntax, it is just that unlike scipy, numpy, and scikit-learn, there is no numba hosted on the test PyPI instance. Compare:
https://pypi.org/project/numba/ <-- 200 OK
https://test.pypi.org/project/numba/ <-- 404 Not Found
(Note: that 404 was true at the time of writing this answer, but it appears as though a version of numba has now been uploaded to the test index, on Feb 9th 2021)
My question is: what is the appropriate way to write the install_requires so I can make sure the test works before putting it on the actual pypi site?
How you wrote install_requires is OK. To smoke test it, by uploading your package to test PyPI and checking that it installs properly, use the test PyPI as an extra index url, rather than as a replacement --index-url:
pip install yourpackage --extra-index-url=https://test.pypi.org/simple/
This way yourpackage can be found in test PyPI but the install requirements such as numba can still be resolved on main PyPI.

Related

No local packages or working download links found for python_version (thought it exists)

I am trying to install/setup this git repo:
https://github.com/xiahongze/pdf_annot
I am getting this error:
> python setup.py install
:
:
Installed e:\work\projs\scraping\scan_pdfs\scanpdfs\lib\site-packages\pdf_annot-0.1.0-py3.7.egg
Processing dependencies for pdf-annot==0.1.0
Searching for python_version>=3.7
Reading https://pypi.org/simple/python_version/
No local packages or working download links found for python_version>=3.7
error: Could not find suitable distribution for Requirement.parse('python_version>=3.7')
My python version is:
> python --version
Python 3.7.6
Same issue happens with pip install. Is there a way to fix this?
It's is a bug in setup.py. This is how it must be written:
python_requires='>=3.7',
install_requires=[
'PyMuPDF>=1.16'
],
Let's see. pdf-annot, the only release version 0.1 was in 2019. Not even a single bug report. The package seems unused, broken and abandoned.
You can try to send a pull request to fix it but I suspect it will be stuck. Perhaps the best course of action for you is to fork the repository, fix it and install from your fork.
PS. Well, there is a package python-version with a single release version 0.2. Certainly not a dependency for this package.
Please sure the network is connected.

Full installation of tensorflow (all modules)?

I have this repository with me ; https://github.com/layog/Accurate-Binary-Convolution-Network . As requirements.txt says, it requires tensorflow==1.4.1. So I am using miniconda (in Ubuntu18.04) and for the love of God, I can't get it to run (errors out at the below line)
from tensorflow.examples.tutorial.* import input_data
Gives me an ImportError saying it can't find tensorflow.examples. I have diagnosed the problem that a few modules are missing after I installed tensorflow (Have tried all of the below ways)
pip install tensorflow==1.4.1
conda install -c conda-forge tensorflow==1.4.1
#And various wheel packages avaliable on the internet for 1.4.1
pip install tensorflow-1.4.0rc1-cp36-cp36m-manylinux1_x86_64.whl
Question is, if I want all the modules which are present in the git repo source as my installed copy, do I have to COMPLETELY build tensorflow from source ? If yes, can you mention the flag I should use? Are there any wheel packages available that have all modules present in them ?
A link would save me tonnes of effort!
NOTE: Even if I manually import the examples directory, it says tensorflow.contrib is missing, and if I local import that too, another ImportError pops up. There has to be an easier way I am sure of it
Just for reference for others stuck in the same situation:-
Use latest tensorflow build and bezel 0.27.1 for installing it. Even though the requirements state that we need an older version - use newer one instead. Not worth the hassle and will get the job done.
Also to answer the above question about building only specific directories is possible. Each module consists of BUILD file which is fed to bezel.
See the names category of the file to build specific to that folder. For reference the command I used to generate the wheel package for examples.tutorial.mnist :
bazel build --config=opt --config=cuda --incompatible_load_argument_is_label=false //tensorflow/examples/tutorials/mnist:all_files
Here all_files is the name found in the examples/tutorials/mnist/BUILD file.

How to compare requirement file and actually installed Python modules?

Given requirements.txt and a virtualenv environment, what is the best way to check from a script whether requirements are met and possibly provide details in case of mismatch?
Pip changes it's internal API with major releases, so I seen advices not to use it's parse_requirements method.
There is a way of pkg_resources.require(dependencies), but then how to parse requirements file with all it's fanciness, like github links, etc.?
This should be something pretty simple, but can't find any pointers.
UPDATE: programmatic solution is needed.
You can save your virtualenv's current installed packages with pip freeze to a file, say current.txt
pip freeze > current.txt
Then you can compare this to requirements.txt with difflib using a script like this:
import difflib
req = open('requirements.txt')
current = open('current.txt')
diff = difflib.ndiff(req.readlines(), current.readlines())
delta = ''.join([x for x in diff if x.startswith('-')])
print(delta)
This should display only the packages that are in 'requirements.txt' that aren't in 'current.txt'.
Got tired of the discrepancies between requirements.txt and the actually installed packages (e.g. when deploying to Heroku, I'd often get ModuleNotFoundError for forgetting to add a module to requirements.)
This helps:
Use compare-requirements (GitHub)
(you'll need to pip install pipdeptree to use it.)
It's then as simple as...
cmpreqs --pipdeptree
...to show you (in "Input 2") which modules are installed, but missing from requirements.txt.
You can then examine the list and see which ones should in fact be added to requirements.txt.

Install transitive bitbucket dependencies via pip

The situation I'm trying to resolve is installing a package from a private repository on bitbucket which has it's own dependency on another private repository in bitbucket.
I use this to kick off the install:
pip install -e git+https://bitbucket.org/myuser/project-one.git/master#egg=django_one
which then attempts to download it's dependencies from setup.py that look like:
install_requires = ['project-two',],
dependency_links = ['git+https://bitbucket.org/myuser/project-two.git/master#egg=project_two'],
This fails, the pip log looks like:
Downloading/unpacking project-two (from project-one)
Getting page https://pypi.python.org/simple/project-two/
Could not fetch URL https://pypi.python.org/simple/project-two/: HTTP Error 404: Not Found (project-two does not have any releases)
Will skip URL https://pypi.python.org/simple/project-two/ when looking for download links for project-two (from project-one)
Getting page https://pypi.python.org/simple/
URLs to search for versions for project-two (from project-one):
* https://pypi.python.org/simple/project-two/
* git+https://bitbucket.org/myuser/project-two.git/master#egg=project-two
Getting page https://pypi.python.org/simple/project-two/
Cannot look at git URL git+https://bitbucket.org/myuser/project-two.git/master#egg=project-two
Could not fetch URL https://pypi.python.org/simple/project-two/: HTTP Error 404: Not Found (project-two does not have any releases)
Will skip URL https://pypi.python.org/simple/project-two/ when looking for download links for project-two (from project-one)
Skipping link git+https://bitbucket.org/myuser/project-two.git/master#egg=project-two; wrong project name (not project-two)
Could not find any downloads that satisfy the requirement project-two (from project-one)
The curious thing about this setup is, if I take a clone of project-one and run
python setup install
from there, project-two is fetched from bitbucket and installed into my virtualenv. My understanding was that pip was using setup tools under the hood, so my assumption was the success of that test validated my approach.
Any suggestions appreciated.
FOLLOW UP:
So the accepted answer is quite right - but my problem had the additional complexity of being a private repo (https + http auth-basic). Using the syntax
dependency_links=["http://user:password#bitbucket.org/myuser/..."]
still caused a 401. Running up a shell and using pip.download.py to run urlopen demonstrates the underlying problem (ie pip needs additional setup in urllib2 to get this working).
The problem is mentioned here but I couldn't get that working.
pip created the idea of a VCS installation, so you can use git+https://path/to/repo.git, but setuptools does not understand that.
When you create a setup.py file you are using only setuptools (no pip involved), and setuptools does not understand that kind of URL.
You can use dependency_links with tarballs or zip files, but not with git repositories.
Replace your depencency_links by:
dependency_links=["https://bitbucket.org/myuser/project-two/get/master.zip#egg=project-two"]
And check if it works.
There is a similar question at https://stackoverflow.com/a/14928126/565999
References:
http://peak.telecommunity.com/DevCenter/setuptools#dependencies-that-aren-t-in-pypi

Problem using easy_install.exe to install pyMySQL for Django

I'm trying to install the pyMySQL module for python so that I can setup Django (see this previous question).
I can't get easy_install.exe PyMySQL-0.3-py2.6.egg to run for the life of me. Every time I get the error easy_install.exe not recognized as an internal or external command... I've tried adding various directories to my system path including:
C:\Python27\Lib\site-packages\;
C:\Python27\Scripts\;
C:\Python27\Scripts\easy_install.exe
C:\Python27\Scripts\easy_install.exe PyMySQL-0.3-py2.6.egg
What am I missing that is keeping this from executing?(note I'm on windows 7)
You have to install setuptools first
[edit]
Uh,
C:\Users\Robus>easy_install
Yada yada, not found
C:\Python26\Scripts>easy_install
error: No urls, filenames, or requirements specified (see --help)
C:\Python26>
The next best thing I can think of is - do you, by any chance, have more than one version of python installed? In that case setuptools might have been installed somewhere else

Categories