Pip installing from github installs only __init__.py file - python

I'm trying to grasp the git(hub) way of managing software. I have a repository:
https://github.com/pythonishvili/django-inguri
And I try to pip install it with this command
pip install git+git://github.com/pythonishvili/django-inguri.git
The response I get:
Downloading/unpacking git+git://github.com/pythonishvili/django-inguri.git
Cloning git://github.com/pythonishvili/django-inguri.git to /tmp/pip-bv5r89-build
Running setup.py egg_info for package from git+git://github.com/pythonishvili/django-inguri.git
Installing collected packages: inguri
Running setup.py install for inguri
Successfully installed inguri
Cleaning up...
But installation went clearly wrong because all I get in my virtualenv (/home/username/.virtualenvs/envname/lib/python2.7/site-packages/inguri) are two files:
__init__.py
__init__.pyc
What did I do wrong? How do I make this work?

I believe you need to add all the subdirectories of your project to the packages option of your setup.py file. Right now, you have just the outermost directory - inguri. You would need to add inguri.ads, inguri.ads.migrations and so forth (as they contain .py files too which you want to include in your distribution).
You also need to add the following line in your manifest file: recursive-include inguri *

Related

pip and tox ignore full path dependencies, instead look for "best match" in pypi

This is an extension of SO setup.py ignores full path dependencies, instead looks for "best match" in pypi
I am trying to write setup.py to install a proprietary package from a .tar.gz file on an internal web site. Unfortunately for me the prop package name duplicates a public package in the public PyPI, so I need to force install of the proprietary package at a specific version. I'm building a docker image from a Debian-Buster base image, so pip, setuptools and tox are all freshly installed, the image brings python 3.8 and pip upgrades itself to version 21.2.4.
Solution 1 - dependency_links
I followed the instructions at the post linked above to put the prop package in install_requires and dependency_links. Here are the relevant lines from my setup.py:
install_requires=["requests", "proppkg==70.1.0"],
dependency_links=["https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"]
Installation is successful in Debian-Buster if I run python3 setup.py install in my package directory. I see the proprietary package get downloaded and installed.
Installation fails if I run pip3 install . also tox (version 3.24.4) fails similarly. In both cases, pip shows a message "Looking in indexes" then fails with "ERROR: Could not find a version that satisfies the requirement".
Solution 2 - PEP 508
Studying SO answer pip ignores dependency_links in setup.py which states that dependency_links is deprecated, I started over, revised setup.py to have:
install_requires=[
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
Installation is successful in Debian-Buster if I run pip3 install . in my package directory. Pip shows a message "Looking in indexes" but still downloads and installs the proprietary package successfully.
Installation fails in Debian-Buster if I run python3 setup.py install in my package directory. I see these messages:
Searching for proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0
..
Reading https://pypi.org/simple/proppkg/
..
error: Could not find suitable distribution for Requirement.parse(...).
Tox also fails in this scenario as it installs dependencies.
Really speculating now, it almost seems like there's an ordering issue. Tox invokes pip like this:
python -m pip install --exists-action w .tox/.tmp/package/1/te-0.3.5.zip
In that output I see "Collecting proppkg# https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0" as the first step. That install fails because it fails to import package requests. Then tox continues collecting other dependencies. Finally tox reports as its last step "Collecting requests" (and that succeeds). Do I have to worry about ordering of install steps?
I'm starting to think that maybe the proprietary package is broken. I verified that the prop package setup.py has requests in its install_requires entry. Not sure what else to check.
Workaround solution
My workaround is installing the proprietary package in the docker image as a separate step before I install my own package, just by running pip3 install https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz. The setup.py has the PEP508 URL in install_requires. Then pip and tox find the prop package in the pip cache, and work fine.
Please suggest what to try for the latest pip and tox, or if this is as good as it gets, thanks in advance.
Update - add setup.py
Here's a (slightly sanitized) version of my package's setup.py
from setuptools import setup, find_packages
def get_version():
"""
read version string
"""
version_globals = {}
with open("te/version.py") as fp:
exec(fp.read(), version_globals)
return version_globals['__version__']
setup(
name="te",
version=get_version(),
packages=find_packages(exclude=["tests.*", "tests"]),
author="My Name",
author_email="email#mycompany.com",
description="My Back-End Server",
entry_points={"console_scripts": [
"te-be=te.server:main"
]},
python_requires=">=3.7",
install_requires=["connexion[swagger-ui]",
"Flask",
"gevent",
"redis",
"requests",
"proppkg # https://site.mycompany.com/path/to/proppkg-70.1.0.tar.gz#egg=proppkg-70.1.0"
],
package_data={"te": ["openapi_te.yml"]},
include_package_data=True, # read MANIFEST.in
)

How do you build a wheel outside of a repo containing the package?

Question
Is there a way to build a wheel for a package while in a different repository such that the wheel has been built exactly as it would be if you built the wheel inside of the repository containing the package?
Example
Consider the following repo:
/repo-containing-your-package
|___ your_module/
|___ setup.py
Build method A
When I run python setup.py bdist_wheel from within repo-containing-your-package it builds the wheel as expected, including your_module. This means after I install pip install ./dist/your_module-#.#.#-py3-none-any.whl (which is successful), I can run python -m your_module.foo from the command line.
When the package is building, I get output that verifies that my module has been picked up by the wheel:
creating 'dist/your_module-#.#.#-py3-none-any.whl' and adding 'build/bar' to it
adding 'your_module/__init__.py'
etc...
Build method B
However, if I run python ../repo-containing-your-package/setup.py bdist_wheel from a repository that is a sibling to repo-containing-your-package, it does not build the wheel as expected, as it fails to include your_module. This means after I install pip install ./dist/your_module-#.#.#-py3-none-any.whl (which is successful), attempting python -m your_module.foo fails:
Error while finding module specification for 'your_module.foo' (ModuleNotFoundError: No module named 'your_module')
The fact that the module has not been properly installed with the package is confirmed by reviewing the build output, which does not include the adding 'your_module' output that method A includes.
Two solutions I know of:
change working directory in setup.py
If you can modify the setup script, you can change the working directory programmatically. Add an os.chdir call early enough in the setup script:
import os
from setuptools import setup
os.chdir(os.path.dirname(__file__))
setup(...)
You can also change the working directory with other means without having to modify the setup script, e.g. in bash:
$ pushd path/to/repo; python setup.py bdist_wheel; popd
Use pip wheel
pip has a subcommand wheel that builds a wheel from the given arg; this arg is usually the name of the package, but can be a directory containing the setup script. Pass -e in that case so the wheel has the correct name:
$ pip wheel -e path/to/repo

What setup.py command will update the source in site-packages?

I have run
python setup.py sdist --formats=gztar,zip bdist_wheel
and then
python setup.py install
The result is that the egg files are created in the site-packages directory but not the <package-name>/<package-source files>:
$ls /usr/local/lib/python3.7/site-packages/infix*
/usr/local/lib/python3.7/site-packages/infixpy-0.0.3-py3.7.egg
/usr/local/lib/python3.7/site-packages/infixpy.egg-link
/usr/local/lib/python3.7/site-packages/infixpy-0.0.4-py3.7.egg
Notice that the directory infix was not created - and thus none of the source code was copied. What am I missing / not understanding in this local installation process?
Update When I had run
pip3 install infixpy
there was an additional directory infix and the source code was included in that directory. Running the local or devel modes of setup.py install was not causing that code to be updated and - crucially - the stacktraces from running any python code (even in a completely new ipython repl) was showing only the older / pip3 installed code. In particular the file __init__.py So my observation has been that the source file :
/usr/local/lib/python3.7/site-packages/infixpy/__init__.py
is an accurate reflection of what the python executable were using. #phd is mentioning that the source code is already included in the egg. So then I do not understand the relationship between the source code in the egg and the source code in that subdirectory - which in the lastest run of mine is completely missing.
The following commands all yield slightly different results:
pip install .: installed as uncompressed package directories and a XXX.dist-info directory
pip install infixpy: same as previous, but installed from an (remote) index (per default PyPI), not from the local directory
python setup.py install: installed as a zipped file XXX.egg
pip install --editable . or python setup.py develop: not installed, but linked as a XXX.egg-link file
So depending on the commands entered, the content of site-packages is different.
Now this is what you say you have:
$ls /usr/local/lib/python3.7/site-packages/infix*
/usr/local/lib/python3.7/site-packages/infixpy-0.0.3-py3.7.egg
/usr/local/lib/python3.7/site-packages/infixpy.egg-link
/usr/local/lib/python3.7/site-packages/infixpy-0.0.4-py3.7.egg```
This is a bit surprising, since theoretically there are 3 versions of your project that are importable (0.0.3, 0.0.4, and develop/editable). I am not sure which one is used by the Python interpreter in this case. You might want to run pip uninstall infixpy a couple of times to start fresh and alleviate these uncertainties. You can then experiment with the commands mentioned above and see how they impact the content of site-packages along with inspecting the result of pip show infixpy.

How to install data_files to absolute path?

I use pip with setuptools to install a package.
I want pip to copy some resource files to, say, /etc/my_package.
My setup.py looks like this:
setup(
...
data_files=[('/etc/my_package', ['config.yml'])]
)
When running pip install, the file ends up in
~/.local/lib/python3.5/site-packages/etc/my_package/config.yml
instead of /etc/my_package.
What am I doing wrong?
(pip version 9.0.1)
Short answer: use pip install --no-binary :all: to install your package.
I struggled with this for a while and eventually figured out that there is some weirdness/inconsistency in how data_files are handled between binary wheels and source distributions. Specifically, there is a bug with wheels that makes all paths in data_files relative to the install location (see https://github.com/pypa/wheel/issues/92 for an issue tracking this).
"Thats fine", you might say, "but I'm not using a wheel!". Not so fast! It turns out recent versions of pip (I am working with 9.0.1) will try to compile a wheel even from a source distribution. For example, if you have a package my_package you can see this doing something like
$ python setup.py sdist # create source tarball as dist/my_package.tar.gz
[...]
$ pip install dist/my_package.tar.gz # install the generated source
[...]
Building wheels for collected packages: my_package
Running setup.py bdist_wheel for my_package ... done
pip tries to be helpful and build a wheel to install from and cache for later. This means you will run into the above bug even though in theory you are not using bdist_wheel yourself. You can get around this by running python setup.py install directly from the package source folder. This avoids the building and caching of built wheels that pip will try to do but is majorly inconvenient when the package you want is already on PyPI somewhere. Fortunately pip offers an option to explicitly disable binaries.
$ pip install --no-binary :all: my_package
[...]
Skipping bdist_wheel for my_package, due to binaries being disabled for it.
Installing collected packages: my_package
Running setup.py install for my_package ... done
Successfully installed my_package-0.1.0
Using the --no-binary option prevents wheel building and lets us reference absolute paths in our data_files paths again. For the case where you are installing a lot of packages together and want to selectively disable wheels you can replace :all: with a comma separated list of packages.
it seems that data_files can't support absolute path, it will add sys.prefix before "/etc/my_package", if you want to put config.yml to ../site_packages/my_package, please try:
import os
import sys
from distutils.sysconfig import get_python_lib
relative_site_packages = get_python_lib().split(sys.prefix+os.sep)[1]
date_files_relative_path = os.path.join(relative_site_packages, "my_package")
setup(
...
data_files=[(date_files_relative_path, ['config.yml'])]
)
I ended up writing an init() function that installs the config file on first run instead of creating it during the installation:
def init():
try:
if not path.isdir(config_dir):
os.mkdir(cs_dir)
copyfile(pkg_resources.resource_filename(
__name__, "default_config.yml"), config_file)
print("INFO: config file created. ")
except IOError as ex:
print("ERROR: could not create config directory: " + str(ex)
if __name__ == "__main__":
init()
main()

Any methods to deploy Python packages with 'pip | easyinstall' + '*.pyc only' + 'flat namespace packges' + virtualenv?

Goals:
Make use of modern Python packaging toolsets to deploy/install proprietary packages into some virtualenv.
The installed packages should include compiled *.pyc(or *.pyo) only without source files.
There are a couple of packages, and a vendor name (here we choose dgmx for our studio) is used as the package names. Therefore, the installed packages would be something like dgmx/alucard, dgmx/banshee, dgmx/carmilla, ...
The file hierarchy of installed packages should be like ones by python setup.py install --single-version-externally-managed or pip install. Refer to How come I can't get the exactly result to *pip install* by manually *python setup.py install*?
Question in short:
I like to deploy proprietary namespaced packages into a virtualenv by only compiled *.pyc(or *.pyo) files, in which the file/directory hierarchy just reflects the namespace with polluting sys.path by lots of ooxx.egg paths.
Something I have tried:
python setup.py bdist_egg --exclude-source-files then easy_install ooxx.egg.
pollute "sys.path" for each namespace package.
python setup.py install --single-version-externally-managed.
not *.pyc only.
the "install_requires" got ignored!
need to manually put a ooxx.egg-info/installed-files.txt to make uninstall work correctly.
pip install . in the location of "setup.py".
not *.pyc only.
pysetup install . in the location of "setup.py".
not *.pyc only.
Update:
My current idea is to follow method 2.
python setup.py egg_info --egg-base . # get requires.txt
python setup.py install --single-version-externally-managed --record installed-files.txt # get installed-files.txt
manually install other dependencies through "requires.txt"
manually delete installed source files (*.py) through "installed-files.txt"
remove source files (*.py) from "installed-files.txt" and put it into deployed "ooxx.egg-info/installed-files.txt"
References:
Migrating to pip+virtualenv from setuptools
installing only .pyc (python compiled) with setuptools
Can I deploy Python .pyc files only to Google App Engine?
How come I can't get the exactly result to *pip install* by manually *python setup.py install*?
Some trick may help:
Compile your source into .pyc, zip them up in a single .zip file.
Write a new module with a simple module all it does is to add the .zip to the sys.path.
So when you import this module, the .zip is in the path. All you have to do is in a custom step in setup.py, copy the zip file to the proper place.

Categories