pip target and post installation in setup.py - python

today I attempted to remove a file after my package (a python wheel) was installed via pip with the -t --target option.
Post-install script with Python setuptools
I am subclassing install in my setup.py like this:
class PostInstallCommand(install):
"""Post-installation for installation mode."""
def run(self):
install.run(self)
# here I am using
p = os.path.join(self.install_libbase,"myPackage/folder/removeThisPyc.pyc")
if os.path.isfile(p):
os.unlink(p)
#there is also self.install_platlib and
#self.install_purelib which seem to be used by pip distutil scheme
#Have not tested those yet
when running
python setup.py install
this works the file is removed upon install.
But through
pip install path-to-my-wheel.whl
this does not work and the file is still there.
pip install -t /target/dir path-to-my-wheel.whl
does not work either...
So question is, what is pip doing with distutils and or setuptools and how can make this work?
Another thing I noticed is that pip does not seem to be printing anything, I am printing in my setup.py in verbose mode?
Is there away to see the full output from python instead of the "pip" only stuff?

Reading educates:
http://pythonwheels.com/
2. Avoids arbitrary code execution for installation. (Avoids setup.py)
As I am using wheels and wheels wont execute the setup.py, my concept of doing this is rubbish.
https://github.com/pypa/packaging-problems/issues/64
I guess this is between deployment and installation, though I would obviously count my little change to installation...
Is there a way to avoid pyc file creation upon a pip install whl ?

Related

List installed packages from within setup.py

We have multiple versions of our package: package1 and package1-unstable - similar to tensorflow and tf-nightly. These are different packages on PyPi but install the same module. This then causes issues when both of these packages are installed as they overlap and write into the same directories in the site-packages folder. When one is removed, the other package stays but most of the module code is now removed, resulting in an even worse, dysfunctional state.
What is the cleanest way to detect colliding packages?
We can hardcode that package1 and package1-unstable are mutually incompatible. We use setup.py for the installation.
My thinking was to use a wrapper class around the install command class.
class Install(install):
def run(self):
if name == "package1":
self.ensure_not_installed("package1-unstable")
else:
self.ensure_not_installed("package1")
install.run(self)
def ensure_not_installed(pkg_name):
"""Raises an error when pkg_name is installed."""
...
...
cmdclass={'install': Install},
This approach seems to work as a general direction. However, I'm unsure yet about how to list exhaustively the installed packages. I'm testing the approaches with both pip install . and python setup.py install.
A couple of approaches that I tried are:
use site.getsitepackages(), iterate through the directories and check for the existence of the given package directories (i.e. package1-{version}.dist-info or pacakge1-unstable-{version}.dist-info - this can work, but this feels hacky / manual / I'm not confident yet that it's going to work in a portable way across all OSes and python distributions
try to call pip list or pip show package1 from within setup.py - this does not seem to work when the setup script is executed via pip install . as pip is not on the import path itself
pkg_resources.working_set works with python setup.py install but not with pip install . probably for similar reasons as calling pip doesn't work: the working set contains only wheel and setuptools when calling the setup with pip install .
in the general case you can't implement this as part of setup.py as pip will build your package to a wheel, cache it, and then never invoke setup.py after that. you're probably best to have some sort of post-installation tests which are run in a different way (makefile, tox.ini, etc.)
You can disable isolated builds by either
pip install --no-build-isolation .
or
PIP_NO_BUILD_ISOLATION=0 pip install .
However, some package installs rely on being invoked in an isolated environment.
Other times, the packaging routine uses a pyproject.toml.
This would be ignored in non-isolated builds.

setup.py install vs pip install

I want to create a python package which will be cloned from its git repo when a build runs, so I will have the source inside the build agent. I would then like to run the python package as a command line tool, the package is called environment_manager.
Initially I thought I would follow a tutorial for creating a simple setup.py although this has proved to be a lot more difficult than I thought it would be and whenever I run python setup.py install --force I am not able to use my installed package, generally either module not found or the command is not recognised when I type it.
I have found that if I simply install with pip install . then I am actually able to use the tool from the command line and it works. I don't understand what the difference is, or why this only works when doing the pip install method.
Below is the setup.py file, I cannot see what is wrong with it:
from setuptools import setup, find_packages, find_namespace_packages
import pathlib
here = pathlib.Path(__file__).parent.resolve()
# Get the long description from the README file
long_description = (here / 'README.MD').read_text(encoding='utf-8')
setup(
name='environment_manager',
version='1.0.0',
package_dir={'': 'src'},
packages=find_namespace_packages(where='src', include='environment_manager.*'),
python_requires='>=3.8, <4',
install_requires=['boto3', 'botocore', 'pyyaml'],
extras_require={
'dev': ['pre-commit', 'black', 'pylint'],
'test': ['pytest', 'pytest-mock', 'coverage'],
},
entry_points={
'console_scripts': [
'environment-manager=environment_manager.environment_controller:main',
],
}
)
My project structure looks like:
environment_manager
/src
conf/
environment_manager/
environment_controller.py
config_parser.py
command.py
test/
unit_tests.py
I thought the correct way to install and run the tool from the command line was to use setup.py and setuptools but it seems like it is a lot easier and actually works if I just install it with pip.
Is installing it with pip over setup.py correct (as both ways the package appears when I type pip list) and are there any issues with my setup.py script? The script was taken from the pypa sample project and I removed most of what I didnt need.
setup.py is a python file, which usually tells you that the module/package you are about to install has been packaged and distributed with Distutils, which is the standard for distributing Python Modules. This allows you to easily install Python packages. Often it's enough to write: $ pip install .
In other words setup.py is a packaging file while pip is a package manager, therefore you should have setup.py file to be able to install with pip.
pip is a package manager which helps install, manage, and uninstall Python packages. It searches for them on PyPI, downloads them, and then runs their setup.py script.
Since you mentioned that you can run your binary executable after a pip install, but not a setup.py install, it is likely that each of them is installing the binary to separate locations.
One thing I would check is that you are using python and pip from the same version of Python, e.g:
% python --version
Python 3.8.6
% pip --version
pip 20.1.1 from /usr/lib/python3.8/site-packages/pip (python 3.8)
If these have different Python versions listed, they are likely installing to two separate directories - one in your PATH environment variable, and one which is not.
Next, I would check pip list -v after each install method, as this should list a Location header telling you where the package has been installed.

Running custom code with pip install fails

I have a Python package that I'm distributing with pip. I need to add some custom code to be run at install time:
from setuptools import setup
from setuptools.command.install import install
class CustomInstall(install):
def run(self):
install.run(self)
print "TEST"
setup(
...
cmdclass={'install': CustomInstall},
...)
I thought the problem might pip suppressing stdout: Custom pip install commands not running. But then I replaced print "TEST" with creating a file and writing some text, and that didn't happen either.
It appears that my custom run method is only happening when I create and upload my_package to test PyPI:
python setup.py sdist bdist_wheel upload -r https://testpypi.python.org/pypi
and not when I pip install it:
pip install -i https://testpypi.python.org/pypi my_package
Maybe I am fundamentally not understanding how pip and setuptools work, but that is the opposite of the behavior I expected.
My questions are:
How can I get my CustomInstall class to work?
and
What actually happens when you call pip install?
I've looked a the setuptools docs and the PyPI docs, and I haven't been able to figure it out. It seems like other people have had success with this: Run custom task when call `pip install`, so I'm not sure what's going wrong.
So I'm not sure how much this will help, but I recently dealt with a similar issue, and here's what I learned.
Your custom install code appears to be correct. However, there are more methods than just run that can be overridden. Another useful one is finalize_options because you can write code to dynamically change the parameters of your setup.py (example here.)
This is a very good question to ask.pip install does various things depending on various factors. From where are you installing the package? From PyPI or some other package index? How was the package distributed? Is it a binary dist (.whl) or a source dist (.gz) file? Are you installing the package via a local directory? A remote repo via a VCS URL? Pip does not necessarily use the same approach for each of these cases. I would recommend using the -vvv flag to see what exactly pip is doing. It may not be running setuptools's install command for whatever reason...do you have
packages=setuptools.find_packages(),
include_package_data=True
in your setup.py file? Without these lines, pip could be installing your package's metadata but not the package itself.

"yum install package" or "python setup.py install" in CentOS?

I was wondering how the above "yum install package" & "python setup.py install" are used differently in CentOS? I used yum install ... all the time. However, when I try to do python setup.py install, I always get: this setup.py file couldn't be found even though its path shows up under echo $PATH, unless I try to use it in its current directory or use the absolute path.
When you type python setup.py install, your shell will check your $PATH for the python command, and run that. Then, python will be examining its arguments, which are setup.py install. It knows that it can be given the name of a script, so it looks for the file called setup.py so it can be run. Python doesn't use your $PATH to find scripts, though, so it should be a real path to a file. If you just give it the name setup.py it will only look in your current directory.
The source directory for a python module should not, ideally, be in your $PATH.
yum install is a command that will go to a package repository, download all the files needed to install something, and then put them in the right place. yum (and equivalents on other distributions, like apt for Debian systems) will also fetch and install any other packages you need, including any that aren't python modules.
Python has a package manager, too. You may also find using pip install modulename or pip install --user modulename (if you don't have administrative rights) easier than downloading and installing the module by hand. You can often get more recent versions of modules this way, as the ones provided by an operating system (through yum) tend to be older, more stable versions. Sometimes the module is not available through yum at all. pip can't install any extra packages that aren't python modules, though.
If you don't have pip already (it comes with Python3, but might need installing separately for Python2, depending on how it was set up), then you can install it by following the instructions here: https://pip.pypa.io/en/stable/installing/

distutils ignores changes to setup.py when building an extension?

I have a setup.py file that builds an extension. If I change one of the source files, distutils recognizes this and rebuilds the extension, showing all the compile / link commands.
However, if the only thing I change is setup.py (I'm fiddling trying to make library dependencies work), then it doesn't seem to rebuild (e.g., none of the compile/link commands show up). I've tested this by removing one of the source files in the line
sources = ['foo.c', 'bar.c' ...]
and when I pip install -e . or python setup.py install, it still creates a new file for the extension, but it must be a version cached somewhere, since it shouldn't compile.
How do I clear this cache? I have tried
python setup.py clean --all
or using the --ignore-installed and --no-cache-dir flags when doing pip install -e .
The only way I have found to make it rebuild is if I add garbage in a source file, triggering a rebuild and error, remove the garbage, and pip install -e . again...
Just delete under site-packages path any file related to it, you may find sometimes more than one version or some files packaged as zip files or run the following command python setup.py clean --all.
Recompile and install again.
But I will recommend to use python setup.py develop so you don't need to reinstall it with every change, you will be able to frequently edit your code and not have to re-install it again. python setup.py install is used to install typically a ready to use third-party packages.
Check here to better understand python packaging.
Summary:
python setup.py clean --all
python setup.py develop
I needed to run
python setup.py clean --all
python setup.py develop
Thanks to DhiaTN for getting me there.

Categories