I wonder if as well as .deb packages for example, it is possible in my setup.py I configure the dependencies for my package, and run:
$ sudo python setup.py install
They are installed automatically. Already researched the internet but all I found out just leaving me confused, things like "requires", "install_requires" and "requirements.txt"
Just create requirements.txt in your lib folder and add all dependencies like this:
gunicorn
docutils>=0.3
lxml==0.5a7
Then create a setup.py script and read the requirements.txt in:
import os
lib_folder = os.path.dirname(os.path.realpath(__file__))
requirement_path = lib_folder + '/requirements.txt'
install_requires = [] # Here we'll get: ["gunicorn", "docutils>=0.3", "lxml==0.5a7"]
if os.path.isfile(requirement_path):
with open(requirement_path) as f:
install_requires = f.read().splitlines()
setup(name="mypackage", install_requires=install_requires, [...])
The execution of python setup.py install will install your package and all dependencies. Like #jwodder said it is not mandatory to create a requirements.txt file, you can just set install_requires directly in the setup.py script. But writing a requirements.txt file is a best practice.
In the setup function call, you also have to set version, packages, author, etc, read the doc for a complete example: https://docs.python.org/3/distutils/setupscript.html
You package dir will look like this:
├── mypackage
│ ├── mypackage
│ │ ├── __init__.py
│ │ └── mymodule.py
│ ├── requirements.txt
│ └── setup.py
Another possible solution
try:
# for pip >= 10
from pip._internal.req import parse_requirements
except ImportError:
# for pip <= 9.0.3
from pip.req import parse_requirements
def load_requirements(fname):
reqs = parse_requirements(fname, session="test")
return [str(ir.req) for ir in reqs]
setup(name="yourpackage", install_requires=load_requirements("requirements.txt"))
You generate egg information from your setup.py, then you use the requirements.txt from these egg information:
$ python setup.py egg_info
$ pip install -r <your_package_name>.egg-info/requires.txt
In Python 3.4+, it is possible to use the Path class from pathlib, to do effectively the same thing as #hayj answer.
from pathlib import Path
import setuptools
...
def get_install_requires() -> List[str]:
"""Returns requirements.txt parsed to a list"""
fname = Path(__file__).parent / 'requirements.txt'
targets = []
if fname.exists():
with open(fname, 'r') as f:
targets = f.read().splitlines()
return targets
...
setuptools.setup(
...
install_requires=get_install_requires(),
...
)
Related
Structure:
.
├── application
│ └── runner.py
└── dummyLibrary
├── helperFunctions.py
├── __init__.py
└── setup.py
runner.py:
import dummyLibrary
dummyLibrary.foo()
dummyLibrary.bar()
init.py:
(empty file)
helperFunctions.py:
def foo():
print("called foo()")
def bar():
print("called bar()")
setup.py:
#!/usr/bin/env python
from distutils.core import setup
setup(name='dummyLibrary', version='0.0.1')
After cd'ing into dummyLibrary/ ,
I tried installing dummyLibrary with
pip3 install -e .
This was the output:
Defaulting to user installation because normal site-packages is not writeable
Obtaining file:///home/ubuntu/Documents/pythonTest/dummyLibrary
Preparing metadata (setup.py) ... done
Installing collected packages: dummyLibrary
Running setup.py develop for dummyLibrary
Successfully installed dummyLibrary-0.0.1
I tried installing dummyLibrary with
python3 -m pip install -e .
This was the output:
Defaulting to user installation because normal site-packages is not writeable
Obtaining file:///home/ubuntu/Documents/pythonTest/dummyLibrary
Preparing metadata (setup.py) ... done
Installing collected packages: dummyLibrary
Attempting uninstall: dummyLibrary
Found existing installation: dummyLibrary 0.0.1
Uninstalling dummyLibrary-0.0.1:
Successfully uninstalled dummyLibrary-0.0.1
Running setup.py develop for dummyLibrary
Successfully installed dummyLibrary-0.0.1
After cd'ing into application/ and running
python3 runner.py
I get:
Traceback (most recent call last):
File "runner.py", line 1, in <module>
import dummyLibrary
ModuleNotFoundError: No module named 'dummyLibrary'
no matter how I try to install my library
Why is this?
Additional Information:
Using Ubuntu, Not using a virtual environment.
I tried rebooting after installing. Didn't help.
Welcome to Stack Overflow!
First thing: __init__.py and helperFunctions.py should be inside another nested folder with the same name as the package.
.
├── application
│ └── runner.py
└── dummyLibrary
├── dummyLibrary
│ ├── __init__.py
│ └── helperFunctions.py
└── setup.py
Secondly, with the code in runner.py as how you'd like it, the __init__.py should include the following line to import all of the functions in helperFunctions.py:
from .helperFunctions import *
Finally, your setup.py should also include the parameter packages as a list of folder names in the package, in this case, the parameter should be packages=['dummyLibrary'].
P.S.: it's a pythonic practice to name your files and packages in snake_case rather than camelCase.
I used to be able to run the command runner.py from my module my-runner while using python setup.py develop. However, ever since that I reinstalled it using python setup.py install, I now get a pkg_resources.ResolutionError when calling runner.py.
This is the mini tree structure
.
├── bin
│ ├── some_other_file.py
│ ├── runner.py
├── setup.py
here is my setup.py
from setuptools import setup, find_packages
setup(
name='my-runner',
version='1.0.0',
license='private',
author='MyName',
author_email='myname#myemail.com',
description='My Runner',
packages=find_packages(),
scripts=['bin/runner.py', 'bin/some_other_file.py']
)
Running command runner.py returns the error
pkg_resources.ResolutionError: Script 'scripts/runner.py' not found in metadata at '/home/myname/module/my-runner.egg-info
I guess I have no clue why install would break it? I am guessing it has to do with the fact that develop does not read the egg-info dir but would like a solution to this problem.
A possible way is to remove the package(pip3 uninstall my-runner) and reinstall it (python setup.py install)
Problem Statement: when I install my pip package, a specific file inside the package get coped to Temp directory
Approach:
My Package directory Sturcture is following:
my-app/
├─ app/
│ ├─ __init__.py
│ ├─ __main__.py
├─ folder-with-extra-stuff/
│ ├─ __init__.py
│ ├─ file_I_want_to_cppy.tar.gz
├─ setup.py
├─ MANIFEST.in
I'm tweaking my setup.py file to do the job. Following is my setup.py
#!/usr/bin/env python
from setuptools import setup, find_packages
from setuptools.command.install import install
import os
import sys
import shutil
rootDir = os.path.abspath(os.path.dirname(__file__))
def run_custom_install():
print("--------Start running custom command -------")
temp_dir = r'c:\temp' if sys.platform == "win32" else r'/tmp'
temp_col_dir = temp_dir + os.sep + 'dump'
os.makedirs(temp_dir, exist_ok=True)
os.makedirs(temp_col_dir, exist_ok=True)
print("----------locate the zip file ---------------")
ColDirTests = os.path.abspath(os.path.join(rootDir, 'my-app','folder-with-extra-stuff'))
_src_file = os.path.join(ColDirTests , 'file_I_want_to_cppy.tar.gz ')
print(f"******{_src_file}**********")
if os.path.exists(_src_file):
print(f"-----zip file has been located at {_src_file}")
shutil.copy(_src_file, temp_col_dir)
else:
print("!!!!Couldn't locate the zip file for transfer!!!!")
class CustomInstall(install):
def run(self):
print("***********Custom run from install********")
install.run(self)
run_custom_install()
ver = "0.0.0"
setup(
name='my_pkg',
version=ver,
packages=find_packages(),
python_requires='>=3.6.0',
install_requires = getRequirements(),
include_package_data= True,
cmdclass={
'install' : CustomInstall,
}
)
MANIFEST.in
include README.md
include file_I_want_to_cppy.tar.gz
recursive-include my-app *
global-exclude *.pyc
include requirements.txt
prune test
Testing build:
> python setup.py bdist_wheel
It is working during build. I can see there is a directory formed C:\temp\dump and file_I_want_to_cppy.tar.gz inside it. But when I release the package in pip and try to install it from pip, the folder remains Empty!
Any idea what I might be doing wrong here?
After a lot of research I have figure out how to resolve this issue. Let me summarize my findings, it might be helpful for other who wants to do post_pip_install processing.
setup.py
Different options to install package: 1) pip install pkg_name, 2) python -m setup.py sdist
If you want to make them work in either ways, need to have install, egg_info and develop all 3 options repeated as shown in setup.py
If you create *.whl file by python -m setup.py bdist_wheel , post pip install processing won't be executed! Please upload .tar.gz format generated usingsdist to PyPi/Artifacts to make post pip install processing work. Again, Please note: It will not work when installing from a binary wheel
upload the pip package: twine upload dist/*.tar.gz
from setuptools import setup, find_packages
from setuptools.command.install import install
from setuptools.command.egg_info import egg_info
from setuptools.command.develop import develop
rootDir = os.path.abspath(os.path.dirname(__file__))
def run_post_processing():
print("--------Start running custom command -------")
# One can Run any Post Processing here that will be executed post pip install
class PostInstallCommand(install):
def run(self):
print("***********Custom run from install********")
install.run(self)
run_post_processing()
class PostEggCommand(egg_info):
def run(self):
print("***********Custom run from Egg********")
egg_info.run(self)
run_post_processing()
class PostDevelopCommand(develop):
def run(self):
print("***********Custom run from Develop********")
develop.run(self)
run_post_processing()
ver = "0.0.0"
setup(
name='my_pkg',
version=ver,
packages=find_packages(),
python_requires='>=3.6.0',
install_requires = getRequirements(),
include_package_data= True,
cmdclass={
'install' : PostInstallCommand,
'egg_info': PostEggCommand,
'develop': PostDevelopCommand
}
)
Few More Things from my research:
If you want to do pre-processing instead of post-processing, need to move install.run(self) at the end
while pip installing, if you want to see custom messages of pre/post instllation, use -vvv. Example: pip install -vvv my_pkg
We have a private git monorepo which hosts a number of Python packages. Poetry was the dependency management tool initially chosen for the project. Anyway, due to this Poetry issue, it would not be accepted solution that involves creating new setup.py files.
A simplified version of the structure:
git-monorepo
├── pkg-1
│ ├── pkg
│ │ └── mod1.py
│ └── pyproject.toml
├── pkg-2
│ ├── pkg
│ │ └── mod2.py
│ └── pyproject.toml
└── lib
├── pkg
│ └── lib.py
└── pyproject.toml
The library distribution package lib is indepentent from any other package. However, pkg-1 depends on lib and pkg-2 depends on both pkg-1 and lib.
So, the question is:
How would be the proper way to use pip to install a package from this monorepo?
Let us consider as an example that we try to install pkg-1, where pkg-1/pyproject.toml includes the following lines:
...
[tool.poetry.dependencies]
lib = {path = "../lib/"}
...
The result from running pip, as explained in the VCS support documentation:
$ pip install -e git+https://gitlab.com/my-account/git-monorepo#"egg=pkg-1&subdirectory=pkg-1"
Traceback (most recent call last):
File "/home/hblanco/.local/lib/python3.8/site-packages/pip/_vendor/pkg_resources/__init__.py", line 3101, in __init__
super(Requirement, self).__init__(requirement_string)
File "/home/hblanco/.local/lib/python3.8/site-packages/pip/_vendor/packaging/requirements.py", line 115, in __init__
raise InvalidRequirement("Invalid URL: {0}".format(req.url))
pip._vendor.packaging.requirements.InvalidRequirement: Invalid URL: ../lib
The problem in the above setup is that the dependency is specified as a path dependency. When installing it, it uses that path dependency.
I ran into the same issue with a python monorepo, where I wanted to share the packages also to other projects.
I found 2 approaches to work for me:
in the CI/CD build pipeline, edit the pyproject.toml just before creating the wheel (which is published to a pypi repo)
First create the wheel (or .tar.gz) artifact, and then modify it afterwards (by extracting it, replacing the path dependencies, and zipping it again).
The full approach I've explained here.
However, it won't work with the git+https://...
You'll need a (private) pypi repo somewhere. Gitlab provides one for each project, which I utilize in the demo project here
For traditional Python projects with a setup.py, there are various ways of ensuring that the version string does not have to be repeated throughout the code base. See PyPA's guide on "Single-sourcing the package version" for a list of recommendations.
Many are trying to move away from setup.py to setup.cfg (probably under the influence of PEP517 and PEP518; setup.py was mostly used declaratively anyway, and when there was logic in setup.py, it was probably for the worse.) This means that most the suggestions won't work anymore since setup.cfg cannot contain "code".
How can I single-source the package version for Python projects that use setup.cfg?
There are a couple of ways to do this (see below for the project structure used in these examples):
1.
setup.cfg
[metadata]
version = 1.2.3.dev4
src/my_top_level_package/__init__.py
import importlib.metadata
__version__ = importlib.metadata.version('MyProject')
2.
setup.cfg
[metadata]
version = file: VERSION.txt
VERSION.txt
1.2.3.dev4
src/my_top_level_package/__init__.py
import importlib.metadata
__version__ = importlib.metadata.version('MyProject')
3.
setup.cfg
[metadata]
version = attr: my_top_level_package.__version__
src/my_top_level_package/__init__.py
__version__ = '1.2.3.dev4'
And more...
There are probably other ways to do this, by playing with different combinatons.
References:
https://setuptools.readthedocs.io/en/latest/userguide/declarative_config.html
https://docs.python.org/3/library/importlib.metadata.html
Structure assumed in the previous examples is as follows...
MyProject
├── setup.cfg
├── setup.py
└── src
└── my_top_level_package
└── __init__.py
setup.py
#!/usr/bin/env python3
import setuptools
if __name__ == '__main__':
setuptools.setup(
# see 'setup.cfg'
)
setup.cfg
[metadata]
name = MyProject
# See above for the value of 'version = ...'
[options]
package_dir =
= src
packages = find:
[options.packages.find]
where = src
$ cd path/to/MyProject
$ python3 setup.py --version
1.2.3.dev4
$ python3 -m pip install .
# ...
$ python3 -c 'import my_top_level_package; print(my_top_level_package.__version__)'
1.2.3.dev4
$ python3 -V
Python 3.6.9
$ python3 -m pip list
Package Version
------------- ----------
MyProject 1.2.3.dev4
pip 20.0.2
pkg-resources 0.0.0
setuptools 45.2.0
wheel 0.34.2
zipp 3.0.0