Is possible to avoid using pip and requirements.txt in favor of just using setup.py to install missing libraries but not having build all other stuff?
Normally it's looks like (and is run python setup.py install:
from setuptools import setup, find_packages
setup(
name="HelloWorld",
version="0.1",
packages=find_packages(),
install_requires=['docutils>=0.3'],
)
And I wish to use only install_requires=['docutils>=0.3'] to have those dependencies resolved and avoid all build artifacts.
Depending on the setup you are using, there is:
install_requires with setuptools
See also:
https://packaging.python.org/requirements/
Adding 'install_requires' to setup.py when making a python package
Related
I'm building my first project on GitHub and my python src code uses an open-source, 3rd-party library that I have installed on my computer. However, I heard it is best practice to create a dep (dependencies) folder to store any additional libraries I would need. How do I actually install the libraries in the dep folder and use them from there instead of my main computer?
You have to create a requirements.txt file with each package on a separate line. e.g.
pandas==0.24.2
You also might want to add a setup.py to your python package. In the setup you have to use "install_requires" argument. Although install_requires will not install packages when installing your package but will let the user know which packages are needed. The user can refer to the requirements.txt to see the requirements.
You can check it here: https://packaging.python.org/discussions/install-requires-vs-requirements/
The following is an example of setup.py file:
from distutils.core import setup
from setuptools import find_packages
setup(
name='foobar',
version='0.0',
packages=find_packages(),
url='',
license='',
author='foo bar',
author_email='foobar#gmail.com',
description='A package for ...'
install_requires=['A','B']
)
Never heard about installing additional libraries in a dependencies folder.
Create a setup python file in your root folder if you don't already have it, in there you can define what packages (libraries as you call them) your project needs. This is a simple setup file for example:
from setuptools import setup, find_packages
setup(
name = "yourpackage",
version = "1.2.0",
description = "Simple description",
packages = find_packages(),
install_requires = ['matplotlib'] # Example of external package
)
When installing a package that has this setup file it automatically also install every requirement in your VENV. And if you're using pycharm then it also warns you if there's a requirement that's not installed.
I'm building a new PyPI package based on an existing open source project using setuptools and add some code modifications (they are not the same).
Example:
opensource-custom=2.13.1
Since this project requires dependencies that will look for opensource
what options can I pass to my setup.py when building my wheel files so when I do pip freeze/pip list I can see both?
opensource-custom=2.13.1
opensource=2.13.0
An example of this scenario is intel-numpy if you do a pip install of it, it will generate a copy of numpy.
>pip install intel-numpy
>pip freeze
icc-rt==2019.0
intel-numpy==1.15.1
intel-openmp==2019.0
mkl==2019.0
mkl-fft==1.0.6
mkl-random==1.0.1.1
numpy==1.15.1
tbb==2019.0
tbb4py==2019.0
It sounds like you want to make opensource a dependency of opensource-custom. To do this, you can specify the install_requires parameter in setup.py:
from setuptools import setup
setup(
name='opensource-custom',
install_requires=[
'opensource',
],
...
)
See https://packaging.python.org/guides/distributing-packages-using-setuptools/#install-requires
In my project, I have a single setup.py file that builds multiple modules using the following namespace pattern:
from setuptools import setup
setup(name="testmoduleserver",
packages=["testmodule.server","testmodule.shared"],
namespace_packages=["testmodule"])
setup(name="testmoduleclient",
packages=["testmodule.client","testmodule.shared"],
namespace_packages=["testmodule"])
I am trying to build wheel files for both packages. However, when I do:
python -m pip wheel .
It only ever builds the package for one of the definitions.
Why does only one package get built?
You cannot call setuptools.setup() more than once in your setup.py, even if you want to create several packages out of one codebase.
Instead you need to separate everything out into separate namespace packages, and have one setup.py for each (they all can reside in one Git repository!):
testmodule/
testmodule-client/
setup.py
testmodule/
client/
__init__.py
testmodule-server/
setup.py
testmodule/
server/
__init__.py
testmodule-shared/
setup.py
testmodule/
shared/
__init__.py
And each setup.py contains something along the lines
from setuptools import setup
setup(
name='testmodule-client',
packages=['testmodule.client'],
install_requires=['testmodule-shared'],
...
)
and
from setuptools import setup
setup(
name='testmodule-server',
packages=['testmodule.server'],
install_requires=['testmodule-shared'],
...
)
and
from setuptools import setup
setup(
name='testmodule-shared',
packages=['testmodule.shared'],
...
)
To build all three wheels you then run
pip wheel testmodule-client
pip wheel testmodule-server
pip wheel testmodule-shared
This is the tree structure of the module I'm writing the setup.py file for:
ls .
LICENSE
README.md
bin
examples
module
scratch
setup.py
tests
tox.ini
I configured my setup.py as follows:
from setuptools import setup, find_packages
setup(
name="package_name",
version="0.1",
packages=find_packages(),
install_requires=[
# [...]
],
extras_require={
# [...]
},
tests_require={
'pytest',
'doctest'
},
scripts=['bin/bootstrap'],
data_files=[
('license', ['LICENSE']),
],
# [...]
# could also include long_description, download_url, classifiers, etc.
)
If I install the package from my python environment (also a virtualenv)
pip install .
the LICENSE file gets correctly installed.
But running tox:
[tox]
envlist = py27, py35
[testenv]
deps =
pytest
git+https://github.com/djc/couchdb-python
docopt
commands = py.test \
{posargs}
I get this error:
running install_data
creating build/bdist.macosx-10.11-x86_64/wheel/leafline-0.1.data
creating build/bdist.macosx-10.11-x86_64/wheel/leafline-0.1.data/data
creating build/bdist.macosx-10.11-x86_64/wheel/leafline-0.1.data/data/license
error: can't copy 'LICENSE': doesn't exist or not a regular file
Removing the data_files part from the setup.py makes tox running correctly.
Your issue here is that setuptools is not able to find the 'LICENSE' file in the files that have been included for building the source distribution. You have 2 options, to tell setuptools to include that file (both have been pointed to here):
Add a MANIFEST.in file (like https://github.com/pypa/sampleproject/)
Use include_package_data=True in your setup.py file.
Using MANIFEST.in is often simpler and easier to verify due to https://pypi.org/project/check-manifest/, making it possible to use automation to verify that things are indeed correct (if you use a VCS like Git or SVN).
pip install . builds a wheel using python setup.py bdist_wheel which is installed by simply unpacking it appropriately, as defined in the Wheel Specification: https://www.python.org/dev/peps/pep-0427/
tox builds a source distribution using python setup.py sdist, which is then unpacked and installed using python setup.py install.
That might be a reason for the difference in behavior for you.
I have some resource files inside my packages which I use during the execution. To make setup store them in a package with python code, I use include_package_data=True and I access them using importlib.resources. You can use backport for an older Python version than 3.7 or another library.
Before each release I have a script which verifies, that all files I need are placed inside a bdist wheel to be sure that everything is on the place.
I have a Python project with the following structure (irrelevant source files omitted for simplicity):
myproject/
mysubmodule/
setup.py
setup.py
The file myproject/setup.py uses distutils.core.setup to install the module myproject and the relevant sources. However, myproject requires mysubmodule to be installed (this is a git submodule). So what I am doing right now is:
myproject/$ cd mysubmodule
myproject/mysubmodule/$ python setup.py install
myproject/mysubmodule/$ cd ..
myproject/$ python setup.py install
This is too tedious for customers, especially if the project will be extended by further submodules in the future.
Is there a way to automate the installation of mysubmodule when calling myproject/setup.py?
setuptools.find_packages() is able to discover submodules
Your setup.py should look like
from setuptools import setup, find_packages
setup(
packages=find_packages(),
# ...
)
Create a package for mysubmodule with its own setup.py and let the top-level package depend on that package in its setup.py. This means you only need to make the packages / dependencies available and run python setup.py install on the top-level package.
The question then becomes how to ship the dependencies / packages to your customers but this can be solved by putting them in a directory and configuring setup.py to include that directory when searching for dependencies.
The alternative is to "vendor" mysubmodule which simply means including it all in one package (no further questions asked) and having one python setup.py install to install the main package. For example, pip vendors (includes) requests so it can use it without having to depend on that requests package.