How to use local library as requirement for pip - python

I have the following directory structure:
/pythonlibraries
/libraryA
setup.py
libraryA/
__init__.py
alib.py
/libraryB
setup.py
libraryB/
__init__.py
blib.py
blib.py:
import libraryA
setup.py for libraryB:
from setuptools import setup
setup(name='libraryB',
version='0.0',
description='',
packages=['libraryB'],
install_requires=["ujson", "/pythonlibraries/libraryA"])
This doesn't work :/
How can I install local dependencies with pip?
Ideally I'd like to do pip install -e /pythonlibraries/libraryB and have it automatically install libraryA from my local disk.
Right now I have to install each local library individually manually...

Did you try to write full path like this
install_requires=["ujson", "/home/user/pythonlibraries/libraryA"])
Because "/" --> this is absolute directory

Related

setup.py file using requirements.txt

I've read a discussion where a suggestion was to use the requirements.txt inside the setup.py file to ensure the correct installation is available on multiple deployments without having to maintain both a requirements.txt and the list in setup.py.
However, when I'm trying to do an installation via pip install -e ., I get an error:
Obtaining file:///Users/myuser/Documents/myproject
Processing /home/ktietz/src/ci/alabaster_1611921544520/work
ERROR: Could not install packages due to an OSError: [Errno 2] No such file or directory:
'/System/Volumes/Data/home/ktietz/src/ci/alabaster_1611921544520/work'
It looks like pip is trying to look for packages that are available on pip (alabaster) on my local machine. Why? What am I missing here? Why isn't pip looking for the required packages on the PyPi server?
I have done it before the other way around, maintaining the setup file and not the requirements file. For the requirements file, just save it as:
*
and for setup, do
from distutils.core import setup
from setuptools import find_packages
try:
from Module.version import __version__
except ModuleNotFoundError:
exec(open("Module/version.py").read())
setup(
name="Package Name",
version=__version__,
packages=find_packages(),
package_data={p: ["*"] for p in find_packages()},
url="",
license="",
install_requires=[
"numpy",
"pandas"
],
python_requires=">=3.8.0",
author="First.Last",
author_email="author#company.com",
description="Description",
)
For reference, my version.py script looks like:
__build_number__ = "_LOCAL_"
__version__ = f"1.0.{__build_number__}"
Which Jenkins is replacing the build_number with a tag
This question consists of two separate questions, for the rather philosopihc choice of how to arrange setup requirements is actually unrelated to the installation error that you are experiencing.
First about the error: It looks like the project you are trying to install depends on another library (alabaster) of which you apparently also did an editable install using pip3 install -e . that points to this directory:
/home/ktietz/src/ci/alabaster_1611921544520/work
What the error tells you is that the directory where the install is supposed to be located does not exist anymore. You should only install your project itself in editable mode, but the dependencies should be installed into a classical system directory, i. e. without the option -e.
To clean up, I would suggest that you do the following:
# clean up references to the broken editable install
pip3 uninstall alabaster
# now do a proper non-editable install
pip3 install alabaster
Concerning the question how to arrange setup requirements, you should primarily use the install_requires and extras_require options of setuptools:
# either in setup.py
setuptools.setup(
install_requires = [
'dep1>=1.2',
'dep2>=2.4.1',
]
)
# or in setup.cfg
[options]
install_requires =
dep1>=1.2
dep2>=2.4.1
[options.extras_require]
extra_deps_a =
dep3
dep4>=4.2.3
extra_deps_b =
dep5>=5.2.1
Optional requirements can be organised in groups. To include such an extra group with the install, you can do pip3 install .[extra_deps_name].
If you wish to define specific dependency environments with exact versions (e. g. for Continuous Integration), you may use requirements.txt files in addition, but the general dependency and version constraint definitions should be done in setup.cfg or setup.py.

setuptools very simple (one source file module) configuration

I want to use setuptools to create a package consisting of two files: foo.py (script) and foo.conf.
Then I want to publish the package on my devpi-server and then install the package using pip.
Suppose I that initially I have my current working directory clean
$ ls -l
total 0
Then I issue pip install (or download?) command
$ pip install -i http://mydevpi.server foo
And get a dir with my two files created
$ tree
.
|
foo
|
|\_ foo.py
|
\_ foo.conf
So questions are:
what setuptools configuration should I use?
what exact pip command should I use to install the package the way I want? Will pip install -i http://mydevpi.server --target=. do the trick?
First write somethings as setup.py in foo directory like:
import setuptools
setuptools.setup(
name='foo_pip',
version='1',
packages=[''],
url='1',
license='1',
author='1',
author_email='1',
description='1'
)
(You can use distutils or setuptools)
Then python setup.py bdist_wheel -d TARGET and there will be a whl file in target directory, copy the path.
You can now install using pip install the_wheel_file_path --prefix="the_path_to_install"
Something like this
Processing .../TARGET/foo_pip-1-py2-none-any.whl
Installing collected packages: foo-pip
Successfully installed foo-pip-1
Then use it by import foo

How to install data_files to absolute path?

I use pip with setuptools to install a package.
I want pip to copy some resource files to, say, /etc/my_package.
My setup.py looks like this:
setup(
...
data_files=[('/etc/my_package', ['config.yml'])]
)
When running pip install, the file ends up in
~/.local/lib/python3.5/site-packages/etc/my_package/config.yml
instead of /etc/my_package.
What am I doing wrong?
(pip version 9.0.1)
Short answer: use pip install --no-binary :all: to install your package.
I struggled with this for a while and eventually figured out that there is some weirdness/inconsistency in how data_files are handled between binary wheels and source distributions. Specifically, there is a bug with wheels that makes all paths in data_files relative to the install location (see https://github.com/pypa/wheel/issues/92 for an issue tracking this).
"Thats fine", you might say, "but I'm not using a wheel!". Not so fast! It turns out recent versions of pip (I am working with 9.0.1) will try to compile a wheel even from a source distribution. For example, if you have a package my_package you can see this doing something like
$ python setup.py sdist # create source tarball as dist/my_package.tar.gz
[...]
$ pip install dist/my_package.tar.gz # install the generated source
[...]
Building wheels for collected packages: my_package
Running setup.py bdist_wheel for my_package ... done
pip tries to be helpful and build a wheel to install from and cache for later. This means you will run into the above bug even though in theory you are not using bdist_wheel yourself. You can get around this by running python setup.py install directly from the package source folder. This avoids the building and caching of built wheels that pip will try to do but is majorly inconvenient when the package you want is already on PyPI somewhere. Fortunately pip offers an option to explicitly disable binaries.
$ pip install --no-binary :all: my_package
[...]
Skipping bdist_wheel for my_package, due to binaries being disabled for it.
Installing collected packages: my_package
Running setup.py install for my_package ... done
Successfully installed my_package-0.1.0
Using the --no-binary option prevents wheel building and lets us reference absolute paths in our data_files paths again. For the case where you are installing a lot of packages together and want to selectively disable wheels you can replace :all: with a comma separated list of packages.
it seems that data_files can't support absolute path, it will add sys.prefix before "/etc/my_package", if you want to put config.yml to ../site_packages/my_package, please try:
import os
import sys
from distutils.sysconfig import get_python_lib
relative_site_packages = get_python_lib().split(sys.prefix+os.sep)[1]
date_files_relative_path = os.path.join(relative_site_packages, "my_package")
setup(
...
data_files=[(date_files_relative_path, ['config.yml'])]
)
I ended up writing an init() function that installs the config file on first run instead of creating it during the installation:
def init():
try:
if not path.isdir(config_dir):
os.mkdir(cs_dir)
copyfile(pkg_resources.resource_filename(
__name__, "default_config.yml"), config_file)
print("INFO: config file created. ")
except IOError as ex:
print("ERROR: could not create config directory: " + str(ex)
if __name__ == "__main__":
init()
main()

How to use packages on the local filesystem?

I have two libraries, lib1 and lib2 and a program that uses them, program1.
The libraries have setup.py files that look like this:
from distutils.core import setup
setup(name='lib1',
version='0.1.0',
maintainer='foven',
maintainer_email='foven#example.com',
url='example.com/lib1',
packages=[
]
)
The setup.py for lib2 obviously replaces lib1 instances with lib2, but is otherwise the same.
Now program1 has a requirements.txt file, that looks like this:
-e ../lib1
-e ../lib2
I want to use the two libraries from their locations on the filesystem, since I'm not ready to put these into the repository yet. When running pip install -r requirements.txt for program1, this seems to work.
However, if I change the lib1/setup.py file to look like this:
from distutils.core import setup
setup(name='lib1',
version='0.1.0',
maintainer='foven',
maintainer_email='foven#example.com',
url='example.com/lib1',
packages=[
'axel'
]
)
and change program1/requirements.txt to this:
axel == 0.0.4
-e ../lib1
-e ../lib2
running pip install -r requirements.txt from program1 results in an error:
error: package directory 'axel' does not exist
Yet, pip list and pip freeze both indicate that the package is installed.
To me, it seems as though pip is not looking for axel in the normal location for installed packages or in pypi, but I don't have much experience with this, so I could be totally wrong. If I create an empty directory lib1/axel and run pip install -r requirements.txt for program1, it seems to work:
Obtaining file:///C:/Users/foven/code/lib1 (from -r requirements.txt (line 2))
Obtaining file:///C:/Users/foven/code/lib2 (from -r requirements.txt (line 3))
Requirement already satisfied (use --upgrade to upgrade): axel==0.0.4 in c:\program files\python35\lib\site-packages (from -r requirements.txt (line 1))
Installing collected packages: lib1, lib2
Running setup.py develop for lib1
Running setup.py develop for lib2
Successfully installed lib1-0.1.0 lib2-0.1.0
Just to be clear, I'll restate my goal: I want to be able to use the two libraries that only exist on the local filesytem with the program I am working on. What am I doing wrong and how should I setup these libraries and the program to work the way I want?
packages is for listing the packages within the package you're creating. install_requires is for listing the packages your package depends on. You put a dependency, 'axel', in packages. There's no internal package called 'axel', so of course the directory with that name can't be found.
setup(
...,
install_requires=['axel'],
...
)

How to install a dependency from a submodule in Python?

I have a Python project with the following structure (irrelevant source files omitted for simplicity):
myproject/
mysubmodule/
setup.py
setup.py
The file myproject/setup.py uses distutils.core.setup to install the module myproject and the relevant sources. However, myproject requires mysubmodule to be installed (this is a git submodule). So what I am doing right now is:
myproject/$ cd mysubmodule
myproject/mysubmodule/$ python setup.py install
myproject/mysubmodule/$ cd ..
myproject/$ python setup.py install
This is too tedious for customers, especially if the project will be extended by further submodules in the future.
Is there a way to automate the installation of mysubmodule when calling myproject/setup.py?
setuptools.find_packages() is able to discover submodules
Your setup.py should look like
from setuptools import setup, find_packages
setup(
packages=find_packages(),
# ...
)
Create a package for mysubmodule with its own setup.py and let the top-level package depend on that package in its setup.py. This means you only need to make the packages / dependencies available and run python setup.py install on the top-level package.
The question then becomes how to ship the dependencies / packages to your customers but this can be solved by putting them in a directory and configuring setup.py to include that directory when searching for dependencies.
The alternative is to "vendor" mysubmodule which simply means including it all in one package (no further questions asked) and having one python setup.py install to install the main package. For example, pip vendors (includes) requests so it can use it without having to depend on that requests package.

Categories