setuptools very simple (one source file module) configuration - python

I want to use setuptools to create a package consisting of two files: foo.py (script) and foo.conf.
Then I want to publish the package on my devpi-server and then install the package using pip.
Suppose I that initially I have my current working directory clean
$ ls -l
total 0
Then I issue pip install (or download?) command
$ pip install -i http://mydevpi.server foo
And get a dir with my two files created
$ tree
.
|
foo
|
|\_ foo.py
|
\_ foo.conf
So questions are:
what setuptools configuration should I use?
what exact pip command should I use to install the package the way I want? Will pip install -i http://mydevpi.server --target=. do the trick?

First write somethings as setup.py in foo directory like:
import setuptools
setuptools.setup(
name='foo_pip',
version='1',
packages=[''],
url='1',
license='1',
author='1',
author_email='1',
description='1'
)
(You can use distutils or setuptools)
Then python setup.py bdist_wheel -d TARGET and there will be a whl file in target directory, copy the path.
You can now install using pip install the_wheel_file_path --prefix="the_path_to_install"
Something like this
Processing .../TARGET/foo_pip-1-py2-none-any.whl
Installing collected packages: foo-pip
Successfully installed foo-pip-1
Then use it by import foo

Related

How do you build a wheel outside of a repo containing the package?

Question
Is there a way to build a wheel for a package while in a different repository such that the wheel has been built exactly as it would be if you built the wheel inside of the repository containing the package?
Example
Consider the following repo:
/repo-containing-your-package
|___ your_module/
|___ setup.py
Build method A
When I run python setup.py bdist_wheel from within repo-containing-your-package it builds the wheel as expected, including your_module. This means after I install pip install ./dist/your_module-#.#.#-py3-none-any.whl (which is successful), I can run python -m your_module.foo from the command line.
When the package is building, I get output that verifies that my module has been picked up by the wheel:
creating 'dist/your_module-#.#.#-py3-none-any.whl' and adding 'build/bar' to it
adding 'your_module/__init__.py'
etc...
Build method B
However, if I run python ../repo-containing-your-package/setup.py bdist_wheel from a repository that is a sibling to repo-containing-your-package, it does not build the wheel as expected, as it fails to include your_module. This means after I install pip install ./dist/your_module-#.#.#-py3-none-any.whl (which is successful), attempting python -m your_module.foo fails:
Error while finding module specification for 'your_module.foo' (ModuleNotFoundError: No module named 'your_module')
The fact that the module has not been properly installed with the package is confirmed by reviewing the build output, which does not include the adding 'your_module' output that method A includes.
Two solutions I know of:
change working directory in setup.py
If you can modify the setup script, you can change the working directory programmatically. Add an os.chdir call early enough in the setup script:
import os
from setuptools import setup
os.chdir(os.path.dirname(__file__))
setup(...)
You can also change the working directory with other means without having to modify the setup script, e.g. in bash:
$ pushd path/to/repo; python setup.py bdist_wheel; popd
Use pip wheel
pip has a subcommand wheel that builds a wheel from the given arg; this arg is usually the name of the package, but can be a directory containing the setup script. Pass -e in that case so the wheel has the correct name:
$ pip wheel -e path/to/repo

How to use local library as requirement for pip

I have the following directory structure:
/pythonlibraries
/libraryA
setup.py
libraryA/
__init__.py
alib.py
/libraryB
setup.py
libraryB/
__init__.py
blib.py
blib.py:
import libraryA
setup.py for libraryB:
from setuptools import setup
setup(name='libraryB',
version='0.0',
description='',
packages=['libraryB'],
install_requires=["ujson", "/pythonlibraries/libraryA"])
This doesn't work :/
How can I install local dependencies with pip?
Ideally I'd like to do pip install -e /pythonlibraries/libraryB and have it automatically install libraryA from my local disk.
Right now I have to install each local library individually manually...
Did you try to write full path like this
install_requires=["ujson", "/home/user/pythonlibraries/libraryA"])
Because "/" --> this is absolute directory

Python Packaging multiple subpackages with different data directories

I have a structure of the directory as such with foobar and alphabet data directories together with the code something.py:
\mylibrary
\packages
\foobar
foo.zip
bar.zip
\alphabet
abc.zip
xyz.zip
something.py
setup.py
And the goal is such that users can pip install the module as such:
pip install mylibrary[alphabet]
And that'll only include the data from the packages/alphabet/* data and the python code. Similar behavior should be available for pip install mylibrary[foobar].
If the user installs without the specification:
pip install mylibrary
Then it'll include all the data directories under packages/.
Currently, I've tried writing the setup.py with Python3.5 as such:
import glob
from setuptools import setup, find_packages
setup(
name = 'mylibrary',
packages = ['packages'],
package_data={'packages':glob.glob('packages' + '/**/*.txt', recursive=True)},
)
That will create a distribution with all the data directories when users do pip install mylibrary.
How should I change the setup.py such that specific pip installs like pip install mylibrary[alphabet] is possible?
Firs you have to package and publish alphabet and foobar as a separate packages as pip install mylibrary[alphabet] means
pip install mylibrary
pip install alphabet
After that add alphabet and foobar as extras:
setup(
…,
extras = {
'alphabet': ['alphabet'],
'foobar': ['foobar'],
}
)
The keys in the dictionary are the names used in pip install mylibrary[EXTRA_NAME], the values are a list of package names that will be installed from PyPI.
PS. And no, you cannot use extras to install some data files that are not available as packages from PyPI.

How to install data_files to absolute path?

I use pip with setuptools to install a package.
I want pip to copy some resource files to, say, /etc/my_package.
My setup.py looks like this:
setup(
...
data_files=[('/etc/my_package', ['config.yml'])]
)
When running pip install, the file ends up in
~/.local/lib/python3.5/site-packages/etc/my_package/config.yml
instead of /etc/my_package.
What am I doing wrong?
(pip version 9.0.1)
Short answer: use pip install --no-binary :all: to install your package.
I struggled with this for a while and eventually figured out that there is some weirdness/inconsistency in how data_files are handled between binary wheels and source distributions. Specifically, there is a bug with wheels that makes all paths in data_files relative to the install location (see https://github.com/pypa/wheel/issues/92 for an issue tracking this).
"Thats fine", you might say, "but I'm not using a wheel!". Not so fast! It turns out recent versions of pip (I am working with 9.0.1) will try to compile a wheel even from a source distribution. For example, if you have a package my_package you can see this doing something like
$ python setup.py sdist # create source tarball as dist/my_package.tar.gz
[...]
$ pip install dist/my_package.tar.gz # install the generated source
[...]
Building wheels for collected packages: my_package
Running setup.py bdist_wheel for my_package ... done
pip tries to be helpful and build a wheel to install from and cache for later. This means you will run into the above bug even though in theory you are not using bdist_wheel yourself. You can get around this by running python setup.py install directly from the package source folder. This avoids the building and caching of built wheels that pip will try to do but is majorly inconvenient when the package you want is already on PyPI somewhere. Fortunately pip offers an option to explicitly disable binaries.
$ pip install --no-binary :all: my_package
[...]
Skipping bdist_wheel for my_package, due to binaries being disabled for it.
Installing collected packages: my_package
Running setup.py install for my_package ... done
Successfully installed my_package-0.1.0
Using the --no-binary option prevents wheel building and lets us reference absolute paths in our data_files paths again. For the case where you are installing a lot of packages together and want to selectively disable wheels you can replace :all: with a comma separated list of packages.
it seems that data_files can't support absolute path, it will add sys.prefix before "/etc/my_package", if you want to put config.yml to ../site_packages/my_package, please try:
import os
import sys
from distutils.sysconfig import get_python_lib
relative_site_packages = get_python_lib().split(sys.prefix+os.sep)[1]
date_files_relative_path = os.path.join(relative_site_packages, "my_package")
setup(
...
data_files=[(date_files_relative_path, ['config.yml'])]
)
I ended up writing an init() function that installs the config file on first run instead of creating it during the installation:
def init():
try:
if not path.isdir(config_dir):
os.mkdir(cs_dir)
copyfile(pkg_resources.resource_filename(
__name__, "default_config.yml"), config_file)
print("INFO: config file created. ")
except IOError as ex:
print("ERROR: could not create config directory: " + str(ex)
if __name__ == "__main__":
init()
main()

How to use packages on the local filesystem?

I have two libraries, lib1 and lib2 and a program that uses them, program1.
The libraries have setup.py files that look like this:
from distutils.core import setup
setup(name='lib1',
version='0.1.0',
maintainer='foven',
maintainer_email='foven#example.com',
url='example.com/lib1',
packages=[
]
)
The setup.py for lib2 obviously replaces lib1 instances with lib2, but is otherwise the same.
Now program1 has a requirements.txt file, that looks like this:
-e ../lib1
-e ../lib2
I want to use the two libraries from their locations on the filesystem, since I'm not ready to put these into the repository yet. When running pip install -r requirements.txt for program1, this seems to work.
However, if I change the lib1/setup.py file to look like this:
from distutils.core import setup
setup(name='lib1',
version='0.1.0',
maintainer='foven',
maintainer_email='foven#example.com',
url='example.com/lib1',
packages=[
'axel'
]
)
and change program1/requirements.txt to this:
axel == 0.0.4
-e ../lib1
-e ../lib2
running pip install -r requirements.txt from program1 results in an error:
error: package directory 'axel' does not exist
Yet, pip list and pip freeze both indicate that the package is installed.
To me, it seems as though pip is not looking for axel in the normal location for installed packages or in pypi, but I don't have much experience with this, so I could be totally wrong. If I create an empty directory lib1/axel and run pip install -r requirements.txt for program1, it seems to work:
Obtaining file:///C:/Users/foven/code/lib1 (from -r requirements.txt (line 2))
Obtaining file:///C:/Users/foven/code/lib2 (from -r requirements.txt (line 3))
Requirement already satisfied (use --upgrade to upgrade): axel==0.0.4 in c:\program files\python35\lib\site-packages (from -r requirements.txt (line 1))
Installing collected packages: lib1, lib2
Running setup.py develop for lib1
Running setup.py develop for lib2
Successfully installed lib1-0.1.0 lib2-0.1.0
Just to be clear, I'll restate my goal: I want to be able to use the two libraries that only exist on the local filesytem with the program I am working on. What am I doing wrong and how should I setup these libraries and the program to work the way I want?
packages is for listing the packages within the package you're creating. install_requires is for listing the packages your package depends on. You put a dependency, 'axel', in packages. There's no internal package called 'axel', so of course the directory with that name can't be found.
setup(
...,
install_requires=['axel'],
...
)

Categories