package_dir in setup.py not working as expected - python

I'm trying to let users write code as a python module (folder with __init__.py defined) under whatever folder name they see fit. After that I want to install that module as a python package but define the import name myself.
The folder structure would be like this:
project_name/
user_defined_name/
__init__.py
...
setup.py
According to this I should be able to add this to my setup.py to get it working:
setuptools.setup(
package_dir={'my_defined_name': 'user_defined_name'},
packages=['user_defined_name']
)
But the only way that I was able to access the code was by using import user_defined_name. I tried installing the package without -e but that gave the same result. Leaving the packages=['..'] out of the setup functions also did not change the result.
My question is kind of the same as this one and there the only answers seem to be to change folder names and that is something that I would like to avoid. That question mentioned that it might be due to a problem in setuptools but that seemed fixed 3 years ago.

In short, it looks like you need something like that in your setup.py:
setuptools.setup(
package_dir={
'my_defined_name': 'user_defined_name',
},
packages=[
'my_defined_name',
],
)
as Ilia Novoselov said in a comment to your question.
This should work, if you package and install the project normally. You would be able to import my_defined_name.
Although, note that as far as I can tell, this will not work if you use an editable installation (python setup.py develop or python -m pip install --editable .). It will be impossible to import my_defined_name, but you would be able to import user_defined_name, which is not what you want.

#Oliver's answer here clarified this for me.
My TLDR is that to support both releases (python setup.py install and pip install .) and editable installs (python setup.py develop and pip install -e .) you must change your file structure to
project_name
setup.py
user_defined_name
my_defined_name
__init__.py
...
docs
tests
and your setup.py to
setuptools.setup(
package_dir={'': 'user_defined_name'},
packages=['my_defined_name']
)
You can support just releases (NOT editable installs) with
project_name
setup.py
user_defined_name
__init__.py
...
docs
tests
setuptools.setup(
package_dir={'my_defined_name': 'user_defined_name'},
packages=['my_defined_name']
)

Related

How to make it possible to make global imports starting from a source root?

I guess my question is a duplicate, but, unfortunately, I haven't found a solution that corresponds to my problem.
I have following project structure:
↓ project_root
↓ source_root
__init__.py
↓ inner_package
some_executable_file.py
some_library_file.py
So I would like to import a name from 'some_library_file' in the following way:
from source_root.inner_package.some_library_file import X
But when I do something like this, I see the following error:
ModuleNotFoundError: No module named 'source_root'
You need to install your own project to your (ideally virtual) environment. In order to do this, you need a setup.py file, which will be used by pip to find all your packages (i.e., folders containing __init__.py) during installation.
Minimal example:
project_root/setup.py
from setuptools import setup, find_packages
setup(
name='MyProjectName',
version='0.0.0',
packages=find_packages(),
)
Then from the command line, cd into your project_root ant type:
python -m pip install -e .
You now installed your own project into your Python environment. Files in the project are able to import their own packages with the from source_root.[...] import [...] syntax.
This process mimics a user pip installing your project from PyPI / github, so you can be sure that your imports are going to work on any other environment/machines, provided pip is used for installation.
The -e flag is used to install the project in editable mode, so you won't need to reinstall the package locally after changing one of the source files (which you'll be doing a lot during development).

pip install bug with `-e` flag and `setuptools.setup(package_dir=...)` parameter?

I have what I think is a pip bug, and I want to double-check that it's not actually a mistake of mine before submitting it as a formal issue. If it's not a bug, I'd appreciate an explanation of what I'm doing wrong.
I have a project structure like so:
project/
setup.py
project_src/
__init__.py
...
common_utils/
utils_src/
__init__.py
...
I want to be able to:
import code from "project/project_src" via import project_src (this isn't the issue, I just want to be comprehensive)
import code from "project/common_utils/utils_src" via import utils_src (note this strips the "common_utils" folder from the package path name)
In order to do this, the root-level "setup.py" looks something like this (abbreviated):
# setup.py
import setuptools
setuptools.setup(
...,
packages=['project_src', 'utils_src'],
package_dir={
'project_src': 'project_src',
'utils_src': 'common_utils/utils_src',
},
...,
)
Now here's my issue. When I then install this package locally via CL as pip install project/, I can then open an interpreter and successfully run import project_src and import utils_src. But if I install via pip install -e project/, import project_src works but import utils_src triggers a ModuleNotFoundError error. (This is a huge pain as I rely on using the -e flag for development.)
Again, please let me know if this appears to be a bug, or if this is a mistake on my part.
Not a bug in pip, your mistake. You want to use common_utils/ directory as a parent dir for a package but you want utils_src as a package inside it. So change your setup.py:
package_dir={
'project_src': 'project_src',
'utils_src': 'common_utils',
},
With this utils_src will be installed as a top-level package so you can do
import utils_src
PS. They say this doesn't work with
pip install -e
I didn't test the rumor yet.
Turns out this is a long-standing issue: pypa/setuptools #230: develop mode does not respect src structure
Thanks #sinoroc for the hint # comments on this answer

Python setup.py for unusual folder structure [duplicate]

I have a Git repository cloned into myproject, with an __init__.py at the root of the repository, making the whole thing an importable Python package.
I'm trying to write a setuptools setup.py for the package, which will also sit in the root of the repository, next to the __init__.py file. I want setup.py to install the directory it resides in as a package. It's fine if setup.py itself comes along as part of the installation, but it would be better if it didn't. Ideally this should work also in editable mode (pip install -e .)
Is this configuration at all supported? I can kind of make it work by having a package_dir= {"": ".."}, argument to setup(), telling it to look for myproject in the directory above the current one. However, this requires the package to always be installed from a directory named myproject, which does not appear to be the case if, say, it's being installed through pip, or if someone is working out of a Git clone named myproject-dev, or in any number of other cases.
Another hack I'm contemplating is a symlink to . named mypackage inside of the repository. That ought to work, but I wanted to check if there was a better way first.
See also Create editable package setup.py in the same root folder as __init__.py
As far as I know this should work:
myproject-dev/
├── __init__.py
├── setup.py
└── submodule
└── __init__.py
#!/usr/bin/env python3
import setuptools
setuptools.setup(
name='MyProject',
version='0.0.0.dev0',
packages=['myproject', 'myproject.submodule'],
package_dir={
'myproject': '.',
},
)
One way to make this work for editable or develop installations is to manually modify the easy-install.pth file.
Assuming:
the project lives in: /home/user/workspace/empty/project;
a virtual environment .venv is used;
the project is installed with python3 -m pip install -e . or python3 setup.py develop;
the Python version is 3.6.
Then:
the file is found at a location such as /home/user/workspace/empty/project/.venv/lib/python3.6/site-packages/easy-install.pth;
its content is: /home/user/workspace/empty/project.
In order to let the imports work as expected one can edit this line to read the following:
/home/user/workspace/empty
Note:
Everything in /home/user/workspace/empty that looks like a Python package is then susceptible to be imported, that is why it is a good idea to place the project in its own directory, in this case the directory empty contains nothing else but the directory project.
The module project.setup is also importable.

Why current working directory affects install path of setup.py? How to prevent that?

I have created a custom python package following this guide, so I have the following structure:
mypackage/ <-- VCS root
mypackage/
submodule1/
submodule2/
setup.py
And setup.py contains exactly the same information as in the guide:
from setuptools import setup, find_packages
setup(name='mypackage',
version='0.1',
description='desc',
url='vcs_url',
author='Hodossy, Szabolcs',
author_email='myemail#example.com',
license='MIT',
packages=find_packages(),
install_requires=[
# deps
],
zip_safe=False)
I have noticed if I go into the folder where setup.py is, and then call python setup.py install in a virtual environment, in site-packages the following structure is installed:
.../site-packages/mypackage-0.1-py3.6.egg/mypackage/
submodule1/
submodule2/
but if I call it from one folder up like python mypackage/setup.py install, then the structure is the following:
.../site-packages/mypackage-0.1-py3.6.egg/mypackage/
mypackage/
submodule1/
submodule2/
This later one ruins all imports from my module, as the path is different for the submodules.
Could you explain what is happening here and how to prevent that kind of behaviour?
This is experienced with Python 3.6 on both Windows and Linux.
Your setup.py does not contain any paths, but seems to only find the files via find_packages. So of course it depends from where you run it. The setup.py isn't strictly tied to its location. Of course you could do things like chdir to the basename of the setup file path in sys.argv[0], but that's rather ugly.
The question is, WHY do you want to build it that way? It looks more like you would want a structure like
mypackage-source
mypackage
submodule1
submodule2
setup.py
And then execute setup.py from the work directory. If you want to be able to run it from anywhere, the better workaround would be to put a shellscript next to it, like
#!/bin/sh
cd ``basename $0``
python setup.py $#
which separates the task of changing to the right directory (here I assume the directory with setup.py in the workdir) from running setup.py

Disutils Self Extracting Python Package [duplicate]

how can I make setup.py file for my own script? I have to make my script global.
(add it to /usr/bin) so I could run it from console just type: scriptName arguments.
OS: Linux.
EDIT:
Now my script is installable, but how can i make it global? So that i could run it from console just name typing.
EDIT: This answer deals only with installing executable scripts into /usr/bin. I assume you have basic knowledge on how setup.py files work.
Create your script and place it in your project like this:
yourprojectdir/
setup.py
scripts/
myscript.sh
In your setup.py file do this:
from setuptools import setup
# you may need setuptools instead of distutils
setup(
# basic stuff here
scripts = [
'scripts/myscript.sh'
]
)
Then type
python setup.py install
Basically that's it. There's a chance that your script will land not exactly in /usr/bin, but in some other directory. If this is the case, type
python setup.py install --help
and search for --install-scripts parameter and friends.
I know that this question is quite old, but just in case, I post how I solved the problem for myself, that was wanting to setup a package for PyPI, that, when installing it with pip, would install it as a system package, not just for Python.
setup(
# rest of setup
console_scripts={
'console_scripts': [
'<app> = <package>.<app>:main'
]
},
)
Details

Categories