Setup.py duplicates sub packages as separate one - python

I have a following project structure:
.
└── Project/
├── package_1/
│ ├── package_2
│ ├── __init__.py
│ ├── file_1.py
│ ├── file_2.py
│ └── file_3.py
└── __init__.py
As the package_2 contains the ported files from another project, I want to be able to install package_1 using setuptools so that it doesn't conflict with the original project and import it like this :
import package_1.package_2
here is my setup.py file content :
setup(
name="Project",
...
packages=find_packages(exclude=["package_1"]),
)
So far everything works perfectly except that in \Lib\site-packages directory in addition to
package_1/package_2 I also have the package_2 as a separate module which I think it is not OK.

Related

Import project's subpackages in Python

i'm experimenting with DDD in Python so i've decided to implement a toy project.
I've created different directories in other to separate shared concepts from specific bounded contexts concepts.
As i try to import these files, i'm facing a No module named error exceptions
For example, with this project structure:
.
└── src/
├── Book/
│ ├── application
│ ├── domain/
│ │ ├── Book.py
│ │ └── __init__.py
│ ├── infrastructure
│ └── __init__.py
└── Shared/
├── application
├── domain/
│ ├── Properties/
│ │ ├── __init__.py
│ │ └── UuidProperty.py
│ ├── ValueObjects/
│ │ ├── __init__.py
│ │ └── BookId.py
│ └── __init__.py
└── infrastructure
On src/Book/domain/Book.py i have:
from Shared.domain.ValueObjects.BookId import BookId
class Book:
bookId: BookId
pages: int
As i've seen in other answer (pretty old ones) it can be fixed by adding these folders to PYTHONPATH or PATH like sys.path.insert(*path to file*) but i'm wondering if there is a more pythonic way to achieve that.
I've also tried to add an __init__.py file to src and import as from src.Shared.domain.ValueObjects.BookId import BookId but none of previous attempts worked for me
On other repos i've saw that they use setuptools to install the src package in order to import it at unit tests (i cant either import them at tests), but i don't know if that is recommended or would work inside package imports
In case someone is facing the same issue as me, i managed to import subpackages and the full package in tests directory.
Just include, in each subpackage, an __init__.py file, and inside the package/subpackages, use relative imports (it looses semantic of imports, when we know where each imports comes from by absolute path from root directory, but works)
from ..Properties import UuidProperty
# inside __init__.py of Properties directory
from .UuidProperty import UuidProperty
And, by including an __init__.py inside src/ we could import them at tests directory like
from src.Book.domain import Book
Hope this helps someone!

Can't import subfolders from python module on GitHub

I have a simple python package that I've published on GitHub. I installed the package locally on my machine using pip. I am trying to import a subfolder of the module but I keep getting a ModuleNotFoundError: No module named 'package_folder.subfolder1'
├── package_name/
│ ├── README.md
│ ├── setup.py
│ └── package_folder
│ ├── __init__.py
│ ├── file1.py
│ ├── file2.py
│ ├── subfolder1/
│ │ ├── __init__.py
│ │ ├── file11.py
│ │ └── file12.py
I have the __init__.py files in both directories, so I'm not sure why I am unable to access the subfolder1 files.
I am able to import file1.py and file2.py from the top-level package_folder with from package_folder import file1.py.
In the setup.py you have to include the subfolder in the packages as well. So, in setup.py instead of:
packages=['package_folder']
You have to do:
packages=['package_folder', 'package_folder/subfolder1']

What is the correct way to distribute "bin" and "tests" directories for a Python package?

I have created a python package.
At the advice of several internet sources (including https://github.com/pypa/sampleproject ), I have set up the directory structure like so:
root_dir
├── bin
│ └── do_stuff.py
├── MANIFEST.in
├── README.md
├── my_lib
│ ├── __init__.py
│ ├── __main__.py
│ └── my_lib.py
├── setup.cfg
├── setup.py
├── important_script.py
└── tests
├── __init__.py
└── test_lib.py
I have included tests, bin, and important_script.py in the manifest, and set include_package_data in setup.py to True.
However, after running pip install root_dir, I see that it correctly installed my_lib but bin and tests were just placed directly into Lib/site-packages as if they were separate packages.
I can't find important_script.py at all, and I don't think it was installed.
How do I correctly include these files/directories in my installation?
EDIT
So, it turns out that the bin and tests directories being placed directly into the site-packages directory was caused by something I was doing previously, but I can't discover what. At some point a build and a dist directory were generated in my root_dir (I assume by pip or setuptools?), and any changes I made to the project after that were not actually showing up in the installed package. After deleting these directories, I am no longer able to reproduce that issue.
The sample project distributes neither bin nor tests, it even explicitly excludes tests.
To include bin you should use scripts or entry_points (like in the sample project). Add this to your setup.py to setup() call:
scripts=['bin/do_stuff.py'],
To include tests you should restructure your tree to include the directory tests under the package directory:
root_dir
├── bin
│ └── do_stuff.py
├── MANIFEST.in
├── README.md
├── my_lib
│ ├── __init__.py
│ ├── __main__.py
│ └── my_lib.py
│ └── tests
│ ├── __init__.py
│ └── test_lib.py
├── setup.cfg
├── setup.py
├── important_script.py

How to use find_packages() to package all files from subdirectory

I am creating a python package (for the first time) and am able to package contents but I am having issue with packaging one of the data files that I have in subdirectory. My directory structure looks like this
├── Jenkinsfile
├── MANIFEST.in
├── README.md
├── __init__.py
├── country
│ ├── __init__.py
│ └── folder1
│ ├── file1.py
│ ├── file2.py
│ ├── file3.py
│ ├── folder2
│ ├── file4.xlsx
│ ├── __init__.py
├── repoman.yaml
├── requirements.txt
├── setup.py
└── test
└── unit
├──── __init__.py
├──── test_unit.py
my setup.py has the following get_packages() method
def get_packages():
return find_packages(exclude=['doc', 'imgs', 'test'])
when I build the package, my package does not include file4.xlsx, can anyone tell me why is that the case and how can I fix it?
I found this answer which is similar to what I want to do
I had to update my setup.py to include package_data and MANIFEST.in to include *.xlsx file (by providing the full directory path to excel file)

Why does "pip install" not include my package_data files?

I can't figure out why when I run pip install ../path_to_my_proj/ (from a virtualenv) none of the data files are copied across to the sitepackage/myproj/ folder. The python packages are copied across correctly.
python version 3.4.4
My project directory is like this:
├── myproj
│ ├── __init__.py
│ ├── module1.py
│ └── module2.py
├── data_files
| ├── subfolder1
│ | ├── datafile.dll
│ | └── datafile2.dll
| └── subfolder2
│ ├── datafile3.dll
│ └── datafile4.dll
|
├── MANIFEST.in
└── setup.py
And my MANIFEST.in looks like
recursive-include data_files *
include README.md
my setup looks like:
setup(
name='myproj',
version='0.1.1',
install_requires=['requirement'],
packages=['myproj'],
include_package_data=True,
)
I encountered the same problem and asked about it on https://gitter.im/pypa/setuptools. The result? You just can't do that. data_files must live under myproj.
You can fake it by putting an empty __init__.py in data_files, but then it will get put into PYTHONHOME\Lib\site-packages along side myproj at same level, polluting the name space.

Categories