python alternative package setup using setuptools - python

I am having some trouble adding packages with my particular setup:
.
├── pkg_a
│ ├── pkg_a
│ │ ├── __init__.py
│ │ └── module_a.py
│ └── run_a.py
├── pkg_b
│ ├── pkg_b
│ │ ├── __init__.py
│ │ └── module_b.py
│ └── run_b.py
└── setup.py
My goal is to be able to import package modules without repeating package name twice.
For example, in run_a.py I'd like to be able to call from pkg_a import module_a instead of calling from pkg_a.pkg_a import module_a
I tried to follow Section 2.1 of doc here. By creating setup.py as follow:
from setuptools import setup
setup(
name="test",
packages=['pkg_a', 'pkg_b'],
package_dir={'pkg_a':'pkg_a/pkg_a', 'pkg_b':'pkg_b/pkg_b'}
)
But this does not achieve the desired effect as mentioned above as I tried to call python setup.py develop and then python -c 'from pkg_a import module_a'.
Is this particular setup achievable? And what am I messing up here? Thanks all!

package_dir modifications do not work with editable (aka develop) installations. The only acceptable package_dir modification for editable installations is the one that covers the so-called src-layout:
package_dir={'': 'src'},

Related

Import project's subpackages in Python

i'm experimenting with DDD in Python so i've decided to implement a toy project.
I've created different directories in other to separate shared concepts from specific bounded contexts concepts.
As i try to import these files, i'm facing a No module named error exceptions
For example, with this project structure:
.
└── src/
├── Book/
│ ├── application
│ ├── domain/
│ │ ├── Book.py
│ │ └── __init__.py
│ ├── infrastructure
│ └── __init__.py
└── Shared/
├── application
├── domain/
│ ├── Properties/
│ │ ├── __init__.py
│ │ └── UuidProperty.py
│ ├── ValueObjects/
│ │ ├── __init__.py
│ │ └── BookId.py
│ └── __init__.py
└── infrastructure
On src/Book/domain/Book.py i have:
from Shared.domain.ValueObjects.BookId import BookId
class Book:
bookId: BookId
pages: int
As i've seen in other answer (pretty old ones) it can be fixed by adding these folders to PYTHONPATH or PATH like sys.path.insert(*path to file*) but i'm wondering if there is a more pythonic way to achieve that.
I've also tried to add an __init__.py file to src and import as from src.Shared.domain.ValueObjects.BookId import BookId but none of previous attempts worked for me
On other repos i've saw that they use setuptools to install the src package in order to import it at unit tests (i cant either import them at tests), but i don't know if that is recommended or would work inside package imports
In case someone is facing the same issue as me, i managed to import subpackages and the full package in tests directory.
Just include, in each subpackage, an __init__.py file, and inside the package/subpackages, use relative imports (it looses semantic of imports, when we know where each imports comes from by absolute path from root directory, but works)
from ..Properties import UuidProperty
# inside __init__.py of Properties directory
from .UuidProperty import UuidProperty
And, by including an __init__.py inside src/ we could import them at tests directory like
from src.Book.domain import Book
Hope this helps someone!

Setup.py duplicates sub packages as separate one

I have a following project structure:
.
└── Project/
├── package_1/
│ ├── package_2
│ ├── __init__.py
│ ├── file_1.py
│ ├── file_2.py
│ └── file_3.py
└── __init__.py
As the package_2 contains the ported files from another project, I want to be able to install package_1 using setuptools so that it doesn't conflict with the original project and import it like this :
import package_1.package_2
here is my setup.py file content :
setup(
name="Project",
...
packages=find_packages(exclude=["package_1"]),
)
So far everything works perfectly except that in \Lib\site-packages directory in addition to
package_1/package_2 I also have the package_2 as a separate module which I think it is not OK.

Issues with project setup and imports

While working on my first "bigger" Python project, I'm running into a multitude of issues while trying to debug and import various modules/sub-modules. Here's my tree:
netbox-setup/
├── README.md
├── TODO.md
├── netboxsetup
│ ├── __init__.py
│ ├── constants.py
│ ├── helpers
│ │ ├── __init__.py
│ │ ├── custom_napalm
│ │ │ ├── __init__.py
│ │ │ └── ios.py
│ │ ├── infoblox.py
│ │ ├── ise.py
│ │ ├── netbox.py
│ │ ├── solarwinds.py
│ │ └── utilities.py
│ └── main.py
├── requirements.txt
├── setup.py
└── tests
main.py imports:
from netboxsetup.helpers import utilities
from netboxsetup.helpers import solarwinds
utilities.py imports:
from napalm import get_network_driver
from netboxsetup.constants import USER
From what I've been reading, it's recommended to use absolute imports in a package rather than relative. If I try to run main.py from within the netboxsetup folder, it states that netboxsetup cannot be found. So I removed that and just called from helpers import utilities. Now running main.py works, but when it imports the utilities file, the imports in the utilities file fail. Information I'm finding regarding import use in a package/module seems to be inconsistent on what to do/use.
Finally, if I run a python3 shell from the the netbox-setup folder, and use from netboxsetup.helpers import utilities it imports. If I do the same from the netboxsetup folder, it states that "ModuleNotFoundError: No module named 'netboxsetup'". Going back to where it worked, I then followed https://napalm.readthedocs.io/en/latest/tutorials/extend_driver.html to create the same setup exactly, and it states that my new method isn't found when I run the grab_inventory function I defined in the utilities.py. And I did appropriately create the respective get_inventory function in the class in the ios.py file as the Napalm docs advise.
def grab_inventory(ip, password):
# make less assumption, account for nx-os as well
driver = get_network_driver('ios')
with driver(ip, USER, password) as client:
result = client.get_inventory()
return result
I'm guessing all of my issues are related to pathing - whether absolute or relative, but I'm just having a very difficult time determining what the exact pathing is that works for everything. Is anyone able to point me into a proper source for proper import of modules in custom packages? Thanks.
P.S. Is it possible to debug an individual function in VSCode, giving it arguments at runtime as well (rather than having to run through all of the main sequence code to get to the function)?

Packaging python project with multiple directories

I need some explanation on working with setuptools and find_packages function.
I have a project structure like this:
├── project_dir_1
│ ├── module.py
│ ├── __init__.py
├── my_project
│ ├── cli.py
│ ├── subdir1
│ │ ├── __init__.py
│ │ ├── module.py
│ ├── conf
│ │ ├── module.py
│ │ ├── params
│ │ │ ├── config.yml
│ │ ├── __init__.py
│ ├── subdir2
│ │ ├── module.py
│ ├── __init__.py
│ └── version.py
├── project_dir_2
│ ├── subdir1
│ │ ├── module.py
│ │ ├── __init__.py
│ ├── __init__.py
├── README.md
├── requirements.txt
├── setup.py
└── tests
└── test_main.py
Actually all my code in the my_project dir and I also have two additonal dirs project_dir_1 and project_dir_2 that contains necessary external modules from where I should do imports both in the package code and in another projects code where this package will be installed in venv.
I have setup script like this:
setup(
name='my_project',
version='0.0.1',
description='Python library.',
license='license',
author='me',
author_email='my_email',
entry_points={'console_scripts': ['my_project=my_project.cli:main']},
python_requires='>=3.7',
packages=find_packages(
include=['my_project', 'project_dir_1', 'project_dir_2', 'my_project.*', 'project_dir_1.*', 'project_dir_2.*']
),
install_requires=list(open(join(dirname(__file__), 'requirements.txt')).read().split()),
)
When I activate venv in another project folder and trying to install package from package root folder like python ..\package_root\setup.py install everything seems to works fine during install. And pip list shows all dependencies and my_project 0.0.1. But if I'm trying to import something from my_project using the venv interpreter I got an error: ModuleNotFoundError: No module named 'my_project'. The same result if I'm trying to import something like from project_dir_1 import module which is also necessary. Also when I just run my_project from shell addressing to cli I got an error:
Traceback (most recent call last):
File "/home/developer/another_project/env/bin/my_project", line 11, in <module>
load_entry_point('my_project==0.0.1', 'console_scripts', 'my_project')()
File "/home/developer/another_project/env/lib/python3.8/site-packages/pkg_resources/__init__.py", line 489, in load_entry_point
return get_distribution(dist).load_entry_point(group, name)
File "/home/developer/another_project/env/lib/python3.8/site-packages/pkg_resources/__init__.py", line 2852, in load_entry_point
return ep.load()
File "/home/developer/another_project/env/lib/python3.8/site-packages/pkg_resources/__init__.py", line 2443, in load
return self.resolve()
File "/home/developer/another_project/env/lib/python3.8/site-packages/pkg_resources/__init__.py", line 2449, in resolve
module = __import__(self.module_name, fromlist=['__name__'], level=0)
ModuleNotFoundError: No module named 'my_project'
So what is the right way to organize this complex project structure and include all necessary code in setup.py to get the setuptools install the package correctly? I need some better understanding of python projects packaging but still don't get an answers for this case from seeking the docs.
find_packages will resolve paths relative to current working directory, so calling it outside of the project root dir will effectively install nothing (check whether you see any sources installed by e.g. running
$ pip show -f my_project
, I bet nothing will be listed). You have to force switching to the project root dir in the setup script, e.g. add a magic line to your setup script:
# setup.py
import os
from setuptools import setup
# old-style for python 2
os.chdir(os.path.normpath(os.path.join(os.path.abspath(__file__), os.pardir)))
# new style for python 3
from pathlib import Path
os.chdir(Path(__file__).parent.absolute())
setup(...)

Configure python sub-packages

I'm trying to set sub-packages for a python project. Please refer to the structure below. The main setup.py will call "setup.py" in each sub packages.
my_project
├── my_sub_package1
│ ├── foo2.py
│ ├── foo.py
│ └── setup.py
├── my_sub_package2
│ ├── bar2.py
│ ├── bar.py
│ └── setup.py
└── setup.py [main]
With this structure, in other projects, if the user only needs a sub_package, the user can choose to install "my_sub_package1" only, instead of installing the whole package (which can become bulky over time as number of packages increases).
Does anyone know if this is the correct way of doing it? Thanks!

Categories