I have two projects (trysetup1 and trysetup2) with the following structure:
I want to pip install package1 and use module1 from project trysetup2
my setup.py that under package1 looks like this:
import setuptools
setuptools.setup(
name="common",
version="1.0.2",
packages=setuptools.find_packages(),
)
the way I want to use module1 is like this from package1.module1 import ClassOne because I still need to use it from package2
when trying to import from module2 it works just fine
but when trying to use it from module3 (in the different project after pip installing it) i'm having "Unresolved reference 'package1'" problem
I know I'm able to use module1 by putting it inside another package under package1 but I need this exact stracture in order to use it from the rest of the project 'trysetup1'
Thanks!
My answer was found here:
https://docs.python.org/3/distutils/examples.html
actually, all I needed to do was to change my setup.py file to look like this:
setuptools.setup(
name="common",
version="1.0.2",
package_dir={'package1': ''},
packages=['package1'],
)
by adding package_dir param, setup function tells all files under my root directory (package1) to be under package1 directory and by adding packages param it distributes package1 package and then if you go to:
/..../venv/lib/python3.8/site-packages/common-1.0.2-py3.8.egg-info/top_level.txt
you'll see the following content:
Related
I am trying to create multiple Python packages in the same namespace without re-creating the full folder structure for that namespace.
This is my project structure:
some_folder/
src/
subpackage1/
functions.py
setup.py
another_folder/
src/
subpackage2/
functions.py
setup.py
The first setup.py looks like this (the second one is the same, only with "subpackage2" instead of "common" in name and packages:
from setuptools import setup
setup(
name="foo-bar-subpackage1",
version="0.1",
package_dir={"foo.bar": "."},
packages=["foo.bar.subpackage1"]
)
So basically I want to define multiple different package in the same shared namespace, which namespace I also don't want to re-create as a folder structure (e.g. foo/bar/subpackage1) and just start with the lowest level where my code is. I am aiming to use "native namespace packaging" as explained here and in PEP 420, but that example doesn't use package_dir.
What I expect to achieve: that after running pip install . or pip install -e . in both some_folder/src/subpackage1 and another_folder/src/subpackage2 I am able to do this:
from foo.bar.subpackage1.functions import a
from foo.bar.subpackage2.functions import b
# there might be something else under foo.* or foo.bar.* that is provided by other modules in other projects, which should also be able to import
What I get instead:
AttributeError: module 'darwin.transform' has no attribute 'subpackage1'
When I remove package_dir parameter from setup(), change packages parameter value to ["foo", "foo.bar", "foo.bar.subpackage1"] and re-create the full namespace structure like this:
some_folder/
src/
foo/
bar/
subpackage1/
functions.py
setup.py
it works, but this structure feels highly excessive. Is there a way to shorten it like I'm intending above?
My directory is structured like this
>project
>tests
>test_to_json.py
>app.py
>x.db
in my test_to_json.py I want to access a class from app.py
How would I import it? I've tried from ..app import myClass but that gives me ImportError: attempted relative import with no known parent package. Is there something I'm missing?
You cannot use .. relative pathing in python. That specific type of relative python is simply not allowed. . is allowed, though. The resolution for this problem is usually done by converting your project into a python package.
Extensive tutorials for doing so can be found here, but I will give an example of how to convert your project into a package.
Step 1
The new file structure should look like this:
>project
>tests
>__init__.py #Note this file
>test_to_json.py
>__init__.py #Note this file
>setup.py #Note this file
>app.py
>x.db
Step 2
Write your setup.py.
Here is an generic setup.py that should work for your project:
from setuptools import setup, find_packages
setup(
name='project_package', #the name of your desired import
version='0.0.1',
author='An Awesome Coder',
packages=find_packages(),
description='An awesome package that does something',
install_requires=[], # a list of python dependencies for your package
)
find_packages() looks for all the __init__.py files in your package to identify submodules.
Step 3
Install your package. In the folder with your new setup.py, run pip install -e . This will install your package on your computer, basically adding the files to your python system path.
Step 4
From any python terminal on your computer you should now be able to import your package using the package_name you specified.
import project_package
from project_package.app import myClass
myClass()
.
.
.
I have a package named 'firstlib' which I want to include completely in another package while keeping the code separated.
For example, the directory structure looks like this:
Proj1/
setup.py
firstlib/
__init__.py
mod1.py
subpack/
...
Proj2/
setup.py
otherlib/
__init__.py
mod2.py
...
Now i want to be able to use 'firstlib' as a standalone package but also as a subpackage of 'otherlib'. Assuming both packages are installed, what needs to be done to make these lines work:
from firstlib import mod1
from otherlib.firstlib import mod1
So basicaly I want the namespace of firstlib in the same namespace as modules of otherlib. In a way like a namespace package.
I have a repository I inherited used by a lot of teams, lots of scripts call it, and it seems like its going to be a real headache to make any structural changes to it. I would like to make this repo installable somehow. It is structured like this:
my_repo/
scripts.py
If it was my repository, I would change the structure like so and make it installable, and run python setup.py install:
my_repo/
setup.py
my_repo/
__init__.py
scripts.py
If this is not feasible (and it sounds like it might not be), can I somehow do something like:
my_repo/
setup.py
__init__.py
scripts.py
And add something to setup.py to let it know that the repo is structured funny like this, so that I can install it?
You can do what you suggest.
my_repo/
setup.py
__init__.py
scripts.py
The only thing is you will need to import modules in your package via their name if they are in the base level. So for example if your structure looked like this:
my_repo/
setup.py
__init__.py
scripts.py
lib.py
pkg/
__init__.py
pkgmodule.py
Then your imports in scripts.py might look like
from lib import func1, func2
from pkg.pkgmodule import stuff1, stuff2
So in your base directory imports are essentially by module name not by package. This could screw up some of your other packages namespaces if you're not careful, like if there is another dependency with a package named lib. So it would be best if you have these scripts running in a virtualenv and if you test to ensure namespacing doesn't get messed up
There is a directive in setup.py file to set the name of a package to install and from where it should get it's modules for installation. That would let you use the desired directory structure. For instance with a given directory structure as :
my_repo/
setup.py
__init__.py
scripts.py
You could write a setup.py such as:
setup(
# -- Package structure ----
packages=['my_repo'],
package_dir={'my_repo': '.'})
Thus anyone installing the contents of my_repo with the command "./setup.py install" or "pip install ." would end up with an installed copy of my_repo 's modules.
As a side note; relative imports work differently in python 2 and python 3. In the latter, any relative imports need to explicitly specify the will to do so. This method of installing my_repo will work in python 3 when calling in an absolute import fashion:
from my_repo import scripts
I'm trying to create an install package for a Python project with included unit tests. My project layout is as follows:
setup.py
src/
disttest/
__init__.py
core.py
tests/
disttest/
__init__.py
testcore.py
My setup.py looks like this:
from distutils.core import setup
import setuptools
setup(name='disttest',
version='0.1',
package_dir={'': 'src'},
packages=setuptools.find_packages('src'),
test_suite='nose.collector',
tests_require=['Nose'],
)
The file tests/disttest/testcore.py contains the line from disttest.core import DistTestCore.
Running setup.py test now gives an ImportError: No module named core.
After a setup.py install, python -c "from disttest.core import DistTestCore" works fine. It also works if I put import core into src/disttest/__init__.py, but I don't really want to maintain that and it only seems necessary for the tests.
Why is that? And what is the correct way to fix it?
You may want to double-check this, but it looks like your tests are importing the disttest package in the tests/ directory, instead of the package-under-test from the src/ directory.
Why do you need to use a package with the same name as the package-under-test? I'd simply move the testcore module up to the tests directory, or rename the tests/disttest package and avoid the potential naming conflict altogether.
In any case, you want to insert a import pdb; pdb.set_trace() line just before the failing import and play around with different import statements to see what is being imported from where (import sys; sys.modules['modulename'].__file__ is your friend) so you get a better insight into what is going wrong.