I have two python projects, one includes useful packages for file manipulation and such. They are usefull because they can be reused in any other kind of project. The second is one of these projects, requiring the use of my useful packages. Here is my projects' file structure:
Python-projects/
Useful/
package_parsing/
package_file_manipulation/
package_blabla/
Super-Application/
pkg1/
__init__.py
module1.py
module2.py
pkg2/
setup.py
MANIFEST.in
README.rst
First of, I would like to use the Useful/package_parsing package in my module Super-Application/pkg1/module1.py. Is there a more convenient way to do it other than copying the package_parsing in Super-Application project?
Depending in the first answer, that is if there is a way to link a module from a different project, how could I include such external module in a release package of my Super-Application project? I am not sure that making use of install_requires in the setup.py will do.
My main idea here is not to duplicate the Useful/package_parsing package in all of my other development projects, especially when I would like to do modifications to this useful package. I wouldn't like to update all the outdated copies in each project.
=============
EDIT 1
It appears the first part of my question cna be dealt with appending the path:
import sys
sys.path.insert(0, path/to/Useful/package_parsing)
Moreover I can simply check the available paths using:
for p in sys.path:
print p
Now for the second part, how could I include such external module in a release package, possibly using the setup.py installation file?
Related
I have inherited a complex setup.py which uses various python modules inside a separate folder called buildchain. The entire repository is basically structured like
- setup.py
- buildchain
- __init__.py
- extension_generator.py
- src
- my_package
- __init__.py
- my_module.py
with setup.py stating
import setuptools
from buildchain.extension_generator import CppExtensionGenerator
setuptools.setup(
...
ext_modules=CppExtensionGenerator().create_ext_modules()
)
This structure works fine when running python setup.py build_ext and similar things, but when installing the package using pip, for example in a tox environment, I get a ModuleNotFoundError: No module named 'buildchain'. Understandable, but frustrating.
buildchain/ is explicitly excluded in the manifest, because it is only necessary for compiling the extensions, but not otherwise needed after the installation is finished, so I don't want it in the wheel. The working solution is sticking all code inside setup.py, but that's not easy to navigate. I'm looking for something with the equivalent effect of still not adding all this code to the users's Python environment permanently.
How can I have a setup.py split into different modules with imports between them?
#sinoroc and #9769953 pointed out that I do not want buildchain/ inside the wheel, but I do want it inside the sdist. Therefore it should be included in the manifest (which informs sdist), but not constitute package data.
How to achieve that is a question answered eg. in Include files in sdist but not wheel.
[I think. I intend to try it out and then amend this answer describing what exactly I needed to do.]
A project written in python with some C extensions (not using SWIG etc). I am trying to figure out how to structure my project so that:
imports work. Importing the shared objects
I don't need to change PYTHONPATH (tried figuring it out and failed).
In the future, distributing the project package will be easiest.
Current structure is, as suggested here:
Project\
docs\ # mainly documentation, right?
bin\ # empty
setup.py # setup for the project, as suggested in the above link
project\
__init__.py
module.py
tests\
bla_test.py
C\ # the package of C files
file.c
file.so
other_c_stuff.c
header.h
setup.py # setup to compile the C files and create .so files
build\ # contaisn a bunch of (hopefully) irrelevant stuf
It worked from PyDev but not from shell. Ideal answer would adress the following:
A suggested structure for the project.
How an import would be perfomed (from, say, the by modules in tests).
Should (can) I keep all the C files in a separate library?
The build of the C files is done in which of the setup.py files (should I post them here?)
Is it possible to automatically build when necessary? How?
I tried relative imports - they don't work for me for some reason.
I saw the accepted answet to this question. He says - do whatever. But I can't get the imports to work. I read this answer, but have no clue what's all the stuff he has (and I don't have). The accepted answer doesn't help me because, again, the imports fail. This blog posts gives good advice but, again, the imports!
I don't want to go into detail for a general answer, since you linked to good ones already.
Some structure, which should work for you, could look like:
Project\
build\ # directory used by setup.py
docs\ # mainly documentation, right?
setup.py # setup for the project, as suggested in the above link
project\
__init__.py
module.py
c_package\
__init__.py
file.c
file.so
other_c_stuff.c
header.h
tests\
__init__.py
test_bla.py
So within the project package and its subpackages you can use relative imports, if you build the C Extensions inplace
python setup.py build_ext --inplace
or create a setup.cfg containing
[build_ext]
inplace=True
but only use this for development and don't release it, since installation will fail.
A build automation is possible, but I don't know of any other than calling setup.py directly, whenever the C sources have changed.
G'day,
Being a total python noob when it comes to packaging and module organisation..
Given the following (simplified) structure:
.
├── bin
│ └── fos.py
└── lib
└── drac.py
And the fact, that when installed, contents of lib folder will go somewhere into /usr/local/share/pyshared and contents of bin folder somewhere in /usr/bin, how do I persuade this whole thing to import my modules from ../lib when in VCS mode and work like it should, i.e. from modulename.drac import bla when installed, while keeping imports preferably the same?
Yes, I've read python docs on module organisation and structure, I just can't seem to wrap my head around some best practices. Asking for best practices on SO is stupid, hence this concrete example, which I have on a daily basis more or less.
Is this structure acceptable, if so, how do I organise the imports? If not, what would be the pythonic way to redo it?
Thanks!
I think you are bucking the idiom here. What you are describing is similiar to the old c ld_lib paradigm.
There is nothing wrong with a python project sourcing modules out of its own local file tree. Alternatively if your code is really that separate and your lib has a well defined API then you should package it separately and import/install it using ez_install, pip, or a setup.py
Generally if the code appears to be evolving together it best to just leave it together. Install it wherever you install your python code (opt..etc.) And symbolically link executables into /usr/local/bin
I'm working toward adopting Python as part of my team's development tool suite. With the other languages/tools we use, we develop many reusable functions and classes that are specific to the work we do. This standardizes the way we do things and saves a lot of wheel re-inventing.
I can't seem to find any examples of how this is usually handled with Python. Right now I have a development folder on a local drive, with multiple project folders below that, and an additional "common" folder containing packages and modules with re-usable classes and functions. These "common" modules are imported by modules within multiple projects.
Development/
Common/
Package_a/
Package_b/
Project1/
Package1_1/
Package1_2/
Project2/
Package2_1/
Package2_2/
In trying to learn how to distribute a Python application, it seems that there is an assumption that all referenced packages are below the top-level project folder, not collateral to it. The thought also occurred to me that perhaps the correct approach is to develop common/framework modules in a separate project, and once tested, deploy those to each developer's environment by installing to the site-packages folder. However, that also raises questions re distribution.
Can anyone shed light on this, or point me to a resource that discusses this issue?
If you have common code that you want to share across multiple projects, it may be worth thinking about storing this code in a physically separate project, which is then imported as a dependency into your other projects. This is easily achieved if you host your common code project in github or bitbucket, where you can use pip to install it in any other project. This approach not only helps you to easily share common code across multiple projects, but it also helps protect you from inadvertently creating bad dependencies (i.e. those directed from your common code to your non common code).
The link below provides a good introduction to using pip and virtualenv to manage dependencies, definitely worth a read if you and your team are fairly new to working with python as this is a very common toolchain used for just this kind of problem:
http://dabapps.com/blog/introduction-to-pip-and-virtualenv-python/
And the link below shows you how to pull in dependencies from github using pip:
How to use Python Pip install software, to pull packages from Github?
The must-read-first on this kind of stuff is here:
What is the best project structure for a Python application?
in case you haven't seen it (and follow the link in the second answer).
The key is that each major package be importable as if "." was the top level directory, which means that it will also work correctly when installed in a site-packages. What this implies is that major packages should all be flat within the top directory, as in:
myproject-0.1/
myproject/
framework/
packageA/
sub_package_in_A/
module.py
packageB/
...
Then both you (within your other packages) and your users can import as:
import myproject
import packageA.sub_package_in_A.module
etc
Which means you should think hard about #MattAnderson's comment, but if you want it to appear as a separately-distributable package, it needs to be in the top directory.
Note this doesn't stop you (or your users) from doing an:
import packageA.sub_package_in_A as sub_package_in_A
but it does stop you from allowing:
import sub_package_in_A
directly.
...it seems that there is an assumption that all referenced packages
are below the top-level project folder, not collateral to it.
That's mainly because the current working directory is the first entry in sys.path by default, which makes it very convenient to import modules and packages below that directory.
If you remove it, you can't even import stuff from the current working directory...
$ touch foo.py
$ python
>>> import sys
>>> del sys.path[0]
>>> import foo
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: No module named foo
The thought also occurred to me that perhaps the correct approach is
to develop common/framework modules in a separate project, and once
tested, deploy those to each developer's environment by installing to
the site-packages folder.
It's not really a major issue for development. If you're using version control, and all developers check out the source tree in the same structure, you can easily employ relative path hacks to ensure the code works correctly without having to mess around with environment variables or symbolic links.
However, that also raises questions re distribution.
This is where things can get a bit more complicated, but only if you're planning to release libraries independently of the projects which use them, and/or having multiple project installers share the same libraries. It that's the case, take a look at distutils.
If not, you can simply employ the same relative path hacks used in development to ensure you project works "out of the box".
I think that this is the best reference for creating a distributable python package:
link removed as it leads to a hacked site.
also, don't feel that you need to nest everything under a single directory. You can do things like
platform/
core/
coremodule
api/
apimodule
and then do things like from platform.core import coremodule, etc.
I'm working on a Python project with approximately the following layout
project/
foo/
__init__.py
useful.py
test/
__init__.py
test_useful.py
test_useful.py tries to import project.foo.useful so it can test it, but it doesn't work when I say "python project/foo/test/test_useful.py", but it does work if I copy it into my current directory and run "python test_useful.py".
What is the correct way to handle these imports while developing? It seems like this won't be an issue once installed, because it will be in PYTHONPATH. Should I use distutils to make a build/ folder and add it to my PYTHONPATH?
First of all you need to set up your PYTHONPATH to either include "project" or the parent of "project". This is important while you're developing too :-)
Then you should be able to use an absolute import:
from project.foo import useful
Secondly, I would suggest that instead of running tests by executing the module, you install py.test (pip install pytest). Then you'll be able to use relative imports, as long as your py.test invocation is generic enough (i.e. "py.test foo" will work, but "py.test foo/test/test_useful.py" will not). I would still recommend that you not use relative imports in tests.
Please consider using distutils/setuptools to make your project installable in a Python standard way. (Hint: you'll need to create a setup.py file parallel to the 'foo' directory, also known as a package.)
Doing so will also allow you to then use a number of common Python testing frameworks (nose, py.test, etc.) to make it possible to collect and run tests, where most such frameworks automatically ensure 'foo' is an importable package before running the tests. Your test_useful.py tests can them import 'foo.useful' without a problem.
Also worth noting from your example directory structure is that it seems to be generally recommended that your tests directory NOT be a Python package. i.e. delete the test/init.py file. The framework will ensure the tests are runnable, and not having it as a package will help ensure it only gets distributed in source distributions and not binary ones (where it likely isn't wanted.)