Say you want want to create a programming project that mixes C++ and Python. The Foo C++ project structure uses CMake, and a Python module is created by using Swig. The tree structure would look something like this:
├── CMakeLists.txt
├── FooConfig.cmake.in
├── FooConfigVersion.cmake.in
├── Makefile
├── README
├── foo
│ ├── CMakeLists.txt
│ ├── config.hpp.in
│ ├── foo.cpp
│ └── foo.hpp
└── swig
└── foo.i
Now you would like to make use of the Foo project within a Python project, say Bar:
├── AUTHORS.rst
├── CONTRIBUTING.rst
├── HISTORY.rst
├── LICENSE
├── MANIFEST.in
├── Makefile
├── README.rst
├── docs
│ ├── Makefile
│ ├── authors.rst
│ ├── conf.py
│ ├── contributing.rst
│ ├── history.rst
│ ├── index.rst
│ ├── installation.rst
│ ├── make.bat
│ ├── readme.rst
│ └── usage.rst
├── bar
│ ├── __init__.py
│ └── bar.py
├── requirements.txt
├── setup.cfg
├── setup.py
├── tests
│ ├── __init__.py
│ └── test_bar.py
└── tox.ini
This structure was crated by using cookiecutter's pypackage template. A BoilerplatePP template is also available to generate a CMake C++ project using cookiecutter (no Swig part).
So now that I have the structure of both projects, and considering that the development will take place mainly in Python and the the project will be run in different systems, I need to address the following questions:
What's the best way to mix them? Should I collapse both root directories? Should I have the Foo C++ project as a directory of the Bar project or the other way around? I may be inclined to put the entire C++ structure shown above in a folder at the root level of the Python project, but I would like to know a priori any pitfalls as the CMake system is quite powerful and it may be convenient to do it the other way around.
In case I decide to put the Foo project as a directory within Bar, is the Python setuptools package as powerful as the CMake build system? I ask this because when I take a look at the Bar project, at the top level it seems there's only a bunch of scripts, but I don't know if this is the equivalent to CMake as I'm new to Python.
The Bar project outlined above has a single bar directory, but I assume that whenever this project expands, instead of having many other directories at the root level, other directories containing Python code will be placed within bar. Is this correct (in the Pythonic sense)?
I assume that a single egg will be produced from the entire project, so that it can be installed and run in many different python systems. Is the integration of the module created by the Foo project easy? I assume that this module will be created in a different directory than bar.
In order for the Python code within the bar directory, the module created by Swig has to be available, so I guess the most straightforward way to do this is to modify the environmental variable PYTHONPATH using the CMake system. Is this fine or is there a better way?
If the C++ application has no use outside the Python package that will contain it:
You can pretty safely place the C++ code within the python package that owns it. Have the "foo" directory within the "bar" directory within your example. This will make packaging the final Python module a bit easier.
If the C++ application is reusable:
I would definitely try to think of things in terms of "packages", where independent parts are self-contained. All independent parts live on the same level. If one part depends on another, you import from its corresponding "package" from the same level. This is how dependencies typically work.
I would NOT include one within the other, because one does not strictly belong to the other. What if you started a third project that needed "foo", but did not need "bar"?
I would place both "foo" and "bar" packages into the same "project" directory (and I would probably give each package it's own code repository so each package can be easily maintained and installed).
Related
Files from the utils folder are not included in the package when individually packaging with serverless-python-requirements plugin.
In my Serverless.com AWS Python project I have the following folder structure:
.
├── serverless.yml
├── generate_features
│ ├── requirements.txt
│ └── generate_features.py
├── requirements.txt
├── utils
│ ├── utility.py
│ └── additional_code.py
│
:
The relevant section of my serverless.yml looks as follows
package:
individually: true
functions:
generate-features:
handler: generate_features.handler
module: generate_features
timeout: 400
...
I would not include everything that is in the utils folder with each individually packaged function (there are more than one and they share some code).
Unfortunately when I use the serverless-python-requirements it appears that it won't let me do that. It only includes whatever is in the module directory. I would basically like to add additional modules though.
Any ideas? Am I not seeing some obvious way to include utils/? adding
package:
include:
- utils/**
at the function level unfortunately doesn't seem to work.
Thx
I'm trying to pip install a GitHub project locally, outside of site-packages so that I can modify it, etc.
I've added -e git+git#github.com:Starcross/django-starcross-gallery.git#egg=gallery to my requirements.txt which brings the relevant part of my project layout to look like this:
/home/mat/venv/proj/
└── src
└── gallery
├── admin.py
├── apps.py
├── build.sh
├── django_starcross_gallery.egg-info
│ ├── dependency_links.txt
│ ├── PKG-INFO
│ ├── requires.txt
│ ├── SOURCES.txt
│ └── top_level.txt
├── forms.py
├── __init__.py
├── LICENSE
├── MANIFEST.in
├── models.py
├── README.rst
├── settings.py
├── setup.py
├── signals.py
├── static
│ └── ...
├── templates
│ └── ...
├── tests
│ └── ...
├── tests.py
├── urls.py
└── views.py
As far as I can see the problem is that these .egg-link and .pth files like one level too deep:
lib/python3.6/site-packages/django-starcross-gallery.egg-link:
/home/mat/venv/proj/src/gallery
.
lib/python3.6/site-packages/easy-install.pth:
/home/mat/venv/proj/src/gallery
I can fix everything by either moving gallery a level deeper, or changing django-starcross-gallery.egg-link and easy-install.pth to point to src.
Is there a config parameter I can pass in requirements.txt to make this work properly? Or do I have to adjust the project layout to fit?
Since you want to modify it, why not just clone the repo. To make your interpreter able to find and use it, you have some options:
modify your sys.path, append path to the repo
create a symlink under your project directory that points to the repo
And in this way, you don't have to pip install every time you modify it.
As has been mentioned, the best way to do this is to clone the repo. This would go for most packages as pip may build extensions, and carry other actions during install aimed at using the module for production rather than editing the source.
To explain why I chose this structure, I wanted to be able to develop the package inside a Django project. As the Django docs say, the app should be placed in a separate directory, which enables setuptools to install the package correctly. There is no way I could find that would enable this to continue to work inside a project, hence the build script to move the files into a suitable directory and generate the package.
I have the following directory structure:
├── DynamicProgramming
│ ├── 0-1_kp_problem.py
│ ├── b.py
│ ├── largest_contigous_subarray.py
│ ├── longest_common_substring.py
│ ├── min_change_for_given_money.py
│ ├── optimal_matrix_chain.py
│ ├── Readme.md
│ └── wis.py
├── helper
│ ├── a.py
│ └── __init__.py
└── Readme.md
The helper directory contains the library functions which will be used all over the code. How can I import the helper package from the scripts inside DynamicProgramming without adding it to the path?
Edit=>
I cannot move helper directory inside dynamicProgramming because there can be more than one directories using it.
You could use something like:
from ..helper import a
See python docs on packages.
If you run your code from project root folder, you are likely to succeed with import helper or import helper.a. If not, you would have to add current directory to PYTHONPATH:
$ export PYTHONPATH="."
better use project setup.py
Instead of playing with PYTHONPATH (what can be tricky business sometime), you shall create your project as python package.
You add setup.py into your project root, specify attributes of that package and build it from it.
setup.py can define multiple packages at once, but generally it is more often
using only one. For this purpose it would be better moving the helper package
into DynamicProgramming structure and import it from there.
Search for setup.py python packaging tutorials, it requires some study, but it will pay back.
I would like to include a third party library into my Python script folder to distribute it all togehter (I am awary of the distribution license, and this library is fine to distribute). This is in order to avoid installing the library on another machine.
Saying I have a script (my_script.py), which calls this external library. I tried to copy this library from the site-packages subdirectory of Python directory into the directory where I have my files, but it seems not to be enough (I think th reason is in the __init__.py of this library which probably needs the folder to be in the PYTHONPATH).
Would it be reasonable to insert some lines of code in my_script.py to temporary append its folder to sys.path in order to make the all things working?
For instance, if I have a structure similar to this:
Main_folder
my_script.py
/external_lib_folder
__init__.py
external_lib.py
and external_lib_folder is the external library I copied from site-packages and inserted in my Main_folder, would it be fine if I write these lines (e.g.) in my_script.py?
import os,sys
main_dir = os.path.dirname(os.path.abspath(__file__))
sys.path.append(main_dir)
EDIT
I ended up choosing the sys.path.append solution. I added these lines to my my_script.py:
import os, sys
# temporarily appends the folder containing this file into sys.path
main_dir = os.path.join(os.path.dirname(os.path.abspath(__file__)),'functions')
sys.path.append(main_dir)
Anyway, I chose to insert this as an edit in my question and accept the answer of Torxed because of the time he spent in helping me (and of course because his solution works as well).
Python3
import importlib.machinery, imp
namespace = 'external_lib'
loader = importlib.machinery.SourceFileLoader(namespace, '/home/user/external_lib_folder/external_lib.py')
external_lib = loader.load_module(namespace)
# How to use it:
external_lib.function(data_or_something)
This would be an ideal way to load custom paths in Python 3.
Not entirely sure this is what you wanted but It's relevant enough to post an alternative to adding to sys.path.
Python2
In python 2 you could just do (if i'm not mistaken, been a while since i used an older version of Python):
external_lib = __import__('external_lib_folder')
This does however require you to keep the __init__.py and a proper declaration of functions in sad script, otherwise it will fail.
**It's also important that the folder you're trying to import from is of the same name that the __init__.py script in sad folder is trying to import it's sub-libraries from, for instance geopy would be:
./myscript.py
./geopy/
./geopy/__init__.py
./geopy/compat.py
...
And the code of myscript.py would look like this:
handle = __import__('geopy')
print(handle)
Which would produce the following output:
[user#machine project]$ python2 myscript.py
<module 'geopy' from '/home/user/project/geopy/__init__.pyc'>
[user#machine project]$ tree -L 2
.
├── geopy
│ ├── compat.py
│ ├── compat.pyc
│ ├── distance.py
│ ├── distance.pyc
│ ├── exc.py
│ ├── exc.pyc
│ ├── format.py
│ ├── format.pyc
│ ├── geocoders
│ ├── __init__.py
│ ├── __init__.pyc
│ ├── location.py
│ ├── location.pyc
│ ├── point.py
│ ├── point.pyc
│ ├── units.py
│ ├── units.pyc
│ ├── util.py
│ ├── util.pyc
│ └── version.pyc
└── myscript.py
2 directories, 20 files
Because in __init__.py of geopy, it's defined imports such as from geopy.point import Point which requires a namespace or a folder of geopy to be present.
There for you can't rename the folder to functions and place a folder called geopy in there because that won't work, nor will placing the contents of geopy in a folder called functions because that's not what geopy will look for.
Adding the path to sys.path (Py2 + 3)
As discussed in the comments, you can also add the folder to your sys.path variable prior to imports.
import sys
sys.path.insert(0, './functions')
import geopy
print(geopy)
>>> <module 'geopy' from './functions/geopy/__init__.pyc'>
Why this is a bad idea: It will work, and is used by many. The problems that can occur is that you might replace system functions or other modules might get loaded from other folders if you're not careful where you import stuff from. There for use .insert(0, ...) for most and be sure you actually want to risk replacing system built-ins with "shady" path names.
What you suggest is bad practice, it is a weak arrangement. The best solution (which is also easy to do) is to package it properly and add an explicit dependency, like this:
from setuptools import setup
setup(name='funniest',
version='0.1',
description='The funniest joke in the world',
url='http://github.com/storborg/funniest',
author='Flying Circus',
author_email='flyingcircus#example.com',
license='MIT',
packages=['funniest'],
install_requires=[
'markdown',
],
zip_safe=False)
This will work if the third party library is on pipy. If it's not, use this:
setup(
...
dependency_links=['http://github.com/user/repo/tarball/master#egg=package-1.0']
...
)
(See this explanation for packaging).
I am trying to document a package in Python. At the moment I have the following directory structure:
.
└── project
├── _build
│ ├── doctrees
│ └── html
│ ├── _sources
│ └── _static
├── conf.py
├── index.rst
├── __init__.py
├── make.bat
├── Makefile
├── mod1
│ ├── foo.py
│ └── __init__.py
├── mod2
│ ├── bar.py
│ └── __init__.py
├── _static
└── _templates
This tree is the result of the firing of sphinx-quickstart. In conf.py I uncommented sys.path.insert(0, os.path.abspath('.')) and I have extensions = ['sphinx.ext.autodoc'].
My index.rst is:
.. FooBar documentation master file, created by
sphinx-quickstart on Thu Aug 28 14:22:57 2014.
You can adapt this file completely to your liking, but it should at least
contain the root `toctree` directive.
Welcome to FooBar's documentation!
==================================
Contents:
.. toctree::
:maxdepth: 2
Indices and tables
==================
* :ref:`genindex`
* :ref:`modindex`
* :ref:`search`
In all the __init__.py's I have a docstring and same goes to the modules foo.py and bar.py. However, when running make html in the project I don't see any of the docstings.
Here is an outline:
Document your package using docstrings in the sources.
Use sphinx-quickstart to create a Sphinx project.
Run sphinx-apidoc to generate .rst sources set up for use with autodoc. More information here.
Using this command with the -F flag also creates a complete Sphinx project. If your API changes a lot, you may need to re-run this command several times.
Build the documentation using sphinx-build.
Notes:
Sphinx requires .rst files with directives like automodule or autoclass in order to generate API documentation. It does not automatically extract anything from the Python sources without these files. This is different from how tools like Epydoc or Doxygen work. The differences are elaborated a bit more here: What is the relationship between docutils and Sphinx?.
After you have run sphinx-apidoc, it may be necessary to adjust sys.path in conf.py for autodoc to find your modules.
In order to avoid strange errors like in these questions, How should I solve the conflict of OptionParser and sphinx-build in a large project?, Is OptionParser in conflict with sphinx?, make sure that the code is properly structured, using if __name__ == "__main__": guards when needed.