I am creating a tool I plan to put on Github. It is very general however I want to also include some examples of its usage. As such I have set up the following folder structure.
This tool is for python 3.
repository/
commonTool.py
commonTool2.py
specificUsage/
runTheSpecificUsage.py
helpRunTheSpecificUsage.py
Now both of the scripts in the specificUsage/ folder will import the methods in commonTool.py and commonTool2.py.
My issue is ideally the user would be able to go
python repository/specificUsage/runTheSpecificUsage.py
However I cant get this to work. It never is able to import the functions that are in the folder above it. I have tried a variety of various posts on how to import a file from a super folder with no luck.
How should I be setting up these folders? Should I have one init.py or two? Where?
You can create a setup.py at the top level along with commonTool.py and commonTool2.py. Inside your setup.py, put the following line:
import setuptools
setuptools.setup(packages=setuptools.find_packages())
Then put an __init__.py in the same level, as well as in specificUsage/. You should be able to import from specificUsage like this:
import commonTool
After setting up your files, from the top level run:
pip install -e .
You may also want to consider that Python has strong naming conventions for modules and packages and frowns upon camelcase in general. You can read more here if you like.
Related
I am struggling a bit to get a package to work, so I would like to just get the simplest possible test case to work.
Here is my current attempt.
Inside of a folder called Python_experiment I have two files: a jupyter notebook with the code
from .pleasejustwork import eat_muffin
and a file called pleasejustwork.py with the code
def eat_muffin():
print('i ate a muffin')
When I run the line from the jupyter notebook I get the error "attempted relative import with no known parent package". What am I missing?
The syntax you’re using is for a python package. To achieve this, place a file, __init__.py in the directory. It can be empty. If you’d like to use that package from anywhere, write a setup.py script to build and install your package. There are a lot of good tutorials on how do that, like in the python docs. Then you could do
from python_experiment.pleasejustwork import eatmuffin or from another file in the package from .pleasejustwork import eatmuffin
If you’re just trying to learn about modules, you can simply take out that leading period
from pleasejustwork import eatmuffin
I have created a Python package which builds on the structure indicated in Kenneth Reitz' "Repository Structure and Python" (1). The main package path is:
/projects-folder (not site-packages)
/package
/package
__init__.py
Datasets.py
Draw.py
Gmaps.py
ShapeSVG.py
project.py
__init__.py
setup.py
With the current structure, I must use the following module import syntax:
import package.package.Datasets
I would prefer to type the following:
import package.Datasets
I am capable of typing the same word twice, of course, but it feels wrong in a deeper sense, i.e., I am structuring my package incorrectly or misunderstanding how Python interprets that structure.
The outer __init__.py is required for Python to detect this package at all, per the docs (2). But that sets up /package/ as the top level of the package and /package/package/ as a sub-package, forcing me into the unwieldy import syntax above.
To avoid this, it seems that my options are to:
Create a package in which the outer folder contains the top level of package modules.
Add the inner folder to my PYTHONPATH environment variable.
Yet both of these seem like suboptimal workarounds for something that shouldn't be an issue in the first place. What should I do?
You've misunderstood. You have two package packages for some reason, but the source you cite never said to do that. The outer folder, with setup.py, is not supposed to be a package.
It sounds like you're running Python in projects-folder and trying to import your package from there. That's not what you should be doing. You have several options to get your package into the import system. (I'll refer to the folder with setup.py in it as setupfolder, to distinguish it from the inner folder):
Build your package with setup.py, for example, python setup.py bdist-wheel --universal, and install the built package with pip.
Skip the build step and just run pip install path/to/setupfolder. Building the package produces an installer useful if you want to distribute your package, but maybe you don't want to do that.
"Install" the package's source tree in development mode with pip install -e path/to/setupfolder, so the Python import system will locate the package's source tree when performing imports. This is handy because you don't have to rebuild and reinstall if you edit the source repository, although you'll still want to restart any running Python processes that are using the package.
Run Python from directly inside the setupfolder.
Any of these options will cause your package to be importable directly as package instead of package.package, as it should be.
While I do not entirely agree with your package structure, you can make use of __all__ and possibly the one legitimate use for star imports I've seen so far. __init__.py can serve more purposes than just identifying your folder as a package or sub-package.
Using a Star Import
In package/package/__init__.py, add a variable __all__ that declares all the public elements you want to export:
__all__ = ['Datasets', 'Draw', 'Gmaps', 'ShapeSVG', 'project']
In package/__init__.py do from package.package import *. Now all the attributes that were available as package.package.x will also be available as package.x.
If you want to directly copy package.package.__all__ to package.__all__ (which is optional, but will allow you to do from package import * properly), you can do something like
from package.package import *
from package.package import __all__ as _all
__all__ = _all
del _all
Not Using a Star Import
You can accomplish the same thing without using package.package.__all__ at all. Just add __all__ directly to package/__init__.py and use from package.package import x-style imports:
from package.package import (
Datasets, Draw, Gmaps, ShapeSVG, project
)
# As before, package.__all__ is optional
__all__ = ['Datasets', 'Gmaps', 'ShapeSVG', 'project']
I would still recommend having a package.package.__all__ variable, but it is optional for this particular purpose.
Pros and Cons
Both approaches are pretty legitimate and I have seen both used in major projects. The first approach reduces redundancy. You only define the public exports in one place: package.package.__all__. The star imports and package.__all__ reference that definition directly, leading to one place that you really have to maintain. On the other hand, there are times when you want to separate the "full" package.package.x API from what you expose via package.x to the casual user. In that case, go with the second option. The only downside here is that you have to be more careful to keep package.__all__ and the corresponding imports synchronized properly.
Note
A number of projects I've seen (numpy especially comes to mind), export the attributes of the individual modules to the top level using this technique. For example, if you had a function package.package.Datasets.get_data, it would be listed in package.package.Datasets.__all__, which would be imported into pacakge.package.__init__, appended to package.package.__all__, and then be referenced by the top-level package and package.__all__.
I used below solution for importing dependencies.
I found this solution works if I run the code in Pycharm but not in Terminal.
The error message in Terminal is "cannot find graphics.primitive".
I'm using Mac and Python 3.5.
Why I see different behaviors from the Terminal and Pycharm?
How may I make the solution work for both?
http://chimera.labs.oreilly.com/books/1230000000393/ch10.html#_solution_169
Making a Hierarchical Package of Modules
Problem
You want to organize your code into a package consisting of a hierarchical collection of modules.
Solution
Making a package structure is simple. Just organize your code as you wish on the file-system and make sure that every directory defines an init.py file. For example:
graphics/
__init__.py
primitive/
__init__.py
line.py
fill.py
text.py
formats/
__init__.py
png.py
jpg.py
Once you have done this, you should be able to perform various import statements, such as the following:
import graphics.primitive.line
from graphics.primitive import line
import graphics.formats.jpg as jpg
You need to make sure that the graphics package is in the Python search path. PyCharm does this by extending sys.path as follows:
import sys
sys.path.extend(['/Users/hackworth/Development/graphics_parent_dir', '/Applications/PyCharm.app/Contents/helpers/pycharm', '/Applications/PyCharm.app/Contents/helpers/pydev'])
You can do the same in your code replacing /Users/hackworth/graphics_parent_dir with the appropriate path, or you can include the full path to graphics_parent_dir in the PYTHONPATH environment variable. See the Python documentation for details.
Another option would be to place the graphics package into a location the is searched by default on your system.
I'm self-taught in the Python world, so some of the structural conventions are still a little hazy to me. However, I've been getting very close to what I want to accomplish, but just ran into a larger problem.
Basically, I have a directory structure like this, which will sit outside of the normal python installation (this is to be distributed to people who should not have to know what a python installation is, but will have the one that comes standard with ArcGIS):
top_directory/
ArcToolbox.tbx
scripts/
ArcGIStool.py (script for the tool in the .tbx)
pythonmod/
__init__.py
general.py
xlrd/ (copied from my own python installation)
xlwt/ (copied from my own python installation)
xlutils/ (copied from my own python installation)
So, I like this directory structure, because all of the ArcGIStool.py scripts call functions within the pythonmod package (like those within general.py), and all of the general.py functions can call xlrd and xlwt functions with simple "import xlrd" statements. This means that if the user desired, he/she could just move the pythonmod folder to the python site-packages folder, and everything would run fine, even if xlrd/xlwt/xlutils are already installed.
THE PROBLEM:
Everything is great, until I try to use xlutils in general.py. Specifically, I need to "from xlutils.copy import copy". However, this sets off a cascade of import errors. One is that xlutils/copy.py uses "from xlutils.filter import process,XLRDReader,XLWTWriter". I solved this by modifying xlutils/copy.py like this:
try:
from xlutils.filter import process,XLRDReader,XLWTWriter
except ImportError:
from filter import process,XLRDReader,XLWTWriter
I thought this would work fine for other situations, but there are modules in the xlutils package that need to import xlrd. I tried following this advice, but when I use
try:
import xlrd
except ImportError:
import os, sys, imp
path = os.path.dirname(os.path.dirname(sys.argv[0]))
xlrd = imp.load_source("pythonmod.xlrd",os.path.join(path,"xlrd","__init__.py"))
I get a new import error: In xlrd/init.py, the info module is called (from xlrd/info.py), BUT when I use the above code, I get an error saying that the name "info" is not defined.
This leads me to believe that I don't really know what is going on, because I thought that when the init.py file was imported it would run just like normal and look within its containing folder for info.py. This does not seem to be the case, unfortunately.
Thanks for your interest, and any help would be greatly appreciated.
p.s. I don't want to have to modify the path variables, as I have no idea who will be using this toolset, and permissions are likely to be an issue, etc.
I realized I was using imp.load_source incorrectly. The correct syntax for what I wanted to do should have been:
imp.load_source("xlrd",os.path.join(path,"xlrd","__init__.py"))
In the end though, I ended up rewriting my code to not need xlutils at all, because I continued to have import errors that were causing many more problems than were worth dealing with.
I have the following structure on a Python program:
my_program/
main.py
packages/
__init.py__
package_to_share/
__init__.py
main_to_share.py
module_to_share.py
package_A/
__init__.py
main_A.py
some_module_A.py
package_B/
__init__.py
main_B.py
some_module_B.py
The package package_to_share provides functionality that every package in the packages folder uses and that main.py at the root folder uses.
I also want to be able to cd into each package and be able to run main_X.py.
So far I figured out how to access functionality from main.py:
import packages.package_A.some_module_A
import packages.package_to_share.module_to_share
but I am having problems accessing the functionality in package_to_share from regular packages (e.g. package_A)
For example, when in main_A.py or some_module_A.py, typing import packages.package_to_share.module_to_share fails.
This leads me to following questions questions:
Given the specifics of my problem, with packages to be shared (accessed) by files at the root folder and by other packages, is there a better way to organize my folders ? Does this organization of modules and files in general conform to good standards in Python?
The following looks incredibly hacky to me, but it's the best thing I came up with to make sure my regular modules see the "shared" modules:
p_this_file = os.path.dirname(os.path.realpath(__file__))
new_sys_path.append(os.path.join(p_cwd, '..')
new_sys_path.extend(sys.path)
sys.path = new_sys_path
It also does not prevent my regular packages from importing each other.
Rather than manipulating the path for imports (which I don't recommend), you could use relative imports within module_A:
from .. import shared
That is what I sometimes do, though I generally have my packages installed so I can reference them fully where I need to use them:
from my_module import shared