Currently, I have a Python project I'm working on with a directory structure like this:
tests/
corpus/
__init__.py
tests.py
monkey/
corpus/
corpus.py
setup.py
and I want tests.py (in tests/corpus) to import corpus.py (in monkey/corpus).
I've seen many solutions that involve using relative imports and sys.path, but I've also seen people directly import using (for instance)
import monkey.corpus
How can I set up my code to be able to import anything in the root folder like this? I've seen glimpses of ideas that it might be possible through configuring setup.py. Is this true?
Thanks a bunch. My apologies for diluting this wonderful site with one more relative import-esque question. :)
Sincerely,
linuxuser
After doing some research, I found that I needed to add an empty __init__.py in the inner corpus directory and put a line in my .bashrc appending it to PYTHONPATH.
This is what my .bashrc looks like now:
...
export PYTHONPATH=$PYTHONPATH:/home/username/monkey/corpus
...
At first, it seemed unusual to have to append my .bashrc to access a library, but from what I've heard it is the typical way to edit your environment and therefore is a proper way to provide access to Python libraries.
A great resource to find info about PYTHONPATH is this blog post: http://www.stereoplex.com/blog/understanding-imports-and-pythonpath
Related
So I have been fiddling around as well as conducting some serious work with Python for quite some time. Though, I still have some issues with it every once in a while.
I find it to be the most comfortable using PyCharm CE when working with Python. The typical scenario is that I just create a new virtualenv, launch PyCharm and open up my virtualenv there. And from there on out, it's like on auto-pilot, PyCharm handles all the dirty work related to which site-packages and Python runtime to use.
I always like to keep my virtualenvs clean and organized, so I often find myself semantically organizing my source code into submodules/subfolders. So whenever I want to import some code, or class, or whatever from another folder I just import it.
Imagine I have the following structure in my virtualenv:
├── bin
├── include
├── lib
└── src
├── foo.py
├── important_scripts
├── some_script.py
└── some_other_script.py
└── pip-selfcheck.json
Now, somewhere in foo.py, I want to use a function named A() that is implemented in some_script.py. The obvious way would be to add a simple line to foo.py - something like from some_script import A. Doing such works perfectly when I run and debug my code (foo.py in this case) from PyCharm.
As opposed to the the typical scenario I have described above, I wanted to do the same from the Terminal.app. So I fire up the Terminal, cd to my virtualenv and activate it. Then, what I do is, using the Python executable that is under the bin folder in my virtualenv, I try to run foo.py (At least this is what I think is the equivalent of right-clicking and running foo.py from the PyCharm window). Unfortunately, I get the error ModuleNotFoundError: No module named 'some_script'.
I think I am missing a simple detail or something. Because like I said, it works like magic when run from PyCharm.
Anyways, any advice or help will be highly appreciated. Thanks in advance.
Thanks for all the responses and references to possible solutions. While researching online, I have come across various instances of more or less the same problem that people were having while importing modules and packages. So, this is how I have just resolved it:
Under the important_scripts directory, I have included a file named __init__.py. This basically just tells the Python interpreter that this is indeed a Python package, rather than an ordinary subdirectory.
In this __init__.py, I have added in the line
from important_scripts.some_script import A
Then in the script, from which I will be importing the function A, that is, foo.py I have included the following lines:
import os
import sys
sys.path.append(os.path.dirname(os.path.dirname(__file__)))
which basically appends the virtualenv into the site-packages.
G'day,
Being a total python noob when it comes to packaging and module organisation..
Given the following (simplified) structure:
.
├── bin
│ └── fos.py
└── lib
└── drac.py
And the fact, that when installed, contents of lib folder will go somewhere into /usr/local/share/pyshared and contents of bin folder somewhere in /usr/bin, how do I persuade this whole thing to import my modules from ../lib when in VCS mode and work like it should, i.e. from modulename.drac import bla when installed, while keeping imports preferably the same?
Yes, I've read python docs on module organisation and structure, I just can't seem to wrap my head around some best practices. Asking for best practices on SO is stupid, hence this concrete example, which I have on a daily basis more or less.
Is this structure acceptable, if so, how do I organise the imports? If not, what would be the pythonic way to redo it?
Thanks!
I think you are bucking the idiom here. What you are describing is similiar to the old c ld_lib paradigm.
There is nothing wrong with a python project sourcing modules out of its own local file tree. Alternatively if your code is really that separate and your lib has a well defined API then you should package it separately and import/install it using ez_install, pip, or a setup.py
Generally if the code appears to be evolving together it best to just leave it together. Install it wherever you install your python code (opt..etc.) And symbolically link executables into /usr/local/bin
I am pretty new to Python so it might sound obvious but I haven't found this everywhere else
Say I have a application (module) in the directoy A/ then I start developing an application/module in another directory B/
So right now I have
source/
|_A/
|_B/
From B I want to use functions are classes defined in B. I might eventually pull them out and put them in a "misc" or "util" module.
In any case, what is the best way too add to the PYTHONPATH module B so A can see it? taking into account that I will be also making changes to B.
So far I came up with something like:
def setup_paths():
import sys
sys.path.append('../B')
when I want to develop something in A that uses B but this just does not feel right.
Normally when you are developing a single application your directory structure will be similar to
src/
|-myapp/
|-pkg_a/
|-__init__.py
|-foo.py
|-pkg_b/
|-__init__.py
|-bar.py
|-myapp.py
This lets your whole project be reused as a package by others. In myapp.py you will typically have a short main function.
You can import other modules of your application easily. For example, in pkg_b/bar.py you might have
import myapp.pkg_a.foo
I think it's the preferred way of organising your imports.
You can do relative imports if you really want, they are described in PEP-328.
import ..pkg_a.foo
but personally I think, they are a bit ugly and difficult to maintain (that's arguable, of course).
Of course, if one of your modules needs a module from another application it's a completely different story, since this application is an external dependency and you'll have to handle it.
I would recommend using the imp module
import imp
imp.load_source('module','../B/module.py')
Else use absolute path starting from root
def setup_paths():
import sys
sys.path.append('/path/to/B')
Add proper folder to PYTHONPATH
First, create the root directory for package that you create. Let's call it project.
Subdirs of project are your apps, that is A and B. Now you have to add the parent directory of project to PYTHONPATH.
Refering modules inside the package.
Let's say you have other_app.py in B, so in A/app_name.py you import it like this:
from project.B.other_app import *
Or if you want to have all symbols in other_app namespace import like this:
from project.B import other_app
Nice way of creating launchers and dynamically change PYTHONPATH
If you want to create universal app launchers for Python which work even when you move your package to other PC/dir you need some solution to dynamically add package parent directory to PYTHONPATH. Here is my solution to this case (for Linux, but you can also translate simple scripts to Windows if you google a bit :) )
In the same folder in which you created project create pwd_to_pythonpath.sh, short bash script:
#!/bin/bash
export PYTHONPATH=$PYTHONPATH:`pwd`
Then create launcher for your A and B, for example A.sh would look like:
#!/bin/bash
source pwd_to_pythonpath.sh
python -m project.A.app_name
The app_name should be the same as module file name in A folder (app_name.py in this case)
Here is my structure,
main.py
folder1\
button.py
folder2\
picturebutton.py
folder3\
listbox.py
folder4\
customlistbox.py
folder5\
hyperlistbox.py
Now,
I have a module called, "widget.py" and I would like to make it accessible to all the modules here so that each module will be able to say import widget or something of the sort. After googling, it appears that I have to make a package to do this.
I could not function with the examples online as I have no idea how they work, and I am hoping that one of you may be able to help me with my case.
Edit:
All the folders, (except for the root one) have an __init__.py file.
Being able to import some other module does not need for that to be a package, it needs for the widget module to be put on your PYTHONPATH. You'd do that typically by installing it (writing a setup.py file, see the standard library's distutils module).
If you did want a package though, every folder that needs to be a package needs to have an __init__.py file in it (empty is fine).
Proper way is to create a setup.py file for your package but since it may take time . Below is shortcut .
If you want to use your module it frequently like in script . Easy way is to export "PYTHONPATH" in bashrc/zshrc file and give path to the directory containing your code .
For example:
export PYTHONPATH=$PYTHONPATH:$HOME/path/to/package
Do check on terminal using
echo "$PYTHONPATH"
Happy Coding
What would be the best directory structure strategy to share a utilities module across my python projects? As the common modules would be updated with new functions I would not want to put them in the python install directory.
project1/
project2/
sharedUtils/
From project1 I can not use "import ..\sharedUtils", is there any other way? I would rather not hardcode the "sharedUtils" location
Thanks in advance
Directory structure:
project1/foo.py
sharedUtils/bar.py
With the directories as you've shown them, from foo.py inside the project1 directory you can add the relative path to sharedUtils as follows:
import sys
sys.path.append("../sharedUtils")
import bar
This avoids hardcoding a C:/../sharedUtils path, and will work as long as you don't change the directory structure.
Make a separate standalone package? And put it in the /site-packages of your python install?
There is also my personal favorite when it comes to development mode: use of symlinks and/or *.pth files.
Suppose you have sharedUtils/utils_foo and sharedUtils/utils_bar.
You could edit your PYTHONPATH to include sharedUtils, then import them in project1 and project2 using
import utils_foo
import utils_bar
etc.
In linux you could do that be editing ~/.profile with something like this:
PYTHONPATH=/path/to/sharedUtils:/other/paths
export PYTHONPATH
Using the PYTHONPATH environment variable affects the directories that python searches when looking for modules. Since every user can set his own PYTHONPATH, this solution is good for personal projects.
If you want all users on the machine to be able to import modules in sharedUtils, then
you can achieve this by using a .pth file. Exactly where you put the .pth file may depend on your python distribution. See Using .pth files for Python development.