So I have been fiddling around as well as conducting some serious work with Python for quite some time. Though, I still have some issues with it every once in a while.
I find it to be the most comfortable using PyCharm CE when working with Python. The typical scenario is that I just create a new virtualenv, launch PyCharm and open up my virtualenv there. And from there on out, it's like on auto-pilot, PyCharm handles all the dirty work related to which site-packages and Python runtime to use.
I always like to keep my virtualenvs clean and organized, so I often find myself semantically organizing my source code into submodules/subfolders. So whenever I want to import some code, or class, or whatever from another folder I just import it.
Imagine I have the following structure in my virtualenv:
├── bin
├── include
├── lib
└── src
├── foo.py
├── important_scripts
├── some_script.py
└── some_other_script.py
└── pip-selfcheck.json
Now, somewhere in foo.py, I want to use a function named A() that is implemented in some_script.py. The obvious way would be to add a simple line to foo.py - something like from some_script import A. Doing such works perfectly when I run and debug my code (foo.py in this case) from PyCharm.
As opposed to the the typical scenario I have described above, I wanted to do the same from the Terminal.app. So I fire up the Terminal, cd to my virtualenv and activate it. Then, what I do is, using the Python executable that is under the bin folder in my virtualenv, I try to run foo.py (At least this is what I think is the equivalent of right-clicking and running foo.py from the PyCharm window). Unfortunately, I get the error ModuleNotFoundError: No module named 'some_script'.
I think I am missing a simple detail or something. Because like I said, it works like magic when run from PyCharm.
Anyways, any advice or help will be highly appreciated. Thanks in advance.
Thanks for all the responses and references to possible solutions. While researching online, I have come across various instances of more or less the same problem that people were having while importing modules and packages. So, this is how I have just resolved it:
Under the important_scripts directory, I have included a file named __init__.py. This basically just tells the Python interpreter that this is indeed a Python package, rather than an ordinary subdirectory.
In this __init__.py, I have added in the line
from important_scripts.some_script import A
Then in the script, from which I will be importing the function A, that is, foo.py I have included the following lines:
import os
import sys
sys.path.append(os.path.dirname(os.path.dirname(__file__)))
which basically appends the virtualenv into the site-packages.
Related
In putting together a convolutional model in my computer, I want to use the convert_to_one_hot(.. utility written by Coursera/Ng. I can download their module test_utils.py, where I expect the one_hot converter to be. How do I incorporate test_utils.py to my tools? with Import? Install? where do I put the test_utils.py, or similar collections?
Im guessing you are referring to some built in code that Andrew Ng gave, in that case if you know its a single file you can just put it wherever you want in your project and make sure that you reference to the nearest _init_.py inside your project, so for example, imagine your project looks something like this:
YourProject
app.py
my_module
__init__.py
test_utils.py
config.py
If you were working on the app.py file and wanted to use the test_utils.py function called one_hot_encoder(), you could simply do
from my_module.test_utils import one_hot_encoder
And thats because there is a _init_.py file inside the my_module folder, which the python interpreter take that folder as a "library" name and searchs for the root you are specifying (test_utils) and returns whatever import you choose, could be either a class or a function or even a variable like a list.
NOTE: Make sure that you add other necessary scripts to your module or to pip install the libraries used in the script, for it to work.
G'day,
Being a total python noob when it comes to packaging and module organisation..
Given the following (simplified) structure:
.
├── bin
│ └── fos.py
└── lib
└── drac.py
And the fact, that when installed, contents of lib folder will go somewhere into /usr/local/share/pyshared and contents of bin folder somewhere in /usr/bin, how do I persuade this whole thing to import my modules from ../lib when in VCS mode and work like it should, i.e. from modulename.drac import bla when installed, while keeping imports preferably the same?
Yes, I've read python docs on module organisation and structure, I just can't seem to wrap my head around some best practices. Asking for best practices on SO is stupid, hence this concrete example, which I have on a daily basis more or less.
Is this structure acceptable, if so, how do I organise the imports? If not, what would be the pythonic way to redo it?
Thanks!
I think you are bucking the idiom here. What you are describing is similiar to the old c ld_lib paradigm.
There is nothing wrong with a python project sourcing modules out of its own local file tree. Alternatively if your code is really that separate and your lib has a well defined API then you should package it separately and import/install it using ez_install, pip, or a setup.py
Generally if the code appears to be evolving together it best to just leave it together. Install it wherever you install your python code (opt..etc.) And symbolically link executables into /usr/local/bin
Currently, I have a Python project I'm working on with a directory structure like this:
tests/
corpus/
__init__.py
tests.py
monkey/
corpus/
corpus.py
setup.py
and I want tests.py (in tests/corpus) to import corpus.py (in monkey/corpus).
I've seen many solutions that involve using relative imports and sys.path, but I've also seen people directly import using (for instance)
import monkey.corpus
How can I set up my code to be able to import anything in the root folder like this? I've seen glimpses of ideas that it might be possible through configuring setup.py. Is this true?
Thanks a bunch. My apologies for diluting this wonderful site with one more relative import-esque question. :)
Sincerely,
linuxuser
After doing some research, I found that I needed to add an empty __init__.py in the inner corpus directory and put a line in my .bashrc appending it to PYTHONPATH.
This is what my .bashrc looks like now:
...
export PYTHONPATH=$PYTHONPATH:/home/username/monkey/corpus
...
At first, it seemed unusual to have to append my .bashrc to access a library, but from what I've heard it is the typical way to edit your environment and therefore is a proper way to provide access to Python libraries.
A great resource to find info about PYTHONPATH is this blog post: http://www.stereoplex.com/blog/understanding-imports-and-pythonpath
Here is my structure,
main.py
folder1\
button.py
folder2\
picturebutton.py
folder3\
listbox.py
folder4\
customlistbox.py
folder5\
hyperlistbox.py
Now,
I have a module called, "widget.py" and I would like to make it accessible to all the modules here so that each module will be able to say import widget or something of the sort. After googling, it appears that I have to make a package to do this.
I could not function with the examples online as I have no idea how they work, and I am hoping that one of you may be able to help me with my case.
Edit:
All the folders, (except for the root one) have an __init__.py file.
Being able to import some other module does not need for that to be a package, it needs for the widget module to be put on your PYTHONPATH. You'd do that typically by installing it (writing a setup.py file, see the standard library's distutils module).
If you did want a package though, every folder that needs to be a package needs to have an __init__.py file in it (empty is fine).
Proper way is to create a setup.py file for your package but since it may take time . Below is shortcut .
If you want to use your module it frequently like in script . Easy way is to export "PYTHONPATH" in bashrc/zshrc file and give path to the directory containing your code .
For example:
export PYTHONPATH=$PYTHONPATH:$HOME/path/to/package
Do check on terminal using
echo "$PYTHONPATH"
Happy Coding
I'm working on a Python project with approximately the following layout
project/
foo/
__init__.py
useful.py
test/
__init__.py
test_useful.py
test_useful.py tries to import project.foo.useful so it can test it, but it doesn't work when I say "python project/foo/test/test_useful.py", but it does work if I copy it into my current directory and run "python test_useful.py".
What is the correct way to handle these imports while developing? It seems like this won't be an issue once installed, because it will be in PYTHONPATH. Should I use distutils to make a build/ folder and add it to my PYTHONPATH?
First of all you need to set up your PYTHONPATH to either include "project" or the parent of "project". This is important while you're developing too :-)
Then you should be able to use an absolute import:
from project.foo import useful
Secondly, I would suggest that instead of running tests by executing the module, you install py.test (pip install pytest). Then you'll be able to use relative imports, as long as your py.test invocation is generic enough (i.e. "py.test foo" will work, but "py.test foo/test/test_useful.py" will not). I would still recommend that you not use relative imports in tests.
Please consider using distutils/setuptools to make your project installable in a Python standard way. (Hint: you'll need to create a setup.py file parallel to the 'foo' directory, also known as a package.)
Doing so will also allow you to then use a number of common Python testing frameworks (nose, py.test, etc.) to make it possible to collect and run tests, where most such frameworks automatically ensure 'foo' is an importable package before running the tests. Your test_useful.py tests can them import 'foo.useful' without a problem.
Also worth noting from your example directory structure is that it seems to be generally recommended that your tests directory NOT be a Python package. i.e. delete the test/init.py file. The framework will ensure the tests are runnable, and not having it as a package will help ensure it only gets distributed in source distributions and not binary ones (where it likely isn't wanted.)