I've been developing a project in python for some time using pycharm. I have no problem running it in pycharm, but I want to try and running it from the command line (I'm using windows).
When I try to run my main file using python <filename> from within the root directory, I get a module not found error. What is pycharm doing/how can i replicate it?
Also, pycharm creates pycache folders. If my understanding is correct its compiling the files, and that makes the runtime faster. Since my runtime is already long i'd like to do the same.
I'm using python 3.6
edit
File Structure
-Root\
----scheduler\
------main.py
------matrices
------models
------rhc
------Singleton.py
------utils.py
------__init__.py
------apis\
acedb.py
metrics.py
__init__.py
------matrices\
distancematrix.py
__init__.py
------models\
branch.py
constants.py
customer.py
dispatch.py
technician.py
workorder.py
__init__.py
------rhc\
pathfinder.py
rhc.py
schedule.py
sched_time.py
tech_schedule.py
__init__.py
Edit 2
Found the solution, i had to move my main files outside of the modules
Thanks!
If you have the below folder structure for instance. add an empty file named init.py and import from your app.py, in case you have a Module1Class , you can always import it like this
from modules.module1 import Module1Class
from modules.module2 import Module2Class
Folder structure
/app.py
/modules
/module1.py
/module2.py
/__init__.py
The first time you rum app.py from terminal like python app.py, the .pyc files will be created, leaving you with the following folder structure
/app.py
/modules
/module1.py
/module1.pyc
/module2.py
/module2.pyc
/__init__.py
Please refer to the python documentation as it's very well documented on how to create modules and importing them. I'm more used to python2.7 , so my answer might not be an exact fit to newer python versions.
https://docs.python.org/3/tutorial/modules.html
From there you can learn more about __ init __.py and module creation , exporting and importing
PS: I use only text editors to develop in python, as I find pycharm a bit on the heavy side, so I cannot explain how exactly pycharm works behind the curtains.
see ModuleNotFoundError: What does it mean __main__ is not a package? for a good example of ModuleNotFoundError description (was answered fast)
I bet that pycharm is configured to use a different python interpreter of virtual environment.
Related
I have a project I am working on, let's call it Project, which lives in the directory Project somewhere wholly unknown to me (really it lives both on my local system and on a couple Docker build systems). In that project, I have some source files, source/module1.py and source/module2.py. I also have some example files, some test files, and an init.py So my directory looks something like this:
Project
__init__.py
/source
module1.py
module2.py
/test
testRunner.py
/examples
awesomeExample.py
However, module1 needs some stuff from module2. My naive self thought this could be done by putting an import statement in module1:
import module2
# Do some other interesting stuff
And this works, but only when I am running / importing module 1 from the source directory. If I am, for example, running some unit tests in another directory test/testRunner.py, either from the test directory or in the main Project directory, the import will fail. Same with trying to use it when running an example in the examples directory.
So here is my problem: in general, I don't know where the calling script lives. It might be in the examples directory, it might be in the test directory, or it might be in the main Project directory (for example when trying to import stuff with an init.py). How do I ensure that module1 can always import module2 in each of these scenarios?
I am not looking for a solution like "add all those directories to your python path". Initially I just added Project to my python path on my local machine, and then did all my imports relative to that (import Project.source.module2), but this (predictably) caused my builds to fail on the Docker instances. I don't just want this to work on my local machine, but also on the Docker instances I'm using to build and test this software, and on any user's machine that subsequently installs it (i.e. by doing a pip install Project. What is the most robust way to make sure this dependency is satisfied? How can I make sure module1 can import module2 regardless of where module1 itself is imported from? Any python 3.x.x solution is welcome.
I figured out a way to do it (credit here) - it's a little inelegant, but extremely robust. Works on my local machine independent of whether using an import statement or running a script directly, as well as my build servers which use github actions and Travis CI.
Basically, I added a file in the source directory, called context.py with the following contents:
import os
import sys
fileLocation = os.path.dirname(os.path.abspath(__file__))
sourceLocation = os.path.abspath(os.path.join(fileLocation, '..', 'source/'))
sys.path.insert(0, sourceLocation)
This finds out the current file being executed from python, and then uses that to add to the python path. And then in my module1.py file, at the top I have:
import context
import module2
Now, whenever module1 is imported, it successfully imports module2. More elegant answers or comments on why this works and in which cases it might fail are appreciated.
I have a python project structured like this:
repo_dir/
----project_package/
--------__init__.py
--------process.py
--------config.py
----tests/
--------test_process.py
__init__.py is empty
config.py looks like this:
name = 'brian'
USAGE
I use the library by running python process.py from the project/project/ directory, or by specifying the python file path absolutely. I'm running Python 2.7 on Amazon EC2 Linux.
When process.py looks like below, everything works fine and process.py prints brian.
import config
print config.name
When process.py looks like below, I get the error ImportError: No module named project.config.
import project.config
print config.name
When process.py looks like below, I get the error ImportError: No module named project. This makes sense as the same behavior from the previous example should be expected.
from project import config
print config.name
If I add these lines to process.py to include the library root in sys.path, all configurations above, work fine.
import os
import sys
sys.path.append(os.path.abspath(os.path.join(os.path.dirname(__file__), '..')))
MY CONFUSION
Many resources suggest setting up python libraries to import modules using project.module_name, but it doesn't seem like sys.path appending is standard, and seems weird that I need it. I can see that the sys.path append added my library root as a path in sys, but I thought that's what the __init__.py in my library root was supposed to do. What gives? What am I missing? I know Python importing creates lots of headaches so I've tried to simplify this as much as possible to wrap my head around it. I'm going crazy and it's Friday before a holiday. I'm bummed. Please help!!
QUESTIONS
How should I set up my libraries? How should I import packages? Where should I have __init__.py files? Do I need to append my library root to sys.path in every project? Why is this so confusing?
Your project setup is alright. I renamed the directories just for clarity
in this example, but the structure is the same as yours:
repo_dir/
project_package/
__init__.py
process.py
config.py
# Declare your project in a setup.py file, so that
# it will be installable, both by users and by you.
setup.py
When you have a module that wants to import from another module in
the same project, the best approach is to use relative imports. For example:
# In process.py
from .config import name
...
While working on the code on your dev box, do your work in a Python virtualenv,
and pip install your project in "editable" mode.
# From the root of your repo:
pip install -e .
With that approach, you'll never need to muck around with sys.path -- which
is almost always the wrong approach.
I think the problem is how you're running your script. If you want the script to be living in a package (the inner project folder), you should run it with python -m project.process, rather than by filename. Then you can make absolute or explicit relative imports to get config from process.
An absolute import would be from project import config or import project.config.
An explicit relative import would be from . import config.
Python 2 also allows implicit relative imports, but they're a really bad misfeature that you should never use. With implicit relative imports, internal package modules can shadow top-level modules. For instance, a project/json.py file would hide the standard library's json module from all the other modules in the package. You can tell Python you want to forbid implicit relative imports by putting from __future__ import absolute_import at the top of the file. It's the standard behavior in Python 3.
I have a problem similar to that described here How to fix "ImportError: No module named ..." error in Python? but I cannot fix it with the suggestion to set PYTHONPATH.
my directory looks like:
- project
- python
- src
- ml
- __init__.py
- classifier_dnn.py
- util.py
- vectorizer
- fv_davison.py
- __init__.py
And I am running classifier_dnn.py at the project folder path:
~project&PYTHONPATH=/home/project/
~project$python3 /home/project/python/src/ml/classifier_dnn.py /home/project/data/labeled_data_all.csv /home/project/output
But an error is generated when classifier_dn imports ml.util:
Traceback (most recent call last):
File "/home/project/chase/python/src/ml/classifier_dnn.py", line 5, in <module>
from ml import util
ImportError: No module named 'ml'
I have also tried setting PYTHONPATH=/home/project/python or PYTHONPATH=/home/project/src but the same error happens.
When I test this in PyCharm, it works if set python/src to become source root, regardless what working directory is. But I cannot figure out how to set this properly when I run this from command line.
Any help please
Thanks
I have a writeup on this but I'll copy the relevant text inline.
You have two problems:
You need to export PYTHONPATH (export PYTHONPATH=/full/path/to/src)
You should run with python -m module.path instead of python path/to/file.py
The core problem is python path/to/file.py puts path/to on the beginning of the PYTHONPATH (sys.path)
This causes imports to start from (in your case) src/ml instead of the expected src.
By using -m, you avoid this path munging and it will correctly keep src as the beginning of your sys.path.
When I test this in PyCharm, it works if set python/src to become source root, regardless what working directory is. But I cannot figure out how to set this properly when I run this from command line.
You can do this by cding into the src directory
You should not put executable scripts into a library structure.
Only library modules should go into the package structure. Executables should stay outside it. This is important for installation and day-to-day use. Once you have installed your package (using distutils for example) using it would be complicated, if you leave the executable in the library. You would have to give the complete installation path into the python library (and don't forget about -m!) every time or create shortcuts or other workarounds.
Seperate executables and library. Executables are installed to something like usr/local/bin and are callable without path, -m or other problems, and your library is installed into the python library.
This is a broad question because no one seems to have found a solution to it as yet so I think asking to see a working example might prove more useful. So here goes:
Has anyone run a nosetests on a python project using imports of multiple files/packages?
What I mean is, do you have a directory listing such as:
project/
|
|____app/
|___main.py
|___2ndFile.py
|___3rdFile.py
|____tests/
|____main_tests.py
Where your main.py imports multiple files and you perform a nosetests from the project file of utilizing a test script in the main_tests.py file? If so please can you screen shot your import section both of all your main files and your main_tests.py file?
This seems to be a major issue in nosetests, with no apparent solution:
Nosetests Import Error
A test running with nosetests fails with ImportError, but works with python command
https://github.com/nose-devs/nose/issues/978
https://github.com/nose-devs/nose/issues/964
You can't have python modules starting with a digit, so 2ndFile.py, 3rdFile.py won't actually work (rename them).
You'll need an __init__.py inside the app directory, for it to be considered a package, so add that (it can be empty file).
You don't need an __init__.py in the tests directory!
The import statements in main_tests.py should look like from app.main import blah
The absolute path of the project directory needs to be in your sys.path. To achieve this, set an environment variable: export PYTHONPATH=/path/to/project
Now running nosetests should work.
I'm setting up some code for unittesting. My directory currently looks like this:
project/
src/
__init__.py
sources.py
test/
__init__.py
sources_test.py
In __init__.py for the test directory, I have these two lines:
import sys
sys.path.insert(0, '../')
In the test files, I have the line import src.sources.
When I use nose to run these tests from the project directory, everything works just fine. If I try to run the tests individually it gives me this error:
ImportError: No module named src.sources
I assume that this is because when I run the test from the command line it isn't using __init__.py. Is there a way I can make sure that it will use those lines even when I try to run the tests individually?
I could take the lines out of __init__.py and put them into my test files, but I'm trying to avoid doing that.
To run the tests individually I am running python sources_test.py
You're really trying to abuse packages here, and that isn't a good idea.
The simple solution is to not run the tests from within the tests directory. Just cd up a level, then do python tests/sources_test.py.
Of course that in itself isn't going to import test/__init__.py. For that, you really need to import the package. So python -m tests.sources_test is probably a better idea… except, of course, that if your package is made to be run as a script but not to be imported, that won't work.
Alternatively, you could (on POSIX platforms, at least) do PYTHONPATH=.. python sources_test.py from within tests. This is a bit hacky, but it should work.
Or, better, combine the above, and, from outside of tests, do PYTHONPATH=. python tests/sources_test.py.
A really hacky workaround is to explicitly import __init__. This should basically work for you simple use case, but everything ends up wrong—in particular, you end up with a module named __init__ instead of one named test, and of course your main module isn't named test.sources_test, and in fact there is no test package at all. Unless you accidentally re-import anything after modifying sys.path, in which case you may get duplicates of the modules.
If you write
import src.source
the python interpreter looks into the src directory for a __init__.py file. If it exists, you can use the directory as a package name. If your are not in your project directory, which is the case when you are in the src directory, then python looks into the directories in $PYTHONPATH environment variable (at least in linux, windows should also have some environment variable, maybe with another name), if it can find some directory src with a __init__.py file in it.
Did you set your $PYTHONPATH?