I've added one folder into my project called folderb as follows:
mainproject
foldera
__init__.py (empty)
filea.py (module)
folderb
__init__.py (empty)
fileb.py (module)
My project is located under:
/root/Documents/Repositories/mainproject/
Inside filea.py module i want to use module's functions of fileb.py from folderb therefore i import as follows:
from folderb.fileb import myfunctionb
Nevertheless i am getting this:
Exception has occurred: ModuleNotFoundError
No module named 'folderb'
What am i doing wrong?
The issue is set up of sys.path with two different way of execution of the script. Go to mainproject folder. This would work:
python -m foldera.filea
and this would not
python foldera/filea.py
You can see why it is so by adding this to beginning of foldera/filea.py before any other import:
import sys
print(sys.path)
Invocation using the path will add /root/Documents/Repositories/mainproject/foldera to the path where Python interpreter looks for modules. Invocation using the package and module name (with -m option) would add /root/Documents/Repositories/mainproject/.
You can make the path variant working by augmenting the sys.path, either with
PYTHONPATH=. python foldera/filea.py
or by adding this ugly code to the beginning of filea.py:
current = os.path.dirname(os.path.realpath(__file__))
parent = os.path.dirname(current)
sys.path.insert(0, parent)
Don't do that, use the first option. More information:
PEP 338 – Executing modules as scripts
Python documentation - Command line and environment, option -m
Related
Assume I have the following file structure:
Package/
__init__.py
A.py
B.py
Inside __init__.py I have the following:
__init__.py/
import numpy as np
import pandas as pd
Then I issue the following in the A.py script:
A.py/
from Package import *
However, I receive an error message that no module name Package is defined.
ModuleNotFoundError: No module named Package
I thought from Package import * means running everything in the __init__.py.
I can run the A.py content and use the init import from B.py as I expected.(using from Package import *)
I am using VSCode and Anaconda and my OS is Windows 10.
I can append the project folder every time to my PythonPath using the following commands:
sys.path.append("Path to the Package")
But I do not want to run this piece of code every time.
Can anyone explain what is the problem?
Is this a new problem in Python since I do not recall having such issues in the past?
Because if you run the B.py, the Parent folder of Package folder will be added into the sys.path, be equal to you add sys.path.append("Path to the Package") in the A.py file.
But when you run the A.py, it will add the Package folder instead of the Parent folder of Package folder to the sys.path.
sys.path:
A list of strings that specifies the search path for modules.
Initialized from the environment variable PYTHONPATH, plus an
installation-dependent default.
As initialized upon program startup, the first item of this list,
path[0], is the directory containing the script that was used to
invoke the Python interpreter.
If you are running the python file in debug mode(F5), and the Package folder was the subfolder of your workspace, you can configure the PYTHONPATH in the launch.json file:
"env": {
"PYTHONPATH": "${workspaceFolder}"
},
In your A.py script use just use import file.py and do not use the star. Or, put all your files in a second Package2 directory and use from Package2 import * from your current A.py file.
I am using python 2.7 and have the following project structure
main-folder
--folder1
- script.py
--folder2
- scr.py
-- abc.py
-- util.py
I am trying to import abc.py into util.py using
from main-folder import abc
but I am not getting error as below
ImportError: No module named main-folder
I also tried to append the path to main-folder to the path using
sys.path.append(r'path/to main-folder/main-folder')
I also have init.py in main-folder , folder1 & folder2
I'll assume your package is not actually called main-folder because that's a syntax error.
sys.path / PYTHONPATH is where Python looks for modules, so adding a folder to sys.path means what's in it can be imported (as a top-level module), it doesn't make the folder itself importable
when you run a script as a Python file, Python adds that file's folder on the PYTHONPATH e.g. here if you run main-folder/folder1/script.py, main-folder/folder1 is what's on your PYTHONPATH, and that obviously can't access abc or utils no matter how you slice it
import <foo> or from <foo> import <bar> is an absolute import, it starts its search from the PYTHONPATH[0]
you can specify PYTHONPATH on the command line, e.g. PYTHONPATH=. python main-folder/folder1/script.py will *also* add whatever .` is to your PYTHONPATH, which may be what you want?
within a pacakge (a directory with an __init__ and a bunch of submodules), it's probably better to use relative imports e.g. util should use from . import abc if they're supposed to be sibling submodules of the same package
[0] that's not actually true for Python 2, as PEP 328 necessarily had to keep the old behaviour working but you probably want to assume it regardless, you can "opt out" of the old behaviour by using the __future__ stanza listed in the PEP
I have the following directory structure:
app/
bin/
script1.py
script2.py
lib/
module1/
__init__.py
module1a.py
module1b.py
__init__.py
module2.py
Dockerfile
My problem is that I want to execute script1.py and script2.py, but inside those scripts, I want to import the modules in lib/.
I run my scripts from the root app/ directory (i.e. adjacent to Dockerfile) by simply executing python bin/script1.py. When I import modules into my scripts using from lib.module1 import module1a, I get ImportError: No module named lib.module1. When I try to import using relative imports, such as from ..lib.module1 import module1a, I get ValueError: Attempted relative import in non-package.
When I simply fire up the interpreter and run import lib.module1 or something, I have no issues.
How can I get this to work?
In general, you need __init__.py under app and bin, then you can do a relative import, but that expects a package
If you would structure your python code as python package (egg/wheel) then you could also define an entry point, that would become your /bin/ file post install.
here is an example of a package - https://python-packaging.readthedocs.io/en/latest/minimal.html
and this blog explains entry points quite well - https://chriswarrick.com/blog/2014/09/15/python-apps-the-right-way-entry_points-and-scripts/
if so, that way you could just do python setup.py install on your package and then have those entry points available within your PATH, as part of that you would start to structure your code in a way that would not create import issues.
You can add to the Python path at runtime in script1.py:
import sys
sys.path.insert(0, '/path/to/your/app/')
import lib.module1.module1a
you have to add current dir to python path.
use export in terminal
or sys.path.insert in your python script both are ok
I have a python project/library called "slingshot" with the following directory structure:
slingshot/
__init__.py
__main__.py
build.py
deploy.py
util/
__init__.py
prepare_env.py
cdn_api.py
From __main__.py I would like to import functions from util/prepare_env.py.
I would like to ensure that util refers to the util I have in my project, and not some other util library that may be installed somewhere.
I tried from .util import prepare_env but I get an error.
from util import prepare_env seems to work, but doesn't address the ambiguity of "util".
What am I doing wrong?
__main__.py is as follows:
import os
from .util import prepare_env
if __name__ == '__main__':
if 'SLINGSHOT_INITIALIZED' not in os.environ:
prepare_env.pip_install_requirements()
prepare_env.stub_travis()
prepare_env.align_branches()
os.environ['SLINGSHOT_INITIALIZED'] = 'true'
When I type python3 ./slingshot I get the following error:
File "./slingshot/__main__.py", line 2, in <module>
from .util import prepare_env
ImportError: attempted relative import with no known parent package
When I type python3 -m ./slingshot I get the following error:
/usr/local/opt/python3/bin/python3.6: Relative module names not supported
__main__.py modules in a package make the module run as a script when you use the -m command line switch. That switch takes a module name, not a path, so drop the ./ prefix:
python3 -m slingshot
The current working directory is added to the start of the module search path so slingshot is found first, no need to give a relative path specification here.
From the -m switch documentation:
Search sys.path for the named module and execute its contents as the __main__ module.
Since the argument is a module name, you must not give a file extension (.py). The module name should be a valid absolute Python module name[.]
[...]
As with the -c option, the current directory will be added to the start of sys.path.
I met a very strange problem:
My file structure is like: (core and test are directories)
core
----file1.py
----__init__.py
test
----file2.py
in file2, i wrote:
from core import file1
result is:
ImportError: cannot import name file1
Have to create __init__.py file inside the test dir:
Because The __init__.py files are required to make Python treat the directories as containing packages.
parent/
child1/
__init__.py
file1.py
child2/
__init__.py
file2.py
From the error:
If run the child2/file2.py file directly. You are not able to access child1/file1.py from the child2/file2.py
Because only from the parent directory can access the child.
If have a folder structure like:
parent/
child1/
__init__.py
file1.py
child2/
__init__.py
file2.py
file3.py
If we run the file3.py file. Can able to access both child1/file1.py, child2/file2.py in file3.py
Because It is running from the parent directory.
If we need to access child1/file1 from child2/file2.py, We need to set the parent directory:
By running this below command we can achieve it...
PYTHONPATH=. python child2/file2.py
PYTHONPATH=. It refers the parent path. Then runs child2/file2.py file from the shell
It's not a strange problem, imports simply don't work like that.
From the official documentation: https://docs.python.org/3/tutorial/modules.html
When a module named spam is imported, the interpreter first searches for a built-in module with that name. If not found, it then searches for a file named spam.py in a list of directories given by the variable sys.path. sys.path is initialized from these locations:
The directory containing the input script (or the current directory when
no file is specified).
PYTHONPATH (a list of directory names, with the same syntax as the shell
variable PATH).
The installation-dependent default.
You could look into relative imports, here's a good source: https://stackoverflow.com/a/16985066/4886716
The relevant info from that post is that there's no good way to do it unless you add core to PYTHONPATH like Shawn. L says.
When I tried your case, I got
Traceback (most recent call last):
File "file2.py", line 3, in <module>
from core import file1
ImportError: No module named core
The reason is that Python does not find core. In this case, you need to add core to the system path, as shown below (add them at the very beginning of file2.py):
import sys,os
sys.path.append(path_to_core.py)
Or, if you would run it using command line, you could simply put the following at the beginning of file2.py
import sys,os
sys.path.append(os.path.join(os.path.dirname(__file__),'../'))
Here, os.path.join(os.path.dirname(__file__),'../') is to state the path to file2.py.