I came up with the situation where I want to use 2 separate projects in tensorflow. The problem is that I use a (quite heavy loaded) virtual environment and I would like to keep using it. In those projects there are two folders named (not too surprisingly) utils. So, I guess a conflict occurs when I import from the second folder.
Possible solutions that I could think of:
Create a new virtual environment or even better clone the existing one like in here I guess. Con is that it takes about 1.7GB and I am running out of space (but I could deal with that somehow if that's the best choice).
Try to debug the conflicting cases using folder prefixes like instead of:
from utils import label_map_util I could use from project2.utils import label_map_util. This seems to work but has the con that I must search every file for conflicting imports. Like inside label_map_util.py where a similar import crashes (there are various folders in the project).
Is there any better solution? Am I missing something obvious here?
Also (I am fairly new to python) should this conflict be dealt automatically by python? I mean when inside a module in a folder and trying to import another module existing in the same folder or subfolder shouldn't this get priority over modules in other parallel folders (possibly in other projects etc)?
Edit:
To make it more clear I have 2 projects say projectA and projectB meaning 2 folders with various subfolders, which both contains a folder named utils. projectA was running just fine before the installation of projectB.
When I installed projectB and tried to run a module in it it contained imports from utils folder. Like
from utils import label_map_util
from utils import visualization_utils as vis_util
etc. I am using pyCharm and my IDE did not recognized the import as valid. I tried to debug it in interactive mode and my conclusions were:
There is a label_map_util.py module in folder utils in projectB folder.
For some reason import tries to find this module in projectA. So, when there is no module with that name it complains giving error:
ImportError: cannot import name 'label_map_util'
in that sense import utils works fine but imports the wrong module (from projectA).
Related
I know that python mechanism doesn't allow doing relative imports without a known parent package, but I want to know which the reason is.
If relative imports worked without known parent package it would make developers life much easier (I think, correct me if I am wrong)
Example:
From a python script a have to import a global definitions file. Right now I have to do it like this:
DEFS_PATH = '../../../'
sys.path.append(DEFS_PATH)
import DEFINITIONS as defs
If I could import this file just like this without having to specify the -m flag when executing the script or creating a __init__.py file that collects all packages. It would make everything much more easier.
from .... import DEFINITIONS as defs
Of course doing this raises the famous import error:
Obviously this is a toy example, but imagine having to repeat this in hundreds of python scripts...
Is there any workaround for importing relative packages without a known parent package that doesn't involve tha hacky ugly way (sys.path.append(...) or python -m myscript )?
I solved this problem but in a different way. I have a folder where I have lots of global functions that I used in different packages, I could create a Python Package of this folder, however I would have to rebuild it each time I changed something.
The solution that fitted me was to add user-packages.pth file in site-packages directory of my current environment, but it could also be added to global site-packages folder. Inside this user-packages.pth I added the absolute path to my directory where all the global utils are. And now I just have to do from any python script
from utils import data_processing as dp
from utils.database import database_connection as dc
Now I don't need to add in each file sys.path.append("path/to/myutils/")
Note:
The .pth file could have any file name (customName.pth) and paths inside the file should be separated by carriage return ("\n"). Also, paths should be absoulte.
For example:
C:\path\to\utils1
C:\path\to\other\utils2
I am trying to understand how import works in jupyter notebook.
My present working directory is "home/username". I have three python modules.
The path names of these modules are as given below.
"/home/username/module1.py"
"/home/username/folder/module2.py"
"/home/username/anaconda3/lib/python3.7/os.py" (which is an inbuilt python module)
Jupyter Notebook:
cell 1:
import module1
Works just fine
cell 2:
import module2 gives
ModuleNotFoundError: No module named 'module2'
cell 3:
import os
Works just fine
It seems like modules in the working directory can be imported without any problem. So, module1.py can be imported. Modules in other directories that are not packages cannot be imported directly. So, module2.py throws an error. But if this is the case how can os.py, which is not the working directory or in another package in the same directory, be imported directly?
This is really more about how python itself works.
You should be able to import module2 with from folder import module2. You should declare /home/username/folder as a package by create a blank init file /home/username/folder/__init__py. I recommend naming the package something more unique, like potrus_folder, that way you don't get naming conflicts down the line.
To explain: Python keeps track of what modules it has available through it's path, it is usually set in your environment variables. To see what folders it looks in for modules you can do import sys then print(sys.path). By default your working directory (/home/username/) will be included, with highest priority (it should thus be either first or last in sys.path, I don't remember). You can add your own folder with sys.path.append('/some/folder'), although it is frowned upon, and you should really add it to your system path, or just keep it as a package in your working directory.
Packages are really just subfolders of paths which have already been added. You access them, as I explained earlier, by using the from X import Y syntax, or if you want to go deeper from X.Z import Y. Remember the __init__.py file.
The path of os library is set in environment*
Whenever you give import it would search all the directories which are added in your environment + the pwd , so you could just add the directory in environment and that would work
By default /home/username/anaconda3/lib/python3.7/ is added by default at the time of installation since there is where most of the module lies, but you can add urs too
I developed a python3 package in Pycharm but am running into some confusing behavior when I try to import modules in my test cases. The problem seems to be with the directory path to the internal package modules. It is a bit difficult to explain the issue, but here is the gist.
So the python package name is pyugend. Now when I try and import a module--inside the package--named Models into a test case, pycharm forces me to reference the path as pyugend.pyugend.Models. So I need to reference pyugend twice.
However, when I build, install, and import this package into a jupyter notebook or some script, then I run into errors about the pyugend package not finding the internal modules. The only way to fix these errors is to change the paths inside of the module to references like pyugend.Models.
So basically, to run tests inside of pycharm I have make sure all of the internal package imports use a directory path like from pyugend.pyugend.Models import ... But when I want to use the package outside of pycharm then I actually have to go into the package, convert all of the import pyugend.pyugend... references to just single import pyugend.Models ... references.
I have included a picture of the directory structure as well as a picture of the __init__.py.
You can add the linesys.path.append(os.path.dirname(os.path.abspath(__file__))) before the imports in __init.py__ and then change the imports to from pyugend.Models import Base_model and so on, to enable consistent behavior wherever you use the package.
I am experimenting with python, mostly troubleshooting other people's code. I am trying to get a program to run, "path\folderA\program.py".
I am running the program from path\folderA
I am getting an error:
ImportError: No module named fff.ggg.ppp
program.py contains an import:
from fff.ggg.ppp import mmm
In the folder "path\folderB" there are:
"path\folderB\fff\__init__.py"
"path\folderB\fff\ggg"
folder ggg also contains __init__.py, as well as program ppp.py
From reading other posts, like Python error "ImportError: No module named" I understand that having the __init__.py makes a folder a "package" which makes imports from it possible - but it doesn't work, since I am getting an error.
This has been working for other people that worked with these projects, so there is something wrong with my setup.
I read something about the directories having to be in the sys.path. Does that mean I have to add them to the environment variable path ? That would mean adding a lot of directories to the PATH though, so it can't be.
So I also found the following:
import sys
sys.path.append( <path to FolderB> )
But that means changing the code (which has not been necessary for other people) and hard-coding a path to what it is on my local machine - which I shouldn't have to, right ?
I can't visualize it - apparently I am not supposed to change the code and hard-code the physical path to the import module - so how can a program from folderA even know to look in folderB for an import ?
How does the magic of __init__.py work ?
I can't visualize it - apparently I am not supposed to change the code
and hard-code the physical path to the import module - so how can a
program from folderA even know to look in folderB for an import ?
You are correct. Somehow you have to tell python to look for imported modules in folderB. There is no __init__.py magic that lets you import from other folders on your hard drive.
Usually, if you've got various different python packages like that, they work by being installed into python's library. That way they can imported from anywhere. This is usually accomplished by a setup.py script. Check if folderB has one. Run it with python setup.py install.
If that doesn't work, we'll need more information about how this code is structured.
Folder B must be on the sys.path, so you would either need to move mmm to A, or modify sys.path from within A (not sure if that works). __init__.py tells python that the folder is a package, so you could have folders with __init__.py within folders with __init__.py and python treats the folders inside as parts of the parent folder. Check out sympy or almost any large python library and you will find such a structure. It can also contain code to be run on import, but can also be empty.
My situation is similar to one in this question... The difference is,
In our python/django project, we have a directory called utils, which keeps basic functions...
Sometimes, we need to test some modules by running thm from console like
python myproject/some_module.py
It is all good, until python tries to import something from our utils directory...
from utils.custom_modules import some_function
ImportError: No module named custom_modules
I check my python path, and our project is on the list, each folder under the project file has __init__.py files, and when i run ipython within project directory... Everything is ok, otherwise, python imports from its own utils directory...
My collegues use the sama method without any problem, but it throws ImportError on my environment... What could be the problem that all of us was missing?
UPDATE: My project directory, and each sub-drectory have __init__.py file, and i can import other modules from my project without any problem... When i am in a diffrent folder than my procekt and i run ipython, a such import have no problem...
from someothermodule.submodule imprort blahblahblah
But, when it comes to importing utils, it fails...
UPATE 2: What caused the problem was the utils directory under django folder, which is also in the python path...
See the PEP on absolute and relative imports for the semantics. You probably want
from .utils.custom_modules import some_function
if you're in a file at the top level of your package.
Edit: This can only be done from inside a package. This is for a good reason -- if you're importing something that is part of your project, then you're already treating it like a Python package, and you should actually make it one. You do this by adding an __init__.py file to project directory.
Edit 2: You've completely changed the question. It may be possible to work around the problem, but the correct thing to do is not refer to your packed the same way as a builtin package. You either need to rename utils, or make it a subpackage of another package, so you can refer to it by a non-conflicting name (like from mydjangoapp.utils.custom_modules import some_function).
I'm not going to bother telling you to try and make sure you don't name your own modules after the stdlib modules;
If you want to leave the name like this, you'll have to use something like this in everything that imports your own utils module:
import sys, imp
utils = sys.modules.get('utils')
if not utils: utils = imp.load_module('utils',*imp.find_module('utils/utils'))
However, unless there's a lot of stuff that you'd have to change after renaming it, I'd suggest you rename it.