I am building a yolo model using pytorch, but I would like to build an activation function by myself.
I need to import module.py to my own function. So I write from .modules import Module and run. It returns me attempted relative import with no known parent package.
This is the file location of my yolo:
C:/Users/Lenovo/.spyder-py3/yolo.py
This is the file location of the module.py that I want to call:
C:/Users/Lenovo/anaconda3/Lib/site-packages/torch/nn/modules/module.py
I have tried to rewrite the import statement as below:
__import__("C:/Users/Lenovo/anaconda3/Lib/site-packages/torch/nn/modules/module.py").Module
It also returns me the same import error. My understanding is that because in the module.py, there is an import statement from ..parameter import Parameter, where the .. refers the module.py to C:/Users/Lenovo (instead of C:/Users/Lenovo/anaconda3/Lib/site-packages/torch/nn).
My question is, what should I do to import the Module and use it in yolo.py for my own use, without having import error?
Related
I have a number of functions that I have written myself over time, over time I have placed each of these functions in their own modules (as I thought this was best practice).
I would like take the next step and organise these modules that I can import in fewer lines of code. However, I'm finding that I need to need to use a lot of lines of 'repeated' code to import the functions I want to have access to.
my_functions
-src
--__init__.py
--func1.py
--func2.py
--func3.py
I have followed this talk tutorial to build this collection of modules into a package thinking that I could use something like
import my_fucntions as mf
but I have to import each module
import func1 as fu1
import func2 as fu2
...
a = fu1.func1()
my question is, what do I have to do to be able to import my functions as a package, the same way that I would import something like pandas
import my_functions as mf
a = mf.func1(arg)
I'm finding it difficult to find either a tutorial or a clear simple example of how to structure this so any guidance at all would be useful. If it's just not doable without building something as complex and pandas or numpy thats ok too, I just said I'd try one last shot at it.
Create a __init__.py file in both my_functions directory and src directory.
Now in the my_functions __init__.py file. No need to edit __init__ file inside src.
from src.func1 import func1
from src.func2 import func2
__all__ = [
'func1',
'func2'
]
Now you can use my_functions like below
import my_functions
my_functions.f1()
__all__ gets called when you try to import my_functions and allow you to import things you have mentioned using dot notation.
To call a python script from the root call it using dot notation as below
python3 -m <some_directory>.<file>
I have a following problem. I have this folder structure:
folder
├──script.py
├──utils.py
└──venv
I would like to import functions from utils.py into script.py. When I try this from utils import function_a, function_b everything works. But when I try import utils I got an error NameError: name 'function_a' is not defined.
Whats the difference between from utils import function_a, function_b and import utils in this case? And how can I fix it, please?
import module : Nice when you are using many bits from the module. Drawback is that you'll need to qualify each reference with the module name.
from module import ... : Nice that imported items are usable directly without module name prefix. The drawback is that you must list each thing you use, and that it's not clear in code where something came from.
If you are importing the module itself, then you have to specify what you want to use from the module with the . operator:
import utils
utils.function_a()
utils.function_b()
I would like a way to detect if my module was executed directly, as in import module or from module import * rather than by import module.submodule (which also executes module), and have this information accessible in module's __init__.py.
Here is a use case:
In Python, a common idiom is to add import statement in a module's __init__.py file, such as to "flatten" the module's namespace and make its submodules accessible directly. Unfortunately, doing so can make loading a specific submodule very slow, as all other siblings imported in __init__.py will also execute.
For instance:
module/
__init__.py
submodule/
__init__.py
...
sibling/
__init__.py
...
By adding to module/__init__.py:
from .submodule import *
from .sibling import *
It is now possible for users of the module to access definitions in submodules without knowing the details of the package structure (i.e. from module import SomeClass, where SomeClass is defined somewhere in submodule and exposed in its own __init__.py file).
However, if I now run submodule directly (as in import module.submodule, by calling python3 -m module.submodule, or even indirectly via pytest) I will also, unavoidably, execute sibling! If sibling is large, this can slow things down for no reason.
I would instead like to write module/__init__.py something like:
if __???__ == 'module':
from .submodule import *
from .sibling import *
Where __???__ gives me the fully qualified name of the import. Any similar mechanism would also work, although I'm mostly interested in the general case (detecting direct executing) rather than this specific example.
What is being desired is will result in undefined behavior (in the sense whether or not the flattened names be importable from module) when we consider how the import system actually works, if it were actually possible.
Hypothetically, if what you want to achieve is possible, where some __dunder__ that will disambiguate which import statement was used to import module/__init__.py (e.g. import module and from module import *, vs import module.submodule. For the first case, module may trigger the subsequent (slow) import to produce a "flattened" version of the desired imports, while the latter case (import module.submodule) will avoid that and thus module will not contain any assignments of the "flattened" imports.
To illustrate the example a bit more, say one may import SiblingClass from module.sibling.SiblingClass by simply doing from module import SiblingClass as the module/__init__.py file executes from .sibling import * statement to create that binding. But then, if executing import module.submodule resulting in the avoidance of that flatten import, we get the following scenario:
import module.submodule
# module.submodule gets imported
from module import SiblingClass
# ImportError will occur
Why is that? This is simply due to how Python imports a file - the source file is executed in its entirety once to assign imports, function and class declarations to the designated names, and be registered to sys.modules under its import name. Importing the module again will not execute the file again, thus if the from .sibling import * statement was not executed during its initial import (i.e. import module.submodule), it will never be executed again during subsequent import of the same module, as the copy produced by the initial import assigned to its module entry in sys.module is returned (unless the module was reloaded manually, the code for the module will be executed again).
You may verify this fact by putting in a print statement into a file, import the corresponding module to see the output produced, and see that no further output will be produced on subsequent import of that module (related: What happens when a module is imported twice?).
Effectively, the desired functionality as described in the question cannot be implemented in Python.
A related thread on this topic: How to only import sub module without exec __init__.py in the package
This is not a complete solution, but standalone py.test (ignore __init__.py files) proposes setting a global flag to detect when in test. This corrects the problem for tests at least, provided the concerned modules don't call each other.
Python local import from files stored in at the same level of directory is often confusing to me. I have the following directory structure,
/distrib
__init__.py # empty
bases.py # contains classes MainBase etc.
extension.py # contains classes MainExtension etc
/errors
__init__.py
base_error.py
In file bases.py importing classes from extension.py is successful. For example, simply using from .extensions import MainExtension would work. On the other hand, in extensions.py, importing a class from bases.py faces challenges.
Attempt 1
If I from bases import MainBase, it complains ModuleNotFoundError: No module named 'bases'.
Attempt 2
If I specify it as local import by from .bases import MainBase, it complains ImportError: cannot import name 'MainBase'.
Attempt 3
If I import it using from . import bases, there is no error. But, consequently using bases.MainBase triggers error module distrib.bases has no attribute 'MainBase' and
it seem that all classes defined in the bases.py file are 'missing'.
However, in a different folder such as errors, I can import classes from distrib.bases normally. What exactly is happening here? Doesn't Python allow cyclical import?
You have a circular import. In your base module you try to import from extension. But from extension you import base. But base is not finished importing yet so that results in an ImportError when you try to import anything from the extensions module.
I am having a lot of trouble understanding the python module import system.
I am trying to create a simple folder structure as follows.
SomeModule
__init__.py
AnotherModule
AnotherModule.py
__init__.py
Utils
Utils.py
__init__.py
To use SomeModule i can do:
SomeModule.Foo()
Now inside AnotherModule.py I would like to import my Utils directory.
How come I have to do
import SomeModule.AnotherModule.Utils.Foo
why cannot I just do
import Utils.Foo
To shorten up the actual function name that you'll have to call in your code, you can always do:
from SomeModule.AnotherModule.Utils import *
While this still won't allow you to get away with a shorter import statement at the top of your script, you'll be able to access all of the functions within .Utils just by calling their function name (i.e. foo(x) instead of SomeModule.AnotherModule.Utils.foo(x).
Part of the reason for the lengthy import statement goes to the comment from #wim . Have a look by typing import this in a python interpreter.
put
import sys
import SomeModule.AnotherModule
sys.modules['AnotherModule'] = SomeModule.AnotherModule
in SomeModules __init__.py