Import with Two Files and both need the import - python

I have two py files and have simplified my problem of just a few lines.
One:
from Two import PrintTwo
class PrintOne(object):
print('HelloOne')
Two:
from One import PrintOne
class PrintTwo(object):
print('HelloTwo')
This brings up this message: cannot import name 'PrintTwo' as expected.
But my problem is I need to use some functions of these classes in both files.
I cant find a solution for that, how is the correct workflow for a case like this?
Kind regards

This is called circular importing and they can work, if you set them up properly. However, I'd not recommend using circular imports and rather refactor the code.
It's hard to say what to change on the code, if I don't see it. When I experience circular imports then I try to avoid them by refactoring the code. Possible solutions are:
Move bits of the "shared" code into an own Python module (recommended)
Lazy import a module/component, means only import it when you use it (works but not really shiny)
I can't show you an example based on the code above, because you only circular import the modules but don't use them.
As mentioned before, a workaround is using imports only when you use them, for example:
class PrintOne:
def some_magic_method(self):
from Two import PrintTwo

Related

Difference between importing python library within function versus importing globally?

Suppose I want to import a python library for use inside a function. Is it better to import the library within the function or import it globally?
Do this
def test_func:
import pandas as pd
# code implementation
or have the line below at the top of the python file to import globally?
import pandas as pd
What are the pros and cons of each approach? Which is the best practice in python?
I am using python v3.6
EDIT: Some clarifications to make.
Suppose I have 2 functions.
def func1:
import pandas as pd
# code implementation
def func2:
import pandas as pd
# code implementation
The python script runs both functions. Will the library be imported twice or is the python compiler smart enough to import it only once? This has performance implications.
It's a difference in name-visibility and execution time-point. The module-level import is imported when the file you are loading is imported or run itself. The function local one obviously only if the function is run. The imported names are either visible to all things in the file, or just within the function the import is executed in.
As there is a cost for hitting the import statement (albeit a small one, but still), the local one will always execute, not just once. It will not fully re-import the module though, python caches modules once they are imported the first time (see reload and sys.modules).
The best practice clearly is to use module level imports, and that's what you see in 99.999% of code. A huge reason is maintainability - if you want to understand what dependencies a module has, it's convenient to just look at the top, instead of having to comb through all code.
So when to use function local imports?
There are three scenarios:
you can't use the import earlier. This happens when e.g. a backend for a db or other system/functionality is chosen at runtime through configuration or system inspection.
you otherwise have circular imports. This is a rare case and also a code-smell, so if that is necessary, consider refactoring.
reducing startup-time by deferring module imports. This is very rarely useful though.
So for your case, the answer is a quick and simple "don't do it".
The module will be loaded when you import it, so if you need to import a rarely used module but cost a lot of time to initialize, you should import it when you need it.
Actually, if we just care about performance but not readability, it maybe always better to import module when we really need it.
But we need to keep our program maintainable. Importing all modules on the top is the most explicit way to tell others and author himself which modules are used.
To sum up, if you really have a very costly but rarely used module, you should import it locally. Otherwise you should import them on the top.

PyCharm import multiple functions from the same module

I have a small annoyance with PyCharm; for example if I am writing assert_true without having it imported, then I can Alt-Enter on it and I get the suggestion to Import from... nose.tools.assert_true and the ide automatically adds the line:
from nose.tools import assert_true
Thats all good. But if I later write assert_false and press Alt-Enter I don't seem to have the choice to add that to the existing import automatically. Seem odd to me, do I miss something ?
So currently what I do is to add the first function like that and then I have to add additional functions manually to the existing import lines.
I know you can use * and to import the full module; but I am still interested in on how to automatically add further functions to existing import lines in some non manual way.
Update: as in the comments below this seem to be specific to nose.tools, when importing from other modules it works.

Python "header.py" module

I've split one large python file containing a bunch of methods into several smaller ones. However, there's one problem: I'd like all of these small files to import almost the same modules.
I've tried to create a header.py file where I just pasted the common header. I've then added from header import * on the other ones, but it seems to me that this would only import the methods listed on header.py, rather than the actual modules.
I know one solution is to figure out which libraries each small file depends on, but isn't there a faster way to do that?
It is very important and useful to have all dependencies in a .py file to be listed in the import statements. That way we can easily trace back the source of all modules used.
Say if you are using module1.method somewhere and want to check where this method came from, you'll always find in the import statements at the top. This is the reason why from module1 import * is highly discouraged.
We can do what you need in neat way.
In your header.py
import module1
import module2
In your other files,
import header
header.module1.method() #make all the function calls via header module.
If you think its making your code writing tedious because of longer function names, then may be try this.
import header as h
h.module1.method()
But please make sure you don't have untraceable modules in your python files.

Is it always a good idea to import very specifically in Python?

This is pretty much Python, but asking from a Django user.
Suppose this is how Django apps are layout:
Webclient
apps
myapp#1
library
library.py
myapp#2
views.py
myapp#3
If I am working with views.py, and I want to import library.py, which one seems better?
from webclient.apps.myapp.library import LibraryClass
from webclient.apps.myapp.library.library import LibraryClass
I am using PyCharm, and either way doesn't complain about "unresolved references".
Is it better to import very speifically. Is second import method more likely to avoid name collison, if possible at all (say /library/ has several .py files)?
Thanks.
You should always import names from where they're defined. That way if webclient.apps.myapp.library should stop importing LibraryClass one day, you won't break the other imports.
As a follow-up to Ignacio's answer, you should look at the documentation of the libraries you are using, to see where it suggests you import things. It may be that although LibraryClass is defined in webclient.apps.myapp.library.library, it is documented as being in webclient.apps.myapp.library, so at some point, it the definition might be moved there, or webclient.apps.myapp.library.oldversion, but still accessible from webclient.apps.myapp.library.

python import depth

I've noticed that importing a module will import its functions and methods, and the functions and methods of those as well. Is there a set rule for how many levels down python will import when you import an upper-level module?
edit
sorry, I think I've been misunderstood by the answers so far responding about multiple imports of some dependencies. I'm thinking of nested folders e.g. in django, if you import django, you can access django.contrib.auth, but you can't access django.contrib.auth.views unless you import that specifically. I was just wondering if it's always two levels down in such a case
second edit
to clarify again.. in the django example, the layout is /django/contrib/auth/views.py, where each of the subfolders has a "init.py" making it a module, none of which define any "all" attributes. Is my example bad, since maybe you can't use the dot syntax to navigate to a file within a module designated folder?
No, python will import what it needs to import. However, each module is only imported once. For example, if one module does import sys and another module does import sys, it will not physically do it twice.
Not really. A module imports stuff from other modules because it needs to use them in that module, otherwise it'll break.
There is no pre-defined import depth level. Import statements are executed, just like any other python statement.
But, you may wonder, how are cycles avoided? Modules are added to sys.modules (i.e., cached) when they get imported for the first time, and that is the first location examined when an import statement is executed. So each module is loaded just once, although it may appear in many import statements.

Categories