Is it possible to import a test.py file in whole project instead of importing it in each file?
for example calling it some where in start.py
No.
However, suppose you have three modules, a.py, b.py, and c.py. Further, assume that c.py imports b, and b.py imports a. Then in c.py, you can refer to items in a using
b.a.foo
since b imported a into its namespace.
even if it is ... (which in programming is almost always the case) a better question is why would you want to.
It would be difficult to do, and it adds nothing... in fact it can severely hinder development, and you may experience very strange bugs that become very hard to track down
[edit based on OP comment]
your use case sounds like you just want it to be in builtins
start.py
import __builtin__
import my_weird_test_class
__builtin__.tester = my_weird_test_class
from main_entry import main
main()
now in any file you can use
tester.do_something()
without importing tester or whatever
but as I said this tends to be a very bad idea... it is much better to explicitly import it in all your files ...
Related
I'm building a small Python package. Following https://packaging.python.org/tutorials/packaging-projects/, I have a file structure like so:
pkg/
- src/
- pkg/
- __init__.py
- file.py
whith ClassA, ClassB, etc. defined in file.py
I've installed this package on my system. I can now do things like import pkg.file
in an interpreter, which is great. However, it gives me access to whatever not starting with _ in file.py; including all the imports, global variables, etc. that live in this file. I'm happy with pkg.file.ClassA; less so with, for instance, pkg.file.itertools, or pkg.file.PI. It just doesn't feel very clean.
What would be the best practice here? Modifying my import statements in file.py as import itertools as _itertools? Some pythonic trickery in the init file? I thought of adding from file import ClassA, ClassB to it, but it doesn't seem very DRY to me. Additionally, file.py is susceptible to being broken into two or more files in the near future.
Alright, so I came up with a two-stage process:
setting __all__ = ['ClassA', 'ClassB'] at the top level in file.py;
adding from .file import * in init.py.
This way, on ìmport pkg I have directly access to my classes in the namespace pkg. As a side effect, I'm quite happy with this flattening of the hierarchy!
pkg.file.whatever are still accessible, a way to circumvent that would be great (for cleanness if anything else) but I can live with it.
I am a python beginne, and am currently learning import modules in python.
So my question is:
Suppose I currently have three python files, which is module1.py, module2.py, and module3.py;
In module1.py:
def function1():
print('Hello')
In module2.py, in order to use those functions in module1.py:
import module1
#Also, I have some other public functions in this .py file
def function2():
print('Goodbye')
#Use the function in module1.py
if __name__ == '__main__':
module1.function1();
function2();
In module3.py, I would like to use both the functions from module1.py and module2.py.
import module1
import module2
def function3():
print('Nice yo meet you');
if __name__ == '__main__':
module1.function1()
function3()
module2.function2()
Seems like it works. But my questions are mainly on module3.py. The reason is that in module3.py, I imported both module1 and module2. However, module1 is imported by module2 already. I am just wondering if this is a good way to code? Is this effective? Should I do this? or Should I just avoid doing this and why?
Thank you so much. I am just a beginner, so if I ask stupid questions, please forgive me. Thank you!!
There will be no problem if you avoid circular imports, that is you never import a module that itself imports the current importing module.
A module does not see the importer namespace, so imports in the importer code don't become globals to the imported module.
Also module top-level code will run on first import only.
Edit 1:
I am answering Filipe's comments here because its easier.
"There will be no problem if you avoid circular imports" -> This is incorrect, python is fine with circular imports for the most part."
The fact that you sensed some misconception of mine, doesn't make that particular statement incorrect. It is correct and it is good advice.
(Saying it's fine for the most part looks a bit like saying something will run fine most of time...)
I see what you mean. I avoid it so much that I even thought your first example would give an error right away (it doesn't). You mean there is no need to avoid it because most of the time (actually given certain conditions) Python will go fine with it. I am also certain that there are cases where circular imports would be the easiest solution. That doesn't mean we should use them if we have a choice. That would promote the use of a bad architecture, where every module starts depending on every other.
It also means the coder has to be aware of the caveats.
This link I found here in SO states some of the worries about circular imports.
The previous link is somewhat old so info can be outdated by newer Python versions, but import confusion is even older and still apllies to 3.6.2.
The example you give works well because relevant or initialization module code is wrapped in a function and will not run at import time. Protecting code with an if __name__ == "__main__": also removes it from running when imported.
Something simple like this (the same example from effbot.org) won't work (remember OP says he is a beginner):
# file y.py
import x
x.func1()
# file x.py
import y
def func1():
print('printing from x.func1')
On your second comment you say:
"This is also incorrect. An imported module will become part of the namespace"
Yes. But I didn't mention that, nor its contrary. I just said that an imported module code doesn't know the namespace of the code making the import.
To eliminate the ambiguity I just meant this:
# w.py
def funcw():
print(z_var)
# z.py
import w
z_var = 'foo'
w.funcw() # error: z_var undefined in w module namespace
Running z.py gives the stated error. That's all that I meant.
Now going further, to get the access we want, we go circular...
# w.py
import z # go circular
def funcw():
'''Notice that we gain access not to the z module that imported
us but to the z module we import (yes its the same thing but
carries a different namespace). So the reference we obtain
points to a different object, because it really is in a
different namespace.'''
print(z.z_var, id(z.z_var))
...and we protect some code from running with the import:
# z.py
import w
z_var = ['foo']
if __name__ == '__main__':
print(z_var, id(z_var))
w.funcw()
By running z.py we confirm the objects are different (they can be the same with immutables, but that is python kerning - internal optimization, or implementation details - at work):
['foo'] 139791984046856
['foo'] 139791984046536
Finally I agree with your third comment about being explicit with imports.
Anyway I thank your comments. I actually improved my understanding of the problem because of them (we don't learn much about something by just avoiding it).
Goal
I want to be able to import (on the __init__.py) all functions from every single file inside my package.
Usage
For example in this folder structure.
manage.py
- scripts/
-- __init__.py
-- tests.py
-- deploy.py
I am currently doing the following:
manage.py:
from scripts import *
script/init.py:
from .tests import *
from .deploy import *
But, every time I add another file to the package I have to add an import line on script/__init__.py, which is kind of annoying.
You can do it, manually, but you shouldn't.
Why you really do not want to do this:
You'll end up with a namespace where understanding what is what and from where it came from will be extremely hard, with difficulty increasing as the size of the overall project does. Appart from being completely unintuitive for Python, think of anybody else that might view your code or even worse, think about yourself re-reading it after 1 month and not remembering what's going on. You don't need that in your life.
In addition to that, any functions you expose to the importer that might overlap with other functions in other modules are going to get shaddowed by the most recent one imported. As an example, think of two scripts that contain the same function foo() and watch what happens.
>>> from scrpt1 import *
>>> foo()
Script 1
>>> from scrpt2 import *
>>> foo()
Script 2
Don't need that in your life either. Especially when it is so easy to bypass by being explicit.
Here are some related lines from the text contained in import this:
Explicit is better than implicit.
Be explicit about the place where your functions are defined in. Don't "spaghetti" your code. You'll want to hit yourself in the future if you opt in for a mesh of all stuff in one place.
Special cases aren't special enough to break the rules.
Really self explanatory.
Namespaces are one honking great idea -- let's do more of those!
"more of those!", not less; don't miss out on how wonderful namespaces are. Python is based on them; segregating your code in different namespaces is the foundation of organizing code.
importlib allows you to import any Python module from a string name. You can automate it with going through the list of files in the path.
It's more pythonic to use __all__. Check here for more details.
I want to make use of an existing python module (called "module.py"). I'm only interested in one function from that module ("my_function()"). The module also contains a lot of other functions, which I'm not using. These other functions cause the module to have a lot of imports that are not used in my_function.
"""module.py"""
import useful_import
import useless_import1
import useless_import2
def my_function():
return useful_import.do()
def other_function1():
return useless_import1.do()
def other_function2():
return useless_import2.do()
The code I've written (main.py) imports only my_function, but it still requires me to include/install the other useless modules. I've checked and none of the useless modules run any code on import, so I should be able to safely remove them.
"""main.py"""
from module import my_function
print my_function()
How do I best deal with this?
Should I included the useless imports in my project anyway?
Should I make a copy of module.py and edit it so that it only contains my_function and the right imports?
Should I copy my_function and its imports into main.py?
(some other option I didn't think/know of)?
It kind of depends on the context, e.g. how will this code be used later, who will maintain it, what kind of code is it realy etc etc.
But my suggestion under most circumstances would be:
refactor my_function and its needed imports into a new_module.py
use this module in main.py
Either remove module.py from your code base, or have it import from new_module
I'm taking a look at how the model system in django works and I noticed something that I don't understand.
I know that you create an empty __init__.py file to specify that the current directory is a package. And that you can set some variable in __init__.py so that import * works properly.
But django adds a bunch of from ... import ... statements and defines a bunch of classes in __init__.py. Why? Doesn't this just make things look messy? Is there a reason that requires this code in __init__.py?
All imports in __init__.py are made available when you import the package (directory) that contains it.
Example:
./dir/__init__.py:
import something
./test.py:
import dir
# can now use dir.something
EDIT: forgot to mention, the code in __init__.py runs the first time you import any module from that directory. So it's normally a good place to put any package-level initialisation code.
EDIT2: dgrant pointed out to a possible confusion in my example. In __init__.py import something can import any module, not necessary from the package. For example, we can replace it with import datetime, then in our top level test.py both of these snippets will work:
import dir
print dir.datetime.datetime.now()
and
import dir.some_module_in_dir
print dir.datetime.datetime.now()
The bottom line is: all names assigned in __init__.py, be it imported modules, functions or classes, are automatically available in the package namespace whenever you import the package or a module in the package.
It's just personal preference really, and has to do with the layout of your python modules.
Let's say you have a module called erikutils. There are two ways that it can be a module, either you have a file called erikutils.py on your sys.path or you have a directory called erikutils on your sys.path with an empty __init__.py file inside it. Then let's say you have a bunch of modules called fileutils, procutils, parseutils and you want those to be sub-modules under erikutils. So you make some .py files called fileutils.py, procutils.py, and parseutils.py:
erikutils
__init__.py
fileutils.py
procutils.py
parseutils.py
Maybe you have a few functions that just don't belong in the fileutils, procutils, or parseutils modules. And let's say you don't feel like creating a new module called miscutils. AND, you'd like to be able to call the function like so:
erikutils.foo()
erikutils.bar()
rather than doing
erikutils.miscutils.foo()
erikutils.miscutils.bar()
So because the erikutils module is a directory, not a file, we have to define it's functions inside the __init__.py file.
In django, the best example I can think of is django.db.models.fields. ALL the django *Field classes are defined in the __init__.py file in the django/db/models/fields directory. I guess they did this because they didn't want to cram everything into a hypothetical django/db/models/fields.py model, so they split it out into a few submodules (related.py, files.py, for example) and they stuck the made *Field definitions in the fields module itself (hence, __init__.py).
Using the __init__.py file allows you to make the internal package structure invisible from the outside. If the internal structure changes (e.g. because you split one fat module into two) you only have to adjust the __init__.py file, but not the code that depends on the package. You can also make parts of your package invisible, e.g. if they are not ready for general usage.
Note that you can use the del command, so a typical __init__.py may look like this:
from somemodule import some_function1, some_function2, SomeObject
del somemodule
Now if you decide to split somemodule the new __init__.py might be:
from somemodule1 import some_function1, some_function2
from somemodule2 import SomeObject
del somemodule1
del somemodule2
From the outside the package still looks exactly as before.
"We recommend not putting much code in an __init__.py file, though. Programmers do not expect actual logic to happen in this file, and much like with from x import *, it can trip them up if they are looking for the declaration of a particular piece of code and can't find it until they check __init__.py. "
-- Python Object-Oriented Programming Fourth Edition Steven F. Lott Dusty Phillips