I have a Python program that stars with a bunch of code where I basically import some modules, initialize some variables and calls a few functions. Here's part of it:
import numpy as np
import scipy as sp
import scipy.optimize as opt
import scipy.constants as const
import random
import time
if os.name == 'nt': os.system('cls')
if os.name == 'posix': os.system('clear')
rows, columns = os.popen('stty size', 'r').read().split()
Inclination = math.radians(INCLINATION)
Period = PERIOD*const.day
Is there a way where I can put all of this into once single module and just call it? I tried to put all of this into an external program and call it, but as I understood everything gets done, but only locally, not on the main code.
The idea would be to be able to also use this "initialization module" in multiple programs.
Did you try putting all of that into some other .py file, and then just from x import *? Then you should have all of those modules and constants in whatever file you called from.
EDIT: If you're worried about performing all of that multiple times, don't be. On an import, Python checks to see if a module has already been loaded before it goes and loads that module again. For example say we have these files:
fileA.py => from initializer import *
fileB.py => import initializer
fileC.py => import fileA, fileB
When you run fileC.py, the code in initializer.py is only run once, even though both fileA and fileB successfully load it, and even though they do so in different ways.
you don't need any special mechanism. when you import this module then python goes throw it and all values are initialized and you can use it. just import it and this is all.
Related
I have to develop python code for an inhouse framework which can either run in the system's standard python environment or sometimes requires a custom environment to be active. For example, to be able to load certain modules.
Because an ImportError might not be too obvious for any user, I would like to give the user a proper error message, explaining the issue.
Some sample code might look like this:
# standard imports...
import sys
import numpy as np
# import which requires special environment
import myspecialmodule
# [...]
One method would be to check for ImportErrors like this:
# standard imports...
import sys
import numpy as np
# import which requires special environment
try:
import myspecialmodule
except ImportError:
print('not in the env', file=sys.stderr)
sys.exit(1)
# [...]
However, that is quite tedious to do, especially if there are many such scripts or if there are many imports. The question is then, which import has failed, if you do not want to repeat the try/except several times.
Now, I wrote a function guard() which checks for the existence of the environment in another way:
import sys
import os
def guard():
# assume the environment sets this special variable
if 'MYSPECIALENV' in os.environ:
# do more checks if the environment is correctly loaded
# if it is, simply:
return
print("The environment is not loaded correctly. "
"Run this script only in the special environment", file=sys.stderr)
sys.exit(1)
I altered the imports on the script:
# standard imports...
import sys
import numpy as np
# import which requires special environment
guard()
import myspecialmodule
# [...]
The advantage over the try/except method is, that an ImportError is still raised even if the environment is loaded.
But the issue is, that code linters like isort do not like function calls before imports.
Now, one could add configuration to isort to skip this section...
Furthermore, for a reader of the code, it might not be too obvious what is going on.
Of course, I could add some comments explaining it...
Another method I thought of is to write a module which does the job of guard, i.e.:
# file: guard/__init__.py
import sys
import os
if 'MYSPECIALENV' not in os.environ:
print("The environment is not loaded correctly. "
"Run this script only in the special environment", file=sys.stderr)
sys.exit(1)
# standard imports...
import sys
import numpy as np
# import which requires special environment
import guard
import myspecialmodule
# [...]
but this might be even less obvious, especially as imports might be sorted in a different way (again thinking about isort).
Is there a better, more pythonic way to do such things before importing a module? Especially one that is obvious for a future developer as well.
Let's say I have a file where I'm importing some packages:
# myfile.py
import os
import re
import pathlib
def func(x, y):
print(x, y)
If I go into another file and enter
from myfile import *
Not only does it import func, but it also imports os, re, and pathlib,
but I DO NOT want those modules to be imported when I do import *.
Why is it importing the other packages I'm importing and how do you avoid this?
The reason
Because import imports every name in the namespace. If something has a name inside the module, then it's valid to be exported.
How to avoid
First of all, you should almost never be using import *. It's almost always clearer code to either import the specific methods/variables you're trying to use (from module import func), or to import the whole module and access methods/variables via dot notation (import module; ...; module.func()).
That said, if you must use import * from module, there are a few ways to prevent certain names from being exported from module:
Names starting with _ will not be imported by import * from .... They can still be imported directly (i.e. from module import _name), but not automatically. This means you can rename your imports so that they don't get exported, e.g. import os as _os. However, this also means that your entire code in that module has to refer to the _os instead of os, so you may have to modify lots of code.
If a module contains the name __all__: List[str], then import * will export only the names contained in that list. In your example, add the line __all__ = ['func'] to your myfile.py, and then import * will only import func. See also this answer.
from myfile import func
Here is the fix :)
When you import *, you import everything from. Which includes what yu imported in the file your source.
It has actually been discussed on Medium, but for simplification, I will answer it myself.
from <module/package> import * is a way to import all the names we can get in that specific module/package. Usually, everyone doesn't actually use import * for this reason, and rather sticked with import <module>.
Python's import essentially just runs the file you point it to import (it's not quite that but close enough). So if you import a module it will also import all the things the module imports. If you want to import only specific functions within the module, try:
from myfile import func
...which would import only myfile.func() instead of the other things as well.
So I have two separate Python packages that I'm importing into my Python script for Raspberry Pi. In this case as;
from rflib import*
from Rpi.GPIO import*
However, both packages have their own seperate method cleanup(self)
So, at the end of the script when I use the command
cleanup(), how do I a) know which package the method is coming from (they both do utterly different things b) control which one is run?
I've looked into similarly named questions, but which seem to deal with inheritance overloading rather than package imports
from <module> import * takes all names in a module (those that don't start with a _ or everything listed in <module>.__all__ and assigns those names as globals in your current module.
If two modules define the same name, that means that the last one imported this way wins; cleanup = rflib.cleanup is replaced by cleanup = Rpi.GPIO.cleanup with the second from Rpi.GPIO import * statement.
You generally want to avoid using from <module> import *. Import specific names, or just the module itself.
In this case, you can do:
from rflib import cleanup as rflib_cleanup
from Rpi.GPIO import cleanup as rpigpio_cleanup
which would bind those two functions as different names.
Or you could just import the modules:
import rflib
from Rpi import GPIO
which gives you only the rflib and GPIO names, each a reference to the module object, so now you can reference each of the cleanup functions as attributes on those modules:
rflib.cleanup()
GPIO.cleanup()
If you need to use a lot of names from either module, the latter style is preferred because that limits the number of imports you need to do, keeps your namespace clean and un-cluttered, and gives you more context whereever those names are used in your code.
It is not a good practice to use * with import. You should be doing something like:
import rflib
from Rpi import GPIO
# Clean Up 1
rflib.cleanup
#Clean Up 2
GPIO.cleanup()
Additional pice of information:
In case your files/objects are of same name, in that case you should use as with import. For example:
from file1 import utils as file1_utils
from file2 import utils as file2_utils
file1_utils.my_func()
file2_utils.my_func()
When we import a module, say os, aren't we importing everything in it?
Then what's the use of from moduleName import (delimiter) should be added to the file in order for us to use its constants? and bunch of other things.
Can any one explain the exactly what from moduleName does when we actually have already loaded the module using import?
When you just do import sys for example, you do input everything in it. When you do a from sys import exit you import that specific module to be used without its first module name. Basically, if you use the from sys import exit statment you can just call:
exit()
Instead of:
sys.exit()
It's just a way to save less time writing the full sys.exit() statement. If you use it to load constants, you just allow yourself to write shorter statements to write something. If you have questions just ask!
Suppose I want to use os.path.abspath. I can import os, and type os.path.abspath every time I want to use it. Or I can write from os.path import abspath, and now I just need to type abspath.
The utility of something like:
import os
from os.path import abspath
Is that I can still reference other objects defined in os, like os.path.splitext, but if I use abspath frequently, I only need to type abspath.
Is there a place when I can put default imports for all my modules?
If you want default imports when using the python shell you can also set the PYTHONSTARTUP environmental variable to point to a python file that will be executed whenever you start the shell. Put all your default imports in this file.
Yes, just create a separate module and import it into yours.
Example:
# my_imports.py
'''Here go all of my imports'''
import sys
import functools
from contextlib import contextmanager # This is a long name, no chance to confuse it.
....
# something1.py
'''One of my project files.'''
from my_imports import *
....
# something2.py
'''Another project file.'''
from my_imports import *
....
Note that according to standard guidelines, from module import * should be avoided. If you're managing a small project with several files that need common imports, I think you'll be fine with from module import *, but it still would be a better idea to refactor your code so that different files need different imports.
So do it like this:
# something1.py
'''One of my project files. Takes care of main cycle.'''
import sys
....
# something2.py
'''Another project file. Main program logic.'''
import functools
from contextlib import contextmanager # This is a long name, no chance to confuse it.
....