Python: Write function in module to import packages - python

I am trying to write a function, which is itself loaded, to quickly import a bunch of modules globally.
I thought that, essentially, loaded modules could be treated as variables so I tried:
def loadMods():
global np
import numpy as np
and when I loaded numpy (calling np) there was no problem.
What I then did was to create a separate .py file called loadTest containing
# loadTest module
# coding: utf-8
def loadMod():
global np
import numpy as np
Then attempted to import numpy using this .py file in python (2.7):
import loadTest
loadTest.loadMod()
but now when attempting calling np I get
File "<stdin>", line 1, in <module>
NameError: name 'np' is not defined
Why does this occur? Any help or alternative ways of doing this would be much appreciated. Thanks a bunch :)

Instead of making a function to do this, why not make another module? You could name it something like modules.py and put all of your imports in there:
import numpy as np
import os
import sys
...
Then, all you need to do is a wildcard import:
from modules import *
and everything will be made available.

You must first define np like that.
In loadTest:
np=None
In somewhere other
import loadTest
loadTest.loadMod()
np=loadTest.np

Related

Python - Importing packages by running a script

I have a script which is importing lots of packages, including import numpy as np.
I have lots of scripts which need to import all of these packages (including some of my own). To make my life easier, I have a file called mysetup.py in my path to import all the packages. It includes the statement in a function called "import numpy as np".
I run "main.py". It runs the following
from mysetup import *
import_my_stuff()
np.pi()
"mysetup.py"
def import_my_stuff():
import numpy as np
return
However, I am unable to use numpy in "main.py" - this code will fail. Any suggestions as to why?
The problem you are facing is a consequence of a very important features of Python: namespaces.
https://docs.python.org/3/tutorial/classes.html#python-scopes-and-namespaces
https://realpython.com/python-namespaces-scope/
Basically, in your case, when you do that (numpy) import inside the (import_my_stuff) function, you are defining the code object numpy/np inside the function namespace. (scope, if you prefer).
To solve your issue (the way you are doing; not the only way), you should simply import everything at the module top level (without a function encapsulating the imports):
mysetup.py:
import numpy as np
# other modules...
main.py:
from mysetup import *
np.pi()
Imports in functions are not the best idea.
But you can just define whatever imports you need in top level code of mysetup.py
import numpy as np
and then it will be available when you import * from mysetup
from mysetup import *
print(np.pi)

Why does python import module imports when importing *

Let's say I have a file where I'm importing some packages:
# myfile.py
import os
import re
import pathlib
def func(x, y):
print(x, y)
If I go into another file and enter
from myfile import *
Not only does it import func, but it also imports os, re, and pathlib,
but I DO NOT want those modules to be imported when I do import *.
Why is it importing the other packages I'm importing and how do you avoid this?
The reason
Because import imports every name in the namespace. If something has a name inside the module, then it's valid to be exported.
How to avoid
First of all, you should almost never be using import *. It's almost always clearer code to either import the specific methods/variables you're trying to use (from module import func), or to import the whole module and access methods/variables via dot notation (import module; ...; module.func()).
That said, if you must use import * from module, there are a few ways to prevent certain names from being exported from module:
Names starting with _ will not be imported by import * from .... They can still be imported directly (i.e. from module import _name), but not automatically. This means you can rename your imports so that they don't get exported, e.g. import os as _os. However, this also means that your entire code in that module has to refer to the _os instead of os, so you may have to modify lots of code.
If a module contains the name __all__: List[str], then import * will export only the names contained in that list. In your example, add the line __all__ = ['func'] to your myfile.py, and then import * will only import func. See also this answer.
from myfile import func
Here is the fix :)
When you import *, you import everything from. Which includes what yu imported in the file your source.
It has actually been discussed on Medium, but for simplification, I will answer it myself.
from <module/package> import * is a way to import all the names we can get in that specific module/package. Usually, everyone doesn't actually use import * for this reason, and rather sticked with import <module>.
Python's import essentially just runs the file you point it to import (it's not quite that but close enough). So if you import a module it will also import all the things the module imports. If you want to import only specific functions within the module, try:
from myfile import func
...which would import only myfile.func() instead of the other things as well.

How to share imports between modules?

My package looks like this:
These helpers, since they are all dealing with scipy, all have common imports:
from matplotlib import pyplot as plt
import numpy as np
I'm wondering if it is possible to extract them out, and put it somewhere else, so I can reduce the duplicate code within each module?
You can create a file called my_imports.py which does all your imports and makes them available as * via the __all__ variable (note that the module names are declared as strings):
File my_imports.py:
import os, shutil
__all__ = ['os', 'shutil']
File your_other_file.py:
from my_imports import *
print(os.curdir)
Although you might want to be explicit in your other files:
File your_other_file.py:
from my_imports import os # or whichever you actually need.
print(os.curdir)
Still, this saves you having to specify the various sources each time — and can be done with a one-liner.
Alright, here is my tweak,
Create a gemfile under the package dir, like this
import numpy as np
from matplotlib import pyplot as plt
import matplotlib as mpl
Then, for other files, like app_helper.py
from .gemfile import *
This comes from here Can I use __init__.py to define global variables?

Issues with Importing Modules into Python

new to Python programming and have encountered an issue importing modules.
I have a main application (compare.py) with imports as follows :
# import the necessary packages
from skimage.measure import structural_similarity as ssim
import matplotlib.pyplot as plt
import numpy as np
import os
import skimage
from skimage import io
from skimage import color
from epilib import mse
from epilib import compare_images
and I have defined two functions in epilib, one called mse() and one called compare_images().
The code in mse() requires numpy. When I execute 'python compare.py', I get the following error message :
File "C:\Users\Dan\epilib.py", line 7, in mse err = np.sum((imageA.astype("float") - imageB.astype("float")) ** 2)
NameError: name 'np' is not defined
I assumed that because 'import numpy as np' was executed prior to import epilib, that the numpy library would be available to epilib? When I added 'import numpy as np' to the top of epilib, the issue resolved.
I don't see it as very efficient to have to move all the import statements to epilib. I was hoping to have epilib as just a library of functions and I could import into various python programs as required.
Is there a way to accomplish this?
That is not how python works, if you want to use numpy library in a module (in this case in eplib module), you need to import it in that module as well, eplib would get not the numpy module imported in your compare.py .
You should import numpy in eplib.py as -
import numpy as np
I do not think there would be any issue in efficiency, since once python imports a module for the first time, it caches the module in sys.modules , so whenever you re-import it (even if its in a different module) as long as its the same python process , Python would not re-import it, instead it would return the module object from sys.modules .

Importing imports within a function - Python 2.6

I have two files, SysDump.py and libApi.py in the same folder.
In SysDump I do:
from libApi._SysDump import *
In libApi I have:
def _SysDump():
import cPickle as _cPickle
import math as _math
from zipfile import ZipFile as _ZipFile
import re as _re
However I get the error:
from libApi._SysDump import *
ImportError: No module named _SysDump
I use VS2012+PTVS to step through the code and the execution trace goes to def _SysDump() in libApi as I steop through but does not enter it. Question is how do I make this work in Python 2.6 only please?
from libApi._SysDump import *
When writing this, Python looks for a package libApi and a module in it called _SysDump. A package is equivalent to a folder and a module is a single file. From your explanations, this is not the situation you have in your case. You have a module libApi with a function _SysDump. So if anything, you could do this:
from libApi import _SysDump
So you would get a reference to the _SysDump function. Note that running that function will not give you references to all the modules you are trying to import. Inside the function, the modules will be imported and assigned to local variables. After the function ends, those references are gone.
If you want to have some module take care of all your imports, you could make a file that performs those imports and import everything from that module:
# imports.py
import cPickle as _cPickle
import math as _math
from zipfile import ZipFile as _ZipFile
import re as _re
And then:
from imports import *

Categories