How to share imports between modules? - python

My package looks like this:
These helpers, since they are all dealing with scipy, all have common imports:
from matplotlib import pyplot as plt
import numpy as np
I'm wondering if it is possible to extract them out, and put it somewhere else, so I can reduce the duplicate code within each module?

You can create a file called my_imports.py which does all your imports and makes them available as * via the __all__ variable (note that the module names are declared as strings):
File my_imports.py:
import os, shutil
__all__ = ['os', 'shutil']
File your_other_file.py:
from my_imports import *
print(os.curdir)
Although you might want to be explicit in your other files:
File your_other_file.py:
from my_imports import os # or whichever you actually need.
print(os.curdir)
Still, this saves you having to specify the various sources each time — and can be done with a one-liner.

Alright, here is my tweak,
Create a gemfile under the package dir, like this
import numpy as np
from matplotlib import pyplot as plt
import matplotlib as mpl
Then, for other files, like app_helper.py
from .gemfile import *
This comes from here Can I use __init__.py to define global variables?

Related

Python - Importing packages by running a script

I have a script which is importing lots of packages, including import numpy as np.
I have lots of scripts which need to import all of these packages (including some of my own). To make my life easier, I have a file called mysetup.py in my path to import all the packages. It includes the statement in a function called "import numpy as np".
I run "main.py". It runs the following
from mysetup import *
import_my_stuff()
np.pi()
"mysetup.py"
def import_my_stuff():
import numpy as np
return
However, I am unable to use numpy in "main.py" - this code will fail. Any suggestions as to why?
The problem you are facing is a consequence of a very important features of Python: namespaces.
https://docs.python.org/3/tutorial/classes.html#python-scopes-and-namespaces
https://realpython.com/python-namespaces-scope/
Basically, in your case, when you do that (numpy) import inside the (import_my_stuff) function, you are defining the code object numpy/np inside the function namespace. (scope, if you prefer).
To solve your issue (the way you are doing; not the only way), you should simply import everything at the module top level (without a function encapsulating the imports):
mysetup.py:
import numpy as np
# other modules...
main.py:
from mysetup import *
np.pi()
Imports in functions are not the best idea.
But you can just define whatever imports you need in top level code of mysetup.py
import numpy as np
and then it will be available when you import * from mysetup
from mysetup import *
print(np.pi)

Why does python import module imports when importing *

Let's say I have a file where I'm importing some packages:
# myfile.py
import os
import re
import pathlib
def func(x, y):
print(x, y)
If I go into another file and enter
from myfile import *
Not only does it import func, but it also imports os, re, and pathlib,
but I DO NOT want those modules to be imported when I do import *.
Why is it importing the other packages I'm importing and how do you avoid this?
The reason
Because import imports every name in the namespace. If something has a name inside the module, then it's valid to be exported.
How to avoid
First of all, you should almost never be using import *. It's almost always clearer code to either import the specific methods/variables you're trying to use (from module import func), or to import the whole module and access methods/variables via dot notation (import module; ...; module.func()).
That said, if you must use import * from module, there are a few ways to prevent certain names from being exported from module:
Names starting with _ will not be imported by import * from .... They can still be imported directly (i.e. from module import _name), but not automatically. This means you can rename your imports so that they don't get exported, e.g. import os as _os. However, this also means that your entire code in that module has to refer to the _os instead of os, so you may have to modify lots of code.
If a module contains the name __all__: List[str], then import * will export only the names contained in that list. In your example, add the line __all__ = ['func'] to your myfile.py, and then import * will only import func. See also this answer.
from myfile import func
Here is the fix :)
When you import *, you import everything from. Which includes what yu imported in the file your source.
It has actually been discussed on Medium, but for simplification, I will answer it myself.
from <module/package> import * is a way to import all the names we can get in that specific module/package. Usually, everyone doesn't actually use import * for this reason, and rather sticked with import <module>.
Python's import essentially just runs the file you point it to import (it's not quite that but close enough). So if you import a module it will also import all the things the module imports. If you want to import only specific functions within the module, try:
from myfile import func
...which would import only myfile.func() instead of the other things as well.

What's the purpose of the file "pylab.py"

I looked at the file "pylab.py" at matplotlab's directory and found that it contains a great bunch of imports, and then defines a single variable "bytes" at the last line. Here is the last several lines of this file:
from numpy.fft import *
from numpy.raenter code herendom import *
from numpy.linalg import *
import numpy as np
import numpy.ma as ma
# don't let numpy's datetime hide stdlib
import datetime
# This is needed, or bytes will be numpy.random.bytes from
# "from numpy.random import *" above
bytes = six.moves.builtins.bytes
I wonder what's the purpose of such a file when it only defines a seemingly useless variable. As a result, what's the purpose of writing code like from matplotlib import pylab?
The matplotlib docs say:
pylab is a convenience module that bulk imports matplotlib.pyplot (for plotting) and numpy (for mathematics and working with arrays) in a single name space. Although many examples use pylab, it is no longer recommended.
So for example, you can do
>>> from pylab import *
And you have imported all the names imported by pylab into your local namespace. This is convenient when using the interactive shell.
Additionally, pylab imports datetime and bytes. This is because the from numpy.foo import * statements import numpy objects named bytes and datetime which are not the same as the standard python objects with these names, so they need to be overridden with the standard versions.
The practice of importing names into a module just so other modules can import them from there instead of the original module is not unusual. For example, given this module:
$ cat foo/__init__.py
from bar import *
from baz.quux import *
from spam import eggs
Other modules can do from foo import eggs rather than from foo.spam import eggs. Apart from the convenience of less typing, this approach hides the internal structure of the foo package from its clients. As long as they import from the top level module they need not be concerned that the internal structure of the package may change over time. This is a form of the facade design pattern.

Way to run all packages I want in python script

There are many times that I want to use same packages in my scripts, I mostly copy paste packages I want from my last script. I want to stop this work and run all of theme with one simple function, Today i try this:
def econometrics():
print("Econometrics is starting")
import pandas as pd
import numpy as np
import statsmodels.formula.api as smf
import statsmodels.api as sm
import matplotlib.pyplot as plt
print("Econometrics is started")
econometrics()
the function runs without error but when I call some method from packages, I get errors like this:
name 'plt' is not defined
What is wrong with that code? is there anyway to define function to do that?
What is wrong with that code?
Simple answer: Variable scope. plt (and the others) are only accessible from within the econometrics method.
Try making one file, named importer.py, for example
import pandas as pd
import numpy as np
import statsmodels.formula.api as smf
import statsmodels.api as sm
import matplotlib.pyplot as plt
Then in your other code (that is in the same directory),
from importer import *
Using an __init__.py is probably the recommended way to approach that, though, but it wasn't clear if you have a module/package layout, or not.
If you do, then use
Relative import (same directory): from . import *
Absolute import (use module name): from some_module import *
Your intent is wrong in python's grammar. Because within your code, the variables range are scoped within the function. So, when you do your imports, you're creating a bunch of variables within the econometrics function range, and thus your variables are only in reach within that function.
So, let's take a simpler example:
>>> def foobar():
... a = 1
... b = 2
...
>>> foobar()
>>> a
NameError: name 'a' is not defined
here a and b only exist within foobar's function scope, so it's out of scope at the main scope.
To do what you want, the way you want it, you should declare your variable as belonging to the global scope:
def econometrics():
global pd, np, smf, sm, plt
print("Econometrics is starting")
import pandas as pd
import numpy as np
import statsmodels.formula.api as smf
import statsmodels.api as sm
import matplotlib.pyplot as plt
print("Econometrics is started")
econometrics()
So to get back to the foobar example:
>>> def foobar():
... global a, b
... a = 1
... b = 2
...
>>> foobar()
>>> a
1
>>> b
2
Though, I do not really like that way of doing things, as it's doing things implicitely. Considering you have a python module with just the econometrics function defined, people reading the following code:
from econometrics import econometrics
econometrics()
plt.something()
wouldn't necessary understand that plt has been made available through the econometrics function call. Adding a comment would help, but still is an unnecessary extra step.
Generally speaking, doing globals within any language is wrong, and there's most of the time always a better way to do it. Within the "Zen of python", it is stated that "Explicit is better than implicit", so I believe a more elegant way would be to create a module that does the import, and then you'd import what you need from the module:
econometrics.py:
import pandas as pd
import numpy as np
import statsmodels.formula.api as smf
import statsmodels.api as sm
import matplotlib.pyplot as plt
and in your code you'd then import only what you need:
from econometrics import pd, plt
plt.something()
which would be much more elegant and explicit! Then, you'd just have to drop that file in any projects you need your mathematics modules to have all your beloved modules that need - and only them - available in your code!
Then as a step further, you could define your own python module, with a full blown setup.py, and with your econometrics.py file being a __init__.py in the econometrics package directory, to then have it installed as a python package through:
python setup.py install
at the root of your sources. So then any code you work out can be using econometrics as a python package. You might even consider making it a package on pypi!
HTH
You imported the packages into the scope of the function. If you want to use them in the global scope, you have to tell python
def importfunc():
global np
import numpy as np
importfunc()
print np.version.version
On a side note: Are you using some kind of toolchain? I'd think it would be better to use an IDE or to write a script which sets up new projects for you.
The various imports are performed when you call the function, but the names pd, np, etc are local to the function, so they can't be referenced outside the function.
I suppose you could return those names, but importing modules in a function like that makes your code a little harder for readers to follow, IMHO.

Python: Write function in module to import packages

I am trying to write a function, which is itself loaded, to quickly import a bunch of modules globally.
I thought that, essentially, loaded modules could be treated as variables so I tried:
def loadMods():
global np
import numpy as np
and when I loaded numpy (calling np) there was no problem.
What I then did was to create a separate .py file called loadTest containing
# loadTest module
# coding: utf-8
def loadMod():
global np
import numpy as np
Then attempted to import numpy using this .py file in python (2.7):
import loadTest
loadTest.loadMod()
but now when attempting calling np I get
File "<stdin>", line 1, in <module>
NameError: name 'np' is not defined
Why does this occur? Any help or alternative ways of doing this would be much appreciated. Thanks a bunch :)
Instead of making a function to do this, why not make another module? You could name it something like modules.py and put all of your imports in there:
import numpy as np
import os
import sys
...
Then, all you need to do is a wildcard import:
from modules import *
and everything will be made available.
You must first define np like that.
In loadTest:
np=None
In somewhere other
import loadTest
loadTest.loadMod()
np=loadTest.np

Categories