Is it possible to overload the from/import statement in Python?
For example, assuming jvm_object is an instance of class JVM, is it possible to write this code:
class JVM(object):
def import_func(self, cls):
return something...
jvm = JVM()
# would invoke JVM.import_func
from jvm import Foo
This post demonstrates how to use functionality introduced in PEP-302 to import modules over the web. I post it as an example of how to customize the import statement rather than as suggested usage ;)
It's hard to find something which isn't possible in a dynamic language like Python, but do we really need to abuse everything? Anyway, here it is:
from types import ModuleType
import sys
class JVM(ModuleType):
Foo = 3
sys.modules['JVM'] = JVM
from JVM import Foo
print Foo
But one pattern I've seen in several libraries/projects is some kind of a _make_module() function, which creates a ModuleType dynamically and initializes everything in it. After that, the current Module is replaced by the new module (using the assignment to sys.modules) and the _make_module() function gets deleted. The advantage of that, is that you can loop over the module and even add objects to the module inside that loop, which is quite useful sometimes (but use it with caution!).
Related
I've run into a bit of a wall importing modules in a Python script. I'll do my best to describe the error, why I run into it, and why I'm tying this particular approach to solve my problem (which I will describe in a second):
Let's suppose I have a module in which I've defined some utility functions/classes, which refer to entities defined in the namespace into which this auxiliary module will be imported (let "a" be such an entity):
module1:
def f():
print a
And then I have the main program, where "a" is defined, into which I want to import those utilities:
import module1
a=3
module1.f()
Executing the program will trigger the following error:
Traceback (most recent call last):
File "Z:\Python\main.py", line 10, in <module>
module1.f()
File "Z:\Python\module1.py", line 3, in f
print a
NameError: global name 'a' is not defined
Similar questions have been asked in the past (two days ago, d'uh) and several solutions have been suggested, however I don't really think these fit my requirements. Here's my particular context:
I'm trying to make a Python program which connects to a MySQL database server and displays/modifies data with a GUI. For cleanliness sake, I've defined the bunch of auxiliary/utility MySQL-related functions in a separate file. However they all have a common variable, which I had originally defined inside the utilities module, and which is the cursor object from MySQLdb module.
I later realised that the cursor object (which is used to communicate with the db server) should be defined in the main module, so that both the main module and anything that is imported into it can access that object.
End result would be something like this:
utilities_module.py:
def utility_1(args):
code which references a variable named "cur"
def utility_n(args):
etcetera
And my main module:
program.py:
import MySQLdb, Tkinter
db=MySQLdb.connect(#blahblah) ; cur=db.cursor() #cur is defined!
from utilities_module import *
And then, as soon as I try to call any of the utilities functions, it triggers the aforementioned "global name not defined" error.
A particular suggestion was to have a "from program import cur" statement in the utilities file, such as this:
utilities_module.py:
from program import cur
#rest of function definitions
program.py:
import Tkinter, MySQLdb
db=MySQLdb.connect(#blahblah) ; cur=db.cursor() #cur is defined!
from utilities_module import *
But that's cyclic import or something like that and, bottom line, it crashes too. So my question is:
How in hell can I make the "cur" object, defined in the main module, visible to those auxiliary functions which are imported into it?
Thanks for your time and my deepest apologies if the solution has been posted elsewhere. I just can't find the answer myself and I've got no more tricks in my book.
Globals in Python are global to a module, not across all modules. (Many people are confused by this, because in, say, C, a global is the same across all implementation files unless you explicitly make it static.)
There are different ways to solve this, depending on your actual use case.
Before even going down this path, ask yourself whether this really needs to be global. Maybe you really want a class, with f as an instance method, rather than just a free function? Then you could do something like this:
import module1
thingy1 = module1.Thingy(a=3)
thingy1.f()
If you really do want a global, but it's just there to be used by module1, set it in that module.
import module1
module1.a=3
module1.f()
On the other hand, if a is shared by a whole lot of modules, put it somewhere else, and have everyone import it:
import shared_stuff
import module1
shared_stuff.a = 3
module1.f()
… and, in module1.py:
import shared_stuff
def f():
print shared_stuff.a
Don't use a from import unless the variable is intended to be a constant. from shared_stuff import a would create a new a variable initialized to whatever shared_stuff.a referred to at the time of the import, and this new a variable would not be affected by assignments to shared_stuff.a.
Or, in the rare case that you really do need it to be truly global everywhere, like a builtin, add it to the builtin module. The exact details differ between Python 2.x and 3.x. In 3.x, it works like this:
import builtins
import module1
builtins.a = 3
module1.f()
As a workaround, you could consider setting environment variables in the outer layer, like this.
main.py:
import os
os.environ['MYVAL'] = str(myintvariable)
mymodule.py:
import os
myval = None
if 'MYVAL' in os.environ:
myval = os.environ['MYVAL']
As an extra precaution, handle the case when MYVAL is not defined inside the module.
This post is just an observation for Python behaviour I encountered. Maybe the advices you read above don't work for you if you made the same thing I did below.
Namely, I have a module which contains global/shared variables (as suggested above):
#sharedstuff.py
globaltimes_randomnode=[]
globalist_randomnode=[]
Then I had the main module which imports the shared stuff with:
import sharedstuff as shared
and some other modules that actually populated these arrays. These are called by the main module. When exiting these other modules I can clearly see that the arrays are populated. But when reading them back in the main module, they were empty. This was rather strange for me (well, I am new to Python). However, when I change the way I import the sharedstuff.py in the main module to:
from globals import *
it worked (the arrays were populated).
Just sayin'
A function uses the globals of the module it's defined in. Instead of setting a = 3, for example, you should be setting module1.a = 3. So, if you want cur available as a global in utilities_module, set utilities_module.cur.
A better solution: don't use globals. Pass the variables you need into the functions that need it, or create a class to bundle all the data together, and pass it when initializing the instance.
The easiest solution to this particular problem would have been to add another function within the module that would have stored the cursor in a variable global to the module. Then all the other functions could use it as well.
module1:
cursor = None
def setCursor(cur):
global cursor
cursor = cur
def method(some, args):
global cursor
do_stuff(cursor, some, args)
main program:
import module1
cursor = get_a_cursor()
module1.setCursor(cursor)
module1.method()
Since globals are module specific, you can add the following function to all imported modules, and then use it to:
Add singular variables (in dictionary format) as globals for those
Transfer your main module globals to it
.
addglobals = lambda x: globals().update(x)
Then all you need to pass on current globals is:
import module
module.addglobals(globals())
Since I haven't seen it in the answers above, I thought I would add my simple workaround, which is just to add a global_dict argument to the function requiring the calling module's globals, and then pass the dict into the function when calling; e.g:
# external_module
def imported_function(global_dict=None):
print(global_dict["a"])
# calling_module
a = 12
from external_module import imported_function
imported_function(global_dict=globals())
>>> 12
The OOP way of doing this would be to make your module a class instead of a set of unbound methods. Then you could use __init__ or a setter method to set the variables from the caller for use in the module methods.
Update
To test the theory, I created a module and put it on pypi. It all worked perfectly.
pip install superglobals
Short answer
This works fine in Python 2 or 3:
import inspect
def superglobals():
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals
save as superglobals.py and employ in another module thusly:
from superglobals import *
superglobals()['var'] = value
Extended Answer
You can add some extra functions to make things more attractive.
def superglobals():
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals
def getglobal(key, default=None):
"""
getglobal(key[, default]) -> value
Return the value for key if key is in the global dictionary, else default.
"""
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals.get(key, default)
def setglobal(key, value):
_globals = superglobals()
_globals[key] = value
def defaultglobal(key, value):
"""
defaultglobal(key, value)
Set the value of global variable `key` if it is not otherwise st
"""
_globals = superglobals()
if key not in _globals:
_globals[key] = value
Then use thusly:
from superglobals import *
setglobal('test', 123)
defaultglobal('test', 456)
assert(getglobal('test') == 123)
Justification
The "python purity league" answers that litter this question are perfectly correct, but in some environments (such as IDAPython) which is basically single threaded with a large globally instantiated API, it just doesn't matter as much.
It's still bad form and a bad practice to encourage, but sometimes it's just easier. Especially when the code you are writing isn't going to have a very long life.
After dozens of research on the subject and a lot of thinking, I leave it to you in this new question:
Is it possible to mock an entire library with Python? I would like the import of this library and all its packages / modules / etc to be done without having to define each element by hand, with mock and sys.module ... :(
In my case, I use a library specific to the job and I would like to be able to work on my code at home, without having to recode my imports, on code which is not dependent on this library.
Example:
"""Main file.
I define the mock here.
"""
mocked = MagicLibraryMock("mylib") # the dream
"""File with lib imports.
I can import anything and use it as a mock.
"""
import mylib
from mylib.a import b
from mylib.z import c
from mylib.a.e.r import x
foo = x()
bar = c.a.e.r.t.d()
bar.side_effect = [1, 2, 3]
bar()
I tried to integrate a class inherited from a dictionary to overload the __getitem__ method of sys.modules. But the problem is that the import method also uses __iter__, and there it becomes much more complicated to return a MagicMock according to the result, knowing that it is not recommended to directly modify the import source code - source.
Finally I lose less time extracting imports from my application to sub-modules which will take care of solving them. I can thus intercept these imports more easily without dirtying my code.
The design is more interesting.
Thanks for your help.
I'm building a Python module for a fairly specific purpose. What I'd like to do with this is get more functionality behind importing things from it.
I'd like to have a setup by which saying from my_module import foo would run a function and pass the string "foo". This function would return the object that should be imported.
For example, maybe I want to make a cloud-based import system. I'd like to store community scripts in the cloud, and then download them when a user tries to import them.
Maybe I use the code from cloud import test_module. This would check a cache to decide whether test_module had been downloaded. If so, it would return that module. If not, it would download the module before importing it.
How can I accomplish something like this in Python, by which a dynamic range of submodules could be seamlessly imported from the cloud?
Full featured support for what you ask probably requires a bunch of complicated code using importlib and hooking into various parts of the import machinery. However, a more limited solution can be implemented with just a single custom class that pretends to be a module.
When you import a module, Python first checks in the sys.modules dictionary to see if the module is a key. If so, it returns the value associated with the key. It does this regardless of what the value is, so you can put any kind of object in sys.modules and Python will treat it like a module. A module's code can even replace its own entry in sys.modules, and the replacement will be used even the first time it is imported!
So, to implement your fancy module that downloads other modules on demand, replace the module itself with an instance of a custom class, and write that class a __getattr__ or __getattribute__ method that does the work you want.
Here's a trivial example module that returns a string for any attribute you look for in it. The string will always be the same as the requested attribute name. In your code, you'd want to do your fancy web-cache lookups and downloading, and then return the fetched module object instead of just returning a string.
class FakeModule(object):
def __getattribute__(self, name):
return name
import sys
sys.modules[__name__] = FakeModule()
On my system I've saved that as fakemodule.py. Now if I do from fakemodule import foo, I get foo with the value 'foo' in my local namespace.
Note that this only works for one level deep imports. If you do from fakemodule.subpackage import name it will not work because there's no fakemodule.subpackage entry in sys.modules.
I have a plypython function which does some json magic. For this it obviously imports the json library.
Is the import called on every call to the function? Are there any performance implication I have to be aware of?
The import is executed on every function call. This is the same behavior you would get if you wrote a normal Python module with the import statement inside a function body as oppposed to at the module level.
Yes, this will affect performance.
You can work around this by caching your imports like this:
CREATE FUNCTION test() RETURNS text
LANGUAGE plpythonu
AS $$
if 'json' in SD:
json = SD['json']
else:
import json
SD['json'] = json
return json.dumps(...)
$$;
This is admittedly not very pretty, and better ways to do this are being discussed, but they won't happen before PostgreSQL 9.4.
The declaration in the body of a PL/Python function will eventually become an ordinary Python function and will thus behave as such. When a Python function imports a module for the first time the module is cached in the sys.modules dictionary (https://docs.python.org/3/reference/import.html#the-module-cache). Subsequent imports of the same module will simply bind the import name to the module object found in the dictionary. In a sense, what I'm saying may cast some doubt on the usefulness of the tip given in the accepted answer, since it makes it somewhat redundant, as Python already does a similar caching for you.
To sum things up, I'd say that if you import in the standard way of simply using the import or from [...] import constructs, then you need not worry about repeated imports, in functions or otherwise, Python has got you covered.
On the other hand, Python allows you to bypass its native import semantics and to implement your own (with the __import__() function and importlib module). If this is what you're doing, maybe you should review what's available in the toolbox (https://docs.python.org/3/reference/import.html).
I can make this code work, but I am still confused why it won't work the first way I tried.
I am practicing python because my thesis is going to be coded in it (doing some cool things with Arduino and PC interfaces). I'm trying to import a class from another file into my main program so that I can create objects. Both files are in the same directory. It's probably easier if you have a look at the code at this point.
#from ArduinoBot import *
#from ArduinoBot import ArduinoBot
import ArduinoBot
# Create ArduinoBot object
bot1 = ArduinoBot()
# Call toString inside bot1 object
bot1.toString()
input("Press enter to end.")
Here is the very basic ArduinoBot class
class ArduinoBot:
def toString(self):
print ("ArduinoBot toString")
Either of the first two commented out import statements will make this work, but not the last one, which to me seems the most intuitive and general. There's not a lot of code for stuff to go wrong here, it's a bit frustrating to be hitting these kind of finicky language specific quirks when I had heard some many good things about Python. Anyway I must be doing something wrong, but why doesn't the simple 'import ClassName' or 'import FileName' work?
Thank you for your help.
consider a file (example.py):
class foo(object):
pass
class bar(object):
pass
class example(object):
pass
Now in your main program, if you do:
import example
what should be imported from the file example.py? Just the class example? should the class foo come along too? The meaning would be too ambiguous if import module pulled the whole module's namespace directly into your current namespace.
The idea is that namespaces are wonderful. They let you know where the class/function/data came from. They also let you group related things together (or equivalently, they help you keep unrelated things separate!). A module sets up a namespace and you tell python exactly how you want to bring that namespace into the current context (namespace) by the way you use import.
from ... import * says -- bring everything in that module directly into my namespace.
from ... import ... as ... says, bring only the thing that I specify directly into my namespace, but give it a new name here.
Finally, import ... simply says bring that module into the current namespace, but keep it separate. This is the most common form in production code because of (at least) 2 reasons.
It prevents name clashes. You can have a local class named foo which won't conflict with the foo in example.py -- You get access to that via example.foo
It makes it easy to trace down which module a class came from for debugging.
consider:
from foo import *
from bar import *
a = AClass() #did this come from foo? bar? ... Hmmm...
In this case, to get access to the class example from example.py, you could also do:
import example
example_instance = example.example()
but you can also get foo:
foo_instance = example.foo()
The simple answer is that modules are things in Python. A module has its own status as a container for classes, functions, and other objects. When you do import ArduinoBot, you import the module. If you want things in that module -- classes, functions, etc. -- you have to explicitly say that you want them. You can either import them directly with from ArduinoBot import ..., or access them via the module with import ArduinoBot and then ArduinoBot.ArduinoBot.
Instead of working against this, you should leverage the container-ness of modules to allow you to group related stuff into a module. It may seem annoying when you only have one class in a file, but when you start putting multiple classes and functions in one file, you'll see that you don't actually want all that stuff being automatically imported when you do import module, because then everything from all modules would conflict with other things. The modules serve a useful function in separating different functionality.
For your example, the question you should ask yourself is: if the code is so simple and compact, why didn't you put it all in one file?
Import doesn't work quite the you think it does. It does work the way it is documented to work, so there's a very simple remedy for your problem, but nonetheless:
import ArduinoBot
This looks for a module (or package) on the import path, executes the module's code in a new namespace, and then binds the module object itself to the name ArduinoBot in the current namespace. This means a module global variable named ArduinoBot in the ArduinoBot module would now be accessible in the importing namespace as ArduinoBot.ArduinoBot.
from ArduinoBot import ArduinoBot
This loads and executes the module as above, but does not bind the module object to the name ArduinoBot. Instead, it looks for a module global variable ArduinoBot within the module, and binds whatever object that referred to the name ArduinoBot in the current namespace.
from ArduinoBot import *
Similarly to the above, this loads and executes a module without binding the module object to any name in the current namespace. It then looks for all module global variables, and binds them all to the same name in the current namespace.
This last form is very convenient for interactive work in the python shell, but generally considered bad style in actual development, because it's not clear what names it actually binds. Considering it imports everything global in the imported module, including any names that it imported at global scope, it very quickly becomes extremely difficult to know what names are in scope or where they came from if you use this style pervasively.
The module itself is an object. The last approach does in fact work, if you access your class as a member of the module. Either if the following will work, and either may be appropriate, depending on what else you need from the imported items:
from my_module import MyClass
foo = MyClass()
or
import my_module
foo = my_module.MyClass()
As mentioned in the comments, your module and class usually don't have the same name in python. That's more a Java thing, and can sometimes lead to a little confusion here.