Force use of module name before calling function, in Python - python

I'm creating a module that I'll import in a main script. In the module, called "colors", there's a function: "info()".
colors-module:
def info(function=None):
print("\ncolors\n Info\n")
The problem I have, is that I may also create a function called "info()", in the main script. This won't be a problem, as long as I import the colors-module as:
import colors
If so, I would call the function in the module by writing:
colors.info()
If I instead import the module as:
from colors import *
I have two functions called the exact same.
Main script:
from colors import *
def info(): # A
print("Main script's function")
info() # A
colors.info() # Am I able to force use of the module name before calling the
# function, if I import the module as in this script? Can this
# be done from the module, and not from the main script? As said,
# force use of module name, as "colors.info()", shall only apply
# when the module is being imported with "from colors import *".
EDIT 1
The reason why I want to import the module this way, is because of global variables in it:
bwc0 = (0, 0, 0)
bwc1 = (1, 1, 1)
bwc2 = (2, 2, 2)
# All the way to 255
After a few answers, I'll try adding these to a class, and import that class as *, if possible.
I also have a few functions in it, that I want imported as *, too. I'm not exactly sure how to import them, yet. It's probably easy, I suppose.
def redc(value):
return value, 0, 0
def greenc(value):
return 0, value, value
Thanks for all help

No, you can't force how your function is called. If someone is writing code to call your function it's up to them to ensure that they are actually able to call it.
Either:
import colors
Or:
from colors import info as colors_info
will work if they want to call it. Otherwise they just have to avoid creating a function with a conflicting name.
BTW, as a general case don't use from colors import * as it means you don't know what function's you're importing. For example:
from colors import *
from foobar import *
now, is info() coming from colors or foobar? You can't be sure, and somewhere down the line the answer may change. So always import the names you need explicitly and that way at least they're listed in the module that uses them so you can see when conflicts will arise.

When importing * from the module you won't have two functions with the same name, you'll have one function with the name info. Which of the two (one in colors or one in main script) is used depends on where the definition of info in main is, relative to the from colors import * statement.
You can't "force" the module prefix in any way; if you didn't import colors with import colors you don't have that name bound to something you can refer to.
If typing long names is the thing you're trying to avoid, just rename the info function you bring in from colors by using the as clause of the from import statement:
from colors import info as c_info
This is one of the main reasons using star * imports are discouraged and why namespaces are such a good idea. With * you throw away the separate namespace (module colors) that holds the name info function and place it in the namespace for the main script thereby masking objects with the same name.

You can make the colors like a class. And the function of info is method of colors class
so, you just instant of colors and call the info() method

Related

accessing and changing module level variable [duplicate]

I've run into a bit of a wall importing modules in a Python script. I'll do my best to describe the error, why I run into it, and why I'm tying this particular approach to solve my problem (which I will describe in a second):
Let's suppose I have a module in which I've defined some utility functions/classes, which refer to entities defined in the namespace into which this auxiliary module will be imported (let "a" be such an entity):
module1:
def f():
print a
And then I have the main program, where "a" is defined, into which I want to import those utilities:
import module1
a=3
module1.f()
Executing the program will trigger the following error:
Traceback (most recent call last):
File "Z:\Python\main.py", line 10, in <module>
module1.f()
File "Z:\Python\module1.py", line 3, in f
print a
NameError: global name 'a' is not defined
Similar questions have been asked in the past (two days ago, d'uh) and several solutions have been suggested, however I don't really think these fit my requirements. Here's my particular context:
I'm trying to make a Python program which connects to a MySQL database server and displays/modifies data with a GUI. For cleanliness sake, I've defined the bunch of auxiliary/utility MySQL-related functions in a separate file. However they all have a common variable, which I had originally defined inside the utilities module, and which is the cursor object from MySQLdb module.
I later realised that the cursor object (which is used to communicate with the db server) should be defined in the main module, so that both the main module and anything that is imported into it can access that object.
End result would be something like this:
utilities_module.py:
def utility_1(args):
code which references a variable named "cur"
def utility_n(args):
etcetera
And my main module:
program.py:
import MySQLdb, Tkinter
db=MySQLdb.connect(#blahblah) ; cur=db.cursor() #cur is defined!
from utilities_module import *
And then, as soon as I try to call any of the utilities functions, it triggers the aforementioned "global name not defined" error.
A particular suggestion was to have a "from program import cur" statement in the utilities file, such as this:
utilities_module.py:
from program import cur
#rest of function definitions
program.py:
import Tkinter, MySQLdb
db=MySQLdb.connect(#blahblah) ; cur=db.cursor() #cur is defined!
from utilities_module import *
But that's cyclic import or something like that and, bottom line, it crashes too. So my question is:
How in hell can I make the "cur" object, defined in the main module, visible to those auxiliary functions which are imported into it?
Thanks for your time and my deepest apologies if the solution has been posted elsewhere. I just can't find the answer myself and I've got no more tricks in my book.
Globals in Python are global to a module, not across all modules. (Many people are confused by this, because in, say, C, a global is the same across all implementation files unless you explicitly make it static.)
There are different ways to solve this, depending on your actual use case.
Before even going down this path, ask yourself whether this really needs to be global. Maybe you really want a class, with f as an instance method, rather than just a free function? Then you could do something like this:
import module1
thingy1 = module1.Thingy(a=3)
thingy1.f()
If you really do want a global, but it's just there to be used by module1, set it in that module.
import module1
module1.a=3
module1.f()
On the other hand, if a is shared by a whole lot of modules, put it somewhere else, and have everyone import it:
import shared_stuff
import module1
shared_stuff.a = 3
module1.f()
… and, in module1.py:
import shared_stuff
def f():
print shared_stuff.a
Don't use a from import unless the variable is intended to be a constant. from shared_stuff import a would create a new a variable initialized to whatever shared_stuff.a referred to at the time of the import, and this new a variable would not be affected by assignments to shared_stuff.a.
Or, in the rare case that you really do need it to be truly global everywhere, like a builtin, add it to the builtin module. The exact details differ between Python 2.x and 3.x. In 3.x, it works like this:
import builtins
import module1
builtins.a = 3
module1.f()
As a workaround, you could consider setting environment variables in the outer layer, like this.
main.py:
import os
os.environ['MYVAL'] = str(myintvariable)
mymodule.py:
import os
myval = None
if 'MYVAL' in os.environ:
myval = os.environ['MYVAL']
As an extra precaution, handle the case when MYVAL is not defined inside the module.
This post is just an observation for Python behaviour I encountered. Maybe the advices you read above don't work for you if you made the same thing I did below.
Namely, I have a module which contains global/shared variables (as suggested above):
#sharedstuff.py
globaltimes_randomnode=[]
globalist_randomnode=[]
Then I had the main module which imports the shared stuff with:
import sharedstuff as shared
and some other modules that actually populated these arrays. These are called by the main module. When exiting these other modules I can clearly see that the arrays are populated. But when reading them back in the main module, they were empty. This was rather strange for me (well, I am new to Python). However, when I change the way I import the sharedstuff.py in the main module to:
from globals import *
it worked (the arrays were populated).
Just sayin'
A function uses the globals of the module it's defined in. Instead of setting a = 3, for example, you should be setting module1.a = 3. So, if you want cur available as a global in utilities_module, set utilities_module.cur.
A better solution: don't use globals. Pass the variables you need into the functions that need it, or create a class to bundle all the data together, and pass it when initializing the instance.
The easiest solution to this particular problem would have been to add another function within the module that would have stored the cursor in a variable global to the module. Then all the other functions could use it as well.
module1:
cursor = None
def setCursor(cur):
global cursor
cursor = cur
def method(some, args):
global cursor
do_stuff(cursor, some, args)
main program:
import module1
cursor = get_a_cursor()
module1.setCursor(cursor)
module1.method()
Since globals are module specific, you can add the following function to all imported modules, and then use it to:
Add singular variables (in dictionary format) as globals for those
Transfer your main module globals to it
.
addglobals = lambda x: globals().update(x)
Then all you need to pass on current globals is:
import module
module.addglobals(globals())
Since I haven't seen it in the answers above, I thought I would add my simple workaround, which is just to add a global_dict argument to the function requiring the calling module's globals, and then pass the dict into the function when calling; e.g:
# external_module
def imported_function(global_dict=None):
print(global_dict["a"])
# calling_module
a = 12
from external_module import imported_function
imported_function(global_dict=globals())
>>> 12
The OOP way of doing this would be to make your module a class instead of a set of unbound methods. Then you could use __init__ or a setter method to set the variables from the caller for use in the module methods.
Update
To test the theory, I created a module and put it on pypi. It all worked perfectly.
pip install superglobals
Short answer
This works fine in Python 2 or 3:
import inspect
def superglobals():
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals
save as superglobals.py and employ in another module thusly:
from superglobals import *
superglobals()['var'] = value
Extended Answer
You can add some extra functions to make things more attractive.
def superglobals():
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals
def getglobal(key, default=None):
"""
getglobal(key[, default]) -> value
Return the value for key if key is in the global dictionary, else default.
"""
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals.get(key, default)
def setglobal(key, value):
_globals = superglobals()
_globals[key] = value
def defaultglobal(key, value):
"""
defaultglobal(key, value)
Set the value of global variable `key` if it is not otherwise st
"""
_globals = superglobals()
if key not in _globals:
_globals[key] = value
Then use thusly:
from superglobals import *
setglobal('test', 123)
defaultglobal('test', 456)
assert(getglobal('test') == 123)
Justification
The "python purity league" answers that litter this question are perfectly correct, but in some environments (such as IDAPython) which is basically single threaded with a large globally instantiated API, it just doesn't matter as much.
It's still bad form and a bad practice to encourage, but sometimes it's just easier. Especially when the code you are writing isn't going to have a very long life.

What's the correct way for importing the whole module as well as a couple of its functions in Python?

I know that from module import * will import all the functions in current namespace but it is a bad practice. I want to use two functions directly and use module.function when I have to use any other function from the module. What I am doing currently is:
import module
from module import func1, func2
# DO REST OF MY STUFF
Is it a good practice? Does the order of first two statements matter?
Is there a better way using which I can use these two functions directly and use rest of the functions as usual with the module's name prepended to them?
Using just import module results in very long statements with a lot of repetition if I use the same function from the given module five times in a single statement. That's what I want to avoid.
The order doesn't matter and it's not a pythonic way. When you import the module there is no need to import some of its functions separately again. If you are not sure how many of the functions you might need to use just import the module and access to the functions on demand with a simple reference.
# The only import you need
import module
# Use module.funcX when you need any of its functions
After all, if you want to use some of your functions (much) more than the others, as the cost of attribute access is greater than importing the functions separately, you better to import them as you've done.
And still, the order doesn't matter. You can do:
import module
from module import func1, func2
For more info read the documentation https://www.python.org/dev/peps/pep-0008/#imports
It is not good to do (may be opinion based):
import module
from module import func1, func2 # `func1` and `func2` are already part of module
Because you already hold a reference to module.
If I were you, I would import it in the form of import module. Since your issue is that module.func1() becomes too long. I may import the module and use as for creating a alias for the name. For example:
import module as mo
# ^ for illustration purpose. Even the name of
# your actual module wont be `module`.
# Alias should also be self-explanatory
# For example:
import database_manager as db_manager
Now I may access the functions as:
mo.func1()
mo.func2()
Edit: Based on the edit in actual question
If your are calling same function in the same line, there is possibility that your are already doing some thing wrong. It will be great if you can share what your that function does.
For example: Want to the rertun value of those functions to be passed as argument to another function? as:
test_func(mo.func1(x), mo.func1(y). mo.func1(z))
could be done as:
params_list = [x, y, z]
func_list = [mo.func1(param) for param in params_list]
test_func(*func_list)

How can I perform `import *` from within a function?

I have the following standard import procedure:
from ROOT import *
Because of the way ROOT handles command line options and arguments, something like the following is required in order to avoid screwing up the script's command line parsing:
argv_tmp = sys.argv
sys.argv = []
from ROOT import *
sys.argv = argv_tmp
I need to perform this operation in many scripts. This operation may change or there might turn out to be a better approach, so I want to centralise this procedure in a single function provided by some imported module, making it easy to change the procedure in future.
def import_ROOT():
# magic
import os
import sys
import_ROOT()
import docopt
How can I import the ROOT module from within a function such that the result of the script's operation is the same as for the from ROOT import * procedure described above?
Due to how local variables are implemented in python you cannot do this. And since you don't know all the variables that may be imported you can't declare them global.
As to why you can't import unknown locals in a function. This is because at compile time python establishes all the different possible locals that may exist (anything that is directly assigned to and hasn't been declared global or nonlocal). Space for these locals are made in an array associated with each call of the function. All locals are then referenced by their index in the array rather than their name. Thus, the interpreter is unable to make additional space for unknown locals nor know how to refer to them at runtime.
I believe there's already a similiar question in here:
Python: how to make global imports from a function
example:
def example_function():
global module
import module
This probably misses all sorts of corner cases, but it's a start:
def import_ROOT():
import ROOT
globals().update(ROOT.__dict__)
Obligatory disclaimer: if you're importing *, you're probably doing it wrong. But I guess there could be situations where import * is the lesser evil.

Import a module with parameter in python

Is it possible to import a module with some parameter in python ?
All I mean by parameter is that there exists a variable in the module which is not initialized in that module, still I am using that variable in that module. In short, I want behavior similar to a function but unlike function, I want the variables of module to be exposed in the calling code.
eg a.py:
#lists like data, count, prob_distribution are constructed from training_pool (not initialized in this file)
x = pymc.Uniform('x', lower = 0, upper = 1)
rv = [ Multinomial("rv"+str(i), count[i], prob_distribution[i], value = data[i], observed=True) for i in xrange(0, len(count)) ]
b.py:
import a #I want some way tr pass value of training_pool
m = pymc.MCMC(a)
I want all random variables in a.py to be exposed to MCMC. I am open to a better approach for my problem at hand, but I would also like to know whether passing arguments to modules is possible in python or not.
there are various approaches to do so, here is just a silly and simple one:
main.py
"""A silly example - main supplies a parameter
"""
import sys,os
print os.path.basename(__file__)+":Push it by: --myModuleParam "+str(123)
sys.argv.append('--myModuleParam')
sys.argv.append(123)
import module
print os.path.basename(__file__)+":Pushed my param:"+str(module.displayMyParam)
module.py
"""A silly example - module consumes parameter
"""
import sys,os
displayMyParam = 'NotYetInitialized'
for px in sys.argv:
if px == '--myModuleParam':
idx = sys.argv.index(px)
sys.argv.pop(idx) # remove option
displayMyParam = sys.argv[idx]
sys.argv.pop(idx) # remove value
print os.path.basename(__file__)+":Got my param:"+str(displayMyParam)
#
# That's it...
#
As #otus already answered, there is no way to pass parameters to modules.
I think you are following some of the introductory examples for PyMC2, which use a pattern where a module wraps all the code for the nodes in a Bayesian model. This approach is good for getting started, but, as you have found, can be limiting, when you want to run your model with a range of variations.
Fortunately, PyMC2 can create an MCMC object from a list or a dictionary as well as a module. What I recommend in this case is just what #oleg-s suggested in the comments: use a function. You can end the function with return locals() to get a dictionary of everything that would have been in the module, and this is suitable input to the pymc.MCMC constructor. Here is an example:
# a.py
from pymc import *
count = [10, 10] # perhaps good to put this stuff in data.py
prob_distribution = [[.5, .5], [.1, .2, .7]]
data = [[2, 8], [2, 3, 5]]
def model(training_pool):
x = Uniform('x', lower = 0, upper = 1)
rv = [ Multinomial("rv"+str(i), count[i], prob_distribution[i], value = data[i], observed=True) for i in training_pool ]
return locals()
# b.py
import pymc, a
training_pool = [0]
m = pymc.MCMC(a.model(training_pool))
I found it helpful to define global variables, and allow these to be set by an init function.
def init(config_filename=CONFIG_FILENAME):
config = configparser.ConfigParser(interpolation=configparser.ExtendedInterpolation())
config.read(config_filename)
global YEARS
YEARS = config['DEFAULT']['YEARS']
global FEATURES
FEATURES = config['DEFAULT']['FEATURES']
Then all the user has to do is remember to initialize the module before using these methods:
import module
module.init('config.ini')
Note, I would NOT use this on a module that I expect to spread publicly. This is more for single-file modules for my own personal use.
There is no way to pass parameters to modules. However, you could use a global in a third module for this:
# a.py
parameter = None
# b.py
import a
a.parameter = 4
import c
# c.py
import a
# use a.parameter
Of course, this only works if nothing else imports c, because modules only get imported once.
Module-wide globals should be indeed enough for most uses, but what if
the parameter needs to be evaluated during module initialization, or
you need multiple versions of the module with different parameters
In recent versions of python, it is possible to load in two steps, first the spec, then exec. In the middle, you can set up extra variables.
import importlib
abstractModuleSpec=importlib.util.find_spec('myModule')
module4=importlib.util.module_from_spec(abstractModuleSpec)
module2=importlib.util.module_from_spec(abstractModuleSpec)
module2.parameter="you are version 2"
module4.parameter="you are version 4"
module4.__spec__.loader.exec_module(module4)
module2.__spec__.loader.exec_module(module2)
In the module you can check dir() or similar, to see if the variable is defined.
I really wonder nobody mentioned environment variables. That's the cleanest way I found:
a.py
import os
param = os.getenv('MY_PACKAGE_PARAM', None)
print(param)
b.py
import os
os.setenv('MY_PACKAGE_PARAM', 'Hello world!')
import a
There is no such way to pass parameters to the module, however you can revamp your code a bit and import the parameters from a module as global parameters.

Access objects from another module

I'm a very inexperienced programmer creating a game (using Python 3.3) as a learning exercise. I currently have a main module and a combat module.
The people in the game are represented by instances of class "Person", and are created in the main module. However, the combat module obviously needs access to those objects. Furthermore, I'm probably going to create more modules later that will also need access to those objects.
How do I allow other modules to access the Persons from main.py?
As things stand, main.py has
import combat
at the top; adding
import main
to combat.py doesn't seem to help.
Should I instantiate my objects in a separate module (common.py?) and import them to every module that needs to access them?
Yes, you should factor this out. What you tried is circular imports between your modules, and that typically causes more problems than it solves. If combat imports main and main imports combat, then you may get an error because some object definitions will be missing from main when you try to import them. This is because main will not have finished executing when combat starts executing for the import. Assuming main is your start up script, it should do nothing more than start the program by calling a method from another module; it may instantiate an object if the desired method is an instance method on a class. Avoid global variables, too. Even if it doesn't seem like they'll be a problem now, that can bite you later on.
That said, you can reference members of a module like so:
import common
x = common.some_method_in_common()
y = common.SomeClass()
or
from common import SomeClass
y = SomeClass()
Personally, I generally avoid referencing a method from another module without qualifying it with the module name, but this is also legal:
from common import some_method_in_common
x = some_method_in_common()
I typically use from ... import ... for classes, and I typically use the first form for methods. (Yes, this sometimes means I have specific class imports from a module in addition to importing the module itself.) But this is only my personal convention.
An alternate syntax of which is strongly discouraged is
from common import *
y = SomeClass()
This will import every member of common into the current scope that does not start with an underscore (_). The reason it's discouraged is because it makes identifying the source of the name harder and it makes breaking things too easy. Consider this pair of imports:
from common import *
from some_other_module import *
y = SomeClass()
Which module does SomeClass come from? There's no way to tell other than to go look at the two modules. Worse, what if both modules define SomeClass or SomeClass is later added to some_other_module?
if you have imported main module in combat module by using import main, then you should use main.*(stuff that are implemented in main module) to access classes and methods in there.
example:
import main
person = main.Person()
also you can use from main import * or import Person to avoid main.* in the previous.
There are some rules for importing modules as described in http://effbot.org/zone/import-confusion.htm :
import X imports the module X, and creates a reference to that
module in the current namespace. Or in other words, after you’ve run
this statement, you can use X.name to refer to things defined in
module X.
from X import * imports the module X, and creates references in
the current namespace to all public objects defined by that module
(that is, everything that doesn’t have a name starting with “_”). Or
in other words, after you’ve run this statement, you can simply use
a plain name to refer to things defined in module X. But X itself is
not defined, so X.name doesn’t work. And if name was already
defined, it is replaced by the new version. And if name in X is
changed to point to some other object, your module won’t notice.
from X import a, b, c imports the module X, and creates references
in the current namespace to the given objects. Or in other words,
you can now use a and b and c in your program.
Finally, X = __import__(‘X’) works like import X, with the
difference that you
1) pass the module name as a string, and
2) explicitly assign it to a variable in your current namespace.

Categories