I'm pretty new to django and came across something that confuses me in this views.py file I've created. I just played around with it a little and came up with something that works, but I don't get why it does.
The class Draft_Order (which I have in another file) requests the NBA stats page, performs some calculations on the backend, and spits out draft lottery odds (for the new draft). The methods initialize, sim draft, and get standings all do things on the backend (which works perfectly).
Now, my question is that I don't get why I can create an instance "f" of the class DraftOrder outside all of the functions, but yet still be able to reference it within most of my functions as they are getting called from my urls.py file, so it doesnt seem like they should be working at all. Also, for some reason, the update function can only reference "f" if I don't have an assignment to f in the function-e.g. if I add the line
f = temp
Then all of a sudden it gives me an "unboundlocalerror", and says that f is referenced before assignment.
I'd appreciate any help on this. Thanks.
from django.shortcuts import render
from django.http.response import HttpResponse
from simulator.draft_simulator import Draft_Order
from simulator.models import Order
# Create your views here.
f = Draft_Order()
f.initialize()
def index(request):
return HttpResponse('<p>Hello World</p>')
def init(request):
return HttpResponse(f.initalodds.to_html())
def table(request):
f.sim_draft()
return HttpResponse(f.finaltable.to_html())
def update(request):
temp = Draft_Order()
temp.get_standings()
if temp == f:
return HttpResponse('Same!')
else:
return HttpResponse('updated!')
UnboundLocalError happens because presence of an assignment to f inside a function shadows the global f for the whole function. You need to explicitly state that f refers to the global variable:
def update(r):
global f
if f == ...
f = Draft_Order() # new draft order
But really, you shouldn't rely on global values stored in RAM, because in production environment you'll have several processes with probably different fs and you won't be able to control time of life of the said processes. Better to rely on a persistent memory here (DBs, key-value stores, files, etc).
You need to look into python namespaces and scope
But here is how i like to think of it to avoid going crazy(everything in python is an object).
In simple terms python those .py files are modules, when python is running those modules are converted into objects, so you have a urls object, a views object ,etc.
so any variable you define on module level turns into an attribute and any function defined turns into a method.
I believe you do something like this on your url.py
from simulator import views
or
from simulator.views import update
which basically means get the views object which represent the views.py file.
From the views object you are able to access your methods like update.
Your update method is able to access the f because here's an excerpt from python namespaces and scope
the global scope of a function defined in a module is that module’s namespace, no matter from where or by what alias the function is called.
Basically your f is an attribute of the views object meaning any methods within views object can access it.
Reason why it works when on urls.py its because methods can access the attributes of objects the are defined in, so since update method is define inside views its able to access views attributes.
Please read more on python namespaces and scope this a very simplified explanation.
Related
I have a method say _select_warehouse_for_order in api/controllers/orders.py file. The method is not part of any class.
Now, I have a new file say api/controllers/dispatchers.py where i need to know which warehouse was selected. I am calling _select_warehouse_for_order from this file to get this information.
Now, in my test cases, I am patching _select_warehouse_for_order like this
from unittest.mock import patch, call
def test_delivery_assignment(self, app_context):
with patch('api.controllers.orders._select_warehouse_for_order') as mock_selected_wh:
mock_selected_wh.return_value = {}
app_context.client.set_url_prefix('/v2')
response = app_context.client.get('/delivery/dispatch')
assert response.status_code == 200
The problem that i am facing is that my patch is not returning empty dictionary. when i started debugging, i noticed that its executed the actual code in _select_warehouse_for_order. Am i missing something here?
Update:
Here is the code in dispatchers.py
from api.controllers.orders import _select_warehouse_for_order
#bp.route("/dispatch")
#login_required
def dispatch():
warehouses = _select_warehouse_for_order(request=request)
if len(warehouses) == 0:
logger.info("No warehouse selected")
return
logger.info("Selected warehouse: %s", warehouses[0].name)
# return response
You must patch where the method is used, not where it is declared. In your case, you are patching 'api.controllers.orders._select_warehouse_for_order' which is where the method is declared. Instead, patch 'dispatchers._select_warehouse_for_order' (possibly prefixed with whatever package contains dispatchers).
The reason for this is because when you do
from api.controllers.orders import _select_warehouse_for_order
you declare a name _select_warehouse_for_order in dispatchers.py that refers to the function which is declared in api/controllers/orders.py. Essentially you have created a second reference to the function. Now when you call
warehouses = _select_warehouse_for_order(request=request)
you are using the reference in dispatchers.py, not the one in api/controllers/orders.py. So in order to replace this function with a patch, you have to use dispatchers._select_warehouse_for_order.
Notice how import is different in python than in Java because we create a new name and assign it to an existing function or class. On the other hand, Java imports tell the compiler where to look for a class when it is mentioned in the code.
I've run into a bit of a wall importing modules in a Python script. I'll do my best to describe the error, why I run into it, and why I'm tying this particular approach to solve my problem (which I will describe in a second):
Let's suppose I have a module in which I've defined some utility functions/classes, which refer to entities defined in the namespace into which this auxiliary module will be imported (let "a" be such an entity):
module1:
def f():
print a
And then I have the main program, where "a" is defined, into which I want to import those utilities:
import module1
a=3
module1.f()
Executing the program will trigger the following error:
Traceback (most recent call last):
File "Z:\Python\main.py", line 10, in <module>
module1.f()
File "Z:\Python\module1.py", line 3, in f
print a
NameError: global name 'a' is not defined
Similar questions have been asked in the past (two days ago, d'uh) and several solutions have been suggested, however I don't really think these fit my requirements. Here's my particular context:
I'm trying to make a Python program which connects to a MySQL database server and displays/modifies data with a GUI. For cleanliness sake, I've defined the bunch of auxiliary/utility MySQL-related functions in a separate file. However they all have a common variable, which I had originally defined inside the utilities module, and which is the cursor object from MySQLdb module.
I later realised that the cursor object (which is used to communicate with the db server) should be defined in the main module, so that both the main module and anything that is imported into it can access that object.
End result would be something like this:
utilities_module.py:
def utility_1(args):
code which references a variable named "cur"
def utility_n(args):
etcetera
And my main module:
program.py:
import MySQLdb, Tkinter
db=MySQLdb.connect(#blahblah) ; cur=db.cursor() #cur is defined!
from utilities_module import *
And then, as soon as I try to call any of the utilities functions, it triggers the aforementioned "global name not defined" error.
A particular suggestion was to have a "from program import cur" statement in the utilities file, such as this:
utilities_module.py:
from program import cur
#rest of function definitions
program.py:
import Tkinter, MySQLdb
db=MySQLdb.connect(#blahblah) ; cur=db.cursor() #cur is defined!
from utilities_module import *
But that's cyclic import or something like that and, bottom line, it crashes too. So my question is:
How in hell can I make the "cur" object, defined in the main module, visible to those auxiliary functions which are imported into it?
Thanks for your time and my deepest apologies if the solution has been posted elsewhere. I just can't find the answer myself and I've got no more tricks in my book.
Globals in Python are global to a module, not across all modules. (Many people are confused by this, because in, say, C, a global is the same across all implementation files unless you explicitly make it static.)
There are different ways to solve this, depending on your actual use case.
Before even going down this path, ask yourself whether this really needs to be global. Maybe you really want a class, with f as an instance method, rather than just a free function? Then you could do something like this:
import module1
thingy1 = module1.Thingy(a=3)
thingy1.f()
If you really do want a global, but it's just there to be used by module1, set it in that module.
import module1
module1.a=3
module1.f()
On the other hand, if a is shared by a whole lot of modules, put it somewhere else, and have everyone import it:
import shared_stuff
import module1
shared_stuff.a = 3
module1.f()
… and, in module1.py:
import shared_stuff
def f():
print shared_stuff.a
Don't use a from import unless the variable is intended to be a constant. from shared_stuff import a would create a new a variable initialized to whatever shared_stuff.a referred to at the time of the import, and this new a variable would not be affected by assignments to shared_stuff.a.
Or, in the rare case that you really do need it to be truly global everywhere, like a builtin, add it to the builtin module. The exact details differ between Python 2.x and 3.x. In 3.x, it works like this:
import builtins
import module1
builtins.a = 3
module1.f()
As a workaround, you could consider setting environment variables in the outer layer, like this.
main.py:
import os
os.environ['MYVAL'] = str(myintvariable)
mymodule.py:
import os
myval = None
if 'MYVAL' in os.environ:
myval = os.environ['MYVAL']
As an extra precaution, handle the case when MYVAL is not defined inside the module.
This post is just an observation for Python behaviour I encountered. Maybe the advices you read above don't work for you if you made the same thing I did below.
Namely, I have a module which contains global/shared variables (as suggested above):
#sharedstuff.py
globaltimes_randomnode=[]
globalist_randomnode=[]
Then I had the main module which imports the shared stuff with:
import sharedstuff as shared
and some other modules that actually populated these arrays. These are called by the main module. When exiting these other modules I can clearly see that the arrays are populated. But when reading them back in the main module, they were empty. This was rather strange for me (well, I am new to Python). However, when I change the way I import the sharedstuff.py in the main module to:
from globals import *
it worked (the arrays were populated).
Just sayin'
A function uses the globals of the module it's defined in. Instead of setting a = 3, for example, you should be setting module1.a = 3. So, if you want cur available as a global in utilities_module, set utilities_module.cur.
A better solution: don't use globals. Pass the variables you need into the functions that need it, or create a class to bundle all the data together, and pass it when initializing the instance.
The easiest solution to this particular problem would have been to add another function within the module that would have stored the cursor in a variable global to the module. Then all the other functions could use it as well.
module1:
cursor = None
def setCursor(cur):
global cursor
cursor = cur
def method(some, args):
global cursor
do_stuff(cursor, some, args)
main program:
import module1
cursor = get_a_cursor()
module1.setCursor(cursor)
module1.method()
Since globals are module specific, you can add the following function to all imported modules, and then use it to:
Add singular variables (in dictionary format) as globals for those
Transfer your main module globals to it
.
addglobals = lambda x: globals().update(x)
Then all you need to pass on current globals is:
import module
module.addglobals(globals())
Since I haven't seen it in the answers above, I thought I would add my simple workaround, which is just to add a global_dict argument to the function requiring the calling module's globals, and then pass the dict into the function when calling; e.g:
# external_module
def imported_function(global_dict=None):
print(global_dict["a"])
# calling_module
a = 12
from external_module import imported_function
imported_function(global_dict=globals())
>>> 12
The OOP way of doing this would be to make your module a class instead of a set of unbound methods. Then you could use __init__ or a setter method to set the variables from the caller for use in the module methods.
Update
To test the theory, I created a module and put it on pypi. It all worked perfectly.
pip install superglobals
Short answer
This works fine in Python 2 or 3:
import inspect
def superglobals():
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals
save as superglobals.py and employ in another module thusly:
from superglobals import *
superglobals()['var'] = value
Extended Answer
You can add some extra functions to make things more attractive.
def superglobals():
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals
def getglobal(key, default=None):
"""
getglobal(key[, default]) -> value
Return the value for key if key is in the global dictionary, else default.
"""
_globals = dict(inspect.getmembers(
inspect.stack()[len(inspect.stack()) - 1][0]))["f_globals"]
return _globals.get(key, default)
def setglobal(key, value):
_globals = superglobals()
_globals[key] = value
def defaultglobal(key, value):
"""
defaultglobal(key, value)
Set the value of global variable `key` if it is not otherwise st
"""
_globals = superglobals()
if key not in _globals:
_globals[key] = value
Then use thusly:
from superglobals import *
setglobal('test', 123)
defaultglobal('test', 456)
assert(getglobal('test') == 123)
Justification
The "python purity league" answers that litter this question are perfectly correct, but in some environments (such as IDAPython) which is basically single threaded with a large globally instantiated API, it just doesn't matter as much.
It's still bad form and a bad practice to encourage, but sometimes it's just easier. Especially when the code you are writing isn't going to have a very long life.
I have a module which I called entities.py - there are 2 classes within it and 2 global variables as in below pattern:
FIRST_VAR = ...
SECOND_VAR = ...
class FirstClass:
[...]
class SecondClass:
[...]
I also have another module (let's call it main.py for now) where I import both classes and constants as like here:
from entities import FirstClass, SecondClass, FIRST_VAR, SECOND_VAR
In the same "main.py" module I have another constant: THIRD_VAR = ..., and another class, in which all of imported names are being used.
Now, I have a function, which is being called only if a certain condition is met (passing config file path as CLI argument in my case). As my best bet, I've written it as following:
def update_consts_from_config(config: ConfigParser):
global FIRST_VAR
global SECOND_VAR
global THIRD_VAR
FIRST_VAR = ...
SECOND_VAR = ...
THIRD_VAR = ...
This works perfectly fine, although PyCharm indicates two issues, which at least I don't consider accurate.
from entities import FirstClass, SecondClass, FIRST_VAR, SECOND_VAR - here it warns me that FIRST_VAR and SECOND_VAR are unused imports, but from my understanding and testing they are used and not re-declared elsewhere unless function update_consts_from_config is invoked.
Also, under update_consts_from_config function:
global FIRST_VAR - at this and next line, it says
Global variable FIRST_VAR is undefined at the module level
My question is, should I really care about those warnings and (as I think the code is correct and clear), or am I missing something important and should come up with something different here?
I know I can do something as:
import entities
from entities import FirstClass, SecondClass
FIRST_VAR = entities.FIRST_VAR
SECOND_VAR = entities.SECOND_VAR
and work from there, but this look like an overkill for me, entities module has only what I have to import in main.py which also strictly depends on it, therefore I would rather stick to importing those names explicitly than referencing them by entities. just for that reason
What do you think would be a best practice here? I would like my code to clear, unambiguous and somehow optimal.
Import only entities, then refer to variables in its namespace to access/modify them.
Note: this pattern, modifying constants in other modules (which then, to purists, aren't so much constants as globals) can be justified. I have tons of cases where I use constants, rather than magic variables, as module level configuration. However, for example for testing, I might reach in and modify these constants. Say to switch a cache expiry from 2 days to 0.1 seconds to test caching. Or like you propose, to override configuration. Tread carefully, but it can be useful.
main.py:
import entities
def update_consts_from_config(FIRST_VAR):
entities.FIRST_VAR = FIRST_VAR
firstclass = entities.FirstClass()
print(f"{entities.FIRST_VAR=} before override")
firstclass.debug()
entities.debug()
update_consts_from_config("override")
print(f"{entities.FIRST_VAR=} after override")
firstclass.debug()
entities.debug()
entities.py:
FIRST_VAR = "ori"
class FirstClass:
def debug(self):
print(f"entities.py:{FIRST_VAR=}")
def debug():
print(f"making sure no closure/locality effects after object instantation {FIRST_VAR=}")
$ python main.py
entities.FIRST_VAR='ori' before override
entities.py:FIRST_VAR='ori'
making sure no closure/locality effects after object instantation FIRST_VAR='ori'
entities.FIRST_VAR='override' after override
entities.py:FIRST_VAR='override'
making sure no closure/locality effects after object instantation FIRST_VAR='override'
Now, if FIRST_VAR wasn't a string, int or another type of immutable, you should I think be able to import it separately and mutate it. Like SECOND_VAR.append("config override") in main.py. But assigning to a global in main.py will only affect affect the main.py binding, so if you want to share actual state between main.py and entities and other modules, everyone, not just main.py needs to import entities then access entities.FIRST_VAR.
Oh, and if you had:
class SecondClass:
def __init__(self):
self.FIRST_VAR = FIRST_VAR
then its instance-level value of that immutable string/int would not be affected by any overrides done after an instance creation. Mutables like lists or dictionaries would be affected because they're all different bindings pointing to the same variable.
Last, wrt to those "tricky" namespaces. global in your original code means: "dont consider FIRST_VAR as a variable to assign in update_consts_from_config s local namespace , instead assign it to main.py global, script-level namespace".
It does not mean "assign it to some global state magically shared between entities.py and main.py". __builtins__ might be that beast but modifying it is considered extremely bad form in Python.
I have the following:
objects
__init__.py
define.py
define.py:
class Place:
def __init__(self,name,inhabitants):
self.name=name
self.inhabitants=inhabitants
myFunction.toStoreThings.on.db(name,inhabitants,'places')
def someUsefulFunction(self):
pass
If I run import objects, moon=objects.Place('Moon',[]), close the interpreter and open it again. I obviously loose the moon instance, but I have (u'Moon',u'[]') stored in the database. I already made __init__.py retrieve that information from the database and unstring it, but I'd also like it to instantiate 'Moon' as Moon=Place('Moon',[]) so I can use Moon.someUsefulFunction() or objects.Moon.someUsefulFunction() even after I close the interpreter. How can I achieve this?
I was able to do it like this:
__init__.py:
# myFunction() creates a dictionary `objdic` of the stuff in the database
# >>>objects.objdic
# {'places' : [['Moon',[]]]}
instancesdic={}
instancesdic['places']={}
instancesdic['places'][objdic['places'][0][0]]=Place(*objdic['places'][0])
Which gives
>>> objects.instancesdic
{'places': {'Moon': <objects.Place instance at 0x1b29248>}}
This way I can use
objects.instancesdic['places']['Moon'].someUsefulFunction()
Which is ok, but I really wanted objects.Moon.someUsefulFunction(). Any attempt to call that whole thing Moon results either in:
TypeError: 'str' object does not support item assignment
Or in just the key in the dictionary being changed to an instance, instead of the Moon instance being created.
You could use the setattr function to set module attributes on the objects module, or you could update globals within that module. So within your __init__.py you could do:
objDict = {obj[0]: Place(*obj) for obj in objdict['places']}
globals().update(objDict)
This will then let you do object.Moon, etc.
There is some danger to be aware of, though. If any of your objects have the same name as anything else already created in objects, they will overwrite those things. So if objects has a function called myFunc and then you create an object called myFunc, it could overwrite the function with the object. (Which will overwrite which depends on which order you do things in.)
For this reason, it's probably not a good idea to do this automatically in __init__.py. It can make sense to do this for ease of use in the interactive interpreter, but modifying globals in this way will get ugly if you use it in scripts. It might be a better idea to create a function called initGlobals or something, and then call that function to set up your interactive environment. If you put the code I showed above into such a function, then call it, it will set up the environment. This lets you separate the simple importing of the module from actually creating global objects from the db, because sometimes you might want to do one but not the other.
I have a dictionary called fsdata at module level (like a global variable).
The content gets read from the file system. It should load its data once on the first access. Up to now it loads the data during importing the module. This should be optimized.
If no code accesses fsdata, the content should not be read from the file system (save CPU/IO).
Loading should happen, if you check for the boolean value, too:
if mymodule.fsdata:
... do_something()
Update: Some code already uses mymodule.fsdata. I don't want to change the other places. It should be variable, not a function. And "mymodule" needs to be a module, since it gets already used in a lot of code.
I think you should use Future/Promise like this https://gist.github.com/2935416
Main point - you create not an object, but a 'promise' about object, that behave like an object.
You can replace your module with an object that has descriptor semantics:
class FooModule(object):
#property
def bar(self):
print "get"
import sys
sys.modules[__name__] = FooModule()
Take a look at http://pypi.python.org/pypi/apipkg for a packaged approach.
You could just create a simple function that memoizes the data:
fsdata = []
def get_fsdata:
if not fsdata:
fsdata.append(load_fsdata_from_file())
return fsdata[0]
(I'm using a list as that's an easy way to make a variable global without mucking around with the global keyword).
Now instead of referring to module.fsdata you can just call module.get_fsdata().