How to run a large amount of python code from a string? - python

I need to be able to run a large amount of python code from a string. Simply using exec doesn't seem to work, as, while the code runs perfectly in a normal setting, doing it this way seems to throw an error. I also don't think I can just import it as it it hosted on the internet. Here is the code:
import urllib.request
URL = "https://dl.dropboxusercontent.com/u/127476718/instructions.txt"
def main():
instructions = urllib.request.urlopen(URL)
exec(instructions.read().decode())
if __name__ == "__main__":
main()
This is the error I've been getting:
Traceback (most recent call last):
File "C:\Python33\rc.py", line 12, in <module>
main()
File "C:\Python33\rc.py", line 9, in main
exec(instructions.read().decode())
File "<string>", line 144, in <module>
File "<string>", line 120, in main
NameError: global name 'Player' is not defined
The code I'm trying to run is available in the link in the first code snippet.
If you have any questions I'll answer them. Thank you.

Without specifying globals, the exec function (Python/bltinmodule.c) uses PyEval_GetGlobals() and PyEval_GetLocals(). For the execution frame of a function, the latter creates a new f_locals dict, which will be the target for the IMPORT_NAME, STORE_NAME, LOAD_NAME ops in the compiled code.
At the module level in Python the normal state of affairs is globals() == locals(). In that case STORE_NAME is using the module's globals, which is what a function defined within the module will use as its global namespace. However, using separate dicts for globals and locals obviously breaks that assumption.
The solution is to to manually supply globals, which exec will also use as locals:
def main():
instructions = urllib.request.urlopen(URL)
exec(instructions.read().decode(), globals())
You could also use a new dict that has __name__ defined:
def main():
instructions = urllib.request.urlopen(URL)
g = {'__name__': '__main__'}
exec(instructions.read().decode(), g)
I see in the source that the current directory will need a sound file named "pickup.wav", else you'll just get another error.
Of course, the comments about the security problems with using exec like this still apply. I'm only addressing the namespace technicality.

First I thought you might try __import__ with a StringIO object. Might look something like StackOverflow: Local Import Statements in Python.
... but that's not right.
Then I thought of using the imp module but that doesn't seen to work either.
Then I looked at: Alex Martelli's answer to Use of Eval in Python --- and tried to use it on a silly piece of code myself.
I can get the ast object, and the results of the compile() from that (though it also seems that one can simply call compile(some_string_containing_python_source, 'SomeName', 'exec') without going through the ast.parse() intermediary step if you like. From what I gather you'd use ast if you wanted to then traverse the resulting syntax tree, inspecting and possibly modifying nodes, before you compiled it.
At the end it seems that you'll need to exec() the results of your compile() before you have resulting functions, classes or variables defined in your execution namespace.

You can use pipe to put all strings into a child process of python and get output result from it.
Google os.popen or subprocess.Popen

Related

How can I import a .pyc compiled python file and use it

Im trying to figure out how to include a .pyc file in a python script.
For example my script is called:
myscript.py
and the script I would like to include is called:
included_script.pyc
So, do I just use:
import included_script
And will that automatically execute the included_script.pyc ? Or is there something further I need to do, to get my included_script.pyc to run inside the myscript.py?
Do I need to pass the variables used in included_script.pyc also? If so, how might this be achieved?
Unfortunately, no, this cannot be done automatically. You can, of course, do it manually in a gritty ugly way.
Setup:
For demonstration purposes, I'll first generate a .pyc file. In order to do that, we first need a .py file for it. Our sample test.py file will look like:
def foo():
print("In foo")
if __name__ == "__main__":
print("Hello World")
Super simple. Generating the .pyc file can done with the py_compile module found in the standard library. We simply pass in the name of the .py file and the name for our .pyc file in the following way:
py_compile.compile('test.py', 'mypyc.pyc')
This will place mypyc.pyc in our current working directory.
Getting the code from .pyc files:
Now, .pyc files contain bytes that are structured in the following way:
First 4 bytes signalling a 'magic number'
Next 4 bytes holding a modification timestamp
Rest of the contents are a marshalled code object.
What we're after is that marshalled code object, so we need to import marshal to un-marshall it and execute it. Additionally, we really don't care/need the 8 first bytes, and un-marshalling the .pyc file with them is disallowed, so we'll ignore them (seek past them):
import marshal
s = open('mypyc.pyc', 'rb')
s.seek(8) # go past first eight bytes
code_obj = marshal.load(s)
So, now we have our fancy code object for test.py which is valid and ready to be executed as we wish. We have two options here:
Execute it in the current global namespace. This will bind all definitions inside our .pyc file in the current namespace and will act as a sort of: from file import * statement.
Create a new module object and execute the code inside the module. This will be like the import file statement.
Emulating from file import * like behaviour:
Performing this is pretty simple, just do:
exec(code_obj)
This will execute the code contained inside code_obj in the current namespace and bind everything there. After the call we can call foo like any other funtion:
foo()
# prints: In foo!
Note: exec() is a built-in.
Emulating import file like behaviour:
This includes another requirement, the types module. This contains the type for ModuleType which we can use to create a new module object. It takes two arguments, the name for the module (mandatory) and the documentation for it (optional):
m = types.ModuleType("Fancy Name", "Fancy Documentation")
print(m)
<module 'Fancy Name' (built-in)>
Now that we have our module object, we can again use exec to execute the code contained in code_obj inside the module namespace (namely, m.__dict__):
exec(code_obj, m.__dict__)
Now, our module m has everything defined in code_obj, you can verify this by running:
m.foo()
# prints: In foo
These are the ways you can 'include' a .pyc file in your module. At least, the ways I can think of. I don't really see the practicality in this but hey, I'm not here to judge.

NameError on global variables when multiprocessing, only in subdirectory

I have a main process which uses execfile and runs a script in a child process. This works fine unless the script is in another directory -- then everything breaks down.
This is in mainprocess.py:
from multiprocessing import Process
m = "subdir\\test.py"
if __name__ == '__main__':
p = Process(target = execfile, args = (m,))
p.start()
Then in a subdirectory aptly named subdir, I have test.py
import time
def foo():
print time.time()
foo()
When I run mainprocess.py, I get the error:
NameError: global name 'time' is not defined
but the issue isn't limited to module names -- sometimes I'll get an error on a function name on other pieces of code.
I've tried importing time in mainprocess.py and also inside the if statement there, but neither has any effect.
One way of avoiding the error (I haven't tried this), is to copy test.py into the parent directory and insert a line in the file to os.chdir back to the original directory. However, this seems rather sloppy.
So what is happening?
The solution is to change your Process initialization:
p = Process(target=execfile, args=(m, {}))
Honestly, I'm not entirely sure why this works. I know it has something to do with which dictionary (locals vs. globals) that the time import is added to. It seems like when your import is made in test.py, it's treated like a local variable, because the following works:
import time # no foo() anymore
print(time.time()) # the call to time.time() is in the same scope as the import
However, the following also works:
import time
def foo():
global time
print(time.time())
foo()
This second example shows me that the import is still assigned to some kind of global namespace, I just don't know how or why.
If you call execfile() normally, rather than in a subprocess, everything runs fine, and in fact, you can then use the time module any place after the call to execfile() call in your main process because time has been brought into the same namespace. I think that since you're launching it in a subprocess there is no module-level namespace for the import to be assigned to (execfile doesn't create a module object when called). I think that when we add the empty dictionary to the call to execfile, we're adding supplying the global dictionary argument, thus giving the import mechanism a global namespace to assign the name time to.
Some links for background:
1) Tutorial page on namespaces and scope
- look here for builtin, global, and local namespace explanations first
2) Python docs on execfile command
3) A very similar question on a non-SO site

NameError when importing a dict

I've got a script where I want to import a dict from a file and then use it to execute functions.
The file codes.py is as follows:
rf_433mhz = {
"0x471d5c" : sensor_LaundryDoor,
}
And the file it's using is as follows:
#!/usr/bin/python
import mosquitto
import json
import time
def sensor_LaundryDoor():
print "Laundry Door Opened"
mqttc.publish("actuators", json.dumps(["switch_HallLight", "on"]))
from codes import rf_433mhz
but I'm getting a NameError.
Traceback (most recent call last):
File "sensors.py", line 11, in <module>
from codes import rf_433mhz
File "/root/ha/modules/processing/codes.py", line 2, in <module>
"0x471d5c" : sensor_LaundryDoor,
NameError: name 'sensor_LaundryDoor' is not defined
Is there any way to do what I'm trying to do? It seems to be getting stuck on not having the function inside codes.py
I'm trying to call sensor_LaundryDoor() as follows
def on_message(msg):
inbound = json.loads(msg.payload)
medium = inbound[0]
content = inbound[1]
if str(medium) == "433mhz":
try:
rf_433mhz[str(content)]()
except:
print "Sorry code " + content + " is not setup"
import isn't include. It won't dump the source code of codes.py into your script; rather, it runs codes.py in its own namespace, almost like a separate script, and then assigns either the module object or specific module contents to names in the namespace the import is in. In the namespace of codes.py, there is no sensor_LaundryDoor variable.
The way you're dividing the code into modules isn't very useful. To understand codes.py, you need to understand the other file to know what sensor_LaundryDoor is. To understand the other file, you need to understand codes.py to know what you're importing. This circular dependency would negate most of the benefit of modularizing your code even if it wasn't an error. Reorganize your code to fix the circular dependency, and you'll probably fix the NameError as well.
The problem is that in your dictionary that you're importing, you're setting the value of 0x471d5c to a variable that was either not defined, or not defined in that scope.
An example of this would be like:
Codes.py
#!/usr/bin/python
sensor_LaundryDoor = 'foo'
rf_433mhz = {
"0x471d5c" : sensor_LaundryDoor,
}
Main files
#!/usr/bin/python
from test import rf_433mhz
print rf_433mhz["0x471d5c"]
There are hacky ways to solve this but it looks like the real issue is that you're trying to write C-style code in Python. The Python way to do things would be to import sensor_LaundryDoor in codes.py before using it, and if this introduces a circular reference then that's a design issue.
Maybe you need three modules, events.py with your main loop which imports the dict from codes.py which imports the functions from sensors.py.

Python wont define function

I am getting back into python and I'm having a really basic issue....
My source has the following...
def calrounds(rounds):
print rounds
When I run this through the shell and try to call calrounds(3) I get..
Traceback (most recent call last):
File "<pyshell#1>", line 1, in <module>
calrounds(3)
NameError: name 'calrounds' is not defined
Its been awhile since I've used python, humor me.
Did you import your source first?
It says that the first line of your program is calling calrounds with a parameter 3. Move that below your function definition. The definition needs to be before you call the function. If you are using python 3.0+ you need parenthesis for the print statement.
>>> def calrounds(rounds):
print(rounds)
>>> calrounds(3)
3
The first thing to do is to look at how you're calling the function. Assuming it's in myModule.py, did you import myModule or did you from myModule import calrounds? If you used the first one, you need to call it as myModule.calrounds().
Next thing I would do is to make sure that you're restarting your interpreter. If you have imported a module, importing it again will not reload the source, but use what is already in memory.
The next posibility is that you're importing a file other than the one you think you are. You might be in a different directory or loading something from the standard library. After you import myModule you should print myModule.__file__ and see if it is the file you think you're working on. After 20 years of programming, I still find myself doing this about once a year and it's incredibly frustrating.
Finally, there's the chance that Python is just acting up. Next to your myModule.py there will be a myModule.pyc - this is where Python puts the compiled code so it can load modules faster. Normally it's smart enough to tell if your source has been modified but, occassionally, it fails. Delete your .pyc file and restart the interpreter.

Python import scope issue

I'm trying to do some script reuse for some python build scripts. An abbreviated version of my "reusable" part looks like (_build.py):
Sources = []
Sources += glob('*.c')
Sources += glob('../FreeRTOS/*.c')
...
def addSources(directory, *rest):
for each in rest:
Sources += ['../'+directory+'/'+each+'.c']
def addSharedSources(*args):
addSources('Shared', *args)
Then in the customized part, I have something like (build.py):
#!/usr/bin/env python
from _build import *
...
#Additional source files from ../Shared, FreeRTOS and *.c are already in
addSharedSources('ccpwrite', 'discovery', 'radioToo', 'telemetry', 'utility')
Unfortunately, when I try to run build.py, I get a traceback that looks like:
Traceback (most recent call last):
File "./build.py", line 8, in <module>
addSharedSources('ccpwrite', 'discovery', 'radioToo', 'telemetry', 'utility')
File "/Users/travisg/Projects/treetoo/Twig/_build.py", line 49, in addSharedSources
addSources('Shared', *args)
File "/Users/travisg/Projects/treetoo/Twig/_build.py", line 46, in addSources
Sources += ['../'+directory+'/'+each+'.c']
UnboundLocalError: local variable 'Sources' referenced before assignment
So even though I did the wildcard import, it would appear that when the import function is called, it's not referencing my "global" variable imported from the original. Is there a way to make it work? I toyed around with global, but that didn't seem to do what I wanted.
This has nothing to do with the import. You'll have the same problem if you run _build.py directly. The problem is that the function addSources is modifying the global Sources without declaring it global. Insert a global declaration in the addSources function, and all should be well.
Explanation: It is very easy to write this kind of code by mistake. So python allows you to read a global variable without declaring it global, but not to modify it.

Categories