In lua there is a function called loadfile, this function allows the program to parse a .lua file into the current project. All the functions and variables work as if they were written in the same file.
This is useful for loading plugins. I am trying to port a telegram bot over to python but cannot find a function that allows me to load a .py file and have the functions be in the context of the file.
I have tried python's execfile and importing the file but that doesn't allow for the functions of the loaded file to be in scope of the initial file.
(i.e. fileA.py loads fileB.py. fileA has function "doThis". fileB can't access "doThis" using execfile)
How can I achieve the same thing in python as loadfile for lua?
I am using python 2
You generally shouldn't do this, since wildcard imports can make code harder to debug, but:
from other_module import *
This imports everything defined in other_module into the current module's global namespace. It's not exactly the same, but it's as close as you'll get.
execfile("other_module.py", globals())
is very similar, except it won't cache the module in sys.modules.
Related
I wrote a module that, if it is imported, automatically changes the error output of my program. It is quite handy to have it in almost any python code I write.
Thus I don't want to add the line import my_errorhook to every code I write but want to have this line added automatically.
I found this answer, stating that it should be avoided to change the behavior of python directly. So I thought about changing the command line, something like
python --importModule my_errorhook main.py
and defining an alias in the bashrc to overwrite the python command to automatically add the parameter. Is there any way I could achieve such a behavior?
There is no such thing like --importModule in python command line. The only way you can incept the code without explicitly importing is by putting your functions in builtins module. However, this is a practice that is discouraged because it makes your code hard to maintain without proper design.
Let's assume that your python file main.py is the entry point of the whole program. Now you can create another file bootstrap.py, and put below codes into the new file.
import main
__builtins__.func = lambda x: x>=0
main.main()
Then the function func() can be called from all modules without being imported. For example in main.py
def main():
...
print(func(1))
...
Im trying to figure out how to include a .pyc file in a python script.
For example my script is called:
myscript.py
and the script I would like to include is called:
included_script.pyc
So, do I just use:
import included_script
And will that automatically execute the included_script.pyc ? Or is there something further I need to do, to get my included_script.pyc to run inside the myscript.py?
Do I need to pass the variables used in included_script.pyc also? If so, how might this be achieved?
Unfortunately, no, this cannot be done automatically. You can, of course, do it manually in a gritty ugly way.
Setup:
For demonstration purposes, I'll first generate a .pyc file. In order to do that, we first need a .py file for it. Our sample test.py file will look like:
def foo():
print("In foo")
if __name__ == "__main__":
print("Hello World")
Super simple. Generating the .pyc file can done with the py_compile module found in the standard library. We simply pass in the name of the .py file and the name for our .pyc file in the following way:
py_compile.compile('test.py', 'mypyc.pyc')
This will place mypyc.pyc in our current working directory.
Getting the code from .pyc files:
Now, .pyc files contain bytes that are structured in the following way:
First 4 bytes signalling a 'magic number'
Next 4 bytes holding a modification timestamp
Rest of the contents are a marshalled code object.
What we're after is that marshalled code object, so we need to import marshal to un-marshall it and execute it. Additionally, we really don't care/need the 8 first bytes, and un-marshalling the .pyc file with them is disallowed, so we'll ignore them (seek past them):
import marshal
s = open('mypyc.pyc', 'rb')
s.seek(8) # go past first eight bytes
code_obj = marshal.load(s)
So, now we have our fancy code object for test.py which is valid and ready to be executed as we wish. We have two options here:
Execute it in the current global namespace. This will bind all definitions inside our .pyc file in the current namespace and will act as a sort of: from file import * statement.
Create a new module object and execute the code inside the module. This will be like the import file statement.
Emulating from file import * like behaviour:
Performing this is pretty simple, just do:
exec(code_obj)
This will execute the code contained inside code_obj in the current namespace and bind everything there. After the call we can call foo like any other funtion:
foo()
# prints: In foo!
Note: exec() is a built-in.
Emulating import file like behaviour:
This includes another requirement, the types module. This contains the type for ModuleType which we can use to create a new module object. It takes two arguments, the name for the module (mandatory) and the documentation for it (optional):
m = types.ModuleType("Fancy Name", "Fancy Documentation")
print(m)
<module 'Fancy Name' (built-in)>
Now that we have our module object, we can again use exec to execute the code contained in code_obj inside the module namespace (namely, m.__dict__):
exec(code_obj, m.__dict__)
Now, our module m has everything defined in code_obj, you can verify this by running:
m.foo()
# prints: In foo
These are the ways you can 'include' a .pyc file in your module. At least, the ways I can think of. I don't really see the practicality in this but hey, I'm not here to judge.
I have a collection of scripts written in Python. Each of them can be executed independently. However, most of the time they should be executed one after the other, so there is a MainScript.py which calls them in the appropriate order. Each script has some configurable variables (let's call them Root_Dir, Data_Dir and LinWinFlag). If this collection of scripts is moved to a different computer, or different data needs to be processed, these variable values need to be changed. As there are many scripts this duplication is annoying and error-prone. I would like to group all configuration variables into a single file.
I tried making Config.py which would contain them as per this thread, but import Config produces ImportError: No module named Config because they are not part of a package.
Then I tried relying on variable inheritance: define them once in MainScript.py which calls all the others. This works, but I realized that each script would not be able to run on its own. To solve this, I tried adding useGlobal=True in MainScript.py and in other files:
if (useGlobal is None or useGlobal==False):
# define all variables
But this fails when scripts are run standalone: NameError: name 'useGlobal' is not defined. The workaround is to define useGlobal and set it to False when running the scripts independently of MainScript.py. It there a more elegant solution?
The idea is that python wants to access files - including the Config.py - primarily as part of a module.
The nice thing is that Python makes building modules (i.e. python packages) really easy - initializing it can be done by creating a
__init__.py
file in each directory you want as a module, a submodule, a subsubmodule, and so on.
So your import should go through if you have created this file.
If you have further questions, look at the excellent python documentation.
The best way to do this is to use a configuration file placed in your home directory (~/.config/yourscript/config.json).
You can then load the file on start and provide default values if the file does not exist :
Example (config.py) :
import json
default_config = {
"name": "volnt",
"mail": "oh#hi.com"
}
def load_settings():
settings = default_config
try:
with open("~/.config/yourscript/config.json", "r") as config_file:
loaded_config = json.loads(config_file.read())
for key in loaded_config:
settings[key] = loaded_config[key]
except IOError: # file does not exist
pass
return settings
For a configuration file it's a good idea to use json and not python, because it makes it easy to edit for people using your scripts.
As suggested by cleros, ConfigParser module seems to be the closest thing to what I wanted (one-line statement in each file which would set up multiple variables).
I need to know how to run a python script from a python script present in other directory like the following algorithm:
if option==true
run /path/to/the/directory/PYTHON SCRIPT
else
ch3ka points out that you can use exec to do this. There are other ways like subprocess or os.system as well.
But Python works well with itself by design - this is the entire concept behind creating and importing modules. I think for most cases you'd be better off just encapsulating the script in a class, and moving the code that was previously in the if __name__ == '__main__' section of the script into the __init__ section of the class:
class PYTHON_SCRIPT:
def __init__(self):
# put your logic here
Then you could just import the class:
import PYTHON_SCRIPT
# no need to say if a boolean is true, just say if boolean
if option:
PYTHON_SCRIPT()
This would additionally give you the benefit of being able to use properties within your script as you saw fit.
use execfile.
execfile(...)
execfile(filename[, globals[, locals]])
Read and execute a Python script from a file.
The globals and locals are dictionaries, defaulting to the current
globals and locals. If only globals is given, locals defaults to it.
In pyton3, execfile is gone. You can use exec(open('/path/to/file.py').read()) instead.
Already answered here
How do I execute a program from python? os.system fails due to spaces in path
use subprocess module
import subprocess
subprocess.call(['C:\\Temp\\a b c\\Notepad.exe', 'C:\\test.txt'])
other methods include making system calls using os library or execfile in the other post
if the script is well designed it probably just launch a main function (often called main), so the most proper way to do this is to import this main function in your code and call it, this is the pythonic way. You just need to add the directory of the script into your python path.
if it's possible, always try to avoid exec, subprocess, os.system, Popen etc ..
example :
import sys
sys.path.insert(0, 'path/to/the/directory')
import python_script
sys.path.pop(0)
if option:
python_script.main()
I'd like to save complex configuration by writing some Python code with variable assignments and functions which I will import later by a ConfigReader class.
Basically so far I've written my config file:
a=1
And a class that works like
c=ConfigReader("C:\my_file")
print(c.a)
For that I used exec(), but now I also want that the config file knows its own filename (since the directory provides information for some variables)
So I need a config file like:
a=parse_project_number_from_dir(__file__)
It seems I need to replace exec() by some imp module loading in Python 3? Is that the easiest way to execute a simple file making it aware of its path?
Moreover I'd like my ConfigReader class to read all variables into a dictionary. (With exec I just looked at locals() ). What should I do now and can I not import the auxiliary imports in config file (like the parse_project_number_from_dir function), but just what I actually define there (i.e. a=)?
When you exec a Python script, there is no filename, because you are giving exec a string or a compiled object. The exec command has no way to know where you got this string.
But there is a way to share the knowledge with the config file that you are passing to exec. For example...
Let's say you have a config file that says
print myname
Then run the following Python script:
ns = {}
ns["myname"] = "sample.cfg"
s = open("sample.cfg","r")
exec s in ns # in 3.x use exec(s,ns) instead
This creates a new namespace, and in that namespace creates a variable named myname. Since the script knows the name of the file that it is about to open, it can assign this to a variable. The effect is the same as exec 'myname = "sample.cfg"' in ns.
Then, when you exec the config file, you share your script's knowledge by telling it to use the ns namespace. If this seems a bit confusing, do a little reading about globals and locals and namespaces.
import sys
sys.path.insert(0,"path to your file")
from yourFile import *