Config python script which knows its filename - python

I'd like to save complex configuration by writing some Python code with variable assignments and functions which I will import later by a ConfigReader class.
Basically so far I've written my config file:
a=1
And a class that works like
c=ConfigReader("C:\my_file")
print(c.a)
For that I used exec(), but now I also want that the config file knows its own filename (since the directory provides information for some variables)
So I need a config file like:
a=parse_project_number_from_dir(__file__)
It seems I need to replace exec() by some imp module loading in Python 3? Is that the easiest way to execute a simple file making it aware of its path?
Moreover I'd like my ConfigReader class to read all variables into a dictionary. (With exec I just looked at locals() ). What should I do now and can I not import the auxiliary imports in config file (like the parse_project_number_from_dir function), but just what I actually define there (i.e. a=)?

When you exec a Python script, there is no filename, because you are giving exec a string or a compiled object. The exec command has no way to know where you got this string.
But there is a way to share the knowledge with the config file that you are passing to exec. For example...
Let's say you have a config file that says
print myname
Then run the following Python script:
ns = {}
ns["myname"] = "sample.cfg"
s = open("sample.cfg","r")
exec s in ns # in 3.x use exec(s,ns) instead
This creates a new namespace, and in that namespace creates a variable named myname. Since the script knows the name of the file that it is about to open, it can assign this to a variable. The effect is the same as exec 'myname = "sample.cfg"' in ns.
Then, when you exec the config file, you share your script's knowledge by telling it to use the ns namespace. If this seems a bit confusing, do a little reading about globals and locals and namespaces.

import sys
sys.path.insert(0,"path to your file")
from yourFile import *

Related

make function available without import globally

let's say I wanted to make a core library for a project, with functions like:
def foo(x):
"""common, useful function"""
and I want to make these functions globally available in my project so that when I call them in a file, I don't need to import them. I have a virtualenv, so I feel like I should be able to modify my interpreter to make them globally available, but wasn't sure if there was any established methodologies behind this. I am aware it defies some pythonic principles.
It is possible to create a custom "launcher" that sets up some global variables and executes the code in a python file:
from sys import argv
# we read the code of the file passed as the first CLI argument
with open(argv[1]) as fin:
code = fin.read()
# just an example: this will be available in the executed python file
def my_function():
return "World"
global_variables = {
'MY_CONSTANT': "Hello", # prepare a global variable
'my_function': my_function # prepare a global function
}
exec(code, global_variables) # run the file with new global variables
Use it like this: python launcher.py my_dsl_file.py.
Example my_dsl_file.py:
# notice: no imports at all
print(MY_CONSTANT)
print(my_function())
Interestingly Python (at least CPython) uses a different way to setup some useful functions like help. It runs a file called site.py that adds some values to the builtins module.
import builtins
def my_function():
return "World"
builtins.MY_CONSTANT = "Hello"
builtins.my_function = my_function
# run your file like above or simply import it
import <your file>
I wouldn't recommend either of these ways. A simple from <your library> import * is a much better approach.
The downside of the first two variants is that no tool will know anything about your injected globals. E.g. mypy, flake8 and all IDEs i know of will fail.

How can I import a .pyc compiled python file and use it

Im trying to figure out how to include a .pyc file in a python script.
For example my script is called:
myscript.py
and the script I would like to include is called:
included_script.pyc
So, do I just use:
import included_script
And will that automatically execute the included_script.pyc ? Or is there something further I need to do, to get my included_script.pyc to run inside the myscript.py?
Do I need to pass the variables used in included_script.pyc also? If so, how might this be achieved?
Unfortunately, no, this cannot be done automatically. You can, of course, do it manually in a gritty ugly way.
Setup:
For demonstration purposes, I'll first generate a .pyc file. In order to do that, we first need a .py file for it. Our sample test.py file will look like:
def foo():
print("In foo")
if __name__ == "__main__":
print("Hello World")
Super simple. Generating the .pyc file can done with the py_compile module found in the standard library. We simply pass in the name of the .py file and the name for our .pyc file in the following way:
py_compile.compile('test.py', 'mypyc.pyc')
This will place mypyc.pyc in our current working directory.
Getting the code from .pyc files:
Now, .pyc files contain bytes that are structured in the following way:
First 4 bytes signalling a 'magic number'
Next 4 bytes holding a modification timestamp
Rest of the contents are a marshalled code object.
What we're after is that marshalled code object, so we need to import marshal to un-marshall it and execute it. Additionally, we really don't care/need the 8 first bytes, and un-marshalling the .pyc file with them is disallowed, so we'll ignore them (seek past them):
import marshal
s = open('mypyc.pyc', 'rb')
s.seek(8) # go past first eight bytes
code_obj = marshal.load(s)
So, now we have our fancy code object for test.py which is valid and ready to be executed as we wish. We have two options here:
Execute it in the current global namespace. This will bind all definitions inside our .pyc file in the current namespace and will act as a sort of: from file import * statement.
Create a new module object and execute the code inside the module. This will be like the import file statement.
Emulating from file import * like behaviour:
Performing this is pretty simple, just do:
exec(code_obj)
This will execute the code contained inside code_obj in the current namespace and bind everything there. After the call we can call foo like any other funtion:
foo()
# prints: In foo!
Note: exec() is a built-in.
Emulating import file like behaviour:
This includes another requirement, the types module. This contains the type for ModuleType which we can use to create a new module object. It takes two arguments, the name for the module (mandatory) and the documentation for it (optional):
m = types.ModuleType("Fancy Name", "Fancy Documentation")
print(m)
<module 'Fancy Name' (built-in)>
Now that we have our module object, we can again use exec to execute the code contained in code_obj inside the module namespace (namely, m.__dict__):
exec(code_obj, m.__dict__)
Now, our module m has everything defined in code_obj, you can verify this by running:
m.foo()
# prints: In foo
These are the ways you can 'include' a .pyc file in your module. At least, the ways I can think of. I don't really see the practicality in this but hey, I'm not here to judge.

Python equivalent of lua's loadfile

In lua there is a function called loadfile, this function allows the program to parse a .lua file into the current project. All the functions and variables work as if they were written in the same file.
This is useful for loading plugins. I am trying to port a telegram bot over to python but cannot find a function that allows me to load a .py file and have the functions be in the context of the file.
I have tried python's execfile and importing the file but that doesn't allow for the functions of the loaded file to be in scope of the initial file.
(i.e. fileA.py loads fileB.py. fileA has function "doThis". fileB can't access "doThis" using execfile)
How can I achieve the same thing in python as loadfile for lua?
I am using python 2
You generally shouldn't do this, since wildcard imports can make code harder to debug, but:
from other_module import *
This imports everything defined in other_module into the current module's global namespace. It's not exactly the same, but it's as close as you'll get.
execfile("other_module.py", globals())
is very similar, except it won't cache the module in sys.modules.

How to run a particular python script present in any other directory after a "if" statement in a python script?

I need to know how to run a python script from a python script present in other directory like the following algorithm:
if option==true
run /path/to/the/directory/PYTHON SCRIPT
else
ch3ka points out that you can use exec to do this. There are other ways like subprocess or os.system as well.
But Python works well with itself by design - this is the entire concept behind creating and importing modules. I think for most cases you'd be better off just encapsulating the script in a class, and moving the code that was previously in the if __name__ == '__main__' section of the script into the __init__ section of the class:
class PYTHON_SCRIPT:
def __init__(self):
# put your logic here
Then you could just import the class:
import PYTHON_SCRIPT
# no need to say if a boolean is true, just say if boolean
if option:
PYTHON_SCRIPT()
This would additionally give you the benefit of being able to use properties within your script as you saw fit.
use execfile.
execfile(...)
execfile(filename[, globals[, locals]])
Read and execute a Python script from a file.
The globals and locals are dictionaries, defaulting to the current
globals and locals. If only globals is given, locals defaults to it.
In pyton3, execfile is gone. You can use exec(open('/path/to/file.py').read()) instead.
Already answered here
How do I execute a program from python? os.system fails due to spaces in path
use subprocess module
import subprocess
subprocess.call(['C:\\Temp\\a b c\\Notepad.exe', 'C:\\test.txt'])
other methods include making system calls using os library or execfile in the other post
if the script is well designed it probably just launch a main function (often called main), so the most proper way to do this is to import this main function in your code and call it, this is the pythonic way. You just need to add the directory of the script into your python path.
if it's possible, always try to avoid exec, subprocess, os.system, Popen etc ..
example :
import sys
sys.path.insert(0, 'path/to/the/directory')
import python_script
sys.path.pop(0)
if option:
python_script.main()

Python, dynamically invoke script

I want to run a python script from within another. By within I mean any state changes from the child script effect the parent's state. So if a variable is set in the child, it gets changed in the parent.
Normally you could do something like
import module
But the issue is here the child script being run is an argument to the parent script, I don't think you can use import with a variable
Something like this
$python run.py child.py
This would be what I would expect to happen
#run.py
#insert magic to run argv[1]
print a
#child.py
a = 1
$python run.py child.py
1
You can use the __import__ function which allows you to import a module dynamically:
module = __import__(sys.argv[1])
(You may need to remove the trailing .py or not specify it on the command line.)
From the Python documentation:
Direct use of __import__() is rare, except in cases where you want to import a module whose name is only known at runtime.
While __import__ certainly executes the specified file, it also stores it in the python modules list. If you want to reexecute the same file, you'd have to do a reload.
You can also take a look at the python exec statement that could be more suited to your needs.
From Python documentation :
This statement supports dynamic execution of Python code. The first expression should evaluate to either a string, an open file object, or a code object.

Categories