How to get the caller script name - python

I'm using Python 2.7.6 and I have two scripts:
outer.py
import sys
import os
print "Outer file launching..."
os.system('inner.py')
calling inner.py:
import sys
import os
print "[CALLER GOES HERE]"
I want the second script (inner.py) to print the name of the caller script (outer.py).
I can't pass to inner.py a parameter with the name of the first script because I have tons of called/caller scripts and I can't refactor all the code.
Any idea?

One idea is to use psutil.
#!env/bin/python
import psutil
me = psutil.Process()
parent = psutil.Process(me.ppid())
grandparent = psutil.Process(parent.ppid())
print grandparent.cmdline()
This is ofcourse dependant of how you start outer.py.
This solution is os independant.

On linux you can get the process id and then the caller name like so.
p1.py
import os
os.system('python p2.py')
p2.py
import os
pid = os.getppid()
cmd = open('/proc/%d/cmdline' % (pid,)).read()
caller = ' '.join(cmd.split(' ')[1:])
print caller
running python p1.py will yield p1.py
I imagine you can do similar things in other OS as well.

Another, a slightly shorter version for unix only
import os
parent = os.system('readlink -f /proc/%d/exe' % os.getppid())

If applicable to your situation you could also simply pass an argument that lets inner.py differentiate:
import sys
import os
print "Outer file launching..."
os.system('inner.py launcher')
innter.py
import sys
import os
try:
if sys.argv[0] == 'launcher':
print 'outer.py called us'
except:
pass

Related

How to create a new console in Python to print message

I make a python script running in the console, and I want to create another console for printing important messages without running another python script to do that.
I first tried to use win32console.AllocConsole() directly, but it got Access is denied
(Seemingly because one process can attach to at most one console according to the docs).
So I tried creating a new process by using multiprocessing :
import sys, os
import win32api, win32con, win32console
import multiprocessing
def ShowConsole():
win32console.FreeConsole()
win32console.AllocConsole()
sys.stdout = open("CONOUT$", "w")
sys.stderr = open("CONOUT$", "w")
print("Test")
os.system("pause")
if __name__ == '__main__':
p = multiprocessing.Process(target=ShowConsole)
p.start()
But when I ran the code in Powershell, it exited directly with no message while no new console is created.
None of the possible solutions I found in stackoverflow works for me. What should I do?
Update: It turns out that it is because multiprocessing.Process fails to call ShowConsole function. I use multiprocessing.dummy.Process as the alternative and it works as expected.
The reason why multiprocessing.Process fails to call target is still unclear.
There's nothing wrong with your example above, it pops the console as shown below. I added a "hello" in the main section to differentiate.
But since you want to values from the first console to the second,
here's a better example. Utilize put/get to pass the information from the first console to the second console.
import win32console
import multiprocessing
import time
def secondconsole(output):
win32console.FreeConsole()
win32console.AllocConsole()
while True:
print(output.get())
if __name__ == "__main__":
output = multiprocessing.Queue()
multiprocessing.Process(target=secondconsole, args=[output]).start()
while True:
print("Hello World")
output.put("Hello to second console") #here you will provide the data to the second console
time.sleep(3) #sleep for 3 seconds just for testing
It looks like the issue might be with the way you are trying to open the console using sys.stdout and sys.stderr. Try using the following code instead:
import sys, os
import win32api, win32con, win32console
import multiprocessing
def ShowConsole():
win32console.FreeConsole()
win32console.AllocConsole()
os.dup2(win32console.GetStdHandle(win32console.STD_OUTPUT_HANDLE), sys.stdout.fileno())
os.dup2(win32console.GetStdHandle(win32console.STD_ERROR_HANDLE), sys.stderr.fileno())
print("Test")
os.system("pause")
if __name__ == '__main__':
p = multiprocessing.Process(target=ShowConsole)
p.start()

Python, create a function to control the process in another file python

I created these three functions in python (pool.py, funcA.py, config.py). My intent is that having two python files I would like the two functions to interact. Specifically, I have the pool.py file which has a while loop inside that simulates an always active process that should be blocked when the funcA.py file is called separately. The config.py file is used to create a global variable and use this variable in both files, with funcA.py I go to modify these variables cmd0 and cmd1 and instead with the file pool.py I repeatedly read these variables and check if they change. If you want to test the operation you can take these files and open two terminals in one run pool.py and in the other, you can run the funcA.py file then you can change the content of the cmd0 and cmd1 variables in the funcA.py file while the pool.py function is still running on the other terminal and you can see that the output of the pool.py file updates immediately.
Now my question is how can I make the contents of the funcA.py file become a function and run it directly as a function and have the same result? That is the one that by launching the new function that is inside the funcA.py file I modify the output of the pool.py function that is running on the other terminal.
config.py
def init():
global cmd0, cmd1
cmd0 = ''
cmd1 = ''
return cmd0, cmd1
pool.py
import os
import time
import sys
import numpy as np
import config
import funcA
from importlib import reload
while True:
funcA = reload(funcA)
print(str(config.cmd0))
print(str(config.cmd1))
if config.cmd0 == 'start':
print('stop')
time.sleep(10)
funcA.py
import os
import sys
import numpy as np
import config
config.cmd0 = 'start'
config.cmd1 = 'u1'
If i understood correctly, the following is working:
pool.py
python
import os
import time
import sys
import numpy as np
import config
import funcA
from importlib import reload
while True:
funcA = reload(funcA)
funcA.change_values("cmd0", "cmd1")
print(str(config.cmd0))
print(str(config.cmd1))
if config.cmd0 == 'start':
print('stop')
time.sleep(10)
funcA.py
import os
import sys
import numpy as np
import config
config.cmd0 = 'start'
config.cmd1 = 'u1'
def change_values(a, b):
config.cmd0 = a
config.cmd1 = b
As you can see, the function change_values is called in the pool.py without any check, then the cmd0 and cmd1 will change immediately.

How to import other script in same dir?

I am trying to put a few python scripts to scheduled and run in main.py. Those scripts are put in the same folder.
main.py:
import schedule
import time
from test1 import dd
schedule.every(2).seconds.do(dd,fname)
while True:
schedule.run_pending()
time.sleep(1)
test1.py:
def dd(fname):
print('hello' + fname)
dd('Mary')
dd('John')
It run out as those 2 name and name 'fname' is not defined.
How to define the argument at main.py file? If I have more than one def in the script, shall I need to import multiple times in the main.py
and the script that I import at top of main.py, it run once before running the schedule? That mean it will run one while you import it?
You are not defining your fname in main.py so it says name 'fname' is not defined. You are only importing the functions to main.py from test1.py
Here is the modified code:
main.py
import schedule
import time
from test1 import dd
fname="Mary"
schedule.every(2).seconds.do(dd,fname)
while True:
schedule.run_pending()
time.sleep(1)
test1.py
def dd(fname):
print('hello' + fname)
if you want to input more than one string, just simply use a list! Here is the sample code for test1.py:
def dd(fname:list):
for n in fname:
print('hello' + n)
These codes are tested using Python 3.7.7
Your problem is that you are trying to use a function argument as it's own variable. Importing is not the problem here.
Try this:
import schedule
import time
from test1 import dd
schedule.every(2).seconds.do(dd,("Any String",))
while True:
schedule.run_pending()
time.sleep(1)

Python - How to stop child process when the parent script is stoped

How can I exit child process when parent process is stoped?
Following is my code.
I want to stop all execution when there is KeyboardInterrupt.
import os, sys
scripts = ["script_1.py","script_2.py"]
try:
for script in scripts:
command = 'python ' + script
os.system(command)
except KeyboardInterrupt:
os._exit(1)
except Exception as e:
raise e
Since you are trying to execute Python scripts within another Python script, let's do this in a Pythonic way and get rid of os.system for importlib.import_module
import os
import importlib
scripts = ['script_1.py', 'script_2.py']
for filename in scripts:
modulename, ext = os.path.splitext(filename)
importlib.import_module(modulename)
If your scripts are like:
if __name__ == '__main__':
# code here
print('hello')
It will not works, because the if __name__ == '__main__': is here to ensure the part under if will be executed only if the file is executed as a script (and not as a module).
So, in this case the better thing to do is something like:
script_1.py:
def main():
# code here
print('hello')
And in the main script:
import os
import importlib
scripts = ['script_1.py', 'script_2.py']
for filename in scripts:
modulename, ext = os.path.splitext(filename)
module = importlib.import_module(modulename)
module.main()

return value from one python script to another

I have two files: script1.py and script2.py. I need to invoke script2.py from script1.py and return the value from script2.py back to script1.py. But the catch is script1.py actually runs script2.py through os.
script1.py:
import os
print(os.system("script2.py 34"))
script2.py
import sys
def main():
x="Hello World"+str(sys.argv[1])
return x
if __name__ == "__main__":
x= main()
As you can see, I am able to get the value into script2, but not back to script1. How can I do that? NOTE: script2.py HAS to be called as if its a commandline execution. Thats why I am using os.
Ok, if I understand you correctly you want to:
pass an argument to another script
retrieve an output from another script to original caller
I'll recommend using subprocess module. Easiest way would be to use check_output() function.
Run command with arguments and return its output as a byte string.
Sample solution:
script1.py
import sys
import subprocess
s2_out = subprocess.check_output([sys.executable, "script2.py", "34"])
print s2_out
script2.py:
import sys
def main(arg):
print("Hello World"+arg)
if __name__ == "__main__":
main(sys.argv[1])
The recommended way to return a value from one python "script" to another is to import the script as a Python module and call the functions directly:
import another_module
value = another_module.get_value(34)
where another_module.py is:
#!/usr/bin/env python
def get_value(*args):
return "Hello World " + ":".join(map(str, args))
def main(argv):
print(get_value(*argv[1:]))
if __name__ == "__main__":
import sys
main(sys.argv)
You could both import another_module and run it as a script from the command-line. If you don't need to run it as a command-line script then you could remove main() function and if __name__ == "__main__" block.
See also, Call python script with input with in a python script using subprocess.

Categories