This question already has answers here:
Is there a way to change effective process name in Python?
(10 answers)
Closed 7 years ago.
Is there a way to change the name of a process running a python script on Linux?
When I do a ps, all I get are "python" process names.
There is simplier (you don't need import any libs) but maybe not so elegant way. You have to do not use "env" inside the shebang line.
In other words, this will be named as "python" in process list:
#!/usr/bin/env python
But this will be named with your scriptname:
#!/usr/bin/python
So you'll be able to find it with something like pidof -x scriptname or ps -C scriptname
http://code.google.com/p/procname/
Sample usage:
# Lets rename:
>>> procname.setprocname('My super name')
# Lets check. Press Ctrl+Z
user#comp:~/procname$ ps
PID TTY TIME CMD
13016 pts/2 00:00:00 bash
13128 pts/2 00:00:00 My super name <-- it's here
It will only work on systems where prctl system call is present and supports PR_SET_NAME command.
the procname library didn't work for me on ubuntu. I went with setproctitle instead (pip install setproctitle). This is what gunicorn uses and it worked for me.
There is the option of doing the following, though it only works on linux (with the prctl(2) call)
if sys.platform == 'linux2':
import ctypes
libc = ctypes.cdll.LoadLibrary('libc.so.6')
libc.prctl(15, 'My Simple App', 0, 0, 0)
Related
I want to implement a userland command that will take one of its arguments (path) and change the directory to that dir. After the program completion I would like the shell to be in that directory. So I want to implement cd command, but with external program.
Can it be done in a python script or I have to write bash wrapper?
Example:
tdi#bayes:/home/$>python cd.py tdi
tdi#bayes:/home/tdi$>
Others have pointed out that you can't change the working directory of a parent from a child.
But there is a way you can achieve your goal -- if you cd from a shell function, it can change the working dir. Add this to your ~/.bashrc:
go() {
cd "$(python /path/to/cd.py "$1")"
}
Your script should print the path to the directory that you want to change to. For example, this could be your cd.py:
#!/usr/bin/python
import sys, os.path
if sys.argv[1] == 'tdi': print(os.path.expanduser('~/long/tedious/path/to/tdi'))
elif sys.argv[1] == 'xyz': print(os.path.expanduser('~/long/tedious/path/to/xyz'))
Then you can do:
tdi#bayes:/home/$> go tdi
tdi#bayes:/home/tdi$> go tdi
That is not going to be possible.
Your script runs in a sub-shell spawned by the parent shell where the command was issued.
Any cding done in the sub-shell does not affect the parent shell.
cd is exclusively(?) implemented as a shell internal command, because any external program cannot change parent shell's CWD.
As codaddict writes, what happens in your sub-shell does not affect the parent shell. However, if your goal is to present the user with a shell in a different directory, you could always have Python use os.chdir to change the sub-shell's working directory and then launch a new shell from Python. This will not change the working directory of the original shell, but will leave the user with one in a different directory.
As explained by mrdiskodave
in Equivalent of shell 'cd' command to change the working directory?
there is a hack to achieve the desired behavior in pure Python.
I made some modifications to the answer from mrdiskodave to make it work in Python 3:
The pipes.quote() function has moved to shlex.quote().
To mitigate the issue of user input during execution, you can delete any previous user input with the backspace character "\x08".
So my adaption looks like the following:
import fcntl
import shlex
import termios
from pathlib import Path
def change_directory(path: Path):
quoted_path = shlex.quote(str(path))
# Remove up to 32 characters entered by the user.
backspace = "\x08" * 32
cmd = f"{backspace}cd {quoted_path}\n"
for c in cmd:
fcntl.ioctl(1, termios.TIOCSTI, c)
I shall try to show how to set a Bash terminal's working directory to whatever path a Python program wants in a fairly easy way.
Only Bash can set its working directory, so routines are needed for Python and Bash. The Python program has a routine defined as:
fob=open(somefile,"w")
fob.write(dd)
fob.close()
"Somefile" could for convenience be a RAM disk file. Bash "mount" would show tmpfs mounted somewhere like "/run/user/1000", so somefile might be "/run/user/1000/pythonwkdir". "dd" is the full directory path name desired.
The Bash file would look like:
#!/bin/bash
#pysync ---Command ". pysync" will set bash dir to what Python recorded
cd `cat /run/user/1000/pythonwkdr`
For one functionality, I need bash commands (12 lines of bash code). How can I put these 12 lines in between my Python code? At the moment I was using:
import subprocess
command = 'bash 1-line code'
subprocess.call(command, shell=True)
This worked but I was only using one line of code, now I have 12 and the '' seems not to work well...
Any suggestions?
Just extend what you were doing. Put all your bash codes in a file named, say script.sh and call it using python. You can call it as you were calling normal commands, i.e using subprocess module:
import subprocess
subprocess.call(['./script.sh'])
This question already has answers here:
subprocess.Popen() error (No such file or directory) when calling command with arguments as a string
(2 answers)
Closed 2 years ago.
I am trying to write a program that opens a gnome-terminal window and executes a python file in it.
When I call the gnome-terminal subprocess with the subprocess module like this:
import subprocess
subprocess.call(['gnome-terminal', '-x', 'python3 '+filename])
I get the following error:
Failed to execute child process "python3 /home/user/Documents/test.py” (No such file or directory)
I have tried to cd to the directory /home/user/Documents/test.py first and then run the file, but it didn't work.
You're trying to execute the literal command python3 /home/user/Documents/test.py which obviously doesn't exist on your system.
When you type that line in a shell, the shell will split it on spaces and in the end it will call python3 with /home/user/Documents/test.py as argument.
When using subprocess.call, you have to do the splitting yourself.
I believe you need to pass your filename as another element in the array. I don't have gnome-terminal, but I replicated your issue with plain sh.
import subprocess
subprocess.call(['gnome-terminal', '-x', 'python3', filename])
Try this:
from os import system
system("gnome-terminal -e 'bash -c \"python3 %s\"'"%filename)
Add other commands using a semicolon:
system("gnome-terminal -e 'bash -c \"python3 %s; [second command]\"'")
Try this (i assume that python3 is set in PATH)
from subprocess import Popen
command="gnome-terminal -x python3"+filename
proc=Popen(command)
if this not works
then try to run your python file first , and see if it works or not
python filename
This question already has answers here:
Reading and writing environment variables in Python? [duplicate]
(4 answers)
Closed 7 years ago.
In UNIX from command line, I do
setenv HOME <path to home>
I pass it as argument to my python script
python hello.py HOME
and do
sys.argv[1] = os.environ["HOME"]
still it doesn't read the path.
I am new to python, is os.environ correct for this case?
If your aim is to get the path of the home directory as argument, you can just make your user send it by making shell evaluate the argument before calling the script.
I have a simple script like -
import sys
print(sys.argv[1])
In Windows I call it as -
set HOME=D:\
python script.py %HOME%
The output I get is -
D:\
In Linux -
$ python hello.py $HOME
output -
/home/random/
If you want to get it from the environment variable, and you want to pass the environment variable to use as the first argument to the script, then you should change your script like -
sys.argv[1] = os.environ.get(sys.argv[1],sys.argv[1])
This would
It seems this depends a little on your shell. For example, in Linux using bash 4.3-7ubuntu1.5:
$ export XXX="abc"
$ python
>>> import os
>>> os.environ["XXX"]
'abc'
>>> os.environ["HOME"]
'/home/alan'
I need to make an export like this in Python :
# export MY_DATA="my_export"
I've tried to do :
# -*- python-mode -*-
# -*- coding: utf-8 -*-
import os
os.system('export MY_DATA="my_export"')
But when I list export, "MY_DATA" not appear :
# export
How I can do an export with Python without saving "my_export" into a file ?
export is a command that you give directly to the shell (e.g. bash), to tell it to add or modify one of its environment variables. You can't change your shell's environment from a child process (such as Python), it's just not possible.
Here's what's happening when you try os.system('export MY_DATA="my_export"')...
/bin/bash process, command `python yourscript.py` forks python subprocess
|_
/usr/bin/python process, command `os.system()` forks /bin/sh subprocess
|_
/bin/sh process, command `export ...` changes its local environment
When the bottom-most /bin/sh subprocess finishes running your export ... command, then it's discarded, along with the environment that you have just changed.
You actually want to do
import os
os.environ["MY_DATA"] = "my_export"
Another way to do this, if you're in a hurry and don't mind the hacky-aftertaste, is to execute the output of the python script in your bash environment and print out the commands to execute setting the environment in python. Not ideal but it can get the job done in a pinch. It's not very portable across shells, so YMMV.
$(python -c 'print "export MY_DATA=my_export"')
(you can also enclose the statement in backticks in some shells ``)
Not that simple:
python -c "import os; os.putenv('MY_DATA','1233')"
$ echo $MY_DATA # <- empty
But:
python -c "import os; os.putenv('MY_DATA','123'); os.system('bash')"
$ echo $MY_DATA #<- 123
I have an excellent answer.
#! /bin/bash
output=$(git diff origin/master..origin/develop | \
python -c '
# DO YOUR HACKING
variable1_to_be_exported="Yo Yo"
variable2_to_be_exported="Honey Singh"
… so on
magic=""
magic+="export onShell-var1=\""+str(variable1_to_be_exported)+"\"\n"
magic+="export onShell-var2=\""+str(variable2_to_be_exported)+"\""
print magic
'
)
eval "$output"
echo "$onShell-var1" // Output will be Yo Yo
echo "$onShell-var2" // Output will be Honey Singh
Mr Alex Tingle is correct about those processes and sub-process stuffs
How it can be achieved is like the above I have mentioned.
Key Concept is :
Whatever printed from python will be stored in the variable in the catching variable in bash [output]
We can execute any command in the form of string using eval
So, prepare your print output from python in a meaningful bash commands
use eval to execute it in bash
And you can see your results
NOTE
Always execute the eval using double quotes or else bash will mess up your \ns and outputs will be strange
PS: I don't like bash but your have to use it
I've had to do something similar on a CI system recently. My options were to do it entirely in bash (yikes) or use a language like python which would have made programming the logic much simpler.
My workaround was to do the programming in python and write the results to a file.
Then use bash to export the results.
For example:
# do calculations in python
with open("./my_export", "w") as f:
f.write(your_results)
# then in bash
export MY_DATA="$(cat ./my_export)"
rm ./my_export # if no longer needed
You could try os.environ["MY_DATA"] instead.
Kind of a hack because it's not really python doing anything special here, but if you run the export command in the same sub-shell, you will probably get the result you want.
import os
cmd = "export MY_DATA='1234'; echo $MY_DATA" # or whatever command
os.system(cmd)
In the hope of providing clarity over common cinfusion...
I have written many python <--> bash <--> elfbin toolchains and the proper way to see it is such as this:
Each process (originator) has a state of the environment inherited from whatever invoked it. Any change remains lokal to that process. Transfering an environment state is a function by itself and runs in two directions, each with it's own caveats. The most common thing is to modify environment before running a sub-process. To go down to the metal, look at the exec() - call in C. There is a variant that takes a pointer to environment data. This is the only actually supported transfer of environment in typical OS'es.
Shell scripts will create a state to pass when running children when you do an export. Otherwise it just uses that which it got in the first place.
In all other cases it will be some generic mechanism used to pass a set of data to allow the calling process itself to update it's environment based on the result of the child-processes output.
Ex:
ENVUPDATE = $(CMD_THAT_OUTPUTS_KEYVAL_LISTS)
echo $ENVUPDATE > $TMPFILE
source $TMPFILE
The same can of course be done using json, xml or other things as long as you have the tools to interpret and apply.
The need for this may be (50% chance) a sign of misconstruing the basic primitives and that you need a better config or parameter interchange in your solution.....
Oh, in python I would do something like...
(need improvement depending on your situation)
import re
RE_KV=re.compile('([a-z][\w]*)\s*=\s*(.*)')
OUTPUT=RunSomething(...) (Assuming 'k1=v1 k2=v2')
for kv in OUTPUT.split(' ')
try:
k,v=RE_KV.match(kv).groups()
os.environ[k]=str(v)
except:
#The not a property case...
pass
One line solution:
eval `python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'`
echo $python_include_path # prints /home/<usr>/anaconda3/include/python3.6m" in my case
Breakdown:
Python call
python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'
It's launching a python script that
imports sysconfig
gets the python include path corresponding to this python binary (use "which python" to see which one is being used)
prints the script "python_include_path={0}" with {0} being the path from 2
Eval call
eval `python -c 'import sysconfig;print("python_include_path={0}".format(sysconfig.get_path("include")))'`
It's executing in the current bash instance the output from the python script. In my case, its executing:
python_include_path=/home/<usr>/anaconda3/include/python3.6m
In other words, it's setting the environment variable "python_include_path" with that path for this shell instance.
Inspired by:
http://blog.tintoy.io/2017/06/exporting-environment-variables-from-python-to-bash/
import os
import shlex
from subprocess import Popen, PIPE
os.environ.update(key=value)
res = Popen(shlex.split("cmd xxx -xxx"), stdin=PIPE, stdout=PIPE, stderr=PIPE,
env=os.environ, shell=True).communicate('y\ny\ny\n'.encode('utf8'))
stdout = res[0]
stderr = res[1]
os.system ('/home/user1/exportPath.ksh')
exportPath.ksh:
export PATH=MY_DATA="my_export"