This question already has answers here:
How to make a call to an executable from Python script?
(3 answers)
Closed 5 years ago.
I have a command like that spins off some calculations and stores some output in a folder. So, I have this command that works on the bash command (in Ubuntu).
/home/usr2/AXE/prog1.exe -k 1 -m /home/usr2/AXE/config.txt
However, I would like to "call" this from inside a python program.
The final idea is to generate lots of serial processes of this nature inside python.
/home/usr2/AXE/prog1.exe -k 1 -m /home/usr2/AXE/config.txt
/home/usr2/AXE/prog1.exe -k 2 -m /home/usr2/AXE/config.txt
/home/usr2/AXE/prog1.exe -k 3 -m /home/usr2/AXE/config.txt
...
...
/home/usr2/AXE/prog1.exe -k 2000 -m /home/usr2/AXE/config.txt
SO FAR
import os
os.system('"/home/usr2/AXE/prog1.exe -k 1 -m /home/usr2/AXE/config.txt"')
This gives me command line output of Out[11]: 32512, but the output file is not produced in the requisite folder.
What am I doing wrong?
P.S. No idea about Bash programming. But would be interested in Bash solutions as well.
The immediate problem seems to be an excess of quotation marks. os.system takes a string containing the command you want to execute. So to execute the shell command echo hello, you would call os.system('echo hello') from Python. Calling os.system('"echo hello"') tries to execute the shell command "echo hello"; in other words, it looks for a single executable file in your PATH named echo hello, which presumably doesn't exist and certainly isn't what you want. So get rid of one set of quotes. Either one is fine since Python treats single and double quotes the same.
You can also try replacing your use of os.system with subprocess.run, also from the standard library. Instead of a single string containing the command, you pass it a sequence of strings, each of which is one token from the shell command you'd like to run. So in our example you could do subprocess.run(('echo', 'hello')). The os.system function is more or less deprecated in favor of subprocess.run.
Related
I would like to run a Python script setting in the shell where the interpreter must look for the modules.
Suppose that myscript.py contains only:
import mymodule ; mymodule.myfunction()
But mymodule is in /home/user/hello, whereas myscript.py is, say, in /home/user/Desktop. I want to run on a terminal something like:
$ python /home/user/Desktop/myscript.py LOCATION_OF_THE_MODULES=/home/user/hello.
Would it be possible? I think that an alternative solution is to define the location in the import statement from the code, but this is not what I am looking for. I want to set the location through a variable in the shell.
So, I've been exploring a little your question, turns out this isn't a Python question but a "prompt" question, because indeed there is a way to do that but, since Python can't hop into interactive from script, we can't make it using Python only, but the Python interactive command have some extra options we can use
see:
python3 -h
for more info.
Specifically there are 2 options that are interesting, -i and -c which stands for interactive mode and command string respectively, that way we can load the modules with -c and hop into interactive with -i, like so:
python3 -i -c "import os"
Obviously, we need to make it more advanced so it can load multiple modules without Python scripting, then we will be needing to make the actual command to run Python and load the scripts you want, there is a problem tho, since we need to issue a command to be able to load all the modules you want in a folder it might create incompatibilities with prompts since not all prompts have the same syntax. There might be another low-level answer to this problem but I couldn't get to it, however, I will leave a Bash Script for reference so you can use it and/or edit it so it works best with your prompt.
FINAL_VAR=""
cd $1
for f in *.py; do
FINAL_VAR+="import ${f%.py}"$'\n'
done
python3 -i -c "$FINAL_VAR"
Usage steps:
Copy and save the script
Give it run permissions (chmod +x file_name.sh)
Run it this way: ./file_name.sh "/full/path/to/your/modules"
It will load all the .py files and will hop into an interactive Python shell for your use
Note: You might want to change the last line so it works accordingly to your Python installation
I have a list .txt with input arguments from a python script, I would like to execute the script with each argument in the list in the linux terminal, but only runs the latter.
I'm trying this:
for p in $(cat list.txt); do eval $(echo script.py -u $p); done;
there are multiple ways to do it. But my most recommended way would be the first one.
Change your python script to take an argument which is the path to your list.txt. I would suggest using argparse module in python to handle arguments. Once the argument is read, just loop thorugh the contents of the file in python and use them the way you want to. With the you can handle the arguments much better. The program would be more robust.
You can run the python script using cat operation on the list.txt. python script.py $(cat list.txt)
You can use pipe, grep and awk. python script.py $(grep pattern file | awk '{print}' ORS=' ')
But I would still suggest the first option. Its more robust, you can handle arguments better and control the errors also.
The Issue
I have a Python script that when I run it from the command line I do not want to record anything within .bash_history.
The reason for this is that the script uses the Python argparse library which allows me to pass in arguments to the python code directly from the command line.
For example I could write the script so that it would use "123456" as a value in the script:
$ ./scriptname.py -n 123456
The issue is that I don't want the value 123456 stored in .bash_history. In fact, I'd rather the entire command was never stored into the .bash_history file in the first place.
What I've Tried
Subprocess & history -c
I've added the subprocess library to the top of my script and then included this directly after to attempt to proactively clear the current history of the shell I am working in:
subprocess.call("history -c", shell=True)
Theoretically this should clear the history of the current shell. I don't see errors from this so I'm assuming that it runs in some other shell. When I run it outside of the script (directly after running the command to invoke the script) it works properly.
Subprocess & unset HISTFILE
I have also used subprocess with the following with no success:
subprocess.call("unset HISTFILE", shell=True)
os.system & history -c
I've also used the os library for Python and included the following in the script:
os.system("history -c")
os.system and unset HISTFILE
I've also tried unset HISTFILE with os.system to no avail.
os.system("unset HISTFILE")
Preferred Solution Characteristics
I realize that I could simply type in unset HISTFILE or history -c after using the command. But I want this to be as much as possible a self-contained script.
Ideally the solution would prevent the ./scomescript.py command from ever being recorded within .bash_history.
I need this script to output text to the terminal based on the input so I can't close the terminal immediately afterwards either.
I imagine there must be a way to do this from within the python script itself - this is my preference.
This really isn't very feasible... Adding the entry to the history file is performed by the interactive shell, and it occurs after the command has completed and the parent shell exits. It is, strictly speaking, possible, if you were to make your python program execute spawn a hacky background process that did something like read the history file in a loop re-writing it. I really can't advocate anything like this, but you could append your script with something like:
os.system("nohup bash -ic 'while :; do read -d \"\" history < \"$HISTFILE\"; echo \"$history\" | sed -e\"s#^%s.*##\" -e\"/^$/d\" > \"$HISTFILE\"; sleep 1; done &' >/dev/null 2>&1" % sys.argv[0])
I think a much better way to accomplish your goal of not recording any arguments would be to use something like var = raw_input("") instead of passing sensitive argument on the command line.
You could also perhaps create a shell function to wrap your script, something like my_script(){ set +o history; python_script.py "$#; set -o history ;}?
I am new to python and still at the level of basic learning. Recently I tried to write a script to generate new folders according to the number supplied in the input text file. After creating those folders I want to copy a file into all those folders at the same time. I can do it by typing
echo equil{1..x} | xargs -n 1 cp *.txt *
in the terminal, and it works fine. Here x is the number of folders I have in my working directory. But my concern is to make it automatic, i.e. to call it from the script, so that the user doesn't need to type this line every time in the terminal. That is why I tried this
sub2 = subprocess.call(['echo', 'equil{1..x}', '|', 'xargs', '-n', '1', 'cp', '*.txt *'])
Can anyone please guide me and show me the mistake. Actually I am not getting any error, rather it is printing this
equil{1..x} | xargs -n 1 cp *.txt *
in the terminal after executing the rest of the script.
You have to use subprocess.Popen if you want to send data to/from stdin/stdout of your subprocesses. And you have to Popen a subprocess for each of the executables, i.e. in your example, one for echo and one for xargs.
There is an example in the docs: https://docs.python.org/2/library/subprocess.html#replacing-shell-pipeline
Another here: Call a shell command containing a 'pipe' from Python and capture STDOUT
However, instead of running echo to produce some lines, you can directly write them in python to the process stdin.
I don't think you can use subprocess.call() like this with pipes. For recipes how to use pipes, see
https://docs.python.org/2/library/subprocess.html#replacing-shell-pipeline
I.e. you would use subprocess.communicate() over two processes.
I am trying to created aliases for tcsh from a python script (running Python 2.7.1).
Once the aliases are created I want to use them in the same shell I ran the python script in.
I tried:
os.system('alias test "echo test"')
but I get the following error:
sh: line 0: alias: test: not found
sh: line 0: alias: echo test: not found
I then tried:
os.system(r"""/bin/csh -i -c 'alias test "echo test"'""")
And then no errors occurred, but the alias did not register, and therefore I could not use it.
The result I'm looking for is this:
tcsh>python my_script.py
tcsh>test
test
Thanks!
os.system executes that command in a subshell (the bourne shell by the look of it), so even if your syntax was correct alias test="echo test", it would not persist after the call (since the subshell closed).
But this seems like an XY question. You ask about Y - the solution you had in mind, and not about X - your problem.
If you simply want to create a bunch of aliases at once, why not use a c-shell script!? (Why you are torturing yourself with c-shell is another matter entirely).
Your python script cannot execute anything in the context of your shell. While you could use subprocess.call(..., shell=True) this would use a new shell and thus not update your existing shell.
The only way to do what you want is to make your python script write valid shell commands to stdout and then, instead of just executing it, you need to make your shell evaluate the output of your python script.