How to emulate backtick subshells in plumbum - python

I want to write this command in Plumbum:
foo `date +%H%M%S`
or, equivalently:
foo $(date +%H%M%S)
(think of 'foo' as a command like 'mkdir')
How can I write this in Plumbum? I want to have this as a Plumbum object that I can reuse - ie each time I call it I call 'foo' with the current time, being different each call.
I tried using a subshell. This example:
#!/usr/bin/env python3
from plumbum import local
import time
s = local["sh"]
f = s["-c", "mkdir", "$(date +%H%M%S)"]
for i in range(0,10):
f()
time.sleep(1)
doesn't work because I get:
plumbum.commands.processes.ProcessExecutionError: Unexpected exit code: 1
Command line: | /usr/bin/sh -c mkdir '$(date +%H%M%S)'
Stderr: | mkdir: missing operand
| Try 'mkdir --help' for more information.
I could use time.localtime() to compute the time in Python, but that would be the same for every evaluation. I could also generate a shell script file with this in, but then I'd have to clean that up afterwards.
How can I write this in a Plumbum-native way?

You came close to having something that would work with sh -c. The trick is to understand that only the one argument directly after the -c is parsed as code by the shell; subsequent arguments go into $0, $1, etc. in the context in which that code is executed.
s = local['sh']
f = s['-c', 'mkdir "$(date +%H%M%S)"']
That said, I'd strongly argue that plumbum is being more of a hinderance than a help, and suggest getting rid of it outright.

I haven't find a way to do it using Plumbum API, but introducing a wrapper function works:
import time
from plumbum.cmd import mkdir, date
def mkdate():
return mkdir[date['+%H%M%S']()]
for i in range(0, 3):
mkdate()
time.sleep(1)
mkdate is a regular function, unfortunately, not a Plumbum object, but I haven't found anything in the docs right away =)

Related

Bash script loop on python variable

I'm trying to do a simple script, but I don't get how to pass a variable to the command that I need:
#!/bin/bash
for i in 1 2 3
do
python -c 'print "a"*' $i
done
If you really want to go on with your solution using python -c, you would have to remove the space:
#!/bin/bash
for i in 1 2 3
do
python -c 'print "a"*'$i
done
But the approach suggested by Asmox makes more sense to me (and it has nothing to do with the question where you pipe the standard output to).
Maybe this topic will help:
How do I run Python script using arguments in windows command line
Have you considered making a whole script? Like this one:
import sys
def f(a):
print a
if __name__ == "__main__":
a = int(sys.argv[1])
f(a)
This example might seem overcomplicated, but I wanted to show you how to pass parameter from console to function
I have tried the solution provided above and had to modify for python3+
due to lack of sufficient points I am not allowed to make a comment, thats why I posted my solution separately
#!/bin/bash
for i in {1..3}
do
python -c "print ('a'* ${i})"
done
output~>
a
aa
aaa

Pass a code of Python in a command line

Is it possible to somehow convert string which I pass via command line e.g
python myfile.py "s = list(range(1000))"
into module? And then...
#myfile.py
def printlist(s):
print(s)
?
Something like a timeit module, but with my own code.
python -m timeit -s "s = list(range(1000))" "sorted(s)"
Save the code
import sys
print(sys.argv)
as filename.py, then run python filename.py Hi there! and see what happens ;)
I'm not sure if I understand correctly, but the sys module has the attribute argv that is a list of all the arguments passed in the command line.
test.py:
#!/usr/bin/env python
import sys
for argument in sys.argv:
print(argument)
This example will print every argument passed to the script, take in mind that the first element sys.argv[0] will always be the name of the script.
$ ./test.py hello there
./test.py
hello
there
There are two options to send short code snippets similar to what you can do via the timeit module or certain other -m module option functionality and these are:
You can use either the -c option of Python e.g.:
python -c "<code here>"
Or you could use piping such as:
echo <code here> | python
You can combine multiple statements using the ; semicolon statement separator. However if you use a : such as with while or for or def or if then you cannot use the ; so there may be limited options. Possibly some clever ways around this limitation but I have yet to see them.

IPython run magic: How create an alias for "run -i"?

I am programming a python user interface to control various instruments in a lab. If a script is not run interactively, the connection with instruments is lost at the end of the script, which can be very bad. I want to help the user to 'remember' running the script interactively.
I'm thinking of two possible ways to do it. First, as specified in the title, I could make an alias for run -i:
%alias_magic lab_run run -i
but this returns an error:
UsageError: unrecognized arguments: -i
Is there a way to get around this?
Alternatively, I could detect inside the script if the -i flag was passed on and raise en error if not. However, it doesn't show up in the sys.argv list:
In [1]: import sys
In [2]: run -i test.py random args
['test.py', 'random', 'args']
I can't use ipy files, because I need to read %run flags, as explained in my previous question here:
How to add a custom flag to IPython's magic commands? (.ipy files)
Anyone sees a solution to this problem?
You can define your own magic function and use %run -i in it:
from IPython.core.magic import register_line_magic
#register_line_magic
def r(line):
get_ipython().magic('run -i ' + line)
del r
EDIT
As hpaulj points out magic is deprecated. Here a version with the new
run_line_magic:
from IPython.core.magic import register_line_magic
#register_line_magic
def r(line):
get_ipython().run_line_magic('run', ' -i ' + line)
del r
Now:
%r
does the same as:
%run -i
thank you #Mike for your answer, it pointed me in the right direction for making a cell magic alias
I found that %alias_magic [-l] [-c] [-p PARAMS] name target, in the docs, does the trick for me
in my case:
%alias_magic --cell py_310 script -p C:\Users\myname\AppData\Local\Programs\Python\Python310\python
output
Created `%%py_310` as an alias for `%%script C:\Users\myname\AppData\Local\Programs\Python\Python310\python`.
that can then be used like
%%py_310
import sys
print(sys.version_info)
output
sys.version_info(major=3, minor=10, micro=1, releaselevel='final', serial=0)

Calling a Python function from a shell script

I am trying to figure out how to call a Python function from a shell script.
I have a Python file with multiple functions and I need to use the values returned by them in my shell script. Is there a way to do it.
I am doing this in order to read a config file using Python and getting the values in shell. Is there any other better way to achieve this.
test.py contains:
import ConfigParser
config = ConfigParser.ConfigParser()
config.read("test.conf")
def get_foo():
return config.get("locations", "foo")
def get_bar():
return config.get("locations", "bar")
I need to store the values returned by the Python functions in a shell variable.
You can send the result of your functions to the standard output by asking the Python interpreter to print the result:
python -c 'import test; print test.get_foo()'
The -c option simply asks Python to execute some Python commands.
In order to store the result in a variable, you can therefore do:
RESULT_FOO=`python -c 'import test; print test.get_foo()'`
or, equivalently
RESULT=$(python -c 'import test; print test.get_foo()')
since backticks and $(…) evaluate a command and replace it by its output.
PS: Getting the result of each function requires parsing the configuration file each time, with this approach. This can be optimized by returning all the results in one go, with something like:
ALL_RESULTS=$(python -c 'import test; print test.get_foo(), test.get_bar()')
The results can then be split and put in different variables with
RESULT_BAR=$(echo $ALL_RESULTS | cut -d' ' -f2)
which takes the second result and puts it in RESULT_BAR for example (and similarly: -fn for result #n).
PPS: As Pablo Maurin mentioned, it would probably be easier to do everything in a single interpreter (Python, but maybe also the shell), if possible, instead of calculating variables in one program and using them in another one.

Pipe output of a command to an interactive python session?

What I'd like to do is something like
$echo $PATH | python --remain-interactive "x = raw_input().split(':')"
>>>
>>> print x
['/usr/local/bin', '/usr/bin', '/bin']
I suppose ipython solution would be best. If this isn't achievable, what would be your solution for the situation where I want to process output from various other commands? I've used subprocess before to do it when I was desperate, but it is not ideal.
UPDATE: So this is getting closer to the end result:
echo $PATH > /tmp/stdout.txt; ipython -i -c 'stdout = open("/tmp/stdout.txt").read()'
Now how can we go about bending this into a form
echo $PATH | pyout
where pyout is the "magic solution to all my problems". It could be a shell script that writes the piped output and then runs the ipython. Everything done fails for the same reasons bp says.
In IPython you can do this
x = !echo $$$$PATH
The double escape of $ is a pain though
You could do this I guess
PATH="$PATH"
x = !echo $PATH
x[0].split(":")
The --remain-interactive switch you are looking for is -i. You also can use the -c switch to specify the command to execute, such as __import__("sys").stdin.read().split(":"). So what you would try is: (do not forget about escaping strings!)
echo $PATH | python -i -c x = __import__(\"sys\").stdin.read().split(\":\")
However, this is all that will be displayed:
>>>
So why doesn't it work? Because you are piping. The python intepreter is trying to interactively read commands from the same sys.stdin you are reading arguments from. Since echo is done executing, sys.stdin is closed and no further input can happen.
For the same reason, something like:
echo $PATH > spam
python -i -c x = __import__(\"sys\").stdin.read().split(\":\") < spam
...will fail.
What I would do is:
echo $PATH > spam.bar
python -i my_app.py spam.bar
After all, open("spam.bar") is a file object just like sys.stdin is :)
Due to the Python axiom of "There should be one - and preferably only one - obvious way to do it" I'm reasonably sure that there won't be a better way to interact with other processes than the subprocess module.
It might help if you could say why something like the following "is not ideal":
>>> process = subprocess.Popen(['cmd', '/c', 'echo %PATH%'], stdout=subprocess.PIPE)
>>> print process.communicate()[0].split(';')
(In your specific example you could use os.environ but I realise that's not really what you're asking.)

Categories