How to start REPL at the end of python script? [duplicate] - python

This question already has answers here:
How to drop into REPL (Read, Eval, Print, Loop) from Python code
(7 answers)
Closed last month.
How can I start REPL at the end of python script for debugging? In Node I can do something like this:
code;
code;
code;
require('repl').start(global);
Is there any python alternative?

If you execute this from the command prompt, just use -i:
➜ Desktop echo "a = 50" >> scrpt.py
➜ Desktop python -i scrpt.py
>>> a
50
this invokes Python after the script has executed.
Alternatively, just set PYTHONINSPECT to True in your script:
import os
os.environ['PYTHONINSPECT'] = 'TRUE'

Just use pdb (python debugger)
import pdb
print("some code")
x = 50
pdb.set_trace() # this will let you poke around... try "p x"
print("bye")

Related

How to pass arguments to python script? [duplicate]

I know that I can run a python script from my bash script using the following:
python python_script.py
But what about if I wanted to pass a variable / argument to my python script from my bash script. How can I do that?
Basically bash will work out a filename and then python will upload it, but I need to send the filename from bash to python when I call it.
To execute a python script in a bash script you need to call the same command that you would within a terminal. For instance
> python python_script.py var1 var2
To access these variables within python you will need
import sys
print(sys.argv[0]) # prints python_script.py
print(sys.argv[1]) # prints var1
print(sys.argv[2]) # prints var2
Beside sys.argv, also take a look at the argparse module, which helps define options and arguments for scripts.
The argparse module makes it easy to write user-friendly command-line interfaces.
Use
python python_script.py filename
and in your Python script
import sys
print sys.argv[1]
Embedded option:
Wrap python code in a bash function.
#!/bin/bash
function current_datetime {
python - <<END
import datetime
print datetime.datetime.now()
END
}
# Call it
current_datetime
# Call it and capture the output
DT=$(current_datetime)
echo Current date and time: $DT
Use environment variables, to pass data into to your embedded python script.
#!/bin/bash
function line {
PYTHON_ARG="$1" python - <<END
import os
line_len = int(os.environ['PYTHON_ARG'])
print '-' * line_len
END
}
# Do it one way
line 80
# Do it another way
echo $(line 80)
http://bhfsteve.blogspot.se/2014/07/embedding-python-in-bash-scripts.html
use in the script:
echo $(python python_script.py arg1 arg2) > /dev/null
or
python python_script.py "string arg" > /dev/null
The script will be executed without output.
I have a bash script that calls a small python routine to display a message window. As I need to use killall to stop the python script I can't use the above method as it would then mean running killall python which could take out other python programmes so I use
pythonprog.py "$argument" & # The & returns control straight to the bash script so must be outside the backticks. The preview of this message is showing it without "`" either side of the command for some reason.
As long as the python script will run from the cli by name rather than python pythonprog.py this works within the script. If you need more than one argument just use a space between each one within the quotes.
and take a look at the getopt module.
It works quite good for me!
Print all args without the filename:
for i in range(1, len(sys.argv)):
print(sys.argv[i])

How do I pass a variable created in a python script to be used in a bash script? [duplicate]

This question already has answers here:
Passing multiple variables from python script to shell script
(3 answers)
Closed 2 years ago.
I am using Python 3.6.3.
I have created a python script 'create.py'. This python script calls and runs a bash script 'verify.sh'.
The 'verify.sh' script sends an email:
#!/bin/sh
emailGroup="dummy#email.com"
echo "The variable of interest is x(insert here): $1 " | mail -s "The variable of interest is x(insert here)" ${emailGroup}
So in my python script 'x' is determined. I want to insert x into the 'verify.sh' script above so that it goes out with the email.
If that's really the full extent of your verify.sh script, you can remove it entirely.
import subprocess
sent = subprocess.run(
['mail', '-s', 'The variable of interest is x(insert here)', 'dummy#email.com'],
input='The variable of interest is x(insert here): {0}'.format(x),
text=True, check=True)
If you are on an earlier version of Python than 3.7, you will want to use universal_newlines=True instead of text=True.
You don't really need mail -s either; see e.g. How to send an email with Python?
Here's an example of how to do that.
First the Python script:
#!/usr/bin/python3
import subprocess
recipient = "me#stackoverflow.com"
subject = "Python Script Calls Bash Script"
body = "This is a test, yes it is, yes it is."
notify_script = "/home/me/Scripts/notify"
subprocess.run([notify_script, recipient, subject, body])
Second the Bash script:
#!/bin/bash
recipient="$1"
subject="$2"
body="$3"
dummy_mailer -r "$recipient" -s "$subject" -b "$body"
Any number of args (see ARG_MAX) can be sent from the Python script using subprocess.run(), simply add or remove them from the list given to run(). e.g.
subprocess.run([path_to_script, arg_1])
subprocess.run([path_to_script, arg_1, arg_2])
subprocess.run([path_to_script, arg_1, arg_2, arg_3, arg_4])

Execute Shell Script from python and assign output to a variable [duplicate]

This question already has answers here:
Running shell command and capturing the output
(21 answers)
Closed 2 years ago.
I am executing below shell command in python to get the total number of line in a file and was wondering is there any way to assign the output to a variable in python? Thanks
import os
cmd = "wc -l" + " " + /path/to/file/text.txt
num_rows = os.system(cmd)
Try the os.exec class of functions in the os module. They are designed to execute the OS-dependent versions of the utils you're referring to. Shell scripts can also be executed in these processes.
https://docs.python.org/2/library/os.html#process-management
You may also want to try the subprocess module.
https://docs.python.org/2/library/subprocess.html

Creating Python Executables for *nix [duplicate]

This question already has answers here:
Why do people write #!/usr/bin/env python on the first line of a Python script?
(22 answers)
Closed 7 years ago.
I have written a library in python, that is kind of like a scripting language, and I had the idea of turning the library into a terminal executable that could be called like:
myprogram /path/to/file
So like python runs .py, this would run the file with the functions defined in my library. Is this at all possible?
Suppose you write function like this
#!/usr/bin/python
import sys
if (sys.argv[0] == "./funA.py"):
print("Calling A")
# ...
sys.exit(0)
if (sys.argv[0] == "./funB.py"):
print("Calling B")
# ...
sys.exit(0)
print(sys.argv[0])
YOu call it main.py. Then you make links like this
$ `ln main.py funA.py`
$ `ln main.py funB.py`
It is really just one code, with three names and link count is equal to 3.
But if you run it
./funA.py
it will be dispatched to funA block in your code.
But if you run it
./funB.py
it will be dispatched to funB block in your code.
And so on and so forth
Is it what you're looking for?

Detect if python script is run from console or by crontab [duplicate]

This question already has answers here:
Checking for interactive shell in a Python script
(5 answers)
Closed 5 years ago.
Imagine a script is running in these 2 sets of "conditions":
live action, set up in sudo crontab
debug, when I run it from console ./my-script.py
What I'd like to achieve is an automatic detection of "debug mode", without me specifying an argument (e.g. --debug) for the script.
Is there a convention about how to do this? Is there a variable that can tell me who the script owner is? Whether script has a console at stdout? Run a ps | grep to determine that?
Thank you for your time.
Since sys.stdin will be a TTY in debug mode, you can use the os.isatty() function:
import sys, os
if os.isatty(sys.stdin.fileno()):
# Debug mode.
pass
else:
# Cron mode.
pass
You could add an environment variable to the crontab line and check, inside your python application, if the environment variable is set.
crontab's configuration file:
CRONTAB=true
# run five minutes after midnight, every day
5 0 * * * /path/to/your/pythonscript
Python code:
import os
if os.getenv('CRONTAB') == 'true':
# do your crontab things
else:
# do your debug things
Use a command line option that only cron will use.
Or a symlink to give the script a different name when called by cron. You can then use sys.argv[0]to distinguish between the two ways to call the script.

Categories