Calling a python to run a shell script - python

I'm currently trying run the shell script by using the os.system method in python.
Python code:
file = open("History.txt","w")
file.write(history)
os.system('./TFPupload.sh')
Shell script code:
#!/bin/sh
HOST="ftp.finalyearproject95.com"
USER='*****'
PASSWD='*****'
FILE='History.txt'
ftp -n $HOST <<END_SCRIPT
quote USER $USER
quote PASS $PASSWD
put $FILE
quit
END_SCRIPT
echo ">>>uploaded<<<\n"
exit 0
At first, when i tried to run the python code and shell script one by one it works perfectly. However, when i attempt to use python to run shell script, instead of uploading the 'History.txt' file that contains data into the database, the file uploaded is an empty file. When i check using 'nano History.txt', it does contain data, only when it passes the text file to the database will be empty. Why is it?

Use With statement to open files whenever possible .
with open("History.txt","w") as file :
file.write(history)
os.system('./TFPupload.sh')
with statement takes care of closing the fd on its own .
some Ref : What is the python "with" statement designed for?

Related

Python can't read a temporary file

I have a python program, which is supposed to calculate changes based on a value written in a temporary file (eg. "12345\n"). It is always an integer.
I have tried different methods to read the file, but python wasn't able to read it. So then I had the idea to execute a shell command ("cat") that will return content. When I execute this in the shell it works fine, but python the feedback I get is empty. Then I tried writing a bash and then a php skript, which would read the file and then return the value. In python I called them over the shell and the feedback I get is empty as well.
I was wondering if that was a general problem in python and made my scripts return the content of other temporary files, which worked fine.
Inside my scripts I was able to do calculations with the value and in the shell the output is exactly as expected, but not when called via python. I also noticed that I don't get the value with my extra scripts when they are called by phython (I tried to write it into another file; it was updated but empty).
The file I am trying to read is in the /tmp directory and is written into serveral time per second by another script.
I am looking for a solution (open for new ideas) in which I end up having the value of the file in a python variable.
Thanks for the help
Here are my programs:
python:
# python script
import subprocess
stdout = subprocess.Popen(["php /path/to/my/script.php"], shell = True, stdout = subprocess.PIPE).communicate()[0].decode("utf-8")
# other things I tried
#with open("/tmp/value.txt", "r") as file:
# stdout = file.readline() # output = "--"
#stdout = os.popen("cat /tmp/value.txt").read() # output = "--"
#stdout = subprocess.check_output(["php /path/to/my/script.php"], shell = True, stdout = subprocess.PIPE).decode("utf-8") # output = "--"
print(str("-" + stdout + "-")) # output = "--"
php:
# php script
valueFile = fopen("/tmp/value.txt", "r");
value = trim(fgets($valueFile), "\n");
fclose($valueFile);
echo $value; # output in the shell is the value of $value
Edit: context: my python script is started by another python script, which listens for commands from an apache server on the pi. The value I want to read comes from a "1wire" device that listens for S0-signals.

Passing variables/values from python script to shell script

This is my use case: I have a python script that returns the name of a remote server and a file path from a config file. I need a way to use those two or more parameterized variables and their values as input for my shell script which will then sync files by connecting to the remote server. Any tips appreciated.
See if this is what you were looking for. I have a python script that prints out 3 lines, each a different value. For you, a[0] would be the server, a[1] the first file, and a[2] the second. The number of lines is arbitrary and this would work for any number, one per line, and allow spaces in the file names.
The "<(" executes what is inside and creates something like a pipe, which the readarray command reads (it takes standard input, namely "<")
> readarray a < <(python -c 'print "myserver:8080"; print "file1 which may have spaces"; print "another file"')
> echo ${a[0]}
myserver:8080
> echo ${a[1]}
file1 which may have spaces
> echo ${a[2]}
another file

Python to read stdin from other source continously

Is it possible to allow Python to read from stdin from another source such as a file continually? Basically I'm trying to allow my script to use stdin to echo input and I'd like to use a file or external source to interact with it (while remaining open).
An example might be (input.py):
#!/usr/bin/python
import sys
line = sys.stdin.readline()
while line:
print line,
line = sys.stdin.readline()
Executing this directly I can continuously enter text and it echos back while the script remains alive. If you want to use an external source though such as a file or input from bash then the script exits immediately after receiving input:
$ echo "hello" | python input.py
hello
$
Ultimately what I'd like to do is:
$ tail -f file | python input.py
Then if the file updates have input.py echo back anything that is added to file while remaining open. Maybe I'm approaching this the wrong way or I'm simply clueless, but is there a way to do it?
Use the -F option to tail to make it reopen the file if it gets renamed or deleted and a new file is created with the original name. Some editors write the file this way, and logfile rotation scripts also usually work this way (they rename the original file to filename.1, and create a new log file).
$ tail -F file | python input.py

Using Osmconvert with Python

I want to use osmconvert to parse down the size of my diff files for just the area I'm interested in because osmconvert is way faster than osm2pgsql, which loads the data.
When I call the command using os.system() like such:
cmd = r"""c:\temp\osmconvert.exe 770.osc.gz -b=1,1,3,3 -o=extract.o5m"""
os.system(cmd)
I get osmconvert error: cannot open file
When I run the same exact command from my command prompt in Windows 7, it runs fine. What is python doing to prevent this function from running? The 770.osc.gz file lives in the same directory as osmconvert.exe and the output extract.05m should populate in the same directory as the osmconvert.exe exists.
If I put the command in a batch file, it works, but I want to use python to download the file from the server so I can automate the updates of the database.
Thank you
The 770.osc.gz file lives in the same directory as osmconvert.exe and the output extract.05m should populate in the same directory as the osmconvert.exe exists.
That's not what your code is saying. The code says "execute osmconvert.exe from inside c:\temp\ but read 770.osc.gz and write extract.o5m from the current working directory".
If you want everything to run inside c:\temp\ then you either have change to this directory before executing osmconvert or you have to preprend the path to every file you are passing to osmconvert.
Try this call instead:
cmd = r"""c:\temp\osmconvert.exe c:\temp\770.osc.gz -b=1,1,3,3 -o=c:\temp\extract.o5m"""

automation : Script to take a mysqldump into a file named by date/time of backup

I tried fabric with a '>' in the command string. It always gives out an error code 2. Currently dabbling with subprocess.call, subprocess.check_output and keeping stdout="filesocket". Not working. The only thing that gets written in the file is the USAGE for mysqldump. Using shlex to parse 'mysqldump -uroot -ppassword database table1 table2'
All this because I don't know shell scripting with string variables from the 'date' utility. How do I take the current date and use it to name the backup file in shell script. OR how do I get this thing done in python?
Thanks in advance.
regards.
You can get a custom date out of date using the following syntax.
CUSTOM_DATE=$(date "+%Y-%m-%d_%H_%M_%S")
The easiest way to accomplish this is to put a script on the remote end that does 'everything'
#!/bin/bash
CUSTOM_DATE=$(date "+%Y-%m-%d_%H_%M_%S")
mysqldump -u admin -p password database table1 table2 >/path/to/backups/mysqldump.${CUSTOM_DATE}.db
"How do I take the current date and use it to name the backup file in shell script. OR how do I get this thing done in python?"
from datetime import datetime
filename = 'mysql_backup_{0:%Y%m%d_%H%M}.sql'.format(datetime.now())
# filename == 'mysql_backup_20120227_0952.sql'
My Answer from related stackoverflow post.
In Microsoft Windows, run below command in CMD
mysqldump -u USERNAME -pYOURPASSWORD --all-databases > "C:/mysql_backup_%date:~-10,2%-%date:~-7,2%-%date:~-4,4%-%time:~0,2%_%time:~3,2%_%time:~6,2%.sql"
Output file will look like,
mysql_backup_21-02-2015-13_07_18.sql
If you want to automate the backup process, then you can use Windows Task Scheduler, and put above command in .bat file - task scheduler will run the .bat file at specified interval.

Categories