Opening and closing a file while logging data - python

So, I'm logging temperature and humidity data from a DHT22 hooked up to the GPIO on a raspberry pi. It logs everything correctly - but I can only see the updated log after I stop logger.py running.
I think the problem is that I'm not closing the file after writing to it - but I'm not sure. Can I just add a f = open(xxx) and f.close() to the loop so that it 'saves' it everytime it logs?
import os
import time
import Adafruit_DHT
DHT_SENSOR = Adafruit_DHT.DHT22
DHT_PIN = 4
try:
f = open('/home/pi/temphumid/log.csv', 'a+')
if os.stat('/home/pi/temphumid/log.csv').st_size == 0:
f.write('Date,Time,Temperature,Humidity\r\n')
except:
pass
while True:
humidity, temperature = Adafruit_DHT.read_retry(DHT_SENSOR, DHT_PIN)
if humidity is not None and temperature is not None:
f.write('{0},{1},{2:0.1f}*C,{3:0.1f}%\r\n'.format(time.strftime('%m/%d/%y'), time.strftime('%H:%M:%S'), temperature, humidity))
else:
print("Failed to retrieve data from humidity sensor")
time.sleep(60)
expected:
log.csv is updated, so that if I use tail log.csv I can see the up to date data.
actual:
log.csv doesn't update until I stop logger.py from running (using sigint from htop as it is currently run as a cronjob on boot).

Every time we open a file we need to close it to push the output to disk:
fp = open("./file.txt", "w")
fp.write("Hello, World")
fp.close()
To avoid calling the close() method every time, we can use the context manager of the open() function, which will automatically close the file after exiting the block:
with open("./file.txt", "w") as fp:
fp.write("Hello, World")
We do not need to call here the close method every time to push the data into the file.

Write data to the file and hit file.flush() and then do file.fsync() which writes the data to the disk and you'll even be able to open file using different program and see changes at the real time.

Related

how can i update a python file by comparing it to a file hosted on my rasbery pi?

I am attempting to make a program update itself to the newest version that I have made. E.g. I added a new functionality to it. It would be useful for me to be able to upload the updated file to a central location like my Raspberry Pi and have the program update itself across all of my computers without updating each one individually.
I have made the bellow code, but it does not work. It can recognize when the file is up-to-date but running the new program it downloads fails, it successfully downloads and deletes itself, but the new program is not run, with no error messages being shown.
Update test.py:
#updaterV1.py
import time
import requests
import os
import hashlib
time.sleep(5)
cwd = os.getcwd()
URL = r"http://[rasberry pi's ip]/update%20files/dev/hash.txt"
hash_path = os.path.join(cwd,"remote hash.txt")
with open (hash_path, "wb") as f:
f.write(requests.get(URL).content)
with open(hash_path,"r") as hash_file:
remotehash = (hash_file.readline()).strip()
os.remove(hash_path)
hasher = hashlib.sha256()
with open(__file__, 'rb') as self_file:
selfunhashed = self_file.read()
hasher.update(selfunhashed)
selfhash = hasher.hexdigest()
print(selfhash)
print(remotehash)
if (selfhash == remotehash):
print("program is up to date")
input()
else:
update_path = os.path.join(cwd,"temp name update.py")
URL = r"http://[rasberry pi's ip]/update%20files/dev/update.py"
with open (update_path, "wb") as f:
f.write(requests.get(URL).content)
with open(update_path,"r") as f:
name = f.readline().strip()
name = name[1:] #use the 1st line as "#name.py" not "# name"
update_path = os.path.join(cwd,name)
try:
os.remove(update_path)
except:
pass
os.rename(os.path.join(cwd,"temp name update.py"),update_path)
os.system("python \""+update_path+"\"")
print("removing self file now")
os.remove(__file__)
It uses a separate TXT file with the hash of the program stored in the same folder to check the remote files hash without downloading the actual file to hash it locally.

How to save datetime to file in Python for use in script automation

I'm creating a script that periodically scrapes a server for "new" files that have been added. To do so, I wish to store the datetime of my last script execution in a file, so that the script can process "all new files" since that datetime. The end goal is to run this script periodically via Windows Task Scheduler.
I'm able to do a basic version of this using the code below. However, I would expect there to be a cleaner, shorter or more robust way of achieving this. Any suggestions are welcome!
import datetime
fmt = "%Y-%m-%d %H:%M:%S"
last_run = ""
# try loading the datetime of the last run, else print warning
try:
with open("last_run.txt", mode="r") as file:
last_run = datetime.datetime.strptime(file.read(), fmt)
print(last_run)
except:
print("no file available")
# ... run script code using the last_run variable as input ...
# update the script execution time and save it to the file
with open("last_run.txt", mode="w") as file:
file.write(datetime.datetime.now().strftime(fmt))
Your solution looks fine.
The only thing that I would like to suggest to take out reading and writing last run time stamp logic in two separate functions and move those two functions in a separate module file. This is same as per suggestion from #Tomalak in the above reply. Below is the example with code.
Module file: last_run.py
import datetime
fmt = "%Y-%m-%d %H:%M:%S"
def get_last_run_time_stamp():
"""
Get last run time stamp\n
====\n
When this function called\n
AND last_run.txt file is present\n
Then open the file and read the time-stamp stored in it\n
====\n
When this function is called\n
AND last_run.txt file is not present\n
Then print the following message on console: "last_run.txt file is not available"\n
"""
# try loading the datetime of the last run, else print warning
try:
with open("last_run.txt", mode="r") as file:
return datetime.datetime.strptime(file.read(), fmt)
except:
# Return with current time-stamp if last_run.txt file is not present
return datetime.datetime.now().strftime(fmt)
# ... run script code using the last_run variable as input ...
def save_last_run_time_stamp():
"""
Save last run time stamp\n
====\n
When this function called\n
AND last_run.txt file is present\n
Then Open the file, save it with current time stamp and close the file\n
====\n
When this function called\n
AND last_run.txt file is not present\n
Then Create the file, open the file, save it with current time stamp and close the file\n
"""
# update the script execution time and save it to the file
with open("last_run.txt", mode="w") as file:
current_timestamp = datetime.datetime.now().strftime(fmt);
file.write(current_timestamp)
Then, below is the file configured and run by schedular:
run_latest_scaped_files.py,
import last_run as lr
last_run = lr.get_last_run_time_stamp()
print(last_run)
# ... run script code using the last_run variable as input ...
lr.save_last_run_time_stamp()
That's it!!!

Autorun python script save output to txt file raspberry pi

I have a issue with my raspberry pi that starts up a python script.How do I save the printed output to a file when it is running on boot? I found script below on the internet but it doesn't seem to write the printed text,it creates the file but the content is empty.
sudo python /home/pi/python.py > /home/pi/output.log
It does write its output to the file but you cannot see it until the python file has finished executing due to buffer never flushed.
If you change the output to a file within your python script you can periodicity call flush in your code to push the output through to the file as and when you wish, something like this.
import sys
import time
outputFile = "output.txt";
with open(outputFile, "w+") as sys.stdout:
while True:
print("some output")
sys.stdout.flush() # force buffer content out to file
time.sleep(5) # wait 5 seconds
if you want to set the output back to the terminal, you may want to save a reference to the original stdout like this
import time
outputFile = "output.txt";
original_stdout = sys.stdout
with open(outputFile, "w+") as sys.stdout:
print("some output in file")
sys.stdout.flush()
time.sleep(5)
sys.stdout = original_stdout
print("back in terminal")

How to record twitch stream in python, preferably using livestreamer?

Currently all I have is:
from livestreamer import Livestreamer
session = Livestreamer()
stream = session.streams('http://www.twitch.tv/riotgames')
stream = stream['source']
fd = stream.open()
As I'm still a newbie to python I'm at complete loss on what I should do next. How do I continuously save, let's say, last 40 seconds of the stream to file?
Here's a start:
from livestreamer import Livestreamer
session = Livestreamer()
stream = session.streams('http://www.twitch.tv/riotgames')
stream = stream['source']
fd = stream.open()
with open("/tmp/stream.dat", 'wb') as f:
while True:
data = fd.read(1024)
f.write(data)
I've tried it. You can open the /tmp/stream.dat in VLC, for example. The example will read 1 kb at a time and write it to a file.
The program will run forever so you have to interrupt it with Ctrl-C, or add some logic for that. You probably need to handle errors and the end of a stream somehow.

python network file writing in a robust manner

I am looking for a robust way to write out to a network drive. I am stuck with WinXP writing to a share on a Win2003 server. I want to pause writing if the network share goes down... then reconnect and continue writing once the network resource is available. With my initial code below, what happens is the 'except' catches the IOError when the drive goes away, but then when the drive becomes available again, the outf operations continue to IOError.
import serial
with serial.Serial('COM8',9600,timeout=5) as port, open('m:\\file.txt','ab') as outf:
while True:
x = port.readline() # read one line from serial port
if x: # if the there was some data
print x[0:-1] # display the line without extra CR
try:
outf.write(x) # write the line to the output file
outf.flush() # actually write the file
except IOError: # catch an io error
print 'there was an io error'
I suspect that once an open file goes into an error state because of the IOError that you will need to reopen it. You could try something like this:
with serial.Serial('COM8',9600,timeout=5) as port:
while True:
try:
with open('m:\\file.txt','ab') as outf:
while True:
x = port.readline() # read one line from serial port
if x: # if the there was some data
print x[0:-1] # display the line without extra CR
try:
outf.write(x) # write the line to the output file
outf.flush() # actually write the file
break
except IOError:
print 'there was an io error'
This puts the exception handling inside an outer loop that will reopen the file (and continue reading from the port) in the event of an exception. In practice you would probably want to add a time.sleep() or something to the except block in order to prevent the code from spinning.

Categories