Cronjob doesn't execute python script - python

I want to use Cron to execute my python script every hour of the day. Therefore I created a cronjob that looks like: #hourly /home/pi/Desktop/repository/auslastung_download/auslastung.py
The cronjob should execute the following script:
from bs4 import BeautifulSoup
from selenium.webdriver.firefox.options import Options as FirefoxOptions
from selenium import webdriver
from datetime import datetime, date
def get_auslastung_lichtenberg():
try:
url = "https://www.mcfit.com/de/fitnessstudios/studiosuche/studiodetails/studio/berlin-lichtenberg/"
options = FirefoxOptions()
options.add_argument("--headless")
driver = webdriver.Firefox(options=options)
driver.get(url)
html_content = driver.page_source
soup = BeautifulSoup(html_content, 'html.parser')
elems = soup.find_all('div', {'class': 'sc-iJCRLp eDJvQP'})
#print(elems)
auslastung = str(elems).split("<span>")[1]
#print(auslastung)
auslastung = auslastung[:auslastung.rfind('</span>')]
#print(auslastung)
auslastung = str(auslastung).split("Auslastung ")[1]
#print(auslastung)
auslastung = auslastung[:auslastung.rfind('%')]
print(auslastung)
now = datetime.now()
current_time = now.strftime("%H:%M:%S")
#print("Current Time =", current_time)
today = date.today()
print(today)
ergebnis = {'date': today, 'time': current_time,'studio': "Berlin Lichtenberg", 'auslastung': auslastung}
return ergebnis
finally:
try:
driver.close()
except:
pass
"""
import json
with open('database.json', 'w') as f:
json.dump(get_auslastung_lichtenberg(), f)
"""
import csv
with open('/home/pi/Desktop/repository/auslastung_download/data.csv', mode='a') as file:
fieldnames = ['date', 'time', 'studio', 'auslastung']
writer = csv.DictWriter(file, fieldnames=fieldnames)
writer.writerow(get_auslastung_lichtenberg())
When executed via python3 auslastung.pyeverything works fine and the script writes into the data.csv file.
Maybe someone can help me :)

First of all you must ensure that your script runs.
If interactively you run python3 auslastung.py why do you invoke your python script differently on your cron.
have you tried to run just /home/pi/Desktop/repository/auslastung_download/auslastung.py interactively? without initial python3, does it run?
If your script runs with python3 auslastung.py on your crontab you should include full path to both interpreter and script:
#hourly /paht/to/python3 /full/path/to/script.py
If you made your script to run directly without need to indicate interpreter, just /full/path/to/script.py then on your crontab you should include full path to script:
#hourly /full/path/to/script.py
You may include a shebang: a very first line of your script indicate which interpreter is used to execute it. So your first line should be #!/path/to/your/interpreter
An you have to ensure then that script has execute permision with chmod +x auslastung.py.

Related

Automatic script doesn't output data

I am just for fun collecting weather data with my Raspberry Pi.
If I execute my python script in the console everything is working fine.But if I add the python-file to crontab to start it after rebooting, it isn't working. (crontab-entry: #reboot python3 /home/pi/Documents/PythonProgramme/WeatherData/weatherdata.py &)
#! /usr/bin/python3
from pyowm import OWM
import csv
import schedule
from datetime import datetime
import time
key = 'XXXXXX'
def weather_request(text):
owm = OWM(key)
mgr = owm.weather_manager()
karlsruhe = mgr.weather_at_place('Karlsruhe, DE').weather
hamburg = mgr.weather_at_place('Hamburg, DE').weather
cities = (karlsruhe, hamburg)
with open('weatherdata.csv', 'a') as file:
writer = csv.writer(file)
row = [datetime.now().strftime("%Y-%m-%d %H:%M:%S")]
for city in cities:
row.append(city.temperature('celsius')['temp'])
row.append(round(row[1] - row[2], 2))
row.append(text)
writer.writerow(row)
schedule.every().day.at("08:00").do(weather_request, 'morgens')
schedule.every().day.at("13:00").do(weather_request, 'mittags')
schedule.every().day.at("18:00").do(weather_request, 'abends')
while 1:
schedule.run_pending()
time.sleep(1)
If I run ps -aef | grep python it is showing, that my script is running: pi 337 1 21 10:32 ? 00:00:10 python3 /home/pi/Documents/PythonProgramme/WeatherData/weatherdata.py
But I never get any data. What am I missing?
Thanks in advance!
where are you checking the output file?
Have tried to open the file with full path?
with open('***<fullPath>***weatherdata.csv', 'a') as

Python Script doesn´t work when started via other script

I´m currently working on a raspberry pi 4 and wrote a script in python that send a mail with a picture and then rename the file and puts it in another folder.
The script works fine when I start with command
sudo python script.py
but when start it with another script it won´t execute the part with the renaming
Now the question what is my mistake ?
import os
import time
from sendmail import mail
from sendmail import file_rename
from time import sleep
pic = '/home/pi/Monitor/Bewegung.jpg'
movie= '/home/pi/Monitor/Aufnahme.avi'
archiv = '/home/pi/Archiv/'
time = time.strftime('%d.%m.%Y %H:%M')
mail(filename = pic )
file_rename(oldname = pic ,name = 'Serverraum Bild' + time ,format = '.jpg' ,place = archiv )
file_rename(oldname = movie ,name = 'Serverraum Video' + time ,format = '.avi' ,place = archiv )
I see that you are starting the script as a user with sudo privileges.
but when start it with another script it won´t execute the part with the renaming
This makes me suspicious that the caller script does not have the correct permissions to rename/move a file. You can view the permissions of the script with the following command
ls -la callerscript.py

How to run Scrapy Spider with cron Job Scheduling

I'm new to Python and web scraping. Pls excuse me for my ignorance. In this program, I want to run my spider on a schedule. I use Python 3.7 and MacOs.
I wrote cronjob using crontab and called shell script to run the scrapy spider. However it executed only once with the last line "INFO: Closing spider (finished)". Didn't repeat according to the schedule. I executed simple python script to test the schedule and then it worked. Seems this issue only with the spider. Please help to understand how to fix this. Any help would be appreciated. Thank you
import csv
import os
import random
from time import sleep
import scrapy
class spider1(scrapy.Spider):
name = "amspider"
with open("data.csv", "a") as filee:
if os.stat("data.csv").st_size != 0:
filee.truncate(0)
filee.close()
def start_requests(self):
list = ["https://www.example.com/item1",
"https://www.example.com/item2",
"https://www.example.com/item3",
"https://www.example.com/item4",
"https://www.example.com/item5"
]
for i in list:
yield scrapy.Request(i, callback=self.parse)
sleep(random.randint(0, 5))
def parse(self, response):
product_name = response.css('#pd-h1-cartridge::text')[0].extract()
product_price = response.css(
'.product-price .is-current, .product-price_total .is-current, .product-price_total ins, .product-price ins').css(
'::text')[3].extract()
print(product_name)
print(product_price)
with open('data.csv', 'a') as file:
itemwriter = csv.writer(file, delimiter=',', quotechar='"', quoting=csv.QUOTE_MINIMAL)
itemwriter.writerow([str(product_name).strip(), str(product_price).strip()])
file.close()
amsp.sh
#!/bin/sh
cd /Users/amal/PycharmProjects/AmProj2/amazonspider
PATH=$PATH:/usr/local/bin/
export PATH
scrapy crawl amspider
crontab
Tried both ways But spider executed only once.
*/2 * * * * /Users/amal/Documents/amsp.sh
*/2 * * * * cd /Users/amal/PycharmProjects/AmProj2/amazonspider && scrapy crawl amspider

How to make simple interacting with running python script for web?

I have a running loop script in Python, which I want interact with a HTML-page. For example:
(HTML)Clicking button -> magic -> (Python) Do something function in
script.py
This response is not appropriate for this.
Probably you can use Selenium python binding for the purpose of interecting with Web page with your python script.
Selenium link
Example:
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
driver = webdriver.Firefox()
driver.get("http://www.python.org")
assert "Python" in driver.title
elem = driver.find_element_by_name("q")
elem.send_keys("pycon")
elem.send_keys(Keys.RETURN)
assert "No results found." not in driver.page_source
driver.close()
I resole this problem by PIPE in python. Python script reads file p1.txt by line and server.js sends text. For example, here server.js sends string 'COMMAND' and we are getting script.py response.
script.py
import os,sys
fd = "./p1.txt"
try:
os.mkfifo(fd)
except OSError:
pass
rp = os.open(fd, os.O_RDONLY)
#Need to create new thread, for background reading file
while True:
response = os.read(rp, 54)
if response == "COMMAND":
#do
print 'ICATCHYOU'
server.js
var fs = require('fs');
var path = '/home/pi/develop/p1.txt';
var streamOptions = { flags: 'w',
encoding: 'utf-8',
mode: 0666 };
var afterOpen = function(err, fd) {
var writeStream = fs.createWriteStream(path, streamOptions);
writeStream.write("COMMAND");
};
fs.open(path, 'w', afterOpen);

Cannot get #reboot Cron job to run Python script

I have a cron job that loads a Python script on reboot but it will just not work.
I have checked the Python script and that works fine from CLi.
The .py basically loads a browser to Google and then sends it to full screen.
(It actually loads another website and enters login details also, but removed for obvious reasons)
Been at this for weeks now and driving me crazy, any ideas?
Raspberry Pi running Raspbian.
$crontab -e
#reboot DISPLAY=:0 python /prtgboot.py
prtgboot.py
'#'!/usr/bin/env python
import commands
import time
webbrowser = "iceweasel"
pgrepcmd = "pgrep %s " % (webbrowser)
process = commands.getoutput(pgrepcmd)
if process == "":
from selenium import webdriver
from selenium.webdriver.common.keys import Keys
from selenium.webdriver import ActionChains
browser = webdriver.Firefox()
actions = ActionChains(browser)
browser.get('http://google.co.uk')
elemFullscreen = browser.find_element_by_tag_name('html')
time.sleep(30)
elemFullscreen.send_keys (Keys.F11)
exit()
else:
exit()
Ok, so Petesh was correct. It was #reboot not working correctly.
Changed the cron to * * * * * so my script runs every minute. Normally bad practice, but already setup script to end if browser already running. Working a treat now.
On a positive note, if the browser crashes it will start again for me :)

Categories