I have created a subversion post-commit hook to send out an email everytime a commit is made.Im calling a python script from the file post-commit in /var/svn/repos/hooks .
REPOS="$1"
REV="$2"
~/svnnotify.py $REV
But the problem is that the svn commit command is taking a longer time to terminate as it waits for the python script to terminate first . Is there any way around this ?
Thank You
Try adding an ampersand (&) after the line that calls your script to put it in the background and return immediately.
Call a batch file and in that batch file execute python script to run in the background by adding ampersand at end of command in batch file( & ).
Maybe put the update in a simple queue that gets scooped up by a script run invoked from cron and sends a message if something is sitting in the queue.
Queue could be a simple file in /tmp, an sqlite file, or a MySQL table.
If it's taking noticeably long to send the e-mail, maybe there's something up with the code in the notify script. It shouldn't take that long to put an e-email in the local mail queue.
Related
I have a python program which is instantier as a linux service.
This service updates itself by downloading a new version of the code on an ftp server and launches a bash file to update the service.
In this file I have a line that destroys the current service before recreating it with the new source code.
I run this bash script with:
subprocess.call("sudo bash /home/pi/install.sh",shell=True)
I understand that this "subprocess" lives in my python program. And the bash script stop the linux service so stop the python program so stop itself ... And so it never ends.
What are the solutions to solve my problem?
I think there's several ways to do it - one of them being (maybe not the most elegant?) to make your python schedule a cron-job of the bash-script using python-crontab.
Say it's 13:00 and you want your job to run - then make the python script schedule a cron-job to 13:05 (just to add a time buffer).
You can then remove your cron-job after the bash-job has been run, either manually or implement it in your bash-script (or make it call a python script which uses python-crontab to remove it, it's fairly easy to do so)
Don't let the script stop the service. Just let it exit with a specific exit code if it installed a new version, and restart the service accordingly in the Python code.
Question about Activiti 5.17.0.
I'm researching to use external REST API and thought Shell Task with Python might be good solution. But it freezes the process in Activiti Explorer.
Is there any better way?
Read manual here: (Shell Task)
http://www.activiti.org/userguide/#bpmnShellTask
Found working sample:
https://github.com/Activiti/Activiti/blob/master/modules/activiti-engine/src/test/resources/org/activiti/examples/bpmn/shell/ShellTaskTest.testEchoShellWindows.bpmn20.xml
It works simple Windows command (ex. echo), but using Python freezes the process and browser.
It waits process done forever (or until timeout). Browser shows waiting icon and do not respond in Activiti Explorer.
It works as following:
OK: simple DOS command
OK: simple batch file
OK: batch file in batch file (note: don't use "call")
Followings are not working and causes freeze:
NG: python (even --version)
NG: batch file call python
NG: batch file call another batch file with "call" command
I've tried "wait" option (default=true) as false. Then process comes back, but there's no result value evaluated.
Is there any workaround or better solution to use external REST API from Activiti? Any advice is helpful.
Thank you,
Naoki
I am using python 2.7.
I am using paramiko module to trigger a remote action. The process is long and typically runs for 15-20 minutes. At the end it generates a file locally.
I would not like to block a thread until the process is finished. Is there a way by which I can trigger another action once the file is generated?
I have created a simple cron job under windows to run a very simple python scrape:
scrape webpage
store data into csv file
close file
Cron job under Windows worked fine-without any probs. All of sudden, that cron job stopped working, my output file is not getting updated. When I run the scrape manually (double click on python file or via python IDE) I would typically get my debug output "everything ok" in a windows dialog window and the target file gets updated. During the cron job I can see that debug output window popping up and printing the same "everything ok" debug, but the file is just not getting updated.
I tried rebooting my machine, close/open all program. Created a new task in the scheduler, it still wouldn't work. Any suggestions?
Thanks
Peter
After playing around with all potential settings in windows and many wasted hours I found a workaround.
My initial setup in the task manager was calling the python.exe and my python.py file was passed as an argument. Just as described here: http://blogs.esri.com/esri/arcgis/2013/07/30/scheduling-a-scrip/. The difference at the end that made my cron job work again was that I deleted passing the python file as an argument and now calling python file directly. Not sure why the initial setup stopped working - but this has helped. Hope you don't get into the same time wasting situation.
I am using Debian and I have a python script that I would like to run during rc.local so that it will run on boot. I already have it working with a test file that is meant to run and terminate.
The problem is that this file should eventually run indefinitely using Scheduler. It's job is to do serial reads, a small amount of processing on those reads, and inserts into a MySQL database. However, I am nervous about then not being able to cancel the script to get to my login prompt if changes need to be made since I was unable to terminate the test script early using Ctrl+C (^C).
My hope is that there is some command that I am just missing that will accomplish this. Is there another key command that I'm missing that will terminate the python script and end rc.local?
Thanks.
EDIT: Another possible solution that would help me here is if there is a way to start a python script in the background during boot. So it would start the script and then allow login while continuing to run the script in the background.
I'm starting to think this isn't something that's possible to accomplish so other suggestions to accomplish something similar to what I'm trying to do would be helpful as well.
Thanks again.
Seems like it was just a dumb mistake on my part.
I realized the whole point of this was to allow the python script to run as a background process during boot so I added the " &" to the end of the script call like you would when running it from the shell and viola I can get to my password prompt by pressing "Enter".
I wanted to put this answer here just in case this would be something horribly wrong to do, but it accomplishes what I was looking for.
Making scripts run at boot time with Debian
Put your script in /etc/init.d/. So, if your script is in a file called my_script, it should be located at /etc/init.d/my_script.
Run update-rc.d my_script defaults as root.
Don't forget to make your script executable and include the shebang. That means the first line of the script should be #!/usr/bin/python.