Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I have a Raspberry Pi which uses a USB wireless adapter and wicd-curses. A Python script runs in the background which uses a WebSocket. When I call sudo reboot, my Python script gets the signal to restart (with SIGTERM) about 20 seconds later. (I don't know why the computer takes 20 seconds to restart anyway. I don't remember it being this way before installing wicd-curses.)
By the time 20 seconds has passed, wicd-curses has already disconnected from the wireless network, meaning my Python script cannot properly close the WebSocket connection. So the core of my question is this: what Python commands are available to me to ensure that my script is notified of the system shutdown earlier than it is now? Is there any sort of event for which I can listen? Preferably, I want to be able to run the script on demand (python myscript.py) without the use of a daemon or service or whatever it might be called in the Linux world. Thank you.
Related
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 3 years ago.
Improve this question
I'm trying to optimize a Python test that involves writing some 100k files to a mounted directory using SSH & a simple Bash command.
I'm rather inexperienced at this so I need some advice on how to minimize IO time.
Basically the Python script mounts a directory on a remote server (let's call it %MOUNTED_DIRECTORY% for that matter), then SSH into the remote host and calls the following bash command on that host:
for number in `seq 1 100000`; do touch %MOUNTED_DIRECTORY%/test_file$number; done
I find that a lot of time on spent on this process, waiting for the creation of the files to finish. I need the files to be created before I continue, so I can't do anything in the meantime - I have to speed up the process.
Also, when the directory is mounted it takes a lot more time to finish than when it's not, so that's why I'm in this problem in the first place.
I thought about multithreading or multiprocessing but they don't seem to be efficient, either because I'm doing something wrong or because the command is actually on a remote host and is creating the files with Bash, not Python?
With xargs:
seq 1 100000 | sed 's|^|%MOUNTED_DIRECTORY%/test_file|' | xargs touch
This passes as many names as possible to each touch command.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I am running a python code on a wipy board (micropython environment) and a python code on an embedded linux system. The wipy board is connected to the linux system with wifi. I was wondering how to create bi-directional communication for passing data between the two independent scripts
I have looked into both threading and multiprocessing but I do not know if either would be appropriate for this use so I am just looking for a conceptual answer so I can find someplace to start
Threading and multiprocessing has nothing to do with your problem. Threading and multiprocessing is all about running multiple programs or part of programs on the same system.
What you want is to use the network to send/receive messages. Please read the
WiPy documentation:
About your WIFI connections
About TCP sockets
The part about the TCP sockets should be exactly what you need. The part about WIFI connections will tell you how to adjust the WIFI settings of your board.
The same goes for your embedded Linux system. Look for documentation on your system and check for the chapter about sockets. I would open a server on one of these devices (or both) and use the other devices to connect to the server and get the information the system needs. It might be a good idea to use the device with more resources as the server.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 5 years ago.
Improve this question
I have a python script that I typically kick off manually. It then runs for several days collecting information until it gets disconnected.
What I want to do is have the Windows scheduler start the job 12:01 AM and then terminate the job at 11:58 pm that same day.
Typically I would be able to do this using CRON in the linux world but am unsure how to do this in Windows Scheduler.
Thank you for your help!
Open Task Scheduler in Windows. Control Panel -> Administrative Tools -> Task Scheduler
Click 'Create Basic Task' (to start the script)
Set the trigger time
Set Program/Script = [full path to python.exe]
Add Arguments = [full path to your scheduled python script]
Click 'Create Basic Task' (to end the script)
Set the trigger time
Program/Script = taskkill /f /im [full path to python.exe]
This guide goes into more detail.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question appears to be off-topic because it lacks sufficient information to diagnose the problem. Describe your problem in more detail or include a minimal example in the question itself.
Closed 8 years ago.
Improve this question
I made a simple crawler using python that has infinite loop so it can't be stop.
With random delay 17 ~ 30, this crawler crawl same one page and find 'href' links that is updated periodically and store to Mysql.
I used Ubuntu server.
Because i used Linux command that
$ nohup python crawer.py &
so this crawler was running in Ubuntu server background.
and it has ran about 4 hours i think.
but suddenly crawler stopped.
and next day i try again. and it works well!
what is the problem? is this about web page's block? or does nohup command has limit time????
thanks a lot.
No, nohup will do what it's designed to to. That is:
The nohup utility invokes utility with its arguments and at this time
sets the signal SIGHUP to be ignored. If the standard output is a termi-
nal, the standard output is appended to the file nohup.out in the current
directory. If standard error is a terminal, it is directed to the same
place as the standard output.
Some shells may provide a builtin nohup command which is similar or iden-
tical to this utility. Consult the builtin(1) manual page.
Bash's (and other shells) & will background the task. nohup with & effectively lets the process run in the background even whilst you terminate your tty/pty session.
I believe the problem is your Python program here is crashing. You should invest some time in some logging and find out. e.g:
nohup my_app.py &> myapp.log &
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I need to make static website. So I connected via ssh to some local server, where I want to make a static website. Then I used python to make it work:
$ python -m http.server 55550
But if I close terminal, then python program is terminated. I want to shut down my computer, but I want to let this process running on that local server, so other people could still access that website.
How can I do this? After that, how should I terminate that process later?
Thanks for any help
Use the nohup shell builtin:
nohup python -m http.server 55550
To terminate the process, simply kill it using the kill command, just like any other process.
you can also launch it in background
python -m http.server 55550 &
then enter
disown
to detach the process to the current term
screen
python -m SimpleHTTPServer 55550 &
press ctrl+a, then press d
exit
shutdown your computer
...
start your computer
ssh your server
screen -r