I have looked at this question but I am not sure I got it correctly or not.
I have opened pycharm and one python script and its running (it's topic modeling).
Also I have another python script in which I opened in another pycharm in the same server. I also run it.
Now these two program are running in the same server, I should mention that I have not changed any configuration neither server nor pycharm.
Do you think its ok in this way? or one script technically won't run(in terms of progressing I mean it just show its running but practically wont run) until the other script finished?
Edit Configurations -> Allow parallel run. Done
First, PyCharm will create independent processes on the server, so both scripts will run. You can check it with something like htop - search for processes and verify that they're running.
Second, you don't have to open second PyCharm window to run the second script. You can run both of them from the single one. There are at least two ways: with run configurations or by spawning multiple terminal windows and running scripts from there.
From the Run/Debug Configurations windows you can add a Compound configuration that contains multiple configurations that will run in parallel. The Allow parallel run option for child configurations make no difference in this case.
The default behaviour was changed starting from version 2018.3. You can allow multiple runs by selecting Allow parallel run within the Edit Configurations menu.
Related
I have a python3.9 script I want to have running 24/7. In it, I use python-daemon to keep it running like so:
import daemon
with daemon.DaemonContext():
%%script%%
And it works fine but after a few hours or days, it just crashes randomly. I always start it with sudo but I can't seem to figure out where to find the log file of the daemon process for debugging. What can I do to ensure logging? How can I keep the script running or auto-restart it after crashing?
You can find the full code here.
If you really want to run a script 24/7 in background, the cleanest and easiest way to do it would surely be to create a systemd service.
There are already many descriptions of how to do that, for example here.
One of the advantages of systemd, in addition to being able to launch a service at startup, is to be able to restart it after failure.
Restart=on-failure
If all you want to do is automatically restart the program after a crash, the easiest method would probably be to use a bash script.
You can use the until loop, which is used to execute a given set of commands as long as the given condition evaluates to false.
#!/bin/bash
until python /path/to/script.py; do
echo "The program crashed at `date +%H:%M:%S`. Restarting the script..."
done
If the command returns a non zero exit-status, then the script is restarted.
I would start with familiarizing myself with those two questions:
How to make a Python script run like a service or daemon in Linux
Run a python script with supervisor
Looks like you need a supervisor that will make sure that your script/daemon is still running. You can take a look at supervisord.
I'm using Python's subprocess to spawn new processes. The processes are independent of each other and output some of the data related to the account creation.
for token in userToken:
p = subprocess.Popen(['python3','create_account.py',token)
sleep(1)
I'm trying to find a way to get the output of each of the Python scripts to run in the separate VSCode terminals to clearly see how the processes are running.
For example, in VSCode you can split the terminals as in the screenshot below. It would be great if each of the processes would have its own terminal window.
I've also checked that you can run tasks in VSCode in separate terminals as described here. Is there a way to launch multiple subprocess threads in separate terminals like that?
If that's not possible, is there another way I can run subprocess in multiple terminals in VSCode?
Currently in VS Code, it supports running python code in a single thread in the terminal by default.
If you want to run the python code in two or more VS Code terminals separately, and not run them sequentially, you could manually enter the run command in the two VS Code terminals, for example:
The command to run the python file'c.py': "..:/.../python.exe ..:/.../c.py".
And for multi-threaded synchronous operation, except for the manual input of execution commands in two or more newly created terminals to make the code run synchronously, VSCode currently does not have other local support that supports this function.
I have submitted an application for this feature in Github and we are looking forward to the realization of this feature:
Github link: Can VSCode automatically run python scripts in two or more terminals at the same time?
I've written a lot of python scripts. Now I want to run it on another computer which running non-stop to crawling, analyzing data and update to an sql database.
Normally I open a command prompt and run the scripts:
python [script directory]
But with many scripts I have to open many cmd and every script call an python interpreter, so It end up with huge mess using a lot of memory.
What should I do to manage these scripts.
You haven't specified what OS your server is, but assuming that it's a Linux server you should probably research a process management tool such as Supervisord or Systemd. These are tools designed to run and monitor your program automatically, and even restart it if it crashes.
If you're using Ubuntu 16.04 then it comes with Systemd out of the box, however I personally find Supervisord easier to configure and use for simple tasks.
These programs won't necessarily help with your memory consumption issues however. Sure you can place caps on memory use for a process, but that's not really going to help you if it stops your program from working. You're probably best to re-evaluate your code and look for ways to reduce its memory footprint or use a server with more ram.
EDIT:
You've just added that the OS is Windows 10, which makes the above irrelevant. You can use the Windows Task Scheduler to automatically execute long running tasks.
you can use pythonw *.py , and it will run in background.
I have python script for ssh which help to run various Linux commands on remote server using paramiko module. All the outputs are saved in text file, script is running properly. Now I wanted to run these script twice a day automatically at 11am and 5pm everyday.
How can I run these script automatically every day at given time without compiling every time manually. Is there any software or module.
Thanks for your help.
If you're running Windows, your best bet would be to create a Scheduled Task to execute Python itself, passing the path to your script as an argument.
If you're using OSX or Linux, CRON is your friend. There are references abound for how to create scheduled events in crontab. This is a good start for setting up CRON tasks.
One thing to mention is permissions. If you're running this from a Linux machine, you'll want to ensure you set up the CRON job to run under the right account (best practice not to use your own).
Assuming you are running on a *nix system, cron is definitely a good option. If you are running a Linux system that uses systemd, you could try creating a timer unit. It is probably more work than cron, but it has some advantages.
I won't go though all the details here, but basically:
Create a service unit that runs your program.
Create a timer unit that activates the server unit at the prescribed times.
Start and enable the timer unit.
I'm developing a script that runs a program with other scripts over and over for testing purposes.
How it currently works is I have one Python script which I launch. That script calls the program and loads the other scripts. It kills the program after 60 seconds to launch the program again with the next script.
For some scripts, 60 seconds is too long, so I was wondering if I am able to set a FLAG variable (not in the main script), such that when the script finishes, it sets FLAG, so the main script and read FLAG and kill the process?
Thanks for the help, my writing may be confusing, so please let me know if you cannot fully understand.
You could use atexit to write a small file (flag.txt) when script1.py exits. mainscript.py could regularly be checking for the existence of flag.txt and when it finds it, will kill program.exe and exit.
Edit:
I've set persistent environment variables using this, but I only use it for python-based installation scripts. Usually I'm pretty shy about messing with the registry. (this is for windows btw)
This seems like a perfect use case for sockets, in particular asyncore.
You cannot use environment variables in this way. As you have discovered it is not persistent after the setting application completes