Spawning external processes from Jenkins - spawned but not executing - python

I am attempting to launch a couple of external applications from a Jenkins build step in Windows 7 64-bit. They are essentially programs designed to interact with each other and perform a series of regression tests on some software. Jenkins is run as Windows service as a user with admin privileges on the machine. I think that's full disclosure on any weirdness with my Jenkins installation.
I have written a Python3 script that successfully does what I want it to when run from the Windows command line. When when I run this script as a Jenkins build step, I can see that the applications have been spawned via the Task Manager, but there is no CPU activity associated with them, and no other evidence that they are actually doing anything (they produce log files, etc., but none of these appear). One of the applications typically runs at 25% CPU during the course of the regression tests.
The Python script itself runs to completion as if everything is OK. Jenkins is correctly monitoring the output of the script, which I can watch from the job's console output. I'm using os.spawnv(os.P_NOWAIT, ...) for each of the external application. The subprocess module doesn't do what I want it to, I just want these programs to run externally.
I've even run a bash script via Cygwin that functionally does the same thing as the Python script with the same results. Any idea why these applications spawn but don't execute?
Thanks!

Related

How do you schedule some python scripts to run regularly on a Windows PC?

I have some python scripts that I look to run daily form a Windows PC.
My current workflow is:
The desktop PC stays all every day except for a weekly restart over the weekend
After the restart I open VS Code and run a little bash script ./start.sh that kicks off the tasks.
The above works reasonably fine, but it is also fairly painful. I need to re-run start.sh if I ever close VS Code (eg. for an update). Also the processes use some local python libraries so I need to stop them if I'm going to update them.
With regards to how to do this properly, 4 tools came to mind:
Windows Scheduler
Airflow
Prefect (https://www.prefect.io/)
Rocketry (https://rocketry.readthedocs.io/en/stable/)
However, I can't quite get my head around the fundamental issue that Prefect/Airflow/Rocketry run on my PC then there is nothing that will restart them after the PC reboots. I'm also not sure they will give me the isolation I'd prefer on these tools.
Docker came to mind, I could put each task into a docker image and run them via some form of docker swarm or something like that. But not sure if I'm re-inventing the wheel.
I'm 100% sure I'm not the first person in this situation. Could anyone point me to a guide on how this could be done well?
Note:
I am not considering running the python scripts in the cloud. They interact with local tools that are only licenced for my PC.
You can definitely use Prefect for that - it's very lightweight and seems to be matching what you're looking for. You install it with pip install prefect, start Orion API server: prefect orion start and once you create a Deployment, and start an agent prefect agent start -q default you can even configure schedule from the UI
For more information about Deployments, check our FAQ section.
It sounds Rocketry could also be suitable. Rocketry can shut down itself using a task. You could do a task that:
Runs on the main thread and process (blocking starting new tasks)
Waits or terminates all the currently running tasks (use the session)
Calls session.shut_down() which sets a flag to the scheduler.
There is also a app configuration shut_cond which is simply a condition. If this condition is True, the scheduler exits so alternatively you can use this.
Then after the line app.run() you simply have a line that runs shutdown -r (restart) command on shell using a subprocess library, for example. Then you need something that starts Rocketry again when the restart is completed. For this, perhaps this could be an answer: https://superuser.com/a/954957, or use Windows scheduler to have a simple startup task that starts Rocketry.
Especially if you had Linux machines (Raspberry Pis for example), you could integrate Rocketry with FastAPI and make a small cluster in which Rocketry apps communicate with each other, just put script with Rocketry as a startup service. One machine could be a backup that calls another machine's API which runs Linux restart command. Then the backup executes tasks until the primary machine answers to requests again (is up and running).
But as the author of the library, I'm possibly biased toward my own projects. But Rocketry very capable on complex scheduling problems, that's the purpose of the project.
You can use schtasks for windows to schedule the tasks like running bash script or python script and it's pretty reliable too.

Automation of Powershell script, that invokes a Python script, doesn't work (via Task Scheduler)

I am trying to create a bunch of automations on my PC with Windows, and I encountered some obstacles while trying to automate a Powershell script with Windows Task Scheduler.
Right now, I have managed to set up a Task Scheduler to perform an actual script, but despite the whole script working as intended when I start it manually, it doesn't work right while invoked via task scheduler.
The Powershell script is very simple, it is meant to invoke a certain Python script. I also created a log transcript for testing purposes.
python dataset_creator.py --to_import yes --to_export yes
Start-Transcript -Path "<path>\transcript0.txt"
While invoked manually, the Python script works, but via Task Scheduler, the only working part is a transcript creation (ergo - no Python script is running). The Task Scheduler informs me itself that the task executed properly. This is how I set up the action:
Program: %SystemRoot%\system32\WindowsPowerShell\v1.0\powershell.exe
Arguments: -NoProfile -NoLogo -NonInteractive -ExecutionPolicy Bypass -File "C:\<path>\powershell_script.ps1"
As for now, I have set it on a SYSTEM account, I have tried to set it on my user admin account, but it doesn't work as well.
Could you suggest the potential issues? I scrolled through several articles and nothing works. I also tried to skip the Powershell part (aka - set up the task relying only on a Python task), but it didn't work so far, and I also initially wanted to insert a number of Python scripts invokations into a single Powershell file.
I am also open to some Windows Task Scheduler alternatives suggestions.

what are the good ways to deploy and manage python script on production server?

I've written a lot of python scripts. Now I want to run it on another computer which running non-stop to crawling, analyzing data and update to an sql database.
Normally I open a command prompt and run the scripts:
python [script directory]
But with many scripts I have to open many cmd and every script call an python interpreter, so It end up with huge mess using a lot of memory.
What should I do to manage these scripts.
You haven't specified what OS your server is, but assuming that it's a Linux server you should probably research a process management tool such as Supervisord or Systemd. These are tools designed to run and monitor your program automatically, and even restart it if it crashes.
If you're using Ubuntu 16.04 then it comes with Systemd out of the box, however I personally find Supervisord easier to configure and use for simple tasks.
These programs won't necessarily help with your memory consumption issues however. Sure you can place caps on memory use for a process, but that's not really going to help you if it stops your program from working. You're probably best to re-evaluate your code and look for ways to reduce its memory footprint or use a server with more ram.
EDIT:
You've just added that the OS is Windows 10, which makes the above irrelevant. You can use the Windows Task Scheduler to automatically execute long running tasks.
you can use pythonw *.py , and it will run in background.

Script to run at startup on Windows RT

I have a python script which basically lauches an Xmlrpc server. I need this script to run always. I have a function that may call for the system to reboot itself. So once the system has rebooted, I need to get the script running again.
How can I add this to the Windows RT startup?
There is no way for a Windows Store app to trigger a system reboot, neither can it run a Python script unless you implement a Python interperter inside your app. Windows Store apps run in their own sandbox and have very limited means of communication with the rest of the system.
You can create a scheduled task to run on login or boot. The run registry key and the startup folder do not function on Windows RT, but the task scheduler does.
Off topic (since I seem unable to add a comment to the other answer) there has been a copy of Python ported over to Windows RT using the jailbreak.

Execute remote python script via SSH

I want to execute a Python script on several (15+) remote machine using SSH. After invoking the script/command I need to disconnect ssh session and keep the processes running in background for as long as they are required to.
I have used Paramiko and PySSH in past so have no problems using them again. Only thing I need to know is how to disconnect a ssh session in python (since normally local script would wait for each remote machine to complete processing before moving on).
This might work, or something similar:
ssh user#remote.host nohup python scriptname.py &
Basically, have a look at the nohup command.
On Linux machines, you can run the script with 'at'.
echo "python scriptname.py" ¦ at now
If you are going to perform repetitive tasks on many hosts, like for example deploying software and running setup scripts, you should consider using something like Fabric
Fabric is a Python (2.5 or higher) library and command-line tool for
streamlining the use of SSH for application deployment or systems
administration tasks.
It provides a basic suite of operations for executing local or remote
shell commands (normally or via sudo) and uploading/downloading files,
as well as auxiliary functionality such as prompting the running user
for input, or aborting execution.
Typical use involves creating a Python module containing one or more
functions, then executing them via the fab command-line tool.
You can even use tmux in this scenario.
As per the tmux documentation:
tmux is a terminal multiplexer. It lets you switch easily between several programs in one terminal, detach them (they keep running in the background) and reattach them to a different terminal. And do a lot more
From a tmux session, you can run a script, quit the terminal, log in again and check back as it keeps the session until the server restart.
How to configure tmux on a cloud server

Categories