I'd like to run a Python script with external Library (beautifulsoup) on a hosted Webserver.
What type of Webserver do I need? I heard something about CGI?
How I've to implement and install the external library in my script and on the server?
Cronjob is only for PHP, isn't it? Is there a "cronjob" for Python?
cron is an OS scheduler for unix like systems. It allows you to run any command at set intervals.
Do you need a webserver, or do you just want to routinely execute your script? If you just want to routinely execute your script there is no need for a webserver, cron will suffice.
I believe all you need is ssh access, and sudo access to a server. (Amazon ec2 offeres free micro instances for a year)
After that you can install pip, virtualenv, and beautiful soup. You can then register your command (which just executes your script) with cron and you are all set
Related
I have an API (in python) which has to alter files inside an EC2 instance that is already running. I'm searching on boto3 documentation, but could only find functions to start new EC2 instances, not to connect to an already existing one.
I am currently thinking of replicating the APIs functions to alter the files in a script inside the EC2 instance, and having the API simply start that script on the EC2 instance by accessing it using some sort of SSH library.
Would that be the correct approach, or is there some boto3 function (or in some of the other Amazon/AWS libraries) that allows me to start a script inside existing instances?
An Amazon EC2 instance is just like any computer on the Internet. It is running an operating system (eg Linux or Windows), and it has standard security in-built. The fact that it is an Amazon EC2 instance has no impact.
So, the question really becomes: How do I run a command on a remote computer?
Typical ways of doing this include:
Connecting to the computer (eg via SSH) and running a command
Running a service on the computer that listens on a particular port (eg responding to an API request)
Using remote shell commands to run an operation on another computer
Fortunately, AWS offers an additional option: Use the AWS Systems Manager Run Command:
AWS Systems Manager Run Command lets you remotely and securely manage the configuration of your managed instances. A managed instance is any Amazon EC2 instance or on-premises machine in your hybrid environment that has been configured for Systems Manager. Run Command enables you to automate common administrative tasks and perform ad hoc configuration changes at scale. You can use Run Command from the AWS console, the AWS Command Line Interface, AWS Tools for Windows PowerShell, or the AWS SDKs. Run Command is offered at no additional cost.
Administrators use Run Command to perform the following types of tasks on their managed instances: install or bootstrap applications, build a deployment pipeline, capture log files when an instance is terminated from an Auto Scaling group, and join instances to a Windows domain, to name a few.
Basically, it is an agent installed on the instance (or, for that matter, on any computer on the Internet) and commands can be sent to the computer that are executed by the agent. In fact, the same command can be sent to hundreds of computers if desired.
The AWS Systems Manager Run Command can be triggered by an API call, such as a program using boto3.
Unless you have a specific service running on that machine which allows you to modify mentioned files. I would make an attempt to log onto EC2 instance as to any other machine via network.
You can access EC2 machine via ssh with use of paramiko or pexpect libraries.
If you want to use the execute a script inside of an existing EC2 instance - you could use the reference from the existing answer here : Boto Execute shell command on ec2 instance
IMO, to be able to start a script inside the EC2, the script should be present on the EC2.
I have a Djnago web application on IIS 6 on one server. Is there a way that from this website I call another python script that is on another server in such a way that that script just run itself there?
Calling or Runnig that script in the usual way as internet says, is not working.
I always get the error of os.getcwd() and it also doesn't allow to change that directory.
I just want to run that python script there on that server from this server.
Can anyone help?
Normally, I would recommend using a framework like fabric or winrm if you want to run a python script on another server. Those frameworks use ssh or windows remoting functionality, respectively, to allow a python program to execute other commands (including python scripts) on other systems. If the target machine is a windows machine, be forewarned that you can run into all sorts of UAC issues doing normal operations.
I want to manage virtual machines (any flavor) using Python scripts. Example, create VM, start, stop and be able to access my guest OS's resources.
My host machine runs Windows. I have VirtualBox installed. Guest OS: Kali Linux.
I just came across a software called libvirt. Do any of you think this would help me ?
Any insights on how to do this? Thanks for your help.
For aws use boto.
For GCE use Google API Python Client Library
For OpenStack use the python-openstackclient and import its methods directly.
For VMWare, google it.
For Opsware, abandon all hope as their API is undocumented and has like 12 years of accumulated abandoned methods to dig through and an equally insane datamodel back ending it.
For direct libvirt control there are python bindings for libvirt. They work very well and closely mimic the c libraries.
I could go on.
follow the directions here to install docker https://docs.docker.com/windows/ (it includes Oracle VirtualBox (if you dont already have it)
#grab the immage
docker pull kalilinux/kali-linux-docker
#run a specific command
docker run kalilinux/kali-linux-docker <some_command>
#open interactive terminal to "docker image"
docker run -t -i kalilinux/kali-linux-docker /bin/bash
if you want to mount a local volume you can use the `-v dst src` switch in your run command
#mount local ./training/webapp directory into kali image # /webapp
docker run kalilinux/kali-linux-docker -v /webapp training/webapp <some_command>
note that these are run from the regular windows prompt to use python you would need to wrap them in subprocess calls ...
I have a python script which basically lauches an Xmlrpc server. I need this script to run always. I have a function that may call for the system to reboot itself. So once the system has rebooted, I need to get the script running again.
How can I add this to the Windows RT startup?
There is no way for a Windows Store app to trigger a system reboot, neither can it run a Python script unless you implement a Python interperter inside your app. Windows Store apps run in their own sandbox and have very limited means of communication with the rest of the system.
You can create a scheduled task to run on login or boot. The run registry key and the startup folder do not function on Windows RT, but the task scheduler does.
Off topic (since I seem unable to add a comment to the other answer) there has been a copy of Python ported over to Windows RT using the jailbreak.
I want to execute a Python script on several (15+) remote machine using SSH. After invoking the script/command I need to disconnect ssh session and keep the processes running in background for as long as they are required to.
I have used Paramiko and PySSH in past so have no problems using them again. Only thing I need to know is how to disconnect a ssh session in python (since normally local script would wait for each remote machine to complete processing before moving on).
This might work, or something similar:
ssh user#remote.host nohup python scriptname.py &
Basically, have a look at the nohup command.
On Linux machines, you can run the script with 'at'.
echo "python scriptname.py" ¦ at now
If you are going to perform repetitive tasks on many hosts, like for example deploying software and running setup scripts, you should consider using something like Fabric
Fabric is a Python (2.5 or higher) library and command-line tool for
streamlining the use of SSH for application deployment or systems
administration tasks.
It provides a basic suite of operations for executing local or remote
shell commands (normally or via sudo) and uploading/downloading files,
as well as auxiliary functionality such as prompting the running user
for input, or aborting execution.
Typical use involves creating a Python module containing one or more
functions, then executing them via the fab command-line tool.
You can even use tmux in this scenario.
As per the tmux documentation:
tmux is a terminal multiplexer. It lets you switch easily between several programs in one terminal, detach them (they keep running in the background) and reattach them to a different terminal. And do a lot more
From a tmux session, you can run a script, quit the terminal, log in again and check back as it keeps the session until the server restart.
How to configure tmux on a cloud server