Automate CloudFoundry Deployment with python - python

I am new to Cloud Foundry. I want to automate the application deployment and service binding in Cloud Foundry with Python.
For deploying an application in Cloud Foundry we will use the commands (Cloud Foundry CLI) like:
cf push redis-sample-app
cf create-service redis shared-vm service-example-redis
cf bind-service redis-sample-app service-example-redis
cf restage redis-sample-app
Now I don't want to use the CLI for that, I just want to write a Python/Ruby/(any language) script which will do all the things.
I have tried google and ended up with Python cloudfoundry module, but it's not clear to go on. Is there any API for my task, like boto for accessing EC2. I have tried following code in Python:
from cloudfoundrty import CloudFoundryInterface
cf=CloudFoundryInterface(target="api.end.point",username="myusername",password="mypwd")
cf.login()
It's showing the error:
`File "C:\Python27\lib\site-packages\requests\models.py", line 398, in full_url
raise MissingSchema("Invalid URL %r: No schema supplied" % url)
MissingSchema: Invalid URL u'users/kishorekumarnetala%40gmail.com/tokens': No schema supplied`

First, a quick thing, what is the actual API endpoint of your Cloud Foundry deployment? If you're using the cf CLI, what did you put when you did cf api API_ENDPOINT? You can run cf target to see what the current API endpoint is set to. It should have a scheme like http or https. If you're actually putting api.end.point in your Python code, that's why you're getting the error message you're seeing.
As for your general question about automating Cloud Foundry interactions, you have a few options:
Write a shell script that directly drives the cf CLI
Write a module in a higher-level language like Ruby or Python that simply wraps calls to the CLI
Write a module in a higher-level language that wraps calls to the restful API.
Here's a breakdown of those options:
If your list of languages (Ruby/Python/any language) included things like bash or pure sh, then you can easily use that to have "code" that automates interacting with Cloud Foundry. The CLI is designed to be scriptable, and not require human interaction. This is the most common approach, since the CLI is designed for this use case.
If you want to drive interactions via a different language (e.g. maybe because this is part of a larger project that's already in a different language), you can certainly do that. The full suite of highest level system tests for Cloud Foundry does this in Golang. If you're familiar with navigating Golang projects, you can look at:
the package that drives the CLI
the test suites that use that package
You can also build a wrapper around the RESTful HTTP API. There are also several out there already in the ecosystem:
Here is a recent thread about an official supported Java client
Someone in the community has been developing a node.js client for their own purposes (not sure if it's public though)
There used to be a Ruby gem but it I believe it is deprecated, but you may be able to find it and look at it for ideas

Related

Implementing web service with python on a virtual AWS instance

I need to implement a simple web service in python - it's my first experience with web services and REST APIs so I want to undertand what environment and tools would fit my needs. In my web service, I need to read some data from a database, do some simple logic, and support a GET call from another application (qualtrics).
I read and implemented a simple test web service with python using some useful blogs such as: Building a Basic RestFul API in Python | Codementor
but I need a real server so that I could call the API from external applications.
As I'm looking for a long term solution, I thought that using AWS EC2 instance may be a good solution for a server. I tried to implement it using some guidelines in blogs such as: Deploy a Flask app on AWS EC2 | Codementor
However, as I'm new to this and encountered some implementation/editing errors (e.g. handling of the wsgi file) and as I'm a windows person and the ubuntu stuff are not always easy to get used to, I was wondering what is the best framework for my needs?
Is there any recomended flow in which I'll be able to implement my simple python code and connect it to a small server (either AWS EC2 instance or any other recomended one) in a more convenient way?
Another important note - I will need to run it only from time to time, this web server and web service should not be contantly live (that's why I thought that aws virtual instance would fit best).
To begin, my recommendation would be to look at Elastic Beanstalk, Fargate and API Gateway with Lambda.
You can use Elastic Beanstalk to easliy provision out-of-the-box AWS environment to host your python app in Flask with minimal configurations required:
Deploying a flask application to Elastic Beanstalk.
The other thing to consider would be to develop your python app as a docker container using, e.g., tiangolo/uwsgi-nginx-flask as the base image. This would allow you to easily work with in on your localhost, and then just move your image to AWS for hosting.
You can host it on Fargate to save time on configuring container instances, or on Beanstalk as well which also supports docker.
Yet other choice is to go fully serverless and develop your Python REST api using API Gateway and lambda.

What is a convenient way to deploy and manage execution of a Python SDK Apache Beam pipeline for Google cloud Dataflow

Once an Apache Beam pipeline designed and tested in Google’s cloud Dataflow using Python SDK and DataflowRunner what is a convenient way to have it in the Google cloud and manage its execution?
What is a convenient way to deploy and manage execution of a Python SDK Apache Beam pipeline for Google Cloud Dataflow?
Should it be somehow packaged? Uploaded to Google storage? Create a Dataflow template? How can one schedule its execution beyond a developer execution it from its development environment?
Update
Preferably without 3rd party tools or a need in additional management tools/infrastructure beyond Google cloud and Dataflow in particular.
Intuitively you’d expect that “deploying a pipeline” section under How-to guides of the Dataflow documentation will cover that. But you find an explanation of that only 8 sections below in the “templates overview” section.
According to that section:
Cloud Dataflow templates introduce a new development and execution workflow that differs from traditional job execution workflow. The template workflow separates the development step from the staging and execution steps.
Trivially you do not deploy and execute your Dataflow pipeline from Google Cloud. But if you need to share the execution of a pipeline with nontechnical members of your cloud or simply want to trigger it without being dependant on a development environment or 3rd party tools then Dataflow templates is what you need.
Once a pipeline developed and tested you can create a Dataflow job template from it.
Please note that:
To create templates with the Cloud Dataflow SDK 2.x for Python, you must have version 2.0.0 or higher.
You will need to execute your pipeline using DataflowRunner with pipeline options that will generate a template on the Google Cloud storage rather than running it.
For more details refer to creating templates documentation section and to run it from template refer to executing templates section.
I'd say the most convenient way is to use Airflow. This allows you to author, schedule, and monitor workflows. The Dataflow Operator can start your designed data pipeline. Airflow can be started either on a small VM, or by using Cloud Composer, which is a tool on the Google Cloud Platform.
There are more options to automate your workflow, such as Jenkins, Azkaban, Rundeck, or even running a simple cronjob (which I'd discourage you to use). You might want to take a look at these options as well, but Airflow probably fits your needs.

Need advice on how to incorporate Python into an Azure, specifically an ASP.NET web application environment

Need advice on how to incorporate Python into an Azure ASP.NET web application environment. Please excuse this question but I am new to Azure and I'm not clear on how to proceed. Every option that I look into looks promising but they all seem to have their own issues. Below is a more thorough explanation but the deal is that I have an Azure account with all kinds of goodies, a full fledged ASP.NET (C#) web app running via App Service, I am new to Azure (but not Python), and I'm hoping to add Python functionality to this whole setup. In short:
I want to add Python to this setup mainly to run scheduled jobs and also to trigger Python code from ASP.NET web form submissions
ideally I want a solution that resembles a non-cloud setup. I know this sounds silly but I'm finding the cloud/Azure functionality to be nuanced and not straightforward. I want a place to put a bunch of Python scripts, run, edit, schedule and trigger them from ASP.NET
for example: I created a WebJob that runs manually and from the documentation it wasn't clear how it should be called. I just figured out that you need to POST with Basic Auth (and the credentials provided).
!Also, Azure CMD does NOT like files with 'underscore _' in them! You cannot submit a Web Job with a py file with an underscore nor can you write output with a file with an underscore
!Also, I don't see an option for this Web Job to run Python 3.6.4 (which I installed via extension). Right now it is using 2.7.15...
!Also, CRON expression in Azure has six *, not five plus a command. Again, more weird stuff to worry about
I tried these instructions but the updates to the web page's Web.config file breaks the ASP.NET web pages
ideally the most cost effective option
Any info is greatly appreciated
MORE DETAILED EXPLANATION
Currently I have an ASP.NET site running via Azure App Service and I would like to add Python scripts and possibly Flask/Rest functionality. Note that I am not expecting to serve any content via Python and will largely be running Python scripts either on a scheduled basis or call them from ASP.NET. As a matter of fact, and this is an important point, I'm hoping to have ASP.NET trigger/run a Python script when a web form is submitted. I realize that I could get a similar effect if I make a web call to a Rest api that is running Python. In any event, I can't tell if I should:
add a Python extension to the current App Service running the web page (I tried this) OR
I did install Python 3.6.4 and some packages via pip
These instructions were useful, however the updates to the web page's Web.config file breaks the ASP.NET web pages
set up a VM that will have all of the Python code (but how can I have the .NET web page(s) call the Python in the VM?) OR
use Azure functions (I'm completely new to this and must admit that I prefer to have my old school Python environment instead although I see the benefit of using functions. But how do you deal with logging and debugging?)
or what about a custom windows container (Docker)?
This requires installing VS Code and that is OK but I'm looking for a solution that another user can get into with as few interruptions as possible
The idea is to ramp up the use of Python although, like I said, I don't expect Python to be serving any of the web content. It will be used to run in the background and to run scheduled jobs. What is the most robust and hopefully easiest way to add Python functionality to Azure (most importantly in a way to be able to trigger/use Python from an App Service running .NET?)? I've searched online and stack overflow so far with interesting finds but nothing to my liking.
For example, the following link discusses how to schedule WebJobs. I just created a manual one and when I called the webhook I got the message: "No route registered for '/api/triggeredwebjobs/TestPython/run'" How to schedule python web jobs on azure
The Docker method looks very promising, however, I'm looking for a simple solution as there is another person who will be involved in all of this and he's busy with other projects
Thank you very much!
I found a solution, though I'm open to more info. Like I mentioned in my post, I used the 'add extension' tool to add Python 3.6.4 to my Azure (installed in D:\home\python364x64).
Then I installed a bunch of packages via pip, these installed into D:\home\python364x64\Lib\site-packages.
I created a Python folder in webpages\Python where I put my scripts.
Finally, in ASP.NET I used the Diagnostics.Process call to run my code in ~\webpages\Python\somecode_2.py
The main issue is that Azure came with Python 2.7.15 installed. And for some reason when my Python code got executed it was using 3.4 (where that version came from beats me). So for each script, I had to create an _2.py version where I simply did the following in order to call the original script via Python 3.6.4. Looks a little nasty but it works. So like I said, I would welcome more info for ways to do this better...
--
import os<br>
os.system("D:\\home\python364x64\python.exe SomePython.py {0}".format(add arguments here)

How to use Firebase with a linux based client app for bidirectional messages communication with server

I've seen Google's documentation and a lot of examples over the internet about how to use Firebase within Android / iOS applications, but I want to develop a client Firebase application which runs on a linux machine.
My requirements are:
Client runs on linux environment (either writtern in C++ or python).
Server is written in Javascript (NodeJS).
Server and client should have bi-directional communication between them using firebase realtime database.
I've seen also the firebase REST api, but I assume that is not good enough for me since I haven't seen a client listener api that listens on a something like onValueChanged.
Question:
How can I implement a linux-based app in C++ or python that listens to messages from server (data changes) using a listener, without having to call some get function each and every few seconds (just like NodeJS have the ref.on("child_changed",...) or ref.on("value",...)?
Help is much appreciated!
Unfortunately there is not official library for Realtime Database in Python but there are some third party wrappers around REST API.
Specifically I would look at Pyrebase library which also supports listening to live changes https://github.com/thisbejim/Pyrebase#streaming
For more information, libraries or other languages look at this page https://firebase.google.com/docs/database/rest/start
Google offers a C++ SDK you can download it here.
There is also a tutorial on the bottom of the page, but if you really want to dig into some code, here is the quickstart code on github from google with examples for each category on Firebase.
Google has also implemented a Game for demonstration purposes in C++ for Desktop OSs. here There you can find more advanced features and code samples.
I have only tried it with an XCode project on Mac OS X, and it works fine. I have no code for real time database but you should find the sample project in the github repo. https://github.com/firebase/quickstart-cpp
I hope this helps!
Greetings.
Haven't you try to use a Realtime Database triggers for C++ (that's is currently on beta) that allows you to simulate the same workflow as on JS?
From my point of view, it is the best way to get it to work as you want.
https://firebase.google.com/docs/functions/database-events

How to write AWS Deployment script to launch AWS instance

I am looking to launch an AWS instance by deploying a script. However, I do not fully understand what this means. What should be in the script in order to launch it and how do I approach this in order to meet the following requirements?
User specifies AWS credentials in a separate key file;
User invokes termination script and pass the instance ID from
command line;
Termination script shuts down AWS instance.
Upon completion, the termination script returns message indicating
whether the termination process has been completed successfully
I would appreciate some help in understanding what exactly a deployment script it and what language I should write it in. I have been coding thus far in Python and have created a script that creates an instance. But I am not sure how this is different from deploying an instance.
The usage of the expressions "create an instance" and "deploy an instance" can mean the same thing or different things. Depends on the engineer's viewpoint.
Basically creating an EC2 instance means the AWS definition of launching an EC2 instance. Deploying an EC2 instance may include additional configuration details such as patching the OS, installing software and applications, etc. It is up to you to decide which is which and how each should be done.
When deploying an EC2 instance, I prefer to configure a machine exactly the way that I want with OS patches, software and my applications. Then I create an AMI. When I then launch a new EC2 instance, I use my hand created AMI. Then the new EC2 instance is exactly what I want. No long deployment phase.
Best practices when writing scripts. Do not store your Amazon credentials in your scripts, source code, random files, etc. Install the Amazon CLI (Command Line Internface) tool and then configure the CLI with your credentials. Now your credentials are stored in a well defined location with the added benefit that Amazon SDKs, scripts, etc. will know how to find the credentials and will automatically load and use them.
The easiest way of writing scripts to manage AWS services is to use the AWS CLI. Just about anything that you can do in the Amazon Management Console, you can do with the CLI. The CLI works on Windows, Linux and Mac OS.
AWS Command Line Interface
Here is a CLI example that will terminate an EC2 instance. Replace with your instance ID:
aws ec2 terminate-instances --instance-ids i-1234567890abcdef0
Writing your scripts in Python is another good idea. Managing AWS services with Python is very easy; there are lots of examples available on the Internet; and Python is just so easy and quick to develop Amazon apps. Use the Boto3 library and not the older Boto library. I use Python 3.x for all new development, but be aware that there is a lot of already created work on the Internet for AWS that runs under Python 2.x.
CLI EC2 Commands

Categories