Coming from AWS, I am completely new to Azure in general and to Cloud Services spesifically.
I want to write a python application that leverages GPU on Azure in a PaaS architecture (Platform as a Service). The application will hopefully be deployed somewhere central, and then a number of GPU enabled nodes will spin up and run the application until it is done before closing down again.
I want to know, what is the shortest way to accomplish this in Azure?
Is my assumption correct that I will need to use what is called Cloud Services with a worker role, or will I have to create my own infrastructure based on single VMs running in IaaS?
It sounds like you created an application which need to do some general-purpose computing on GPU via Cuda or OpenCL. If so, you need to install GPGPU driver on Azure to support your Python application, so the Azure NC & NV Series VMs are suitable for this scenario like on AWS, as the figure below from here.
Hope it helps. Any concern, please feel free to let me know.
Related
I have a python script which run 24 hours on my local system and my script uses different third party libraries that are installed using pip in python
Libraries
BeautifulSoup
requests
m3u8
My python script is recording some live stream videos from a website and is storing on system. How google cloud will help me to run this script 24/hours daily and 7days a week.I am very new to clouds. Please help me i want to host my script on google cloud so i want to make sure that my script will work there same as it is working on local system so my money will not lost .
Yes it can. I would recommend familiarizing yourself with this Quickstart: Deploy a Python Service to Cloud Run and What is Cloud Run. When you use Cloud Run, you can provide your own Docker image that uses Python, or select from preexisting images.
Once you have a Cloud Run instance running, you can tie it into other Cloud Run instances or Cloud Functions which are scalable functions that use Cloud Run under-the-hood and allow you to easily scale your app. Additionally, these instances spin down to 0 if nobody is using the app which saves costs greatly. This can be modified of course so that the app is always spun-up.
In general I highly recommend looking at Cloud Run but also other services can handle the task such as a Compute Engine.
If you want to run 24/7 application on the cloud, whatever the cloud, you must not use solution with timeout (like Cloud Run or Cloud Functions).
You can imagine using App Engine flex, but it won't be my best advice.
The most efficient for me (low maintenance, cost efficient), is to use GKE autopilot. A Kubernetes cluster managed for you, you pay only the CPU/Memory that your workloads use.
You have to containerize your app to do that.
I have a fx vol model which I have written in python / questdb on my local machine. The standalone model works as an application in my machine and I also have lots of feeds written which are constantly updating the local machine questdb instance.
Now I want to move to web and have my application and database on the web server away from my machine.
I am not too familiar of web servers and how to install a questdb there.
From my knowledge I will need :
a VPS paid subscription where I have centOS and I have python support.
I need to install questdb here ( using docker ?)
Install all my python model here
Start the questdb in the VPS
Configure my scripts to use the hosted questdb
The model saves output in a table in questdb ; while the scripts also keep on updating new feeds to questdb
Start a webserver which provides a web access to the model results saved in the questdb
7.a I need to provide username and login for the website
7.b I need to use some vitualization
7.c I need the users once they are logged in to run some simulations
What I need some quidance is :
What sort of VPS service to look for
Has any one already installed a questdb in this way
Which webserver is the best for python
Lots of good questions there. Will try to help with all of them (disclaimer, I work for QuestDB as a developer advocate)
What sort of VPS service to look for:
Anything will work, but depending on the volume you are going to ingest/query, you will greatly benefit from a fast drive. In many cases the default disk drive for a VPS/Cloud provider will be slower than the SSD drive on your local development machine. See if the default works for you and otherwise select a better option when starting your virtual machine.
Has any one already installed a questdb in this way
Sure. If you want to install on AWS, Google Cloud, or Digital Ocean, you have detailed instructions at the QuestDB site. If you prefer to use Docker, it is fully supported, but a popular option is installing the binaries and just start QuestDB as a service.
Which webserver is the best for python
This one is tricky, as it really depends on what you will do with it. From your question I am getting the idea you want to run a small user-facing web application that will also run some background jobs. For that scenario a suitable framework can be Flask, which also offers some plugins for user/login management and for background tasks (using redis as a dependency)
Context: I'm in the process of designing an event-driven Python application. Various stakeholders have tasked me with investigating options for deploying the application using GraphQL endpoints within a serverless environment running on Azure Functions. End goal being that as the underlying data structures grow, we'd like to easily maintain the use-ability and performance of the application over time. Based on below resources it appears this is possible:
(https://azure.microsoft.com/en-us/resources/videos/build-2019-build-scalable-apis-using-graphql-and-serverless/)
(https://azure.microsoft.com/en-us/resources/videos/azure-friday-live-building-serverless-python-apps-with-azure-functions/)
(https://graphene-python.org/)
Question: User requirements dictate that the Azure Functions MUST be for internal use only and cannot be exposed publicly. Reading through the docs below I haven't found any resources on security config options for private endpoints.
https://learn.microsoft.com/en-us/azure/azure-functions/
Private endpoint in Azure
Can someone please point me in the right direction? Are Azure Functions even capable of this? And if they aren't can this be achieved with an alternative like Azure App Service?
Azure Functions have multiple hosting options including Consumption Plan, Premium Plan and App Service Plan.
Out of which, for complete VNET Isolation, App Service Environment is the only way to go as of now since the Private Endpoints for Azure Web Apps is currently in preview.
But note that Azure Functions can be deployed into a Kubernetes cluster as well which could be the better option if you already have a kubernetes cluster to deploy to.
Is there such service (free/paid) allow me to upload my python program (which it may have to support the library that I import), and just run on its server end 24 hours non-stop please?
The idea is this python program will keep on running and rely me information.
many thanks
You might try heroku. They have a free tear, and it integrates with git repositories.
Yes,There are many platform to upload your code in cloud and runs non-stop.
AWS and GCP(Google cloud platform) are the best services which provide these facilities.
You can refer this link for AWS AWS free tier usage
For GCP you can refer this link GCP free tier usage
Thanks.
There are many services, but I would recommend using a linux VPS so you can control your server and run many python programs as you want 24/7.
DigitalOcean providing a great service like a small VPS with good specifications for python programs and the price is low as $5/month and you can choose your linux distribution + datacenter region and will take only few seconds to setup your server.
Here is my referral link (Click Here) using my link will give you $10 credits and I will have benefits as well.
good luck!
I have an apps coded in python/pandas/scipy which can be launched by anyone authorized. I want to use Google Cloud Platform to host it but I can't find a good way to set up this.
Since I want my app to be a web app, part of this is hosted on google app engine, but since google app engine does not seem compatible with big calculus and scientific libraries, I think I shall send the task to a VM with the scientific libraries.
My questions are:
1- could I create a VM each time the app is launched, in order to save money? But if I do this, I have to setup pandas scipy etc... each time, which should take some time?
2- Am I condemned to have a VM running every time and activated by the app? But if two people launch the calculus together, this can be really bad perf?
3 - Shall I package my app as a .exe and launch it as a standalone?
I am totally lost on how to handle such an architecture, can anyone give me some advices?
Thank you!
You can host your application on Managed VMs. Applications that run on managed VMs are not subject to the restrictions imposed on the sandboxed runtimes (Java, Python, PHP, and Go).
You can also choose the hosting environment (sandboxed or managed VM) separately for each module in your application. This means an application can contain a mix of modules that use different hosting environments. For instance, you might use the sandbox environment for your client-facing frontend, and use a module running in a managed VM for backend processing.