How to deploy a python docker image to AWS Lambda? - python

I am trying to figure out how to deploy a flask application that I have received with a Dockerfile to AWS Lambda.
In local, all I have to do to start the app is to enter docker-compose up. That's work great.
But I don't know how to deploy this environment to AWS Lambda and tell it to run docker-compose upto launch the app.
Any help will be highly appreciated. Thanks.

It is not possible to use a docker image with aws lambda. Lambda is supposed to execute function or code snippets in different possible programming languages.
You should use AWS ECS to run docker container from images.

Related

Changes are not being deployed to AWS console

I am deploying changes to AWS console throuhg the command
cdk deploy --all
Previously it worked well and created the stack on AWS console but now after creating another stack when I tried to run the same command cdk deploy all rather than deploying code to AWS it shows just following four statements
Usage:
cdk [-vbo] [--toc] [--notransition] [--logo=<logo>] [--theme=<theme>] [--custom-css=<cssfile>] FILE
cdk --install-theme=<theme>
cdk --default-theme=<theme>
cdk --generate=<name>
Something changed in your environment and now cdk is pointing to the Courseware Development Kit instead of the aws-cdk.
You can confirm this by studying the output of which cdk.
To fix this, uninstall Courseware Development Kit or create a shell alias for it (after putting it further down in your $PATH).
Also, cdk deploy all is not the right command - you're looking for cdk deploy --all.

How to deploy AWS using CDK, sagemaker?

I want to use this repo and I have created and activated a virtualenv and installed the required dependencies.
I get an error when I run pytest.
And under the file binance_cdk/app.py it describes the following tasks:
App (PSVM method) entry point of the program.
Note:
Steps tp setup CDK:
install npm
cdk -init (creates an empty project)
Add in your infrastructure code.
Run CDK synth
CDK bootstrap <aws_account>/
Run CDK deploy ---> This creates a cloudformation .yml file and the aws resources will be created as per the mentioned stack.
I'm stuck on step 3, what do I add in this infrastructure code, and if I want to use this on amazon sagemaker which I am not familiar with, do I even bother doing this on my local terminal, or do I do the whole process regardless on sagemaker?
Thank you in advance for your time and answers !
The infrastructure code is the Python code that you want to write for the resources you want to provision with SageMaker. In the example you provided for example the infra code they have is creating a Lambda function. You can do this locally on your machine, the question is what do you want to achieve with SageMaker? If you want to create an endpoint then following the CDK Python docs with SageMaker to identify the steps for creating an endpoint. Here's two guides, the first is an introduction to the AWS CDK and getting started. The second is an example of using the CDK with SageMaker to create an endpoint for inference.
CDK Python Starter: https://towardsdatascience.com/build-your-first-aws-cdk-project-18b1fee2ed2d
CDK SageMaker Example: https://github.com/philschmid/cdk-samples/tree/master/sagemaker-serverless-huggingface-endpoint

Deploy FastAPI microservice in Kubernetes via OpenFaaS

I have a big application structured with FastAPI (with many routers), that runs in AWS Lambda. I want to migrate it to a container inside Kubernetes. From my research, OpenFaaS is a great solution.
However I can't find documentation about how to do this.
Does anyone has references or a better solution?
If you are using the python or Ruby
You can create the docker file and use it for creating the docker images and simply deploy it on Kubernetes.
FROM ruby:2.7-alpine3.11
WORKDIR /home/app
COPY . .
RUN bundle install
CMD ["ruby", "main.rb"]
For OpenFass they have provided good labs with documentation to create the Async function etc.
Labs : https://github.com/openfaas/workshop
If you are looking for examples you can check out the official repo only : https://github.com/openfaas/faas/tree/master/sample-functions
Extra
There is also another good option Knative or Kubeless
You can find the python Kubeless example and CI/CD example : https://github.com/harsh4870/kubeless-kubernetes-ci-cd
Try use a template to build an upstream FastAPI application as an OpenFAAS function. This will create a docker image you can run and deploy in your Kubernetes cluster.
You can see how to do so in the following github repo

Azure Functions in VS Code

I am trying to run my Python code with Azure Functions using Visual Studio Code. I have run the "Hello World" project with Azure Functions in VS Code, but I am looking for a way to deploy my Python code with Azure Functions. I would really appreciate it if anyone can help me with this issue by introducing a relevant tutorial or sharing some ideas.
This is Azure for Python Developers
and you want to deploy python code to azure, there are three options for you to choose with regard to different code situation:
Deploy a web app with VS Code
Deploy Docker containers to Azure App Service with Visual Studio Code
Create and deploy serverless Azure Functions in Python with Visual Studio Code
First, you need to create a function app on azure. This is a container to deploy your function.
Then you have two ways to deploy:
1, use VS code:
click the upload button, and then select the folder you want to deploy, the subscription, and select the function app you created just now.
2, use cmd:
Go to the root folder of your function, and run this command:
func azure functionapp publish <FunctionAppName>
The name of function app is absolutely unique, so if you have the authority then your folder will deploy to this function app directly.

How can I "upload and deploy" to Elastic Beanstalk from the command line?

To start, apologies if this is a foolish question -- I'm relatively new to AWS and none of my googling has been fruitful thus far.
I have a server that I deploy through elastic beanstalk. Currently I do it relatively manually (docker build, push, zip folder and click upload and deploy and the elastic beanstalk page with the zip I just made.
I'm writing a script to simplify this. Everything seems to be working, but I can't figure out how to do the upload and deploy step programmatically. Just to be very explicit, this is the button I'm referring to:
Is there a way I can do this bit using python? Any help is appreciated.
You can use the EB CLI to deploy and manage your Application
Elastic Beanstalk Command Line Interface (EB CLI)

Categories