Can you call AWS CDK in python without CLI? - python

Is it possible to run the AWS CDK lifecycle directly in python (or any other language), without using the cli? e.g.
app.py
app = cdk.App()
Stack(app, ..)
app.synth()
app.deploy()
run python
python app.py //instead of cdk deploy...
this could be useful to prepare temporary test scenarios.

What's possible as of today:
Programatically synth templates without cdk synth?
Yes. CDK's testing constructs work this way. Tests assert against a programatically synth-ed template.
template = Template.from_stack(processor_stack)
template.resource_count_is("AWS::SNS::Subscription", 1)
Programatically deploy apps without cdk deploy?
Not in the way the OP describes, but you can get close with the CDK's CI/CD constructs. The CDK Pipelines construct synths and deploys CDK apps programatically. Build a pipeline that builds and tears-down a testing environment on each push to your remote repository. Alternatively, a standalone CodeBuild Project can be used with commands to cdk deploy an app and perform tests when triggered by an event.
Also worth mentioning is cdk watch for rapid local development iterations. It automatically deploys incremental changes as you code.

Related

How to deploy AWS using CDK, sagemaker?

I want to use this repo and I have created and activated a virtualenv and installed the required dependencies.
I get an error when I run pytest.
And under the file binance_cdk/app.py it describes the following tasks:
App (PSVM method) entry point of the program.
Note:
Steps tp setup CDK:
install npm
cdk -init (creates an empty project)
Add in your infrastructure code.
Run CDK synth
CDK bootstrap <aws_account>/
Run CDK deploy ---> This creates a cloudformation .yml file and the aws resources will be created as per the mentioned stack.
I'm stuck on step 3, what do I add in this infrastructure code, and if I want to use this on amazon sagemaker which I am not familiar with, do I even bother doing this on my local terminal, or do I do the whole process regardless on sagemaker?
Thank you in advance for your time and answers !
The infrastructure code is the Python code that you want to write for the resources you want to provision with SageMaker. In the example you provided for example the infra code they have is creating a Lambda function. You can do this locally on your machine, the question is what do you want to achieve with SageMaker? If you want to create an endpoint then following the CDK Python docs with SageMaker to identify the steps for creating an endpoint. Here's two guides, the first is an introduction to the AWS CDK and getting started. The second is an example of using the CDK with SageMaker to create an endpoint for inference.
CDK Python Starter: https://towardsdatascience.com/build-your-first-aws-cdk-project-18b1fee2ed2d
CDK SageMaker Example: https://github.com/philschmid/cdk-samples/tree/master/sagemaker-serverless-huggingface-endpoint

Remove CodePipeline Update from CloudFormation stack's deploy

I'm currently using CDK Synth in order to generate templates that I deploy with a sam deploy command.
The idea behind my work is to deploy my stacks from a GitLab runner, without the use of CodePipeline (which is the actual method of deployement).
I manage to deploy my stack as I want using the template located in the cdk.out, after the CDK Synth command. However, the deploy is triggering the CodePipeline because the template contains a ""Type": AWS::CodePipeline::Pipeline" section, which leads to a " * Modify cdkpipeline AWS::CodePipeline::Pipeline operation in the sam deploy command of the stack.
I want to remove this operation in the deploy section, so that the CodePipeline is not triggered.
I can't modify the code in himself, because I need to keep both deployment methods alive.
I also can't use the --guided prompts from the sam deploy command because the scripts are executed from a GitLab runner.
Do you have any clue on achieving this, by passing parameters on the deploy (other libraries are fine to me if it works), or by using a tool that deletes the CodePipeline section in the templates ? Maybe something exists aswell to prevent CDK Synth to generate a CodePipeline section ?
I'm open to any proposal.
Thank you already

Azure Function CI/CD deployment using GitHub loses important file in the .gitignore

Odd question, but a question for which I have found no answers for. A classic "middleman" (Git) screwing the consumer problem.
I have an Azure Function (python, timer trigger), that I can successfully deploy straight from VS Code. It's the usual file structure, etc. BUT, when I assign a deployment slot in the Portal (Deployment Center), connecting to GitHub (using this example), I lose an important .py file that I have listed in my .gitignore file. The .gitignore file is in the root of the project folder.
This file has some API keys, Slack hooks, etc. - so it is essential to my Function. Obviously, I do not want those keys exposed on my GitHub page, but I need them in the Azure Function.
Am I SOL, if I want to do a CI/CD pipeline?
E.g., I'm not always around my personal laptop with VS Code, and I might want to fiddle with the timer schedule straight from GitHub on the fly - and this CI/CD connection is the best way to make sure my Function keeps running (it's a python function on a linux consumption plan, so I can't just edit CRON from the portal).
When using VSCode for development you would normally store your secrets in the local.settings.json for local debugging and testing.
Code and test Azure Functions locally - Local Settings File
When you deploy from Visual Studio Code it publishes the local.settings.json file as app settings but since the file is in your .gitignore your github remote won't know about the file to publish it.
The app settings of your FunctionApp can be found under the configuration blade of your FunctionApop. Any app settings can then be referenced in your FunctionApp as env vars within your code.
Azure Functions Python developer guide - Environment Variables
Since the local.settings.json file is usually added to the .gitignore file you would need to find another way to seed the app settings as part of your github actions CI/CD process.
You have a couple of options. You could create repo secrets within github, reference those at deployment time and write a script that deploys the secrets to your app settings at the same time as your function code.
Github Actions - Encrypted Secrets
Alternatively you can pull your app settings from a key vault so that you don't need to manage the secrets at all, as long as they static and not dynamic you can pre-populate the key vault and use key vault references in the app settings of your Azure function.
Use Key Vault references for App Service and Azure Functions
There is a github action which lets you specify your app settings as JSON, which I've not used but looks like it could be just what you're looking for.
https://github.com/marketplace/actions/azure-app-service-settings

Is there a way to test an application deployed with Zappa?

I have created a Django application which is deployed to AWS using Zappa in a CI/CD pipeline using the following commands:
- zappa update $ENVIRONMENT
- zappa manage $ENVIRONMENT "collectstatic --noinput"
- zappa manage $ENVIRONMENT "migrate"
However I want to add a test step to the CI/CD pipeline that would test the application much like in a way that I would test the application locally using the Django command:
python manage.py test
Is there a way to issue a zappa command to make run the "python manage.py test"? Do you have any suggestion?
Here is the docs that have a docker setup that you can use for local testing (Docs here)
Also, this post from Ian Whitestone(a core Zappa contributor) uses the official AWS docker image for the same job. And he has given one example for local testing also (Blog post)

How to deploy AWS python Lambda project locally?

I got an AWS python Lambda function which contains few python files and also several dependencies.
The app is build using Chalice so by that the function will be mapped like any REST function.
Before the deployment in prod env, I want to test it locally, so I need to pack all this project (python files and dependencies), I tried to look over the web for the desired solution but I couldn't find it.
I managed to figrue how to deploy one python file, but a whole project did not succeed.
Take a look to the Atlassian's Localstack: https://github.com/atlassian/localstack
It's a full copy of the AWS cloud stack, locally.
I use Travis : I hooked it to my master branch in git, so that when I push on this branch, Travis tests my lambda, with a script that uses pytest, after having installed all its dependencies with pip install. If all the tests passed, it then deploy the lambda in AWS in my prod-env.

Categories