How does AWS know where my imports are? - python

I'm new to AWS Lambda and pretty new to Python.
I wanted to write a python lambda that uses the AWS API.
boto is the most popular python module to do this so I wanted to include it.
Looking at examples online I put import boto3 at the top of my Lambda and it just worked- I was able to use boto in my Lambda.
How does AWS know about boto? It's a community module. Are there a list of supported modules for Lambdas? Does AWS cache its own copy of community modules?

AWS Lambda's Python environment comes pre-installed with boto3. Any other libraries you want need to be part of the zip you upload. You can install them locally with pip install whatever -t mysrcfolder.

The documentation seems to suggest boto3 is provided by default on AWS Lambda:
AWS Lambda includes the AWS SDK for Python (Boto 3), so you don't need to include it in your deployment package. However, if you want to use a version of Boto3 other than the one included by default, you can include it in your deployment package.
As far as I know, you will need to manually install any other dependencies in your deployment package, as shown in the linked documentation, using:
pip install foobar -t <project path>

AWS Lambda includes the AWS SDK for Python (Boto 3), so you don't need to include it in your deployment package.
This link will give you a little more in-depth info on Lambda environment
https://aws.amazon.com/blogs/compute/container-reuse-in-lambda/
And this too
https://alestic.com/2014/12/aws-lambda-persistence/

Related

Run awscli from AWS Lambda

I want to execute awscli for s3 sync from AWS Lambda.
When I use ./aws I get following error:
('Status : FAIL', 127, 's3: ./aws: No such file or directory\n')
Even tried using full path but still got error:
('Status : FAIL', 127, 's3: /Library/Frameworks/Python.framework/Versions/2.7/bin/aws: No such file or directory
')
I would recommend to use the AWS SDK which is implemented in AWS Lambda. With this you can access S3 and many other AWS services.
Try to look here for programming in Python
The other SDKs can be found here.
Remember that you can just use the supported languages for Lambda.
The AWS CLI is not installed on AWS Lambda, so that won't work out of the box.
As the AWS CLI is just a Python package, you could upload it as part of your deployment package if you're using Python as runtime.
Using the Python boto3 which is already included within your Lambda environment. You can use it to upload to your S3 bucket, as far as I know it doesn't use the sync command but will work the same as the aws s3 cp command used in the CLI.
I've used AWS CLI in AWS Lambda - by adding it as a Layer. You can add this as a layer by codifying thee Lambda infrastructure using CDK constructs.
https://pypi.org/project/aws-cdk.lambda-layer-awscli/
In my use case - I wanted to run aws s3 sync commmand.

How to use OAuth with jira-python and AWS Lambda

I wish to create issues in JIRA from an AWS Lambda function. I am getting stuck on the first couple of steps for OAuth where I need to use JIRA's AppLinks to pair with an application. Do I need to build a custom client? If so, how? If not, what do I need to do?
Any help is appreciated, thanks!
I found this question via Google looking for an answer to the same problem. I'm adding an answer in case someone else stumbles upon this question looking for an answer.
The jira python package does not include cryptography which you'll need to add as a pip requirement.
pycrypto - This is required for the RSA-SHA1 used by OAuth. Please note that it’s not installed automatically, since it’s a fairly cumbersome process in Windows. On Linux and OS X, a pip install pycrypto should do it.
https://jira.readthedocs.io/en/master/installation.html#dependencies
Depending on your platform it may not be as simple as adding the above and deploying due to native code requirements
cryptography contains native code and that code is compiled for the architecture (and OS) of the current machine. The AWS lambda documentation currently appears to claim that zipping up your local directory will work, but that's patently untrue.
https://github.com/pyca/cryptography/issues/3051#issuecomment-234093766
To resolve you can compile your function's requirements on an EC2 AMI or docker container.
If you are using serverless you can use the serverless-python-requirements plugin which is what I used to resolved these issue for me.
In your serverless.yml file you can specify compiling packages via docker.
custom:
pythonRequirements:
dockerizePip: true
found via https://www.npmjs.com/package/serverless-python-requirements#cross-compiling

how to use jwplatform api using python

I am going to create search api for Android and iOS developers.
Our client have setup a lambda function in AWS.
Now we need to fetch data using jwplatform Api based on search keyword passed as parameter. For this, I have to install jwplatform module in Lambda function or upload zip file of code with dependencies. So that i want to run python script locally and after getting appropriate result i will upload zip in AWS Lambda.
I want to use the videos/list (jwplatform Api) class to search the video library using python but i don't know much about Python. So i want to know how to run python script? and where should i put the pyhton script ?
There are a handful of useful Python script examples here: https://github.com/jwplayer/jwplatform-py
I am succeed to install jwplatform module locally.
Steps are as follows:
1. Open command line
2. Type 'python' on command line
3. Type command 'pip install jwplatform'
4. Now, you can use jwplatform api.
Above command added module jwplatform in python locally
But my another challenge is to install jwplatform in AWS Lambda.
After research i am succeed to install module in AWS Lambda. I have bundled module and code in a directory then create zip of bundle and upload it in AWS Lambda. This will install module(jwplatform) in AWS Lambda.

How to deploy AWS python Lambda project locally?

I got an AWS python Lambda function which contains few python files and also several dependencies.
The app is build using Chalice so by that the function will be mapped like any REST function.
Before the deployment in prod env, I want to test it locally, so I need to pack all this project (python files and dependencies), I tried to look over the web for the desired solution but I couldn't find it.
I managed to figrue how to deploy one python file, but a whole project did not succeed.
Take a look to the Atlassian's Localstack: https://github.com/atlassian/localstack
It's a full copy of the AWS cloud stack, locally.
I use Travis : I hooked it to my master branch in git, so that when I push on this branch, Travis tests my lambda, with a script that uses pytest, after having installed all its dependencies with pip install. If all the tests passed, it then deploy the lambda in AWS in my prod-env.

How to create AWS Lambda deployment package that uses Couchbase Python client

I'm trying to use AWS Lambda to transfer data from my S3 bucket to Couchbase server, and I'm writing in Python. So I need to import couchbase module in my Python script. Usually if there are external modules used in the script, I need to pip install those modules locally and zip the modules and script together, then upload to Lambda. But this doesn't work this time. The reason is the Python client of couchbase works with the c client of couchbase: libcouchbase. So I'm not clear what I should do. When I simply add in the c client package (with that said, I have 6 package folders in my deployment package, the first 5 are the ones installed when I run "pip install couchbase": couchbase, acouchbase, gcouchbase, txcouchbase, couchbase-2.1.0.dist-info; and the last one is the c client of Couchbase I installed: libcouchbase), lambda doesn't work and said:
"Unable to import module 'lambda_function': libcouchbase.so.2: cannot open shared object file: No such file or directory"
Any idea on how I can get the this work? With a lot of thanks.
Following two things worked for me:
Manually copy /usr/lib64/libcouchbase.so.2 into ur project folder
and zip it with your code before uploading to AWS Lambda.
Use Python 2.7 as runtime on the AWS Lambda console to connect to couchbase.
Thanks !
Unfortunately AWS Lambda does not support executing C-based python modules, like the Couchbase SDK.
Your best bet would be to use a pure-python client. The easiest way to do this would be to use the unofficial memcached client https://github.com/couchbase/couchbase-cli/blob/master/cb_bin_client.py which uses server-side moxi to handle memcached clients on port 11211.

Categories