AWS lambda AccessDeniedException calling another lambda function - python

In my project i create a lambda function in python code that in one method have to call another lambda function using boto3.
In my main lambda i create client like this:
client = boto3.client('lambda')
then i invoke my method in this fashion:
response = client.invoke(
FunctionName='arn:aws:lambda:eu-west-1:1577:function:test',
InvocationType='RequestResponse',
LogType='None',
Payload=json.dumps(d)
)
but when i test my main lambda console return this error:
An error occurred (AccessDeniedException) when calling the Invoke operation: User
I try to set in my enviroment variables the AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY but when i try to Save, return this error:
Lambda was unable to configure your environment variables because the environment variables you have provided contains reserved keys that are currently not supported for modification. Reserved keys used in this request: AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY
How can i set in lambda a call using a IAM user?
Thanks in advance

Instead of using an IAM user, attach the Lambda invoke permission to the existing IAM role attached to your parent Lambda function.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "InvokePermission",
"Effect": "Allow",
"Action": [
"lambda:InvokeFunction"
],
"Resource": "*"
}
]
}
Note: You can specify the ARN of the Lambda function that is being invoked for the Resource.

If possible, restrict the scope so the caller can only call your target function, vs the "*" resource which allows it to call any lambda function.
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "InvokePermission",
"Effect": "Allow",
"Action": [
"lambda:InvokeFunction"
],
"Resource": "arn:aws:lambda:eu-west-1:1577:function:test"
}
]
}

Related

AWS CLI Can List S3 Bucket But Access Denied For Python Lambda

I've used terraform to setup infra for an s3 bucket and my containerised lambda. I want to trigger the lambda to list the items in my s3 bucket. When I run the aws cli it's fine:
aws s3 ls
returns
2022-11-08 23:04:19 bucket-name
This is my lambda:
import logging
import boto3
LOGGER = logging.getLogger(__name__)
LOGGER.setLevel(logging.DEBUG)
s3 = boto3.resource('s3')
def lambda_handler(event, context):
LOGGER.info('Executing function...')
bucket = s3.Bucket('bucket-name')
total_objects = 0
for i in bucket.objects.all():
total_objects = total_objects + 1
return {'total_objects': total_objects}
When I run the test in the AWS console, I'm getting this:
[ERROR] ClientError: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied
No idea why this is happening. These are my terraform lambda policies, roles and the s3 setup:
resource "aws_s3_bucket" "statements_bucket" {
bucket = "bucket-name"
acl = "private"
}
resource "aws_s3_object" "object" {
bucket = aws_s3_bucket.statements_bucket.id
key = "excel/"
}
resource "aws_iam_role" "lambda" {
name = "${local.prefix}-lambda-role"
path = "/"
assume_role_policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Principal": {
"Service": "lambda.amazonaws.com"
},
"Effect": "Allow"
}
]
}
EOF
}
resource "aws_iam_policy" "lambda" {
name = "${local.prefix}-lambda-policy"
description = "S3 specified access"
policy = <<EOF
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket"
],
"Resource": [
"arn:aws:s3:::bucket-name"
]
},
{
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject"
],
"Resource": [
"arn:aws:s3:::bucket-name/*"
]
}
]
}
EOF
}
As far as I can tell, your Lambda function has the correct IAM role (the one indicated in your Terraform template) but that IAM role has no attached policies.
You need to attach the S3 policy, and any other IAM policies needed, to the IAM role. For example:
resource "aws_iam_role_policy_attachment" "lambda-attach" {
role = aws_iam_role.role.name
policy_arn = aws_iam_policy.policy.arn
}
In order to run aws s3 ls, you would need to authorize the action s3:ListAllMyBuckets. This is because aws s3 ls lists all of your buckets.
You should be able to list the contents of your bucket using aws s3 ls s3://bucket-name. However, you're probably going to have to add "arn:aws:s3:::bucket-name/*" to the resource list for your role as well. Edit: nevermind!

AWS S3 PutObject Access denied problem in Python

I'm trying to upload an image to AWS S3. This code previously worked fine (and still working for another project). This is a brand new project with a new AWS S3 bucket. I noticed they again changed a lot and maybe it's a problem.
This is the code:
s3_client.upload_fileobj(
uploaded_file,
files_bucket_name,
key_name,
ExtraArgs={
'ContentType': uploaded_file.content_type
}
)
This is the permission policy for the bucket:
{
"Version": "2012-10-17",
"Id": "Policy1204",
"Statement": [
{
"Sid": "Stmt15612",
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject",
"s3:PutObject"
],
"Resource": "arn:aws:s3:::bucket-name/*"
}
]
}
The upload did not work until I added the "PutObject" here but it was working for another project. I don't like about this policy that PutObject is now public available.
How to make:
all images are public available
but only owner can upload files?
This are screenshots from AWS permissions for this bucket:
The problem has gone as soon as I created an IAM user and granted it full access to S3. Not sure if this solution is good or not but at least it's working now.
It appears that your requirement is:
Allow everyone to see the files
Only allow an owner to upload them
There is a difference between "seeing the files" -- ListObjects allows listing of the objects in a bucket while GetObject allows downloading of an object.
If you want to make all objects available for download assuming that the user knows the name of the object, then you could use a policy like this:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:GetObject"
],
"Resource": "arn:aws:s3:::bucket-name/*"
}
]
}
Note that this policy will not permit viewing the contents of the bucket.
If you wish to allow a specific IAM User permission to upload files to the bucket, then put this policy on the IAM User (not on the Bucket):
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:PutObject"
],
"Resource": "arn:aws:s3:::bucket-name/*"
}
]
}

Getting AccessDeniedException from Lambda function when calling AWS SSO Permission set

Following is my Python code to add/update an inline policy for an AWS SSO permission set:
# In actual code adding escape characters
Inline_Policy="
"Version": "2012-10-17",
"Statement": [
{
"Action": [
"s3:Get*",
"s3:List*"
],
"Effect": "Allow",
"Resource": "*"
}
] "
response = client.put_inline_policy_to_permission_set(
InstanceArn='arn:aws:sso:::instance/ssoins-sssss',
PermissionSetArn='arn:aws:sso:::permissionSet/ssoins-sssss/ps-sssss',
InlinePolicy=Inline_Policy)
I am getting the error:
"errorMessage": "An error occurred (AccessDeniedException) when calling the PutInlinePolicyToPermissionSet operation: User: arn:aws:sts::ddddddd:assumed-role/Modify_Permission_Set-role-ssss/Modify_Permission_Set is not authorized to perform: sso:PutInlinePolicyToPermissionSet on resource: arn:aws:sso:::permissionSet/ssoins-sssss/ps-sssss"
I tried adding the Admin policy for the Lambda role executing the function and I still get permission denied.
Is there a different way to handle SSO permission sets than regular IAM permissions?
Admin Policy attached to Lambda
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "*",
"Resource": "*"
}
]
}
Have you checked if there is a Service Control Policy (SCP) denying access to SSO which applies to your account or Organizational Unit (OU) please? https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_scps.html
It is likely due to your region if you have ensured that the policy and permissions are correct.
Make sure you are defining the sso client to the region where your SSO or Identity Center is activated
e.g. for Python sso = boto3.client('sso-admin', region_name='deployed_sso_region')

Invoke step function in account A from a lambda in account B using cdk

I have a lambda stack that is deployed in account A, and a stepfunction stack deployer in account B. Now How do I invoke this stepfunction from the lambda using python cdk? specifically what permissions do i need to give them?
The lambda (Account A) has an IAM role(RoleA) assigned. The Step function (Account B) has an IAM role (RoleB) assigned.
Permissions
The lambda's IAM role should have permission to assume the role from (Account B)
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "sts:AssumeRole",
"Resource": "*"
}
]
}
The Step functions IAM role(Role B) should have a trusted policy that allows the Lambda's IAM role to assume it. in the following trust policy 123456789012 is the account number of Account A
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": [
"arn:aws:iam::123456789012:root"
]
},
"Action": "sts:AssumeRole"
}
]
}
Inside the lambda
The lambda should have code that assumes the role(Role B) from Account B and get temporary credentials
Using those credentials the lambda should invoke the step function.
How to assume an IAM role in a different account from lambda
If you want to trigger something when a cdk deployment happens:
Seems a bizarre use-case but I think the solution here is to define a custom resource.
Both cdk and cloudformation support that:
https://docs.aws.amazon.com/cdk/api/latest/docs/custom-resources-readme.html
https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/template-custom-resources.html
Because you are using a custom resource you would need to further handle Create, Update, Delete events by yourself. Similarly send completion responses, I would advice you to use cfnresponsemodule to send back completion responses otherwise cdk will never be able to tell when your custom resource function completed:
https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/cfn-lambda-function-code-cfnresponsemodule.html

How to assume an AWS role from another AWS role?

I have two AWS account - lets say A and B.
In account B, I have a role defined that allow access to another role from account A. Lets call it Role-B
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::********:role/RoleA"
},
"Action": "sts:AssumeRole"
}]
}
In account A, I have defined a role that allows the root user to assume role. Lets call it Role-A
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::********:root"
},
"Action": "sts:AssumeRole"
}]
}
Role A has the following policy attached to it
{
"Version": "2012-10-17",
"Statement": [
{
"Action": "sts:AssumeRole",
"Resource": "arn:aws:iam::****:role/RoleB",
"Effect": "Allow"
}]
}
As a user in account A, I assumed the Role-A. Now using this temporary credential, I want to assume the Role-B and access the resource owned by account B. I have the below code
client = boto3.client('sts')
firewall_role_object = client.assume_role(
RoleArn=INTERMEDIARY_IAM_ROLE_ARN,
RoleSessionName=str("default"),
DurationSeconds=3600)
firewall_credentials = firewall_role_object['Credentials']
firewall_client = boto3.client(
'sts',
aws_access_key_id=firewall_credentials['AccessKeyId'],
aws_secret_access_key=firewall_credentials['SecretAccessKey'],
aws_session_token=firewall_credentials['SessionToken'], )
optimizely_role_object = firewall_client.assume_role(
RoleArn=CUSTOMER_IAM_ROLE_ARN,
RoleSessionName=str("default"),
DurationSeconds=3600)
print(optimizely_role_object['Credentials'])
This code works for the set of roles I got from my client but is not working for the roles I defined between two of the AWS account I have access to.
Finally got this working. The above configuration is correct. There was a spelling mistake in the policy.
I will keep this question here for it may help someone who want to achieve double hop authentication using roles.

Categories