ValidationException when using injecting variables to boto3 - python

I am transferring variables from one function to the other , where the final step is to use them as a part of a BOTO3 function.
I am getting the needed variable, and converting it to a STR type.
I tried to check the type of the output I'm getting, also tried to change it to Unicode (I'm familiar with the difference between str & Unicode).
check with hexdump to see if there are any secret hidden characters,
tried to save it to a file and reading from it,
compered the string I am generating and the one from the aws console - they are the same !
tried to change the formating method of the variables {} .format vs %s,
tried to update the aws cli \ python versions..
The annoying part is that when using the string from aws console, the BOTO3 function works perfectly. (so it is not a permission issue)
def describe_table():
response = client.list_tables(
)
tablename = response['TableNames']
for n in tablename:
if 'tname' in n:
print n
return n
def tag_dynamo(AID):
AID = '111222333'
response = client.tag_resource(
ResourceArn="arn:aws:dynamodb:region:%s:table/%s" % (AID, n),
Tags=[
{
'Key': 'Key_name',
'Value': 'value_state'
}
]
)
This is the error I'm getting:
botocore.exceptions.ClientError: An error occurred (ValidationException) when calling the
TagResource operation: Invalid TableArn: Invalid ResourceArn provided as input
arn:aws:dynamodb:region:AID:table/TABLE_NAME

Related

Keeping quotes from std out for passing to bash

Okay, this is a bit convoluted but I've got a python script that digests a json file and prints a string representation of that file like so
for id in pwds.keys():
secret += f"\'{id}\' : \'{pwds[id]['username']},{pwds[id]['pswd']}\',"
secret = secret[:-1] + "}\'"
print(secret)
This is taken in by a jenkins pipeline so it can be passed to a bash script
def secret_string = sh (script: "python3 syncToSecrets.py", returnStdout: true)
sh label: 'SYNC', script: "bash sync.sh ${ENVIRONMENT} ${secret_string}"
I can see that when python is printing the output it looks like
'{"key" : "value", "key" : "value"...}'
But when it gets to secret_string, and also the bash script it then looks like
{key : value, key : value}
This is how the bash script is calling it
ENV=$1; SECRET_STRING=$2;
aws secretsmanager create-secret --name NAME --secret-string "${SECRET_STRING}"
Which technically works, it just uploads the whole thing as a string instead of discrete KV-pairs.
I'm trying to run some stuff with the AWS CLI, and it requires that the data be wrapped in quotes, but so far, I've been totally unable to keep the quotes in between processes. Any advice?
Sample pwds dict data:
import json
pwds = {
'id001': {
'username': 'user001',
'pswd': 'pwd123'
},
'id002': {
'username': 'user002',
'pswd': 'pwd123'
}
}
As suggested by SuperStormer, it's a better to use Python types (dict, list, etc) instead of building your own JSON.
secrets = [{id: f"{val['username']}, {val['pswd']}"} for id, val in pwds.items()]
json.dumps(secrets)
'[{"id001": "user001, pwd123"}, {"id002": "user002, pwd123"}]'
The JSON string should be usable within Jenkins script blocks.
Try experimenting with single quotes or --secret-string file://secrets.json as alternatives.

Python MySQL Query Not working (Something went wrong format requires a mapping)

I am trying to pull a query from my database and I am receiving this error when trying to run it: Something went wrong format requires a mapping.
I'm using flask in Python and pymysql.
This is my class method that is throwing the error:
#classmethod
def get_dojo(cls, data):
query = 'SELECT * FROM dojos WHERE id = %(id)s;'
result = connectToMySQL('dojos_and_ninjas').query_db(query, data)
return cls(result[0])
I thought it might be the data I am passing through but it looks good to me, and the query runs fine in workbench. I tried restarting MySQL, VS Code, and restarting the pipenv.
The data I am passing is:
#app.route('/dojo/<int:id>')
def dojo_page(id):
dojo_current = Dojo.get_dojo(id)
return render_template('dojo_page.html', dojo = dojo_current)
My page will render and I receive no error when I enter an id in manually instead of calling the data into it.
I figured it out, I needed to add a data dictionary in the route.
#app.route('/dojo/<int:id>')
def dojo_page(id):
data = {
'id': id
}
dojo_current = Dojo.get_dojo(data)
return render_template('dojo_page.html', dojo = dojo_current)

S3 InvalidDigest when calling the PutObject operation [duplicate]

I have tried to upload an XML File to S3 using boto3. As recommended by Amazon, I would like to send a Base64 Encoded MD5-128 Bit Digest(Content-MD5) of the data.
https://docs.aws.amazon.com/AmazonS3/latest/API/RESTObjectPUT.html
https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Object.put
My Code:
with open(file, 'rb') as tempfile:
body = tempfile.read()
tempfile.close()
hash_object = hashlib.md5(body)
base64_md5 = base64.encodebytes(hash_object.digest())
response = s3.Object(self.bucket, self.key + file).put(
Body=body.decode(self.encoding),
ACL='private',
Metadata=metadata,
ContentType=self.content_type,
ContentEncoding=self.encoding,
ContentMD5=str(base64_md5)
)
When i try this the str(base64_md5) create a string like 'b'ZpL06Osuws3qFQJ8ktdBOw==\n''
In this case, I get this Error Message:
An error occurred (InvalidDigest) when calling the PutObject operation: The Content-MD5 you specified was invalid.
For Test purposes I copied only the Value without the 'b' in front: 'ZpL06Osuws3qFQJ8ktdBOw==\n'
Then i get this Error Message:
botocore.exceptions.HTTPClientError: An HTTP Client raised and unhandled exception: Invalid header value b'hvUe19qHj7rMbwOWVPEv6Q==\n'
Can anyone help me how to save Upload a File to S3?
Thanks,
Oliver
Starting with #Isaac Fife's example, stripping it down to identify what's required vs not, and to include imports and such to make it a full reproducible example:
(the only change you need to make is to use your own bucket name)
import base64
import hashlib
import boto3
contents = "hello world!"
md = hashlib.md5(contents.encode('utf-8')).digest()
contents_md5 = base64.b64encode(md).decode('utf-8')
boto3.client('s3').put_object(
Bucket="mybucket",
Key="test",
Body=contents,
ContentMD5=contents_md5
)
Learnings: first, the MD5 you are trying to generate will NOT look like what an 'upload' returns. We actually need a base64 version, it returns a md.hexdigest() version. hex is base16, which is not base64.
(Python 3.7)
Took me hours to figure this out because the only error you get is "The Content-MD5 you specified was invalid." Super useful for debugging... Anyway, here is the code I used to actually get the file to upload correctly before refactoring.
json_results = json_converter.convert_to_json(result)
json_results_utf8 = json_results.encode('utf-8')
content_md5 = md5.get_content_md5(json_results_utf8)
content_md5_string = content_md5.decode('utf-8')
metadata = {
"md5chksum": content_md5_string
}
s3 = boto3.resource('s3', config=Config(signature_version='s3v4'))
obj = s3.Object(bucket, 'filename.json')
obj.put(
Body=json_results_utf8,
ContentMD5=content_md5_string,
ServerSideEncryption='aws:kms',
Metadata=metadata,
SSEKMSKeyId=key_id)
and the hashing
def get_content_md5(data):
digest = hashlib.md5(data).digest()
return base64.b64encode(digest)
The hard part for me was figuring out what encoding you need at each step in the process and not being very familiar with how strings are stored in python at the time.
get_content_md5 takes a utf-8 bytes-like object only, and returns the same. But to pass the md5 hash to aws, it needs to be a string. You have to decode it before you give it to ContentMD5.
Pro-tip - Body on the other hand, needs to be given bytes or a seekable object. Make sure if you pass a seekable object that you seek(0) to the beginning of the file before you pass it to AWS or the MD5 will not match. For that reason, using bytes is less error prone, imo.

[snowflake python connector ]How to bindings inside a string format

I want to use data binding when executing sql.
I just want to binding in the middle of the string, but it doesn't work.
I tried the following, but all of them resulted in execution errors.
Python
param = {
"env": "dev"
"s3_credential": "secret"
}
cursor().execute(sql, param)
sql1
CREATE OR REPLACE STAGE my_s3_stage_demo
URL='s3://my-stage-demo-'%(env)s'/tmp/'
credentials = (aws_role = %(s3_credential)s )
FILE_FORMAT = ( TYPE=JSON);
error message1
snowflake.connector.errors.ProgrammingError: 091006 (22000): Bucket name 'my-stage-demo-'dev'' in the stage location is not supported. Valid bucket names must consist of lowercase letters, digits, hyphens '-', and periods '.'.
SQL2
CREATE OR REPLACE STAGE my_s3_stage_demo
URL='s3://my-stage-demo-%(env)s/tmp/'
credentials = (aws_role = %(s3_credential)s )
FILE_FORMAT = ( TYPE=JSON);
error message2
snowflake.connector.errors.ProgrammingError: 001003 (42000): SQL compilation error:
syntax error line 2 at position 32 unexpected ''/tmp/''.
I want to execute the binding result as follows, but how should I specify it?
CREATE OR REPLACE STAGE my_s3_stage_demo
URL='s3://my-stage-demo-dev/tmp/'
credentials = (aws_role = "secret" )
FILE_FORMAT = ( TYPE=JSON);
You cannot bind substrings, only complete syntactical elements.
In Python, you can do something like:
cursor.execute(
"SELECT t.*, 'P'||:2 p2 FROM IDENTIFIER(:1) t",
[['"my_db"."my_schema"."my_table"', '2. parameter']]
)
You can only use parts of a value where expressions are allowed (like 'P'||:2 above). There is also some provision for identifiers like the table name above using IDENTIFIER().
Unfortunately, for the CREATE OR REPLACE STAGE command there seems to be no support for using expressions for eg an s3 bucket, or bind variables at all.
Which means you have to use replacement for the SQL text, not variable binding.

Python throwing error when trying to pull order data via Shopify API

I am trying to play with this simple python script to pull order data from my shopify admin, but keep getting this error message (seems to be coming from line 6 as per terminal and sublime text) TypeError: not all arguments converted during string formatting:
Here is the script,
import shopify
API_KEY = 'xxxxxxxxxxxxxxx'
PASSWORD = 'xxxxxxxxxxx'
SHOP_NAME = 'Shop name goes here'
shop_url = "https://xxxxxxxxxxxxx#xxxxxxxxxx.myshopify.com/admin" % (API_KEY, PASSWORD, SHOP_NAME)
shopify.ShopifyResource.set_site(shop_url)
shop = shopify.Shop.current()
order = shopify.Order()
num = order.count()
print num
success = order.find()
print success
der.save()
print success
I am at a loss for what I am doing wrong and have tried changing line 6 every which way as this is apparently where the error is coming from (from what terminal/sublime text tells me. Any input is appreciated, I am a complete newbie to Python.
Thanks!
shop_url = "https://xxxxxxxxxxxxx#xxxxxxxxxx.myshopify.com/admin" % (API_KEY, PASSWORD, SHOP_NAME)
replace above line with
shop_url = "https://xxxxxxxxxxxxx#xxxxxxxxxx.myshopify.com/admin/%s%s%s" % (API_KEY, PASSWORD, SHOP_NAME)
The correct way to use traditional string formatting using the '%' operator is to use a printf-style format string (Python documentation for this here):
"'%s' is longer than '%s'" % (name1, name2)
However, the '%' operator will probably be deprecated in the future. The new PEP 3101 way of doing things is like this.
"'{0}' is longer than '{1}'".format(name1, name2)
Line shop_url = "...
Should in this format
shop_url = "https://%s:%s#SHOP_NAME" % (API_KEY, PASSWORD)
You are passing the shop_VAR and API,pass to the s% key to make the URL

Categories