'S3' object has no attribute 'get_object_lock_configuration' - python

I'm trying to implement the object lock feature but functions (get/put_object_lock_configuration) are not available :
>>> import boto3
>>> boto3.__version__
'1.17.64'
>>> client = boto3.client('s3')
>>> client.get_object_lock_configuration
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python3.6/site-packages/botocore/client.py", line 553, in __getattr__
self.__class__.__name__, item)
AttributeError: 'S3' object has no attribute 'get_object_lock_configuration'
>>> client.get_object_lock_configuration(Bucket='tst', ExpectedBucketOwner='tst')
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/lib/python3.6/site-packages/botocore/client.py", line 553, in __getattr__
self.__class__.__name__, item)
AttributeError: 'S3' object has no attribute 'get_object_lock_configuration'
Edit:
object lock functions not showing in python (tab tab) :
>>> client.get_object_
client.get_object_acl( client.get_object_tagging( client.get_object_torrent(
>>> client.put_object
client.put_object( client.put_object_acl( client.put_object_tagging(

get_object_lock_configuration is a function not a property.
You need to call it like that:
response = client.get_object_lock_configuration(
Bucket='string',
ExpectedBucketOwner='string'
)

The syntanx to call function client.get_object_lock_configuration:
response = client.get_object_lock_configuration(
Bucket='string',
ExpectedBucketOwner='string'
)
Syntax to call function client.put_object_lock_configuration:
response = client.put_object_lock_configuration(
Bucket='string',
ObjectLockConfiguration={
'ObjectLockEnabled': 'Enabled',
'Rule': {
'DefaultRetention': {
'Mode': 'GOVERNANCE'|'COMPLIANCE',
'Days': 123,
'Years': 123
}
}
},
RequestPayer='requester',
Token='string',
ContentMD5='string',
ExpectedBucketOwner='string'
)
To know more please refer to this: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/s3.html#S3.Client.put_object_lock_configuration
Edit:
Example Code:
import json
import boto3
client = boto3.client('s3')
response = client.get_object_lock_configuration(
Bucket='anynewname')
print(response)
Output sytanx:
{
'ObjectLockConfiguration': {
'ObjectLockEnabled': 'Enabled',
'Rule': {
'DefaultRetention': {
'Mode': 'GOVERNANCE'|'COMPLIANCE',
'Days': 123,
'Years': 123
}
}
}
}
Note: It will throw an error if object lock config is not set on bucket.
{
"errorMessage": "An error occurred (ObjectLockConfigurationNotFoundError) when calling the GetObjectLockConfiguration operation: Object Lock configuration does not exist for this bucket",
"errorType": "ClientError",
"stackTrace": [
" File \"/var/task/lambda_function.py\", line 7, in lambda_handler\n response = client.get_object_lock_configuration(\n",
" File \"/var/runtime/botocore/client.py\", line 357, in _api_call\n return self._make_api_call(operation_name, kwargs)\n",
" File \"/var/runtime/botocore/client.py\", line 676, in _make_api_call\n raise error_class(parsed_response, operation_name)\n"
]
}

Related

Error 'rlp: expected List' when calling a smart contract function using web3py

I'm trying to call a function on a smart contract deployed on SmartBCH.
This is the function ABI:
{
"inputs": [],
"name": "startStake",
"outputs": [],
"stateMutability": "nonpayable",
"type": "function"
}
This is the Python code:
from web3 import Web3
w3 = Web3(Web3.HTTPProvider('https://smartbch.greyh.at'))
if not w3.isConnected():
w3 = Web3(Web3.HTTPProvider('https://smartbch.fountainhead.cash/mainnet'))
def start_celery_stake():
import server_settings
ABI = open("ABIs/CLY-ABI.json", "r") # ABI for CLY token
abi = json.loads(ABI.read())
contract = w3.eth.contract(address="0x7642Df81b5BEAeEb331cc5A104bd13Ba68c34B91", abi=abi)
nonce = w3.eth.get_transaction_count(portfolio_address)
stake_cly_tx = contract.functions.startStake().buildTransaction({'chainId': 10000, 'gas': 64243, 'maxFeePerGas': w3.toWei('2', 'gwei'), 'maxPriorityFeePerGas': w3.toWei('2', 'gwei'), 'nonce': nonce})
private_key = server_settings.PORTFOLIO_PRIV_KEY
signed_txn = w3.eth.account.sign_transaction(stake_cly_tx, private_key=private_key)
signed_txn.rawTransaction
w3.eth.send_raw_transaction(signed_txn.rawTransaction)
The private key is stored as a string in server_settings.PORTFOLIO_PRIV_KEY.
The error I got is:
Traceback (most recent call last):
File "/usr/lib/python3.8/code.py", line 90, in runcode
exec(code, self.locals)
File "<input>", line 13, in <module>
File "/home/administrador/Descargas/BCH/transparency_portal/venv/lib/python3.8/site-packages/web3/eth.py", line 722, in send_raw_transaction
return self._send_raw_transaction(transaction)
File "/home/administrador/Descargas/BCH/transparency_portal/venv/lib/python3.8/site-packages/web3/module.py", line 57, in caller
result = w3.manager.request_blocking(method_str,
File "/home/administrador/Descargas/BCH/transparency_portal/venv/lib/python3.8/site-packages/web3/manager.py", line 198, in request_blocking
return self.formatted_response(response,
File "/home/administrador/Descargas/BCH/transparency_portal/venv/lib/python3.8/site-packages/web3/manager.py", line 171, in formatted_response
raise ValueError(response["error"])
ValueError: {'code': -32000, 'message': 'rlp: expected List'}
This is the raw transaction, which I got when calling signed_txn.rawTransaction:
HexBytes('0x02f87182271081fa8477359400847735940082faf3947642df81b5beaeeb331cc5a104bd13ba68c34b91808428e9d35bc080a0c5570eba5692b8beb1e1dd58907ab709f35409f95daddc8bf568fcfcbf1a4320a02250b01810c2f801fb7afec9ca3f24ffea84869f42c3c91e2c8df245af8bc2b7')
According to a Ethereum tx decoder, this raw transaction is not correct, so perhaps something isn't formatted properly.
This is the solution:
stake_cly_tx = contract.functions.startStake().buildTransaction(
{'chainId': 10000,
'gas': 108287,
'gasPrice': w3.toWei('1.05', 'gwei'),
'nonce': nonce})
The problem came from 'maxFeePerGas' and 'maxPriorityFeePerGas' parameters. These are obsolete.

jsonb join not working properly in sqlalchemy

I have a query that joins on a jsonb type column in postgres that I want to convert to sqlalchemy in django using the aldjemy package
SELECT anon_1.key AS tag, count(anon_1.value ->> 'polarity') AS count_1, anon_1.value ->> 'polarity' AS anon_2
FROM feedback f
JOIN tagging t ON t.feedback_id = f.id
JOIN jsonb_each(t.json_content -> 'entityMap') AS anon_3 ON true
JOIN jsonb_each(((anon_3.value -> 'data') - 'selectionState') - 'segment') AS anon_1 ON true
where f.id = 2
GROUP BY anon_1.value ->> 'polarity', anon_1.key;
The json_content field stores data in the following format:
{
"entityMap":
{
"0":
{
"data":
{
"people":
{
"labelId": 5,
"polarity": "positive"
},
"segment": "a small segment",
"selectionState":
{
"focusKey": "9xrre",
"hasFocus": true,
"anchorKey": "9xrre",
"isBackward": false,
"focusOffset": 75,
"anchorOffset": 3
}
},
"type": "TAG",
"mutability": "IMMUTABLE"
},
"1":
{
"data":
{
"product":
{
"labelId": 6,
"polarity": "positive"
},
"segment": "another segment",
"selectionState":
{
"focusKey": "9xrre",
"hasFocus": true,
"anchorKey": "9xrre",
"isBackward": false,
"focusOffset": 138,
"anchorOffset": 79
}
},
"type": "TAG",
"mutability": "IMMUTABLE"
}
}
}
I wrote the following sqlalchemy code to achieve the query
first_alias = aliased(func.jsonb_each(Tagging.sa.json_content["entityMap"]))
print(first_alias)
second_alias = aliased(
func.jsonb_each(
first_alias.c.value.op("->")("data")
.op("-")("selectionState")
.op("-")("segment")
)
)
polarity = second_alias.c.value.op("->>")("polarity")
p_tag = second_alias.c.key
_count = (
Feedback.sa.query()
.join(
CampaignQuestion,
CampaignQuestion.sa.question_id == Feedback.sa.question_id,
isouter=True,
)
.join(Tagging)
.join(first_alias, true())
.join(second_alias, true())
.filter(CampaignQuestion.sa.campaign_id == campaign_id)
.with_entities(p_tag.label("p_tag"), func.count(polarity), polarity)
.group_by(polarity, p_tag)
.all()
)
print(_count)
but it is giving me a NotImplementedError: Operator 'getitem' is not supported on this expression error on accessing first_alias.c
the stack trace:
Traceback (most recent call last):
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/rest_framework/views.py", line 506, in dispatch
response = handler(request, *args, **kwargs)
File "/home/work/api/app/campaign/views.py", line 119, in results_p_tags
d = campaign_service.get_p_tag_count_for_campaign_results(id)
File "/home/work/api/app/campaign/services/campaign.py", line 177, in get_p_tag_count_for_campaign_results
return campaign_selectors.get_p_tag_counts_for_campaign(campaign_id)
File "/home/work/api/app/campaign/selectors.py", line 196, in get_p_tag_counts_for_campaign
polarity = second_alias.c.value.op("->>")("polarity")
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/sqlalchemy/util/langhelpers.py", line 1093, in __get__
obj.__dict__[self.__name__] = result = self.fget(obj)
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/sqlalchemy/sql/selectable.py", line 746, in columns
self._populate_column_collection()
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/sqlalchemy/sql/selectable.py", line 1617, in _populate_column_collection
self.element._generate_fromclause_column_proxies(self)
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/sqlalchemy/sql/selectable.py", line 703, in _generate_fromclause_column_proxies
fromclause._columns._populate_separate_keys(
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/sqlalchemy/sql/base.py", line 1216, in _populate_separate_keys
self._colset.update(c for k, c in self._collection)
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/sqlalchemy/sql/base.py", line 1216, in <genexpr>
self._colset.update(c for k, c in self._collection)
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/sqlalchemy/sql/operators.py", line 434, in __getitem__
return self.operate(getitem, index)
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/sqlalchemy/sql/elements.py", line 831, in operate
return op(self.comparator, *other, **kwargs)
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/sqlalchemy/sql/operators.py", line 434, in __getitem__
return self.operate(getitem, index)
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/sqlalchemy/sql/type_api.py", line 75, in operate
return o[0](self.expr, op, *(other + o[1:]), **kwargs)
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/sqlalchemy/sql/default_comparator.py", line 173, in _getitem_impl
_unsupported_impl(expr, op, other, **kw)
File "/home/.cache/pypoetry/virtualenvs/api-FPSaTdE5-py3.8/lib/python3.8/site-packages/sqlalchemy/sql/default_comparator.py", line 177, in _unsupported_impl
raise NotImplementedError(
NotImplementedError: Operator 'getitem' is not supported on this expression
Any help would be greatly appreciated
PS: The sqlalchemy version I'm using for this is 1.4.6
I used the same sqlalchmy query expression before in a flask project using sqlalchemy version 1.3.22 and it was working correctly
Fixed the issue by using table_valued functions as mentioned in the docs,
and accessing the ColumnCollection of the function using indices instead of keys. Code is as follows:
first_alias = func.jsonb_each(Tagging.sa.json_content["entityMap"]).table_valued(
"key", "value"
)
second_alias = func.jsonb_each(
first_alias.c[1].op("->")("data").op("-")("selectionState").op("-")("segment")
).table_valued("key", "value")
polarity = second_alias.c[1].op("->>")("polarity")
p_tag = second_alias.c[0]

How do you create a Adwords BigQuery Transfer and Transfer Runs using the bigquery_datatransfer Python client?

I have been able to successfully authenticate, and list transfers and transfer runs. But I keep running into the issue of not being able to create a transfer because the transfer config is incorrect.
Here's the Transfer Config I have tried:
transferConfig = {
'data_refresh_window_days': 1,
'data_source_id': "adwords",
'destination_dataset_id': "AdwordsMCC",
'disabled': False,
'display_name': "TestR",
'name': "TestR",
'schedule': "every day 07:00",
'params': {
"customer_id": "999999999" -- Changed Number
}
}
response = client.create_transfer_config(parent, transferConfig)
print(response)
And this is the error I get:
Traceback (most recent call last):
File "./create_transfer.py", line 84, in <module>
main()
File "./create_transfer.py", line 61, in main
response = client.create_transfer_config(parent, transferConfig)
File "/usr/local/Cellar/python3/3.6.2/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/cloud/bigquery_datatransfer_v1/gapic/data_transfer_service_client.py", line 438, in create_transfer_config
authorization_code=authorization_code)
ValueError: Protocol message Struct has no "customer_id" field.
DDIS:bigquery siddharthsudheer$ ./create_transfer.py
Traceback (most recent call last):
File "./create_transfer.py", line 84, in <module>
main()
File "./create_transfer.py", line 61, in main
response = client.create_transfer_config(parent, transferConfig)
File "/usr/local/Cellar/python3/3.6.2/Frameworks/Python.framework/Versions/3.6/lib/python3.6/site-packages/google/cloud/bigquery_datatransfer_v1/gapic/data_transfer_service_client.py", line 438, in create_transfer_config
authorization_code=authorization_code)
ValueError: Protocol message Struct has no "customer_id" field.
I managed to set up a Data Transfer through the API by defining the params as class google.protobuf.struct_pb2.Struct.
Try if the adding the following works for you:
from google.protobuf.struct_pb2 import Struct
params = Struct()
params["customer_id"] = "999999999"
And then changing your transferConfig to:
transferConfig = {
'data_refresh_window_days': 1,
'data_source_id': "adwords",
'destination_dataset_id': "AdwordsMCC",
'disabled': False,
'display_name': "TestR",
'name': "TestR",
'schedule': "every day 07:00",
'params': params
}
}

JIRA not recognizing dictionary object in cURL call

This is the error I am currently getting while using the jira-python python module to automate some logging to JIRA.
Traceback (most recent call last):
File "/home/csd-user/test/libs/utils/butler.py", line 217, in <module>
main()
File "/home/csd-user/test/libs/utils/butler.py", line 214, in main
b.log_bug_exec(url)
File "/home/csd-user/test/libs/utils/butler.py", line 47, in log_bug_exec
cls.process_file(stdout)
File "/home/csd-user/test/libs/utils/butler.py", line 108, in process_file
cls.submit_bug(bug)
File "/home/csd-user/test/libs/utils/butler.py", line 207, in submit_bug
iss = cls.my_server.create_issue(fields=bug.json_dict['fields'])
File "/opt/clearsky/lib/python2.7/site-packages/jira/client.py", line 706, in create_issue
r = self._session.post(url, data=json.dumps(data))
File "/opt/clearsky/lib/python2.7/site-packages/jira/resilientsession.py", line 81, in post
return self.__verb('POST', url, **kwargs)
File "/opt/clearsky/lib/python2.7/site-packages/jira/resilientsession.py", line 74, in __verb
raise_on_error(r, verb=verb, **kwargs)
File "/opt/clearsky/lib/python2.7/site-packages/jira/utils.py", line 120, in raise_on_error
r.status_code, error, r.url, request=request, response=r, **kwargs)
# This is the important part...
jira.utils.JIRAError: JiraError HTTP 400
text: data was not an object
url: https://jira.clearsky-data.net/rest/api/2/issue
My problem is that, as far as I can see, the dict object that I am passing it is perfectly valid.
BUG FIELDS :: {'environment': 'node => 62-qa-driver12 (M3 - HA Only)\nversion => \nurl => https://jenkins.clearsky-data.net/job/BugLoggerTest/144/\ntimestamp => 2015-06-29_11-11-15\njob name => BugLoggerTest\nbuild number => 144\nversion number => Not present. Check git hash. Maybe add in processing of full failure list!\n', 'description': '', 'summary': 'Fill in Something', 'project': {'key': 'QABL'}, 'assignee': 'qa-auto', 'issuetype': {'name': 'Bug'}, 'priority': {'name': 'Major'}}
CLASS :: <type 'dict'>
Is formatted by this...
# creating JSON object (bug should not have to be changed after initialization)
self.json_dict = ( {"fields": {
"project": {'key': self.project},
"issuetype": {'name': self.issue_type},
"priority": {'name': self.priority},
"assignee": self.assignee,
"environment": self.environment,
"description": self.description,
"summary": self.summary } } )
This is the call to create the issue where the error is being thrown:
iss = cls.my_server.create_issue(fields=bug.json_dict['fields'])
# This calls a cURL POST or PUT command.
Your assignee is not what jira expects:
"assignee": self.assignee,
should read
"assignee": {'name': self.assignee}
I saw it in the update example
issue.update(summary='new summary', description='A new summary was added')
issue.update(assignee={'name': 'new_user'})

Elastic Search: pyes.exceptions.IndexMissingException exception from search result

This is a question about Elastic-Search python API (pyes).
I run a very simple testcase through curl, and everything seems to work as expected.
Here is the description of the curl test-case:
The only document that exists in the ES is:
curl 'http://localhost:9200/test/index1' -d '{"page_text":"This is the text that was found on the page!"}
Then I search the ES for all documents that the word "found" exists in. The result seems to be OK:
curl 'http://localhost:9200/test/index1/_search?q=page_text:found&pretty=true'
{
"took" : 1,
"timed_out" : false,
"_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits" : {
"total" : 1,
"max_score" : 0.15342641,
"hits" : [ {
"_index" : "test",
"_type" : "index1",
"_id" : "uaxRHpQZSpuicawk69Ouwg",
"_score" : 0.15342641, "_source" : {"page_text":"This is the text that was found on the page!"}
} ]
}
}
However, when I run the same query though python2.7 api (pyes), something goes wrong:
>>> import pyes
>>> conn = pyes.ES('localhost:9200')
>>> result = conn.search({"page_text":"found"}, index="index1")
>>> print result
<pyes.es.ResultSet object at 0xd43e50>
>>> result.count()
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/usr/local/pythonbrew/pythons/Python-2.7.3/lib/python2.7/site-packages/pyes/es.py", line 1717, in count
return self.total
File "/usr/local/pythonbrew/pythons/Python-2.7.3/lib/python2.7/site-packages/pyes/es.py", line 1686, in total
self._do_search()
File "/usr/local/pythonbrew/pythons/Python-2.7.3/lib/python2.7/site-packages/pyes/es.py", line 1646, in _do_search
doc_types=self.doc_types, **self.query_params)
File "/usr/local/pythonbrew/pythons/Python-2.7.3/lib/python2.7/site-packages/pyes/es.py", line 1381, in search_raw
return self._query_call("_search", body, indices, doc_types, **query_params)
File "/usr/local/pythonbrew/pythons/Python-2.7.3/lib/python2.7/site-packages/pyes/es.py", line 622, in _query_call
return self._send_request('GET', path, body, params=querystring_args)
File "/usr/local/pythonbrew/pythons/Python-2.7.3/lib/python2.7/site-packages/pyes/es.py", line 603, in _send_request
raise_if_error(response.status, decoded)
File "/usr/local/pythonbrew/pythons/Python-2.7.3/lib/python2.7/site-packages/pyes/convert_errors.py", line 83, in raise_if_error
raise excClass(msg, status, result, request)
pyes.exceptions.IndexMissingException: [_all] missing
As you can see, pyes returns the result object, but from some reason I can't even get the number of results there.
Anyone was any guess what may be wrong here?
Thanks a lot in advance!
The name of the parameter changed, it's no longer called index, it's called indices and it's a list:
>>> result = conn.search({"page_text":"found"}, indices=["index1"])

Categories