I am trying to get signed from sengrid Webhook:
https://docs.sendgrid.com/for-developers/tracking-events/getting-started-event-webhook-security-features
from sendgrid.helpers.eventwebhook import EventWebhook, EventWebhookHeader
def is_valid_signature(request):
#event_webhook_signature=request.META['HTTP_X_TWILIO_EMAIL_EVENT_WEBHOOK_SIGNATURE']
#event_webhook_timestamp=request.META['HTTP_X_TWILIO_EMAIL_EVENT_WEBHOOK_TIMESTAMP']
event_webhook = EventWebhook()
key=settings.SENDGRID_HEADER
ec_public_key = event_webhook.convert_public_key_to_ecdsa(key)
text=json.dumps(str(request.body))
return event_webhook.verify_signature(
text,
request.headers[EventWebhookHeader.SIGNATURE],
request.headers[EventWebhookHeader.TIMESTAMP],
ec_public_key
)
When I send test example from sengrid, always return False. I compared keys and all is correct, so, I think that the problem is the sintax of the payload:
"b[{\"email\":\"example#test.com\",\"timestamp\":1648560198,\"smtp-id\":\"\\\\u003c14c5d75ce93.dfd.64b469#ismtpd-555\\\\u003e\",\"event\":\"processed\",\"category\":[\"cat facts\"],\"sg_event_id\":\"G6NRn4zC5sGxoV2Hoz7gpw==\",\"sg_message_id\":\"14c5d75ce93.dfd.64b469.filter0001.16648.5515E0B88.0\"},{other tests},\\r\\n]\\r\\n"
I think the issue is that you are calling:
text = json.dumps(str(request.body))
json.dumps serializes an object to a JSON formatted string, but str(request.body) is already a string.
Try just
text = str(request.body)
I found the solution, my function is now like this:
def is_valid_signature(request):
#event_webhook_signature=request.META['HTTP_X_TWILIO_EMAIL_EVENT_WEBHOOK_SIGNATURE']
#event_webhook_timestamp=request.META['HTTP_X_TWILIO_EMAIL_EVENT_WEBHOOK_TIMESTAMP']
event_webhook = EventWebhook()
key=settings.SENDGRID_HEADER
ec_public_key = event_webhook.convert_public_key_to_ecdsa(key)
return event_webhook.verify_signature(
request.body.decode('latin-1'),
request.headers[EventWebhookHeader.SIGNATURE],
request.headers[EventWebhookHeader.TIMESTAMP],
ec_public_key
)
I had to decode in Latin-1, because we have my codification in UTF-8.
Thanks
( not failing on missing headers , utf8 decoding , types converted to strings)
def flask_verifySendgridSignedWebhook(myrequest , expectedKey ):
try:
if(myrequest.is_json):
sg_verify=EventWebhook()
msgbody=""
#print("JSON FOUND")
if(myrequest.data):
msgbody=myrequest.get_data().decode('utf-8')
##print(msgbody)
if(sg_verify.verify_signature( msgbody , str( myrequest.headers.get(EventWebhookHeader.SIGNATURE)),
str( myrequest.headers.get(EventWebhookHeader.TIMESTAMP)),
sg_verify.convert_public_key_to_ecdsa(expectedKey) )):
return True
else:
#print("NO JSON SENT")
return False
except:
return False
Related
In views.py VENDOR_MAPPER is list of dictionary each dictionary has id, name, placeholder and autocommit key. I also tried sending json instead of Response object.
resp_object = {}
resp_object['supported_vendors'] = VENDOR_MAPPER
resp_object['vendor_name'] = ""
resp_object['create_vo_entry'] = False
resp_object['generate_signature_flag'] = False
resp_object['branch_flag'] = False
resp_object['trunk_flag'] = False
resp_object['branch_name'] = ""
resp_object['advisory'] = ""
data = {'data': resp_object}
return Response(data)
On home.html I am accessing the vendors_supported which is list and iterate through it, however instead of object i am getting string as type of variable.
var supported_vendors = "{{data.supported_vendors|safe}}";
console.log(supported_vendors);
console.log("Supported_vendors ", supported_vendors);
console.log("Supported_vendors_type:", typeof(supported_vendors));
data.supported_vendors|safe (django template tagging) is used to remove the unwanted characters in the response i have also tried without safe, but still the type was string
also tried converted as well as parse the response but type is shown as string
var supported_vendors = "{{data.supported_vendors}}";
console.log(JSON.parse(supported_vendors));
console.log(JSON.stringify(supported_vendors));
Output generated, i have printed the response type and values i get, also converting using JSON.parse and JSON.stringify did not work and output every time was string
[1]: https://i.stack.imgur.com/DuSMb.png
I want to convert the property into javascript object and perform some computations
You can try this instead ,
return HttpResponse(json.dumps(data),
content_type="application/json")
I got the answer:
var supported_vendors = "{{data.supported_vendors}}";
Converted the above line to
var supported_vendors = {{data.supported_vendors}};
removed quotes from the variable
I tried to generate uid for a user confirmation email.
'uid':urlsafe_base64_encode(force_bytes(user.pk)),
so, it's works nice, it returns something like "Tm9uZQ"
Then, when I tried to decode it, using force_text(urlsafe_base64_decode(uidb64))
it return None.
The next string
urlsafe_base64_decode(uidb64)
also, return b'None'
I tried to google it, and see different implementations, but copy-paste code not works.
I write something like
b64_string = uidb64
b64_string += "=" * ((4 - len(b64_string) % 4) % 4)
print(b64_string)
print(force_text(base64.urlsafe_b64decode(b64_string)))
and the result still None:
Tm9uZQ==
None
I don't understand how the default decode doesn't work.
"Tm9uZQ==" is the base64 encoding of the string "None",
>>> from base64 import b64encode, b64decode
>>> s = b'None'
>>>
>>> b64encode(s)
b'Tm9uZQ=='
>>> b64decode(b64encode(s))
b'None'
>>>
It could be possible that some of your data is missing. E.g. user.pk is not set. I think that force_bytes is turning a None user.pk into the bytestring b'None', from the Django source,
def force_bytes(s, encoding='utf-8', strings_only=False, errors='strict'):
"""
Similar to smart_bytes, except that lazy instances are resolved to
strings, rather than kept as lazy objects.
If strings_only is True, don't convert (some) non-string-like objects.
"""
# Handle the common case first for performance reasons.
if isinstance(s, bytes):
if encoding == 'utf-8':
return s
else:
return s.decode('utf-8', errors).encode(encoding, errors)
if strings_only and is_protected_type(s):
return s
if isinstance(s, memoryview):
return bytes(s)
return str(s).encode(encoding, errors)
You might be able to prevent None being turned into b'None' by setting strings_only=True when calling force_bytes.
I always receive a string in my result, even in exported JSON.
Using double translate to replace everything. The decimal_serializer was just for testing purposes. I called print(value) inside and it returned a valid float value. In my result it's always unicode string. add_value('offerCountNew', 1.3) returns valid float value in my result.
I also tried removing any processor or serializer. Any ideas on what I am doing wrong?
Item
offerCountNew = scrapy.Field(output_processor = TakeFirst(), serializer = decimal_serializer)
Spider
l.add_xpath('offerCountNew', 'number(translate(//*[#id="olp_feature_div"]//a[contains(#href, "new")], translate(//*[#id="olp_feature_div"]//a[contains(#href, "new")], "0123456789", ""), ""))')
Result
'offerCountNew': u'1.0',
JSON
"offerCountNew": "1.0",
def process_float_or_int(value):
try:
return eval(value)
except:
return value
offerCountNew = scrapy.Field(input_processor = MapCompose(lambda x: process_float_or_int(x)), output_processor = TakeFirst())
In Suds, I use something like
client = suds.client.Client(url)
date_val = client.service.getDate()
and date_val is printed as
2013-11-16
If I use client.last_received(), the raw xml gets printed as
2013-12-11-05:00
How do I get the date returned to date_val to be returned as 2013-11-16-05:00 ?
Apparently this is a known issue with suds. It finds the datetime and returns datetime.date. I couldn't figure out how to change this so I used something like the following:
def getElementFromRawXML(raw_xml,element):
string_xml = raw_xml.plain()
begin = string_xml.find("<"+element+">")
end = string_xml.find("</"+element+">")
if begin == -1 or end == -1:
return None
else:
return string_xml[(begin+len(element)+2):end]
raw_xml = client.last_received()
print getELementFromRawXML(raw_xml,'date')
In Python, is there a way to check if a string is valid JSON before trying to parse it?
For example working with things like the Facebook Graph API, sometimes it returns JSON, sometimes it could return an image file.
You can try to do json.loads(), which will throw a ValueError if the string you pass can't be decoded as JSON.
In general, the "Pythonic" philosophy for this kind of situation is called EAFP, for Easier to Ask for Forgiveness than Permission.
Example Python script returns a boolean if a string is valid json:
import json
def is_json(myjson):
try:
json.loads(myjson)
except ValueError as e:
return False
return True
Which prints:
print is_json("{}") #prints True
print is_json("{asdf}") #prints False
print is_json('{ "age":100}') #prints True
print is_json("{'age':100 }") #prints False
print is_json("{\"age\":100 }") #prints True
print is_json('{"age":100 }') #prints True
print is_json('{"foo":[5,6.8],"foo":"bar"}') #prints True
Convert a JSON string to a Python dictionary:
import json
mydict = json.loads('{"foo":"bar"}')
print(mydict['foo']) #prints bar
mylist = json.loads("[5,6,7]")
print(mylist)
[5, 6, 7]
Convert a python object to JSON string:
foo = {}
foo['gummy'] = 'bear'
print(json.dumps(foo)) #prints {"gummy": "bear"}
If you want access to low-level parsing, don't roll your own, use an existing library: http://www.json.org/
Great tutorial on python JSON module: https://pymotw.com/2/json/
Is String JSON and show syntax errors and error messages:
sudo cpan JSON::XS
echo '{"foo":[5,6.8],"foo":"bar" bar}' > myjson.json
json_xs -t none < myjson.json
Prints:
, or } expected while parsing object/hash, at character offset 28 (before "bar}
at /usr/local/bin/json_xs line 183, <STDIN> line 1.
json_xs is capable of syntax checking, parsing, prittifying, encoding, decoding and more:
https://metacpan.org/pod/json_xs
I would say parsing it is the only way you can really entirely tell. Exception will be raised by python's json.loads() function (almost certainly) if not the correct format. However, the the purposes of your example you can probably just check the first couple of non-whitespace characters...
I'm not familiar with the JSON that facebook sends back, but most JSON strings from web apps will start with a open square [ or curly { bracket. No images formats I know of start with those characters.
Conversely if you know what image formats might show up, you can check the start of the string for their signatures to identify images, and assume you have JSON if it's not an image.
Another simple hack to identify a graphic, rather than a text string, in the case you're looking for a graphic, is just to test for non-ASCII characters in the first couple of dozen characters of the string (assuming the JSON is ASCII).
I came up with an generic, interesting solution to this problem:
class SafeInvocator(object):
def __init__(self, module):
self._module = module
def _safe(self, func):
def inner(*args, **kwargs):
try:
return func(*args, **kwargs)
except:
return None
return inner
def __getattr__(self, item):
obj = getattr(self.module, item)
return self._safe(obj) if hasattr(obj, '__call__') else obj
and you can use it like so:
safe_json = SafeInvocator(json)
text = "{'foo':'bar'}"
item = safe_json.loads(text)
if item:
# do something
An effective, and reliable way to check for valid JSON. If the 'get' accessor does't throw an AttributeError then the JSON is valid.
import json
valid_json = {'type': 'doc', 'version': 1, 'content': [{'type': 'paragraph', 'content': [{'text': 'Request for widget', 'type': 'text'}]}]}
invalid_json = 'opo'
def check_json(p, attr):
doc = json.loads(json.dumps(p))
try:
doc.get(attr) # we don't care if the value exists. Only that 'get()' is accessible
return True
except AttributeError:
return False
To use, we call the function and look for a key.
# Valid JSON
print(check_json(valid_json, 'type'))
Returns 'True'
# Invalid JSON / Key not found
print(check_json(invalid_json, 'type'))
Returns 'False'
Much simple in try block. You can then validate if the body is a valid JSON
async def get_body(request: Request):
try:
body = await request.json()
except:
body = await request.body()
return body