Within my workflow I query DynamoDB for tables whose load_fail status equals 1.
If there is at least one table, Glue job needs to start with that list of tables as --source_tables argument.
Below is my entire state machine.
{
"Comment": "A description of my state machine",
"StartAt": "Query",
"States": {
"Query": {
"Type": "Task",
"Next": "Choice",
"Parameters": {
"TableName": "source_tables_load_status",
"KeyConditionExpression": "load_fail = :load_fail",
"ExpressionAttributeValues": {
":load_fail": {
"S": "1"
}
}
},
"Resource": "arn:aws:states:::aws-sdk:dynamodb:query",
"ResultSelector": {
"count.$": "$.Count",
"startTime.$": "$$.Execution.StartTime",
"items.$": "$.Items[*].table_name.S"
}
},
"Choice": {
"Type": "Choice",
"Choices": [
{
"Variable": "$.count",
"NumericGreaterThanEquals": 1,
"Next": "start_glue"
}
],
"Default": "Success"
},
"start_glue": {
"Type": "Task",
"Resource": "arn:aws:states:::glue:startJobRun",
"Parameters": {
"JobName": "data-moving-glue",
"Arguments": {
"--dynamodb_metadata_table": "metadata_table",
"--source_tables.$": "$.items"
}
},
"End": true
},
"Success": {
"Type": "Succeed"
}
}
}
Currently I'm getting an error caused by "--source_tables.$": "$.items".
Question is how to make "--source_tables":["dbo.Table_Two", "dbo.Table_Three"] working by state machine:
An error occurred while executing the state 'start_glue' (entered at the event id #9).
The Parameters '{"JobName":"data-moving-glue","Arguments":{"--dynamodb_metadata_table":"metadata_table","--source_tables":["dbo.Table_Two", "dbo.Table_Three"]}}'
could not be used to start the Task: [The value for the field '--source_tables' must be a STRING]
I closed the result in quotes making it into a string using States.Format
https://docs.aws.amazon.com/step-functions/latest/dg/amazon-states-language-intrinsic-functions.html
"--source_tables.$": "States.Format('{}', $.items)"
New output is:
"--source_tables": "[\"dbo.TableOne\",\"dbo.TableTwo\"]"
This on the other hand can be handled with a function.
eval is used only as an example! Don't use it as it can compromise your code!
lst = "[\"dbo.TableOne\",\"dbo.TableTwo\"]"
for t in (eval(lst)):
print(t)
Related
We are new to Here Api,
Our team is working on one transportation project in which we required to get max speed limit of road using vehicle (latitude ,longitude).
From last few days we are trying to figure out which api should we use in HereApi to achieve what we want.
Regarding this documentation of HERE Route Matching 8: https://developer.here.com/documentation/route-matching/api-reference.html
Send please POST request like:
https://routematching.hereapi.com/v8/match/routelinks?apikey=LXX1Axs75efnlFAlgbPxVekPDR0Hz6rTcRHQMT0EvQs&routeMatch=1&mode=fastest;car;traffic:disabled;&attributes=SPEED_LIMITS_FCn(*),LINK_ATTRIBUTE_FCn(*),TRAFFIC_PATTERN_FCn(*),TRUCK_SPEED_LIMITS_FCn(*),SPEED_LIMITS_VAR_FCn(*),SPEED_LIMITS_COND_FCn(*)
With in body:
LATITUDE,LONGITUDE
37.4201866,15.049515
See please attached the Postman collection for test of HERE Rote Match API v8 https://developer.here.com/documentation/route-matching/dev_guide/index.html :
{
"info": {
"_postman_id": "2563b7cc-2d62-4485-bb64-2f9561318aa2",
"name": "RMEspeed_limit",
"schema": "https://schema.getpostman.com/json/collection/v2.1.0/collection.json",
"_exporter_id": "1051680"
},
"item": [
{
"name": "speed_limit",
"request": {
"method": "POST",
"header": [],
"body": {
"mode": "raw",
"raw": "LATITUDE,LONGITUDE\r\n37.4201866,15.049515"
},
"url": {
"raw": "https://routematching.hereapi.com/v8/match/routelinks?apikey=LXX1Axs75efnlFAlgbPxVekPDR0Hz6rTcRHQMT0EvQs&routeMatch=1&mode=fastest;car;traffic:disabled;&attributes=SPEED_LIMITS_FCn(*),LINK_ATTRIBUTE_FCn(*),TRAFFIC_PATTERN_FCn(*),TRUCK_SPEED_LIMITS_FCn(*),SPEED_LIMITS_VAR_FCn(*),SPEED_LIMITS_COND_FCn(*) ",
"protocol": "https",
"host": [
"routematching",
"hereapi",
"com"
],
"path": [
"v8",
"match",
"routelinks"
],
"query": [
{
"key": "apikey",
"value": "LXX1Axs75efnlFAlgbPxVekPDR0Hz6rTcRHQMT0EvQs"
},
{
"key": "routeMatch",
"value": "1"
},
{
"key": "mode",
"value": "fastest;car;traffic:disabled;"
},
{
"key": "attributes",
"value": "SPEED_LIMITS_FCn(*),LINK_ATTRIBUTE_FCn(*),TRAFFIC_PATTERN_FCn(*),TRUCK_SPEED_LIMITS_FCn(*),SPEED_LIMITS_VAR_FCn(*),SPEED_LIMITS_COND_FCn(*) "
}
]
}
},
"response": []
}
]
}
An apikey and coordinate you specify your own for sure.
About all layers attributes you can see this example: https://demo.support.here.com/pde/maps?url_root=pde.api.here.com
In Python you should use normal POST request using some http libraries developed for Python:
https://www.geeksforgeeks.org/get-post-requests-using-python/
i have this document in mongodb
{
"_id": {
"$oid": "62644af0368cb0a46d7c2a95"
},
"insertionData": "23/04/2022 19:50:50",
"ipfsMetadata": {
"Name": "data.json",
"Hash": "Qmb3FWgyJHzJA7WCBX1phgkV93GiEQ9UDWUYffDqUCbe7E",
"Size": "431"
},
"metadata": {
"sessionDate": "20220415 17:42:55",
"dataSender": "user345",
"data": {
"height": "180",
"weight": "80"
},
"addtionalInformation": [
{
"name": "poolsize",
"value": "30m"
},
{
"name": "swimStyle",
"value": "mariposa"
},
{
"name": "modality",
"value": "swim"
},
{
"name": "gender-title",
"value": "schoolA"
}
]
},
"fileId": {
"$numberLong": "4"
}
}
I want to update nested array document, for instance the name with gender-tittle. This have value schoolA and i want to change to adult like the body. I give the parameter number of fileId in the post request and in body i pass this
post request : localhost/sessionUpdate/4
and body:
{
"name": "gender-title",
"value": "adultos"
}
flask
#app.route('/sessionUpdate/<string:a>', methods=['PUT'])
def sessionUpdate(a):
datas=request.json
r=str(datas['name'])
r2=str(datas['value'])
print(r,r2)
r3=collection.update_one({'fileId':a, 'metadata.addtionalInformation':r}, {'$set':{'metadata.addtionalInformation.$.value':r2}})
return str(r3),200
i'm getting the 200 but the document don't update with the new value.
As you are using positional operator $ to work with your array, make sure your select query is targeting array element. You can see in below query that it is targeting metadata.addtionalInformation array with the condition that name: "gender-title"
db.collection.update({
"fileId": 4,
"metadata.addtionalInformation.name": "gender-title"
},
{
"$set": {
"metadata.addtionalInformation.$.value": "junior"
}
})
Here is the Mongo playground for your reference.
My bot returns an adaptive card in 1:1 private chat with user, the adaptive card configuration is like this,
{
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"type": "AdaptiveCard",
"version": "1.0",
"body": [
{
"type": "Container",
"items": [
{
"type": "TextBlock",
"text": f"{jiradetail.summary}",
}
]
}
],
"actions": [
{
"type": "Action.ShowCard",
"title": "Comment",
"card": {
"type": "AdaptiveCard",
"version": "1.0",
"body": [
{
"type": "Input.Text",
"id": "comment",
"isMultiline": True,
"placeholder": "Enter your comment"
}
],
"actions": [
{
"type": "Action.Submit",
"title": "OK",
"data": "**jiraid**"
}
]
}
}
]
}
As you could see, there is a 'comment' textbox and a 'Ok' Action (type Action.Submit, and hidden data->jiraid), the card will be as shown below,
Now on click on this Ok button, I am receiving the activity in ,
on_message_activity, with the user entered value in the commentbox in the field,
turn_context.activity.value
but i couldnt get the hidden data which i mapped to the action button, the below picture shows the inspected value of 'turn_context.activity'.
How can i get the mapped data to this action?
Note: I was also expecting the callback to be, on_teams_messaging_extension_submit_action , but this callback is never called, instead only on_message_activity is called. I assume, its because its an 1:1 conversation and its not invoked via the messageextensions. Any experts please confirm.
Regarding "on_teams_messaging_extension_submit_action" - it's not because it's a 1-1, rather it's because it is NOT a "message extension", it's just a regular Adaptive Card action.
With regards the main issue, about the data not appearing, try to avoid having a direct string value as the "data" payload, and instead try with an object, like this:
...
"data": {"value": "**jiraid**"}
...
Got the answer here,
https://learn.microsoft.com/en-us/microsoftteams/platform/task-modules-and-cards/cards/cards-actions#
For easy reference, this what we are supposed to do,
Adaptive Cards support three action types:
Action.OpenUrl
Action.Submit
Action.ShowCard
In addition to the
actions mentioned above, you can modify the Adaptive Card
Action.Submit payload to support existing Bot Framework actions using
a msteams property in the data object of Action.Submit. The below
sections detail how to use existing Bot Framework actions with
Adaptive Cards.
So the updated payload will be, refer the payload 'msteams' under action->data,
{
"$schema": "http://adaptivecards.io/schemas/adaptive-card.json",
"type": "AdaptiveCard",
"version": "1.0",
"body": [
{
"type": "Container",
"items": [
{
"type": "TextBlock",
"text": f"{jiradetail.summary}",
}
]
}
],
"actions": [
{
"type": "Action.ShowCard",
"title": "Comment",
"card": {
"type": "AdaptiveCard",
"version": "1.0",
"body": [
{
"type": "Input.Text",
"id": "comment",
"isMultiline": True,
"placeholder": "Enter your comment"
}
],
"actions": [
{
"type": "Action.Submit",
"title": "OK",
"data": {
"msteams": {
"type": "invoke",
"value": {"jiraid":f"{jiradetail.issueid}"}
}
}
]
}
}
]
}
I'm using Python to add entries in a local ElasticSearch (localhost:9200)
Currently, I use this method:
def insertintoes(data):
"""
Insert data into ElasicSearch
:param data: dict
:return:
"""
timestamp = data.get('#timestamp')
logstashIndex = 'logstash-' + timestamp.strftime("%Y.%m.%d")
es = Elasticsearch()
if not es.indices.exists(logstashIndex):
# Setting mappings for index
mapping = '''
{
"mappings": {
"_default_": {
"_all": {
"enabled": true,
"norms": false
},
"dynamic_templates": [
{
"message_field": {
"path_match": "message",
"match_mapping_type": "string",
"mapping": {
"norms": false,
"type": "text"
}
}
},
{
"string_fields": {
"match": "*",
"match_mapping_type": "string",
"mapping": {
"fields": {
"keyword": {
"type": "keyword"
}
},
"norms": false,
"type": "text"
}
}
}
],
"properties": {
"#timestamp": {
"type": "date",
"include_in_all": true
},
"#version": {
"type": "keyword",
"include_in_all": true
}
}
}
}
}
'''
es.indices.create(logstashIndex, ignore=400, body=mapping)
es.index(index=logstashIndex, doc_type='system', timestamp=timestamp, body=data)
data is a dict structure with a valid #timestamp defined like this data['#timestamp'] = datetime.datetime.now()
The problem is, even if there is a timestamp value in my data, Kibana doesn't show the entry in «discovery» field. :(
Here is an example of a full entry in ElasicSearch:
{
"_index": "logstash-2017.06.25",
"_type": "system",
"_id": "AVzf3QX3iazKBndbIkg4",
"_score": 1,
"_source": {
"priority": 6,
"uid": 0,
"gid": 0,
"systemd_slice": "system.slice",
"cap_effective": "1fffffffff",
"exe": "/usr/bin/bash",
"hostname": "ns3003395",
"syslog_facility": 9,
"comm": "crond",
"systemd_cgroup": "/system.slice/cronie.service",
"systemd_unit": "cronie.service",
"syslog_identifier": "CROND",
"message": "(root) CMD (/usr/local/rtm/bin/rtm 14 > /dev/null 2> /dev/null)",
"systemd_invocation_id": "9228b6c72e6a4624a1806e4c59af8d04",
"syslog_pid": 26652,
"pid": 26652,
"#timestamp": "2017-06-25T17:27:01.734453"
}
}
As you can see, there IS a #timestamp field but it doesn't seems to be what Kibana expects.
And don't know what to do to make my entries visible in Kibana.
Any idea ?
Elasticsearch is not recognizing #timestamp as a date, but as a string. If your data['#timestamp'] is a datetime object, you can try to convert it to a ISO string, which is automatically recognized, try:
timestamp = data.get('#timestamp').isoformat()
timestamp should now be a string, but in ISO format
I am attempting to read JSON Data from a Network Port Scan and store these results in an ElasticSearch Index as a document. However, whenever I try to do this, I get a MapperParsingException error on the scan output results. In my mapping, I even tried to change the analysis to not_analyzed and no, but the error doesnt go away. Then, I figured that ES might be trying to interpret certain values as date values and attempted to set date_format to 0 or none. That led to a dead-end as well, with the mapping throwing an Unsupported option exception.
I have a dump of the values that I want to index in ElasticSearch here:
{
"protocol": "tcp",
"service": "ssh",
"state": "open",
"script_out": [
{
"output": "\n 1024 de:4e:50:33:cd:f6:8a:d0:c4:5a:e9:7d:1e:7b:13:12 (DSA)\nssh-dss AAAAB3NzaC1kc3MAAACBANkPx1nphZwsN1SVPPQHwz93abIHuEC4wMEeZiXdBC8RoSUUeCmdgPfIh4or0LvZ1pqaZP/k0qzCLyVxFt/eI7n36Lb9sZdVMf1Ao7E9TSc7lj9wg5ffY58WbWob/GQs1llGZ2K9Gp7oWuwCjKP164MsxMvahoJAAaWfap48ZiXpAAAAFQCnRMwRp8wBzzQU6lia8NegIb5rswAAAIEAxvN66VMDxE5aU8SvwwVmcUNwtVQWZ6pxn2W0gzF6H7JL1BhcnbCwQ3J/S6WdtqL2Dscw8drdAvsrN4XC8RT6Jowsir4q4HSQCybll6fSpNEdlv/nLIlYsH5ZuZZUIMxbTQ9vT0oYvzpDHejIQ/Zl1inYnJ+6XJmOc0LPUsu5PEsAAACAQO+Tsd3inLGskrqyrWSDO0VDD3cApYW7C+uTWXBfIoh/sVw+X9+OPa833w/PQkpacm68kYPXKS7GK8lqhg93dwbUNYFKz9MMNY6WVOjeAX9HtUAbglgLyRIt0CBqmL4snoZeKab22Nlmaf4aU5cHFlG9gnFEcK0vVIwIWp2EM/I=\n 2048 94:5f:86:77:81:39:2e:03:e0:42:d8:7d:10:a5:60:f0 (RSA)\nssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDV9BKj+QSavAr4UcDaCoHADVIaMOpcI5/hx/X9CRLDTxmB/WvEiL42tziMZEx7ipHT28/hl4HOwK64eXZuK75JMrMDutCZ2gmvRmvFKl6mAVbUEOlVkMGZeNJxATCZyWQyrZ6wA9E2ns5+id6l9C8we+bdq39cIR/e+yR8Ht8sfaigDi0gcW67GrHDI/oIgTQ79l+T/xAqCVrtQxqn/6pCuaCWQUVCxgOPXmJPbsd+g+oqZtm0aEjIJvcDJocMkZ2qMMlgMPeJBN27FCTKB80UUbV57iHXHzZF+cD7v+Jlw0fmyMapMkkPH+aabOUy7Kkbty1mucrFxaisLsckEf47",
"elements": {
"null": [
{
"type": "ssh-dss",
"bits": "1024",
"key": "AAAAB3NzaC1kc3MAAACBANkPx1nphZwsN1SVPPQHwz93abIHuEC4wMEeZiXdBC8RoSUUeCmdgPfIh4or0LvZ1pqaZP/k0qzCLyVxFt/eI7n36Lb9sZdVMf1Ao7E9TSc7lj9wg5ffY58WbWob/GQs1llGZ2K9Gp7oWuwCjKP164MsxMvahoJAAaWfap48ZiXpAAAAFQCnRMwRp8wBzzQU6lia8NegIb5rswAAAIEAxvN66VMDxE5aU8SvwwVmcUNwtVQWZ6pxn2W0gzF6H7JL1BhcnbCwQ3J/S6WdtqL2Dscw8drdAvsrN4XC8RT6Jowsir4q4HSQCybll6fSpNEdlv/nLIlYsH5ZuZZUIMxbTQ9vT0oYvzpDHejIQ/Zl1inYnJ+6XJmOc0LPUsu5PEsAAACAQO+Tsd3inLGskrqyrWSDO0VDD3cApYW7C+uTWXBfIoh/sVw+X9+OPa833w/PQkpacm68kYPXKS7GK8lqhg93dwbUNYFKz9MMNY6WVOjeAX9HtUAbglgLyRIt0CBqmL4snoZeKab22Nlmaf4aU5cHFlG9gnFEcK0vVIwIWp2EM/I=",
"fingerprint": "de4e5033cdf68ad0c45ae97d1e7b1312"
},
{
"type": "ssh-rsa",
"bits": "2048",
"key": "AAAAB3NzaC1yc2EAAAADAQABAAABAQDV9BKj+QSavAr4UcDaCoHADVIaMOpcI5/hx/X9CRLDTxmB/WvEiL42tziMZEx7ipHT28/hl4HOwK64eXZuK75JMrMDutCZ2gmvRmvFKl6mAVbUEOlVkMGZeNJxATCZyWQyrZ6wA9E2ns5+id6l9C8we+bdq39cIR/e+yR8Ht8sfaigDi0gcW67GrHDI/oIgTQ79l+T/xAqCVrtQxqn/6pCuaCWQUVCxgOPXmJPbsd+g+oqZtm0aEjIJvcDJocMkZ2qMMlgMPeJBN27FCTKB80UUbV57iHXHzZF+cD7v+Jlw0fmyMapMkkPH+aabOUy7Kkbty1mucrFxaisLsckEf47",
"fingerprint": "945f867781392e03e042d87d10a560f0"
}
]
},
"id": "ssh-hostkey"
}
],
"banner": "product: OpenSSH version: 6.2 extrainfo: protocol 2.0",
"port": "22"
},
Update
I am able to index the content in the "output" key. However, the error appears when I try and index the content in the "elements" key
Update 2
There's a possibility that there's something wrong with my mapping. This is the python code that I am using for the mapping.
"scan_info": {
"properties": {
"protocol": {
"type": "string",
"index": "analyzed"
},
"service": {
"type": "string",
"index": "analyzed"
},
"state": {
"type": "string",
"index": "not_analyzed"
},
"banner": {
"type": "string",
"index": "analyzed"
},
"port": {
"type": "string",
"index": "not_analyzed"
},
"script_out": { #is this the problem??
"type": "object",
"dynamic": True
}
}
}
I am drawing a blank here. What do I need to do?