python bot server does not receive bash messages - missing concept - python

I have created a Telegram bot using BotFather, https://t.me/botfather
In a Raspberry server, I have some python code answering messages written to bot.
It uses "pyTelegramBotAPI" and is based on this code
https://www.flopy.es/crea-un-bot-de-telegram-para-tu-raspberry-ordenale-cosas-y-habla-con-ella-a-distancia/
Basically it does "bot.polling()"
It works perfect when I write messages into the bot using the smartphone Telegram APP.
The problem is when I write messages into the bot from another computer,
using "bash" + "curl" + "POST"
The server does not receive the bash message, so it does not answer it.
Can someone provide some light on any concept I am missing ?
PD.- the bash+curl code is this one
#!/bin/bash
TOKEN="1436067683:ABGcHbGWS3ek1UdKvyRWC7Xtuv1DuyvT6A4"
ID="304688070"
MENSAJE="La Raspberry te saluda."
URL="https://api.telegram.org/bot${TOKEN}/sendMessage"
curl -s -X POST ${URL} -d chat_id=${ID} -d text="${MENSAJE}"
PD #2 .- now I use "json" and have reached an interesting situation :
curl -v -X POST -H 'Content-Type: application/json' -d '{"chat_id":"${ID}", "text":"${MENSAJE}"}' ${URL_sndmsg}
... produces ...
{"ok":false,"error_code":400,"description":"Bad Request: chat not found"}
... but I did not change ID neither TOKEN ... and old code still finds the chat ...
Strange

the sendMessage endpoint of the bot API is a POST endpoint, and according to the documentation, you need to use JSON data to communicate with it.
the request you made in the question would look like this:
#!/bin/bash
TOKEN="1436067683:ABGcHbGWS3ek1UdKvyRWC7Xtuv1DuyvT6A4"
ID="304688070"
MENSAJE="La Raspberry te saluda."
URL="https://api.telegram.org/bot${TOKEN}/sendMessage"
curl -s -X POST -H 'Content-Type: application/json' -d '{"chat_id":"${ID}", "text":"${MENSAJE}"}' ${URL}
I'd recommend that you don't use the -s option while making tests, that way you can see the output and you could've figure it out.

Related

Running Job On Airflow Based On Webrequest

I wanted to know if airflow tasks can be executed upon getting a request over HTTP. I am not interested in the scheduling part of Airflow. I just want to use it as a substitute for Celery.
So an example operation would be something like this.
User submits a form requesting for some report.
Backend receives the request and sends the user a notification that the request has been received.
The backend then schedules a job using Airflow to run immediately.
Airflow then executes a series of tasks associated with a DAG. For example, pull data from redshift first, pull data from MySQL, make some operations on the two result sets, combine them and then upload the results to Amazon S3, send an email.
From whatever I read online, you can run airflow jobs by executing airflow ... on the command line. I was wondering if there is a python api which can execute the same thing.
Thanks.
The Airflow REST API Plugin would help you out here. Once you have followed the instructions for installing the plugin you would just need to hit the following url: http://{HOST}:{PORT}/admin/rest_api/api/v1.0/trigger_dag?dag_id={dag_id}&run_id={run_id}&conf={url_encoded_json_parameters}, replacing dag_id with the id of your dag, either omitting run_id or specify a unique id, and passing a url encoded json for conf (with any of the parameters you need in the triggered dag).
Here is an example JavaScript function that uses jQuery to call the Airflow api:
function triggerDag(dagId, dagParameters){
var urlEncodedParameters = encodeURIComponent(dagParameters);
var dagRunUrl = "http://airflow:8080/admin/rest_api/api/v1.0/trigger_dag?dag_id="+dagId+"&conf="+urlEncodedParameters;
$.ajax({
url: dagRunUrl,
dataType: "json",
success: function(msg) {
console.log('Successfully started the dag');
},
error: function(e){
console.log('Failed to start the dag');
}
});
}
A new option in airflow is the experimental, but built-in, API endpoint in the more recent builds of 1.7 and 1.8. This allows you to run a REST service on your airflow server to listen to a port and accept cli jobs.
I only have limited experience myself, but I have run test dags with success. Per the docs:
/api/experimental/dags/<DAG_ID>/dag_runs creates a dag_run for a given dag id (POST).
That will schedule an immediate run of whatever dag you want to run. It does still use the scheduler, though, waiting for a heartbeat to see that dag is running and pass tasks to the worker. This is exactly the same behavior as the CLI, though, so I still believe it fits your use-case.
Documentation on how to configure it is available here: https://airflow.apache.org/api.html
There are some simple example clients in the github, too, under airflow/api/clients
You should look at Airflow HTTP Sensor for your needs. You can use this to trigger a dag.
Airflow's experimental REST API interface can be used for this purpose.
Following request will trigger a DAG:
curl -X POST \
http://<HOST>:8080/api/experimental/dags/process_data/dag_runs \
-H 'Cache-Control: no-cache' \
-H 'Content-Type: application/json' \
-d '{"conf":"{\"START_DATE\":\"2018-06-01 03:00:00\", \"STOP_DATE\":\"2018-06-01 23:00:00\"}'
Following request retrieves a list of Dag Runs for a specific DAG ID:
curl -i -H "Accept: application/json" -H "Content-Type: application/json" -X GET http://<HOST>:8080/api/experimental/dags/process_data/dag_runs
For the GET API to work set rbac flag to True at airflow.cfg.
Following are the list of APIs available: here & there.
UPDATE: stable Airflow REST API released:
https://airflow.apache.org/docs/apache-airflow/stable/stable-rest-api-ref.html
Almost everything stays the same, except API URL change.
Also "conf" is now required to be an object, so I added additional wrapping:
def trigger_dag_v2(self, dag_id, run_id=None, conf=None, execution_date=None):
endpoint = '/api/v1/dags/{}/dagRuns'.format(dag_id)
url = urljoin(self._api_base_url, endpoint)
data = self._request(url, method='POST',
json={
"run_id": run_id,
"conf": {'conf': json.dumps(event)},
"execution_date": execution_date,
})
return data['message']
OLD ANSWER:
Airflow has REST API (currently experimental) - available here:
https://airflow.apache.org/api.html#endpoints
If you do not want to install plugins as suggested in other answers - here is code how you can do it directly with the API:
def trigger_dag(self, dag_id, run_id=None, conf=None, execution_date=None):
endpoint = '/api/experimental/dags/{}/dag_runs'.format(dag_id)
url = urljoin(self._api_base_url, endpoint)
data = self._request(url, method='POST',
json={
"run_id": run_id,
"conf": conf,
"execution_date": execution_date,
})
return data['message']
More examples working with airflow API in python are available here:
https://github.com/apache/airflow/blob/master/airflow/api/client/json_client.py
I found this post while trying to do the same, after further investigation, I switch to ArgoEvents. It is basically the same but based on event-driven flows so it is much more suitable for this use case.
Link:
https://argoproj.github.io/argo
Airflow now has support for stable REST API. Using stable REST API, you can trigger DAG as:
curl --location --request POST 'localhost:8080/api/v1/dags/unpublished/dagRuns' \
--header 'Content-Type: application/json' \
--header 'Authorization: Basic YWRtaW46YWRtaW4=' \
--data-raw '{
"dag_run_id": "dag_run_1",
"conf": {
"key": "value"
}
}'

How to add flows in mininet with an OpenDayLight controller using Python

I'm working with a simple mininet topology, trying to learn how to manipulate flows with an ODL controller. The topology is:
Host1 -- OFSwitch1 -- OFSwitch2 -- Host 2 -- OFSwitch3 -- OFSwitch4 -- Host 3
I'm trying to achieve no connectivity from Host1 to Host3 by default, however, once a python script is run a flow is added that allows Host1 ping Host3.
I'm just starting to learn ODL and cannot seem to get this basic project working.
Flows can be created by Opendaylight controller REST api which in turn get reflected in OVS switch where network simulation is done using mininet.
Please refer below steps to create flow in ODL and verify the same in ODL & OVS:
1) ODL Flow creation
curl -u admin:admin -H 'Content-Type: application/yang.data+xml' -X PUT -d #flow_data.xml http://192.168.1.196:8181/restconf/config/opendaylight-inventory:nodes/node/openflow:1/table/0/flow/10
flow_date.xml file content:
<flow xmlns="urn:opendaylight:flow:inventory"> <priority>14865</priority> <flow-name>jpsampleFlow</flow-name> <idle-timeout>12000</idle-timeout> <match> <ethernet-match> <ethernet-type> <type>2048</type> </ethernet-type> </ethernet-match> <ipv4-source>10.0.0.1/32</ipv4-source><ipv4-destination>10.0.0.2/32</ipv4-destination><ip-match><ip-dscp>28</ip-dscp> </ip-match></match> <id>9</id> <table_id>0</table_id> <instructions> <instruction> <order>6555</order> </instruction> <instruction> <order>0</order> <apply-actions> <action> <order>0</order><drop-action/> <output-action> <output-node-connector>1</output-node-connector> </output-action> </action> </apply-actions> </instruction> </instructions> </flow>
2) Verify the created flow in ODL:
curl -u admin:admin -X GET http://192.168.1.196:8181/restconf/operational/opendaylight-inventory:nodes/node/openflow:1/table/0/flow/10
3) Verify the same in OVS:
sudo ovs-ofctl dump-flows <switch_id>
Refer this wiki page to know more about flow creation in ODL

How to automatically download a file, behind a login page, riddled with javascript?

I want to periodically download a logfile from a device. After finding the curl command to do so through inspecting chrome->inspect element->network while manually downloading the file, I found the following curl type command to display the log file:
curl 'http://10.0.0.1:1333/cgi/WebCGI?7043' -H 'Cookie: loginname=admin; password=e5T%7B%5CYnlcIX%7B; OsVer=2.17.0.32' -H 'Connection: keep-alive' --data '' --compressed
Is this command reliable? The username in the 'Cookie:' is the same as the login username, but the password is not (lets say the true password is p#$$w0rd). I guess e5T%7B%5CYnlcIX%7B is only a cookie session password and will expire at some stage. Am I correct?
Question 1
How can I achieve this reliably with username as admin and password as p#$$w0rd, using a command-line program or Python library. This will run on a server and I do not want it to periodically break.
Question 2
What is the usual steps towards achieving: "I want to script a file download, which is initiated through a JavaScript button, which is behind a login page".

Automate bugsense proguard mapping file upload using (apitoken/apikey)

Hi need to automate Bugsense proguard mapping file upload using python and api(apitoken /apikey) . I was trying with the code from("github.com/PanosJee/5004886") but not find anything getting uploaded . I am able to do curl to the urls specified in the python code(.../errors.json and .../analytics.json) using my apikey and apitoken but not to anyother urls which asks me to login
You could use curl as in the examples below.
APPTOKEN - the token provided for application
ACCESSTOKEN - Bugsense access token. Found in Account Info -> Integration -> API TOKEN
Bash script examples below:
iOS
export DSYMFILEPATH=file.dSYM
export APPTOKEN="fcccccca"
export ACCESSTOKEN="aaaaa4075aaaa69fbaaaa61"
curl -F "file=#$DSYMFILEPATH" --header "X-Bugsense-apikey: $APPTOKEN" --header "X-BugSense-auth-token: $ACCESSTOKEN" https://symbolicator.splkmobile.com/upload/dsym -i
Android
export PROGUARDMAPPINGFILE=mapping.txt
export APPTOKEN="acccccca"
export ACCESSTOKEN="aaaaa4075aaaa69fbaaaa61"
export APPVERSION="1.1"
curl -F "file=#$PROGUARDMAPPINGFILE" --header "X-Bugsense-apikey: $APPTOKEN" --header "X-BugSense-auth-token: $ACCESSTOKEN" --header "X-Bugsense-appver: $APPVERSION" https://symbolicator.splkmobile.com/upload/mapping -i
For more details: https://github.com/bugsense/docs/blob/master/api/read.md

How to use urllib2 when users only have a API token?

how would i tranfoms this curl command:
curl -v -u 82xxxxxxxxxxxx63e6:api_token -X GET https://www.toggl.com/api/v6/time_entries.json
into urlib2?
I found this tutorial: http://www.voidspace.org.uk/python/articles/authentication.shtml
but they use a password and username. I can only use an API token.
Thank you.
see also this question:
Urllib2 raises 403 error while the same request in curl works fine
urllib2 is known to suck big time when it comes to authentication. To save some kitten lifes you should use http://docs.python-requests.org/en/latest/index.html

Categories