curl to xively using python - python

I'm looking to push json packets to xively using this command:
jstr = '''{ "version":"1.0.0", "datastreams":[{"id":"one","current_value":"100.00"},{"id":"two","current_value":"022.02"},{"id":"three","current_value":"033.03"},{"id":"four","current_value":"044.04"}] }'''
and running it by calling this:
cmd = "curl --request PUT %s --header %s --verbose %s" % (jstr,apiKey,feedId)
I'm doing it this way so I can manipulate the JSON between transmissions (I change it to a dict and back).
It's throwing an error saying that there is no data sent. I'm new to curl, xively and python so it's really confusing me. Any help would be greatly appreciated please.

The best way to achieve this will be using official Python module provided by Xively.
Here are few reasons for not doing it the way you just described:
the official library provides a nice and simple API
you don't need to care what the data format actually is
calling curl command to make an HTTP request every time is absolutely inefficient as it take time for the OSto spawn and new process

Related

How to retrieve CouchDB replication status from Python

In bash I can retrieve a json blob of CouchDB replication status docs easily:
curl -X GET -u admin:password localhost:5984/_scheduler/docs/_replicator
Is it possible to retrieve the same information in Python using the couchdb library? I've tried this:
couch_db.view("_scheduler/docs/_replicator", include_docs=True)
.. but this returns a ('not_found', 'missing') error.
It turns out that I was making this more complicated than required. Using the Python requests library makes this task trivial:
requests.get("localhost:5984/_scheduler/docs/_replicator", auth=("admin", "password"))

Marklogic extract DB status from multiple servers?

I need to create a bash(or python) script which gives me the availability status of multiple databases which are on different servers. I found that I can get the status using this url "http://marklogic:8002/manage/v2/database/$DBNAME/?view=status". But I have for about twenty different DBs. When you open this link it generates an xml with database details. Can you please advise how can I loop all the links and grep only the status row ? Or if you have any other idea please advise
It might be worth looking into the MarkLogic Python API project on Github:
https://github.com/marklogic/python_api
HTH!
You can keep the dbnames in a file and then use for loop around it.
for a in `cat dbname.txt`
do
status = `wget -qO- "http://marklogic:8002/manage/v2/database/${a}/?view=status"`
echo $a, $status
done
yep, I did it through curl --anyauth --user user:pass "http://marklogic:8002/manage/v2/database/${a}/?view=status

Compressing JSON sent via CGI in Python

The webApp I'm currently developing requires large JSON files to requested by the client, built on the server using Python and sent back to the client. The solution is implemented via CGI, and is working correctly in every way.
At this stage I'm just employing various techniques to minimize the size of the resulting JSON objects sent back to the client which are around 5-10mb ( Without going into detail, this is more or less fixed, and cannot be lazy loaded in any way).
The host we're using doesn't support mod_deflate or mod_gzip, so while we can't configure Apache to automatically create gzipped content on the server with .htaccess, I figure we'll still be able to receive it and decode on the client side as long as the Content-encoding header is set correctly.
What I was wondering, is what is the best way to achieve this. Gzipping something in Python is trivial. I already know how to do that, but the problem is:
How do I compress the data in such a way, that printing it to the output stream to send via CGI will be both compressed, and readable to the client?
The files have to be created on the fly, based upon input data, so storing premade and prezipped files is not an option, and they have to be received via xhr in the webApp.
My initial experiments with compressing the JSON string with gzip and io.stringIO, then printing it to the output stream caused it to be printed in Python's normal byte format eg: b'\n\x91\x8c\xbc\xd4\xc6\xd2\x19\x98\x14x\x0f1q!\xdc|C\xae\xe0 and such, which bloated the request to twice it's normal size...
I was wondering if someone could point me in the right direction here with how I could accomplish this, if it is indeed possible.
I hope I've articulated my problem correctly.
Thank you.
I guess you use print() (which first converts its argument to a string before sending it to stdout) or sys.stdout (which only accepts str objects).
To write directly on stdout, you can use sys.stdout.buffer, a file-like object that supports bytes objects:
import sys
import gzip
s = 'foo'*100
sys.stdout.buffer.write(gzip.compress(s.encode()))
Which gives valid gzip data:
$ python3 foo.py | gunzip
foofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoofoo
Thanks for the answers Valentin and Phillip!
I managed to solve the issue, both of you contributed to the final answer. Turns out it was a combination of things.
Here's the final code that works:
response = json.JSONEncoder().encode(loadData)
sys.stdout.write('Content-type: application/octet-stream\n')
sys.stdout.write('Content-Encoding: gzip\n\n')
sys.stdout.flush()
sys.stdout.buffer.write(gzip.compress(response.encode()))
After switching over to sys.stdout instead of using print to print the headers, and flushing the stream it managed to read correctly. Which is pretty curious... Always something more to learn.
Thanks again!

POST flask server with XML from python

I have a flask server up and running on pythonanywhere and I am trying to write a python script which I can run locally which will trigger a particular response - lets say the server time, for the sake of this discussion.
There is tonnes and tonnes of documentation on how to write the Flask server side of this process, but non/very little on how to write something which can trigger the Flask app to run.
I have tried sending XML in the form of a simple curl command e.g.
curl -X POST -d '<From>Jack</From><Body>Hello, it worked!</Body>' URL
But this doesnt seem to work (errors about referral headers).
Could someone let me know the correct way to compose some XML which can be sent to a listening flask server.
Thanks,
Jack
First, i would add -H "Content-Type: text/xml" to the headers in the cURL call so the server knows what to expect. It would be helpful if you posted the server code (not necessarily everything, but at least what's failing).
To debug this i would use
#app.before_request
def before_request():
if True:
print "HEADERS", request.headers
print "REQ_path", request.path
print "ARGS",request.args
print "DATA",request.data
print "FORM",request.form
It's a bit rough, but helps to see what's going on at each request. Turn it on and off using the if statement as needed while debugging.
Running your request without the xml header in the cURL call sends the data to the request.form dictionary. Adding the xml header definition results in the data appearing in request.data. Without knowing where your server fails, the above should give you at least a hint on how to proceed.
EDIT referring to comment below:
I would use the excellent xmltodict library. Use this to test:
import xmltodict
#app.before_request
def before_request():
print xmltodict.parse(request.data)['xml']['From']
with this cURL call:
curl -X POST -d '<xml><From>Jack</From><Body>Hello, it worked!</Body></xml>' localhost:5000 -H "Content-Type: text/xml"
'Jack' prints out without issues.
Note that this call has been modified from your question- the 'xml' tag has been added since XML requires a root node (it's called an xml tree for a reason..). Without this tag you'll get a parsing error from xmltodict (or any other parser you choose).

Sending messages with Telegram - APIs or CLI?

I would like to be able to send a message to a group chat in Telegram. I want to run a python script (which makes some operations that already works) and then, if some parameters have some values the script should send a message to a group chat through Telegram. I am using Ubuntu, and Python 2.7
I think, if I am not wrong, that I have two ways to do that:
Way One: make the Python script connect to the Telegram APIs directly and send the message (https://core.telegram.org/api).
Way Two: make the Python script call the Telegram's CLI (https://github.com/vysheng/tg), pass some values to this and then the message is sent by the Telegram's CLI.
I think that the first way is longer, so a good idea might be using the Way Two.
In this case I really don't know how to proceed.
I don't know lots about scripts in linux, but I tried to do this:
#!/bin/bash
cd /home/username/tg
echo "msg user#******** messagehere" | ./telegram
sleep 10
echo "quit" | ./telegram
this works at a half: it sends the message correctly, but then the process remains open. And second problem, I have no clue on how to call that from python and how to pass some value to this script. The value that I would like to pass to the script is the "messagehere" var: this would be a 100/200 characters message, defined from inside the python script.
Does anyone has any clues on that?
Thanks for replies, I hope this might be useful for someone else.
Telegram recently released their new Bot API which makes sending/receiving messages trivial. I suggest you also take a look at that and see if it fits your needs, it beats wrapping the client library or integrating with their MTProto API.
import urllib
import urllib2
# Generate a bot ID here: https://core.telegram.org/bots#botfather
bot_id = "{YOUR_BOT_ID}"
# Request latest messages
result = urllib2.urlopen("https://api.telegram.org/bot" + bot_id + "/getUpdates").read()
print result
# Send a message to a chat room (chat room ID retrieved from getUpdates)
result = urllib2.urlopen("https://api.telegram.org/bot" + bot_id + "/sendMessage", urllib.urlencode({ "chat_id": 0, "text": 'my message' })).read()
print result
Unfortunately I haven't seen any Python libraries you can interact directly with, but here is a NodeJS equivalent I worked on for reference.
Since version 1.05 you can use the -P option to accept messages from a socket, which is a third option to solve your problem. Sorry that it is not really the answer to your question, but I am not able to comment your question because I do not have enough reputation.
First create a bash script for telegram called tg.sh:
#!/bin/bash
now=$(date)
to=$1
subject=$2
body=$3
tgpath=/home/youruser/tg
LOGFILE="/home/youruser/tg.log"
cd ${tgpath}
${tgpath}/telegram -k ${tgpath}/tg-server.pub -W <<EOF
msg $to $subject
safe_quit
EOF
echo "$now Recipient=$to Message=$subject" >> ${LOGFILE}
echo "Finished" >> ${LOGFILE}
Then put the script in the same folder than your python script, and give it +x permission with chmod +x tg.sh
And finally from python, you can do:
import subprocess
subprocess.call(["./tg.sh", "user#****", "message here"])
I'm working with pytg which could be found here:
A Python package that wraps around Telegram messenger CLI
it works pretty good. I already have a python bot based on that project
You can use safe_quit to terminate the connection instead since it waits until everything is done before closing the connection and termination the application
#!/bin/bash
cd /home/username/tg
echo "msg user#******** messagehere\nsafe_quit\n" | ./telegram
use this as a simple script and call it from python code as the other answer suggested.
I would recommend the first option.
Once you are comfortable with generating an AuthKey, you should start to get a handle on the documentation.
To help, I have written a detailed step-by step guide of how I wrote the AuthKey generation code from scratch here.
It's in vb.net, but the steps should help you do same in python.

Categories