I am using Python to retrieve data from Mongo database to analyze it.
So I am changing data using meteor app and client python to retrieve it in a real time. This is my code:
from MeteorClient import MeteorClient
def call_back_meth():
print("subscribed")
client = MeteorClient('ws://localhost:3000/websocket')
client.connect()
client.subscribe('tasks', [], call_back_meth)
a=client.find('tasks')
print(a)
when I run this script, it only show me current data in 'a' and console will close,
I want to let the console stay open and print data in case of change.
I have used While True to let script running and see changes but I guess it's not a good solution. Is there any another optimized solution?
To get realtime feedback you need to subscribe to changes,and then monitor those changes. Here is an example of watching tasks:
from MeteorClient import MeteorClient
def call_back_added(collection, id, fields):
print('* ADDED {} {}'.format(collection, id))
for key, value in fields.items():
print(' - FIELD {} {}'.format(key, value))
# query the data each time something has been added to
# a collection to see the data `grow`
all_lists = client.find('lists', selector={})
print('Lists: {}'.format(all_lists))
print('Num lists: {}'.format(len(all_lists)))
client = MeteorClient('ws://localhost:3000/websocket')
client.on('added', call_back_added)
client.connect()
client.subscribe('tasks')
# (sort of) hacky way to keep the client alive
# ctrl + c to kill the script
while True:
try:
time.sleep(1)
except KeyboardInterrupt:
break
client.unsubscribe('tasks')
(Reference) (Docs)
Related
I
setting up a Websocket that receives market data from 33 pairs, process the data and insert it into a local mysql database.
what I've tried so far :
Setting up the websocket works fine, then process the data on each new message function and insert it directly into the database
--> problem was that with 33 pairs the websocket was stacking up the buffer with market data, and after a few minutes I would get a delay in the database of at least 10 seconds
Then I tried processing the data through a thread : the on_message function would execute a thread that is simply putting the market data into an array, like below
datas=[]
def add_queue(symbol,t,a,b,r_n):
global datas
datas.append([symbol,t,a,b,r_n])
if json_msg['ev']=="C":
symbol=json_msg['p'].replace("/","-")
round_number=pairs_dict_new[symbol]
t = Thread(target=add_queue, args=(symbol,json_msg['t'],json_msg['a'],json_msg['b'],round_number,))
t.start()
and then another function, with a loop thread would pick it up to insert it into the database
def add_db():
global datas
try:
# db = mysql.connector.connect(
# host="104.168.157.164",
# user="bvnwurux_noe_dev",
# password="Tickprofile333",
# database="bvnwurux_tick_values"
# )
while True:
for x in datas:
database.add_db(x[0],x[1],x[2],x[3],x[4])
if x in datas:
datas.remove(x)
except KeyboardInterrupt:
print("program ending..")
t2 = Thread(target=add_db)
t2.start()
still giving a delay, and the threaded process wasn't actually using a lot of CPU but more of RAM and it just was even worse.
instead of using a websocket with a thread, I tried simple webrequests to the API call, so with 1 thread per symbol, it would loop through a webrequest and in everythread send it to the database. my issues here were that mysql connections don't like threads (sometimes they would make a request with the same connection at the same time and crash) or it would still be delayed by the time to process the code, even without buffer. the code was taking too long to process the answered request that it couldnt keep it under 10s of delay.
Here is a little example of the basic code I used to get the data.
pairs={'AUDCAD':5,'AUDCHF':5,'AUDJPY':3,'AUDNZD':5,'AUDSGD':2,'AUDUSD':5,'CADCHF':5,'CADJPY':3,'CHFJPY':3,'EURAUD':5,'EURCAD':5,'EURCHF':5,'EURGBP':5,'EURJPY':3,'EURNZD':5,'EURSGD':5,'EURUSD':5,'GBPAUD':5,'GBPCAD':5,'GBPCHF':5,'GBPJPY':3,'GBPNZD':5,'GBPSGD':5,'GBPUSD':5,'NZDCAD':5,'NZDCHF':5,'NZDJPY':3,'NZDUSD':5,'USDCAD':5,'USDCHF':5,'USDJPY':3,'USDSGD':5,'SGDJPY':3}
def on_open(ws):
print("Opened connection")
ws.send('{"action":"auth","params":"<API KEY>"}') #connecting with secret api key
def on_message(ws, message):
print("msg",message)
json_msg = json.loads(message)[0]
if json_msg['status'] == "auth_success": # successfully authenticated
r = ws.send('{"action":"subscribe","params":"C.*"}') # subscribing to currencies
print("should subscribe to " + pairs)
#once the websocket is connected to all the pairs, process the data
--> process json_msg
if __name__ == "__main__":
# websocket.enableTrace(True) # just to show all the requests made (debug mode)
ws = websocket.WebSocketApp("wss://socket.polygon.io/forex",
on_open=on_open,
on_message=on_message)
ws.run_forever(dispatcher=rel) # Set dispatcher to automatic reconnection
rel.signal(2, rel.abort) # Keyboard Interrupt
rel.dispatch()
method I tried multiprocess, but this was on the other crashing my server because it would use 100% CPU, and then the requests made on the apache server would not reach or take a long time loading. Its really a balance problem
I'm using an ubuntu server with 32CPUS, based in london and the API polygon is based in NYC.
I also tried with 4 CPUS in seattle to NYC, but still no luck.
Even with 4 pairs and 32CPUS , it would eventually reach 10s delay. I think this is more of a code structure problem.
OK so I'm doing a project on finding the Health details of a remote server using python and I'm hosting the main server using flask. But the idk how to send the Health report which I have created using python, to the flask app. The Health report is in the form of a dictionary and I need to pass the values of the dictionary into columns which are the keys of the dictionary in my database.can someone please help me in sending the Health report to the Flask app? This health report is on another system and I need to send that to my main server.
import psutil
import time
import json
import requests
'''
This program will be loaded on to the target server.
A flask app will transmit health data to the main flask app.
'''
SERVER_NAME="test_local_server"
def getHealth(): # function for generating health report. Returns a json object.
print('generating health report')
report={}
report['sever_name']=SERVER_NAME
report['cpupercent']=psutil.cpu_percent(interval=2.0)
report['ctime']=psutil.cpu_times()
report['cpu_total']=report['ctime'].user+report['ctime'].system
report['disk_usages']=psutil.disk_usage("/")
report['net']=psutil.net_io_counters()
report['bytes_sent']=report['net'].bytes_sent
report['bytes_received']=report['net'].bytes_recv
report['packets_sent']=report['net'].packets_sent
report['packets_received']=report['net'].packets_recv
report['mem']=psutil.virtual_memory()
report['memory_Free']=report['mem'].free
json_report=json.dumps(report)
return(json_report)
if __name__=='__main__':
print(f'starting health report stream for server :\t{SERVER_NAME}')
while True:
getHealth()
This is the code for generating the Health details.How to send this back to my flask app in the form of a dictionary?
Client
I would start by simpifying that code somewhat:
import psutil
STATS_URL = 'http://localhost:5000/'
SERVER_NAME="test_local_server"
def get_health():
print('generating health report')
cpu_percent = psutil.cpu_percent(interval=2.0)
cpu_times = psutil.cpu_times()
disk_usage = psutil.disk_usage("/")
net_io_counters = psutil.net_io_counters()
virtual_memory = psutil.virtual_memory()
# The keys in this dict should match the db cols
report = dict (
sever_name = SERVER_NAME
ctime = cpu_times.__str__(),
disk_usages = disk_usage.__str__(),
net = net_io_counters.__str__(),
mem = virtual_memory.__str__(),
cpupercent = cpu_percent,
cpu_total = cpu_times.user + cpu_times.system,
bytes_sent = net_io_counters.bytes_sent,
bytes_received = net_io_counters.bytes_recv,
packets_sent = net_io_counters.packets_sent,
packets_received = net_io_counters.packets_recv,
memory_Free = virtual_memory.free,
)
return report
This get_health function builds and returns a report dictionary. Notice that for some of the return values from the psutil functions, I've used the built in __str__ method. This ensures a friendly type to be inserted into the database.
If you want to check the types yourself, you can do something like:
for item in report:
print (item, type(report[item]), report[item])
Next have this function run in a loop, with a desired time delay between requests:
if __name__=='__main__':
import time
import requests
print(f'starting health report stream for server :\t{SERVER_NAME}')
while True:
report = get_health()
r = requests.post(STATS_URL, json=report)
print (r, r.json())
time.sleep(1)
Notice this uses the json argument to request.post which automatically sets the correct Content-Type which Flask's request.get_json function expects.
Server
This is pretty easy to recieve:
from flask import Flask, request
app = Flask(__name__)
#app.route('/', methods=['POST'])
def index():
incoming_report = request.get_json()
add_to_db(incoming_report) # We'll build this in a sec.
return {'message': 'success'}
You can now work with incoming_report which is a dictionary.
This also sends a success message back to the client, so on the client you'll see the ouptut:
starting health report stream for server : test_local_server
generating health report
<Response [200]> {'message': 'success'}
# Repeats until killed
Database
and I need to pass the values of the dictionary into columns which are the keys of the dictionary in my database
Now that you have a dictionary incoming_report it should be easy to add this to your database if you're using an ORM.
Something along the lines of this answer should allow you to simply unpack that dictionary. So assuming your model is called Report you could simply do something like:
def add_to_db(d):
report = Report(**d)
db.session.add(report)
db.session.commit()
Note this could probably use some validation, and authentication if your deployment requires this.
I'm trying to convert a synchronous flow in Python code which is based on callbacks to an A-syncronious flow using asyncio.
Basically the code interacts a lot with TCP/UNIX sockets. It reads data from the sockets, manipulates it to make decisions and writes stuff back to the other side. This is going on over multiple sockets at once and data is shared between the contexts to make decisions sometimes.
EDIT :: The code currently is mostly based on registering a callback to a central entity for a specific socket, and having that entity run the callback when the relevant socket is readable (something like "call this function when that socket has data to be read"). Once the callback is called - a bunch of stuff happens, and eventually a new callback is registered for when new data is available. The central entity runs a select over all sockets registered to figure out which callbacks should be called.
I'm trying to do this without refactoring my entire code and making this as seamless as possible to the programmer - so I was trying to think about it like so - all code should run the same way as it does today - but whenever the current code does a socket.recv() to get new data - the process would yield execution to other tasks. When the read returns, it should go back to handling the data from the same point using the new data it got.
To do this, I wrote a new class called AsyncSocket - which interacts with the IO streams of asyncIO and placed the Async/await statements almost solely in there - thinking that I would implement the recv method in my class to make it look like a "regular IO socket" to the rest of my code.
So far - this is my understanding of what A-sync programming should allow.
Now to the problem :
My code awaits for clients to connect - when it does, each client's context is allowed to read and write from it's own connection.
I've simplified to flow to the following to clarify the problem:
class AsyncSocket():
def __init__(self,reader,writer):
self.reader = reader
self.writer = writer
def recv(self,numBytes):
print("called recv!")
data = self.read_mitigator(numBytes)
return data
async def read_mitigator(self,numBytes):
print("Awaiting of AsyncSocket.reader.read")
data = await self.reader.read(numBytes)
print("Done Awaiting of AsyncSocket.reader.read data is %s " % data)
return data
def mit2(aSock):
return mit3(aSock)
def mit3(aSock):
return aSock.recv(100)
async def echo_server(reader, writer):
print ("New Connection!")
aSock = AsyncSocket(reader,writer) # create a new A-sync socket class and pass it on the to regular code
while True:
data = await some_func(aSock) # this would eventually read from the socket
print ("Data read is %s" % (data))
if not data:
break
writer.write(data) # echo everything back
async def main(host, port):
server = await asyncio.start_server(echo_server, host, port)
await server.serve_forever()
asyncio.run(main('127.0.0.1', 5000))
mit2() and mit3() are synchronous functions that do stuff with the data on the way back before returning to the main client's loop - but here I'm just using them as empty functions.
The problem starts when I play with the implementation of some_func().
A pass through implementation (edit: kind-of-works) - but still has issues :
def some_func(aSock):
try:
return (mit2(aSock)) # works
except:
print("Error!!!!")
While an implementation which reads the data and does something with it - like adding a suffix before returning, throws an error:
def some_func(aSock):
try:
return (mit2(aSock) + "something") # doesn't work
except:
print("Error!!!!")
The error (as far as I understand it) means it's not really doing what it should:
New Connection!
called recv!
/Users/user/scripts/asyncServer.py:36: RuntimeWarning: coroutine 'AsyncSocket.read_mitigator' was never awaited
return (mit2(aSock) + "something") # doesn't work
RuntimeWarning: Enable tracemalloc to get the object allocation traceback
Error!!!!
Data read is None
And the echo server obviously doesn't work.
Obviously my code looks more like option #2 with a lot more stuff in some_func(),mit2() and mit3() - but I can't get this to work. I'm fairly new in using asyncio/async/await - so what (rather basic concept I guess) am I missing?
This code won't work as envisioned:
def recv(self,numBytes):
print("called recv!")
data = self.read_mitigator(numBytes)
return data
async def read_mitigator(self,numBytes):
...
You cannot call an async function from a sync function and get the result, you must await it, which ensures that you return to the event loop in case the data is not yet ready. This mismatch between async and sync code is sometimes referred to as the issue of function color.
Since your code is already using non-blocking sockets and an event loop, a good approach to porting it to asyncio might be to first switch to the asyncio event loop. You can use event loop methods like sock_recv to request data:
def start():
loop = asyncio.get_event_loop()
sock = make_socket() # make sure it's non-blocking
future_data = loop.sock_recv(sock, 1024)
future_data.add_done_callback(continue_read)
# return to the event loop - when some data is ready
# continue_read will be invoked
def continue_read(future):
data = future.result()
print('got', data)
# ... do something with data, e.g. process it
# and call sock_sendall with the response
asyncio.get_event_loop().call_soon(start())
asyncio.get_event_loop().run_forever()
Once you have the program working in that mode, you can start moving to coroutines, which allow the code to look like sync code, but work in exactly the same way:
async def start():
loop = asyncio.get_event_loop()
sock = make_socket() # make sure it's non-blocking
data = await loop.sock_recv(sock, 1024)
# data is available "immediately", meaning the coroutine gets
# automatically suspended when awaiting data that is not yet
# ready, and automatically re-scheduled when the data is ready
print('got', data)
asyncio.run(start())
The next step can be eliminating make_socket and switching to asyncio streams.
I am attempting to create a script that uses a value in a remote database to create a wol packet.
I need to pass the value of a (something that I honestly don't know what to call) to a variable. I am unable to just set it to the variable, and I can't figure out how to import it from somewhere else. I need "payload" which is printed and returned under "def message" to be saved to the variable "data"
Below is my code, and I will link the MQTT code that this relies on.
# Import standard python modules.
import sys
# Import Adafruit IO MQTT client.
from Adafruit_IO import MQTTClient
from Adafruit_IO import *
# Set to your Adafruit IO key & username below.
ADAFRUIT_IO_KEY = 'where the api key goes'
ADAFRUIT_IO_USERNAME = 'my username' # See https://accounts.adafruit.com
# to find your username.
# Set to the ID of the feed to subscribe to for updates.
FEED_ID = 'test1'
# Define callback functions which will be called when certain events happen.
def connected(client):
# Connected function will be called when the client is connected to Adafruit IO.
# This is a good place to subscribe to feed changes. The client parameter
# passed to this function is the Adafruit IO MQTT client so you can make
# calls against it easily.
print ('Connected to Adafruit IO! Listening for {0} changes...').format(FEED_ID)
# Subscribe to changes on a feed named DemoFeed.
client.subscribe(FEED_ID)
def disconnected(client):
# Disconnected function will be called when the client disconnects.
print ('Disconnected from Adafruit IO!')
sys.exit(1)
def message(client, feed_id, payload):
# Message function will be called when a subscribed feed has a new value.
# The feed_id parameter identifies the feed, and the payload parameter has
print ('Feed {0} received new value: {1}').format(FEED_ID, payload)
return (payload == payload)
data = message(x, x, x)
# Create an MQTT client instance.
client = MQTTClient(ADAFRUIT_IO_USERNAME, ADAFRUIT_IO_KEY)
# Setup the callback functions defined above.
client.on_connect = connected
client.on_disconnect = disconnected
client.on_message = message
# Connect to the Adafruit IO server.
client.connect()
# Start a message loop that blocks forever waiting for MQTT messages to be
# received. Note there are other options for running the event loop like doing
# so in a background thread--see the mqtt_client.py example to learn more.
if data == '1':
print('Latest value from Test: {0}'.format(data.value))
wol.send_magic_packet('my mac addy')
time.sleep(3)
# Send a value to the feed 'Test'.
aio.send('test1', 0)
print ("worked this time")
client.loop_blocking()
Here is the link to the other code it relies on https://github.com/adafruit/io-client-python/blob/master/Adafruit_IO/mqtt_client.py
First of all, you don't need to import the same module twice
from Adafruit_IO import MQTTClient # this imports part MQTTClient
from Adafruit_IO import * # this imports everything, including MQTTClient again
As for your problem, you've returned (payload == payload) which will always be True, i'm not sure what you're trying to do here but it should look something like this:
def message(client, feed_id, payload):
...
print ('Feed {0} received new value: {1}').format(FEED_ID, payload)
return payload == payload_from_database # what you return will be saved as the variable data
data = message("Steve the happy client", 85, "£100") # in this case, data will be payload
I'm connecting to an IRC server but while it's sitting waiting for data I'd like the program to be able to grab input from the terminal and then relay it to the server, so essentially say JOIN #foobar and the program send JOIN #foobar. The current code looks like:
def receive(self):
while True:
raw = self.socket.recv(4096).decode()
raw_split = raw.splitlines()
if not raw:
break
for line in raw_split:
#if line.find('MODE {0} :'.format(self.config['nick'])) > -1:
# placeholder for perform
data = line.split()
if data[0] == 'PING':
self.send('PONG {0}'.format(data[1]))
color_print("-> {0}".format(data), 'yellow')
#self.plugin.run(data)
Any ideas how to do this?
Take a look at the select module. You can use it to wait on multiple file-like objects including a socket and stdin/stdout/stderr.
There's some example code at this site.