Thrift server facade for clients - python

I am having a little trouble with a server / client design and
wonder if anyone had any advice.
I have a Thrift server that abstracts a data store. The idea is that
there will be a number of clients that are essentially
out of process plugins that use the interface provided by the server
to receive, manipulate the underlying data store and also provide
their own data.
There will be a number of other clients which simply access the data
provided by the server and its "plugins".
The problem case is when one of these "plugins" wishes to provide its
own data and provide an interface to that data.
The server should have no knowledge of the plugins data or interface.
I would ideally like all clients to access functionality through the
main thrift server so it acts as a facade for the plugins. If a client
requested some data provided by a plugin the main server could
delegate to the plugin to provide that data. I guess this would mean
have each plugin being a thrift client and server. I have written the
server in python so could probably handle thrift calls that are not
yet defined but would it be possible to forward these calls another
thrift server IE act as a proxy ?
An alternative is maybe have the plugins be clients only and push data
to the server. But the format of these messages
would unknown to the server and would have to be generic enough to
accommodate different types of data. I not sure how I would provide a
useful interface to this data to other clients.
As far as I can see only the plugins knows how to store and manipulate
the data it owns so this idea probably would not work.
Thanks for any advice. Any suggestions welcomed.

Sounds like you need some sort of a mechanism to correlate requests to the different plugins available. Ideally, there should be a different URL path per set of operations published for each plugin.
I would consider implementing a sort of map/dictionary of URL paths to plugins. Then for each request received, do a lookup in the map and get the associated plugin and send it the request accordingly. If there is no entry in the map, then a redirect/proxy could be sent. For example if URL = http://yourThriftServer/path/operation, the operation or the path and operation would map to a plugin.
An extra step would be to implement a sort of meta request, whereby a client could query what URL paths/operations are available in the server.

Related

How can I create a custom protocol like HTTP, FTP, accesible from browser?

I have my application which communicates over its own protocol, suppose if a user installs the application and want to get data, then he needs to use that protocol. Let my protocol be XYZ so the user is able to use the same from their existing browsers to get the data. Example: XYZ://myApplication/Query. I am not sure how can I achieve it. I should be noted that the application is running on the user's device itself.
Basically, I want to route a specific request to my application, and let the other request function normally. I am using python for the development of my application.
Additionally, If my approach is wrong then what can I do to create my own communication protocol which can be easily used with existing browsers?
You want to create your own scheme?
Only if you create the right backend server which can response to your request in your protocol, you can do it.
You can also find the details here.

Client-Server framework for python

I'm currently working on a University project that needs to be implemented with a Client - Server model.
I had experiences in the past where I was managing the communication at socket level and that really sucked.
I was wondering if someone could suggest an easy to use python framework that I can use for that purpose.
I don't know what kind of details you may need to answer so I'm just going to describe the project briefly.
Communication should happen over HTTP, possibly HTTPS.
The server does not need to send data back or invoke methods on the clients, it just collects data
Many clients send data concurrently to server, who needs to distinguish the sender, process the data accordingly and put the result in a database.
You can use something like Flask or Django. Both frameworks are fairly easy to implement, Flask is much easier than Django IMO, although Django has a built in authentication layer that you can use, albeit more difficult to implement in a client/server scenario like you need.
I would personally use Flask and JWT (JSON Web Tokens), which will allow you to give a token to each client for authentication with the server, which will also let you differentiate between clients, and you can use HTTPS for your SSL/TLS requirement. It is tons easier to implement this, and although I like django better for what it brings to the table, it is probably overkill to have you learn it for a single assignment.
For Flask with SSL, here is a quick rundown of that.
For JWT with Flask, here is that.
You can use any database system you would like.
If I understood you correctly you can use any web framework in python. For instance, you can use Flask (I use it and I like it). Django is also a popular choice among the python web frameworks. However, you shouldn't be limited to only these two. There are plenty of them out there. Just google for them.
The implementation of the client depends on what kind of communication there will be between the clients and the server - I don't have enough details here. I only know it's unidirectional.
The client can be a browser accessing you web application written in Flask where users send only POST requests to the server. However, even here the communication will bidirectional (the clients need to open the page which means the server sends requests back to the client) and it violates your initial requirement.
Then it can be a specific client written in python sending some particular requests to your server over http/https. For instance, your client can use a requests package to send HTTP requests.

Python implementing simple web data storage

I am trying to develop a python PyQt program that allow user to enter data about personal particulars and review them at a later time for processing purpose.
The program will be used by less than 5 persons at the same time. So, i am thinking to use Sqlite3 database as i believe it should be able to cope for that amount of traffic.
The frame work i have in mind is that, the clients will have their own copy of my python pyqt program on each machine. Whenever they perform any operations that required data read/write, it will connect to the server thorough internet and read/write from the sqlite.db on the server.
Basically, the server will be nothing but a remote data storage.
Currently, i am able to create the required GUI for data inputs by using various widgets like QlineEdit, QCombobox, QTextEdit and so on.
But i have never done network programming before, thus i have no idea how to implement a server that store the sqlite data file for my software. So my questions are
(1) if i have a PC that has 24/7 internet connection, how do i set it up so that it can act as a server that store the data file for my software?
(2)In what way can/should my program communicate to that server through internet.
Even if you can't give me exact answer, i would appreciate if you can provide me some information of so that i look up and study about it.
Any constructive advice will be appreciated.
FYI: all the PCs will be running windows XP SP3 32 bits.
There are different ways for a client to communicate with a server.
You can use
XMLRPC to create an object with methods that are called on the server side
You can use HTTP and REST for the server with the library requests or urllib for the client
For the latter you can use flask, bottle, django or other frameworks to create a website that serves the content
(tutorials)
You can use Pyro to remotely access the objects on the server. Useful if the clients should also communicate with eachother.
You can create your own protocol. You will learn a lot and value the other options.
The list is not complete
I suggest that you have a look at XMLRPC if that fits. For number 2 I can say that many APIs use such a HTTP-interface (twitter, github, facebok, google). It is easy to use also for other people.
Security is important. I am not an expert. If you send username and password in plain text then use SSL to encrypt the connection. If you can not get ssl to work with python you can use stunnel.

Storage Backend based on Websockets

I spent quite some time now with researching Server Backends/API/Frameworks. I need a solution where I can store user content (JSON & Binary data).
The obvious choice would be a REST API. The only missing element is a push feature when data on server changed and clients should be notified instantly. With more research in this matter I discovered classic approaches (Comet, Push, Server sent events, Bayeux, BOSH, …) as well as the „new“ league, Websockets. I would definitely prefer the method with Websockets or using directly TCP Sockets. But this post is not about pros/cons of these two technologies so please restrain yourself from getting side tracked in comments.
At moment exists following projects which are very similar to my needs:
- Simperium (simperium.com), this looks very promising, but core/server is sadly not open source and god knows when, if ever, this step happens
- Realtime.co (framework.realtime.co/storage), hosted service, but same principle
- Some Frameworks for building servers such as Atmosphere (java, no WAMP), Cometd (java, project page looks like stuck in the 90’s), Autobahn (python, WAMP)
My actual favorite is the Autobahn framework (autobahn.ws). Especially using the WAMP protocol (subset of Websocket) as it offers exactly what I need. So the idea would be to build a python backend/server with Autobahn Python (based on Twisted framework) which manages all socket (WAMP) connections and include a Postgresql database for data storing. For all desired clients exists already WAMP libraries. The server would need to be able to do the typical REST API features:
- Send, update, delete requested data (JSON/Binary) from/to server/clients
- Synchronize & automatic conflict management
- Offline handling when connection breaks, automatic restart when connection available again
So finally the questions:
- Have I missed an open source project which covers exactly my needs?
- If I would like to develop my own server with autobahn and a database, could you point me to right direction? Have lot of concerns and not enough depth understanding.. I know Autobahn gives you already a server, but this one is not very close to my final needs.. how to build a server efficient so that he can handle all connected sockets? How handle when a client needs server push? Are there schemas, models or concept how such a server should look like?
- Twisted is a very powerful python framework but not regarded as the most convenient for writing apps.. But I guess a Socket based storage server with db access should be possible? When I run twisted as a web ressource and develop server components with other python framework, would this compromise the latency/performance much?
- Is such a desired server backend with lot of data storage (JSON fields and also binary data such as documents, images) reasonable to build with Sockets by a single devoloper/small team or is this smth. which only bigger companies like Dropbox can do at the moment?
Thank you very much for your help & time!
So finally the questions:
Have I missed an open source project which covers exactly my needs?
No you've covered the open source projects. Open source only gets you about halfway there though. To implement a Global Realtime Network requires equal parts implementation and equal parts operations. You have to think about dropped messages, retries, what happens if a particular geography gets hot how do you scale your servers ...etc. I would argue that an open source solution won't achieve what you want unless you're willing to invest significant resources into operations. I would recommend a service like PubNub: http://pubnub.com
If I would like to develop my own server with autobahn and a database, could you point me to right direction? Have lot of concerns and not enough depth understanding.. I know Autobahn gives you already a server, but this one is not very close to my final needs.. how to build a server efficient so that he can handle all connected sockets? How handle when a client needs server push? Are there schemas, models or concept how such a server should look like?
A good database to back a realtime framework would be Cassandra because it supports high write volumes and handles time series data well: http://cassandra.apache.org/.
Twisted is a very powerful python framework but not regarded as the most convenient for writing apps.. But I guess a Socket based storage server with db access should be possible? When I run twisted as a web ressource and develop server components with other python framework, would this compromise the latency/performance much?
I would not use Twisted. I would use Gevent:http://www.gevent.org/. Its coroutine based so you don't get into callback hell. To support more connections you just increase your greenlet pool to listen on the socket.
Is such a desired server backend with lot of data storage (JSON fields and also binary data such as documents, images) reasonable to build with Sockets by a single devoloper/small team or is this smth. which only bigger companies like Dropbox can do at the moment?
Again I would not build this on your own. A service like PubNub: http://pubnub.com which takes care of all the operational issues for you and has a clean API would service your needs with minimal cost. PubNub takes care of the protocol for you so if your on a mobile device that doesn't support WebSockets it will use TCP, HTTP or whatever the best transport is for the device.

Fine-grained authorisation with ZODB

I have been looking into using ZODB as a persistence layer for a multiplayer video game. I quite like how seamlessly it integrates with arbitrary object-oriented data structures. However, I am stumbling over one issue, where I can't figure out, whether ZODB can resolve this for me.
Apparently, one can use the ClientStorage from ZEO to access a remote data storage used for persistence. While this is great in a trusted local network, one can't do this without proper authorization and authentication in an open network.
So I was wondering, if there is any chance to realize the following concept with ZODB:
On the server-side I would like to have a ZEO server running plus a simulation of the game world that might operate as a fully authorized client on the ZEO server (or use the same file storage as the ZEO server).
On the client side I'd need very restricted read/write access to the ZEO server, so that a client can only view the information its user is supposed to know about (e.g. the surrounding area of their character) and can only modify information related to the actions that their character can perform.
These restrictions would have to be imposed by the server using some sort of fine-grained authorisation scheme. So I would need to be able to tell the server whether user A has permissions to read/write object B.
Now is there way to do this in ZODB or third-party solutions for this kind of problem? Or is there a way to extend ZEO in this way?
No, ZEO was never designed for such use.
It is designed for scaling ZODB access across multiple processes instead, with authentication and authorisation left to the application on top of the data.
I would not use ZEO for anything beyond a local network anyway. Use a different protocol to handle communication between game clients and game server instead, keeping the ZODB server side only.

Categories