send data from app engine(python) to remote server (linux with php) - python

Is it possible to send data from app engine server to another,external server with httpd service? I don't care if the payload will be 1 bit, I just need to make this happen.
I checked all over the place and found this:
"
Can't Open Sockets To Arbitrary Ports
Given that Google App Engine is a constrained runtime environment, it has an understandable limitation of preventing you from opening sockets on arbitrary ports. This restriction is necessary for security and scalability and Google can only be expected to enable these scenarios by providing their own wrapper libraries for each desired scenario. However, this leads to restrictions on important scenarios. For example, if your application wants to incorporate email and connect to an IMAP server, then you have no ability to do this on GAE.
While Google does plan to eventually add additional services to their capabilities, there is no plan for providing a general capability for opening sockets.
"
in here
and from here, I have a new question: Is it possible to keep a connection from my linux service to app engine(the other way around) ? if my linux was a android device, it would be possible. can my server pretend to be android device that will listen to events from the server? I implemented push notification on app engine and i have another server for push for iPhones, when the app engine sends to all, i need the linux to know about it and to send to all iphones as well.
thanx!

So you have two questions:
Is it possible to send data from app engine server to another,external
server with httpd service?
Yes, you can have make outgoing http connections from your GAE app (link).
Is it possible to keep a connection from my linux service to app
engine(the other way around) ?
You can if you use Channel API to stream events to your linux. If I remember correctly you'll just have to request a new connection token every hour.

Related

using multiple twisted socket servers together

So I have a single twisted socket server that serves clients and eventually I'll need to add more servers. The problem is that connections to the server are unique and unable to be shared among multiple server instances.
This makes a problem if the servers are behind a load balancer, or if multiple users from a single chat are across multiple server instances, because a message to a chat won't successfully send to everyone.
How would I resolve this?
It may be a difficult task as balancing load can be improved according to the underlying protocol (like http for web servers).
Are you trying to design a load balancing system for basically any socket based application ? What I mean is that it is one thing to dispatch messages between multiples servers, ensuring correct synchronization, it is another thing to build a dynamic self-balancing system for any communication protocol.
To build your loadbalancer, you can use a "TCP proxy" like HAProxy (http://www.haproxy.org/)
To handle the communication between your application server instances (behind the load balancing server), you can use messaging like zeromq (http://zeromq.org/) or rabbitmq (http://www.rabbitmq.com/). You'll find some common architecture pattern there.
There are python libs for both zeromq and rabbitmq so the implementation within your twisted-based server is not too hard.

Using PythonAnywhere as a game server

I'm building a turn-based game and I'm hoping to implement client-server style networking. I really just need to send the position of a couple of objects and some other easily encodable data. I'm pretty new to networking, although I've coded some basic stuff in socket and twisted. Now, though, I need to be able to send the data to a computer that isn't on my local network, and I can't do port forwarding since I don't have admin access to the router and I'm also not totally sure that would do the trick anyways since I've never done it. So, I was thinking of running some Flask or Bottle or Django, etc. code off PythonAnywhere. The clients would then send data to the server code on PythonAnywhere, and when the turn passed, the other client would just go look up the information it needed on the server. I guess then the server would act as just a data bank with some simple getter and setter methods. My question is how can this be implemented? Can my Socket code on my client program talk to my Flask code on PythonAnywhere?
Yes, client code can talk to your project at PythonAnywhere, as you will be given a unique project url like http://yourblogname.pythonanywhere.com/. Your server will listen the 80 port at that url.
It depends what sort of connection your clients need to make to the server. PythonAnywhere supports WSGI, which means "normal" HTTP request/response interactions -- GET, POST, etc. That works well for "traditional" web pages or web apps.
If your client side needs dynamic, two-way connections using non-HTTP protocols, using raw sockets, or even websockets, PythonAnyhwere doesn't support that at present.

Sync data with Local Computer Architecture

The scenario is
I have multiple local computers running a python application. These are on separate networks waiting for data to be sent to them from a web server. These computers are on networks without a static IP and generally behind firewall and proxy.
On the other hand I have web server which gets updates from the user through a form and send the update to the correct local computer.
Question
What options do I have to enable this. Currently I am sending csv files over ftp to achieve this but this is not real time.
The application is built on python and using django for the web part.
Appreciate your help
Use a REST API. Then you can post information to your Django app over HTTP, using an authentication key if necessary.
http://www.django-rest-framework.org/ should help you get started quickly
Sounds like you need a message queue.
You would run a separate broker server which is sent tasks by your web app. This could be on the same machine. On your two local machines you would run queue workers which connect to the broker to receive tasks (so no inbound connection required), then notify the broker in real time when they are complete.
Examples are RabbitMQ and Oracle Tuxedo. What you choose will depend on your platform & software.

WebRTC LAN p2p VIDEO CHAT using PYZMQ possible?

I have built a messaging/chat application for my local network (all WINDOWS) using pyzmq and pyqt for UI, it is based on the majordomo pattern. It's setup this way:
each machine on the network has a client/worker pair
they connect to a 'server' broker via pyzmq and register sessions
sessions are broadcasted by 'server' broker to clients
when 'sender' client sends a message to a specific session, broker routes the message to the corresponding worker destination, a reply is generated by worker, and it gets routed by the broker back to the 'sender' client (ending loop, confirming delivery)
Everything is working well, text messages are formed in 'client' pyqt UI and received by 'worker'pyqt UI.
Now I'm looking to build upon this skeleton to add video chat to my application... I have been looking into webRTC and would like to find a way to implement it.
This is how webRTC works From what I gather (could be severely wrong here, please correct me):
Machine A's Chrome browser opens local video/audio stream from webcam/mic via javascript function
webkitGetUserMedia, then creates a (Machine A) URL for the stream via javascript function webkitURL
Sends (Machine A) URL to Machine B's Chrome browser via signaling server
Machine B's Chrome browser accepts and loads (Machine A) URL, sets up it's own local video/audio stream from webcam.mic via previously mentioned javascript functions and replies with a (Machine B) URL back to Machine A via signaling server
Machine A's Chrome browser is displaying (Machine B) video/audio | Machine B's Chrome browser is displaying (Machine A) video/audio
Is that the process? or is this a totally wring assumption of how peers connect to each other?
If Correct , I would like to adapt my current pyzmq application to act as a signaling server for creating connections between machines, Since IP addresses of my machines are known to me and I can configure my firewall to give access to needed ports I'm trying to eliminate any extra STUN/TURN servers for this setup, I am not planning to go outside of my LAN and access remote machines. And I would like to handle everything(as much as possible) with Python and included batteries(Avoiding Node.js).
So the main question is how should I go about integrating webRTC to my setup? Does webRTC need specific prerequisite libraries or API to be built and running on the signaling server or peer machines? any code examples/advice/links would be appreciated.

Python twisted client server interface

I'm designing a domotic network, which would basically consist in a main node that will act as a front-end where a web server will be hosted and a bunch of client nodes such as video entry-phone, irrigation control, alarm ...
The client nodes would register themselves on the front-end which would be, at that point, able of controlling their functionalities. I would like to use python for this project, so twisted and django would be the two main frameworks.
What's the smoothest way of having the clients interfaced with the front-end ?
The clients themselves could serve a webpage but I'm not really happy with this solution since I would like to have a single, main web server.
Could a light web server (twisted web server for instance) lazily turned on by the frontend if the user wants to control a given client be a solution here ?
What about defining some "public" amp.Command classes which could be adapted by the server to paint the client's specific web interface ? (some client business would need to be on server side that way)
Any advice is welcome.

Categories