I am relatively new to programming. I have a project where I need to control things like pumps and read data from sensors. I currently have a flask webserver set up on a raspberry and can access that website only from within my local network.
My objective is to change my flask webserver to a flask REST APi. Then I want to set up my raspberry pi on a cloud/iot platform and control my pi over the cloud via the flask REST api.
Any idea on how to best do this? Ive been researching it a lot and could use some help.
I would setup the api endpoints like you discussed. Those endpoints I assume would allow you to control the sensors and whatever other hardware is attached to the GPIO. Then as you mentioned you would expose that API to some sort of an IOT platform. AWS offers a nice solution however there are many.
I have come a relatively long way from what I knew when asking this question. For other new programmers trying to figure out how to bring their project/idea to reality here is what I did to deploy my local flask web server on the pi to "the cloud":
I used a company named linode to host my web server on, but there are many other cloud platforms you could choose from like digital ocean and heroku. All I did was purchase their entry level plan, connect via ssh to the new server, copy over the files containing my web server, and add security provisions. Then you can use ssh to connect to your server and use the terminal to do what you need. This was the processes that I followed to deploy to the cloud. There are an abundance of options and ways to do this, but I found this option relatively flexible and cost effective...giving me the chance to participate in the IoT world.
I know this information is obvious to a lot of experienced programmers, but for the ones just getting started I hope this explanation can provide some clarity on your way forward.
Related
Im developing a financial app using PySimpleGUI.
This is a desktop app, and will be sold publicly on my web page. I need a place to store my future clients data.
Does Google Cloud Storage work for a Desktop App, and is it safe? ( There will be sensitive financial data stored ). Also, multiple people will be editing the files simultaneously, will this cause the Google Cloud Storage to break?
Will you recommend me using something else for storing my data?
Thanks
I have tried connecting to SQL Server, but it only works for computers that are on the same network.
Your choice of components is way out of sync. My suggestion would be to first see what the actual requirements are. One small example would be how many people at a time will access the application, what data access controls will be present, how will you implement them? Can you use features of GCP or will you be developing your own? In any of the scenarios, are you involving data masking? What would be the design to expand the application in long run, etc. etc. Also a small disclaimer, the above queries have barely scratched the surface of the complexities involved in designing such data systems.
Once done, go through the list of tools available in GCP. See, what fits and how an efficient chain can be established.
Also, connecting to GCP via python works anywhere depending on how you setup the environment.
Disclaimer: I am a novice programmer
I am currently following a tutorial: http://www.raywenderlich.com/3932/networking-tutorial-for-ios-how-to-create-a-socket-based-iphone-app-and-server
To build a chat application on the iPhone using socket servers. For other purposes, I am using Google App Engine to maintain the backend of my app and hold onto other pieces of data. It only makes sense to have all my backend code located in one area so I was wondering whether Google App Engine will support my socket Programming as there seems to be quite a few restrictions as such: https://developers.google.com/appengine/docs/python/sockets/#limitations-and-restrictions
In fact it almost looks as if there are too many restrictions, however google on the page said that there are "Libraries that import socket, such as poplib or nntplib, and that don't violate the limitations and restrictions listed below, should work without modification." meaning that there are things that I can do to modify my work to allow it to work on the Google App Engine.
My Question: Is it possible to use my learning of socket programming to maintain a backend for my chat on the google app engine? If there is, how do I modify my file if I need to. If there isn't, what app server should I look into so that I can at least hold my chat backend on another server if not at google app engine. If you think that I should take another method altogether to implement chat in my iPhone app, I would love to hear that as well. Thank you for your input.
I think you shall not open the socket yourself, you should use APNS on iPhone and Google Cloud Messaging on Android, so it's not your app that will open (send keepalives, reopen when closed, reopen when connectivity change, etc...) the TCP socket. Also you'll be able to receive data (be spawned) when even if your app is closed.
Received messages (from APNS/GCM) can contains the actual data, or simply be a "Hey, you may go check for messages on the server". To send message you may simply use an HTTP request.
I am currently working on a project where we need to establish communication like an ESB, between a REST API and the apps services on a small scale.
Scenario:
Assume a web app front end (e.g. Django/Python or Ruby/Rails) and services that are accessible via a HTTP RESTful request.
How can I:
make it configurable which web services are called on a web request depending on the request and not requiring code changes (through keys for example)
encapsulate or implement the services in a way to make it easy to manage them e.g. start/stop etc.
I have been looking at spring.io, but cant work out whether this could be used for the this??
I am open to all suggestions,
Thanks
From what I understand, you want an authorisation solution.
In Rails, Pundit and CanCanCan are very popular. You could also implement it from scratch. Here is a screencast to help you get started.
I spent quite some time now with researching Server Backends/API/Frameworks. I need a solution where I can store user content (JSON & Binary data).
The obvious choice would be a REST API. The only missing element is a push feature when data on server changed and clients should be notified instantly. With more research in this matter I discovered classic approaches (Comet, Push, Server sent events, Bayeux, BOSH, …) as well as the „new“ league, Websockets. I would definitely prefer the method with Websockets or using directly TCP Sockets. But this post is not about pros/cons of these two technologies so please restrain yourself from getting side tracked in comments.
At moment exists following projects which are very similar to my needs:
- Simperium (simperium.com), this looks very promising, but core/server is sadly not open source and god knows when, if ever, this step happens
- Realtime.co (framework.realtime.co/storage), hosted service, but same principle
- Some Frameworks for building servers such as Atmosphere (java, no WAMP), Cometd (java, project page looks like stuck in the 90’s), Autobahn (python, WAMP)
My actual favorite is the Autobahn framework (autobahn.ws). Especially using the WAMP protocol (subset of Websocket) as it offers exactly what I need. So the idea would be to build a python backend/server with Autobahn Python (based on Twisted framework) which manages all socket (WAMP) connections and include a Postgresql database for data storing. For all desired clients exists already WAMP libraries. The server would need to be able to do the typical REST API features:
- Send, update, delete requested data (JSON/Binary) from/to server/clients
- Synchronize & automatic conflict management
- Offline handling when connection breaks, automatic restart when connection available again
So finally the questions:
- Have I missed an open source project which covers exactly my needs?
- If I would like to develop my own server with autobahn and a database, could you point me to right direction? Have lot of concerns and not enough depth understanding.. I know Autobahn gives you already a server, but this one is not very close to my final needs.. how to build a server efficient so that he can handle all connected sockets? How handle when a client needs server push? Are there schemas, models or concept how such a server should look like?
- Twisted is a very powerful python framework but not regarded as the most convenient for writing apps.. But I guess a Socket based storage server with db access should be possible? When I run twisted as a web ressource and develop server components with other python framework, would this compromise the latency/performance much?
- Is such a desired server backend with lot of data storage (JSON fields and also binary data such as documents, images) reasonable to build with Sockets by a single devoloper/small team or is this smth. which only bigger companies like Dropbox can do at the moment?
Thank you very much for your help & time!
So finally the questions:
Have I missed an open source project which covers exactly my needs?
No you've covered the open source projects. Open source only gets you about halfway there though. To implement a Global Realtime Network requires equal parts implementation and equal parts operations. You have to think about dropped messages, retries, what happens if a particular geography gets hot how do you scale your servers ...etc. I would argue that an open source solution won't achieve what you want unless you're willing to invest significant resources into operations. I would recommend a service like PubNub: http://pubnub.com
If I would like to develop my own server with autobahn and a database, could you point me to right direction? Have lot of concerns and not enough depth understanding.. I know Autobahn gives you already a server, but this one is not very close to my final needs.. how to build a server efficient so that he can handle all connected sockets? How handle when a client needs server push? Are there schemas, models or concept how such a server should look like?
A good database to back a realtime framework would be Cassandra because it supports high write volumes and handles time series data well: http://cassandra.apache.org/.
Twisted is a very powerful python framework but not regarded as the most convenient for writing apps.. But I guess a Socket based storage server with db access should be possible? When I run twisted as a web ressource and develop server components with other python framework, would this compromise the latency/performance much?
I would not use Twisted. I would use Gevent:http://www.gevent.org/. Its coroutine based so you don't get into callback hell. To support more connections you just increase your greenlet pool to listen on the socket.
Is such a desired server backend with lot of data storage (JSON fields and also binary data such as documents, images) reasonable to build with Sockets by a single devoloper/small team or is this smth. which only bigger companies like Dropbox can do at the moment?
Again I would not build this on your own. A service like PubNub: http://pubnub.com which takes care of all the operational issues for you and has a clean API would service your needs with minimal cost. PubNub takes care of the protocol for you so if your on a mobile device that doesn't support WebSockets it will use TCP, HTTP or whatever the best transport is for the device.
Say I have code written in python that analyzes files on my computer and returns a result. It works great locally on my HD, but now I'd like to turn it into a mobile app. This means I'll require a server of some kind (cloud for instance) where users can access it.
It is my understanding that all that would be required is a method to grant user credentials and permissions to the patrons so they can access the "run" command in my analysis program. But honestly, I have no ZERO visibility in this area and don't really know where to begin.
I only have two questions:
Users & their credentials are endless, but they all have to share the same analysis program. I don't know much about servers, but wouldn't this method cause long queue times? Generally-speaking what considerations would I have to make in my analysis code to avoid this?
Can someone just point me in the direction of what I'd need to learn in order to answer the above question? This topic is a bottomless pit of information and I don't wanna get trapped.
Thanks.
Django is an MVC Web framework which possesses all features required for doing Web applications with Python. Simply go through the tutorial and you should be up and running in no time, on your local machine.
To deploy there are various options, be it a cloud instance (a lot of providers here, including Rackspace and Amazon, Google for "django web hosting"), or "traditional" server machines (again a lot of providers here).
The "mobile" part is just the user interface. This affects decisions in the presentation part of your application, and you can restricted this to the View part in Django jargon (i.e. the HTML templates) of your Web application. You can look for frameworks which allow the production of aesthetically decent (or better) user interfaces HTML tailored for mobile/tablet devices, e.g. JQueryMobile.
Therefore direction: start with Django -> deploy on a server "somewhere" -> tailor your user interface for mobile devices.