I have an Angular app that runs on a Python-Flask server on port 5000. Right now the app works well on local host. But now I want the app to be accessible to multiple users. Seems like I will have to create sessions (as each user will generate some temp data and I'd like this data to be stored in a directory where directory name is session id). How do I proceed with this? Also how can I test this multi user functionality on my local machine since flask only listens to one port.
Use gunicorn(Gunicorn.org) to run your app, it will automatically create multiple threads to handle requests from multiple users.
You can store the temp data on client side in local or session storage and send the required data with each request(**Do not store sensitive data on client side like password)
Related
I created a server to receive data from a data_logger,
It is connected through a flask api that I created, and stores the data in a posgreSQL database.
When I run the api locally, through the terminal, I can send hundreds of thousands of data per minutes. But when I do the Deploy on IIS windows it limits me a lot.
I already changed the "maxURL" and "MaxQueryString"
The sending is done with the 'POST' method, inside a url parameter.
WEB.CONFIG IS LIKE THIS:
requestFiltering allowDoubleEscaping="true"
requestLimits maxAllowedContentLength="4294967295" maxUrl="4294967295" maxQueryString="4294967295" /
</requestFiltering
Exemplo da url>>> machineserver.ddns.net:5000/shippingofroute?parameter=RR108102220300020900209002090020900786;RR308102221300715707157071570715700840;RR108102222309876503452987650345200104;RR108102223300320203202032020320200890;
unfortunately I still haven't found a solution to this problem, I need to send a lot of data from several loggers, locally the bank and the API work perfectly.
Every help is welcome, thankss!!!!
alright.
I managed to resolve this error.
Instead of passing the values in the url parameters I send a json through the request body
I have a really old piece of code that I am planning to refactor.
Basically, what this application does is extract data from Database A that is stored on Server A, process the extracted data and save it to Database Z that is stored on Server Z. The retrieval & saving of data is done via urllib.request.
Not only do I have Database A that is stored on Server A, I have several database stored on separate servers as well. (i.e. Database B on Server B, Database C on Server C) but all this data goes to Database Z.
Now here's the catch, application.py is deployed on Servers A,B,C, .. and runs multiple schedules and sometimes simultaneously as well. Database A,B,C, .. have the same table structure except for Database Z.
With the current design, I can execute my application wherever I am as long as I have an internet connection if need be. I just pass the Server X parameter and viola, process complete.
But this design got me thinking on whether I should change my application design to retrieve the data via odbc instead of getting the data via web.
Should I switch from retrieving data via web requests to querying data via odbc?
Please shed some light as to what is the pro's and con's on using a direct sql connection vs retrieving data via web api for my specific scenario.
Note: I don't mind not being able to run the application wherever I am as long as the transition is for the greater good. I might as well remote the servers and execute the application from there.
I'm using:
from flask import session
#app.route('/')
def main_page():
if session.get('key'):
print ("session exist" + session.get('key'))
else:
print ("could not find session")
session['key'] = '34544646###########'
return render_template('index.html')
I don't have the Flask-Session extension installed but this still works fine. I'm trying to understand why and when is that extension imp to me. As far as I see, the default session works well for me.
The difference is in where the session data is stored.
Flask's sessions are client-side sessions. Any data that you write to the session is written to a cookie and sent to the client to store. The client will send the cookie back to the server with every request, that is how the data that you write in the session remains available in subsequent requests. The data stored in the cookie is cryptographically signed to prevent any tampering. The SECRET_KEY setting from your configuration is used to generate the signature, so the data in your client-side sessions is secure as long as your secret key is kept private. Note that secure in this context means that the data in the session cannot be modified by a potential attacker. The data is still visible to anybody who knows how to look, so you should never write sensitive information in a client-side session.
Flask-Session and Flask-KVSession are two extensions for Flask that implement server-side sessions. These sessions work exactly in the same way as the Flask native sessions from the point of view of your application, but they store the data in the server. The data is never sent to the client, so there is a bit of increased security. The client still receives a signed cookie, but the only data in the cookie is a session ID that references the file or database index in the server where the data is stored.
from flask import session
Cookies of all session data is stored client-side.
Pros:
Validating and creating sessions is fast (no data storage)
Easy to scale (no need to replicate session data across web servers)
Cons:
Sensitive data cannot be stored in session data, as it's stored on the web browser
Session data is limited by the size of the cookie (usually 4 KB)
Sessions cannot be immediately revoked by the Flask app
from flask_session import Session
Session data is stored server side.
Pros:
Sensitive data is stored on the server, not in the web browser
You can store as much session data as you want without worrying about the cookie size
Sessions can easily be terminated by the Flask app
Cons:
Difficult to set up and scale
Increased complexity since session state must be managed
*this information is from Patrick Kennedy on this excellent tutorial: https://testdriven.io/blog/flask-server-side-sessions/
Session
A session makes it possible to remember information from one request to another. The way Flask does this is by using a signed cookie. Cookie can be modified unless they have SECRET KEY. Save in Client Side unless permanent is set to TRUE(boolean). If Permanent is set True, it's store in the server default 31 days unless it mentioned PERMANENT_SESSION_LIFETIME in flask app.
Flask-Session:
Flask-Session is an extension for Flask that adds support for Server-side Session to your application. It's main goal to store the session in Server side
Server Side method are
- redis: RedisSessionInterface
- memcached: MemcachedSessionInterface
- filesystem: FileSystemSessionInterface
- mongodb: MongoDBSessionInterface
- sqlalchemy: SqlAlchemySessionInterface
Flask-Session is an extension of Session.
Bases on config method it's over write the existing session saving method.
flask.sessions.SessionInterface: SessionInterface is the basic interface you have to implement in order to replace the default session interface which uses flask(werkzeug’s) secure cookie implementation.
The only methods you have to implement are open_session() and save_session(), the others have useful defaults which you don’t need to change.
Based on this, they are updating the session in the selected storage
Session Interface
Reference Links:
Session
flask_session
Session Interface
`
I'm hoping to be pointed in the right direction as far as what tools to use while in the process of developing an application that runs on two servers per client.
[Main Server][Client db Server]
Each client has their own server which has a django application managing their respective data, in addition to serving as a simple front end.
The main application server has a more feature-rich front end, using the same models/db schemas. It should have full read/write access to the client's database server.
The final desired effect would be a typical SaaS type deal:
client1.djangoapp.com => Connects to mysql database # client1_IP
client2.djangoapp.com => Connects to mysql database # client2_IP...
Thanks in advance!
You could use different settings files, let's say settings_client_1.py and settings_client_2.py, import common settings from a common settings.py file to keep it DRY. Then add respective database settings.
Do the same with wsgi files, create one for each settings. Say, wsgi_c1.py and wsgi_c2.py
Then, in your web server direct the requests for client1.djangoapp.com to wsgi_c1.py and client2.djangoapp.com to wsgi_c2.py
I'm currently running a t2.micro instance on EC2 right now. I have the html/web interface side of it working, along with a MySQL database.
The site allows users to register and stores them in the DB via a PHP script.
I want there to be an actual Python application that queries the MySQL database and returns user data, to then be executed in a Python script.
What I cannot find is whether I host this Python application as a totally separate instance or if it can exist on the same instance, in a different directory. I ultimately just need to query the database, which makes me thing it must exist on the same instance.
Could someone please provide some guidance?
Let me just be clear: this is not a Python web app. This Python backend is entirely separate except making queries against the database.
Either approach is possible, but there are pros & cons to each.
Running separate Python app on the same server:
Pros:
Setting up local access to the database is fairly simple
Only need to handle backups or making snapshots, etc. for a single instance
Cons:
Harder to scale up individual pieces if you need more memory, processing power, etc. in the future
Running the Python app on a separate server:
Pros:
Separate pieces means you can scale up & down the hardware each piece is running on, according to their individual needs
If you're using all micro instances, you get more resources to work with, without any extra costs (assuming you're still meeting all the other 'free tier eligible' criteria)
Cons:
In general, more pieces == more time spent on configuration, administration tasks, etc.
You have to open up the database to non-local access
Simplest: open up the database to access from anywhere (e.g. all remote IP addresses), and have the Python app log in via the internet
Somewhat safer, more complex: set the Python app server up with an elastic IP, open up the database to access only from that address
Much safer, more complex: set up your own virtual private cloud (VPC), and allow connections to the database only from within the VPC. You'd have to configure public access for each of the servers for whatever public traffic you'll have, presumably ports 80 and/or 443.