I am working on capstone project and I am looking for ideas to implement this one.
I have a working web application that was created using OpenCV Python and deployed on a Django Framework. Now I want to also create a hybrid mobile application for the project. The idea is, the mobile app will allow the user to upload an image to the server, and then the web server will process the image then finally return a response to the mobile app.
===============================
Client (Hybrid Mobile app):
Take image
Web server:
Receive user uploaded image
Call Image Processing class (views.py) and do something more in the backend
Save the result to the database
Display the result to the client/Response to client (could be a page redirect)
============================
I know that it is possible to save an image to the database through REST API, however, I have no idea if it is possible for the client side to call a class from the server side through REST API? If no, then is there any other way to implement this method? Do know any references that can give me some ideas on how to implement it?
You can't call a class on the server from the mobile phone.
You can just post the file to your view (this will differ based on what architecture the app is running):
files = {'file': ('file_name.jpeg', file_data)}
response = requests.post(url, files=files, auth=(USERNAME, PASSWORD))
# add error checking etc
It sounds like you need to read up a bit more about creating an API. Most commomly, when you want data you do a http get and the data is returned in json. When sending data you do a http post.
Also, don't save file data directly in the database
Related
I'm having an hard time figuring out how to solve a problem with a little project.
Basically i have a Django application. On the other hand i have an external Python script running. I would like to create a system where, each time a form in my Django app is submitted, the data submitted in the form is sent to the external Python application.
The external Python service should receive the data, read it, and according to who is the user and what did he submit, it should perform some tasks, then send a response.
Here is what i thought: 1) Connect the external Python app to the same database that Django is using. So that when the form is submitted, it is saved on the database and the data can be 'shared' with the second Python service. The problem with this solution is that i would need the second app to query the Database every second and perform a lot of queries, that would be a problem with performances. 2) Create an epi endpoint, so that the external python app would connect to the endpoint and fetch the data saved in the database from there. The problem is the same of the first solution. Would a service like Redis or RabbitMQ help in this case?
Importing the external Python process in my Django app is not a solution, it needs to be separate from the Django app. An important requirement for this, is speed. When new data is submitted, it needs to be received by the second Python app in the shortest time possible.
That said, i'm open to any advice or possible solution to solve this problem, thanks in advance :)
You could use a microservices architecture to build this. Instead of sharing databases between two applications you have them communicate with each other through web requests. Django would shoot a request to your other app with the relevant data, and the other server would respond back with the results.
Usually one would use something like Flask (synchronous server) or Sanic (asynchronous server) to receive/reply, but you can also look into something like Nameko. Would also recommend looking into Docker as eventually, as you set up more of these microservices, you'll need it.
The idea is (i.e. using Flask), to create an access point that does some computation to your data and returns it back to the Django server.
computation.py
from flask import Flask
from flask import request
app = Flask(__name__)
#app.route("/", methods=["POST"])
def computation():
data = request.get_json()
print(data)
return f"Hey! {data}"
app.run(host="0.0.0.0", port=8090)
The Django server is simply sending a request to your server application.
django_mock.py
import requests
req = requests.post('http://0.0.0.0:8090/', json={"data": "Hello"})
print(req.text)
The above will print out on the computation.py app:
{'data': 'Hello'}
and will print out on the django_mock.py example:
Hey! {'data': 'Hello'}
You should build an API. The 2nd app would now be an application server and the 1st app, when it receives a form submission from the user, would persist its data to the DB and then make an API call to the 2nd app via this API. You would include key information in the API request that identifies the record in the DB.
You can use Django (e.g. DRF) or Flask to implement a simple API server in Python.
Now, this requires your app server to be up and running all the time. What if it's down? What should the 1st app do? If you need this level of flexibility, then you need to decouple these apps in some way. Either the 1st app implements some kind of backoff/retry if it can't send to the 2nd app. Or you use a reliable queueing mechanism (something like Amazon SQS).
I'm working on a single page application with Django, and would like to use WebSockets, and therefore Channels. To keep things simple, I think I want to handle all server communication over a WebSocket alone, rather than adding XHR (XML HTTP Request) into the mix. I'm using channels from the get-go since there will be a lot of data pushed from the server to the client asynchronously.
With regular Django, a conventional request made to https://example.com/login or https://example.com/logout or whatever and the Django URL router will decide what view to send it to. Instead, I would like to have the user perform their action in the client, handle it with Javascript, and use the WebSocket to send the request to the server. Since I'm using Django-allauth, I would like to use the provided Django views to handle things like authentication. The server would then update the client with the necessary state information from the view.
My question: how can I process the data received over the WebSocket and submit the HTTP request to the Django view? My channels consumer would then take the rendered HTML and send it back to the client to update the page or section.
I can picture what would happen using XHR, but I'm trying to avoid mixing the two, unless someone can point out the usefulness in using XHR plus WebSockets...? I suppose another option is to use XHR for authentication and other client initiated requests, and use the WebSocket for asynchronously updating the client. Does this make any sense at all?
Update: It occurs to me that I could use requests from PyPi, and make an sync_to_async call to localhost using credentials I received over the WebSocket. However, this would require me to then handle the session data and send it back to the client. This seems like a lot more work. That said, I could maintain the sessions themselves on the server and just associate them with the WebSocket connection itself. Since I'm using a secure WebSocket wss:// is there any possibility for hijacking the WebSocket connection?
Check out this project that gives the ability to process a channels websocket request using Django Rest Framework views. You can try to adapt it to a normal Django view.
EDIT: I am quoting the following part of the DCRF docs in response to #hobs comments:
Using your normal views over a websocket connection
from djangochannelsrestframework.consumers import view_as_consumer
application = ProtocolTypeRouter({
"websocket": AuthMiddlewareStack(
URLRouter([
url(r"^front(end)/$", view_as_consumer(YourDjangoView)),
])
),
})
In this situation if your view needs to read the GET query string
values you can provides these using the query option. And if the view
method reads parameters from the URL you can provides these with the
parameters.
#hobs if you have a problem with the naming of the package or the functionality is not working as intended, please take it up with the developers on Github using their issue tracker.
I have built a python webservices application using flask.
After creation of swagger document, I published the api in WSO2 API manager.
The publish of API and subscription to the API are successful.
I am able to get authentication token as well.
What I am doing
To consume the API, in Postman/Angular application I am uploading 2 excel file
Files are then sent to WSO2 server url which then sends it to python server
When python server receives the file, it parses it.
Does some calculations and returns the response data object.
Now the problem is at step 3. The files received at Python end are not in the excel format. The data of both the file are combined into one FileStorage object.
Please see the snapshot below
Instead of two One FileStorage object received in request.files
I am trying this all in postman as well as with an angular application and both doesn't work.
I even tried it via swagger at published app's page in WSO2 API manager but that also doesn't works.
Sending Files Via WSO2 Swagger
What only works
When I try the Swagger of published app, Or when I consume the services via postman making a request directly to python's server.
I am getting both the files in proper required format.
File received properly when tried with python app's swagger page
Can you please help me to understand as to what I might be doing wrong.
I am using content type as below
'content-type: multipart/form-data;
After some digging up I found my resolution at Multipart form data file upload using WSO2 API manager?
I added the
<messageBuilder contentType="multipart/form-data"
class="org.wso2.carbon.relay.BinaryRelayBuilder"/>
<messageFormatter contentType="multipart/form-data"
class="org.wso2.carbon.relay.ExpandingMessageFormatter"/>
inside the JSON Message Builders and JSON Message Formatters section in axis2.xml file at
<API-M_HOME>repository\conf\axis2\axis2.xml
I have an application hosted on Google App Engine and would like to make uploading data more secure. Currently, I upload data by using a python script to make POST requests with the relevant data to a URL on the /_ah/spi/... path (and the app takes care of uploading the data correctly). My goal is to make uploading data only available to admins of the application. I tried specifying an admin security-constraint for /_ah/spi/* in my web.xml but this seemed to block me from being able to upload data.
What is the best/easiest way only allow admins to upload data via my python script?
I didn't quite get exactly what I wanted (allow access to my application endpoints by admins only) but I did find a way to secure it. I followed the steps here. Basically I had to:
generate a client ID via the google developer console
add the client ID to the #Api annotation (under clientIds) for my endpoint
add a User parameter to all of the protected methods
authenticate myself using OAuth and my public/secret client ID (generated in step 1) in my python script (sample code here)
This scheme at least requires the script accessing my application to have both the public/secret client ID. Also, this question had some very useful information.
I have been trying to do something which I think should be pretty simple. The situation is as follows. The client makes a request for a resource on my web server. My flask application processes the request and determines that this resource is located at a certain location on another web server and the client should make a request of that server instead.
I know I can use the redirect function to tell the client to send a request to the remote location, but my problem is that the remote location is the Amazon Glacier servers. These servers require a request to be made in a certain way, with a special signature (see http://docs.aws.amazon.com/amazonglacier/latest/dev/amazon-glacier-signing-requests.html). My flask application knows how to go about the business of making these requests in the required way. I essentially want to know if it's possible to send a response to my client saying, send this request (generated by my application, with all the required signing) to the Amazon server?
Any ideas?
If the request can be encoded with get params like
http://www.redirecturl.com/?param1=bla¶m2=blub
then it should work no problem. Just construct the request as a string and pass it to redirect().
As far as i know, you can't tell a client to send specific headers to a HTTP redirect URL.
Hitting the Glacier URL serverside would be the easiest. Using javascript on the clientside would only work if Glacier is implementing CORS.