Accessing MongoDB Database on nodejs server using HTTP GET requests - python

I have a nodejs server setup on AWS with mongoDB. I want to access the database contents using GET method. There is another application in python which needs to access this database present on AWS. I searched on the internet and came across PycURL but I am not getting how to use it exactly. How to approach with pycURL or what can be an alternate solution?

You can build your restful API that is going to handle those GET requests. You have awesome tutorial (with example that you want on bottom):
https://scotch.io/tutorials/build-a-restful-api-using-node-and-express-4
Edit: If you want phyton code for GET requests there is awesome answer here: Simple URL GET/POST function in Python
Edit 2: Example of how would this work. You first need to code your API how to handle GET request and on what route (example: http://localhost:5000/api/getUsers). Than you want to make GET request to that route using Phyton:
Example:
r = requests.get(url="http://localhost:5000/api/getUsers")

I had a similar problem a while ago, there is a tutorial here here. It can lead you towards your intended direction, the drawback may be that in the tutorial, to issue the http request (if I remember correctly), they used postman but I'm sure you can still use PyCurl.

Related

Get data from Scroll in http request api elasticsearch

I'm trying to write a code in python to get all the data from an api through an http request.
I am wondering if there is a way to use the _scroll_id and it's contents in python. If so, how do I implement it or could you share some documentation regarding it?
All the documentation regarding elasticsearch in python is using a localhost...
Any leads would be highly appreciated.
Elasticsearch has a Python library that helps with pinging the database.
You can use the scan() helper function. Internally, it calls the scroll API so you don't have to worry about any of that.
For the last part of your question, you'd have to follow the tutorial to see how to connect to different databases.
https://elasticsearch-py.readthedocs.io/en/v8.3.3/helpers.html#scan

Backend script passing data to client-side script

Say that I have a Python Flask server running which has a backend script that produces a number and a string.
How can I pass the number and string from the backend to a script that runs of client-side so that the user can run it rather than the server?
Example:
backend script data_producer.py produces "asdaslkdjasdlksja" and 18 from its functions
I want to pass "asdaslkdjasdlksja" and 18 to a Brython or JavaScript HTML embed ( tag) so that it can be processed in the browser and the results be sent back to my server.
Edit: I realized that I can just use Jinja2's "{{ }}" when rendering a template so that I can use data in an HTML script embed
The question you asked is too broad. It's almost equivalent to asking, how you can connect two computers. Since you haven't even specified any data type, the first thing that comes to my mind is using sockets which is as low level as you can get.
A more high level and appropriate approach would be to use an HTTP REST server (with flask-RESTful), since you already have a flask server running.
However, there's another million ways to transfer data between two Python scripts, from WebSockets, WebRTCs, sshing, the new IPFS to even emails. Now most of them would probably be overkill, so I would suggest you to make a simple REST server and make the client send GET or POST requests to it.
After looking at the new edit, I still think a REST api is the best option. Since you can easily make a GET or POST request using the fetch api in Javascript. In Brython you can use ajax to do the same thing.

Getting data from some rest API on Django

So here's the deal:
I have some nodeJS/PHP rest API! And I need to build a Django app which feeds on that API. Everything will be done on the server-side! So I would not use Django's back-end structure.
Basically I would GET some JSONs and POST it back so the server, which in turn would process that data. How should I proceed? I've been looking for tutorials for a while now.
However everywhere I look, people are using django-rest or something "django friendly". I tried to start using python-requests but it is still kind of cloudy, am I letting my front-end unprotected using direct GET/POST calls to the server (using requests)!?
Any guidance would be much appreciate!

Python Requests & Django to send a picture from app to server

I would like to do a very simple thing but I kept having trouble to get it work.
I have a client and a server. Both of them have python. The client needs at a certain time in the python code to send a picture to the server and the server is using python to receive the picture, to do some modifications in the picture then save it to disk.
How can I acheive this the easiest way possible? Is Django a good idea?
My problem is that I keep getting an error from the Django server side and it seems it is because I am not managing the cookies.
Can someone give me a sample code for the client and for the server to authenticate then send the file to the server in https?
Also, if you think it is best to use something else than Django, your comments are welcomed :). In fact I managed to get it work very easily with client python and server php but because I have to treat everything in python on the server, I would have prefered not to install apache, php, ... and use only python also to get the picture.
Many thanks for your help,
John.
You don't need Django - a web framework - for this unless you really want to have the features of Django. (Here's a good link. But to sum it up, it would be "a bunch of website stuff".)
You'd probably be best off just using something to transmit data over the network. There are a lot of ways to do this!
If your data is all local (same network) you can use something like ZeroMQ.
If you are not sure if your data is local, or if you know it won't be, you can use HTTP without a server - the Requests library is awesome for this.
In both these scenarios, you'd need to have a "client" and a "server" which you already have a good handle on.

Review Board Python Web API

I'm new with using third-party APIs, and I was trying to get the Review Board Web API to work in python.
However, I was confused about 3 things:
from rbtools.api.client import RBClient
client = RBClient('http://localhost:8080/')
root = client.get_root()
My first question: is http://localhost:8080/ the server that is running the ReviewBoard server? Is there some sort of test server that I can use instead of running my own?
Again, I don't have much experience with APIs so I was wondering if I needed to do some form of authentication before making these calls.
Finally, if I must set up my own Review Board server to try out the API. Would it be possible to get some code for a very simple example as to how to make the simplest POST and GET request you can think of with minimal setup if for example my server was running on http://localhost:8080/?
Reference : http://www.reviewboard.org/docs/rbtools/0.5/api/overview/
To answer your first question: the answer seems to be yes although their docs don't make it entirely clear.
Their docs say:
Here is an example of how to instantiate the client, and retrieve the Root List Resource resource:
Before the code snippet your pasted. That makes me think that the url being passed is whatever you're trying to use, e.g., you could have that set up on a networked machine called monty_python running on port 5050, then you would do:
client = RBClient('http://monty_python:5050/')
As for a test server you should check the documentation they have about their Web API.
Their examples don't seem to show any authentication being performed in the overview. If you check in other sections (e.g., Tutorial: Creating a Pull Request) you'll see them demonstrate how to authenticate and what can be done after authenticating.
As for your last question, I'm not 100% sure what you're asking, but you should probably check the docs I found for their Web API

Categories