I'm trying to create an API using FastAPI and MongoDB that has just a GET method and I have no clue for how to do this.
My application already populated the MongoDB with a huge data base using an txt file that is converted to csv and include all data on MONGODB collections and what I'm trying to do is: Give a zip code in the endpoint route(e.g. localhost:8000/cep/123456789) and my application will return all information from my MongoDB collections that is part of that zip code like street, city.
#app.get('/cep/{cep}')
def find_ceps():
mongo_uri = pymongo.MongoClient("mongodb://root:example#localhost:27017/")
db_name = mongo_uri["cep4free"]
col = db_name["cep4free"]
for i in col.find({}):
return i
cep = find_ceps()
return cep
I tried to do this, but it isn't work and I have no idea how to return this using a GET method.
I'm just new on Python and FastAPI as well and I'm coding this API for practice and learn. I would be glad if anyone could help.
Thanks!
First thing you need to do is separate out your db initialization from your actual route. FastAPI shows an example of how to do that, just replace the Couch initialization lines with the ones you need for Mongo.
Second, you (usually) will have a database folder with code you'll be using in your route. This is part of a layered architecture approach. From here you'd either define a schema, or just call a method from the defined class in the data layer you just created.
For example (of the more simplified approach) we would call cep_db.findAll() in the route. This method could live in a class named Cep in the file database/cep.py.
Finally, see this project https://github.com/markqiu/fastapi-mongodb-realworld-example-app for a working example.
Related
swagger-editor is with codegen tool built in, so I can generate server side code with it. I try the pet store example, and it works. Then my next try is to think about data persistence. I google around, and I often use Python/Java, so I focus on this 2 languages.
First I would like to try python.
In the swagger-codegen tool, it depends on flask/connexion/flask-swagg and so on, so I think connexion is a good start for me. However when I read the generated code, I found that swagger-codegen inherits swagger data Model itself, and I can use connexion with SQLAlchemy to implement db ORM. For example
Connexion-Example. Also I found SAFRS a good place to go, with this I can easily expose all the DB models as REST API, so that I don't need to create duplicate ORM code myself. So I try to find some suggestions, which can help me out, because now I am stuck.
I want to create a API specification file(yaml or json in swagger-editor), and also the DB model. Then generate server side code, including DB model with SQLAlchemy, with all the DB CRUD ops as REST API. My first thinking is to modify swagger template in swagger-codegen, to use SAFRS or connexion. But I would like to discuss here to get some advise on how you manage to do this kind of work?
Thanks.
Andes
I’m new to Pyramid. I’ve used Python for a few months. I've created a Python application on Linux to maintain an Oracle database using weekly data feeds from a vendor. To get that done, one of the things I did was to create a customized database wrapper class using the cx_Oracle package. I had specific requirements for maintaining history in the DB. All Oracle access goes through the methods in this wrapper. I now want to use Pyramid to create a simple reporting browser interface to the Oracle DB. To allow me the greatest flexibility, I’d like to use the wrapper I already have to get to the data on Oracle instead of Alchemy (or possibly with it, I'm not sure).
In my Pyramid app, I’ve considered importing my wrapper in my views.py init method but that seems to get executed with every browser submit.
Can anyone suggest how I might create a persistent connection to Oracle that I can use over and over from my reporting application which uses my wrapper class? I’m finding Pyramid a bit opaque. I’m never sure what’s happening behind the scenes but I’m willing to operate on trust until I get the swing of it. I need the benefit of the automatic authorization/ authentication and login.
What I’m really looking for is a good approach from experienced Pyramid users before going down the wrong track.
Many thanks.
You should definitely use SQLAlchemy as it makes use of connection pooling and such. In your case SQLAlchemy would use cx_Oracle underneath but make you be able to concentrate writing actual code instead of maintaining connections/pooling them/and such.
You should follow the patterns in Wiki2 tutorial to set up basic SQLAlchemy.
So, basically, the question boils down to "how do I use my existing API with Pyramid", right? This is quite easy as Pyramid is database-agnostic and very transparent in this area despite what you say :) Basically, you import it, call its methods, retrieve data and send it to the template:
import pokemon_api as api
#view_config(route_name='list_pokemon', renderer='pokemon_list.mako')
def list_pokemon(request):
# To illustrate how to get data from the request to send to your API
batch_start = request.GET.get('batch_start', 0)
batch_end = batch_start + request.GET.get('batch_size', 0)
sort_order = request.GET.get('sort_by', 'name')
# Here we call our API - we don't actually care where it gets the data from, it's a black box
pokemon = api.retrieve_pokemon(from=batch_start, to=batch_end, sort=sort_order)
# send the data to the renderer/template
return {
'pokemon': pokemon
}
#view_config(route_name='add_pokemon', request_method='POST')
def add_pokemon(request):
"""
Add a new Pokemon
"""
name = request.POST.get('name', 0)
weight = request.GET.get('weight', 0)
hight = request.GET.get('hight', 0)
api.create(name=name, weight=weight, height=height)
# go back to the listing
return HTTPFound('/pokemon_list')
and if your API needs some initialization, you can do it at startup time
import pokemon_api as api
def main(global_config, **settings):
""" This function returns a Pyramid WSGI application.
"""
config = Configurator(settings=settings)
...
api.init("MAGIC_CONNECTION_STRING")
return config.make_wsgi_app()
Of course, this assumes your API already handles transactions, connections, pooling and other boring stuff :)
One last point to mention - in Python you generally don't import things inside methods, you import them at the top of the file at module-level scope. There are exceptions for that rule, but I don't see why you would need that in this case. Also, importing a module should be free of side-effects (which you might have since "importing my wrapper in my views.py __init__ method" seems to be causing trouble).
I have not worked with Django seriously and my only experience is the tutorials on their site.
I am trying to write my own application now, and what I want is to have some sort of API. My idea is that I will later be able to use it with a client written in any other language.
I have the simplest of all apps, a model that has a name and surname field.
So the idea is that I can now write an app lets say in c++ that will send two strings to my Django app so they can be saved in the database as name, surname respectively.
What I know until now is to create a form so a user can enter that information, or have the information in the url, and of curse adding them myself from the admin menu.
What I want though is some other better way, maybe creating a packet that contains that data. Later my client sends this data to my Django webpage and it will extract the info and save it as needed. But I do not know how to do this.
If my suggested method is a good idea, then I would like an example of how this is done. If not the I would like suggestions for possible things I could try out.
Typically, as stated by #DanielRoseman, you certainly want to:
Create a REST API to get data from another web site
Get data, typically in JSON or XML, that will contain all the required data (name and surname)
In the REST controller, Convert this data to the Model and save the Model to the database
Send an answer.
More information here: http://www.django-rest-framework.org/
I am writing function, which create new Solr core.
To create core, you need to post data like (http://wiki.apache.org/solr/CoreAdmin):
http://localhost:8983/solr/admin/cores?action=CREATE&name=coreX&instanceDir=path_to_instance_directory&config=config_file_name.xml&schema=schem_file_name.xml&dataDir=data
But in this example you need to refer to existing config and schema.
In my app each core can be with different configuration, so the best way will be to post config and schema with JSON format to server with create request.
Its is possible?
Thanks for the help!
No, as far as I know, this is not possible at the moment (without creating the files on the server, then creating a core from the files).
You might want to use a more schemaless-ish structure for your schema if you need this kind of functionality, where you rather define a set of field pre/postfixes that map to different default settings for fields, and then use wildcard names to avoid having to define each field in your Schema.
A truly schema less alternative based on Lucene could be Elastic Search as well.
At my work, we use Oracle for our database. Which works great. I am not the main db admin, but I do work with it. One thing I like is that the DB has a built in logic layer using PL/SQL which ca handle logic related to saving the data and retrieve it. I really like this because it allows our MVC application (PHP/Zend Framework) to be lighter, and makes it easier to tie in another platform into the data, such as desktop or mobile.
Although, I have a personal project where I want to use couchdb or mongodb, and I want to try and accomplish a similar goal. outside of the mvc/framework, I want to have an API layer that the main applications talk to. they dont actually talk directly to the database. They specify the design document (couchdb) or something similar for mongo, to get the results. And that API layer will validate the incoming data and make sure that data itself is saved and updated properly. Such as saving a new user, in the framework I only need to send a json obejct with the keys/values that need to be saved and the api layer saves the data in the proper places where needed.
This API would probably have a UI, but only for administrative purposes and to make my life easier. In general it will always reply with json strings, or pre-rendered/cached html in some cases. Since each api layer would be specific to the application anyways.
I was wondering if anyone has done anything like this, or had any tips on nethods I could accomplish this. I am currently looking to write my application in python, and the front end will likely be something like Angularjs. Although I am also looking at node.js for a back end.
We do this exact thing at my current job. We have MongoDB on the back end, a RESTful API on top of it and then PHP/Zend on the front end.
Most of our data is read only, so we import that data into MongoDB and then the RESTful API (in Java) just serves it up.
Some things to think about with this approach:
Write generic sorting/paging logic in your API. You'll need this for lists of data. The user can pass in things like http://yourapi.com/entity/1?pageSize=10&page=3.
Make sure to create appropriate indexes in Mongo to match what people will query on. Imagine you are storing users. Make an index in Mongo on the user id field, or just use the _id field that is already indexed in all your calls.
Make sure to include all relevant data in a given document. Mongo doesn't do joins like you're used to in Oracle. Just keep in mind modeling data is very different with a document database.
You seem to want to write a layer (the middle tier API) that is database agnostic. That's a good goal. Just be careful not to let Mongo specific terminology creep into your exposed API. Mongo has specific operators/concepts that you'll need to mask with more generic terms. For example, they have a $set operator. Don't expose that directly.
Finally after having a decent amount of experience with CouchDB and Mongo, I'd definitely go with Mongo.