I have not worked with Django seriously and my only experience is the tutorials on their site.
I am trying to write my own application now, and what I want is to have some sort of API. My idea is that I will later be able to use it with a client written in any other language.
I have the simplest of all apps, a model that has a name and surname field.
So the idea is that I can now write an app lets say in c++ that will send two strings to my Django app so they can be saved in the database as name, surname respectively.
What I know until now is to create a form so a user can enter that information, or have the information in the url, and of curse adding them myself from the admin menu.
What I want though is some other better way, maybe creating a packet that contains that data. Later my client sends this data to my Django webpage and it will extract the info and save it as needed. But I do not know how to do this.
If my suggested method is a good idea, then I would like an example of how this is done. If not the I would like suggestions for possible things I could try out.
Typically, as stated by #DanielRoseman, you certainly want to:
Create a REST API to get data from another web site
Get data, typically in JSON or XML, that will contain all the required data (name and surname)
In the REST controller, Convert this data to the Model and save the Model to the database
Send an answer.
More information here: http://www.django-rest-framework.org/
Related
I would like to create a vcf file on my website that users can download and add the file info to their contacts on their mobile phones.
So far I have made this:
Download
When I click the link it downloads a vcf file. When I open it, it redirects me to my contacts app and throws me this error: "No importable cards were found." That´s because I haven´t set any information in any VCard. I would like to know how can I set/create a VCard with the information I have in my SQLAlchemy database (name, email,phonenumber,website,etc.) Thanks in advance
I had to solve this problem recently for work. Here is how I did it!
The broad strokes: Created a Jinja2 template based on my team's needs for vcard output, a data model to lay over the template, a service to render the template from the database query, and finally, using io.BytesIO and flask.send_file to transmit the data in file format to the user.
The gist linked above doesn't have the more contextual parts of the implementation, but does provide an example of how to wire up flask to do this.
Edit: I evaluated the vobject library that i found recommended, but I honestly didn't think it was intuitive to use or very pythonic, meaning it wasn't something I wanted to depend on in my code base. However, maybe it'll work better for you (or others).
So I have a Django site that works perfectly and displays everything I want it to in the US. It automatically displays the data from the US data model.
What I want to be able to do is basically have an exact clone of my site, maybe under like mysite.com/canada for example, that displays the data from canada.
One approach was for me to just add in all the data into the database and add a field that says which country it's from, but I'd rather for each countries data to be in a completely different model.
With pure HTML/CSS this would be easy, I would just copy the entire site directory into a sub directory and that would be it for the country. Was wondering if there is something similiar I can do with Django.
Based on what you're describing you should probably be setting up parallel stacks and using either your DNS, Apache, your whatever your HTTP routing tech of choice is to do the separation.
Use a separate database, possibly even a separate server (or WSGI configuration), and keep your code clean.
Creating duplicate "models" based on the value of a field like you're describing breaks a lot of Python's DRY principles.
I am in the middle of my personal website development and I am using python to create a "Comment section" which my visitors could leave comments at there in public (which means, everybody can see it, so don't worry about the user name registration things). I already set up the sql database to store those data but only thing I haven't figured out yet was how to get the user input (their comments) from the browser. So, is there any modules in python could do that? (Like, the "Charfield" things in django, but unfortunately I don't use django)
For that you would need a web framework like Bottle or Flask. Bottle is a simple WSGI based web framework for Python.
Using either of these you may write simple REST based APIs, one for set and other for get. The "set" one could accept data from your client side and store it on your database where as your "get" api should return the data by reading it from your DB.
Hope it helps.
At my work, we use Oracle for our database. Which works great. I am not the main db admin, but I do work with it. One thing I like is that the DB has a built in logic layer using PL/SQL which ca handle logic related to saving the data and retrieve it. I really like this because it allows our MVC application (PHP/Zend Framework) to be lighter, and makes it easier to tie in another platform into the data, such as desktop or mobile.
Although, I have a personal project where I want to use couchdb or mongodb, and I want to try and accomplish a similar goal. outside of the mvc/framework, I want to have an API layer that the main applications talk to. they dont actually talk directly to the database. They specify the design document (couchdb) or something similar for mongo, to get the results. And that API layer will validate the incoming data and make sure that data itself is saved and updated properly. Such as saving a new user, in the framework I only need to send a json obejct with the keys/values that need to be saved and the api layer saves the data in the proper places where needed.
This API would probably have a UI, but only for administrative purposes and to make my life easier. In general it will always reply with json strings, or pre-rendered/cached html in some cases. Since each api layer would be specific to the application anyways.
I was wondering if anyone has done anything like this, or had any tips on nethods I could accomplish this. I am currently looking to write my application in python, and the front end will likely be something like Angularjs. Although I am also looking at node.js for a back end.
We do this exact thing at my current job. We have MongoDB on the back end, a RESTful API on top of it and then PHP/Zend on the front end.
Most of our data is read only, so we import that data into MongoDB and then the RESTful API (in Java) just serves it up.
Some things to think about with this approach:
Write generic sorting/paging logic in your API. You'll need this for lists of data. The user can pass in things like http://yourapi.com/entity/1?pageSize=10&page=3.
Make sure to create appropriate indexes in Mongo to match what people will query on. Imagine you are storing users. Make an index in Mongo on the user id field, or just use the _id field that is already indexed in all your calls.
Make sure to include all relevant data in a given document. Mongo doesn't do joins like you're used to in Oracle. Just keep in mind modeling data is very different with a document database.
You seem to want to write a layer (the middle tier API) that is database agnostic. That's a good goal. Just be careful not to let Mongo specific terminology creep into your exposed API. Mongo has specific operators/concepts that you'll need to mask with more generic terms. For example, they have a $set operator. Don't expose that directly.
Finally after having a decent amount of experience with CouchDB and Mongo, I'd definitely go with Mongo.
I have a script which scans an email inbox for specific emails. That part's working well and I'm able to acquire the data I'm interested in. I'd now like to take that data and add it to a Django app which will be used to display the information.
I can run the script on a CRON job to periodically grab new information, but how do I then get that data into the Django app?
The Django server is running on a Linux box under Apache / FastCGI if that makes a difference.
[Edit] - in response to Srikar's question When you are saying " get that data into the Django app" what exactly do you mean?...
The Django app will be responsible for storing the data in a convenient form so that it can then be displayed via a series of views. So the app will include a model with suitable members to store the incoming data. I'm just unsure how you hook into Django to make new instances of those model objects and tell Django to store them.
I think Celery is what you are looking for.
You can write custom admin command to load data according to your need and run that command through cron job. You can refer Writing custom commands
You can also try existing loaddata command, but it tries to load data from fixture added in your app directory.
I have done the same thing.
Firstly, my script was already parsing the emails and storing them in a db, so I set the db up in settings.py and used python manage.py inspectdb to create a model based on that db.
Then it's just a matter of building a view to display the information from your db.
If your script doesn't already use a db it would be simple to create a model with what information you want stored, and then force your script to write to the tables described by the model.
Forget about this being a Django app for a second. It is just a load of Python code.
What this means is, your Python script is absolutely free to import the database models you have in your Django app and use them as you would in a standard module in your project.
The only difference here, is that you may need to take care to import everything Django needs to work with those modules, whereas when a request enters through the normal web interface it would take care of that for you.
Just import Django and the required models.py/any other modules you need for it work from your app. It is your code, not a black box. You can import it from where ever the hell you want.
EDIT: The link from Rohan's answer to the Django docs for custom management commands is definitely the least painful way to do what I said above.
When you are saying " get that data into the DJango app" what exactly do you mean?
I am guessing that you are using some sort of database (like mysql). Insert whatever data you have collected from your cronjob into the respective tables that your Django app is accessing. Also insert this cron data into the same tables that your users are accessing. So that way your changes are immediately reflected to the users using the app as they will be accessing the data from the same table.
Best way?
Make a view on the django side to handle receiving the data, and have your script do a HTTP POST on a URL registered to that view.
You could also import the model and such from inside your script, but I don't think that's a very good idea.
Have your script send an HTTP Post request like so. This is the library Requests
>>> files = {'report.xls': open('report.xls', 'rb')}
>>> r = requests.post(url, files=files)
>>> r.text
then on the receiving side you can use web.py to process the info like this
x = web.input()
then do whatever you want with x
On the receiving side of the POST request import web and write a function that handles the post
for example
def POST(self):
x = web.input()
If you dont want to use HTTP to send messages back and forth you could just have the script write the email info to a .txt file and then have your django app open the file and read it.
EDIT:
You could set your CRON job to read the e-mails at say 8AM then write it to a text file info.txt. The in your code write something like
import time
if '9' == time.strftime("%H"):
file = open(info.txt)
info = file.read()
that will check the file at 9AM untill 10AM. if you want it to only check one time just add the minutes too the if statement as well.