I have an XML file.
Is there a way to generate Django models from the XML file? or should I hardcode it?
I've been searching online for hours, and I guess "rest frame work" and "django adaptors" do the job, but I am not so sure how to go about doing that.
If there is anyone familiar with Django and XML, any help will be appreciated.
Thank you.
Here a talk which contains a tutorial to import an xml file as class:
http://youtu.be/sPiWg5jSoZI?t=2h30s
If you run with py3k, (I think this particular demo of this awesome and funny demonstration is compatible with python 2.6 and upper), you'll get the idea on how to replace your models.py by a models.xml. You'll have to adapt it to your xml format by yourself.
(you'll need to look at least from around 2h to 2h32m)
This talk explain a fun way to do it, the speaker doesn't seem convinced that it's a great idea.
If you want to generate api models from WSDL, I would suggest you to still make your tables models in hard code as your database isn't dynamic I suppose. Simply make methods/static methods to translate api models to db models.
Related
Im working with Maltego and I would like to add entities to a graph from a server through an api.
Is that possible?
I would say that it is not possible but in a project are asking me to do that and I don't know how to do it. I've been looking for information for a few days and I can't find anything.
Thanks in advance.
I have not tested anything because reading the documentation I do not see it possible.
brand new on this forum and this is my first post!
At work we're starting a project which uses Apache Solr and i'm in charge of the frontend system (Django-based).
Our solr database isn't related to any other db engine nor to any models' class, so Haystack isn't good for us (since its strictly related to the models).
I was looking at http://code.google.com/p/pysolr/ and http://code.google.com/p/solrpy/
Basically, they're similar. I like more solrpy, since it uses POST requests and we can mask our users queries, but this makes its paginator harder to use (i guess..).
Other side, pysolr, thanks to the GET method, performs better (lower query timing), but so far i couldn't execute a query without getting a badrequest error.
Before choosing one, i wanted to ask the community any opinion. Users need to do only searches, our data is handled by a java process, no other db is used (except for storing user informations), and we need to use all solr features (faceting, highlight, word stopping, analyzers...).
What will you choose? And why? Any good code example you can point me at? I was looking throu the haystack source to see how they did implement all...
Thanks all!
We have used 'solrpy', but encountered some problems with it.
Sunburnt is actually an interesting API:
https://github.com/tow/sunburnt/
Actively developed, and easy to use. Unfortunately it introduces some additional dependencies.
Does anyone know of simple and well documented API's with plenty of hand holding examples that assumes very little or no prior knowledge of web development?
I've been messing around with Pyfacebook and Facebook-Python-SDK in trying to create a simple photo display app but I haven't been able to make much headway after spending the last few days on it. The main reason for this is simply because I just wasn't able to find a good tutorial that walks me through all the steps. So, I'm putting this mini project on pause and looking for lower hanging fruit.
In terms of skill level, I'm pretty ok on the basics of Python and Django.
Update
I've done the tutorials at http://www.djangoproject.com/ already. Really looking for ideas and suggestions on webapp projects that utilises an API. E.g, a twitter app that displays a user's most frequently used keywords in a tagcloud.
Update2
Side note: Having mess around with Twitter's API for a little bit, I would definitely recommend to start with Twitter first as opposed to Facebook. It's easier and better documented.
Best place to start is with tutorials on djangoproject.com.
Have you tried the Django tutorial? It is pretty basic, but touches on all important points required to develop your own basic app.
django-basic-apps contains a collection of apps you might enjoy reading.
Edit: Check out this good list of web services I found. :)
As far as I know you can't write facebook apps with Django. Facebook uses their own API and stuff. They are completely different.
And for the twitter API thingy I have an idea.
Develop a django app which can used to
scrap and backup tweets.
The scenario is during any FOSS
conference, they are using a #hastag
to identify tweets related to that
conf. But after sometime these tweets
don't show up even on search. For
example we used #inpycon2010 tag for
Pycon conf in India. But now when I
search for this tag, nothing shows up.
So what you can do allow users to
register a hastag and set a time
interval. Within that time interval
your app should scrap all the tweets
and backup them. The user should be
able to retreive from that later.
If you start this a foss project, I'm ready to jump in :)
Additional questions regarding SilentGhost's initial answer to a problem I'm having parsing Twitter RSS feeds. See also partial code below.
First, could I insert tags[0], tags[1], etc., into the database, or is there a different/better way to do it?
Second, almost all of the entries have a url, but a few don't; likewise, many entries don't have the hashtags. So, would the thing to do be to create default values for url and tags? And if so, do you have any hints on how to do that? :)
Third, when you say the single-table db design is not optimal, do you mean I should create a separate table for tags? Right now, I have one table for the RSS feed urls and another table with all the rss entry data (summar.y, date, etc.).
I've pasted in a modified version of the code you posted. I had some success in getting a "tinyurl" variable to get into the sqlite database, but now it isn't working. Not sure why.
Lastly, assuming I can get the whole thing up and running (smile), is there a central site where people might appreciate seeing my solution? Or should I just post something on my own blog?
Best,
Greg
I would suggest reading up on database normalisation, especially on 1st and 2nd normal forms. Once you're done with it, I hope there won't be need for default values, and your db schema evolves into something more appropriate.
There are plenty of options for sharing your source code on the web, depending on what versioning system you're most comfortable with you might have a look at such well know sites as google code, bitbucket, github and many other.
I am wondering if there are any django based, or even Python Based Reporting Services ala JasperReports or SQL Server Reporting Services?
Basically, I would love to be able to create reports, send them out as emails as CSV or HTML or PDF without having to code the reports. Even if I have to code the report I wouldn't mind, but the whole framework with schedules and so on would be nice!
PS. I know I could use Django Apps to do it, but I was hoping if there was any integrated solutions or even projects such as Pinax or Satchmo which brings together the apps needed.
PPS: It would have to work off Postgres
"I would love to be able to create reports ... without having to code the reports"
So would I. Sadly, however, each report seems to be unique and require custom code.
From Django model to CSV is easy. Start there with a few of your reports.
import csv
from myApp.models import This, That, TheOther
def parseCommandLine():
# setup optparse to get report query parameters
def main():
wtr= csv.DictWriter( sys.stdout, ["Col1", "Col2", "Col3"] )
this, that = parseCommandLine()
thisList= This.objects.filter( name=this, that__name=that )
for object in thisList:
write.writerow( object.col1, object.that.col2, object.theOther.col3 )
if __name__ == "__main__":
main()
HTML is pretty easy -- Django has an HTML template language. Rather than render_to_response, you simply render your template and write it to stdout. And the core of the algorithm, interestingly, is very similar to writing a CSV. Similar enough that -- without much cleverness -- you should have a design pattern that does both.
Once you have the CSV working, add the HTML using Django's templates.
PDF's are harder, because you have to actually work out the formatting in some detail. There are a lot of Python libraries for this. Interestingly, however, the overall pattern for PDF writing is very similar to CSV and HTML writing.
Emailing means using Python's smtplib directly or Django's email package. This isn't too hard. All the pieces are there, you just need to email the output files produced above to some distribution list.
Scheduling takes a little thinking to make best use of crontab. This -- perhaps -- is the hardest part of the job.
I just thought after a fair bit of investigation I would report my findings...
http://code.google.com/p/django-reporting/ - I think that this project, looks like an awesome candidate for alot of the functionality I require. Unfortunately its Django 1.1 which as of this writing (29th April 2009) has not been released.At least in the ability to create reports without too much code.
http://code.google.com/p/django-cron/ - Look promising for scheduling of jobs without cron access
http://www.xhtml2pdf.com/ - Could be used or ReportLabs PDF Libraries for conversion of HTML to PDF
All these together with Django's Email functionality could make a nice Reporting System.