I'm using SQLAlchemy Automap with reflection to connect to an existing database. Some of the relationships work properly and some do not. I'd like a way to audit the results of prepare() so I can better understand what I'm working with. How can I view the relationship objects produced after I run prepare()?
Base.classes.<classname>.__table__ shows the tables and included ForeignKey objects as described in the documentation but no relationships are backreferences are included here, probably because it's at the Table level rather than the class level.
Not sure what AutoMap does. Inspect may help. not sure
from sqlalchemy.inspection import inspect
relations = inspect(Base.classes.<classname>).relationships.items()
Related
Recetly I've seen an app powered with django and mongodb as backend,thing is that app doesn't have a models.py file.All the datas are inserted directly in views.py.I Just need a little clarification about this particular things "Using django without models.py with mongodb."
models.py is the Django ORM way of inspecting a fixed relational schema and generating the relevant SQL code to initialize (or modify) the database. "ORM" stands for "Object-Relational Mapping".
Mongo is not relational, hence you don't need this type of schema.
(Of course, that can cause a lot of other problems if the needs of your project change later...)
But you don't need a relational schema since you're not using a relational DB.
A short answer
models.py is the ORM that comes free with django.
ORM relates your SQL schema into oopsy objects.
You can read more about ORM here-> https://en.wikipedia.org/wiki/Object-relational_mapping.
When using a noSQL, you can push objects directly into DB. So, you do not really need an ORM.
That said, whether to use it or not is a debatable part.
PS. even while using SQL, some people prefer other ORMs instead of django's built-in models.
I have a typical Django project with one primary database where I keep all the data I need.
Suppose there is another DB somewhere with some additional information. That DB isn't directly related to my Django project so let's assume I do not even have a control under it.
The problem is that I do ont know if I need to create and maintain a model for this external DB so I could use Django's ORM. Or maybe the best solution is to use raw SQL to fetch data from external DB and then use this ifo to filter data from primary DB using ORM, or directly in views.
The solution with creating a model seems to be quite ok but the fact that DB isn't a part of my project means I am not aware of possible schema changes and looks like it's a bad practice then.
So in the end if I have some external resources like DBs that are not related to but needed for my project should I:
Try to create django models for them
Use raw SQL to get info from external DB and then use it for filtering data from the primary DB with ORM as well as using data directly in views if needed
Use raw SQL both for a primary and an external DB where they intersect in app's logic
An alternative is to use SQLAlchemy for the external database. It can use reflection to generate the SQLAlchemy-equivalent of django models during runtime.
It still won't be without issues. If your code depends on a certain column, it would still break if that column is removed or changed in an incompatible way. However, it will add a bit more flexibility to your database interactions, e.g. a Django model would definitely break if an int column is changed to a varchar column, but using database reflection, it will only break if your code depends on the fact that it is an int. If you simply display the data or something, it will remain fully functional. However, there is always a chance that a change doesn't break the system, but causes unexpected behaviour.
If, like Benjamin said, the external system has an API, that would be the preferred choice.
I suggest you to read about inspectdb and database routers. It's possible to use the django ORM to manipulate a external DB.
https://docs.djangoproject.com/en/1.7/ref/django-admin/#inspectdb
I would create the minimal django models on the external databases => those that interact with your code:
Several outcomes to this
If parts of the database you're not interested in change, it won't have an impact on your app.
If the external models your using change, you probably want to be aware of that as quickly as possible (your app is likely to break in that case too).
All the relational databases queries in your code are handled by the same ORM.
In my models I have a Concert class and a Venue class. Each venue has multiple concerts. I have been linking the Concert class to a Venue with a simple
venue = models.IntegerField(max_length = 10)
...containing the venue object's primary key. A colleague suggested we use venue = models.ForeignKey(Venue) instead. While this also works, I wonder if it's worth the switch because I have been able to parse out all the concerts for a venue by simply using the venue's ID in Concert.objects.filter(venue=4) the same way I could do this with a ForeignKey: Venue_instance.Concert_set.all(). I've never had any problems using my method.
The way I see it, using the IntegerField and objects.filter() is just as much of a "ManyToOne" relationship as a ForeignKey, so I want to know where I'm wrong. Why are ForeignKeys advantageous? Are they faster? Is it better database design? Cleaner code?
I would say that the most practical benefit of a foreign key is the ability to query across relationships automatically. Django generates the JOINs automatically.
The automatic reverse relation helpers are great too as you mentioned.
Here are some examples that would be more complicated with only an integer relationship.
concerts = Concert.objects.filter(...)
concerts.order_by('venue__attribute') # ordering beyond PK.
concerts.filter(venue__name='foo') # filter by a value across the relationship
concerts.values_list('venue__name') # get just venue names
concerts.values('venue__city').annotate() # get unique values across the venue
concerts.filter(venue__more__relationships='foo')
Venue.objects.filter(concert__name='Coachella') # reverse lookups work too
# with an integer field for Concert.venue, you'd have to do something like...
Venue.objects.filter(id__in=Concert.objects.filter(name='Coachella'))
As others have pointed out... database integrity is useful, cascading deletes (customizable of course), and facepalm it just occurred to me that the django admin and forms framework work amazingly with foreign keys.
class ConcertInline(admin.TabularInline):
model = Concert
class VenueAdmin(admin.ModelAdmin):
inlines = [ConcertInline]
# that was quick!
I'm sure there are more examples of django features handling foreign keys.
ForeignKey is a database concept implemented in most databases that also enforces referential integrity.
Because django would know what this column refers to is a table, which may itself be a foreign key to some other table, it can help chain the relationship which will produce the corresponding joins in the SQL.
Other than the normal one-way chaining, Django also adds a parameter to the opposite side, like you have recognized. When you have a venue instance, you are able to query venue.concert_set.
The thing that bothers me the most about not using FK and rolling your own by using the integer is that:
You don't have referential integrity check.
You lose out on the power of SQL. Every moderately deep query of yours will now need multiple hits to the database, since you can't join. - You also lose out on all the levers the framework provides to deal with the SQL
When using django, I believe you can swap out the built-in orm for sqlalchemy (not sure how though?).
Are they both basically the same thing or there is a clear winner between the 2?
When using django, I believe you can swap out the built-in orm for sqlalchemy (not sure how though?).
You can use SQLAlchemy in your Django applications. That doesn't mean you can "swap" out the ORM though. Some of Django's built-in batteries would cease to work if you completely replace Django's ORM with SQLAlchemy. For instance the Admin app wouldn't work.
I have read about people using both. Django's ORM to get the batteries to work and SQLAlchemy to tackle complex SQL relationships and queries.
Are they both basically the same thing or there is a clear winner between the 2?
They are not the same thing. SQLAlchemy is more versatile than Django's ORM. For instance while Django's ORM tightly couples the business logic layer and the persistence layer into models, SQLAlchemy will let you keep them separate.
On the other hand SQLAlchemy has a steeper learning curve and would be an overkill for many projects. The existing migration tools for SQLAlchemy may not easily integrate with Django.
All said, I wouldn't presume to declare either a "winner". They are broadly similar tools but with their own strengths, weaknesses and best-fit situations.
While you are on the subject it wouldn't hurt to read this (dated but relevant) blog post about the short comings of the Django ORM and how it compares with SQLAlchemy.
SQLAlchemy is more powerful, but also more complex, than Django ORM.
Elixir provides an interface on top of SQLAlchemy that is closer to Django ORM in terms of complexity, but also lets you use constructs from SQLAlchemy if Elixir alone isn't enough. For example, I've found SQLAlchemy's AssociationProxy class to be useful on several occasions. You just drop an AssociationProxy field into an Elixir model class and you're good to go.
Does anyone has any insight on organizing sqlalchemy based projects? I have many tables and classes with foreign keys, and relations. What is everyone doing in terms of separating classes, tables, and mappers ? I am relatively new to the framework, so any help would be appreciated.
Example:
classA.py # table definition and class A definition
classB.py # table definition and class B definition
### model.py
import classA,classB
map(classA.classA,clasSA.table)
map(classB.classB,clasSB.table)
Including mappers inside classA, and classB works, but poses cross import issues when building relations.. Maybe I am missing something :)
Take a look at Pylons project including SA setup.
meta.py includes engine and metadata objects
models package includes declerative classes (no mapper needed). Inside that package, structure your classes by relavance into modules.
Maybe a good example would be reddit source code:)
There are two features in SQLAlchemy design to avoid cross imports when defining relations:
backref argument of relation() allows you to define backward relation.
Using strings (model class and their fields names). Unfortunately this works for declarative only, which is not your case.
See this chapter in tutorial for more information.