I have a mongodb database named as world, which has two collection from before city, languages. I want to show the data of my collection on the web, how can i do it.
currently i know to create collection in models.py and migrate it. like;
first we have to edit databases[] in setting.py
DATABASES = {
'default': {
'ENGINE': 'djongo',
'NAME': 'world',
}
}
in models.py i creted a class and migrated it using python manage.py migrate
class Destination(models.Model):
name= models.CharField(max_length=100)
img=models.ImageField(upload_to='pics')
desc=models.TextField()
and i'm able to retrieve data from Destination by using below code in views.py
from django.shortcuts import render
from .models import Destination
def index(request):
dests=Destination.objects.all()
return render(request,'index.html',{'dests':dests})
My question is my collection/class is already available in the database ( city, language) and i'm not creating it contrary to Destination which was defined by me. then how to show data of city collection of world database on the front-end.
kindly looking for help.
If I got you properly, then you have a MongoDB database called world.
There you stored city and languages before you started to set up Django.
Then you added a Destination model, thus created a new collection.
For now, you're looking for a way how to get city and languages collections data similar way as you do with Destination.
So there are multiple ways how you could handle it:
Create Django models for city and languages collections (define fields that you have in existing collections):
class City(models.Model):
field1 = ...
field2 = ...
class Meta:
db_table = 'city' # important, should be your existing collection name
class Language(models.Model):
field3 = ...
field4 = ...
class Meta:
db_table = 'languages' # important, should be your existing collection name
Now you're ready to use City and Language the same way as you do with the Destination model.
Use PyMongo (this is already installed as you're using Djongo). So your snipped will look something like:
from django.shortcuts import render
from .models import Destination
import pymongo
# default localhost connection URL
MONGO_URL = 'mongodb://localhost:27017'
connection = pymongo.MongoClient(MONGO_URL)
mongo_db = connection.world
def index(request):
collection_city = db['city']
collection_languages = db['languages']
cities = list(collection_city.find())
languages = list(collection_languages.find())
dests=Destination.objects.all()
return render(
request,
'index.html',
{
'dests': dests,
'cities': cities,
'languages': languages
}
)
I'd use option 1, as it allows you to keep the project coherent.
Related
With a database model described in the simplified toy example below, we are trying to get a list of EAV attributes used for a specific Product.
The sample source code in the Actions section below serves the purpose; however, we feel the statement is overly verbose: We only need columns of the template_attribute table, but the values arguments need to maintain a fully qualified path starting from the original Product model. See the code below:
# Getting the `id` columns from multiple-orders of related models:
attribute_set = template. values(
"id", # the base model, Product
"template__id", # the related model, Template
"template__templateattribute__id" # the second related model, TemplateAttribute
)
So, we wonder if there is a way to refer to the columns directly from the containing model, e.g. templateattribute__id, or even better id, instead of template__templateattribute__id.
We are new to the Django ORM and appreciate any hints or suggestions.
Actions:
template = Product.active_objects.filter(id='xxx').select_related('template')
attribute_set = template. values("id", "template__id", "template__templateattribute__id")
for i, attr in enumerate(attribute_set):
print("{:03}: {}".format(i, attr))
# Output:
# 000: {'id': xxx, 'template__id': xxxx, 'template__templateattribute__id': xxxxx}
# 001: {'id': xxx, 'template__id': xxxx, 'template__templateattribute__id': xxxxx}
# 002: {'id': xxx, 'template__id': xxxx, 'template__templateattribute__id': xxxxx}
# ...
The models:
# Simplified toy example
class Product(models.Model):
product_template = models.ForeignKey(Template)
sku = models.CharField(max_length=100)
...
class Template(models.Model):
base_sku = models.CharField(max_length=100)
...
class TemplateAttribute(models.Model):
product_template = models.ForeignKey(Template)
attribute = models.ForeignKey(eav_models.Attribute)
...
# From the open-source Django EAV library
# imported as `eav_models`
#
class Attribute(models.Model):
name = models.CharField(_(u"name"), max_length=100,
help_text=_(u"User-friendly attribute name"))
...
slug = EavSlugField(_(u"slug"), max_length=50, db_index=True,
help_text=_(u"Short unique attribute label"))
...
Perhaps using a Manager with annotations or aliases could help?
You could try to add more magic by trying to dynamically add an annotation for each key during the manager's construction, but really at that point you are writing code that should have been in the EAV itself.
I would warn you, having attribute names and their values in a table instead of just model fields (and columns in the DB) will be an uphill battle, and already you are finding areas where your library isn't handling things for you.
from django.db import models
from django.db.models import F
class ProductManager(models.Manager):
def get_queryset(self):
qs = super().get_queryset() \
.annotate(
my_attribute_key=F('template__templateattribute__id')
)
return qs
In addition, we also figured out a workaround by defining a base path:
base = "template__templateattribute"
So, instead of
attribute_set = template. Values("template__templateattribute__id")
, we can do the following:
attribute_set = template. Values(base+"__id")
This is just an informal workaround, though. The better way is still to use a Manager class.
I have a web application in Python django. I need to import users and display data about them from another database, from another existing application. All I need is the user to be able to login and display information about them. What solutions are?
You can set 2 DATABASES in settings.py.
DATABASES = {
'default': {
...
},
'user_data': {
...
}
}
Then in one database store User models with authentication and stuff, in another rest information. You can connect information about specific User with a field that is storing id of User from another database.
If you have multiple databases and create a model, you should declare on which db it is going to be stored. If you didn't, it will be in default one (if you have it declared).
class UserModel(models.Model):
class Meta:
db_table = 'default'
class UserDataModel(models.Model):
class Meta:
db_table = 'user_data'
the answer from #NixonSparrow was wrong.
_meta.db_table defined only table_name in database and not the database self.
for switch database you can use manager.using('database_name'), for every model, it is good declared here: https://docs.djangoproject.com/en/4.0/topics/db/multi-db/#topics-db-multi-db-routing
in my project i use multiple router.
https://docs.djangoproject.com/en/4.0/topics/db/multi-db/#topics-db-multi-db-routing
it help don't override every manager with using. But in your case:
DATABASES = {
'default': {
...
},
'other_users_data': {
...
}
}
and somethere in views:
other_users = otherUserModel.objects.using('other_users_data')
Probably, otherUserModel should define in meta, which table you want to use db_table = 'other_users_table_name' and also probably it should have managed=False, to hide this model from migration manager.
I have just started learning Django and I am making a database using SQLite
I have a python file called models.py and I am currently creating a py file for the objects called object. However I am having trouble creating objects in the database I have made. I think there is an issue with the object.py file because it is claiming that my FruitInfo sections have no objects.
models.py
from django.db import models
from django.core.validators import MinLengthValidator, MaxValueValidator, MinValueValidator
class FruitInfo(models.Model):
id = models.IntegerField(primary_key=True, max_length=30, validators=[MinLengthValidator("5")])
name= models.CharField(max_length=30)
origin = models.CharField(max_length=60)
price = models.DecimalField(max_digits=4,null=False,decimal_places=2, validators=[MinValueValidator('200')])
def __str__(self):
return self.origin + " " + self.name
object.py
from stinky.models import FruitInfo
FruitInfo.objects.all()
FruitInfor.objects.all().values()
record = FruitInfo.objects.create(id = "A0001", name = "Pink Lady Apple", origin= "Washington State", price="$2.00/kg" )
record = FruitInfo.objects.create(id = "A0002", name = "Organic Bananana", origin = "Australia", price = "$4.50/kg")
record = FruitInfo.objects.create(id = "A0003", name = "Orange", origin = "Indonesia", price = "$4/kg")
record.save()
I am currently creating a py file for the objects called object.
object.py is just a "simple" python file. Unless you load it somewhere, it will not run. But even if it does, this is not the way to construct objects in the database: this means it will each time you start the server, it will try to construct new objects. It also means that if you later alter that model, that you each time will have to update the logic of that file.
Typically one populates the database like that through a data migration [Django-doc].
You can create such migration by creating an empty migration file with:
python3 manage.py makemigrations --empty stinky
With stinky, the name of the app.
Then we can create data in the migration by altering the migration file to something like:
from django.db import migrations
def populate_database(apps, schema_editor):
FruitInfo = apps.get_model('stinky', 'FruitInfo')
FruitInfo.objects.create(id='A0001', name='Pink Lady Apple', origin='Washington State', price='$2.00/kg')
FruitInfo.objects.create(id='A0002', name='Organic Bananana', origin='Australia', price='$4.50/kg')
FruitInfo.objects.create(id='A0003', name='Orange', origin='Indonesia', price='$4/kg')
class Migration(migrations.Migration):
dependencies = [
('stinky', '1234_some_migration'),
]
operations = [
migrations.RunPython(populate_database),
]
The apps.get_model('stinky', 'FruitInfo') will fetch a model how it looks at that specific time in the migrations. This thus means that if you later for example add extra fields, then the records will run through that other migration, but the FruitInfo here will not have that extra field.
I need to programmatically generate the CREATE TABLE statement for a given unmanaged model in my Django app (managed = False)
Since i'm working on a legacy database, i don't want to create a migration and use sqlmigrate.
The ./manage.py sql command was useful for this purpose but it has been removed in Django 1.8
Do you know about any alternatives?
As suggested, I post a complete answer for the case, that the question might imply.
Suppose you have an external DB table, that you decided to access as a Django model and therefore have described it as an unmanaged model (Meta: managed = False).
Later you need to be able to create it in your code, e.g for some tests using your local DB. Obviously, Django doesn't make migrations for unmanaged models and therefore won't create it in your test DB.
This can be solved using Django APIs without resorting to raw SQL - SchemaEditor. See a more complete example below, but as a short answer you would use it like this:
from django.db import connections
with connections['db_to_create_a_table_in'].schema_editor() as schema_editor:
schema_editor.create_model(YourUnmanagedModelClass)
A practical example:
# your_app/models/your_model.py
from django.db import models
class IntegrationView(models.Model):
"""A read-only model to access a view in some external DB."""
class Meta:
managed = False
db_table = 'integration_view'
name = models.CharField(
db_column='object_name',
max_length=255,
primaty_key=True,
verbose_name='Object Name',
)
some_value = models.CharField(
db_column='some_object_value',
max_length=255,
blank=True,
null=True,
verbose_name='Some Object Value',
)
# Depending on the situation it might be a good idea to redefine
# some methods as a NOOP as a safety-net.
# Note, that it's not completely safe this way, but might help with some
# silly mistakes in user code
def save(self, *args, **kwargs):
"""Preventing data modification."""
pass
def delete(self, *args, **kwargs):
"""Preventing data deletion."""
pass
Now, suppose you need to be able to create this model via Django, e.g. for some tests.
# your_app/tests/some_test.py
# This will allow to access the `SchemaEditor` for the DB
from django.db import connections
from django.test import TestCase
from your_app.models.your_model import IntegrationView
class SomeLogicTestCase(TestCase):
"""Tests some logic, that uses `IntegrationView`."""
# Since it is assumed, that the `IntegrationView` is read-only for the
# the case being described it's a good idea to put setup logic in class
# setup fixture, that will run only once for the whole test case
#classmethod
def setUpClass(cls):
"""Prepares `IntegrationView` mock data for the test case."""
# This is the actual part, that will create the table in the DB
# for the unmanaged model (Any model in fact, but managed models will
# have their tables created already by the Django testing framework)
# Note: Here we're able to choose which DB, defined in your settings,
# will be used to create the table
with connections['external_db'].schema_editor() as schema_editor:
schema_editor.create_model(IntegrationView)
# That's all you need, after the execution of this statements
# a DB table for `IntegrationView` will be created in the DB
# defined as `external_db`.
# Now suppose we need to add some mock data...
# Again, if we consider the table to be read-only, the data can be
# defined here, otherwise it's better to do it in `setUp()` method.
# Remember `IntegrationView.save()` is overridden as a NOOP, so simple
# calls to `IntegrationView.save()` or `IntegrationView.objects.create()`
# won't do anything, so we need to "Improvise. Adapt. Overcome."
# One way is to use the `save()` method of the base class,
# but provide the instance of our class
integration_view = IntegrationView(
name='Biggus Dickus',
some_value='Something really important.',
)
super(IntegrationView, integration_view).save(using='external_db')
# Another one is to use the `bulk_create()`, which doesn't use
# `save()` internally, and in fact is a better solution
# if we're creating many records
IntegrationView.objects.using('external_db').bulk_create([
IntegrationView(
name='Sillius Soddus',
some_value='Something important',
),
IntegrationView(
name='Naughtius Maximus',
some_value='Whatever',
),
])
# Don't forget to clean after
#classmethod
def tearDownClass(cls):
with connections['external_db'].schema_editor() as schema_editor:
schema_editor.delete_model(IntegrationView)
def test_some_logic_using_data_from_integration_view(self):
self.assertTrue(IntegrationView.objects.using('external_db').filter(
name='Biggus Dickus',
))
To make the example more complete... Since we're using multiple DB (default and external_db) Django will try to run migrations on both of them for the tests and as of now there's no option in DB settings to prevent this. So we have to use a custom DB router for testing.
# your_app/tests/base.py
class PreventMigrationsDBRouter:
"""DB router to prevent migrations for specific DBs during tests."""
_NO_MIGRATION_DBS = {'external_db', }
def allow_migrate(self, db, app_label, model_name=None, **hints):
"""Actually disallows migrations for specific DBs."""
return db not in self._NO_MIGRATION_DBS
And a test settings file example for the described case:
# settings/test.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.oracle',
'NAME': 'db_name',
'USER': 'username',
'HOST': 'localhost',
'PASSWORD': 'password',
'PORT': '1521',
},
# For production here we would have settings to connect to the external DB,
# but for testing purposes we could get by with an SQLite DB
'external_db': {
'ENGINE': 'django.db.backends.sqlite3',
},
}
# Not necessary to use a router in production config, since if the DB
# is unspecified explicitly for some action Django will use the `default` DB
DATABASE_ROUTERS = ['your_app.tests.base.PreventMigrationsDBRouter', ]
Hope this detailed new Django user user-friendly example will help someone and save their time.
unfortunately there seems to be no easy way to do this, but for your luck I have just succeeded in producing a working snippet for you digging in the internals of the django migrations jungle.
Just:
save the code to get_sql_create_table.py (in example)
do $ export DJANGO_SETTINGS_MODULE=yourproject.settings
launch the script with python get_sql_create_table.py yourapp.yourmodel
and it should output what you need.
Hope it helps!
import django
django.setup()
from django.db.migrations.state import ModelState
from django.db.migrations import operations
from django.db.migrations.migration import Migration
from django.db import connections
from django.db.migrations.state import ProjectState
def get_create_sql_for_model(model):
model_state = ModelState.from_model(model)
# Create a fake migration with the CreateModel operation
cm = operations.CreateModel(name=model_state.name, fields=model_state.fields)
migration = Migration("fake_migration", "app")
migration.operations.append(cm)
# Let the migration framework think that the project is in an initial state
state = ProjectState()
# Get the SQL through the schema_editor bound to the connection
connection = connections['default']
with connection.schema_editor(collect_sql=True, atomic=migration.atomic) as schema_editor:
state = migration.apply(state, schema_editor, collect_sql=True)
# return the CREATE TABLE statement
return "\n".join(schema_editor.collected_sql)
if __name__ == "__main__":
import importlib
import sys
if len(sys.argv) < 2:
print("Usage: {} <app.model>".format(sys.argv[0]))
sys.exit(100)
app, model_name = sys.argv[1].split('.')
models = importlib.import_module("{}.models".format(app))
model = getattr(models, model_name)
rv = get_create_sql_for_model(model)
print(rv)
For Django v4.1.3, the above get_create_sql_for_model soruce code changed like this:
from django.db.migrations.state import ModelState
from django.db.migrations import operations
from django.db.migrations.migration import Migration
from django.db import connections
from django.db.migrations.state import ProjectState
def get_create_sql_for_model(model):
model_state = ModelState.from_model(model)
table_name = model_state.options['db_table']
# Create a fake migration with the CreateModel operation
cm = operations.CreateModel(name=model_state.name, fields=model_state.fields.items())
migration = Migration("fake_migration", "app")
migration.operations.append(cm)
# Let the migration framework think that the project is in an initial state
state = ProjectState()
# Get the SQL through the schema_editor bound to the connection
connection = connections['default']
with connection.schema_editor(collect_sql=True, atomic=migration.atomic) as schema_editor:
state = migration.apply(state, schema_editor, collect_sql=True)
sqls = schema_editor.collected_sql
items = []
for sql in sqls:
if sql.startswith('--'):
continue
items.append(sql)
return table_name,items
#EOP
I used it to create all tables (like the command syncdb of old Django version):
for app in settings.INSTALLED_APPS:
app_name = app.split('.')[0]
app_models = apps.get_app_config(app_name).get_models()
for model in app_models:
table_name,sqls = get_create_sql_for_model(model)
if settings.DEBUG:
s = "SELECT COUNT(*) AS c FROM sqlite_master WHERE name = '%s'" % table_name
else:
s = "SELECT COUNT(*) AS c FROM information_schema.TABLES WHERE table_name='%s'" % table_name
rs = select_by_raw_sql(s)
if not rs[0]['c']:
for sql in sqls:
exec_by_raw_sql(sql)
print('CREATE TABLE DONE:%s' % table_name)
The full soure code can be found at Django syncdb command came back for v4.1.3 version
So I have a question I was thinking of creating a single table that has a foreign key to several other tables, and using another field "type" to say what table the key should belong to.
class Status(Models.model):
request = models.ForeignKey("Request1", "Request2", "Request3")
request_type = models.IntegerField()
...Some status related data
class Request1(Models.model):
...Some data
class Request2(Models.model):
...Some other data
Class Request3(Models.model):
...different data
My question is, is it possible to define a foreign key like this?
another solution I thought of was to define my model like this
class Status(Models.model):
request1 = models.ForeignKey("Request1")
request2 = models.ForeignKey("Request2")
request3 = models.ForeignKey("Request3")
...Some status related data
class Request1(Models.model):
...Some data
class Request2(Models.model):
...Some other data
Class Request3(Models.model):
...different data
But if I do it this way is it possible to define a constraint via django that says only 1 foreign key is allowed to have data and the other two must be null? or will I have to strictly set this constraint up on the db side.(I'm using postgres) I would like to be able to tell django to do it when it creates the db so I don't have to remember every time someone recreates the db.
Any input or advice would be greatly appreciated. I am not married to either of these ideas, so if there is another clever way to achieve the same effect i am up to hear it. Thank you for your time.
Edit: I am using django 1.7.10
You should use the contentypes framework in Django.
There's an example for a generic relation here :https://docs.djangoproject.com/en/1.8/ref/contrib/contenttypes/#generic-relations
For your requirement it could look something like this:
from django.db import models
from django.contrib.contenttypes.fields import GenericForeignKey
from django.contrib.contenttypes.models import ContentType
class Status(models.Model):
request_type = models.ForeignKey(ContentType)
request_id = models.PositiveIntegerField()
request = GenericForeignKey('request_type', 'request_id')
You can then do something like following:
status1 = Status(request=Request1("foo"))
status1.save()
status2 = Status(request=Request2("bar"))
status2.save()
status1.request // <Request1 "foo">
status2.request // <Request2 "bar">