it is my first time to use mongoengine, I know the ORM concept.
I want to know may I get the data in one DB and using the same object to store in another DB??
class User(Document):
email = EmailField(required=True, unique= True)
salary = IntField(require=True)
connect(alias='default',db='tumblelog')
connect(alias='testdb',db='testdb')
users = User.objects
with switch_db(User,'testdb') as User:
for user in users:
User(email=user.email,salary=user.email).save # it works
user.save() #doesn't works
I've found out that by using ORM to get a data from one DB, it will create an object unique to the DB you get it from. You won't be able to use the same object to store it into another DB.
What I would suggest is that you initialise an empty object, for example User, for the other DB and fill it with values from the original object and then store it.
You might want to have a look at this question. Sequelize: Using Multiple Databases
Related
I want to implement this structural model to store my data on Mongodb with MongoEngine on flask:
skills = [{"asm":"Assembly",
"flag":False,
"date": datetime},
{"java":"Java",
"flag":False,
"date": datetime}]
So I don't know how I can declare and update this kind of structure.
For updating one object I used:
User.objects(skills=form.skills.data).update_one()
However, I don't know how to update more fields in one shot.
I tried with the code below but it doesn’t work.
now = datetime.now()
User.objects(skills=form.skills).update_one(set__skills = ({'ruby':'Ruby'}, {'flag':'true'},{'date':now}))
What kind of fields should I declare on forms.py?
For what I understood, you need a a nested document (skills) into another (who refers to User in this case). For doing something like this you don't have to update atomically a field but append values to the subdocument and the save everything.
Tryin' to follow your example, in your case should do something like this:
user = User.objects(email=current_user.email).get()
To get the BaseQuery that refers to user X through a certain query filter, in my example the email of the current logged user
user.kskills.append(SubDocumentClass(skillName="name_of_the_skill", status=True, date=datetime.now()))
For append a collection to the subdocument list. (I've appended your field)
user.save()
To save everything
I am trying to query my postgres database from django, the query I'm using is
s = Booking.objects.all().filter(modified_at__range=[last_run, current_time], coupon_code__in=l)
Now I am changing this object of mine in some ways in my script, and not saving it to the database. What I want to know is that, is it possible to query this object now?
say, I changed my variable as
s.modified_at = '2016-02-22'
Is it still possible to query this object as:
s.objects.all()
or something similar?
The QueryManager is Django's interface to the database (ORM). By definition this means you can only query data that has been stored in the database.
So, in short: "no". You cannot do queries on unsaved data.
Thinking about why you are even asking this, especially looking at the example using "modified_at": why do you not want to save your data?
(You might want to use auto_now=True for your "modified_at" field, btw.)
You could do something like this:
bookings = Booking.objects.all().filter(modified_at__range=[last_run, current_time], coupon_code__in=l)
for booking in bookings:
booking.modified_at = 'some value'
booking.save() # now booking object will have the updated value
im trying to run this python-django code below, but am getting a blank output
SitePaths = PathsOfDomain.objects.filter(pathToScan__contains="www.myapp.com")
return SitePaths
PathsOfDomain is the object representation of a db table.
I'm trying to iterate through a db field name pathToScan and output each value
If someone can please shed some light on this.
Thank you.
If you meant to query for matching PathsOfDomain rows in the database, use the .objects attribute to create a query set:
SitePaths = PathsOfDomain.objects.filter(FKtoTld__id=domain_in_session)
See Making queries in the Django documentation.
Alternatively, if there is a foreign key relationship between the Tld and PathsOfDomain objects, use the related objects manager instead:
SitePaths = domain_in_session.pathsofdomain_set.all()
I've created an Google App Engine Apps using Python.The application deals with lot of user names.
It has got a database to 50K usernames. Each user name has a unique hash value. Which is also stored in the data store.
When any app user submit any user name. The application first checks if the username exist in DB.
If its a new user name, the application calculate a new hash for the new name and store the name and hash in DataStore.
If user name already exist in Datastore, it retrieve the old hash from data store.
Sample Code:
class Names(db.Model):
name = db.StringProperty(required=True)
hash = db.StringProperty(required=True)
username = "debasish"
user_db = db.GqlQuery("SELECT * FROM Names WHERE name=:1", username)
user = user_db.get()
if user == None:
#doesn't exist in DB..so calculate new hash for that name and store it in DB
e = Names(name=username,hash="badasdbashdbhasbdasbdbjasbdjbasjdbasbdbasjdbjasbd")
e.put()
else:
#retrieve the old hash.
self.response.out.write('{"name":"'+user.name+'","hash":"'+user.hash+'"}')
The problem I'm facing is GAE's free data store read operation quota.Its exceeding too quickly and My application stop working.
I've also tried to implement memcache,like this , adding entire db in memcache. But it was also a failure,result more bad.
def get_fresh_all(self):
all_names = db.GqlQuery("SELECT * FROM Names")
memcache.add('full_db', all_names, 3600)
return all_names
So,guys could you please suggest , Am I doing something wrong??
How I can make data store read operations more efficiently??
Thanks in Adv.
you can:
switch to NDB where caching is automatic
query the keys instead of entities SELECT __key__ FROM...
reduce the related indexes (surely decreases the write ops, perhaps even read ops)
rewrite all your entities with username as key_name and use the method get_or_insert()
user = Names.get_or_insert("debasish", hash="badasdbashdbhasbd")
You should cache only the username = hash instead of all. Plus add a in memory cache (this works per instance only cache. Should help more, just create a dict on global module level). This could grow really quickly depending on your unique hits but you can add a logic to only hold certain numbers. Here is a sample:
cache = {}
def get_user_hash(username):
if username in cache:
return cache[username]
hash = memcache.get(username)
if not hash:
hash = # retrieve from db
if not hash:
# put to db & assign hash=new_hash
cache[username] = hash
memcache.set(username, hash)
return hash
#Faisal's method should work well, it adds two levels of caching to the query.
Another option is to store the username and hash in the session. Only check the database once per session, and then retrieve the values from the session variables.
MongoDB is using string(hash) _id field instead of integer; so, how to get classic id primary key? Increment some variable each time I create my class instance?
class Post(Document):
authors_id = ListField(IntField(required=True), required=True)
content = StringField(max_length=100000, required=True)
id = IntField(required=True, primary_key=True)
def __init__(self):
//what next?
Trying to create new user raises exception:
mongoengine.queryset.OperationError: Tried to save duplicate unique keys
(E11000 duplicate key error index: test.user.$_types_1_username_1
dup key: { : "User", : "admin" })
Code:
user = User.create_user(username='admin', email='example#mail.com',
password='pass')
user.is_superuser = True
user.save()
Why?
There is the SequenceField which you could use to provide this. But as stated incrementing id's dont scale well and are they really needed? Can't you use ObjectId or a slug instead?
If you want to use an incrementing integer ID, the method to do it is described here:
http://www.mongodb.org/display/DOCS/How+to+Make+an+Auto+Incrementing+Field
This won't scale for a vary large DB/app but it works well for small or moderate application.
1) If you really want to do it you have to override the mongoengine method saving your documents, to make it look for one document with the highest value for your id and save the document using that id+1. This will create overhead (and one additional read every write), therefore I discourage you to follow this path. You could also have issues of duplicated IDs (if you save two records at the exactly same time, you'll read twice the last id - say 1 and save twice the id 1+1 = 2 => that's really bad - to avoid this issue you'd need to lock the entire collection at every insert, by losing performances).
2) Simply you can't save more than one user with the same username (as the error message is telling you) - and you already have a user called "admin".