I have the following two models:
Class Foo(models.model):
param1 = ...
param2 = ...
...
paramN = ...
Class Bar(models.model):
foo = models.ForeignKey(Foo)
...
...
GOAL: Compute a QuerySet of all instances of Foo such that more than 1 Bar instance is connected to it
I have been looking for a solution and this seems to work for everybody else
Foo.objects.annotate(num_bar=Count('bar')).filter(num_bar__gt=1)
This gave me a FieldError saying that 'bar' was not a possible field for Foo, I then tried 'bar_set' and also got the same error
Is there a chance I am implementing them wrong, or because they are old they are depreciated now? Any help would be appreciated!
traceback
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "/home/ryan/.virtualenvs/project/local/lib/python2.7/site-packages/django/db/models/manager.py", line 127, in manager_method
return getattr(self.get_queryset(), name)(*args, **kwargs)
File "/home/ryan/.virtualenvs/project/local/lib/python2.7/site-packages/django/db/models/query.py", line 794, in annotate
obj.query.add_annotation(annotation, alias, is_summary=False)
File "/home/ryan/.virtualenvs/project/local/lib/python2.7/site-packages/django/db/models/sql/query.py", line 982, in add_annotation
summarize=is_summary)
File "/home/ryan/.virtualenvs/project/local/lib/python2.7/site-packages/django/db/models/aggregates.py", line 20, in resolve_expression
c = super(Aggregate, self).resolve_expression(query, allow_joins, reuse, summarize)
File "/home/ryan/.virtualenvs/project/local/lib/python2.7/site-packages/django/db/models/expressions.py", line 491, in resolve_expression
c.source_expressions[pos] = arg.resolve_expression(query, allow_joins, reuse, summarize, for_save)
File "/home/ryan/.virtualenvs/project/local/lib/python2.7/site-packages/django/db/models/expressions.py", line 448, in resolve_expression
return query.resolve_ref(self.name, allow_joins, reuse, summarize)
File "/home/ryan/.virtualenvs/project/local/lib/python2.7/site-packages/django/db/models/sql/query.py", line 1532, in resolve_ref
self.get_initial_alias(), reuse)
File "/home/ryan/.virtualenvs/project/local/lib/python2.7/site-packages/django/db/models/sql/query.py", line 1471, in setup_joins
names, opts, allow_many, fail_on_missing=True)
File "/home/ryan/.virtualenvs/project/local/lib/python2.7/site-packages/django/db/models/sql/query.py", line 1396, in names_to_path
"Choices are: %s" % (name, ", ".join(available)))
FieldError: Cannot resolve keyword 'bar' into field. Choices are: param1, param2, param3, ..., paramN
version
my django version is 1.8.3
There can be multiple reasons for this error, I'll try to tell the probable causes:
Recently upgrading your django version may cause the problem, clear
migrations, rerun.
Moving from local server to production server can cause this problem
sometimes.
Your app name can cause the problem, if it starts with "__".
Try other simpler queries, if they work, try to change the query so
that you don't use the num_model.
also check if you can get the count of them ( the dirty way :) ):
for foo in Foo.objects.all():
if foo.bar_set.count() < 2:
#do sth like : foo.bar_set.get() or temp = temp + 1
as of your short description of model (not your main code), cannot find other causes. Your query should work.
So after trying a lot, this was a solution that worked:
Bar.objects.values("foo_id").annotate(Count("foo_id")).filter(pk__count__gt=1)
Not exactly sure why this worked and the other didn't, but it essentially just gets the count of Bar objects with the same foo_id and makes sure there is more than 1.
If somebody would like to explain a potential reason why this works and the other did not, that would be appreciated.
I had the same issue by import mistake. This is solution for my use case
# from django.db.models.sql.aggregates import Count # wrong import
from django.db.models import Count # correct one
Related
I am struggling trying to understand how to write tests in Flask.
I've inherited an app that already has a bunch of tests that hit routes like /login and test that the response is what's expected.
I have a substantially more complicated situation. I need to test a route/method that depending on the circumstances, hits an external api, figures out whether a path exists in the container the app itself is running in, starts a process that takes 10+ minutes to run on a different machine -- all manner of things. So I can't just hit the route and see if I got what I wanted; I need mocking and patching to mimic the effects of various external world states.
Right now I have a route defined like so in brain_db/views.py:
#app.route('/label_view/<int:scan_number>')
#login_required
def label_view(scan_number):
<so much complicated logic>
The first route defined in that same file, brain_db/views.py, is
#app.route('/surface_test')
def surface_test():
<some code>
Here is a simplified version of the file that's throwing the error:
import unittest
from mock import MagicMock, patch
from flask_brain_db.test_helpers import set_up, tear_down
from flask_brain_db.brain_db.models import Scan
from brain_db.views import label_view
class BrainDBTest(unittest.TestCase):
def setUp(self):
app, db = set_up()
scan = Scan(1, '000001', '000001_MR1', 'scan.nii.gz', scan_number=1)
db.session.add(scan)
scan = Scan.query.filter(Scan.scan_number == 1).first()
db.session.commit()
def tearDown(self):
tear_down()
def mock_volume_views_setup(self)
scan = Scan.query.filter(Scan.scan_number == 1).first()
container_file_path = '/path/to/file/in/container'
return scan, container_file_path
def mock_os_path_exists(self, arg):
return True
#patch('brain_db_helpers.volume_views_setup', mock_volume_views_setup)
#patch('os.path.exists', mock_os_path_exists)
def test_label_view(self):
rv = label_view(1)
assert(True) # I'll actually write tests when I figure out that I can!
print rv
Here is the error:
======================================================================
ERROR: brain_db.tests.test (unittest.loader.ModuleImportFailure)
----------------------------------------------------------------------
ImportError: Failed to import test module: brain_db.tests.test
Traceback (most recent call last):
File "/usr/local/lib/python2.7/unittest/loader.py", line 254, in _find_tests
module = self._get_module_from_name(name)
File "/usr/local/lib/python2.7/unittest/loader.py", line 232, in _get_module_from_name
__import__(name)
File "/usr/src/app/flask_brain_db/brain_db/tests/test.py", line 7, in <module>
from brain_db.views import label_view
File "/usr/src/app/flask_brain_db/brain_db/views.py", line 36, in <module>
#app.route('/surface_test')
File "/usr/local/lib/python2.7/site-packages/flask/app.py", line 1250, in decorator
self.add_url_rule(rule, endpoint, f, **options)
File "/usr/local/lib/python2.7/site-packages/flask/app.py", line 66, in wrapper_func
return f(self, *args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/flask/app.py", line 1221, in add_url_rule
'existing endpoint function: %s' % endpoint)
AssertionError: View function mapping is overwriting an existing endpoint function: surface_test
What I have done to try to solve my problem: I've read a bunch of the posts on SO that quote that same AssertionError. E.g. 1, 2. I can see that the general shape of the problem is that my routes have already been defined, and
from brain_db.views import label_view
is executing the views module again, thus redefining the routes, so I get an error thrown.
What I don't understand is how exactly I should avoid this. I need to be able to import a method into another file to be able to test it. Are all the routes supposed to be wrapped in if __name__ == main? I am brand new to Flask development and haven't yet seen example code where this is the case; I'm dubious that this is the correct solution; it's just the only thing that's offered when you try to search for preventing code from being executed on import.
The way I'm running my tests right now is via the file manage.py in the top level of my application. It contains the following method:
#manager.command
def test():
"""Runs the tests without coverage"""
tests = unittest.TestLoader().discover(start_dir='.', pattern='test*.py')
res = unittest.TextTestRunner(verbosity=2).run(tests)
sys.exit(not res.wasSuccessful())
I run python manage.py test at the command line.
It also might be relevant that while I've put the test that's failing in a submodule within brain_db, several tests run before it that hit routes defined in the app and test for the expected result. However, commenting out those tests has no effect on the way my test is failing.
Finally, I was initially getting an error at the line from flask_brain_db.brain_db.models import Scan:
ERROR: brain_db.tests.test (unittest.loader.ModuleImportFailure)
----------------------------------------------------------------------
ImportError: Failed to import test module: brain_db.tests.test
Traceback (most recent call last):
File "/usr/local/lib/python2.7/unittest/loader.py", line 254, in _find_tests
module = self._get_module_from_name(name)
File "/usr/local/lib/python2.7/unittest/loader.py", line 232, in _get_module_from_name
__import__(name)
File "/usr/src/app/flask_brain_db/brain_db/tests/test.py", line 5, in <module>
from flask_brain_db.brain_db.models import Scan
File "/usr/src/app/flask_brain_db/brain_db/models.py", line 6, in <module>
class Scan(db.Model):
File "/usr/local/lib/python2.7/site-packages/flask_sqlalchemy/model.py", line 67, in __init__
super(NameMetaMixin, cls).__init__(name, bases, d)
File "/usr/local/lib/python2.7/site-packages/flask_sqlalchemy/model.py", line 121, in __init__
super(BindMetaMixin, cls).__init__(name, bases, d)
File "/usr/local/lib/python2.7/site-packages/sqlalchemy/ext/declarative/api.py", line 65, in __init__
_as_declarative(cls, classname, cls.__dict__)
File "/usr/local/lib/python2.7/site-packages/sqlalchemy/ext/declarative/base.py", line 116, in _as_declarative
_MapperConfig.setup_mapping(cls, classname, dict_)
File "/usr/local/lib/python2.7/site-packages/sqlalchemy/ext/declarative/base.py", line 144, in setup_mapping
cfg_cls(cls_, classname, dict_)
File "/usr/local/lib/python2.7/site-packages/sqlalchemy/ext/declarative/base.py", line 172, in __init__
self._setup_table()
File "/usr/local/lib/python2.7/site-packages/sqlalchemy/ext/declarative/base.py", line 465, in _setup_table
**table_kw)
File "/usr/local/lib/python2.7/site-packages/flask_sqlalchemy/model.py", line 90, in __table_cls__
return sa.Table(*args, **kwargs)
File "/usr/local/lib/python2.7/site-packages/sqlalchemy/sql/schema.py", line 439, in __new__
"existing Table object." % key)
InvalidRequestError: Table 'scan' is already defined for this MetaData instance. Specify 'extend_existing=True' to redefine options and columns on an existing Table object.
I made it go away by including
__table_args__ = {'extend_existing': True}
In the model definition, but I don't know if I should have done that and I suspect I was just postponing the same problem I have now. It seems like the fundamental problem is that I don't know how to write tests without redefining a bunch of things that have been already been defined.
What is the correct way to approach this? Please let me know if I need to provide any other information.
You should be able to access your all view functions from your app object. Try removing the line "from brain_db.views import label_view", and instead define your label_view method after running set_up() using the below:
label_view = app.view_functions["label_view"]
Is it possible to extract a sub-key from a JSONField field and annotate the Queryset with its value? I'm trying to extract the value within the query rather than post-processing in the Python code.
Model architecture is:
Django 1.10
Model has a django.contrib.postgres.fields.JSONFieldcalleddata` to store an API response. This example is Twitter.
Other fields are profile_id and screen_name. The rest of the data lives within the data field so it can be queried ad-hoc.
I thought I'd be able to combine annotate and django.models.F but I'm getting the following error:
> models.TwitterUser.objects.annotate(foll_count=F("data__followers_count"))
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "/Virtualenv/env_name/lib/python3.5/site-packages/django/db/models/manager.py", line 85, in manager_method
return getattr(self.get_queryset(), name)(*args, **kwargs)
File "/Virtualenv/env_name/lib/python3.5/site-packages/django/db/models/query.py", line 914, in annotate
clone.query.add_annotation(annotation, alias, is_summary=False)
File "/Virtualenv/env_name/lib/python3.5/site-packages/django/db/models/sql/query.py", line 971, in add_annotation
summarize=is_summary)
File "/Virtualenv/env_name/lib/python3.5/site-packages/django/db/models/expressions.py", line 463, in resolve_expression
return query.resolve_ref(self.name, allow_joins, reuse, summarize)
File "/Virtualenv/env_name/lib/python3.5/site-packages/django/db/models/sql/query.py", line 1462, in resolve_ref
self.get_initial_alias(), reuse)
File "/Virtualenv/env_name/lib/python3.5/site-packages/django/db/models/sql/query.py", line 1402, in setup_joins
names, opts, allow_many, fail_on_missing=True)
File "/Virtualenv/env_name/lib/python3.5/site-packages/django/db/models/sql/query.py", line 1370, in names_to_path
" not permitted." % (names[pos + 1], name))
django.core.exceptions.FieldError: Cannot resolve keyword 'followers_count' into field. Join on 'data' not permitted.
This isn't explicitly documented anywhere so I'm attempting to reverse engineer it using the double underscores used elsewhere in Django. I've separately tried accessing the key as if it was a native Python dict (F("data")[followers_count"]) but that didn't work either.
Any direct answers or pointers towards other areas would be appreciated.
I couldn't use F() at the time of writing so had to fallback to a RawSQL call to access the field.
Based on the prior work of How to aggregate (min/max etc.) over Django JSONField data? and José San Gil on his blog.
qs = models.TwitterUser.objects.annotate(followers=RawSQL(
# The Postgres SQL query to access a JSON object field.
# `data` is the name of the JSONField in my schema.
# More: https://www.postgresql.org/docs/9.3/static/functions-json.html
"((data->%s))",
# The parameter to insert into the query and replace '%s'
# This could be a variable
# and the query adapted into a reusable universal function.
("followers_count",) # In my example, this is the JSON sub-key I'm after
)
)
Note: I've bracketed and indented to hopefully aid comprehension.
You can also add a .filter(data__has_key="insert_key_here") before the annotate if you only want to return items that contain the field in question. This is a nice native method for JSON within the ORM and hopefully in time we'll have similar ways of accessing JSON sub-fields directly through the ORM.
I am integrating django with a legacy system and database and have a model that looks like this
class Label(models.Model)
iLabelID = models.IntegerField(db_column='LABELID', primary_key=True)
organization = models.OneToOneField(Organization, related_name='labels', db_column='ORGANIZATION')
sLabelText = models.CharField(max_length=42)
Using this notation (more or less hungarian notation) is a requirement of the project.
The following will work in django:
>>> Label.objects.get(organization_id=1)
But I want to be able to write this:
>>> Label.objects.get(iOrganizationID=1)
I tried subclassing models.OneToOneField with
class MyOneToOneField(models.OneToOneField):
def get_attname(self):
# default is:
# return '%s_id' % self.name
return 'i%s%sID' % (self.name[0].upper(), self.name[1:])
But this is the error I get when trying to use it:
>>> Label.objects.get(iOrganizationID=1)
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "django/db/models/manager.py", line 151, in get
return self.get_queryset().get(*args, **kwargs)
File "django/db/models/query.py", line 301, in get
num = len(clone)
File "django/db/models/query.py", line 77, in __len__
self._fetch_all()
File "django/db/models/query.py", line 854, in _fetch_all
self._result_cache = list(self.iterator())
File "django/db/models/query.py", line 230, in iterator
obj = model(*row_data)
File "django/db/models/base.py", line 347, in __init__
setattr(self, field.attname, val)
AttributeError: can't set attribute
EDIT: here's another pain point:
I wish to generate some JSON. This JSON will be fed to another part of the system on which I have no control (not possible to change names). I wish I could do the following:
json.dumps(list(Label.objects.values('iLabelID', 'iOrganizationID', 'sAnotherValue')))
But this is not possible. I have to do this
list(Label.objects.values('iLabelID', 'organization_id', 'sAnotherValue'))
And then manually map organization_id to iOrganizationID, although this is not a problem for the label's ID. It makes code more difficult to maintain, to read, and slower to execute.
Note that this is not specific to hungarian notation, you may need to suffix with _identifier or _pk or whatever instead of _id.
EDIT2: I must have made another error because as lanzz pointed out get_attname does work -_-
I solved it using some modifications of your code:
class CustomAttrNameForeignKey(djangoFields.related.ForeignKey):
def __init__(self, *args, **kwargs):
attname = kwargs.pop('attrname', None)
super(CustomAttrNameForeignKey, self).__init__(*args, **kwargs)
self.attname = attname or super(CustomAttrNameForeignKey, self).get_attname()
def get_attname(self):
return self.attname
class AModelUsingThis(djangoModels.Model):
snapshot = CustomAttrNameForeignKey(
ParentalModel, db_column="timestamp", to_field="timestamp",
attrname='timestamp', related_name="children")
In this case we get in the model an attribute for FK with no suffix at all, but with the name given. I haven't tested it yet on DB, but just tried to instantiate model also with this custom FK given -- it works fine.
I'm having a lot of trouble implementing SQLalchemy for a legacy MSSQL database. It is a big existing database, so I wanted to use sqlautocode to generate the files for me, because using autoload to reflect the database takes too long.
The first problem was that sqlautocode no longer seems to work on SQLalchemy 0.8. I did still have existing output from an earlier version, so I thought I'd use that, just to test with.
Now sqlautocode outputs the 'classical mapping', which is not really a problem, but whenever I tried to use a foreign key, 'RelationshipProperty' object has no attribute 'c' would show up. An error somewhere deep inside the SQLalchemy library.
So next, I tried just skipping sqlautocode and writing the classes and relations myself, going by this code for SQLalchemy 0.8. I used two example tables and I got the exact same error. Then I commented out most of the columns, all of the relations and I -STILL- get the error.
Below is my code:
from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy import Column, ForeignKey
from sqlalchemy.orm import relationship, backref
from sqlalchemy.dialects.mssql import *
Base = declarative_base()
class PeopleMemberhip(Base):
__tablename__ = 'people_memberships'
ppl_mshp_id = Column(VARCHAR(length=36), primary_key=True, nullable=False)
ppl_mshp_startdate = Column(DATETIME())
ppl_mshp_enddate = Column(DATETIME())
# ppl_mshp_pmsd_id = Column(VARCHAR(length=36), ForeignKey('paymentschedules.pmsd_id'))
# paymentschedule = relationship("PaymentSchedule", backref=backref('people_memberships'))
def __repr__(self):
return "<people_membership('%s','%s')>" % (self.ppl_mshp_id, self.ppl_mshp_startdate)
class PaymentSchedule(Base):
__tablename__ = 'paymentschedules'
pmsd_id = Column(VARCHAR(length=36), primary_key=True, nullable=False)
pmsd_name = Column(NVARCHAR(length=60))
pmsd_startdate = Column(DATETIME())
pmsd_enddate = Column(DATETIME())
# paymentschedule = relationship("PaymentSchedule", backref=backref('people_memberships'))
def __repr__(self):
return "<paymentschedule('%s','%s')>" % (self.pmsd_id, self.pmsd_name)
And the resulting Error:
Traceback (most recent call last):
File "C:\Program Files (x86)\JetBrains\PyCharm 2.7\helpers\pydev\pydevd.py", line 1472, in <module>
debugger.run(setup['file'], None, None)
File "C:\Program Files (x86)\JetBrains\PyCharm 2.7\helpers\pydev\pydevd.py", line 1116, in run
pydev_imports.execfile(file, globals, locals) #execute the script
File "C:/Users/erik/workspace/flasktest/test.py", line 16, in <module>
contract = db.session.query(PeopleMemberhip).filter_by(ppl_mshp_id='98ABD7E9-4CFF-4F7B-8537-8E46FD5C79D5').one()
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\scoping.py", line 149, in do
return getattr(self.registry(), name)(*args, **kwargs)
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\session.py", line 1105, in query
return self._query_cls(entities, self, **kwargs)
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\query.py", line 115, in __init__
self._set_entities(entities)
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\query.py", line 124, in _set_entities
self._set_entity_selectables(self._entities)
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\query.py", line 157, in _set_entity_selectables
ent.setup_entity(*d[entity])
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\query.py", line 2744, in setup_entity
self._with_polymorphic = ext_info.with_polymorphic_mappers
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\util\langhelpers.py", line 582, in __get__
obj.__dict__[self.__name__] = result = self.fget(obj)
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\mapper.py", line 1425, in _with_polymorphic_mappers
configure_mappers()
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\mapper.py", line 2106, in configure_mappers
mapper._post_configure_properties()
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\mapper.py", line 1242, in _post_configure_properties
prop.init()
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\interfaces.py", line 231, in init
self.do_init()
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\properties.py", line 1028, in do_init
self._setup_join_conditions()
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\properties.py", line 1102, in _setup_join_conditions
can_be_synced_fn=self._columns_are_mapped
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\relationships.py", line 115, in __init__
self._annotate_fks()
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\relationships.py", line 311, in _annotate_fks
self._annotate_present_fks()
File "C:\Users\erik\workspace\flasktest\lib\site-packages\sqlalchemy\orm\relationships.py", line 331, in _annotate_present_fks
secondarycols = util.column_set(self.secondary.c)
AttributeError: 'RelationshipProperty' object has no attribute 'c'
I'm really at a loss, any help with this error is appreciated, but also if someone can suggest a different approach that can make SQLalchemy work with our legacy MSSQL database it is also a solution.
As I said, sqlautocode doesn't seem to work any more, but maybe I'm using it the wrong way, or maybe there is an alternative tool I don't know about.
Erik
Okay guys, figured it out myself.
I had a couple of files I was messing with that had table definitions in them (output from sqlautocode). One was called 'database.py' another 'model.py' and the last one 'ORM.py'.
I had a test.py file that imported 'model.py'. Model.py was the file I had written my table definitions in. However - the test.py page was also importing the database from within Flask (from app import app, db), and in the __init__() function of the Flask app, Flask was still loading 'ORM.py'.
So some of the objects were coming from ORM.py, which was an old file generated by sqlautocode, instead of from model.py, which I was experimenting with.
Renaming ORM.py gave me a clue. I have written a very simple script in Python that traverses through the MSSQL tables and columns and generates a model.py for me. I exclusively load that model.py file now and the whole thing works!
Sorry if anyone was spending time on this. Hope it helps somebody Googleing for the same problem though.
Erik
I've added a list property to an entity model with a large number of existing instances.
class MyModel(db.Model):
new_property = db.ListProperty(item_type=str, default=None)
Upon deployment to the live environment the app runs without issues for a short time and then starts throwing BadValueError error's as it tries to retrieve records from the datastore.
The code throwing the error is just a straight call to the datastore:
app_item = db.get(app_item_key)
I'm using 1.7.5. of the Python 2.7 runtime.
Any ideas on what I can do to prevent this, or at least trap it so that I can get data from the store?
Traceback (most recent call last):
File "/base/data/home/apps/app/4-15.365909351579418812/app.py", line 1739, in app_get
app_item = db.get(app_item_key)
File "/python27_runtime/python27_lib/versions/1/google/appengine/ext/db/__init__.py", line 1533, in get
return get_async(keys, **kwargs).get_result()
File "/python27_runtime/python27_lib/versions/1/google/appengine/api/apiproxy_stub_map.py", line 604, in get_result
return self.__get_result_hook(self)
File "/python27_runtime/python27_lib/versions/1/google/appengine/datastore/datastore_rpc.py", line 1459, in __get_hook
entities = rpc.user_data(entities)
File "/python27_runtime/python27_lib/versions/1/google/appengine/api/datastore.py", line 600, in local_extra_hook
return extra_hook(result)
File "/python27_runtime/python27_lib/versions/1/google/appengine/ext/db/__init__.py", line 1503, in extra_hook
model = cls1.from_entity(entity)
File "/python27_runtime/python27_lib/versions/1/google/appengine/ext/db/__init__.py", line 1438, in from_entity
return cls(None, _from_entity=entity, **entity_values)
File "/python27_runtime/python27_lib/versions/1/google/appengine/ext/db/__init__.py", line 970, in __init__
prop.__set__(self, value)
File "/python27_runtime/python27_lib/versions/1/google/appengine/ext/db/__init__.py", line 614, in __set__
value = self.validate(value)
File "/python27_runtime/python27_lib/versions/1/google/appengine/ext/db/__init__.py", line 3460, in validate
value = super(ListProperty, self).validate(value)
File "/python27_runtime/python27_lib/versions/1/google/appengine/ext/db/__init__.py", line 641, in validate
raise BadValueError('Property %s is required' % self.name)
BadValueError: Property new_property is required
For those that follow:
As per Aaron D's suggestion, changing the default value to an empty list resolved this issue, so:
new_property = db.ListProperty(item_type=str, default=None)
Should read:
new_property = db.ListProperty(item_type=str, default=[])
Looking at the source code of the Google App Engine in the __init__.py referenced in your traceback, you can see a comment in the ListProperty doc comments (line 3428) that says:
Note that the only permissible value for 'required' is True.
So, even though you are not providing it, it looks like line 3442 is setting it automatically:
self._require_parameter(kwds, 'required', True)
If you look further into the source code (line 3500), you can see the definition of empty() for a ListProperty:
def empty(self, value):
"""Is list property empty.
[] is not an empty value.
Returns:
True if value is None, else false.
"""
return value is None
I could think of two issues that might cause error but I haven't verified through testing.
1) If for some reason, you already have data in that field (perhaps you are reusing the new_property name?) and it was empty, then it seems likely to generate the error you have. I am not sure how to fix this problem, except to suggest that you use a unique name for your new_property instead. The post I referenced in my comment explains how to "fix" the data.
2) Since you already have records, your code is trying to populate those using your default value of None, which matches the empty() test and then throws the exception. In that case, if you just provide a default value of [] instead, it should work.
I am pretty sure you example code here is not what you are using. I would bet you have required=True in the new property. You are then retrieving an old record which doesn't have a value for the required property. Just dropping 'required=True` will make those errors go away. If you need to have that value required then you need to add the default value to the field before enforcing the constraint.
* removed some complete garbage about None not being a valid default value for ListProperty *
So I tried to replicate the situation based on the information you have supplied
and I have the answer. I can generate the problem by first creating a model that has a name new_property of type StringProperty with a default of None. put() the record with no value for new_property getting the default of None written, then change the model definition of new_property toListProperty`, and the fetch the record. We get the same stack trace. See shell log below.
s~lightning-catfish> class MyModel(db.Model):
... pass
...
s~lightning-catfish> x = MyModel()
s~lightning-catfish> x.put()
datastore_types.Key.from_path(u'MyModel', 1001L, _app=u's~lightning-catfish')
s~lightning-catfish> class MyModel(db.Model):
... new_property = db.ListProperty(item_type=str,default=None)
...
s~lightning-catfish> y = db.get(x.key())
s~lightning-catfish> y
<MyModel object at 0x9e09dcc>
s~lightning-catfish> y.new_property
[]
s~lightning-catfish> new_property = db.StringProperty(defaul
KeyboardInterrupt
s~lightning-catfish> class MyModel(db.Model):
... new_property = db.StringProperty(default=None)
...
s~lightning-catfish> z = MyModel()
s~lightning-catfish> z.put()
datastore_types.Key.from_path(u'MyModel', 2001L, _app=u's~lightning-catfish')
s~lightning-catfish> class MyModel(db.Model):
... new_property = db.ListProperty(item_type=str,default=None)
...
s~lightning-catfish> a1 = db.get(z.key())
Traceback (most recent call last):
File "<console>", line 1, in <module>
File "/home/timh/google_appengine/google/appengine/ext/db/__init__.py", line 1533, in get
return get_async(keys, **kwargs).get_result()
File "/home/timh/google_appengine/google/appengine/api/apiproxy_stub_map.py", line 604, in get_result
return self.__get_result_hook(self)
File "/home/timh/google_appengine/google/appengine/datastore/datastore_rpc.py", line 1459, in __get_hook
entities = rpc.user_data(entities)
File "/home/timh/google_appengine/google/appengine/api/datastore.py", line 600, in local_extra_hook
return extra_hook(result)
File "/home/timh/google_appengine/google/appengine/ext/db/__init__.py", line 1503, in extra_hook
model = cls1.from_entity(entity)
File "/home/timh/google_appengine/google/appengine/ext/db/__init__.py", line 1438, in from_entity
return cls(None, _from_entity=entity, **entity_values)
File "/home/timh/google_appengine/google/appengine/ext/db/__init__.py", line 970, in __init__
prop.__set__(self, value)
File "/home/timh/google_appengine/google/appengine/ext/db/__init__.py", line 614, in __set__
value = self.validate(value)
File "/home/timh/google_appengine/google/appengine/ext/db/__init__.py", line 3460, in validate
value = super(ListProperty, self).validate(value)
File "/home/timh/google_appengine/google/appengine/ext/db/__init__.py", line 641, in validate
raise BadValueError('Property %s is required' % self.name)
BadValueError: Property new_property is required
s~lightning-catfish>
To fix the data you will need to access it at a low level and change the data types stored in the record.
I have code for fetching and putting entities without using models if you want it.
* last thing you should try *
Try using the following code or something like it to fetch objects without using the model.
You get the underlying data back, with types etc.. in dicts. That will show you what is in the datastore.
from google.appengine.api import datastore
from google.appengine.api import datastore_errors
def get_entities(keys):
rpc = datastore.GetRpcFromKwargs({})
keys, multiple = datastore.NormalizeAndTypeCheckKeys(keys)
entities = None
try:
entities = datastore.Get(keys, rpc=rpc)
except datastore_errors.EntityNotFoundError:
assert not multiple
return entities
x = get_entities([some_key])