Missing data for required field in Marshmallow 3.10.0 - python

I am new to Marshmallow (3.10.0) and I am lost and need help trying to figure out what causes the following error:
AssertionError: ["Input Error - exten: ['Missing data for required field.']"]
The traceback of the error is the following:
Traceback (most recent call last):
File "/root/wazo_virtualenv_python37/lib/python3.7/site-packages/nose/case.py", line 198, in runTest
self.test(*self.arg)
File "/root/wazo-confd/integration_tests/suite/helpers/wrappers.py", line 81, in decorated
result = func(*new_args, **kwargs)
File "/root/wazo-confd/integration_tests/suite/helpers/wrappers.py", line 81, in decorated
result = func(*new_args, **kwargs)
File "/root/wazo-confd/integration_tests/suite/base/test_call_filter_surrogate_user.py", line 216, in test_get_surrogates_callfilter_exten_when_disabled
confd.extensions.features(feature['id']).put({'enabled': False}).assert_updated()
File "/root/wazo-confd/integration_tests/suite/helpers/client.py", line 272, in assert_updated
self.assert_status(204)
File "/root/wazo-confd/integration_tests/suite/helpers/client.py", line 242, in assert_status
assert_that(self.response.status_code, is_in(statuses), self.response.text)
So it seems that the test function test_get_surrogates_callfilter_exten_when_disabled is failing:
def test_get_surrogates_callfilter_exten_when_disabled(call_filter, user):
response = confd.extensions.features.get(search="bsfilter")
feature = response.items[0]
---> (line 216 in traceback): confd.extensions.features(feature['id']).put({'enabled': False}).assert_updated()
with a.call_filter_surrogate_user(call_filter, user):
response = confd.callfilters(call_filter['id']).get()
assert_that(
response.item,
has_entries(
surrogates=has_entries(
users=contains(has_entries(exten=None, uuid=user['uuid']))
)
),
)
confd.extensions.features(feature['id']).put(
{'enabled': feature['enabled']}
).assert_updated()
the feature_extension schema is defined as the following:
class ExtensionFeatureSchema(BaseSchema):
id = fields.Integer(dump_only=True)
exten = fields.String(validate=Regexp(EXTEN_REGEX), required=True)
context = fields.String(dump_only=True)
feature = fields.String(attribute='typeval', dump_only=True)
enabled = fields.Boolean()
links = ListLink(Link('extensions_features'))
and the put function:
def put(self):
form = self.schema().load(request.get_json())
variables = [self.model(**option) for option in form]
self.service.edit(self.section_name, variables)
return '', 204
I have tried many solutions that I found online; but they did not fix the issue for me:
1 + pass partial=True to the load function:
form = self.schema().load(request.get_json(), partial=True)
2 + remove required=True from the field definition; this made the above error go away but failed many other tests that I have.
I am currently out of ideas, so if anyone has an idea how to fix the issue.

Related

Django import-export library ForeignKey error (IntegrityError: FOREIGN KEY constraint failed in django)

First of all please sorry for my English. ))
So I have a problem with Django import-export library when I try to import data from csv/xls/xlsx files to the Django application DB.
How it looks like.
Here is my models.py:
class Department(models.Model):
department_name = models.CharField(max_length = 50, default = '')
def __str__(self):
return f'{self.department_name}'
class ITHardware(models.Model):
it_hardware_model = models.CharField(max_length = 100)
it_hardware_serial_number = models.CharField(max_length = 100,
blank = True, default = '')
it_hardware_department = models.ForeignKey(Department,
related_name = 'department', on_delete = models.SET_NULL, default = '',
null = True, blank = True, db_constraint=False)
admin.py:
#admin.register(Department)
class DepartmentAdmin(admin.ModelAdmin):
list_display = ('department_name', )
actions = [duplicate_object]
#admin.register(ITHardwareManufacturer)
class ITHardwareManufacturerAdmin(admin.ModelAdmin):
list_display = ('manufacturer', )
actions = [duplicate_object]
class ITHardwareImportExportAdmin(ImportExportModelAdmin):
resource_class = ITHardwareResource
list_display = ['id', 'it_hardware_manufacturer',
'it_hardware_model', 'it_hardware_serial_number',
'it_hardware_department']
actions = [duplicate_object]
resource.py:
class ITHardwareResource(resources.ModelResource):
it_hardware_department = fields.Field(
column_name = 'it_hardware_department',
attribute = 'ITHardware.it_hardware_department',
widget = widgets.ForeignKeyWidget(Department, field = 'department_name'))
class Meta():
model = ITHardware
fields = (
'id',
'it_hardware_model',
'it_hardware_serial_number',
'it_hardware_department',
)
export_order = (
'id',
'it_hardware_model',
'it_hardware_serial_number',
'it_hardware_department',
)
import file:
If I try to import data from file, I get such error:
String number: 1 - ITHardwareManufacturer matching query does not exist.
None, Canon, BX500CI, 5B1837T00976, Office_1, IT_1
And so on.
OK. When I fill Department table manually via adminpanel (pointed there IT_1, 2, 3 and 4 departments) I get this preview:
and then I get this error:
Please explain me what am I doing wrong and how to fix it.
Update
If I understood the concept correctly, then "column_name" is the name of the column from which/to which data is imported/exported, and "attribute" is the name of the database field for the same operations.
The export works fine for me.
However, for import, as I understand it, I incorrectly describe the mechanism for ForeignKey. That is, I'm trying to tell Django that when importing data from the "it_hardware_department" column of the file, this data should be written to the "it_hardware_department" field of the Department model using ForeignKeyWidget.
Perhaps some intermediate action is required to determine the mechanism for writing data to the Department model, where to specify something like "Department__it_hardware_department"?
Update: traceback
Traceback (most recent call last):
File "C:\users\iv\mip\mip_env\lib\site-packages\django\db\backends\base\base.py", line 242, in _commit
return self.connection.commit()
The above exception (FOREIGN KEY constraint failed) was the direct cause of the following exception:
File "C:\users\iv\mip\mip_env\lib\site-packages\django\core\handlers\exception.py", line 47, in inner
response = get_response(request)
File "C:\users\iv\mip\mip_env\lib\site-packages\django\core\handlers\base.py", line 181, in _get_response
response = wrapped_callback(request, *callback_args, **callback_kwargs)
File "C:\users\iv\mip\mip_env\lib\site-packages\django\utils\decorators.py", line 130, in _wrapped_view
response = view_func(request, *args, **kwargs)
File "C:\users\iv\mip\mip_env\lib\site-packages\django\views\decorators\cache.py", line 44, in _wrapped_view_func
response = view_func(request, *args, **kwargs)
File "C:\users\iv\mip\mip_env\lib\site-packages\django\contrib\admin\sites.py", line 232, in inner
return view(request, *args, **kwargs)
File "C:\users\iv\mip\mip_env\lib\site-packages\django\utils\decorators.py", line 43, in _wrapper
return bound_method(*args, **kwargs)
File "C:\users\iv\mip\mip_env\lib\site-packages\django\views\decorators\http.py", line 40, in inner
return func(request, *args, **kwargs)
File "C:\users\iv\mip\mip_env\lib\site-packages\import_export\admin.py", line 113, in process_import
result = self.process_dataset(dataset, confirm_form, request, *args, **kwargs)
File "C:\users\iv\mip\mip_env\lib\site-packages\import_export\admin.py", line 125, in process_dataset
return resource.import_data(dataset,
File "C:\users\iv\mip\mip_env\lib\site-packages\import_export\resources.py", line 771, in import_data
return self.import_data_inner(
File "C:\users\iv\mip\mip_env\lib\site-packages\import_export\utils.py", line 25, in __exit__
self.context_manager.__exit__(*args)
File "C:\users\iv\mip\mip_env\lib\site-packages\django\db\transaction.py", line 246, in __exit__
connection.commit()
File "C:\users\iv\mip\mip_env\lib\site-packages\django\utils\asyncio.py", line 33, in inner
return func(*args, **kwargs)
File "C:\users\iv\mip\mip_env\lib\site-packages\django\db\backends\base\base.py", line 266, in commit
self._commit()
File "C:\users\iv\mip\mip_env\lib\site-packages\django\db\backends\base\base.py", line 242, in _commit
return self.connection.commit()
File "C:\users\iv\mip\mip_env\lib\site-packages\django\db\utils.py", line 90, in __exit__
raise dj_exc_value.with_traceback(traceback) from exc_value
File "C:\users\iv\mip\mip_env\lib\site-packages\django\db\backends\base\base.py", line 242, in _commit
return self.connection.commit()
Exception Type: IntegrityError at /admin/mip_apps/ithardware/process_import/
Exception Value: FOREIGN KEY constraint failed
Based on your code sample, the import process will look for a field called it_hardware_department in the import file (and we can see that it is there). During import, it will attempt to find the FK relation in Department by doing a simple get(name=<value in column>. It will store this value in ITHardware.it_hardware_department - however this won't work because this isn't a valid model field. You should rename this to it_hardware_department (which you said you did). The fact that shows up in the preview screen tells me that this worked ok.
The 'FK contraint' failed error says that you have tried to write to the table with an invalid FK relationship. I suggest run the debugger or add a print to inspect the object before it is saved. It is likely that one of your required Foreign Keys will be empty here. I don't think it is it_hardware_department because this has a value in the preview. Are there other FKs in your model which are not listed?

DialogFlow CX V3 - Upsert Entities (Batch Insert)

I am looking for some advice on how I can upsert or replace existing user entity.
I tried couple of API's documented here and also here.
The entities are read from database and the plan is to keep them in sync with database values as a scheduled job.
Update: Code Snippet
client_options = {"quota_project_id": gcp_default_project_id,
"api_endpoint": "us-central1-dialogflow.googleapis.com:443"}
client = EntityTypesClient(credentials=credentials_det, client_options=client_options)
entity_type = v3beta.EntityType()
entity_type.display_name = entity_display_name
entity_type.kind = "KIND_REGEXP"
print(client_options)
entity_type.entities = entity_json
# Initialize request argument(s)
request = v3beta.UpdateEntityTypeRequest(
entity_type=entity_type,
)
print(request)
response = client.update_entity_type(request=request)
print(response)
entity_json is fetched from DB and created as JSON object as below.
df = get_data.get_df_details(config_dir, entity_data_source, sql)
username = df['username'].tolist()
entity_json = []
for each in username:
each_entity_value = {}
each_entity_value['value'] = each
each_entity_value['synonyms'] = [each]
entity_json.append(each_entity_value)
Here's the Trace
Traceback (most recent call last):
File "/Users/<some_dir>/df_ins_entities/df_ins_entities/ins_entity_val.py", line 116, in
ins_now(config_dir, input_entity_name, entity_data_source)
File "/Users/<some_dir>/df_ins_entities/df_ins_entities/ins_entity_val.py", line 96, in ins_now
response = client.update_entity_type(request=request)
File "/Users/<some_dir>/df_ins_entities/lib/python3.9/site-packages/google/cloud/dialogflowcx_v3beta1/services/entity_types/client.py", line 902, in update_entity_type
response = rpc(request, retry=retry, timeout=timeout, metadata=metadata,)
File "/Users/<some_dir>/df_ins_entities/lib/python3.9/site-packages/google/api_core/gapic_v1/method.py", line 154, in call
return wrapped_func(args, **kwargs)
File "/Users/<some_dir>/df_ins_entities/lib/python3.9/site-packages/google/api_core/grpc_helpers.py", line 59, in error_remapped_callable
raise exceptions.from_grpc_error(exc) from exc
google.api_core.exceptions.InvalidArgument: 400 Resource name '' does not match 'projects//locations//agents//entityTypes/*'.
Process finished with exit code 1

Testing boto3/botocore with pagination

I've been trying to add unit tests to my AWS scripts. I've been using botocore.stub to stub the API calls.
I needed to add pagination to various calls, and I can't seem to find a way to write the tests to include pagination.
Here's an example of the non-paginated test, I'm wondering how I can refactor this test and function to use pagination:
# -*- coding: utf-8 -*-
import unittest
import boto3
from botocore.stub import Stubber
from datetime import datetime
def describe_images(client, repository):
return client.describe_images(repositoryName=repository)
class TestCase(unittest.TestCase):
def setUp(self):
self.client = boto3.client('ecr')
def test_describe_images(self):
describe_images_response = {
'imageDetails': [
{
'registryId': 'string',
'repositoryName': 'string',
'imageDigest': 'string',
'imageTags': [
'string',
],
'imageSizeInBytes': 123,
'imagePushedAt': datetime(2015, 1, 1)
},
],
'nextToken': 'string'
}
stubber = Stubber(self.client)
expected_params = {'repositoryName': 'repo_name'}
stubber.add_response(
'describe_images',
describe_images_response,
expected_params
)
with stubber:
response = describe_images(self.client, 'repo_name')
self.assertEqual(describe_images_response, response)
if __name__ == '__main__':
unittest.main()
If I update the function to include pagination like this:
def describe_images(client, repository):
paginator = client.get_paginator('describe_images')
response_iterator = paginator.paginate(
repositoryName=repository
)
return response_iterator
we seem to be getting somewhere. The test fails as it should as equality has changed:
F
======================================================================
FAIL: test_describe_images (__main__.TestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
File "desc_imgs_paginated.py", line 47, in test_describe_images
self.assertEqual(describe_images_response, response)
AssertionError: {'imageDetails': [{'registryId': 'string'[178 chars]ing'} != <botocore.paginate.PageIterator object at 0x1058649b0>
----------------------------------------------------------------------
Ran 1 test in 0.075s
FAILED (failures=1)
When I try to iterate over the generator::
def describe_images(client, repository):
paginator = client.get_paginator('describe_images')
response_iterator = paginator.paginate(
repositoryName=repository
)
return [r for r in response_iterator]
I get the following error:
E
======================================================================
ERROR: test_describe_images (__main__.TestCase)
----------------------------------------------------------------------
Traceback (most recent call last):
File "desc_imgs_paginated.py", line 45, in test_describe_images
response = describe_images(self.client, repo_name)
File "desc_imgs_paginated.py", line 14, in describe_images
return '.join([r for r in response_iterator])
File "desc_imgs_paginated.py", line 14, in <listcomp>
return '.join([r for r in response_iterator])
File "lib/python3.6/site-packages/botocore/paginate.py", line 255, in __iter__
response = self._make_request(current_kwargs)
File "lib/python3.6/site-packages/botocore/paginate.py", line 332, in _make_request
return self._method(**current_kwargs)
File "lib/python3.6/site-packages/botocore/client.py", line 312, in _api_call
return self._make_api_call(operation_name, kwargs)
File "lib/python3.6/site-packages/botocore/client.py", line 579, in _make_api_call
api_params, operation_model, context=request_context)
File "lib/python3.6/site-packages/botocore/client.py", line 631, in _convert_to_request_dict
params=api_params, model=operation_model, context=context)
File "lib/python3.6/site-packages/botocore/hooks.py", line 227, in emit
return self._emit(event_name, kwargs)
File "lib/python3.6/site-packages/botocore/hooks.py", line 210, in _emit
response = handler(**kwargs)
File "lib/python3.6/site-packages/botocore/stub.py", line 337, in _assert_expected_params
self._assert_expected_call_order(model, params)
File "lib/python3.6/site-packages/botocore/stub.py", line 323, in _assert_expected_call_order
pformat(params)))
botocore.exceptions.StubResponseError: Error getting response stub for operation DescribeImages: Unexpected API Call: called with parameters:
{nextToken: string, repositoryName: repo_name}
----------------------------------------------------------------------
Ran 1 test in 0.051s
FAILED (errors=1)
Am I missing the correct approach to testing this? or is this a bug in boto3/botocore?
It's been a while since this question was asked but since there isn't an answer ..
In your set up you provide a response dictionary as below
describe_images_response = {
'imageDetails': [
{
'registryId': 'string',
'repositoryName': 'string',
'imageDigest': 'string',
'imageTags': [
'string',
],
'imageSizeInBytes': 123,
'imagePushedAt': datetime(2015, 1, 1)
},
],
'nextToken': 'string'
}
The key here is that the first response will include a nextToken value. This will result in a second request from the paginator. So you have to provide an additional response for the stub, ultimately you need to end with a response the does not include a nextToken
Now looking back at you set up, there is only a single add_response call to the stubber
stubber.add_response(
'describe_images',
describe_images_response,
expected_params
)
The net result in that when the paginator makes the second request, there is not response specified in the setup.
This results in the exception, the message on which hopefully now makes more sense
botocore.exceptions.StubResponseError: Error getting response stub for operation DescribeImages: Unexpected API Call: called with parameters:
{nextToken: string, repositoryName: repo_name}
Since the second response hasn't been set up, you get an exception with the request that was unexpected, in this request you can see the specification of the nextToken parameter.

Serializing twisted.protocols.amp.AmpList for testing

I have a command as follows:
class AddChatMessages(Command):
arguments = [
('messages', AmpList([('message', Unicode()), ('type', Integer())]))]
And I have a responder for it in a controller:
def add_chat_messages(self, messages):
for i, m in enumerate(messages):
messages[i] = (m['message'], m['type'])
self.main.add_chat_messages(messages)
return {}
commands.AddChatMessages.responder(add_chat_messages)
I am writing a unit test for it. This is my code:
class AddChatMessagesTest(ProtocolTestMixin, unittest.TestCase):
command = commands.AddChatMessages
data = {'messages': [{'message': 'hi', 'type': 'None'}]}
def assert_callback(self, unused):
pass
Where ProtocolMixin is as follows:
class ProtocolTestMixin(object):
def setUp(self):
self.protocol = client.CommandProtocol()
def assert_callback(self, unused):
raise NotImplementedError("Has to be implemented!")
def test_responder(self):
responder = self.protocol.lookupFunction(
self.command.commandName)
d = responder(self.data)
d.addCallback(self.assert_callback)
return d
It works if AmpList is not involved, but when it is - I get following error:
======================================================================
ERROR: test_responder
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/<username>/Projects/space/env/lib/python2.7/site-packages/twisted/internet/defer.py", line 139, in maybeDeferred
result = f(*args, **kw)
File "/Users/<username>/Projects/space/env/lib/python2.7/site-packages/twisted/internet/utils.py", line 203, in runWithWarningsSuppressed
reraise(exc_info[1], exc_info[2])
File "/Users/<username>/Projects/space/env/lib/python2.7/site-packages/twisted/internet/utils.py", line 199, in runWithWarningsSuppressed
result = f(*a, **kw)
File "/Users/<username>/Projects/space/tests/client_test.py", line 32, in test_responder
d = responder(self.data)
File "/Users/<username>/Projects/space/env/lib/python2.7/site-packages/twisted/protocols/amp.py", line 1016, in doit
kw = command.parseArguments(box, self)
File "/Users/<username>/Projects/space/env/lib/python2.7/site-packages/twisted/protocols/amp.py", line 1717, in parseArguments
return _stringsToObjects(box, cls.arguments, protocol)
File "/Users/<username>/Projects/space/env/lib/python2.7/site-packages/twisted/protocols/amp.py", line 2510, in _stringsToObjects
argparser.fromBox(argname, myStrings, objects, proto)
File "/Users/<username>/Projects/space/env/lib/python2.7/site-packages/twisted/protocols/amp.py", line 1209, in fromBox
objects[nk] = self.fromStringProto(st, proto)
File "/Users/<username>/Projects/space/env/lib/python2.7/site-packages/twisted/protocols/amp.py", line 1465, in fromStringProto
boxes = parseString(inString)
File "/Users/<username>/Projects/space/env/lib/python2.7/site-packages/twisted/protocols/amp.py", line 2485, in parseString
return cls.parse(StringIO(data))
TypeError: must be string or buffer, not list
Which makes sense, but how do I serialize a list in AddChatMessagesTest.data?
The responder expects to be called with a serialized box. It will then deserialize it, dispatch the objects to application code, take the object the application code returns, serialize it, and then return that serialized form.
For a few AMP types. most notably String, the serialized form is the same as the deserialized form, so it's easy to overlook this.
I think that you'll want to pass your data through Command.makeArguments in order to produce an object suitable to pass to a responder.
For example:
>>> from twisted.protocols.amp import Command, Integer
>>> class Foo(Command):
... arguments = [("bar", Integer())]
...
>>> Foo.makeArguments({"bar": 17}, None)
AmpBox({'bar': '17'})
>>>
If you do this with a Command that uses AmpList I think you'll find makeArguments returns an encoded string for the value of that argument and that the responder is happy to accept and parse that kind of string.

Django 1.2 + South 0.7 + django-annoying's AutoOneToOneField leads to TypeError: 'LegacyConnection' object is not iterable

I'm using Django 1.2 trunk with South 0.7 and an AutoOneToOneField copied from django-annoying. South complained that the field does not have rules defined and the new version of South no longer has an automatic field type parser. So I read the South documentation and wrote the following definition (basically an exact copy of the OneToOneField rules):
rules = [
(
(AutoOneToOneField),
[],
{
"to": ["rel.to", {}],
"to_field": ["rel.field_name", {"default_attr": "rel.to._meta.pk.name"}],
"related_name": ["rel.related_name", {"default": None}],
"db_index": ["db_index", {"default": True}],
},
)
]
from south.modelsinspector import add_introspection_rules
add_introspection_rules(rules, ["^myapp"])
Now South raises the following error when I do a schemamigration.
Traceback (most recent call last):
File "manage.py", line 11, in <module>
execute_manager(settings)
File "django/core/management/__init__.py", line 438, in execute_manager
utility.execute()
File "django/core/management/__init__.py", line 379, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "django/core/management/base.py", line 196, in run_from_argv
self.execute(*args, **options.__dict__)
File "django/core/management/base.py", line 223, in execute
output = self.handle(*args, **options)
File "South-0.7-py2.6.egg/south/management/commands/schemamigration.py", line 92, in handle
(k, v) for k, v in freezer.freeze_apps([migrations.app_label()]).items()
File "South-0.7-py2.6.egg/south/creator/freezer.py", line 33, in freeze_apps
model_defs[model_key(model)] = prep_for_freeze(model)
File "South-0.7-py2.6.egg/south/creator/freezer.py", line 65, in prep_for_freeze
fields = modelsinspector.get_model_fields(model, m2m=True)
File "South-0.7-py2.6.egg/south/modelsinspector.py", line 322, in get_model_fields
args, kwargs = introspector(field)
File "South-0.7-py2.6.egg/south/modelsinspector.py", line 271, in introspector
arg_defs, kwarg_defs = matching_details(field)
File "South-0.7-py2.6.egg/south/modelsinspector.py", line 187, in matching_details
if any([isinstance(field, x) for x in classes]):
TypeError: 'LegacyConnection' object is not iterable
Is this related to a recent change in Django 1.2 trunk? How do I fix this?
I use this field as follows:
class Bar(models.Model):
foo = AutoOneToOneField("foo.Foo", primary_key=True, related_name="bar")
For reference the field code from django-tagging:
class AutoSingleRelatedObjectDescriptor(SingleRelatedObjectDescriptor):
def __get__(self, instance, instance_type=None):
try:
return super(AutoSingleRelatedObjectDescriptor, self).__get__(instance, instance_type)
except self.related.model.DoesNotExist:
obj = self.related.model(**{self.related.field.name: instance})
obj.save()
return obj
class AutoOneToOneField(OneToOneField):
def contribute_to_related_class(self, cls, related):
setattr(cls, related.get_accessor_name(), AutoSingleRelatedObjectDescriptor(related))
Try to change this line
(AutoOneToOneField),
To this:
(AutoOneToOneField,),
A tuple declared like you did, is not iterable.
Solved the problem by removing the rules and adding the following method to AutoOneToOneField:
def south_field_triple(self):
"Returns a suitable description of this field for South."
from south.modelsinspector import introspector
field_class = OneToOneField.__module__ + "." + OneToOneField.__name__
args, kwargs = introspector(self)
return (field_class, args, kwargs)
Your rule have simple python related problem.. In tuple, you must add comma if only single item inside.
So change (AutoOneToOneField), to (AutoOneToOneField,),
But to be honest, i didn't know that I can use method inside field instead of rules. I will apply your patch and submit to django-annoying repository.

Categories