Python not able to pickle ZabbixAPI class instance - python

I'm not able to pickle a pyzabbix.ZabbixAPI class instance with the following code:
from pyzabbix import ZabbixAPI
from pickle import dumps
api = ZabbixAPI('http://platform.autuitive.com/monitoring/')
print dumps(api)
It results in the following error:
Traceback (most recent call last):
File "zabbix_test.py", line 8, in <module>
print dumps(api)
File "/usr/lib/python2.7/pickle.py", line 1374, in dumps
Pickler(file, protocol).dump(obj)
File "/usr/lib/python2.7/pickle.py", line 224, in dump
self.save(obj)
File "/usr/lib/python2.7/pickle.py", line 306, in save
rv = reduce(self.proto)
File "/usr/lib/python2.7/copy_reg.py", line 84, in _reduce_ex
dict = getstate()
TypeError: 'ZabbixAPIObjectClass' object is not callable

Well, in the documentation it says:
instances of such classes whose __dict__ or the result of calling
__getstate__() is picklable (see section The pickle protocol for details).
And it would appear this class isn't one. If you really need to do this, then consider writing your own pickling routines.

Related

pickle error when serialize cython class with lambda

I use pickle and dill for follow lambda function and work fine :
import dill
import pickle
f = lambda x,y: x+y
s = pickle.dumps(f)
or even when used in class, for example:
file
foo.py
class Foo(object):
def __init__(self):
self.f = lambda x, y: x+y
file
test.py
import dill
import pickle
from foo import Foo
f = Foo()
s = pickle.dumps(f) # or s = dill.dumps(f)
but when build same file with format .pyx (foo.pyx) using cython, can't serialize with dill, pickle or cpickle, get this error :
Traceback (most recent call last):
File "/home/amin/anaconda2/envs/rllab2/lib/python2.7/site-packages/IPython/core/interactiveshell.py", line 2878, in run_cod
exec(code_obj, self.user_global_ns, self.user_ns)
File "", line 1, in
a = pickle.dumps(c)
File "/home/amin/anaconda2/envs/rllab2/lib/python2.7/pickle.py", line 1380, in dumps
Pickler(file, protocol).dump(obj)
File "/home/amin/anaconda2/envs/rllab2/lib/python2.7/pickle.py", line 224, in dump
self.save(obj)
File "/home/amin/anaconda2/envs/rllab2/lib/python2.7/pickle.py", line 331, in save
self.save_reduce(obj=obj, *rv)
File "/home/amin/anaconda2/envs/rllab2/lib/python2.7/pickle.py", line 425, in save_reduce
save(state)
File "/home/amin/anaconda2/envs/rllab2/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "/home/amin/anaconda2/envs/rllab2/lib/python2.7/site-packages/dill/_dill.py", line 912, in save_module_dict
StockPickler.save_dict(pickler, obj)
File "/home/amin/anaconda2/envs/rllab2/lib/python2.7/pickle.py", line 655, in save_dict
self._batch_setitems(obj.iteritems())
File "/home/amin/anaconda2/envs/rllab2/lib/python2.7/pickle.py", line 669, in _batch_setitems
save(v)
File "/home/amin/anaconda2/envs/rllab2/lib/python2.7/pickle.py", line 317, in save
self.save_global(obj, rv)
File "/home/amin/anaconda2/envs/rllab2/lib/python2.7/pickle.py", line 754, in save_global
(obj, module, name))
PicklingError: Can't pickle . at 0x7f9ab1ff07d0>: it's not found as foo.lambda
setup.py file for build cython
setup.py
from distutils.core import setup
from Cython.Build import cythonize
setup(ext_modules=cythonize("foo.pyx"))
then run in terminal:
python setup.py build_ext --inplace
Is there a way ?
I'm the dill author. Expanding on what #DavidW says in the comments -- I believe there are (currently) no known serializers that can pickle cython lambdas, or the vast majority of cython code. Indeed, it is much more difficult for python serializers to be able to pickle objects with C-extensions unless the authors of the C-extension code specifically build serialization instructions (as did numpy and pandas). In that vein... instead of a lambda, you could build a class with a __call__ method, so it acts like a function... and then add one or more of the pickle methods (__reduce__, __getstate__, __setstate__, or something similar)... and then you should be able to pickle instances of your class. It's a bit of work, but since this path has been used to pickle classes written in C++ -- I believe you should be able to get it to work for cython-built classes.
In this code
import dill
import pickle
f = lambda x,y: x+y
s = pickle.dumps(f)
f is a function,
But in another code
import dill
import pickle
from foo import Foo
f = Foo()
s = pickle.dumps(f)
# or
s = dill.dumps(f)
f is a class

Any easy solution to resolve pickling error

I am using savez to save the weights. Following is my code:
class vgg16:
def __init__(self, imgs1,imgs2, weights=None, sess=None):
.........
self.weight_list=[]
self.keys=[]
........
self.SaveWeights()
....neural network............
def SaveWeights(self):
tmp = file("vgg16_predict.npz",'wb')
np.savez(self,**dict(zip(self.keys, self.weight_list)))
tmp.close
I keep getting the pickling error. There are different solutions provided. But is there an easiest way to make this happen?
Here is the traceback:
Traceback (most recent call last):
File "f.py", line 350, in <module>
vgg = vgg16(imgs1,imgs2, 'vgg16_weights.npz', sess)
File "f.py", line 43, in __init__
self.SaveWeights()
File "f.py", line 339, in SaveWeights
np.savez(self,**dict(zip(self.keys, self.weight_list)))
File "/usr/local/lib/python2.7/dist-packages/numpy/lib/npyio.py", line 574, in savez
_savez(file, args, kwds, False)
File "/usr/local/lib/python2.7/dist-packages/numpy/lib/npyio.py", line 639, in _savez
pickle_kwargs=pickle_kwargs)
File "/usr/local/lib/python2.7/dist-packages/numpy/lib/format.py", line 573, in write_array
pickle.dump(array, fp, protocol=2, **pickle_kwargs)
cPickle.PicklingError: Can't pickle <type 'module'>: attribute lookup __builtin__.module failed
Exception AttributeError: "vgg16 instance has no attribute 'tell'" in <bound method ZipFile.__del__ of <zipfile.ZipFile object at 0x7f812dec99d0>> ignored
You need to pickle to a file. Just use the path directly:
np.savez("vgg16_predict.npz", **dict(zip(self.keys, self.weight_list)))
So, this should be you full method:
def SaveWeights(self):
np.savez("vgg16_predict.npz", **dict(zip(self.keys, self.weight_list)))

Python - dill: Can't pickle decorated class

I have the following code, which decorates the class:
import dill
from collections import namedtuple
from multiprocessing import Process
def proxified(to_be_proxied):
b = namedtuple('d', [])
class Proxy(to_be_proxied, b):
pass
Proxy.__name__ = to_be_proxied.__name__
return Proxy
#proxified
class MyClass:
def f(self):
print('hello!')
pickled_cls = dill.dumps(MyClass)
def new_process(clazz):
dill.loads(clazz)().f()
p = Process(target=new_process, args=(pickled_cls,))
p.start()
p.join()
When I am trying to pickle decorated class I am getting the following error:
Traceback (most recent call last):
File "/usr/lib/python3.5/pickle.py", line 907, in save_global
obj2, parent = _getattribute(module, name)
File "/usr/lib/python3.5/pickle.py", line 265, in _getattribute
.format(name, obj))
AttributeError: Can't get local attribute 'proxified.<locals>.Proxy' on <function proxified at 0x7fbf7de4b8c8>
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/carbolymer/example.py", line 108, in <module>
pickled_cls = dill.dumps(MyClass)
File "/usr/lib/python3.5/site-packages/dill/dill.py", line 243, in dumps
dump(obj, file, protocol, byref, fmode, recurse)#, strictio)
File "/usr/lib/python3.5/site-packages/dill/dill.py", line 236, in dump
pik.dump(obj)
File "/usr/lib/python3.5/pickle.py", line 408, in dump
self.save(obj)
File "/usr/lib/python3.5/pickle.py", line 475, in save
f(self, obj) # Call unbound method with explicit self
File "/usr/lib/python3.5/site-packages/dill/dill.py", line 1189, in save_type
StockPickler.save_global(pickler, obj)
File "/usr/lib/python3.5/pickle.py", line 911, in save_global
(obj, module_name, name))
_pickle.PicklingError: Can't pickle <class '__main__.proxified.<locals>.Proxy'>: it's not found as __main__.proxified.<locals>.Proxy
How can I pickle the decorated class using dill? I would like pass to pass this class to a separate process as an argument - maybe is there a simpler way to do it?
A good explanation of "Why pickling decorated functions is painful", provided by Gaël Varoquaux can be found here.
Basically rewriting the class using functools.wraps can avoid these issues :)

How do I efficiently retrieve a non-initial slice of a Flywheel scan?

Given a large Dynamo table with lots of items, I would like to be able to start a scan and later resume iteration on it from an unrelated Python context, as if I had continued calling the next() of gen() on the scan itself.
What I am trying to avoid:
offset = 500
count = 25
scan_gen = engine.scan(AModel).gen()
for _ in range(offset):
scan_gen.next()
results = [scan_gen.next() for _ in range(count)]
Because this would require restarting the scan from the top, every single time.
I see that the DynamoDB API normally works in a cursor-like fashion with the LastEvaluatedKey property: http://docs.aws.amazon.com/amazondynamodb/latest/APIReference/API_Scan.html
Is there a way to use this to jump ahead in scan generator in Flywheel?
Failing that, is there a way to serialize the state of the generator? I have tried pickling the generator, and it causes pickle.PicklingError due to name resolution problems:
>>> with open('/tmp/dump_dynamo_result', 'wb') as out_fp:
... pickle.dump(engine.scan(AModel).gen(), out_fp, pickle.HIGHEST_PROTOCOL)
...
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
File "/usr/lib/python2.7/pickle.py", line 1370, in dump
Pickler(file, protocol).dump(obj)
File "/usr/lib/python2.7/pickle.py", line 224, in dump
self.save(obj)
File "/usr/lib/python2.7/pickle.py", line 331, in save
self.save_reduce(obj=obj, *rv)
File "/usr/lib/python2.7/pickle.py", line 396, in save_reduce
save(cls)
File "/usr/lib/python2.7/pickle.py", line 286, in save
f(self, obj) # Call unbound method with explicit self
File "/usr/lib/python2.7/pickle.py", line 748, in save_global
(obj, module, name))
pickle.PicklingError: Can't pickle <type 'generator'>: it's not found as __builtin__.generator
Yes, you can build the LastEvaulatedKey yourself and pass it in as the ExclusiveStartKey in another scan. It will basically be the key (hash or hash/range) of the last item that was evaluated -- i.e. the item that you want to start scanning from.
If you want to see what it looks like, do a Scan and set the limit to 2 or something small. You'll get a LastEvaluatedKey returned and you can examine it and determine how to build it yourself. Then, choose an item from your database from where you want to start your scan and create a LastEvaluatedKey out of it.

Python Pickle throwing FileNotFoundError

I have a situation in python where, in a python runnable object that is running together with other processes, the following situation happens: If the code is simply:
f = open(filename, "rb")
f.close()
There is no error, but then when the code changes to the following, introducing pickle in the middle, it throws a FileNotFoundError:
f = open(filename, "rb")
object = pickle.load(f)
f.close()
I don't understand why, if the file exists, pickle would throw such error. The complete trace of the error is:
task = pickle.load(f)
File "/usr/lib/python3.4/multiprocessing/managers.py", line 852, in RebuildProxy
return func(token, serializer, incref=incref, **kwds)
File "/usr/lib/python3.4/multiprocessing/managers.py", line 706, in __init__
self._incref()
File "/usr/lib/python3.4/multiprocessing/managers.py", line 756, in _incref
conn = self._Client(self._token.address, authkey=self._authkey)
File "/usr/lib/python3.4/multiprocessing/connection.py", line 495, in Client
c = SocketClient(address)
File "/usr/lib/python3.4/multiprocessing/connection.py", line 624, in SocketClient
s.connect(address)
FileNotFoundError: [Errno 2] No such file or directory
While I'd say that #TesselatingHeckler provided the answer, this comment doesn't really fit in the comments section… so it's an extension of that answer.
You can't pickle multiprocessing Queue and Pipe objects, and indeed, some pickled objects only fail on load.
I have built a fork of multiprocessing that allows most objects to be pickled. Primarily what was done was to replace pickle with a much more robust serializer (dill). The fork is available as part of the pathos package (on github). I've tried to serialize Pipes before, and it works, and it also works on a python socket and a python Queue. However, it still doesn't work apparently on a multiprocessing Queue.
>>> from processing import Pipe
>>> p = Pipe()
>>>
>>> import dill
>>> dill.loads(dill.dumps(p))
(Connection(handle=12), Connection(handle=14))
>>>
>>> from socket import socket
>>> s = socket()
>>> from Queue import Queue as que
>>> w = que()
>>> dill.loads(dill.dumps(s))
<socket._socketobject object at 0x10dae18a0>
>>> dill.loads(dill.dumps(w))
<Queue.Queue instance at 0x10db49f38>
>>>
>>> from processing import Queue
>>> q = Queue()
>>> dill.loads(dill.dumps(q))
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/mmckerns/lib/python2.7/site-packages/dill-0.2.2.dev-py2.7.egg/dill/dill.py", line 180, in dumps
dump(obj, file, protocol, byref, file_mode, safeio)
File "/Users/mmckerns/lib/python2.7/site-packages/dill-0.2.2.dev-py2.7.egg/dill/dill.py", line 173, in dump
pik.dump(obj)
File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/pickle.py", line 224, in dump
self.save(obj)
File "/opt/local/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/pickle.py", line 306, in save
rv = reduce(self.proto)
File "/Users/mmckerns/lib/python2.7/site-packages/processing/queue.py", line 62, in __getstate__
assertSpawning(self)
File "/Users/mmckerns/lib/python2.7/site-packages/processing/forking.py", line 24, in assertSpawning
'processes through inheritance' % type(self).__name__)
RuntimeError: Queue objects should only be shared between processes through inheritance
It seems that the __getstate__ method for a multiprocessing Queue is hardwired to throw an error. If you had this buried inside a class, it might not be triggered until the class instance is being reconstituted from the pickle.
Trying to pickle a multiprocessing Queue with pickle should also give you the above error.

Categories