Error when calling Pkg_resources.resource_string - python

I'm a little confused. I'm working on a Python project where I load a resource file, read it as tsv and transform it into a dictionary with pattern objects as key for later usage. To load the file, so far I've used the pkg_resources package from setuptools. It basically looks like this:
from csv import DictReader
from pkg_resources import resource_string
def make_dict():
"""Make global dictionary."""
global event_dict
dictlines = [l.decode('utf8') for l in resource_string(
'pkgname.resources.tsv', 'event_dict.tsv').splitlines()]
reader = DictReader(dictlines, dialect='excel-tab')
for row in reader:
event = re.compile(r'\b{}\b'.format(re.escape(row['word'])))
classes = string_to_list(row['id'])
event_dict[event] = classes
So far, it worked well. However, once I started calling the module from another module, following error appeared:
---------------------------------------------------------------------------
KeyError Traceback (most recent call last)
C:\Python\Python36\lib\site-packages\pkg_resources\__init__.py in get_provider(moduleOrReq)
430 try:
--> 431 module = sys.modules[moduleOrReq]
432 except KeyError:
KeyError: 'pkgname.resources.tsv'
During handling of the above exception, another exception occurred:
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-22-efa35954c76f> in <module>()
----> 1 make_event_dict()
<ipython-input-21-b318bc78e8fd> in make_event_dict()
4 global event_dict
5 dictlines = [l.decode('utf8') for l in resource_string(
----> 6 'pkgname.resources.tsv', 'event_classes_dict.tsv').splitlines()]
7 reader = DictReader(dictlines, dialect='excel-tab')
8 for row in reader:
C:\Python\Python36\lib\site-packages\pkg_resources\__init__.py in resource_string(self, package_or_requirement, resource_name)
1215 def resource_string(self, package_or_requirement, resource_name):
1216 """Return specified resource as a string"""
-> 1217 return get_provider(package_or_requirement).get_resource_string(
1218 self, resource_name
1219 )
C:\Python\Python36\lib\site-packages\pkg_resources\__init__.py in get_provider(moduleOrReq)
431 module = sys.modules[moduleOrReq]
432 except KeyError:
--> 433 __import__(moduleOrReq)
434 module = sys.modules[moduleOrReq]
435 loader = getattr(module, '__loader__', None)
ModuleNotFoundError: No module named 'pkgname'
Now I'm guessing something's wrong with my project setup, so this is what it's structured like:
|Pkg\
|----setup.py
|----pkg\
|--------__init__.py
|--------events.py
|--------resources\
|------------__init__.py
|------------tsv\
|----------------__init__.py
|----------------event_dict.tsv
What could be wrong? Not exactly sure if the __init__.py in the subfolders are needed, btw.

Related

AttributeError: 'QuantumCircuit' object has no attribute 'config'

I am working on a program for Qiskit, but I am getting a strange error (one that I have not gotten in the past) when I try to simulate a circuit. Here is a minimal example producing the error:
from qiskit.circuit import QuantumCircuit
from qiskit import Aer,transpile
c = QuantumCircuit(2)
simulator = Aer.get_backend('qasm_simulator')
c = transpile(c, simulator)
result = simulator.run(c).result()
plot_histogram(counts, title='Counts')
The error I get is:
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-1-89904ecd5f8e> in <module>()
5 simulator = Aer.get_backend('qasm_simulator')
6 c = transpile(c, simulator)
----> 7 result = simulator.run(c).result()
8 plot_histogram(counts, title='Counts')
/Users/d/anaconda3/envs/qiskit/lib/python3.7/site-packages/qiskit/providers/aer/backends/aerbackend.py in run(self, qobj, backend_options, validate, **run_options)
146 # Add backend options to the Job qobj
147 qobj = self._format_qobj(
--> 148 qobj, backend_options=backend_options, **run_options)
149
150 # Optional validation
/Users/d/anaconda3/envs/qiskit/lib/python3.7/site-packages/qiskit/providers/aer/backends/aerbackend.py in _format_qobj(self, qobj, backend_options, **run_options)
353 """Return execution sim config dict from backend options."""
354 # Add options to qobj config overriding any existing fields
--> 355 config = qobj.config
356
357 # Add options
AttributeError: 'QuantumCircuit' object has no attribute 'config'
Does anyone know what is causing this error?
Thank you!
I believe you need to assemble the transpiled circuit in a qobj before running it :
from qiskit.compiler import assemble
my_qobj = assemble(c)
result = simulator.run(my_qobj).result()
By the way, without any measure, the plot_histogram(result.get_counts()) will return an error as well.
There is also a special platform the Quantum Computing, https://quantumcomputing.stackexchange.com, feel free to post any other questions about QC&co there :)

KeyError Traceback (most recent call last)

**from geopy.geocoders import Here
exif = get_exif('earth_postcard_1599147372.jpg')
geotags = get_geotagging(exif)
coords = get_coordinates(geotags)
geocoder = Here(apikey=os.environ['API_KEY'])
print(geocoder.reverse("%s,%s" % coords))**
ERROR
KeyError Traceback (most recent call last)
<ipython-input-13-baa26ae24e59> in <module>
4 geotags = get_geotagging(exif)
5 coords = get_coordinates(geotags)
----> 6 geocoder = Here(apikey=os.environ['API_KEY'])
7 print(geocoder.reverse("%s,%s" % coords))
~\anaconda3\lib\os.py in __getitem__(self, key)
677 except KeyError:
678 # raise KeyError with the original key value
--> 679 raise KeyError(key) from None
680 return self.decodevalue(value)
681
KeyError: 'API_KEY'
Here i try to get the location of using latitude and logitude using geopy library but im getting key error
You should check whether you have set your environment parameter.You can enter into python command line mode then input :
import os
os.environ.keys()
System will output all the environment parameter."API_KEY" should not be set. If you set this key in system parameter list,this error will be gone.

Why won't pandas read pickle file generated with older pandas version? [duplicate]

Importing pandas didn't throw the error, but rather trying to read a picked pandas dataframe as such:
import numpy as np
import pandas as pd
import matplotlib
import seaborn as sns
sns.set(style="white")
control_data = pd.read_pickle('null_report.pickle')
test_data = pd.read_pickle('test_report.pickle')
The traceback is 165 lines with three concurrent exceptions (whatever that means). Is read_pickle not compatible with pandas version 17.1 I'm running? How do I unpickle my dataframe for use?
Below is a copy of the traceback:
ImportError Traceback (most recent call last)
C:\Users\test\Anaconda3\lib\site-packages\pandas\io\pickle.py in try_read(path, encoding)
45 with open(path, 'rb') as fh:
---> 46 return pkl.load(fh)
47 except (Exception) as e:
ImportError: No module named 'pandas.indexes'
During handling of the above exception, another exception occurred:
ImportError Traceback (most recent call last)
C:\Users\test\Anaconda3\lib\site-packages\pandas\io\pickle.py in try_read(path, encoding)
51 with open(path, 'rb') as fh:
---> 52 return pc.load(fh, encoding=encoding, compat=False)
53
C:\Users\test\Anaconda3\lib\site-packages\pandas\compat\pickle_compat.py in load(fh, encoding, compat, is_verbose)
115
--> 116 return up.load()
117 except:
C:\Users\test\Anaconda3\lib\pickle.py in load(self)
1038 assert isinstance(key, bytes_types)
-> 1039 dispatch[key[0]](self)
1040 except _Stop as stopinst:
C:\Users\test\Anaconda3\lib\pickle.py in load_stack_global(self)
1342 raise UnpicklingError("STACK_GLOBAL requires str")
-> 1343 self.append(self.find_class(module, name))
1344 dispatch[STACK_GLOBAL[0]] = load_stack_global
C:\Users\test\Anaconda3\lib\pickle.py in find_class(self, module, name)
1383 module = _compat_pickle.IMPORT_MAPPING[module]
-> 1384 __import__(module, level=0)
1385 if self.proto >= 4:
ImportError: No module named 'pandas.indexes'
During handling of the above exception, another exception occurred:
ImportError Traceback (most recent call last)
C:\Users\test\Anaconda3\lib\site-packages\pandas\io\pickle.py in read_pickle(path)
59 try:
---> 60 return try_read(path)
61 except:
C:\Users\test\Anaconda3\lib\site-packages\pandas\io\pickle.py in try_read(path, encoding)
56 with open(path, 'rb') as fh:
---> 57 return pc.load(fh, encoding=encoding, compat=True)
58
C:\Users\test\Anaconda3\lib\site-packages\pandas\compat\pickle_compat.py in load(fh, encoding, compat, is_verbose)
115
--> 116 return up.load()
117 except:
C:\Users\test\Anaconda3\lib\pickle.py in load(self)
1038 assert isinstance(key, bytes_types)
-> 1039 dispatch[key[0]](self)
1040 except _Stop as stopinst:
C:\Users\test\Anaconda3\lib\pickle.py in load_stack_global(self)
1342 raise UnpicklingError("STACK_GLOBAL requires str")
-> 1343 self.append(self.find_class(module, name))
1344 dispatch[STACK_GLOBAL[0]] = load_stack_global
C:\Users\test\Anaconda3\lib\pickle.py in find_class(self, module, name)
1383 module = _compat_pickle.IMPORT_MAPPING[module]
-> 1384 __import__(module, level=0)
1385 if self.proto >= 4:
ImportError: No module named 'pandas.indexes'
During handling of the above exception, another exception occurred:
ImportError Traceback (most recent call last)
C:\Users\test\Anaconda3\lib\site-packages\pandas\io\pickle.py in try_read(path, encoding)
45 with open(path, 'rb') as fh:
---> 46 return pkl.load(fh)
47 except (Exception) as e:
ImportError: No module named 'pandas.indexes'
During handling of the above exception, another exception occurred:
ImportError Traceback (most recent call last)
C:\Users\test\Anaconda3\lib\site-packages\pandas\io\pickle.py in try_read(path, encoding)
51 with open(path, 'rb') as fh:
---> 52 return pc.load(fh, encoding=encoding, compat=False)
53
C:\Users\test\Anaconda3\lib\site-packages\pandas\compat\pickle_compat.py in load(fh, encoding, compat, is_verbose)
115
--> 116 return up.load()
117 except:
C:\Users\test\Anaconda3\lib\pickle.py in load(self)
1038 assert isinstance(key, bytes_types)
-> 1039 dispatch[key[0]](self)
1040 except _Stop as stopinst:
C:\Users\test\Anaconda3\lib\pickle.py in load_stack_global(self)
1342 raise UnpicklingError("STACK_GLOBAL requires str")
-> 1343 self.append(self.find_class(module, name))
1344 dispatch[STACK_GLOBAL[0]] = load_stack_global
C:\Users\test\Anaconda3\lib\pickle.py in find_class(self, module, name)
1383 module = _compat_pickle.IMPORT_MAPPING[module]
-> 1384 __import__(module, level=0)
1385 if self.proto >= 4:
ImportError: No module named 'pandas.indexes'
During handling of the above exception, another exception occurred:
ImportError Traceback (most recent call last)
<ipython-input-17-3b05fe7d20a4> in <module>()
3 # test_data = np.genfromtxt(fh, usecols=2)
4
----> 5 control_data = pd.read_pickle('null_report.pickle')
6 test_data = pd.read_pickle('test_report.pickle')
7
C:\Users\test\Anaconda3\lib\site-packages\pandas\io\pickle.py in read_pickle(path)
61 except:
62 if PY3:
---> 63 return try_read(path, encoding='latin1')
64 raise
C:\Users\test\Anaconda3\lib\site-packages\pandas\io\pickle.py in try_read(path, encoding)
55 except:
56 with open(path, 'rb') as fh:
---> 57 return pc.load(fh, encoding=encoding, compat=True)
58
59 try:
C:\Users\test\Anaconda3\lib\site-packages\pandas\compat\pickle_compat.py in load(fh, encoding, compat, is_verbose)
114 up.is_verbose = is_verbose
115
--> 116 return up.load()
117 except:
118 raise
C:\Users\test\Anaconda3\lib\pickle.py in load(self)
1037 raise EOFError
1038 assert isinstance(key, bytes_types)
-> 1039 dispatch[key[0]](self)
1040 except _Stop as stopinst:
1041 return stopinst.value
C:\Users\test\Anaconda3\lib\pickle.py in load_stack_global(self)
1341 if type(name) is not str or type(module) is not str:
1342 raise UnpicklingError("STACK_GLOBAL requires str")
-> 1343 self.append(self.find_class(module, name))
1344 dispatch[STACK_GLOBAL[0]] = load_stack_global
1345
C:\Users\test\Anaconda3\lib\pickle.py in find_class(self, module, name)
1382 elif module in _compat_pickle.IMPORT_MAPPING:
1383 module = _compat_pickle.IMPORT_MAPPING[module]
-> 1384 __import__(module, level=0)
1385 if self.proto >= 4:
1386 return _getattribute(sys.modules[module], name)[0]
ImportError: No module named 'pandas.indexes'
I also tried loading the pickle file from pickle directly:
via_pickle = pickle.load( open( 'null_report.pickle', "rb" ) )
and got the same error:
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-23-ba2e3adae1c4> in <module>()
1
----> 2 via_pickle = pickle.load( open( 'null_report.pickle', "rb" ) )
3
4 # control_data = pd.read_pickle('null_report.pickle')
5 # test_data = pd.read_pickle('test_report.pickle')
ImportError: No module named 'pandas.indexes'
I had this error when I created a pkl file with python 2.7 and was trying to read it with python 3.6
I did:
pd.read_pickle('foo.pkl')
and it worked
I had this problem from trying to open a pickled dataframe made with pandas 0.18.1 using pandas 0.17.1.
If you are using pip, upgrade pandas with:
pip install --upgrade pandas
If you are using a library like anaconda, use:
conda upgrade pandas
If you need to have both versions of pandas on your machine, consider using virtualenv
Saving and loading in different versions of pandas using pickle often does not work. Instead, use pandas.HDFStore.
When I needed to update pandas but also needed some data saved with pickle in previous versions, I went back and re-saved that data in HDF format instead, when nothing else would work. No problems anymore.
Works for any sort of pandas data structure it seems, even multi-indexed dataframes! In short, if pickling fails after a version upgrade, try HDFStore; it's more reliable (and more efficient!).
Here is the solution without updating pandas or whatever your using.
If you're using python2
import cPickle
with open('filename.pkl', 'rb') as fo:
dict = cPickle.load(fo, encoding='latin1’)
If you're using python3
import pickle
with open('filename.pkl', 'rb') as fo:
dict = pickle.load(fo, encoding='latin1’)
In the pandas 0.23.4, there is a better way to fix the problem. Use the pandas.read_pickle to read the fileobject, like:
pd.read_pickle(open('test_report.pickle', 'rb'))
A flexible way to deal with internal API changes that break unpickling is to implement a custom Unpickler instance.
For example, the pandas.indexes module has been moved to pandas.core.indexes. We can write an Unpickler, that adapts the module path accordingly. To do that, we can overwrite the method find_class:
import sys
class Unpickler(pickle.Unpickler):
def find_class(self, module, name):
'''This method gets called for every module pickle tries to load.'''
# python 2 --> 3 compatibility: __builtin__ has been renamed to builtins
if module == '__builtin__':
module = 'builtins'
# pandas compatibility: in newer versions, pandas.indexes has been moved to pandas.core.indexes
if 'pandas.indexes' in module:
module = module.replace('pandas.indexes', 'pandas.core.indexes')
__import__(module)
return getattr(sys.modules[module], name)
with open('/path/to/pickle.pkl', 'rb') as file:
pdf = Unpickler(file).load()
If you want to read pickled text instead of a file, do
import io
pd.read_pickle(io.BytesIO(pickled_text))
If you face the error - ValueError: Unrecognized compression type: infer,
explicitly mention the compression type. It could be one of None(no compression), gzip, bz2, xz or zip(depending upon file extension).
pd.read_pickle(io.BytesIO(pickled_text), compression=None)

pydicom 'Dataset' object has no attribute 'TransferSyntaxUID'

I'm using pydicom 1.0.0a1, downloaded from here, When I run the following code:
ds=pydicom.read_file('./DR/abnormal/abc.dcm',force=True)
ds.pixel_array
this error occurs:
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-6-d4e81d303439> in <module>()
7 ds=pydicom.read_file('./DR/abnormal/abc.dcm',force=True)
8
----> 9 ds.pixel_array
10
/Applications/anaconda/lib/python2.7/site-packages/pydicom-1.0.0a1-py2.7.egg/pydicom/dataset.pyc in __getattr__(self, name)
501 if tag is None: # `name` isn't a DICOM element keyword
502 # Try the base class attribute getter (fix for issue 332)
--> 503 return super(Dataset, self).__getattribute__(name)
504 tag = Tag(tag)
505 if tag not in self: # DICOM DataElement not in the Dataset
/Applications/anaconda/lib/python2.7/site-packages/pydicom-1.0.0a1-py2.7.egg/pydicom/dataset.pyc in pixel_array(self)
1064 The Pixel Data (7FE0,0010) as a NumPy ndarray.
1065 """
-> 1066 return self._get_pixel_array()
1067
1068 # Format strings spec'd according to python string formatting options
/Applications/anaconda/lib/python2.7/site-packages/pydicom-1.0.0a1-py2.7.egg/pydicom/dataset.pyc in _get_pixel_array(self)
1042 elif self._pixel_id != id(self.PixelData):
1043 already_have = False
-> 1044 if not already_have and not self._is_uncompressed_transfer_syntax():
1045 try:
1046 # print("Pixel Data is compressed")
/Applications/anaconda/lib/python2.7/site-packages/pydicom-1.0.0a1-py2.7.egg/pydicom/dataset.pyc in _is_uncompressed_transfer_syntax(self)
662 """Return True if the TransferSyntaxUID is a compressed syntax."""
663 # FIXME uses file_meta here, should really only be thus for FileDataset
--> 664 return self.file_meta.TransferSyntaxUID in NotCompressedPixelTransferSyntaxes
665
666 def __ne__(self, other):
/Applications/anaconda/lib/python2.7/site-packages/pydicom-1.0.0a1-py2.7.egg/pydicom/dataset.pyc in __getattr__(self, name)
505 if tag not in self: # DICOM DataElement not in the Dataset
506 # Try the base class attribute getter (fix for issue 332)
--> 507 return super(Dataset, self).__getattribute__(name)
508 else:
509 return self[tag].value
AttributeError: 'Dataset' object has no attribute 'TransferSyntaxUID'
I read the google group post , and I changed the filereader.py file to the posted file, and I got this error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Applications/anaconda/lib/python2.7/site-packages/pydicom-1.0.0a1-py2.7.egg/pydicom/__init__.py", line 41, in read_file
from pydicom.dicomio import read_file
File "/Applications/anaconda/lib/python2.7/site-packages/pydicom-1.0.0a1-py2.7.egg/pydicom/dicomio.py", line 3, in <module>
from pydicom.filereader import read_file, read_dicomdir
File "/Applications/anaconda/lib/python2.7/site-packages/pydicom-1.0.0a1-py2.7.egg/pydicom/filereader.py", line 35, in <module>
from pydicom.datadict import dictionaryVR
ImportError: cannot import name dictionaryVR
Does anybody know how to solve this problem?
You should set the TransferSyntaxUID after reading the file before trying to get the pixel_array.
import pydicom.uid
ds=pydicom.read_file('./DR/abnormal/abc.dcm',force=True)
ds.file_meta.TransferSyntaxUID = pydicom.uid.ImplicitVRLittleEndian # or whatever is the correct transfer syntax for the file
ds.pixel_array
The correction from the post you referenced was done before some changes in the code to harmonize some naming, so the error is thrown because the current master uses dictionary_VR rather than dictionaryVR. Setting the transfer syntax in user code as above avoids that problem.

python cubes analytical workspace list_cubes() fails

The current analytical workspace code seems to have some import error.
I tried this ..
from cubes import Workspace
workspace = Workspace(config="slicer.ini")
workspace.import_model("model.json")
workspace.list_cubes()
And I get this error:
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
<ipython-input-6-f80f0974af47> in <module>()
----> 1 workspace.list_cubes()
/home/anand/.virtualenvs/cubes/local/lib/python2.7/site-packages/cubes/workspace.pyc in list_cubes(self, identity)
535 """
536
--> 537 all_cubes = self.namespace.list_cubes(recursive=True)
538
539 if self.authorizer:
/home/anand/.virtualenvs/cubes/local/lib/python2.7/site-packages/cubes/namespace.pyc in list_cubes(self, recursive)
112 name = cube["name"]
113 if name in cube_names:
--> 114 raise ModelError("Duplicate cube '%s'" % name)
115 cube_names.add(name)
116
NameError: global name 'ModelError' is not defined
I'm using the 1.0.1 version.
And have tried the latest master from github, but that fails even at the
from cubes import Workspace

Categories