Failed convert .caffemodel to .mlmodel - python

While trying convert caffemodel to mlmodel i cant run my converter-script.py
this is my converter-script.py file :
import coremltools
caffe_model = ('oxford102.caffemodel', 'deploy.prototxt')
labels = 'flower-labels.txt'
models = coremltools.converters.caffe.converts(
caffe_model,
class_labels = labels,
image_input_names = 'data'
)
coreml_model.save('FlowerClassifier.mlmodel')
i run this using virtualenv with python 2.7
and i get this error message :
File "convert-script.py", line 1, in
import coremltools
File "/Users/aji/Documents/Environments/python27/lib/python2.7/site-packages/coremltools/init.py", line 28, in
_root_logger_handlers_backup = _root_logger.handlers.copy()
AttributeError: 'list' object has no attribute 'copy'
Anyone can give me solution?

Use python3 instead of creating and running from a python27 venv.
python3 convert-script.py
worked for me

So the problem here is about coremltools. The most recent version of it works with python 3 and you're doing conversion on python 2.7
The easiest way to solve your problem is to downgrade your coremltools to version, which works with py2.7, you can do it with following command in your terminal:
pip install coremltools=4.0
Then just run script the same way you did it before :)

Related

module 'minisam' has no attribute 'DiagonalLoss'

I am using PyICP github repo. I built the Sophus from this commit (commit a0fe89a323e20c42d3cecb590937eb7a06b8343a) Reference.
I am using Ubuntu 22 and used virtual environment (venv with python version 3.7.14) to build the PyICP repo, minisam and Sophus repo.
Finally when I run the command python3 main_icp_slam.py it runs in this error.
(venv) shubham#shubhamubuntu:~/Lidar_Slam/minisam$ python3 main_icp_slam.py Traceback (most recent call last): File "main_icp_slam.py", line 50, in <module> PGM = PoseGraphManager() File "/home/shubham/Lidar_Slam/minisam/utils/PoseGraphManager.py", line 9, in __init__ self.prior_cov = minisam.DiagonalLoss.Sigmas(np.array([1e-6, 1e-6, 1e-6, 1e-4, 1e-4, 1e-4])) AttributeError: module 'minisam' has no attribute 'DiagonalLoss' (venv) shubham#shubhamubuntu:~/Lidar_Slam/minisam$ python3 main_icp_slam.py
The image view of this error is HERE
I am not sure is it caused by Sophus, Eigen, Venv or minisam.
Let me know if anymore information is needed.
Thanks, for any and all the help.
It is probably because you have multiple versions of python on your system and you have installed minisam on another python version. When you run cmake (for installing minisam) look at the version it shows for the python executable and also check your root python version (or environment that you are using), they should match.
If they don't match either make an environment with the python version your minisam is installed on or run your code as follows: (let say it is installed on python 3.6)
$ python3.6 main_icp_slam.py

AttributeError: module 'sst' has no attribute 'train_reader'

I am very new to sentiment analysis. Trying to use Stanford Sentiment Treebank(sst) and ran into an error.
from nltk.tree import Tree
import os
import sst
trees = "C:\\Users\m\data\trees"
tree, score = next(sst.train_reader(trees))
[Output]:
AttributeError Traceback (most recent call last)
<ipython-input-19-4101f90b0b16> in <module>()
----> 1 tree, score = next(sst.train_reader(trees))
AttributeError: module 'sst' has no attribute 'train_reader'
I think you're looking for https://github.com/JonathanRaiman/pytreebank, not https://pypi.org/project/sst/.
On the python side, that error is pretty clear. Once you import the right package, though, I'm not sure I saw train_reader but I could be wrong.
UPDATE:
I'm not entirely sure why you're running into the 'sst' not having the attribute train_reader. Make sure you didn't accidentally install the 'sst' package if you're using conda. It looks like the 'sst' is referring to a privately created module and that one should work.
I got your import working but what I did was I:
Installed everything specified in the requirements.txt file.
import sst was still giving me an error so I installed nltk and sklearn to resolve that issue. (fyi, im not using conda. im just using pip and virtualenv for my own private package settings. i ran pip install nltk and pip install sklearn)
At this point, import sst worked for me.
I guess you're importing the sst package selenium-simple-test, which is not what you're looking for.
Try sst.discover() , if you get the error
TypeError: discover() missing 4 required positional arguments: 'test_loader', 'package', 'dir_path', and 'names'
You are using the selenium-simple-test package

AttributeError: module 'cupy' has no attribute 'cupyx'

I have this python code when I run it ,it say
AttributeError: module 'cupy' has no attribute 'cupyx'
code:
# upload matrix and inverse diagonal GPU
A = cp.cupyx.scipy.sparse.csr_matrix(A)
I've installed cupy successfully in docker using
pip install cupy-cuda100
any help will be appreciated, thx
See the discussion in https://github.com/cupy/cupy/issues/2654 and try the following
import cupyx.scipy.sparse
cupyx.scipy.sparse.csr_matrix(A)
The alias cupy.cupyx was unintentionally there in some prereleases, but it has been removed because it is too confusing.

AttributeError: module 'shodan' has no attribute 'Shodan'

I need to perform a BULK whois query using shodan API.
I came across this code
import shodan
api = shodan.Shodan('inserted my API-KEY- within single quotes')
info = api.host('8.8.8.8')
After running the module i get the following error:
Traceback (most recent call last):
File "C:/Users/PIPY/AppData/Local/Programs/Python/Python37/dam.py", line 1, in
import shodan
File "C:/Users/PIPY/AppData/Local/Programs/Python/Python37\shodan.py", line 2, in
api = shodan.Shodan('the above insereted API KEY')
AttributeError: module 'shodan' has no attribute 'Shodan'
I'm learning python and have limited scripting/programming experience.
Could you please help me out?
Cheers
You seem to have dam.py and shodan.py – Python defaults to importing from the module directory, so the installed shodan package gets masked.
Try renaming shodan.py to e.g. shodan_test.py (and of course fixing up any imports, etc.).
I have solved the issue by re-installing the shodan module under the C:\Users\PIPY\AppData\Local\Programs\Python\Python37\Scripts>pip install shodan
Thank you for the help AKX.
I had this same issue but after renaming my file as something different than shodan.py, I had to also delete the compiled class shodan.pyc to avoid the error.
Also, if you have more than one version of python installed, i.e. python2 and python3, use
python -m pip install shodan instead of pip install shodan, to ensure that you are installing the library in the same version of shodan that you are using to execute your script.
If you are executing your script with python3 shodan_test.py then use python3 -m pip install shodan

how to load pcd file in pycharm

I use python-pcl and want to load a pcd file.
My code is :
cloud_blob = pcl.load('./Downloads/table_scene_lms400.pcd')
this code works fine in shell, but in Pycharm always have error:
[pcl::PCDReader::readHeader] Could not find file './Downloads/table_scene_lms400.pcd'
I don't know why.
First install pypcd
pip install pypcd
Then run:
import pypcd
pc = pypcd.PointCloud.from_path('table_scene_lms400.pcd')
Reference taken from here

Categories