AttributeError: module 'utils' has no attribute 'load_data' - python

I am engaging a project now, and the library utils might be frequently used. However, I encounter a problem
import numpy as np
import pandas as pd
import featuretools as ft
import utils
data_path = 'dataturbo/train_FD003.txt'
data = utils.load_data(data_path)
data.head()
how to solve this guys, i'm always encountering this problem

For those who wants to know the solution. I believe he is testing a package from FeatureTools.
https://github.com/Featuretools/predict-remaining-useful-life
import utils package is not something you install from conda or pip install, but a utils.py script you can download from FeatureTools github page.

Related

AttributeError: module 'tensorly' has no attribute 'decomposition'

I’m using a package (tensorly) on python where I don't have access to all the modules.
For example if I try to use the 'decomposition' module :
python version: 3.9.12
tensorly version : 0.7
I run :
pip3 install tensorly
python3 main.py
main.py :
### imports ###
import tensorly
### tensor decomposition ###
cp = tensorly.decomposition.CP(n)
output :
AttributeError: module 'tensorly' has no attribute 'decomposition'
PS: When I go to /.local/lib/python3.9/site-packages/tensorly there is the module decomposition and when I print my sys.path there is the path for this same site-packages.
I have the same problem with another package (cobrapy) and on other different machines with other versions of python (3.6)
Update :
Now I have the exact same problem with scikit-learn:
from sklearn.preprocessingcessing import StandardScaler
Output :
No module named 'sklearn.preprocessingcessing'
Even if this package worked really well before (no error with .preprocessingcessing), this error popped out randomly today...
You have to first import the submodule you want to use if it isn't loaded by default (you can check the __init__.py file to see what modules are imported by default).
In other words, just first import decomposition:
import tensorly
import tensorly.decomposition
Or directly import the decomposition methods you want to use:
from tensorly.decomposition import CP
You also have a typo in your Scikit-Learn example.

ModuleNotFoundError: No module named 'modeling'

I'm very new to deep learning and python and I'm trying to recreate the project at https://github.com/Nagakiran1/Extending-Google-BERT-as-Question-and-Answering-model-and-Chatbot
As I saw that in the project there is a file named Bert_QuestionAnswer.ipynb and with data.txt are the only difference I see from the original Bert repository, I just simply loaded it in my google drive and opened it as a notebook to see it in use.
When I run the first portion dough I get the ModuleNotFoundError: No module named 'modeling'errror.
What library is it part of?
For somoebody this was the problem :
It looks like it's trying to import from the github repo source rather
than the pip package.
If you are running this in a directory that contains the BERT github
repo, try running it elsewhere.
As always many thanks for the help.
This is the code of the file that throws me the error :
from __future__ import absolute_import
from __future__ import division
from __future__ import print_function
from IPython.core.debugger import set_trace
import collections
import json
import math
import os
import random
import modeling
import optimization
import tokenization
import six
import os
import tensorflow as tf
import logging
logging.getLogger('tensorflow').disabled = True
os.environ['TF_CPP_MIN_LOG_LEVEL'] = '3'
import warnings
warnings.filterwarnings("ignore")
import time
from pandas import Series
from nltk.tokenize import sent_tokenize
import gensim.downloader as api
from gensim.parsing.preprocessing import remove_stopwords
word_vectors = api.load("glove-wiki-gigaword-100") # load pre-trained word-vectors from gensim-data
You need to tell python where this module is:
import sys
sys.path.append("/path/to/your/bert/repo")
Because python will search in his system folders and in the current working directory. If you don't run it in the repo, python doesn't find this module.

ImportError: No module named deepmolecule.rdkit_utils

When I tried to python running,
I got the error that
ImportError: No module named deepmolecule.rdkit_utils
so I search about "deepmolecule.rdkit_utils" at google, but there are no exist about that module information.
How can I solve this problem?
This is importing modules in the python script file.
import csv
import subprocess
import numpy as np
import numpy.random as npr
import matplotlib.pyplot as plt
import copy
from deepmolecule.rdkit_utils import smile_to_fp
from rdkit.Chem import Descriptors
from rdkit import Chem
from rdkit.Chem import rdMolDescriptors
Apparently, the module got renamed to neuralfingerprint and is called nfp at pypi. Hence, you can install the module by running pip install nfp in your shell. Note that you might need to change the name of the module in your script.

ModuleNotFoundError: No module named 'utils'

I'm trying to run the object_detection API in Tensorflow using my webcam as an input.
The error says: "from utils import label_map_util ModuleNotFoundError: No module named 'utils'"
Which relates to the lines:
from utils import label_map_util
from utils import visualization_utils as vis_util
I've tried "pip install util" appears to work but doesn't solve the problem. I have also reinstalled multiple versions of protobuf as other questions online appear to have this as the solution. I don't get any errors when I install protoc so I don't think this is the issue.
I'm using python 3.6 on windows 10 with tensorflow-gpu.
add object_detection to the front of utils:
# from utils import label_map_util
# from utils import visualization_utils as vis_util
from object_detection.utils import label_map_util
from object_detection.utils import visualization_utils as vis_util
What folder are you running your python script from?
To be able to access the 'utils' module directly, you need to be running the script inside the <models-master>\research\object_detection folder.
Instead of running script inside object detection folder append the
path of tensorflow object detection in your script by writing
import sys
sys.path.append('PATH_TO_TENSORFLOW_OBJECT_DETECTION_FOLDER')
e.g 'PATH_TO_TENSORFLOW_OBJECT_DETECTION_FOLDER' in my ubuntu system is
/home/dc-335/Documents/Softwares/tensorflow/models/research/object_detection
Cheers, You done it !!
I used a quicker method to fix it.
I copied the utils folder from models\research\object_detection and pasted it within the same directory as the python file which required utils
the installation didn't go through, you will notice no module called model_utils in your project folder.
uninstall it pip uninstall django-model-utils then install it again pip install django-model-utils a new app called model_utils in your project folder.
I used to quick method
!pip install utils
it workes properlly

Python ImportError: cannot import name datafunc [PyML]

I have installed PyML package in order to use some machine learning algorithms, and according to the tutorial, my installation is successful.
I try to run a python script which includes the following line to import modules from PyML
from PyML import datafunc,svm,assess,modelSelection,ker
However I get the error message above saying
File <stdin>, line 1, in <module> ImportError: cannot import name
datafunc
cannot import name datafunc`. From terminal I check every module by saying
from PyML import datafunc,
from PyML import svm,
from PyML import ker
I only get error message for datafunc. The PyML library is under the site-packages folder of Python 2.7.
I check this question here Python error: ImportError: cannot import name Akismet, but I could't see how it will help my problem.
Do you have any idea why Python imports some modules but does not import this one?
In PyML-0.7.13.3, the datafunc module exists in PyML/containers directory.
So it seems that you can import the module as follows:
from PyML.containers import datafunc
Howerver, it raises an error beacuse the datafunc module uses
undefined classes BaseVectorDataSet and SparseDataSet.
Thus you need to modify the source of PyML
in order to use datafunc module.
First, prepend the following two lines to PyML/containers/datafunc.py
and re-install the PyML library.
from PyML.containers.baseDatasets import BaseVectorDataSet
from PyML.containers.vectorDatasets import SparseDataSet
Then you can import the modules as follows:
from PyML import svm, modelSelection, ker
from from PyML.containers import datafunc
from from PyML.evaluators import assess
BTW, I recommend that you use more documented and tested machine learning library, such as scikit-learn.

Categories