Import ".utils" could not be resolved(reportMissingImports) - python

import torch
from torch import nn
from torch.nn import functional as F
from .utils import (
round_filters,
round_repeats,
drop_connect,
get_same_padding_conv2d,
get_model_params,
efficientnet_params,
load_pretrained_weights,
Swish,
MemoryEfficientSwish,
calculate_output_image_size
)
I tried a lot but I could'nt find the solution to fix this error

from .utils import round_filters
It gives error: relative import with no known parent package. See discussion for this in stackoverflow thread.
You need to install EfficientNet-PyTorch package. About this package
!pip install EfficientNet-PyTorch
And add parent package as from efficientnet_pytorch.utils import round_filters
So your code should be
import torch
from torch import nn
from torch.nn import functional as F
from efficientnet_pytorch.utils import (
round_filters,
round_repeats,
drop_connect,
get_same_padding_conv2d,
get_model_params,
efficientnet_params,
load_pretrained_weights,
Swish,
MemoryEfficientSwish,
calculate_output_image_size
)

Related

Cannot import name 'functional_datapipe' from 'torch.utils.data'

When I am running datasets_utils.py from '/usr/local/lib/python3.7/dist-packages/torchtext/data/datasets_utils.py' in Google Colab, the following error occurs even with the most updated versions of Python packages:
ImportError: cannot import name 'functional_datapipe' from 'torch.utils.data' (/usr/local/lib/python3.7/dist-packages/torch/utils/data/init.py)
Are there any solutions to solve such errors, as I could not find functional_datapipe even in the official torch.utils.data documentation? The following is excerpt from datasets._utils.py in the Google Colab environment
import functools
import inspect
import os
import io
import torch
from torchtext.utils import (
validate_file,
download_from_url,
extract_archive,
)
from torch.utils.data import functional_datapipe, IterDataPipe
from torch.utils.data.datapipes.utils.common import StreamWrapper
import codecs
It might be available only on torchdata.datapipes

Module 'dateutil' has no attribute 'parser' - Dateutil - Python

I am trying to download a pre-trained tensorflow model. I am using the below code
import numpy as np
import time
import PIL.Image
import IPython.display as display
import matplotlib.pylab as plt
import tensorflow as tf
import tensorflow_hub as hub
import datetime
from tensorflow.keras.preprocessing import image
from dateutil import parser
from keras.applications.inception_v3 import InceptionV3
model = InceptionV3()
model.summary()
I am getting the following error
AttributeError: module 'dateutil' has no attribute 'parser'
I am using python -3.7 and TF-2.7, python-dateutil-2.8.1
Please help me fix this. Thank You :)
The correct import syntax is:
import dateutil.parser
and then:
parser.parse(time_string)
or:
from dateutil.parser import parse
parse(time_string)
Documentation: https://dateutil.readthedocs.io/en/stable/parser.html

cannot import name 'imagenet_utils' from 'tensorflow.keras.applications'

i have the following python imports with in Jupyter Notebook.
import numpy as np
import tensorflow as tf
from tensorflow import keras
from tensorflow.keras.layers import Dense, Activation
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.metrics import categorical_crossentropy
from tensorflow.keras.preprocessing.image import ImageDataGenerator
from tensorflow.keras.preprocessing import image
from tensorflow.keras.models import Model
from tensorflow.keras.applications import imagenet_utils
from sklearn.metrics import confusion_matrix
import itertools
import os
import shutil
import random
import matplotlib.pyplot as plt
%matplotlib inline
But i keep getting the following error
ImportError: cannot import name 'imagenet_utils' from 'tensorflow.keras.applications' (C:\ProgramData\Anaconda3\lib\site-packages\tensorflow\python\keras\api_v2\keras\applications_init_.py)
when i search for **cannot import name 'imagenet_utils' from 'tensorflow.keras.applications' **in google i dont get much helpful information.
Has anyone come across this at all?
change
from tensorflow.keras.applications import imagenet_utils
to
from keras.applications import imagenet_utils
i managed to solve my issue.
first i ran the following to update all modules.
conda update --all
Then i used 'from keras.applications import imagenet_utils'
instead of '#from tensorflow.keras import imagenet_utils'

do I have a circular import in python

I'm trying to port over a python project from v2.x, to v3.x
one of the major changes to python was the import system.
I am now seeing an error when trying to load my python notebook as follows
package/
__init__.py
bh_tsne.py
Collect Samples.ipynb //imports utils.list_all_files, sees error
Error Output
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-2-1339232cd15c> in <module>()
1 import numpy as np
2 from os.path import join
----> 3 from utils.list_all_files import list_all_files
4 from multiprocessing import Pool
/~/AudioNotebooks/utils/__init__.py in <module>()
4 from . import show_array
5 from . import make_mosaic
----> 6 from . import bh_tsne
7 from . import normalize
8 from . import mkdir_p
ImportError: cannot import name 'bh_tsne'
strangely.. I think the problem is a circular dependence.. but bh_tsne doesn't rely on any utilities.. could the circularity be coming from my utils.list_all_files and then the __init__.py ?
bh_tsne imports
from argparse import ArgumentParser, FileType
from os.path import abspath, dirname, isfile, join as path_join
from shutil import rmtree
from struct import calcsize, pack, unpack
from subprocess import Popen
from sys import stderr, stdin, stdout
from tempfile import mkdtemp
from platform import system
from os import devnull
import numpy as np
import os, sys
import io
Edit
Is that redundant os.path join perhaps the root cause?
I ended up just upgrading the wrapper that was used in one project from it's source upstream project. The original owner had done upgrades.
https://github.com/lvdmaaten/bhtsne/blob/master/bhtsne.py
and the import worked after that as
import utils.bhtsne as bhtsne
I found out that bh_tsne seems NOT to work with python 3. Also another version (Multicore TSNE) just work with python 2.7 as well

About Graphlab library importing

In Ubuntu 14.04, I have installed Graphlab based on https://dato.com/download/install-graphlab-create-command-line.html and it seems to be working fine.
However, I receive this error when trying to use a recommender module:
import graphlab
from graphlab.recommender import ranking_factorization_recommender
In the first line, graphlab is imported without any error. However, the second line causes this error:
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-5-34df81ffb957> in <module>()
----> 1 from graphlab.recommender import ranking_factorization_recommender
ImportError: No module named recommender
How can the problem be solved? Thanks
It's just a namespace issue. recommender actually lives in the `toolkits module, so this should work:
import graphlab
from graphlab.toolkits.recommender import ranking_factorization_recommender
Graphlab has already imported everything for you in their __init__.py file.
Just do:
from graphlab import ranking_factorization_recommender
from graphlab import <any_other_recommender>
Here is a snippet of graphlab.__init__.py file:
from graphlab.util import get_runtime_config
from graphlab.util import set_runtime_config
import graphlab.connect as _mt
import graphlab.connect.aws as aws
from . import visualization
import os as _os
import sys as _sys
if _sys.platform != 'win32' or \
(_os.path.exists(_os.path.join(_os.path.dirname(__file__), 'cython', 'libstdc++-6.dll')) and \
_os.path.exists(_os.path.join(_os.path.dirname(__file__), 'cython', 'libgcc_s_seh-1.dll'))):
from graphlab.data_structures.sgraph import Vertex, Edge
from graphlab.data_structures.sgraph import SGraph
from graphlab.data_structures.sarray import SArray
from graphlab.data_structures.sframe import SFrame
from graphlab.data_structures.sketch import Sketch
from graphlab.data_structures.image import Image
from graphlab.data_structures.sgraph import load_sgraph, load_graph
from graphlab.toolkits._model import Model, CustomModel
import graphlab.aggregate
import graphlab.toolkits
import graphlab.toolkits.clustering as clustering
import graphlab.toolkits.distances as distances
...

Categories