ImportError: cannot import name 'AutoModelWithLMHead' from 'transformers' - python

This is literally all the code that I am trying to run:
from transformers import AutoModelWithLMHead, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")
I am getting this error:
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<ipython-input-14-aad2e7a08a74> in <module>
----> 1 from transformers import AutoModelWithLMHead, AutoTokenizer
2 import torch
3
4 tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
5 model = AutoModelWithLMHead.from_pretrained("microsoft/DialoGPT-small")
ImportError: cannot import name 'AutoModelWithLMHead' from 'transformers' (c:\python38\lib\site-packages\transformers\__init__.py)
What do I do about it?

I solved it! Apperantly AutoModelWithLMHead is removed on my version.
Now you need to use AutoModelForCausalLM for causal language models, AutoModelForMaskedLM for masked language models and AutoModelForSeq2SeqLM for encoder-decoder models.
So in my case code looks like this:
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch
tokenizer = AutoTokenizer.from_pretrained("microsoft/DialoGPT-small")
model = AutoModelForCausalLM.from_pretrained("microsoft/DialoGPT-small")

Related

ModuleNotFoundError: No module named 'classification_models.resnet'

from classification_models.resnet import ResNet18, preprocess_input
model = ResNet18((224, 224, 3), weights='imagenet')
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-43-ff8b34ae99fa> in <module>()
----> 1 from classification_models.resnet import ResNet18, preprocess_input
2
3 model = ResNet18((224, 224, 3), weights='imagenet')
ModuleNotFoundError: No module named 'classification_models.resnet'
I am not able to import pre-trained ResNet18 model on google colab. Please help
You can access all ResNet(ResNet50, ResNet101, ResNet152) models and its preprocess_input() easily from tf.keras.applications API as below:
from tensorflow.keras.applications.resnet50 import ResNet50
from tensorflow.keras.applications.resnet50 import preprocess_input
Please check this link for more details on how to use and apply ResNet model.

HuggingFace SciBert AutoModelForMaskedLM cannot be imported

I am trying to use the pretrained SciBERT model (https://huggingface.co/allenai/scibert_scivocab_uncased) from Huggingface to evaluate masked words in scientific/biomedical text for bias using CrowS-Pairs (https://github.com/nyu-mll/crows-pairs/). The CrowS-Pairs code works great with the built in models like BERT.
I modified the code of metric.py with the goal of allowing an option of using the SciBERT model -
import os
import csv
import json
import math
import torch
import argparse
import difflib
import logging
import numpy as np
import pandas as pd
from transformers import BertTokenizer, BertForMaskedLM
from transformers import AlbertTokenizer, AlbertForMaskedLM
from transformers import RobertaTokenizer, RobertaForMaskedLM
from transformers import AutoTokenizer, AutoModelForMaskedLM
and get the following error
2021-06-21 17:11:38.626413: I tensorflow/stream_executor/platform/default/dso_loader.cc:53] Successfully opened dynamic library libcudart.so.11.0
Traceback (most recent call last):
File "metric.py", line 15, in <module>
from transformers import AutoTokenizer, AutoModelForMaskedLM
ImportError: cannot import name 'AutoModelForMaskedLM' from 'transformers' (/usr/local/lib/python3.7/dist-packages/transformers/__init__.py)
Later in the Python file, the AutoTokenizer and AutoModelForMaskedLM are defined as
tokenizer = AutoTokenizer.from_pretrained("allenai/scibert_scivocab_uncased")
model = AutoModelForMaskedLM.from_pretrained("allenai/scibert_scivocab_uncased")
Libraries
huggingface-hub-0.0.8
sacremoses-0.0.45
tokenizers-0.10.3
transformers-4.7.0
The error occurs with and without GPU support.
Try this:
tokenizer = BertTokenizer.from_pretrained("allenai/scibert_scivocab_uncased", do_lower_case=True)
model = BertForMaskedLM.from_pretrained("allenai/scibert_scivocab_uncased")

NLP / ModuleNotFoundError

import numpy as np
%matplotlib notebook
import matplotlib.pyplot as plt
plt.style.use ('ggplot')
from sklearn.manifold import TSNE
from sklearn.decomposition import PCA
from gensim.test.utils import datapath, get_tmpfile
from gensim.models import KeyedVectors
from gensim.scripts.glove2word2vec import glove2word2vec
Hello everyone, I hope you all are doing well. I am new in DL and NLP. As I am learning I came across this error. Can anyone help me solve this issue? thank you all.
---------------------------------------------------------------------------
ModuleNotFoundError Traceback (most recent call last)
<ipython-input-13-8db7eebe6f3e> in <module>
7 from sklearn.decomposition import PCA
8
----> 9 from gensim.test.utils import datapath, get_tmpfile
10 from gensim.models import KeyedVectors
11 from gensim.scripts.glove2word2vec import glove2word2vec
ModuleNotFoundError: No module named 'gensim'

cannot import name 'TfidfVectorizer' from 'sklearn.feature_extraction'

I am trying to do a Topic modeling project, but when I use
from sklearn.feature_extraction import TfidfVectorizer
I will receive this error, my sckit-learn version installed is 0.24.1. I will be grateful if anyone could help me.
ImportError Traceback (most recent call last)
<ipython-input-2-5ae89ed22b7e> in <module>
----> 1 from sklearn.feature_extraction import TfidfVectorizer
ImportError: cannot import name 'TfidfVectorizer' from 'sklearn.feature_extraction' (C:\Users\mozha\Anaconda3\envs\spyder-env\lib\site-packages\sklearn\feature_extraction\__init__.py)
You have to import vectorizers like TfidfVectorizer from sklearn.feature_extraction.text and not sklearn.feature_extraction.

NameError: name 'gensim' is not defined

I've imported all the packages I need
from gensim import corpora
from gensim import models
from gensim.models import LdaModel
from gensim.models import TfidfModel
from gensim.models import CoherenceModel
and then I need to run the LdaMallet model so I import them like this
from gensim.models.wrappers import LdaMallet
when run the code below, I've got some Namerror:
mallet_path = 'mallet-2.0.8/bin/mallet' # update this path
ldamallet = gensim.models.wrappers.LdaMallet(mallet_path,corpus=corpus, num_topics=20, id2word=dictionary)
Error occurred:
---------------------------------------------------------------------------
NameError Traceback (most recent call last)
<ipython-input-22-1c656d4f8c21> in <module>()
1 mallet_path = 'mallet-2.0.8/bin/mallet' # update this path
2
----> 3 ldamallet = gensim.models.wrappers.LdaMallet(mallet_path,corpus=corpus, num_topics=20, id2word=dictionary)
NameError: name 'gensim' is not defined
I thought I've imported all the things that I need, and the lda model ran well before I tried to use mallet. So what's the problem?
Because you have this import:
from gensim import models
you would need to refer to wrappers in your code as models.wrappers, etc., not gensim.models.wrappers.
But you're also doing this:
from gensim.models.wrappers import LdaMallet
so you can just refer to LdaMallet directly, as in:
ldamallet = LdaMallet(mallet_path,corpus=corpus, num_topics=20, id2word=dictionary)
Note that I left out the gensim.models.wrappers. here; you don't need it.
Just use LdaMallet(mallet_path,corpus=corpus, num_topics=20, id2word=dictionary) straightaway because you already have imported the required method from gensim.models.wrappers

Categories