I installed Scrapy on Fedora 20 and when I tried to create a new project it gives me the following error:
/usr/lib/python2.7/site-packages/Twisted-14.0.0-py2.7-linux-x86_64.egg/twisted/internet/_sslverify.py:184: UserWarning: You do not have the service_identity module installed. Please install it from <https://pypi.python.org/pypi/service_identity>. Without the service_identity module and a recent enough pyOpenSSL tosupport it, Twisted can perform only rudimentary TLS client hostnameverification. Many valid certificate/hostname mappings may be rejected.
I tried searching for the module service_identity but in vain!
Could anyone help me in this regard.
Thanks in advance
Try pip install service_identity to resolve this issue.
Related
I have Python 3.8 32-bit, when I try to install xgboost using pip, I got the following message, does anyone know how to resolve it?
ERROR: Files/directories not found in C:\Users\xxxx\AppData\Local\Temp\pycharm-packaging\xgboost\pip-egg-info
Thank you in advance.
I failed to install twisted by pip command, so I manually downloaded the .whl file and got it installed( version 18.7.0). Only after i did that, my laptop could install scrapy; however, it seems that the twisted package is not compatible with python 3.7 and it keeps saying "syntax error"
I have tried some method posted on the Github about this issue(https://github.com/scrapy/scrapy/issues/3143), but none of them solve it. I wonder whether I need to shift to python 3.6 or not? cause my python spider can only setup it's downloader and cannot parse webpages
Could anyone please give me some advise?
Yes. Twisted does not yet support Python 3.7. Try Python 3.6 or earlier.
One of the comments from the issue helped me:
pip install git+https://github.com/scrapy/scrapy#master --no-dependencies --upgrade
Hi I am new to using python and was learning how to use stacks in python. I found some web examples that use this command:
from pythonds.basic.stack import Stack
But when I tried using this command, I get this error message back:
ImportError: No module named pythonds.basic.stack
I tried google searching where to get this module installed from but can't seem to find it. Any help in identifying where I can get this from or any other way to use stack will be appreciated!
You'll need to install it with pip, like so
pip install pythonds
or you can download the tarball and install it yourself, which you can find on PyPi
This worked for me to fix that error:
sudo pip3 install pythonds
Just installed Scrapy and Looking for Installing Portia UI for Scrapy on Python 2.7.6 (32 bit) Windows 7 (64 bit) with connectivity to IPv4 DNS Internet Service.
Getting following output error when version check is done.
C\> scrapy version
:0: UserWarning: You do not have a working installation of the service_identity
module: 'No module named service_identity'. Please install it from <https://pyp
i.python.org/pypi/service_identity> and make sure all of its dependencies are sa
tisfied. Without the service_identity module and a recent enough pyOpenSSL to s
upport it, Twisted can perform only rudimentary TLS client hostname verification
. Many valid certificate/hostname mappings may be rejected.
Scrapy 0.22.2
The project can be created but Scrapy seems non responsive to the Spiders as well.
The Scrapy installation is with all Win32 versions of in the order described on the Scrapy Site http://doc.scrapy.org/en/latest/intro/install.html and wherever required pip install or easy_install.
How shall I go about clearing the problem?
This should help:
pip install service_identity
I am new in python.I installed "scrapy" but it giving error "importerror no module named zope.interface twisted".Please help me.
Thanks in advance..........
Scrapy requires a few additional libraries, namely Twisted and libxml2 (and optionally pyopenssl and simplejson). The installation instructions describes how to install these libraries.
I had the same error after following the steps scrapy suggested, mainly using pip install Scrapy. In order to install some Scrapy dependencies you will need root permision, and my mistake was creating a virtualenv for my scrapy project. Maybe you will run into the same error after following the scrapy setup steps and using virtualenv. Just dont use virtualenv.