Scrapy Install / Version Error - python

Just installed Scrapy and Looking for Installing Portia UI for Scrapy on Python 2.7.6 (32 bit) Windows 7 (64 bit) with connectivity to IPv4 DNS Internet Service.
Getting following output error when version check is done.
C\> scrapy version
:0: UserWarning: You do not have a working installation of the service_identity
module: 'No module named service_identity'. Please install it from <https://pyp
i.python.org/pypi/service_identity> and make sure all of its dependencies are sa
tisfied. Without the service_identity module and a recent enough pyOpenSSL to s
upport it, Twisted can perform only rudimentary TLS client hostname verification
. Many valid certificate/hostname mappings may be rejected.
Scrapy 0.22.2
The project can be created but Scrapy seems non responsive to the Spiders as well.
The Scrapy installation is with all Win32 versions of in the order described on the Scrapy Site http://doc.scrapy.org/en/latest/intro/install.html and wherever required pip install or easy_install.
How shall I go about clearing the problem?

This should help:
pip install service_identity

Related

Import ldap in python gives gives "DLL load failed" error

I'm using the "import ldap" in a python code. This is on a windows 10 machine.
I installed the python-ldap module
pip3 install python-ldap
Installed the dependencies based on the instructions at Python Can't install packages
Also resolved all the pip deployment issues based on Installing python-ldap in a virtualenv on Windows
I'm now getting the following error when executing the import ldap statement. am I missing something here? Any ideas to resolve it?
thon39\site-packages\ldap\__init__.py", line 34, in <module>
import _ldap
ImportError: DLL load failed while importing _ldap: The specified module could not be found.
Visit the unofficial Python binaries page:
https://www.lfd.uci.edu/~gohlke/pythonlibs/#python-ldap
Download the appropriate WHL package for your system.
For example, if you're using Python 3.8 on an x64 system, download python_ldap‑3.3.1‑cp38‑cp38‑win_amd64.whl
(hint: do NOT download the +sasl version unless you have the Cyrus SASL code running on your system...)
Start the VirtualEnv for your project, if you're using one (C:\Users\youruser\.virtualenv\YourVirtualEnv\Scripts\activate.bat) -- if you're not, skip this step.
Then run pip3 install C:\Path\To\python_ldap_x.x.x-cpXX-cpXX-winXX.whl and this should install the Python DLL (pyd) file for you.

Proxy Authentication Required NLTK download

I have installed python 2.7.3 on a Windows 8, 64 bit machine, sublime text and nltk 3.0.1 with the following steps:
Install Setuptools:
http://pypi.python.org/pypi/setuptools and run ez_setup.py from the directory stored in python27 (from CMD prompt)
Install NLTK: http://pypi.python.org/pypi/nltk
In nltk directory run setup.py to install nltk (from CMD prompt)
Change the environmental variables to %PYTHONPATH%;C:\Python27;C:\Python27\DLLs;C:\Python27\Lib;C:\Python27\Lib\lib2to3;C:\Python27\Scripts with PYTHONPATH with C:\Python27
Test installation: Start>Python34, then type import nltk
in sublime type
import nltk
nltk.set_proxy('xxx.xx.xx.xx:yy',('username','pwd'))
nltk.download()
However, I am met with the following error:
HTTP Error 407: Proxy Authentication Required (The ISA Server requires authorization to fulfill the request. Access to the Web Proxy Filter is denied.
Despite giving the proxy details why am I getting this error?
Please help,
Arc.
Try this format instead for setting the proxy details:
import nltk
nltk.set_proxy('https://username:password#proxy.example.com:port')
It worked for me.
This work for me on macOS 10.15
Search for 'Install Certificates.command' in finder and open it.
Now,
import nltk
nltk.download()
It will open the NLTK downloader then you can select and download the required package.
I also faced the same issue but i could able to solve with this
> nltk.set_proxy('xxx.xx.xx.xx:yy' , 'username', 'passcode')

Error while creating a new project using Scrapy

I installed Scrapy on Fedora 20 and when I tried to create a new project it gives me the following error:
/usr/lib/python2.7/site-packages/Twisted-14.0.0-py2.7-linux-x86_64.egg/twisted/internet/_sslverify.py:184: UserWarning: You do not have the service_identity module installed. Please install it from <https://pypi.python.org/pypi/service_identity>. Without the service_identity module and a recent enough pyOpenSSL tosupport it, Twisted can perform only rudimentary TLS client hostnameverification. Many valid certificate/hostname mappings may be rejected.
I tried searching for the module service_identity but in vain!
Could anyone help me in this regard.
Thanks in advance
Try pip install service_identity to resolve this issue.

Installing Scrapy on windows 7 with python 2.7.3 gives error

easy_install scarpy on command prompt says scrapy is not a recognized archive type. Help!
Error Message: "error: Not a recognized archive type: Scrapy"
Unfortunately, a few dependencies of scrapy won't installed automatically from easy_install and you have to do it manually.
Install binary version of Twisted
install binary version of lxml
install Scrapy
Hope that this will solve your error message. If not, check this google group discussion with someone that had the same problem as you.

python importerror no module named zope.interface twisted

I am new in python.I installed "scrapy" but it giving error "importerror no module named zope.interface twisted".Please help me.
Thanks in advance..........
Scrapy requires a few additional libraries, namely Twisted and libxml2 (and optionally pyopenssl and simplejson). The installation instructions describes how to install these libraries.
I had the same error after following the steps scrapy suggested, mainly using pip install Scrapy. In order to install some Scrapy dependencies you will need root permision, and my mistake was creating a virtualenv for my scrapy project. Maybe you will run into the same error after following the scrapy setup steps and using virtualenv. Just dont use virtualenv.

Categories