Installing Scrapy on Python VirtualEnv - python

Here's my problem,
I have a shared hosting (GoDaddy Linux Hosting package) account and I'd like to create .py file to do some scraping for me. To do this I need the scrapy module (scrapy.org). Because of the shared account I can't install new modules so I installed VirtualEnv and created a new virtual env. that has pip, wheel, etc. preinstalled.
Running pip install scrapydoes NOT complete successfully because scrapy has lot of dependencies like libxml2 and it also needs python-dev tools. If I had access to 'sudo apt-get ...' this would be easy but I dont'. I can only use pip and easy_install.
So How do I install the python dev tool? And how do I install the dependencies? Is this even possible?
Cheers

You can install all the dependencies by activating the python virtual environment first.
Step 1 :
On Linux :
env/bin/activate
On Windows :
env\Scripts\activate
Step 2:
pip install lxml
I just tried and it worked for me. Please find screen shot attached.

It's not possible to do what I wanted to do on the GoDaddy plan I had.

I had some of the same issues. I found this and modified to pip3.7 install lxml==3.4.2. I was able to install successfully.

Related

install taichi package on python

I tried to run a script with python, it has taichi package downloaded from GitHub.
I have little knowledge of how python packages are installed, now I got error in command prompt like
"ModuleNotFoundError: No module named 'taichi'"
I just installed package downloaded from GitHub: https://pypi.org/project/taichi/#files
Hope someone can teach what should I do to run my script contains taichi package~
To install packages in python you just run the command pip install (name of package) in the command prompt so in your case that would be pip install taichi
How to install virtualenv:
Install pip first
sudo apt-get install python3-pip
Then install virtualenv using pip3
sudo pip3 install virtualenv
Now create a virtual environment
virtualenv venv
Active your virtual environment:
source venv/bin/activate
Now install your package for python
pip3 install taichi
I have figured it our that this Taichi package needs 64 bits python, it also needs LLMV file downloaded. Besides, there should be virtual studio installed in the local computer.
Then this Taichi package can be installed just by using "pip install taichi" in command.
Thanks for everyone's help!
I alse encountered this problem,I confirmed that I have already successfully installed taichi whereas it still showed "no module named taichi" .And here is my solution:tryimport sysandsys.path to check whether the file of the packages is contained in python's search range.
For example,after I inputsys.path in python ,it shows ['e:\\', 'd:\\anaconda2022\\python39.zip', 'd:\\anaconda2022\\DLLs', 'd:\\anaconda2022\\lib', 'd:\\anaconda2022', '', 'd:\\anaconda2022\\lib\\site-packages', 'd:\\anaconda2022\\lib\\site-packages\\win32', 'd:\\anaconda2022\\lib\\site-packages\\win32\\lib', 'd:\\anaconda2022\\lib\\site-packages\\Pythonwin']
I am using anaconda as the python interpreter in vscode ,so the packages of taichi is installed in d:\\anaconda2022\\lib\\site-packages,and I can import it successfully . But previously my sys.path are incorrectly setted to E:\\Python but not D:\\anaconda2022. Python cannot find the packages of taichi from the wrong sys.path.
Check the sys.path in python may help.If it is the one which cause the problem , there are many ways to edit sys.path .I solved it by uninstalling the python for I installed python and anaconda at the same time (- -).

Is it possible to use scrapy without Anaconda?

So I am very new to scraping and trying to learn scrapy. Most of the tutorial on web I have seen use Anaconda for scrapy project. I just wish to know is it possible to use scrapy without Anaconda.
it's is normally possible,
You can download it with pip
check there : http://doc.scrapy.org/en/latest/intro/install.html
you should probably use a dedicated virtualenv to avoid conflicting with your system packages.
https://github.com/pypa/pipenv Pipenv is a good one
Yes, it is possible. Following steps would lead you to that(if you're using Ubuntu):
Install python-pip if (if not installed)
sudo apt-get install python 3-pip
Next install scrapy
sudo pip install Scrapy
How to install pip on Windows
Step #2 still remains the same for Windows.
Cheers!

Offline centos install psutil

I am working on CentOS machine which cannot connect to internet, for some reasons I need to install a python module psutil, so I got psutil-2.1.3 package but not there is no clear instruction how manually one can install it in centOS system
This is usually pretty simple:
Got to PyPi
Download the .whl file for your Python version and copy it to your server
Run pip install path/to/wheel.whl, depending on your seetings you might need to install this with sudo or from a virtual environment
???
Profit!

'bz2 is module not available' when installing Pandas with pip in python virtual environment

I am going through this post Numpy, Scipy, and Pandas - Oh My!, installing some python packages, but got stuck at the line for installing Pandas:
pip install -e git+https://github.com/pydata/pandas#egg=pandas
I changed 'wesm' to 'pydata' for the latest version, and the only other difference to the post is that I'm using pythonbrew.
I found this post, related to the error, but where is the Makefile for bz2 mentioned in the answer? Is there another way to resolve this problem?
Any help would be much appreciated. Thanks.
You need to build python with BZIP2 support.
Install the following package before building python:
Red Hat/Fedora/CentOS: yum install bzip2-devel
Debian/Ubuntu: sudo apt-get install libbz2-dev
Extract python tarball. Then
configure;
make;
make install
Install pip using the new python.
Alternative:
Install a binary python distribution using yum or apt, that was build with BZIP2 support.
See also: ImportError: No module named bz2 for Python 2.7.2
I spent a lot of time on the internet and got a partial answer everywhere. Here is what you need to do to make it work. Follow every step.
sudo apt-get install libbz2-dev Thanks to Freek Wiekmeijer for this.
Now you also need to build python with bz2. Previously installed python won't work. For that do following:
Download stable python version from https://www.python.org/downloads/source/ then extract that Gzipped source tarball file. You can use wget https://python-tar-file-link.tgz to download and tar -xvzf python-tar-file.tgz to extract it in current directory
Go inside extracted folder then run following commands one at a time
./configure
make
make install
This will build a python file with bz2 that you previously installed
Since this python doesn't have pip installed, idea was to create a virtual environment with above-built python then install pandas using previously installed pip
You will see python file in the same directory. Just create a virtual environment.
./python -m env myenv (create myenv in the same directory or outside it's your choice)
source myenv/bin/activate (activate virtual environment)
pip install pandas (install pandas in the current environment)
That's it. Now with this environment, you should be able to use pandas without error.
pyenv
I noticed that installing Python using source takes a long time (I am doing it on i7 :/ ); especially the make and make test...
A simpler and shorter solution was to install another version of Python (I did Python 3.7.8) using pyenv, install it using these steps.
It not only saved the problem of using multiple Python instances on the same system but also maintain my virtual environments without virtualenvwrapper (which turned buggy on my newly setup ubuntu-20.04).

Have MySQLdb installed, works outside of virtualenv but inside it doesn't exist. How to resolve?

I'm using the most recent versions of all software (Django, Python, virtualenv, MySQLdb) and I can't get this to work. When I run "import MySQLdb" in the python prompt from outside of the virtualenv, it works, inside it says "ImportError: No module named MySQLdb".
I'm trying to learn Python and Linux web development. I know that it's easiest to use SQLLite, but I want to learn how to develop larger-scale applications comparable to what I can do in .NET. I've read every blog post on Google and every post here on StackOverflow and they all suggest that I run "sudo pip install mysql-python" but it just says "Requirement already satisfied: mysql-python in /usr/lib/pymodules/python2.7"
Any help would be appreciated! I'm stuck over here and don't want to throw in the towel and just go back to doing this on Microsoft technologies because I can't even get a basic dev environment up and running.
If you have created the virtualenv with the --no-site-packages switch (the default), then system-wide installed additions such as MySQLdb are not included in the virtual environment packages.
You need to install MySQLdb with the pip command installed with the virtualenv. Either activate the virtualenv with the bin/activate script, or use bin/pip from within the virtualenv to install the MySQLdb library locally as well.
Alternatively, create a new virtualenv with system site-packages included by using the --system-site-package switch.
source $ENV_PATH/bin/activate
pip uninstall MySQL-python
pip install MySQL-python
this worked for me.
I went through same problem, but using pip from virtualenv didn't solve the problem as I got this error
error: could not delete '/Library/Python/2.7/site-packages/_mysql.so': Permission denied
Earlier I had installed the package by sudo pip install mysql-python
To solve, copy files /Library/Python/2.7/site-packages/MySQL_python-1.2.5-py2.7.egg-info and /Library/Python/2.7/site-packages/_mysql* to ~/v/lib/python-2.7/site-packages and include /usr/local/mysql/lib in DYLD_LIBRARY_PATH env variable.
For the second step I am doing export DYLD_LIBRARY_PATH=/usr/local/mysql/lib in ~/.profile

Categories