I am trying to install Python 3.9 on Linux 4.4 in Cloudera Data Science Workbench (cdsw).. I do not have sudo rights and I wont be able to connect to any websites.
The current version of python is 3.6
Following the procedure as mentioned here.
However, on step "sudo make altinstall" I get the error "permission denied" on /usr/local/bin
Is there any workaround to make this step work?
This is the last step of the whole procedure.
You can use the method on top and install using apt... It's the best way to do it,
sudo apt update
sudo apt install software-properties-common
Followed by adding the repository,
sudo add-apt-repository ppa:deadsnakes/ppa
Accept the changes, then
sudo apt install python3.9
Now, whenever you want to use python3.9 you have to invoke python3.9 instead of python3.
If you are using cloudera, cdsw, you cannot get sudo rights, you need to follow this guide to install packages,
https://docs.cloudera.com/documentation/data-science-workbench/1-8-x/topics/cdsw_extensible_engines.html
Follow the guide above, then change the docker file like this
# Dockerfile
FROM docker.repository.cloudera.com/cdsw/engine:8
RUN rm /etc/apt/sources.list.d/*
RUN apt-get update
RUN apt install software-properties-common
RUN add-apt-repository ppa:deadsnakes/ppa
RUN apt install python3.9 python3-pip \
&& rm /etc/apt/sources.list.d/*
RUN pip install pandas numpy
Then follow the rest of the guide for steps 2-4, you should be able to get your desired outcome.
I am confronted with a somewhat unexpected problem - I am having problems installing Python 3.8 in a Docker container.
I have created a Dockerfile that is intended to serve as my test DB. As part of its creation, it needs to run a Python script to populate it with test data. However, I cannot do what I thought would be the easiest step: installing Python.
FROM postgres
# Install Python dependencies ---------
RUN apt-get update && apt dist-upgrade -y
RUN apt install software-properties-common --yes
RUN apt-get install ca-certificates --yes
RUN gpg-agent --daemon --enable-ssh-support
RUN add-apt-repository ppa:deadsnakes/ppa --yes
RUN apt install python3.8 --yes
RUN python3.8 --version
Somewhat to my surprise, only Python 3.7 is available through apt-get. The approved method for getting Python 3.8 it to use deadsnakes - but this creates the following errors:
Step 12/33 : RUN add-apt-repository ppa:deadsnakes/ppa --yes
---> Running in 17d490c0b568
gpg: keybox '/tmp/tmp8n9r_96q/pubring.gpg' created
gpg: /tmp/tmp8n9r_96q/trustdb.gpg: trustdb created
gpg: key BA6932366A755776: public key "Launchpad PPA for deadsnakes" imported
gpg: Total number processed: 1
gpg: imported: 1
Warning: apt-key output should not be parsed (stdout is not a terminal)
gpg: no valid OpenPGP data found.
As per various posts I've found, I've added:
RUN apt-get install ca-certificates --yes
RUN gpg-agent --daemon --enable-ssh-support
And although they appear to do no harm (and the latter appears to get rid of a second error message from gpg), they do not solve the problem...
OK, it seems that installing Python 3.8 (while the latest distribution is Python 3.7) on a debian container is more trouble than it was worth.
My workaround was to create a second Docker container running Python. This populated the Postgres container in a one-off operation.
I am trying to install python 3.6 in ubuntu 16.04 docker image. It was working fine before. But today it is started showing this error.
Step 8/14 : RUN add-apt-repository ppa:jonathonf/python-3.6
---> Running in a27c7c55afef
This PPA has been removed from public access as part of a protest against the abuse of open-source projects by large companies. For more detail visit the main page here: https://launchpad.net/~jonathonf
If you are a company and you would like this PPA to continue then let me know your preferred route for contributions and I will arrange something.
Ign:8 http://ppa.launchpad.net/jonathonf/python-3.6/ubuntu xenial/main all Packages
Err:7 http://ppa.launchpad.net/jonathonf/python-3.6/ubuntu xenial/main amd64 Packages
404 Not Found
Ign:8 http://ppa.launchpad.net/jonathonf/python-3.6/ubuntu xenial/main all Packages
Reading package lists...
W: The repository 'http://ppa.launchpad.net/jonathonf/python-3.6/ubuntu xenial Release' does not have a Release file.
E: Failed to fetch http://ppa.launchpad.net/jonathonf/python-3.6/ubuntu/dists/xenial/main/binary-amd64/Packages 404 Not Found
E: Some index files failed to download. They have been ignored, or old ones used instead.
I am not sure about this. I didn't understand the problem. How I can solve this issue.
My docker code below:
FROM ubuntu:16.04
COPY requirements.txt /
RUN apt-get update
RUN apt-get install -y software-properties-common vim
RUN add-apt-repository ppa:jonathonf/python-3.6
RUN apt-get update
RUN apt-get install -y build-essential python3.6 python3.6-dev python3-pip python3.6-venv python-dev libssl-dev swig
RUN apt-get install -y git
# update pip
RUN python3.6 -m pip install pip --upgrade
RUN python3.6 -m pip install wheel
RUN pip install -r requirements.txt
Is anyone facing the same issue?
Thanks in advance.
The error you're getting seems pretty obvious:
This PPA has been removed from public access as part of a protest against the abuse of open-source projects by large companies. For more detail visit the main page here: https://launchpad.net/~jonathonf
The author has removed the PPA you're trying to use. You will need to find another PPA, or install Python yourself from source, or use a different base image. For example, you could use the standard python:3.6 base image if you need Python 3.6 (or just python:3.7 or python:3.8, depending on your needs).
This question already has answers here:
Unable to install pip in ubuntu?
(2 answers)
Closed 1 year ago.
I am trying to install virtualenv on Ubuntu.
First it said command 'pip' not found, so I typed
sudo apt install python-pip
then it said
E: Unable to locate package python-pip
I tried to reset WSL, download using cmd but it doesn't work with Ubuntu. I don't know why. Even though I have downloaded python3, virtualenv, and pip using cmd. It doesn't work with Ubuntu 18.04. It also fails on Ubuntu 14.04.
aiki#LAPTOP-886AEJJG:~$ pip
Command 'pip' not found, but can be installed with:
sudo apt install python-pip
aiki#LAPTOP-886AEJJG:~$ sudo apt install python-pip
[sudo] password for aiki:
Reading package lists... Done
Building dependency tree
Reading state information... Done
E: Unable to locate package python-pip
I'm trying to install jarvis and mycroft on win 10, but I need to use Ubuntu because it only works with Linux.
ls /bin/python*
Identify the highest version of python listed.
If the highest version is something like python2.7 then install python2-pip
If its something like python3.8 then install python3-pip
Example for python3.8:
sudo apt-get install python3-pip
Try following command sequence on Ubuntu terminal:
sudo apt-get install software-properties-common
sudo apt-add-repository universe
sudo apt-get update
sudo apt-get install python3-pip
Try the following commands in terminal, this will work better:
apt-get install curl
curl https://bootstrap.pypa.io/pip/2.7/get-pip.py -o get-pip.py
python get-pip.py
On some kind of Linux, like distros based on Debian, you might want to consider updating your 'apt-get' first, in case you are installing python-pip through it.
sudo apt-get update
This might help apt-get to update its indexes and locate the python-pip package.
After this, u might install it like this-
sudo apt-get install python-pip (Python2)
sudo apt-get install python3-pip (Python3)
You might have python 3 pip installed already. Instead of pip install you can use pip3 install.
To solve the problem of:
E: Unable to locate package python-pip
you should do this. This works with the python2.7 and you not going to get disappointed by it.
follow the steps that are mention below.
go to get-pip.py and copy all the code from it.
open the terminal using CTRL + ALT +T
vi get-pip.py
paste the copied code here and then exit from the vi editor by pressing
ESC then :wq => press Enter
lastly, now run the code and see the magic
sudo python get-pip.py
It automatically adds the pip command in your Linux.
you can see the output of my machine
I'm using WSL2 on Windows 10 and I had the same issue. Try the way which helped me to fix this. I assume that you are using python3.
python3 get-pip.py
sudo apt install python3-pip
I am trying to run cv2, but when I try to import it, I get the following error:
ImportError: libGL.so.1: cannot open shared object file: No such file or directory
The suggested solution online is installing
apt install libgl1-mesa-glx
but this is already installed and the latest version.
NB: I am actually running this on Docker, and I am not able to check the OpenCV version. I tried importing matplotlib and that imports fine.
Add the following lines to your Dockerfile:
RUN apt-get update && apt-get install ffmpeg libsm6 libxext6 -y
These commands install the cv2 dependencies that are normally present on the local machine, but might be missing in your Docker container causing the issue.
[minor update on 20 Jan 2022: as Docker recommends, never put RUN apt-get update alone, causing cache issue]
Even though the above solutions work. But their package sizes are quite big.
libGL.so.1 is provided by package libgl1. So the following code is sufficient.
apt-get update && apt-get install libgl1
This is a little bit better solution in my opinion. Package python3-opencv includes all system dependencies of OpenCV.
RUN apt-get update && apt-get install -y python3-opencv
RUN pip install opencv-python
Try installing opencv-python-headless python dependency instead of opencv-python. That includes a precompiled binary wheel with no external dependencies (other than numpy), and is intended for headless environments like Docker. This saved almost 700mb in my docker image compared with using the python3-opencv Debian package (with all its dependencies).
The package documentation discusses this and the related (more expansive) opencv-contrib-python-headless pypi package.
Example reproducing the ImportError in the question
# docker run -it python:3.9-slim bash -c "pip -q install opencv-python; python -c 'import cv2'"
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Traceback (most recent call last):
File "<string>", line 1, in <module>
File "/usr/local/lib/python3.9/site-packages/cv2/__init__.py", line 5, in <module>
from .cv2 import *
ImportError: libGL.so.1: cannot open shared object file: No such file or directory
# docker run -it python:3.9-slim bash -c "pip -q install opencv-python-headless; python -c 'import cv2'"
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
For me, the only WA that worked is following:
# These are for libGL.so issues
# RUN apt-get update
# RUN apt install libgl1-mesa-glx
# RUN apt-get install -y python3-opencv
# RUN pip3 install opencv-python
RUN pip3 install opencv-python-headless==4.5.3.56
If you're on CentOS, RHEL, Fedora, or other linux distros that use yum, you'll want:
sudo yum install mesa-libGL -y
In my case it was enough to do the following which also saves space in comparison to above solutions
RUN apt-get update && apt-get install -y --no-install-recommends \
libgl1 \
libglib2.0-0 \
Put this in the Dockerfile
RUN apt-get update
RUN apt install -y libgl1-mesa-glx
Before the line
COPY requirements.txt requirements.txt
For example
......
RUN apt-get update
RUN apt install -y libgl1-mesa-glx
COPY requirements.txt requirements.txt
......
I was getting the same error when I was trying to use OpenCV in the GCP Appengine Flex server environment. Replacing "opencv-python" by "opencv-python-headless" in the requirements.txt solved the problem.
The OpenCV documentation talks about different packages for desktop vs. Server (headless) environments.
I met this problem while using cv2 in a docker container. I fixed it by:
pip install opencv-contrib-python
install opencv-contrib-python rather than opencv-python.
Here is the solution you need:
pip install -U opencv-python
apt update && apt install -y libsm6 libxext6 ffmpeg libfontconfig1 libxrender1 libgl1-mesa-glx
had the same issue on centos 8 after using pip3 install opencv on a non gui server which is lacking all sorts of graphics libraries.
dnf install opencv
pulls in all needed dependencies.
"installing opencv-python-headless instead of opencv-python"
this works in my case!
I was deploying my website to Azure and pop up this exception:
ImportError: libGL.so.1: cannot open shared object file: No such file or directory
then I uninstall the opencv-python package, install the later one,
freeze the requirements and then deploy it again,
then problem solved.
For a raspberry pi, put this , work for me :
sudo apt-get install ffmpeg libsm6 libxext6 -y
For me, the problem was related to proxy setting. For pypi, I was using nexus mirror to pypi, for opencv nothing worked. Until I connected to a different network.
In rocky linux 9 i resolved the error using command
dnf install mesa-libGLU
Use opencv-python-headless if you're using docker or in server environment.
I got the same issue on Ubuntu desktop, and none of the other solutions worked for me.
libGL.so.1 was correctly installed but for some reason Python wasn’t able to see it:
$ ldconfig -p | grep libGL.so.1
libGL.so.1 (libc6,x86-64) => /lib/x86_64-linux-gnu/libGL.so.1
The only solution that worked was to force it in LD_LIBRARY_PATH. Add the following in your ~/.bashrc then run source ~/.bashrc or restart your shell:
export LD_LIBRARY_PATH="/lib/x86_64-linux-gnu:$LD_LIBRARY_PATH"
I understand that LD_LIBRARY_PATH is bad but for me this is the only solution that works.