I have globally installed modules in my pc, but when i create a virtualenv some of the modules are already preinstalled in it, but when i execute 'pip freeze' in my virtualenv there are no installed modules. commands like django-admin , cookiecutter already work in my virtualenv though i have never installed them in it. But other commands like numpy or pandas do not work , though i have installed them in my machine globally like django or cookiecutter. How do i fix this? I am using python version 3.9.6.
TL;DR: The django-admin and cookiecutter commands are accessible from your virtual environment because they are on PATH. This isn't related to the Python virtual environment, but rather due to your whole system. If you want to make global packages accessible in your virtual environment, see this answer.
django-admin and cookiecutter are executables. They're located in some folder on your system (most likely the Scripts folder of your Python installation), and that folder is in PATH. Therefore, the shell can access them, no matter if you are in a virtual environment.
To contrast with that, numpy and pandas are only libraries. Therefore, when you try to import them in your code which is run in the virtual environment, they cannot be accessed. This can be changed by either installing them in the virtual environment, or including system site packages, which you can see how to do in this answer.
If you tried to import django or cookiecutter, that wouldn't work either (in your virtual environment), just like numpy or pandas. There's no way to "fix this", because it isn't broken. I wouldn't suggest removing Scripts from PATH, because that would mean those commands would never be accessible.
Related
I wanted to replace Python 3.8 32-bit with the 64-bit version to install the face_recognition module, so I deleted the previous version and tried to re-route the project to the new Python version by going to File > Settings > Project Interpreter > Show all > Show Paths for Selected Interpreter, and adding all the Python files from the new folder and getting rid of the old ones.
However, it's still showing me this error when I try to install the module:
(Will2.0) C:\Users\solei\PycharmProjects\Will>pip install face_recognition
No Python at 'C:\Users\solei\AppData\Local\Programs\Python\Python38-32\python.exe'
I've also tried going to the Windows System Properties and changing everything that says "Python38-32" there, but it's still not working. It does work when I make a new environment, though, so at least I know that Python installed properly. It's just this one environment that is tripping me up (I'd prefer not to make a new project for this, btw. I've already installed a lot of modules in it.).
Your selected interpreter is not the system interpreter you've replaced with the 64-bit version, but your project's virtual environment interpreter. The virtual environment's files weren't changed in that process and need to be updated before you can use that environment again.
The system interpreter is your Python interpreter installed using the installation executable. In your case it is located in C:\Users\solei\AppData\Local\Programs\Python\Python38\. You can have multiple system interpreters installed, such as having Python 2.7, Python 3.7 and Python 3.8 side-by-side.
The virtual environment interpreter is a copy of another interpreter created using the venv package from the Python standard library. You can have many virtual environments interpreters in the system (one or more for every project, for example)
The base interpreter is the interpreter that was used as a template for the venv package. Every virtual environment interpreter has its base interpreter (usually a system interpreter) that it requires to run. Changing or upgrading the base interpreter requires updating the virtual environment.
If we take a quick look at the documentation, a virtual environment is described as
a self-contained directory tree that contains a Python installation for a particular version of Python, plus a number of additional packages.
That means you can setup an individual environment for every project, which will contain its own packages. The environment is a very efficient way of managing project packages, that's why PyCharm suggests a creation of such environment over the system interpreter by default. In short, it allows you to have two different versions of the same package used by two different projects, without the packages conflicting with each other.
This also explains why your virtual environment files weren't affected by your upgrade.
Now, I am unfortunately no Python expert. I had to spend some time examining how Python handles virtual environments on Windows and Ubuntu. It seems the environment always requires the base system interpreter present in the system. If you remove or change the location of the base interpreter, the environment will fail to function.
As I mentioned before editing this answer, you can in theory simply edit the pyenv.cfg file located in the root folder of the virtual environment. In practice, that will only work in simple cases and it is not the intended way of updating virtual environments.
You need to upgrade your virtual environment's files to work with your new system interpreter. That can mean the 64-bit version over the 32-bit version, or even a newer version of Python - such us upgrading from 3.7 to 3.8.
Close PyCharm
Check if the system interpreter you want to upgrade to is on the system Path
You can quickly check by running
python -c "import platform; print(platform.architecture())"
For you, the output should look like this
('64bit', 'WindowsPE')
If your output is different, you'll need to prefix the absolute path to the Python executable in step 4).
Navigate to the virtual environment's directory
The directory you're looking for contains the Include, Lib and Scripts directories and the pyenv.cfg file. From your screenshots, it seems this directory is your project's root directory, so in your case:
cd C:\Users\solei\PycharmProjects\Will2.0\
Upgrade the virtual environment
python -m venv --upgrade .
... or if Python is not on your path
C:\Users\solei\AppData\Local\Programs\Python\Python38\python.exe -m venv --upgrade .
The . in the commands refers to the current directory.
Open PyCharm and verify your environment is working correctly
... or simply try to run pip directly from the command line. Note you need to first activate the virtual environment by running the Scripts\activate.bat batch file.
If the above-mentioned method doesn't work, you might have to create a new virtual environment. You can create one easily without making a new PyCharm project. See this PyCharm documentation for reference. However, you'll still need to redownload all the required packages again.
For the simplicity, I recommend creating the new virtual environment in a .venv folder located in the project's root.
Disclaimer
I tested only the Python's behavior alone on a fresh Windows installation inside the Windows Sandbox. I was able to install the 32-bit Python, create a virtual environment, replace Python with the 64-bit version and upgrade the virtual environment to have it launch correctly again.
I'm a huge beginner so I'm not very informed about how packages really work. I know that you should create a virtualenv in your project folder to avoid version conflicts etc, and you're not supposed to put your actual project files in the virtual env. So if your project files are in your project directory on the same level as the virtualenv, can your project files "access" the things installed in the virtualenv? Can files outside of your directory access packages in your virtual env?
Yes, it all depends on the context. Your virtualenv can exist anywhere, be it in your project directory, or somewhere else.
When you want to use the virtualenv, you just have to call source command on it. Then whatever python command you execute on whichever file, will have access to the virtualenv. For example, if you store your virtualenv in /home/user/project/virtualenv, then you would do
source /home/user/project/virtualenv/bin/activate
Then whatever you with the python, it would be the version installed in virtualenv.
You can double check if you're using the global python or the virtualenv python by doing which python. It will either point to the global python path which is usually under /usr/bin/python or /home/user/project/virtualenv/bin/python.
So normally, you first do the source command, then you can do pip install on whatever packages you need already. It will be installed in the virtualenv and it will not conflict with other projects.
I was suggested to conda create a new environment for installing tensorflow
First question, in general:
Why do environment exist in conda or in Python ? (Why) is it preferable to install a new library in a new environment ?
Here, in practice:
After install conda shell says $conda activate test will activate the test environment. Does it mean i can't access the lib in Spyder unless i activate test in conda shell ? Do i need to restart python shell to see the lib ? I can't access the lib (no module named tensorflow) and I assume it has to do with python not finding the path.
After install conda shell says $conda activate test will activate the
test environment. Does it mean i can't access the lib in Spyder unless
i activate test in conda shell ? Do i need to restart python shell to
see the lib ? I can't access the lib (no module named tensorflow) and
I assume it has to do with python not finding the path.
Have you installed TF within the environment?
I haven't used Spyder in a while, but what usually happens is that you can start a program (like Spyder or Jupyter) from an environment if you have installed the application within it and the environment is active. (Some editors/IDE like VS Code lets you choose the environment for a specific project, once it is able to discover all the environments.)
And, also usually, though perhaps not always, you will not need to restart the shell to import a library, after installing it. It's best to refer to the specific library's installation instructions for details like this.
Virtual Environment is used to manage Python packages for different projects. Using virtual environment allows you to avoid installing Python packages globally which could break system tools or other projects. You can install virtual environment using pip.
For example, say you have two projects, and each requires a different version of Tensorflow. This is a real problem for Python since it can’t differentiate between versions in the “site-packages” directory. So both say V1.1 and V2.1 would reside in the same directory with the same name.
This also allows easy clean up, once you are done with the project just delete the virtual environment.
Checkout more, https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/
Coming from JavaScript I'm familiar with NPM.
There you can install packages globally (by using the -g flag) or locally in a project.
In Python they have these Virtual Environments.
I'm still a bit uncertain why they are needed. I know that it is for having the same package in different versions on one machine.
Is it because Python doesn't have the concept of local project-installations?
All package-installations are installed global and there's no way around that. It seems to me being that way ...
And so they have does Virtual Environments instead?
I'm a right there?
Virtual environments make possible for you to encapsulate dependencies by project.
Python has no node_modules equivalent. When you install something with pip it goes to your site-packages folder. To find out this folder you can run python -m site and it will print out the folders where it will search for packages.
Example on Fedora 29:
➜ ~ python -m site
sys.path = [
'/home/geckos',
'/usr/lib/python27.zip',
'/usr/lib64/python2.7',
'/usr/lib64/python2.7/plat-linux2',
'/usr/lib64/python2.7/lib-tk',
'/usr/lib64/python2.7/lib-old',
'/usr/lib64/python2.7/lib-dynload',
'/usr/lib64/python2.7/site-packages',
'/usr/lib/python2.7/site-packages',
]
USER_BASE: '/home/geckos/.local' (exists)
USER_SITE: '/home/geckos/.local/lib/python2.7/site-packages' (doesn't exist)
ENABLE_USER_SITE: True
pip vs package manager
If you don't use virtual environments you may end up with packages being installed side by side with operating system python packages, and this is where the danger is. Packages may be overwritten and things get messy fast. For example you install Flask with pip then try to install Jinja2 from with package-manager, now you remove Jinja2, and breaks Flask, or you update your system, Jinja2 got updated but not Flask. Or even simpler, you install something with package manager and remove with pip, now the package manager is in a broken state.
Because of this we always use virtual environments, and even separate virtual environments by project.
Creating and maintaining virtual environments
Nothing prevents you from maintaining you virtual environment in the same folder as your project. This way you will have the same felling that you have with node_modules. You can create it with
virtualenv <SOME_FOLDER> for python 2
or
python3 -m venv <SOME_FOLDER> for python 3
Conventions that I've seen
If you're keeping virtual environments as a subfolder of your project, I usually call then env or venv
Other options is keeping all then in the same folder inside your home, I've been using ~/.venv/<PROJECT>
Pipenv
Finally there is an alternative that I like more than pure pip. Pipenv is a tool that manages virtual environments automatically for you. It feels more close to yarn and has more features
To create a virtual environment for a project just pipenv --tree or pipenv --two in your project folder. It will create and manage the virtual environment and write dependencies to Pipenv file. It also support development packages, I really think is worth trying. Here is the docs: https://pipenv.kennethreitz.org/en/latest/
I hope this helps, regards
Is it because Python doesn't have the concept of local project-installations?
Correct.
Well, mostly correct. There's a number of "modern" Python package managers that support project-local package installation. Right now the big two are pipenv and poetry.
However, all of these libraries are fundamentally wrappers over the basic Python virtual environment mechanism. It's the basis of the ecosystem.
Global package management is a little thorny in Python because Unix systems tend to come with a "system Python" installation that support parts of the operating system. Installing/updating packages in the system Python is a very bad idea, so you always want to be working in a Python you installed yourself, either a fully separate installation or at least a virtual environment of the system Python.
I have an existing django project which I have developed using python libraries installed in system and adding missing ones to the system. But now conflict has come for python-requests as system has 2.2 version but I need >2.5. Dont't want to uninstall and put newer one as it may break the OS. So now, I want to use virtual env and install packages there in complete isolation to that of OS.
I think the solution you're looking for is to download a different version of python without uninstalling your original, then start up virtualenv venv, but by passing in a path to the new python.exe file. Like this: virtualenv -p venv <path-to-executable-here>, then just do source bin\activate, as usual. This starts the virtual environment using the python executable you passed to virtualenv through the terminal.
Also, this might not be the only way, but, there's something called ModuleFinder which enables you to get a list of all the modules your script is importing--That is if you don't want to type them out manually, and you have extra modules installed (otherwise pip freeze > requirements.txt would do the job, and your new virtual environment would install all the packages in requirements.txt).