I'm a huge beginner so I'm not very informed about how packages really work. I know that you should create a virtualenv in your project folder to avoid version conflicts etc, and you're not supposed to put your actual project files in the virtual env. So if your project files are in your project directory on the same level as the virtualenv, can your project files "access" the things installed in the virtualenv? Can files outside of your directory access packages in your virtual env?
Yes, it all depends on the context. Your virtualenv can exist anywhere, be it in your project directory, or somewhere else.
When you want to use the virtualenv, you just have to call source command on it. Then whatever python command you execute on whichever file, will have access to the virtualenv. For example, if you store your virtualenv in /home/user/project/virtualenv, then you would do
source /home/user/project/virtualenv/bin/activate
Then whatever you with the python, it would be the version installed in virtualenv.
You can double check if you're using the global python or the virtualenv python by doing which python. It will either point to the global python path which is usually under /usr/bin/python or /home/user/project/virtualenv/bin/python.
So normally, you first do the source command, then you can do pip install on whatever packages you need already. It will be installed in the virtualenv and it will not conflict with other projects.
Related
I have globally installed modules in my pc, but when i create a virtualenv some of the modules are already preinstalled in it, but when i execute 'pip freeze' in my virtualenv there are no installed modules. commands like django-admin , cookiecutter already work in my virtualenv though i have never installed them in it. But other commands like numpy or pandas do not work , though i have installed them in my machine globally like django or cookiecutter. How do i fix this? I am using python version 3.9.6.
TL;DR: The django-admin and cookiecutter commands are accessible from your virtual environment because they are on PATH. This isn't related to the Python virtual environment, but rather due to your whole system. If you want to make global packages accessible in your virtual environment, see this answer.
django-admin and cookiecutter are executables. They're located in some folder on your system (most likely the Scripts folder of your Python installation), and that folder is in PATH. Therefore, the shell can access them, no matter if you are in a virtual environment.
To contrast with that, numpy and pandas are only libraries. Therefore, when you try to import them in your code which is run in the virtual environment, they cannot be accessed. This can be changed by either installing them in the virtual environment, or including system site packages, which you can see how to do in this answer.
If you tried to import django or cookiecutter, that wouldn't work either (in your virtual environment), just like numpy or pandas. There's no way to "fix this", because it isn't broken. I wouldn't suggest removing Scripts from PATH, because that would mean those commands would never be accessible.
I'm a bit confused on this and can't find a satisfactory answer searching elsewhere - everything is about using external libraries in some way or another.
In my PyCharm projects, I've got your project root directory, which contains a venv. This venv of course contains the python executable relative to the project, as well as the relevant site-packages and modules I've downloaded via pip or whatever.
But there's always an External Libraries section, that has it's own Python executable, venv, .gitignore, site-packages, etc.
Is this normal? What's the purpose of having two venvs in the same project?
Thanks!
You only have one venv normally in your Project
The other Python Executable should be the global one.
Normally you use a venv to keep track of the needed Packages for this specific project. So that you don't confuse other Packages with other projects or have dependency/Version Issues
If you want you can also use the global Installation of Python which would not create a venv
I wanted to replace Python 3.8 32-bit with the 64-bit version to install the face_recognition module, so I deleted the previous version and tried to re-route the project to the new Python version by going to File > Settings > Project Interpreter > Show all > Show Paths for Selected Interpreter, and adding all the Python files from the new folder and getting rid of the old ones.
However, it's still showing me this error when I try to install the module:
(Will2.0) C:\Users\solei\PycharmProjects\Will>pip install face_recognition
No Python at 'C:\Users\solei\AppData\Local\Programs\Python\Python38-32\python.exe'
I've also tried going to the Windows System Properties and changing everything that says "Python38-32" there, but it's still not working. It does work when I make a new environment, though, so at least I know that Python installed properly. It's just this one environment that is tripping me up (I'd prefer not to make a new project for this, btw. I've already installed a lot of modules in it.).
Your selected interpreter is not the system interpreter you've replaced with the 64-bit version, but your project's virtual environment interpreter. The virtual environment's files weren't changed in that process and need to be updated before you can use that environment again.
The system interpreter is your Python interpreter installed using the installation executable. In your case it is located in C:\Users\solei\AppData\Local\Programs\Python\Python38\. You can have multiple system interpreters installed, such as having Python 2.7, Python 3.7 and Python 3.8 side-by-side.
The virtual environment interpreter is a copy of another interpreter created using the venv package from the Python standard library. You can have many virtual environments interpreters in the system (one or more for every project, for example)
The base interpreter is the interpreter that was used as a template for the venv package. Every virtual environment interpreter has its base interpreter (usually a system interpreter) that it requires to run. Changing or upgrading the base interpreter requires updating the virtual environment.
If we take a quick look at the documentation, a virtual environment is described as
a self-contained directory tree that contains a Python installation for a particular version of Python, plus a number of additional packages.
That means you can setup an individual environment for every project, which will contain its own packages. The environment is a very efficient way of managing project packages, that's why PyCharm suggests a creation of such environment over the system interpreter by default. In short, it allows you to have two different versions of the same package used by two different projects, without the packages conflicting with each other.
This also explains why your virtual environment files weren't affected by your upgrade.
Now, I am unfortunately no Python expert. I had to spend some time examining how Python handles virtual environments on Windows and Ubuntu. It seems the environment always requires the base system interpreter present in the system. If you remove or change the location of the base interpreter, the environment will fail to function.
As I mentioned before editing this answer, you can in theory simply edit the pyenv.cfg file located in the root folder of the virtual environment. In practice, that will only work in simple cases and it is not the intended way of updating virtual environments.
You need to upgrade your virtual environment's files to work with your new system interpreter. That can mean the 64-bit version over the 32-bit version, or even a newer version of Python - such us upgrading from 3.7 to 3.8.
Close PyCharm
Check if the system interpreter you want to upgrade to is on the system Path
You can quickly check by running
python -c "import platform; print(platform.architecture())"
For you, the output should look like this
('64bit', 'WindowsPE')
If your output is different, you'll need to prefix the absolute path to the Python executable in step 4).
Navigate to the virtual environment's directory
The directory you're looking for contains the Include, Lib and Scripts directories and the pyenv.cfg file. From your screenshots, it seems this directory is your project's root directory, so in your case:
cd C:\Users\solei\PycharmProjects\Will2.0\
Upgrade the virtual environment
python -m venv --upgrade .
... or if Python is not on your path
C:\Users\solei\AppData\Local\Programs\Python\Python38\python.exe -m venv --upgrade .
The . in the commands refers to the current directory.
Open PyCharm and verify your environment is working correctly
... or simply try to run pip directly from the command line. Note you need to first activate the virtual environment by running the Scripts\activate.bat batch file.
If the above-mentioned method doesn't work, you might have to create a new virtual environment. You can create one easily without making a new PyCharm project. See this PyCharm documentation for reference. However, you'll still need to redownload all the required packages again.
For the simplicity, I recommend creating the new virtual environment in a .venv folder located in the project's root.
Disclaimer
I tested only the Python's behavior alone on a fresh Windows installation inside the Windows Sandbox. I was able to install the 32-bit Python, create a virtual environment, replace Python with the 64-bit version and upgrade the virtual environment to have it launch correctly again.
I have an existing django project which I have developed using python libraries installed in system and adding missing ones to the system. But now conflict has come for python-requests as system has 2.2 version but I need >2.5. Dont't want to uninstall and put newer one as it may break the OS. So now, I want to use virtual env and install packages there in complete isolation to that of OS.
I think the solution you're looking for is to download a different version of python without uninstalling your original, then start up virtualenv venv, but by passing in a path to the new python.exe file. Like this: virtualenv -p venv <path-to-executable-here>, then just do source bin\activate, as usual. This starts the virtual environment using the python executable you passed to virtualenv through the terminal.
Also, this might not be the only way, but, there's something called ModuleFinder which enables you to get a list of all the modules your script is importing--That is if you don't want to type them out manually, and you have extra modules installed (otherwise pip freeze > requirements.txt would do the job, and your new virtual environment would install all the packages in requirements.txt).
I am newbie using VirtualEnv and recently try to create one using PyCharm. During the process, PyCharm ask me to specify the project location, application name and VirtualEnv name and location. My doubt is, after I specify the name and location of the VirtualEnv the location of the Django project files must be inside the VirtualEnv? or it's possible to have the VirtualEnv files in a different location than the Django project files?
Maybe I am not understanding the purpose of the VirtualEnv. Perhaps, VirtualEnv it's just a list of the dependencies of my project, Python version, Django version, Pip version, Jinja2 version and all other required files, but not necessarily the Django application files (the website that is being developed).
Thanks in advance.
Ya, I think you misunderstand what virtualenv does:
https://virtualenv.pypa.io/en/latest/
The basic problem being addressed is one of dependencies and versions, and indirectly permissions. Imagine you have an application that needs version 1 of LibFoo, but another application requires version 2. How can you use both these applications? If you install everything into /usr/lib/python2.7/site-packages (or whatever your platform’s standard location is), it’s easy to end up in a situation where you unintentionally upgrade an application that shouldn’t be upgraded.
Your project files don't need to be (and shouldn't be) where the virtualenv files are.
virtualenv installs your app's python dependancies in a folder for the specific virtualenv that is being used.
Let's say you are not using virtualenv, the dependencies would be installed into into the site-packages folder for your system. The dependancies aren't installed in your project directory and your project directory isn't in your system's site-packages directory.
Using virtualenv doesn't change that, it just changes where the dependencies are installed.
virtualenv is not just a list of dependencies! It actually has all the modules under its umbrella. Think of a virtualenv as a space which isolates all the packages used by your project from the rest of the other packages that were installed previously or at a later time.Yes, there is an option to have the virtualenv make use of packages that are "outside" of the environment but that's just an option.
The main purpose of having an virtualenv is to enable the user to use packages versions of his choice and keep them isolated from the rest of the space. Usually, the list of packages belonging to a specific virtualenv are captured into a file, requirements.txt. If you want to run the project on a different machine or share it with someone, having requirements.txt will make it easy to recreate the environment via pip install -r requirement.txt from within virtualenv