I have created a virtual environment named knowhere and I activate it in cmd using code .\knowhere\Scripts\activate. I have installed some libraries into this environment.
I have some python scripts stored on my pc. When I try to run them they are not working since they are not running in this virtual environment. Now how to make these scripts run.
Also is there any way to make "knowhere" as my default environment.
Virtual environments are only necessary when you want to work on two projects that use different versions of the same external dependency, e.g. Django 1.9 and Django 1.10 and so on. In such situations virtual environment can be really useful to maintain dependencies of both projects.
If you simply want your scripts to use Python libraries just install them on your system and you won't have that problem.
Related
I have globally installed modules in my pc, but when i create a virtualenv some of the modules are already preinstalled in it, but when i execute 'pip freeze' in my virtualenv there are no installed modules. commands like django-admin , cookiecutter already work in my virtualenv though i have never installed them in it. But other commands like numpy or pandas do not work , though i have installed them in my machine globally like django or cookiecutter. How do i fix this? I am using python version 3.9.6.
TL;DR: The django-admin and cookiecutter commands are accessible from your virtual environment because they are on PATH. This isn't related to the Python virtual environment, but rather due to your whole system. If you want to make global packages accessible in your virtual environment, see this answer.
django-admin and cookiecutter are executables. They're located in some folder on your system (most likely the Scripts folder of your Python installation), and that folder is in PATH. Therefore, the shell can access them, no matter if you are in a virtual environment.
To contrast with that, numpy and pandas are only libraries. Therefore, when you try to import them in your code which is run in the virtual environment, they cannot be accessed. This can be changed by either installing them in the virtual environment, or including system site packages, which you can see how to do in this answer.
If you tried to import django or cookiecutter, that wouldn't work either (in your virtual environment), just like numpy or pandas. There's no way to "fix this", because it isn't broken. I wouldn't suggest removing Scripts from PATH, because that would mean those commands would never be accessible.
I have created a Python virtual environment using virtualenv for Python 2.7.18 64 bit and another virtual environment using venv for Python 3.5.4 64 bit.
I was hoping to be able to commit these items into version control so that other users of the project could access them without having to setup a Python environment themselves. Another issue is that some of the work stations will not have access to internet to easily create a virtual environment from scratch, so using a requirements.txt file is not a valid solution.
It seems like there are a fair amount of issues preventing a virtual environment (whether using virtualenv or venv) from being easily 'copied' and executed on another system.
Is what I am describing even possible? I have tried tinkering with the 'activate' scripts to remove some of the hard coded pathing but that doesn't seem to do the trick.
Thanks
Have you considered using Docker? If you just have an image (or use docker-compose for multiple images), the user will not need to start a virtual environment.
I need to make a fully self-contained Python environment for Mac, which can be transferred between different Macs running OSX and function independently of any other Python installations (or lack thereof).
In the enviroments I have created (using venv), I find that the /bin/python file and others are in fact aliases. When I replace these aliases with the original executables, the enviroment still does not work on other computers due references to files within a Python installation, such as
Library not loaded: /Library/Frameworks/Python.framework/Versions/3.9/Python
Is there a way to make the environment fully self-contained?
I would guess so, since tools such as pyinstaller and cx_freeze exist to make executables out of Python applications. So these must contain standalone environments somehow. But for my purposes, it is the enviroment that I need. Not an executable as these tools provide.
You can try using a virtual environment. Python comes with venv which allows you to create virtual environments. You could consider them self-contained, since all the necessary scripts for running Python are in the virtual environment. This goes with the package dependencies as well. You can have a variety of packages installed on your computer, but the virtual environment will have none of them installed (unless, of course, you install them yourself).
To create a virtual environment, run:
$ py -3.9 -m venv virtualenv
You can replace 3.9 with whichever Python version you are using and you can replace virtualenv with whatever you want to name your virtual environment.
To activate the virtual environment, you would run:
$ source virtualenv/Scripts/activate
And to deactivate it, you would run:
$ deactivate
Bottom Line:
I can get everything to work by configuring two separate virtual environments, one for pyCharm and one for the CLI. Is this really necessary or should I be able to use 1 virtual environment for both as I expected?
More Detailed explanation:
I'm very new so this is probably a facepalm type of question so i'll try to be terse.
I'm using Linux Mint, Python 3.6, django 3.0.3, and pyCharm 2019.3.1.
I can create a virtual env using venv in the cli and it works.
I can also create a NEW virtual env in pyCharm through the settings: Project: Interpreter interface, and it works, however it doesn't have venv as an option, it only has virtualenv.
But if I try to activate the virtual env i created in pyCharm from the cli (using virtualenv of course, not venv), it fails hard and thinks i'm using python 2.7 which isn't even installed on my system. If it try to point pyCharm at the virtual env I setup on the cli, I get an error 134.
Is this just a known/expected issue? Must I have two virtual environments for every project I want to access via both pyCharm AND the cli? And I assume this is unrelated but I also find it odd that pyCharm lists my interpreter as python 3.7, which also is not installed on my system. I'm using 3.6 alone.
Thanks for your time.
At this time, I'm going to just answer this as: you need a separate virtual env for each (pyCharm and CLI) as this approach is not difficult or time-consuming and I have not had any issues working in this way.
Coming from JavaScript I'm familiar with NPM.
There you can install packages globally (by using the -g flag) or locally in a project.
In Python they have these Virtual Environments.
I'm still a bit uncertain why they are needed. I know that it is for having the same package in different versions on one machine.
Is it because Python doesn't have the concept of local project-installations?
All package-installations are installed global and there's no way around that. It seems to me being that way ...
And so they have does Virtual Environments instead?
I'm a right there?
Virtual environments make possible for you to encapsulate dependencies by project.
Python has no node_modules equivalent. When you install something with pip it goes to your site-packages folder. To find out this folder you can run python -m site and it will print out the folders where it will search for packages.
Example on Fedora 29:
➜ ~ python -m site
sys.path = [
'/home/geckos',
'/usr/lib/python27.zip',
'/usr/lib64/python2.7',
'/usr/lib64/python2.7/plat-linux2',
'/usr/lib64/python2.7/lib-tk',
'/usr/lib64/python2.7/lib-old',
'/usr/lib64/python2.7/lib-dynload',
'/usr/lib64/python2.7/site-packages',
'/usr/lib/python2.7/site-packages',
]
USER_BASE: '/home/geckos/.local' (exists)
USER_SITE: '/home/geckos/.local/lib/python2.7/site-packages' (doesn't exist)
ENABLE_USER_SITE: True
pip vs package manager
If you don't use virtual environments you may end up with packages being installed side by side with operating system python packages, and this is where the danger is. Packages may be overwritten and things get messy fast. For example you install Flask with pip then try to install Jinja2 from with package-manager, now you remove Jinja2, and breaks Flask, or you update your system, Jinja2 got updated but not Flask. Or even simpler, you install something with package manager and remove with pip, now the package manager is in a broken state.
Because of this we always use virtual environments, and even separate virtual environments by project.
Creating and maintaining virtual environments
Nothing prevents you from maintaining you virtual environment in the same folder as your project. This way you will have the same felling that you have with node_modules. You can create it with
virtualenv <SOME_FOLDER> for python 2
or
python3 -m venv <SOME_FOLDER> for python 3
Conventions that I've seen
If you're keeping virtual environments as a subfolder of your project, I usually call then env or venv
Other options is keeping all then in the same folder inside your home, I've been using ~/.venv/<PROJECT>
Pipenv
Finally there is an alternative that I like more than pure pip. Pipenv is a tool that manages virtual environments automatically for you. It feels more close to yarn and has more features
To create a virtual environment for a project just pipenv --tree or pipenv --two in your project folder. It will create and manage the virtual environment and write dependencies to Pipenv file. It also support development packages, I really think is worth trying. Here is the docs: https://pipenv.kennethreitz.org/en/latest/
I hope this helps, regards
Is it because Python doesn't have the concept of local project-installations?
Correct.
Well, mostly correct. There's a number of "modern" Python package managers that support project-local package installation. Right now the big two are pipenv and poetry.
However, all of these libraries are fundamentally wrappers over the basic Python virtual environment mechanism. It's the basis of the ecosystem.
Global package management is a little thorny in Python because Unix systems tend to come with a "system Python" installation that support parts of the operating system. Installing/updating packages in the system Python is a very bad idea, so you always want to be working in a Python you installed yourself, either a fully separate installation or at least a virtual environment of the system Python.