Cleaning up system env of conda packages - python

I forgot to activate a new conda environment,I created before, so I started installing the packages. Now my system is littered with packages. How do I clean this up? I only want my conda packages to be in the base environment and the ones I create.
I checked
How to clean local python environment of conda installs
but that is of no help to me.

Do you have pip installed? If so, run pip freeze and save the output to a file. Review the file, make sure there are no packages listed that you DO NOT WANT to remove. Now, write a script to read in that file to a list, only keeping the package names. Have python loop over that list and uninstall each entry.

Related

Create new conda environment with latest python AND all the packages that I've added in an existing environment

I want to create new conda environment with latest python (want 3.10 or later) AND the appropriate versions of all the packages (like matplotlib and pandas) that I've added in an existing environment (it's NOT the base environment). I don't recall what all of these packages are. Is there a way to do this without breaking things?
You can get all package versions with pip freeze.
Put these in a requirements.txt file.
Create a new environment with the appropriate python version.
Now install all packages with pip install -r requirements.txt
The Conda way to export all packages is conda env export > environment.yml.
In the environment file, modify the python version if needed, and remove unnecessary packages.
Create the new environment using conda env create -f environment.yml. If there is a conflict (e.g. dependency) it will tell you.
Also, check out this question for dealing with cross-dependencies.

pip and python packages missing after attempting to upgrade a package

I just realized that pip was somehow uninstalled and all my packages are missing. My Apps have stopped running on local environment.
I was attempting to upgrade pandas using pip3 and conda. I had the environment up and running fine until then.
Is there a way to recover installed packages or restore the environment?
When I run pip3 list, I get:
Package Version
---------- -------
pip 10.0.1
setuptools 39.0.1
Probably there isn't any easy way to restore the packages. You could inspect your console output because pip shows which packages are uninstalled and just install them again.
Good practice for next time is to store packages needed for each script in requirements.txt file and also separating environments so that each script has its own virtual environment with packages in required version. You can read more about venv here.
It seems like, your Python (manually or by your IDE) has been updated. One who encounters this problem maybe give a chance to change the environment paths order. Search for the "environment variables" on Windows. And check if you have already installed two different Python version. If you have so; you may change your older Python version's order to an upper position of newly installed version. This may help, but as mentioned at the first answer, using virtual environment for further projects is the best-practice.

Creating new Conda environment messes with old environments?

I have 2 different projects with different dependencies. One requires Tensorflow1.15 and another needs 1.14. I first created an environment env1 and pip installed Tf1.14, ran my code, all went well. Then I created a new environment env2 and pip installed Tf1.15, during which I could see it was uninstalling Tf1.14, I assumed it knew what it was doing. However now when I run my code in env1 it throws errors because tf1.14 is removed and env1 also tries to use tf1.15!
What am I doing wrong? I thought we use Conda to create completely separate environments for specifically this kind of situation but I'm confused.
You should not have used pip to install the packages. If you have Anaconda you should use conda to install them. conda install numpy will install the numpy module.
You need
conda install your-module

virtualenv - Birds Eye View of Understanding

Using Windows
Learning about virtualenv. Here is my understanding of it and a few question that I have. Please correct me if my understanding is incorrect.
virtualenv are environments where your pip dependencies and its selected version are stored for a particular project. A folder is made for your project and inside there are the dependencies.
I was told you would not want to save your .py scripts in side of virtual ENV, if that's the case how do I access the virtual env when I want to run that project? Open it up in the command line under source ENV/bin/activate then cd my way to where my script is stored?
By running pip freeze that creates a requirements.txt file in that project folder that is just a txt. copy of the dependencies of that virtual env?
If I'm in a second virutalenv who do I import another virtualenv's requirements? I've been to the documentation but I still don't get it.
$ env1/bin/pip freeze > requirements.txt
$ env2/bin/pip install -r requirements.txt
Guess I'm confused on the "requirements" description. Isn't best practice to always call our requirements, requirements.txt? If that's the case how does env2 know I'm want env1 requirements?
Thank you for any info or suggestions. Really appreciate the assistance.
I created a virtualenv C:\Users\admin\Documents\Enviorments>virtualenv django_1
Using base prefix'c:\\users\\admin\\appdata\\local\\programs\\python\\python37-32'
New python executable in C:\Users\admin\Documents\Enviorments\django_1\Scripts\python.exe Installing setuptools, pip, wheel...done.
How do I activate it? source django_1/bin/activate doesn't work?
I've tried: source C:\Users\admin\Documents\Enviorments\django_1/bin/activate Every time I get : 'source' is not recognized as an internal or external command, operable program or batch file.
* disclaimer * I mainly use conda environments instead of virtualenv, but I believe that most of this is the same across both of them and is true to your case.
You should be able to access your scripts from any environment you are in. If you have virtenvA and virtenvB then you can access your script from inside either of your environments. All you would do is activate one of them and then run python /path/to/my/script.py, but you need to make sure any dependent libraries are installed.
Correct, but for clarity the requirements file contains a list of the dependencies by name only. It doesn't contain any actual code or packages. You can print out a requirements file but it should just be a list which says package names and their version numbers. Like pandas 1.0.1 numpy 1.0.1 scipy 1.0.1 etc.
In the lines of code you have here you would export the dependencies list of env1 and then you would install these dependencies in env2. If env2 was empty then it will now just be a copy of env1, otherwise it will be the same but with all the packages of env1 added and if it had a different version number of some of the same packages then this would be overwritten
virtualenv simply creates a new Python environment for your project. Think of it as another copy of Python that you have in your system. Virutual environment is helpful for development, especially if you will need different versions of the same libraries.
Answer to your first question is, yes, for each project that you use virtualenv, you need to activate it first. After activating, when you run python script, not just your project's scripts, but any python script, will use dependencies and configuration of the active Python environment.
Answer to the second question, pip freeze > requirements.txt will create requirements file in active folder, not in your project folder. So, let's say in your cmd/terminal you are in C:\Desktop, then the requirements file will be created there. If you're in C\Desktop\myproject folder, the file will be created there. Requirements file will contain the packages installed on active virtualenv.
Answer to 3rd question is related to second. Simply, you need to write full path of the second requirements file. So if you are in first project and want to install packages from second virtualenv, you run it like env2/bin/pip install -r /path/to/my/first/requirements.txt. If in your terminal you are in active folder that does not have requirements.txt file, then running pip install will give you an error. So, running the command does not know which requirements file you want to use, you specify it.
I created a virtualenv
C:\Users\admin\Documents\Enviorments>virtualenv django_1 Using base prefix 'c:\\users\\admin\\appdata\\local\\programs\\python\\python37-32' New python executable in C:\Users\admin\Documents\Enviorments\django_1\Scripts\python.exe Installing setuptools, pip, wheel...done.
How do I activate it? source django_1/bin/activate doesn't work?
I've tried: source C:\Users\admin\Documents\Enviorments\django_1/bin/activate Every time I get : 'source' is not recognized as an internal or external command, operable program or batch file.
Yes, saving virtualenv separately from your project files is one of concepts. virtualenvwrapper and pipenv works like that. But personally if I use virtualenv in the simplest form, then I just create the directory with the same name inside virtualenv's directory (next to bin/) and I keep project files there.
pip freeze prints to console the packages (and it's versions) you've installed inside your virtualenv using pip. If you want to save those requirements to file you should do something like pip freeze > requirements.txt
There are few posibilites:
you can activate one virtualenv, then go (cd /path/to/venv2) to another virtualenv.
you can copy your requirements.txt file from one virtualenv and install those requirements in your second virtualenv

How to move installed packages to a newly created virtual environment ?

I've downloaded a lots of packages into global environment (lets say so). Now, I want to create a new virtual environment and move some of the packages to that environment. How would I do that ?
While you could copy files/directories from the site-packages directory of your global installation into the site-packages of your virtual env, you may experience problems (missing files, binary mismatch, or others). Don't do this if you're new to python packaging mechanisms.
I would advise that you run pip freeze from your global installation to get a list of what you installed, and then store that output as a requirements.txt with your source, and put it under source management. Then run pip install -r requirements.txt after activating your virtualenv, and you'll replicate the dependencies (with the same versions) into your virtualenv.
If you try to copy or rename a virtual environment, you will discover that the copied environment does not work. This is because a virtual environment is closely tied to both the Python it was created with, and the location it was created in. (The “relocatable” option does not work.
However, this is very easy to fix. Instead of moving/copying, just create a new environment in the new location. To create VirtualEnvironment. This way work for me or you can see the link below:
pip install virtualenv
virtualenv NameOfYourVirtualEnvironment
virtualenv NameOfYourVirtualEnvironment/bin/activate
Then, run pip freeze > requirements.txt in the old environment to create a list of packages installed in it which is in your case the global environment. With that, you can just run pip install -r requirements.txt in the new environment to install packages from the saved list. Of course, you can copy requirements.txt between machines. In many cases, it will just work; sometimes, you might need a few modifications to requirements.txt to remove OS-specific stuff.
Source:https://chriswarrick.com/blog/2018/09/04/python-virtual-environments/
And also this may it work for you:
How to import a globally installed package to virtualenv folder
https://gist.github.com/k4ml/4080461

Categories