How to Copy/Clone a Virtual Environment from Server to Local Machine - python

I have an existing Python django Project running in Web Server. Now the client needs to make some changes in the existing code. So I need to set it up in my Local Machine. All the packages needed for this project is installed in a Virtual environment. How can I copy or clone this virtual environment to my Local machine to run this Project.

Run pip freeze > requirements.txt on the remote machine
Copy that requirements.txt file to your local machine
In your local virtual environment, run pip install -r requirements.txt
And, so long as all of the requirements are well behaved Python packages, you should be good to go.

Please using Freeze command and you will get text file with all the versions of packages.
Then install them using easy install or pip install

Related

package not being permanently installed inside the python virtual environment

I'm building a rest-api using the Django Python framework. I'm using many external python packages. I have created a python virtual environment (python -m venv venv) and after activating the venv environment (venv\Scripts\activate), I installed the requests package (python -m pip install requests). Then I pushed my project to my git repo and cloned it onto another machine. When I tried to run my Django project, it asked me to install the requests package again. Why or how can I permanently install packages into my python virtual environment or someplace else where I wouldn't have to install packages again on different machines? I'm looking for a solution similar to NodeJS - npm of installing packages as all the packages are locally installed into the node_modules folder of the project and you don't have to reinstall them on different machines. Thanks
The environment itself is not shareable in the way you specify. I'd recommend to use Docker for this use-case. If you create a docker image which has the correct dependencies, then you can easily operate in the same environment on different computers. The python venv cannot be used this way.
Nevertheless, if your requirements.txt files specify package versions, then the venv you create on the two machines should be relatively similar (depending of course on other parameters like the OS, python version, etc.).

Unable to clone Python venv to another PC

I want to clone my existing venv to another PC but simply copy paste is not working. When I copy the venv and paste to the second machine and run
pip list
It only list pip and setup_tools as the installed dependencies.
I tried another way to clone the packages.
I created a new venv in the second machine and copied all the file of first venv to that new venv with skipping the existing files with the same name in new venv. Now, when I run
pip list
It shows all the dependencies but, when I try to launch the jupyter notebook as
jupyter notebook
It gives the following error.
Fatal error in launcher: Unable to create process using '"f:\path\to\first_venv\on_first_machine\scripts\python.exe"
"C:\path\to\new_venv\on_the_second_machine\Scripts\jupyter.exe" notebook': The system cannot find the file specified.
I don't know to make things working. Please help!
Edit
The problem is I don't have internet connection on the second machine. Actually it's a remote machine with some security protocols applied and having no internet connection is part of security ! My bad :'(
You can't copy-paste venvs from one machine to another since scripts in them may refer to system locations. (The same stands for attempting to move venvs within a machine.)
Instead, recreate the environment on the new machine:
On the old machine, run pip freeze -l > packages.txt in the virtualenv.
Move packages.txt over to the new machine.
Create a new virtualenv on the new machine and enter it.
Install the packages from the txt file: pip install -r packages.txt.
EDIT: If you don't have internet access on the second machine, you can continue from step 2 with:
Run pip wheel -w wheels -r packages.txt in the venv on the first machine. This will download and build *.whl packages for all the packages you require. Note that this assumes both machines are similar in OS and architecture!
Copy the wheel files over to the new machine.
Create a new virtualenv on the new machine and enter it.
Install the packages from wheels in the new virtualenv: pip install *.whl.
You should never copy a virtual environment between machines. The correct way is to export the dependencies installed in the environment using pip freeze and create a new virtual environment on the other machine.
# One the first machine
pip freeze > requirements.txt
# Copy requirements.txt to the other machine, or store in a source repository
# Then install the requirements in the new virtual environment
pip install -r requirements.txt

Pipenv: Activate virtual environment on new computer after git clone

I copied a repo containing a Python Django project from my old computer to my new one via git clone. I manage my dependecies in the project via pipenv.
After successfully cloning my repo I wanted to start working on the new computer and tried to select the relevant python interpreter for my project in VS Code . However, the path was not on the list.
So first I tried the command pipenv --venv which gave me the feedback: No virtualenv has been created for this project
So I thought I might need to activate the virtual environment first, before being able to select it in VS Studio code. So i ran pipenv shell in the root directory of my project.
However this seem to have created a new virtual environment: Creating a virtualenv for this project… Pipfile: C:\path\to\my\cloned\project\Pipfile
My questions:
1.) Is this the correct way to activate a pipenv virtualenvironment on a new computer after copying the project via git clone? And if not,...
2.1) Does the way I did it cause any problems, which I should be aware of?
2.2) What would be the correct procedure to activate my virtual enviroment on a new computer?
In general an environment image probably shouldn't be copied to github. You'll get a bunch of unneeded files which clogs your repo.
Instead you should create a requirements.txt from your existing environment pip freeze > requirements.txt and commit that file.
Then when someone else clones your repo they can set up a new virtual environment using any tool they want and run python -m pip install -r requirements.txt
That is, requirements.txt is like a recipe for how to create your environment. By providing the recipe users can use it any way they want.
use:
pipenv install
It worked on Ubuntu, should work also on a mac.
I tried on a windows, it triggered some errors.
"If you download a source repository for a project that uses Pipenv for package management, all you need to do is unpack the contents of the repository into a directory and run pipenv install (no package names needed). Pipenv will read the Pipfile and Pipfile.lock files for the project, create the virtual environment, and install all of the dependencies as needed."
https://www.infoworld.com/article/3561758/how-to-manage-python-projects-with-pipenv.html

how to commit virtual envinroment

I have a Python code with virtual env and libraries on my local but in GitHub repository there is no virtual env/libraries.
I removed virtual environment from git-ignore but still the libraries are not getting committed to repository and when I clone it to another system I don't have it.
What is the best practice to commit env and libraries to repository or better to install it again on each computer. If it's acceptable to commit virtual env, can you please advise how to do it.
I personally do it by installing through requirements and push the requirements to the repo. Eg:
pip3 freeze > requirements.txt
Will make a list of all packages and libraries that you've installed in the virtual environment (assuming you're in the environment of course, and not transferred any from the global environment). If you push the requirements.txt file, you can create a new virtual environment as usual and install the requirements:
pip3 install -r requirements.txt

Do we need to upload virtual env on github too?

This is my GitHub repo
https://github.com/imsaiful/backmyitem
I push from my local machine and pull the changes in Amazon EC2.
Earlier I have not added the virtual env file in my repo but now I have changed some file in admin directory which is containing in the virtual env. So should I go for to add the virtual env too on my GitHub or instead I change the same thing on my remote server manually?
As was mentioned in a comment it is standard to do this through a requirements.txt file instead of including the virtualenv itself.
You can easily generate this file with the following:
pip freeze > requirements.txt
You can then install the virtualenv packages on the target machine with:
pip install -r requirements.txt
It is important to note that including the virtualenv will often not work at all as it may contain full paths for your local system. It is much better to use a requirements.txt file.
No - although the environment is 100% there, if someone else where to pull it down the path environment hasn't been exported not to mention Python version discrepancies will likely crop up.
The best thing to do is to create what is known as a requirements.txt file.
When you have created your environment, you can pip install this and pip install that. You'll start to built a number of project specific dependencies.
Once you start to build up a number of project dependencies I would then freeze your local python environment (analogoues to a package.json for node.js package dependency management). I would recommend doing the following in your terminal:
(local_python_environment) $ pip install django && pip freeze > requirements.txt
(local_python_environment) $ pip install requests && pip freeze > requirements.txt
That is to say, freeze your environment to a requirements.txt file every time a new dependency is installed.
Once a collaborator pulls down your project - they can then install a fresh python environment:
$ python3 -m venv local_python_environment
(* Please use Python 3 and not Python 2!)
And then activate that environment and install from your requirements.txt which you have included in your version control:
$ source local_python_environment/bin/activate
(local_python_environment) $ pip install -r requirements.txt
Excluding your virtual environment is probably analogous to ignoring node_modules! :)
No Its not necessary to upload virtualenv file on github. and even some time when you push your code to github then it ignore python file only if add into ignore.
Virtual Environment
Basically virtual environment is nothing but itis a tool that helps to keep dependencies required by different projects separate by creating isolated python virtual environments for them. This is one of the most important tools that most of the Python developers use. Apart from that you can add requirement.txt file into your project.
Requirement.txt
It is file that tells us to which library and application are need to run this application. you can add requirement.txt file with this simple command.
pip freeze > requirements.txt
After run this command all application and library add in this file. and if you make your project without activate any virtualenv then python automatically use system environment variable it will also add all the file that not necessary for your project.
You should add the virtualenv in your gitignore. Infact github has a recommended format for python, which files should be added and which shouldn't
Github recommendation for gitignore

Categories