My question is do i have to install django every single time in my virtual environment in order to run my python files? and is this taking up bunch of space on my machine? My project also uses "matplotlib" and every virtual environment i create it also asks me to import the matplotlib module too. its getting annoying. do i have to do this every time?
Im new to Django. I wanted to run some python files in django but they weren't working, so after some research i found out i needed to run my pycharm project in a virtual environment in order to run these python files.
my folders look like this pycharmProjects -> my project
I enter pycharmProjects and I set up virtual environment using "pienv shell". Then i run "python3 manage.py runserver". It turns out i must install django in the virtual environment before the files run.
Short answer is no, you don't have to use a virtual environment at all and can install your dependancies globally instead. However you will soon find that it will cause a lot of issues. The main reason you would create a virtual environment is to give control of your dependancies and prevent bugs that could be caused because of them having their wires crossed between projects.
Short answer yes.
If you create a virualenv you have to install all packages, that your program needs.
Long answer:
You could install django system wide and then create a virtualenv with the option
--system-site-packages then django would be used from your globally installed python.
(Or you install everything just in your global python, put I personally don't think this is good practice)
If you work with many different projects I think you will avoid a lot of trouble if you use one virtualenv per project.
Trouble meaning that one project breaks, because one pip install for another project changed the version of one package and one project can't handle the newer version.
I would recommend to create a requirements.txt file for each project, that lists the dependencies then you can create the virtualenv with following command
pip install -r requirements.txt
if you have requirement.txt files, then you can create virtualenvs rather quickly if going back to an old project and you can delete the virtualenvs whenever you run out of disk space. If you want to be an the safe side, type pip freeze > pipfreeze.txt prior to deleting the virtualenv and use pip install -r pipfreeze.txt if you want to create one with the same modules and the same versions.
You also might want to look at direnv or autoenv if working on a linux like system.
This will automatically switch to the required virtualenv when changing to a project's working directory.
Related
I spent a good deal of time the past two days trying to use poetry correctly. I'm not new to poetry and have used it for a while. But in this DJANGO project I just started with a buddy, after adding packages, some of them needed to be added again via 'python -m pip install '. Once I did that, the django project passed a check and I could also migrate tables. It really bothers me that I'm using a virtual management package that doesn't appear to always work. I like poetry and want to use it. Some of the packages I had to pip install are django and pymysql. I also use pyenv. I'm running python version 3.10.6. I did things such as deleting the .venv folder (mine gets created in the project directory) checking with pip freeze that these packages are not already installed, and then doing a poetry update to re-create .venv. Any suggestions or ideas? I really need to focus on my django project and not be farting around all the time with package issues so I would like to either fix this or at least have a better idea of what is going wrong so I can avoid it.
After learning how to code Python I'm starting to learn and figure out how to structure projects and set virtual environments up, but I can't make my mind up on how are packages managed when the virtual environment is activated. To make an example: I want to make a Django project. So, first of all, mkdir the project folder, cd into it and then execute python -m venv [whatever]. cd into Scripts folder and execute 'activate'. Then, pip install Django and pip list and shows Django. At last, I deactivate the virtual environment and make pip list again. Why is Django listed there? Should it?
You might have installed Django both inside and outside the venv (outside being the system python installation). Deactivate the venv and run pip uninstall django then try again.
Okay, finally I understood what was happening and I didn't realize. On one hand, virtual environments are completely independent from the global or system, so what is being installed in one side it shouldn't affect the other. On the other hand, what happened was, that for being inside the Scripts folder, when trying to execute Python commands I was actually executing the scripts with the same name, which are copies to use with the virtual environment and which can be used calling the "activate" script. There was actually no problem, it was me messed up.
Thanks to all contributors for their help.
I have a python library that I am wanting to help out with and fix some issues. I just don't know how to test my changes given the complexity of how python/pip installs libraries.
I have the library installed with pip and I can run python code connecting to the library by doing an "from import *". But now that I want to make changes to it I pulled the code with git and plan to branch to work on my changes. That's fine. I will then do a pull request to merge any changes given tests pass.
But after I make a change, how do I integrate my changes into python to test out the changes I made with the library? Can pip install my custom/modified version of the library?
I have looked around and haven't successfully found an answer to this but perhaps I'm not looking in the right spot.
Can pip install my custom/modified version of the library?
Yes.
There are various ways of approaching this question. A common solution is the use of Python virtual environments. This allows you to create an isolated Python environment that does not share the same packages as your system Python install. You can then install things into it (such as your modified Python library) to test it out.
To get started, you need the virtualenv tool. This is probably available as a package for your distribution, but you can also install it using pip. Once you have it, you can run in the same directory as your code:
virtualenv .venv
This creates a virtuelenv named .venv. You can call it anything you want, but naming it .venv (or anything starting with a .) means it won't clutter up the output of ls in your workspace.
Next, you need to activate the virtualenv:
. .venv/bin/activate.sh
This modifies your $PATH to place the virtualenv at the front of the list of directories. Now when you type python or pip, you'll be using the virtualenv version.
If your code has a setup.py file, you can install it like this:
pip install -e .
The -e means you want to perform an "editable" install, which means python will use the code "in place", and any changes you make will be immediately visible to the code you use for testing.
When you're done, you can run:
deactivate
This will remove the changes that activate made to your environment.
For more information:
Pipenv & Virtual Environments discusses a higher level tool for managing virtual environments.
Virtualenvwrapper is another take on a higher level management tool.
I am working on several projects on the same PyCharm. Like I "attached" them all together. But I recently noticed some weird behaviors. Like when I import a library I haven't installed yet to my script. It shows me a little error as expected. But when I try to install that using python -m pip install my_library, it tells me that it has already installed. I recently noticed that this is because it's using and other pip from another project. I doesn't use the one in the venv folder in the project. Also to run the scripts sometimes it uses python.exe from pythons original directory. It's a whole mess and I have no idea how I can solve it. Sometimes my projects requires different versions of the same library and you can imagine what happens when I change the version.
I make sure each project is using their own interpreter. Don't know what else to do other than this. I am using Python3.6.4 PyCharm2018.3.2 running on Windows10
it sounds like all your projects are configured to use the system's interpreter instead of the virtual environment you set up for each of them.
Follow this instruction to fix it https://www.jetbrains.com/help/pycharm-edu/creating-virtual-environment.html
In terms of using different version of the python library, you can address that by specifying it in requirements.txt file, which you can put in your venv folder for each project. then you can just do pip install -r requirements.txt after you set up your venv. (you need to ensure that the venv is activated - you don't need to worry about this if you have configured the project in PyCharm to use the venv's python interpreter.) You can check this by going to Terminal in your PyCharm and you should see (venv_name) hostusername#host:~/project_folder$
Let me first outline my desired solution and then elaborate on a specific question how to achieve this state.
I'm soon starting two coding projects in python. I've used python before but never on such big projects. My ideal scenario would be to have a setup where I can run virtual environments and different python version for various project. Some research pointed me to virtualenv / virtualenvwrapper and pyenv. It seems using pyenv-virtualenv or pyenv-virtualenvwrapper there is a nice way to specify the virtualenvironment and python version for a specific project.
Question: Once I've setup a virtualenvironment and python version for a specific project, how easily could I switch in a later stage to a newer python version? Let's say I've started project A with python 3.4 and in one year in the future I would like to move everything to python 3.6. Is this possible in a neat way?
Sure:
$ rm -r my-python-3.4-env
$ virtualenv -p python3.6 my-python-3.6-env
$ source my-python-3.6-env/bin/activate
In other words, each virtual environment is just a folder with the necessary files in it. You "activate" an environment with the source .../activate command (in case of virtualenv) and you leave it just as easily. To switch to a different environment you simply create a new one with a specific Python executable and activate it.
What you want to be careful about is to keep your installation repeatable, meaning if you depend on external modules (which modern projects typically do), you don't want to install each dependency by hand and instead automate that. For instance, you create a setuptools setup.py file which lists your dependencies, and then have it install them into your new environment automatically:
$ source my-python-3.6-env/bin/activate
(my-python-3.6-env) $ python setup.py develop