I'm developing a Python app which maintains a local repository through a number a CLI commands. This application has a settings file written in a certain directory. This file could change for newer versions of the application.
What I thought about is deleting this settings file on executing pip uninstall or upgrade. It isn't appropriate to delete it after each command, since I want to maintain its configuration between commands, i.e. as long as the user have a certain version installed.
As I say, my idea is to delete this settings file whenever the user executes a pip uninstall, also leaving the directory clean. If the user does an upgrade instead, it should be removed too, since the new settings file for the latest version would be created after by the app code when executing the first command.
Is there any way of adding some lines of code when performing a pip uninstall or pip upgrade? What would be the proper way of dealing with this otherwise?
I also thought about writing the version in the same settings file, and updating it in case it is necessary because it detects that the current version is a newer one. But I think it should be easier than this.
Thanks in advance.
Related
I have already installed a package on my Linux machine via conda install, but I found an error when running it on Jupyter Notebook. I'm able to make a simple modification to the package's python script, but the modified code isn't recognized when I called the package again. Do I need to update the package so that the changes would be recognized, and if so, how do I do that?
The library's code does not get imported directly. Instead, it is compiled the first time you import it and put into a pycache folder in the corresponding site-packages directory. In this folder, locate the .pyc file corresponding to the one that you modified and delete it.
Now, when you import, the source code is compiled anew and your changes will have taken effect.
If the package has a GitHub page, it might be worthwhile to make a pull request and ask that they also update the conda package
My question is do i have to install django every single time in my virtual environment in order to run my python files? and is this taking up bunch of space on my machine? My project also uses "matplotlib" and every virtual environment i create it also asks me to import the matplotlib module too. its getting annoying. do i have to do this every time?
Im new to Django. I wanted to run some python files in django but they weren't working, so after some research i found out i needed to run my pycharm project in a virtual environment in order to run these python files.
my folders look like this pycharmProjects -> my project
I enter pycharmProjects and I set up virtual environment using "pienv shell". Then i run "python3 manage.py runserver". It turns out i must install django in the virtual environment before the files run.
Short answer is no, you don't have to use a virtual environment at all and can install your dependancies globally instead. However you will soon find that it will cause a lot of issues. The main reason you would create a virtual environment is to give control of your dependancies and prevent bugs that could be caused because of them having their wires crossed between projects.
Short answer yes.
If you create a virualenv you have to install all packages, that your program needs.
Long answer:
You could install django system wide and then create a virtualenv with the option
--system-site-packages then django would be used from your globally installed python.
(Or you install everything just in your global python, put I personally don't think this is good practice)
If you work with many different projects I think you will avoid a lot of trouble if you use one virtualenv per project.
Trouble meaning that one project breaks, because one pip install for another project changed the version of one package and one project can't handle the newer version.
I would recommend to create a requirements.txt file for each project, that lists the dependencies then you can create the virtualenv with following command
pip install -r requirements.txt
if you have requirement.txt files, then you can create virtualenvs rather quickly if going back to an old project and you can delete the virtualenvs whenever you run out of disk space. If you want to be an the safe side, type pip freeze > pipfreeze.txt prior to deleting the virtualenv and use pip install -r pipfreeze.txt if you want to create one with the same modules and the same versions.
You also might want to look at direnv or autoenv if working on a linux like system.
This will automatically switch to the required virtualenv when changing to a project's working directory.
I am working on several projects on the same PyCharm. Like I "attached" them all together. But I recently noticed some weird behaviors. Like when I import a library I haven't installed yet to my script. It shows me a little error as expected. But when I try to install that using python -m pip install my_library, it tells me that it has already installed. I recently noticed that this is because it's using and other pip from another project. I doesn't use the one in the venv folder in the project. Also to run the scripts sometimes it uses python.exe from pythons original directory. It's a whole mess and I have no idea how I can solve it. Sometimes my projects requires different versions of the same library and you can imagine what happens when I change the version.
I make sure each project is using their own interpreter. Don't know what else to do other than this. I am using Python3.6.4 PyCharm2018.3.2 running on Windows10
it sounds like all your projects are configured to use the system's interpreter instead of the virtual environment you set up for each of them.
Follow this instruction to fix it https://www.jetbrains.com/help/pycharm-edu/creating-virtual-environment.html
In terms of using different version of the python library, you can address that by specifying it in requirements.txt file, which you can put in your venv folder for each project. then you can just do pip install -r requirements.txt after you set up your venv. (you need to ensure that the venv is activated - you don't need to worry about this if you have configured the project in PyCharm to use the venv's python interpreter.) You can check this by going to Terminal in your PyCharm and you should see (venv_name) hostusername#host:~/project_folder$
I was struggling with installing dependencies for an external library (the requirements were already fulfilled) when I read that I should check if the install path is in my PYTHONPATH. It wasn't, so I looked up how to add it.
I came across this answer, and typed the code straight into the Terminal (not ~/.bashrc) before I finished reading.
If you're using bash (on a Mac or GNU/Linux distro), add this to your ~/.bashrc
export PYTHONPATH="${PYTHONPATH}:/my/other/path"
The path was I entered was /usr/bin/python.
Surprisingly this fixed all of my dependency problems.
However, since my Django project is dependent on a virtualenv, this ruined everything. I can no longer find how or where to restore my PYTHONPATH to.
I tried export PYTHONPATH="/home/[username]/.virtualenvs/[env]/bin/python" and also deleting the virtualenv with rmvirtualenv.
My next plan is to delete the project and pull again.
At the top of your Django settings module, you could include:
import sys
sys.path.append('/your/dependency/path')
I installed Django 1.5.4 on my MAC OSX 10.6.8. I created a test project.
And I am unable to edit any of those files, it says read only. I can sudo it and do the modifications.
But what are the other files it may try to access during run time, if they are read only then my application is not working, for example: sqlite database file "storage.db". Can I change the file permissions on the project root folder level and have it applied all the files inside?
Even If could do that, why Django starting project like this? I tried Virtualenv as well, but of no use same thing happens
I figured out the problem. Its the way I installed django in my system. Previously I used sudo easy_install , but this time I created a virtual env and installed django with pip and it worked well. Virtual environment is the best solution for package installation in python. It wont mess with system packages.