My python interpreter setting been deleted by this git clean -d -f so now do i have to reinstall all package requirements?. Is there any way i can undo this.
I was trying to remove _ _ pychache _ _ files. but by mistake typed this git clean -d -f command.
if I had to install all packages again then i would be in big trouble cause of high chances of mismatch module version
If the file was private, meaning not added to the index through git add, and not committed, then double-check your editor/IDE: it might still have a local history for that file.
If not, then you need to use a file recovery utility, as detailed in "Can I restore deleted files (undo a git clean -fdx)?".
It is best to have an alias for git clean, in order to delete the (.gitignore'd) __pycache__ while keeping the (.gitignore'd) .iml/.idea project setting files you want to keep during a clean (even an inadvertent one)
Related
I want to use the "newenv" approach, but now I need to make changes to psynet (for example adding a log line or rpdb.set_trace()), where is this local copy? And can I simply change it or I need to reinstall the environment with reenv?
I believe this is the right workflow:
In a new project, start by creating a new environment, with the command newenv, which currently stands for
alias newenv="python3 -m venv env && source env/bin/activate && pip install -r constraints.txt"
This will install all the dependencies in your constraints file, which should include the Psynet version too.
If you make any changes in the code (e.g., experiment.py) that do not involve any change in your dependencies/ packaged, then you do not need to do anything. But if your changes involve the packages your are working with in your environment (a change in PsyNet or some of your working packages, like REPP or melody-experiments), then you will need to first push the changes in GITHUB and then reinstall your working environment, using the command reenv, which stands for:
alias reenv="rm -rf env && newenv && source env/bin/activate && pip install -r constraints.txt"
There are other situations in which one only wants to make a change in requirements.txt, such as adding a new package or changing PsyNet's version. In this case, after changing your requirements, you should run dallinger generate-constraints, which will update your constraints.txt accordingly Then, run reenv.
If you want to achieve this, you should edit the source code files within your env directory, which will be located in your project directory. PsyNet will be located somewhere like this:
env/lib/python3.9/site-packages/psynet
Note that any changes will be reset the next time you run newenv.
The simplest way to do this is certainly to edit the file in the venv, but as has been pointed out it's easy to lose track of that change and overwrite it. I do that anyway if it's a short term test.
However, my preferred approach when I want to modify a library like this is to clone its source code from Git, do the modifications in my cloned sandbox, and do a pip install -e . inside the module after having activated the venv. That will make this venv replace the previously installed version of the library with my sandbox version of it.
How is it possible to reduce the size of a python virtual environment?
This might be:
Removing packages from site_packages but which one can be removed?
Removing the *.pyc files
Checking for used files as mentioned here: https://medium.com/#mojodna/slimming-down-lambda-deployment-zips-b3f6083a1dff
...
What else can be removed or stripped down? Or are there other way?
The use case is for example the upload of the virtualenv to a server with limited space (e.g. AWS Lambda function with 512 MB limit)
If there is a .pyc file, you can remove the .py file, just be aware that you will lose stack trace information from these files, which will most likely mess up any error/exception logging you have.
Apart from that there is no universal way of reducing the size of a virtualenv - it will be highly dependent on the packages you have installed, and you will most likely have to resort to trial and error or reading source code to figure out exactly what you can remove.
The best you can do is look for the packages that take up the most space and then further investigate the ones that take up the most disk space. On a *nix system with the standard coreutil commands available, you can run the following command:
du -ha /path/to/virtualenv | sort -h | tail -20
After you installed all packages you could try to remove all packages in the virutalenv that are related to installing packages.
rm -r pip*
rm -r pkg_resources*
rm -r setuptools*
Depending on which packages you have installed, the result might still work as desired since most packages wont have runtime dependencies on these three packages. Use at your own risk.
When you create your virtualenv you can tell it to use your system site_packages. If you installed all required packages globally on the system, when you created your virtualenv, it would essentially be empty.
$ pip install package1 package2 ...
$ virtualenv --system-site-packages venv
$ source venv/bin/activate
(venv) $ # now you can use package1, package2, ...
With this method you can overinstall a package. If, inside your virtualenv, you install a package, that will be used instead of whatever is on the system.
I am really new to Python and the virtualenv needed to set up a project. I dont know whether the directories generated by virtualenv should be gitignored or staged and committed.
I narrowed it down to the myproject/env/bin directory that doesn't seem to want to be staged. After running git add env/bin once I get.
[1] 16599 killed git add env/bin
And then after running the same git add env/bin again I get.
Another git process seems to be running in this repository, e.g.
an editor opened by 'git commit'....
There are no other git process running. Thanks for your help!
After looking through a few other Python/Django repositories on Github, I see that most have add the env/bin, env/include and env/lib directories (generated by virtualenv) to their .gitignore file. I'll take this at face value and move on til I have a better understanding of virtualenv. Thanks!
Is there any way to do something like git clean -d -x -f using GitPython?
I need to reset working directories and want to get rid of all unversioned files without deleting the whole folder (except for .git) and checking out again.
You can work around by using the git command directly, from gitpython
git = repo.git
git.clean('-xdf')
I have just started learning Django, and I'm facing some problems regarding the 'copy' of the projects.
I have two computers and I would like to use both computers for my development. When I was learning PHP (at that time I didnt even know how to use Github), all I had to do was set up a webserver on both computers, and upload the whole files through Google Drive (from one computer) and then download it from the other computer.
However, it seems to me that Django is somewhat different since it is a framework and has a lot of setting ups before starting a project (including virtual environment; I am following a Youtube tutorial and it says that I would be better off if I used virtualenv). I thought it wouldn't work just by downloading the whole project folder to the other computer.
Currently, I have uploaded the whole virtual environment folder on Github.
So, to list my questions,
When downloading it on the other computer, should I setup the virtual environment on that computer and then download the folder?...
Is there any way that I can only sync or commit the files that has been changed in the project automatically?
(That is, i have to change many files in django projects(views, urls, settings... etc) but it would be hard to remember all the files that i have changed and then seperately commit those ones)
Any help would be appreciated.
Edit: Consider using pipenv
I suggest that you also install virtualenvwrapper (here). virtualenvwrapper keeps all files except your project at another location so your project directory contains only your files and you can safely use git add --all.
After its installed, do:
$ mkdir my-project; cd my-project
$ mkvirtualenv my-env-name
$ pip install django <more-good-stuff>
$ pip freeze > requirements.txt
$ git init; git add --all; git commit -m "Initial Commit"
... push to github ...
Now go to other machine, and install virtualenv and virtualenvwrapper
$ git clone <url> my-project; cd my-project
$ mkvirtualenv my-env-name
$ pip install -r requirements.txt
... continue your work, commit and push push and win at life :D
you usually don't want to commit everything blindly. It's better if you use git status to see all the files that you changed and then git add those that you want to commit. Once you've verified that you really want to commit all files, you can simply git add --all (more here). And then you git commit.
And, yes, virtualenv is the way to go. You want to create a virtualenv for your project and have all your dependencies in a requirements.txt file. This will allow to have only your source code and no libraries in your git repo, making it much cleaner. This also can allow you to have a set of verified libraries in production, and if you want to try out a new library you can just install it in your local virtualenv. Or even have two virtualenvs and switch, or whatever, and it does not mess your code repo or other machines' installations.