Migrating dependencies with django relocation - python

I've been managing a Django site my MacBook while syncing changes (via github) to my webfaction production and staging servers.
I've purchased a new MacBook and would like to start rebuilding my environment. My issue is that the GH versioned files were deeper in the projects--at the same level as manage.py and included settings.py, template files, mvc files, etc. However, they did not include the files and data created by installing django apps and dependencies. For that, I was manually installing them on the prod/staging server immediately after installing them on my MacBook env.
What I'm having a hard time understanding is where these dependencies are located. I'm a victim of PHP development and am used to all my files being right there in the public folder.
Now that I have my github repo pulled down, I assume there's a way to copy all this stuff over? I don't think I can remember alllllll of the many dependencies I installed from the very beginning.

The typical way of managing dependencies for specific projects is to use pip, virtualenv and write/store all the dependencies you have installed for that particular project's virtualenv by running
pip freeze > requirements.txt
in your project's (root) directory and then committing the requirements.txt file into your project git repository.
You can later on reinstall all these dependencies by simply issuing:-
pip install -r requirements.txt
Failing which at this point in time, you will have to manually trying to figure out which dependencies are missing when you try to run your python project and manually pip install each one until your project works.
If you still have your old macbook (probably still do), you can create your requirements.txt file right now by running pip freeze > requirements.txt. But if you did not use virtualenv, you are essentially freezing all your dependencies that you have installed in your old macbook system-wide into your requirements.txt file.

Related

Forcing Flask to only use libraries contained in requirements.txt?

I have a Flask app currently running on Google App Engine. Locally, the app is running inside a virtual environment and references the appropriate libraries installed in the venv/Lib/site-packages directory.
When the app is updated in GAE, requirements.txt is used to determine what libraries/dependencies need to be installed. I frequently get tedious errors like "Module not found" and have to remember to add said module in my requirements.txt, and then have to redeploy and check the error logs, which takes time.
I have a bunch of dependencies installed in my virtual environment, only some of which need to be referenced in my requirements.txt file, since I only use a few in my Flask app. So, I am trying to figure out a way to test my app locally as if it was running on GAE by forcing Flask to reference only those dependencies in my requirements.txt file so if there is a "Module not found" error, I won't have to repeat gcloud app deploy and have to scour through the logs all over again but rather just do it quickly on my own machine.
Hopefully that wasn't to convoluted, lol.
To be clear, not everything installed in your virtual env needs to be declared in your requirements.txt file. Some libraries are installed because they are dependencies of another. For example, just listing Flask will lead to Jinja also being installed
To your specific issue, you're basically saying you did not narrow down the actual libraries you need for your project. This is usually due to copying over installed libraries from another project.
You can use pip3 freeze > requirements.txt or pip2 freeze > requirements.txt to auto-generate your requirements.txt file. The problem with this method is that it will include everything installed in your virtual env and it seems you don't want this.
Some suggest using pipreqs (see this Stackoverflow answer).
I normally do it the manual way i.e. Remove the existing venv, create a requirements.txt with just the basics - python, flask/django, run your program and then manually add each library it complains about into the requirements.txt file and reinstall the contents of your requirements.txt file. Rinse & Repeat till you no longer get errors. Now you have your full requirements.
1.Install venv --- sudo apt install python3-venv --- for python3
2.You create Virtual environment for all flask server.
Choose some directory and run this command.
python3 -m venv venv
3.Run this command for activate for venv
source venv/bin/activate
4.Choose diroctory flask server
pip3 freeze > requirements.txt -run this command
5. Finally run this command
pip3 install -r requirements.txt
6. You can use this venv all flask server. You can update it.

Upgrading a project from buildout to pip?

I am trying to upgrade a Pyramid project to Python 3. I am also exploring various on how to improve the build system so that is more modern.
Currently we are using Python's buildout to setup an instance. The approach is simple:
All the required eggs (including app - which is my package) are stored in a buildout configuration file with their exact versions specified.
Run buildout to get the exact eggs (including third party stuff) from a local package server.
Run the instance using ./bin/pserve config.ini.
For my new app source code that has Python 3 changes, I am trying to get rid of everything and just use pip instead. This is what I have done now (in a docker container):
git clone git#github.com/org/app.git
cd project
# Our internal components are fetched using the `git` directive inside `requirements.txt`.
pip install -r requirements.txt # Mostly from PyPi.
pip install .
It works, but is this the correct way to deploy an application for deployment?
Will I be able to convert the entire installation to a simple: pip install app and run it using pserve config.ini if I do the following:
Upload the latest app egg to my package server.
Sync setup.py and requirements.txt so that Python to do pip install -r requirements.txt (or its equivalent) internally?
pip install app.
Copy config.ini to the machine where I am going to install.
Run pserver config.ini
I wanted to know if the above approach can be made to work before proceeding with the egg creation, mocking a simple package server etc. I am not sure if I can really do pip install for a web application; and I think requirements.txt has some significance in this case.
I haven't explored wheels yet, but if the above works, I will try that as well.
Since I am really new to packaging, would appreciate if I can some suggestions to modernize by build using the latest tools.
After reading some of the links like requirements.txt vs setup.py, I think requirements.txt is needed for Web Apps especially if you want a consistent behaviour for deployment purposes. A project or an application seems to be different than a library where pip install suffices.
If that is the case, I think the ideal way is to do pip install -r requirements.txt and then pip install app from a local package server without git cloning?
Resources: install_requires vs requirements files

Do we need to upload virtual env on github too?

This is my GitHub repo
https://github.com/imsaiful/backmyitem
I push from my local machine and pull the changes in Amazon EC2.
Earlier I have not added the virtual env file in my repo but now I have changed some file in admin directory which is containing in the virtual env. So should I go for to add the virtual env too on my GitHub or instead I change the same thing on my remote server manually?
As was mentioned in a comment it is standard to do this through a requirements.txt file instead of including the virtualenv itself.
You can easily generate this file with the following:
pip freeze > requirements.txt
You can then install the virtualenv packages on the target machine with:
pip install -r requirements.txt
It is important to note that including the virtualenv will often not work at all as it may contain full paths for your local system. It is much better to use a requirements.txt file.
No - although the environment is 100% there, if someone else where to pull it down the path environment hasn't been exported not to mention Python version discrepancies will likely crop up.
The best thing to do is to create what is known as a requirements.txt file.
When you have created your environment, you can pip install this and pip install that. You'll start to built a number of project specific dependencies.
Once you start to build up a number of project dependencies I would then freeze your local python environment (analogoues to a package.json for node.js package dependency management). I would recommend doing the following in your terminal:
(local_python_environment) $ pip install django && pip freeze > requirements.txt
(local_python_environment) $ pip install requests && pip freeze > requirements.txt
That is to say, freeze your environment to a requirements.txt file every time a new dependency is installed.
Once a collaborator pulls down your project - they can then install a fresh python environment:
$ python3 -m venv local_python_environment
(* Please use Python 3 and not Python 2!)
And then activate that environment and install from your requirements.txt which you have included in your version control:
$ source local_python_environment/bin/activate
(local_python_environment) $ pip install -r requirements.txt
Excluding your virtual environment is probably analogous to ignoring node_modules! :)
No Its not necessary to upload virtualenv file on github. and even some time when you push your code to github then it ignore python file only if add into ignore.
Virtual Environment
Basically virtual environment is nothing but itis a tool that helps to keep dependencies required by different projects separate by creating isolated python virtual environments for them. This is one of the most important tools that most of the Python developers use. Apart from that you can add requirement.txt file into your project.
Requirement.txt
It is file that tells us to which library and application are need to run this application. you can add requirement.txt file with this simple command.
pip freeze > requirements.txt
After run this command all application and library add in this file. and if you make your project without activate any virtualenv then python automatically use system environment variable it will also add all the file that not necessary for your project.
You should add the virtualenv in your gitignore. Infact github has a recommended format for python, which files should be added and which shouldn't
Github recommendation for gitignore

Python - how to let users run script with libraries

I plan on sharing a Python program on GitHub.
However it uses additional libraries like http, Selenium, BeautifulSoup and the Google Calendar API.
How do I include these libraries in the directory I push to GitHub, so that all users have to do is run python script.py, instead of having to install the libraries?
I thought of generating an executable with pyinstaller but that didn't work :/
You use a pip requirements.txt file for this.
If you have done your work inside of a virtual environment then run on your command line/terminal:
pip freeze > requirements.txt
Then commit and push the file to your github repository.
If you have not done your script within a virtual environment then run:
pip freeze > requirements.txt
And edit the file so you only have the modules you require.
I'd recommend always use a virtual environment for this since it makes your application easy to share. In django framework employing virtualenv's is very common.
Your collaborators can install your dependencies using:
pip install -r requirements.txt
After cloning your github repo.
Usually you don't need to embed your dependencies in your project (not practical! specially when they are many). Instead you may include requirements.txt inside your project to list the modules (and version number) that are required by your application. Then when a user need to use your script, they can run something like this:
pip install -r requirements.txt
read more about requirements files here:
https://pip.readthedocs.org/en/1.1/requirements.html#requirements-files

Installing selected packages from requirements.txt

I have a requirements.txt file on my development machine. I have pushed it into a git repo and cloned it on a server.
The way I push changes to the server is as follows:
I freeze the file on my development machine, then I add the file to git and pull it on the server and do pip install -r requirements.txt.
But doing this is installing all the packages again and again and I dont want that. I only want those packages to be installed which are not installed on the server.
Whats the best way of doing this? I would also like to know other efficient methods of pushing development code to server.
Use buildout, this is other method. Buildout checks for packages before installing, so it will not reinstall unneeded packages.
It is very powerfull tool. When you deploy, you just need to make git push, then on the production server you do:
git pull
bin/buildout
That's it. You can read an article about Buildout and pip+virtualenv differences
EDIT:
You can set PIP_DOWNLOAD_CACHE path in settings.py to tell pip store all downloaded packages in some directory('packages' for example), so it won't download them again:
import os.path
PROJECT_ROOT = os.path.normpath(os.path.dirname(__file__))
PIP_DOWNLOAD_CACHE = os.path.abspath(PROJECT_ROOT+'/packages/'),

Categories