I have a Flask app currently running on Google App Engine. Locally, the app is running inside a virtual environment and references the appropriate libraries installed in the venv/Lib/site-packages directory.
When the app is updated in GAE, requirements.txt is used to determine what libraries/dependencies need to be installed. I frequently get tedious errors like "Module not found" and have to remember to add said module in my requirements.txt, and then have to redeploy and check the error logs, which takes time.
I have a bunch of dependencies installed in my virtual environment, only some of which need to be referenced in my requirements.txt file, since I only use a few in my Flask app. So, I am trying to figure out a way to test my app locally as if it was running on GAE by forcing Flask to reference only those dependencies in my requirements.txt file so if there is a "Module not found" error, I won't have to repeat gcloud app deploy and have to scour through the logs all over again but rather just do it quickly on my own machine.
Hopefully that wasn't to convoluted, lol.
To be clear, not everything installed in your virtual env needs to be declared in your requirements.txt file. Some libraries are installed because they are dependencies of another. For example, just listing Flask will lead to Jinja also being installed
To your specific issue, you're basically saying you did not narrow down the actual libraries you need for your project. This is usually due to copying over installed libraries from another project.
You can use pip3 freeze > requirements.txt or pip2 freeze > requirements.txt to auto-generate your requirements.txt file. The problem with this method is that it will include everything installed in your virtual env and it seems you don't want this.
Some suggest using pipreqs (see this Stackoverflow answer).
I normally do it the manual way i.e. Remove the existing venv, create a requirements.txt with just the basics - python, flask/django, run your program and then manually add each library it complains about into the requirements.txt file and reinstall the contents of your requirements.txt file. Rinse & Repeat till you no longer get errors. Now you have your full requirements.
1.Install venv --- sudo apt install python3-venv --- for python3
2.You create Virtual environment for all flask server.
Choose some directory and run this command.
python3 -m venv venv
3.Run this command for activate for venv
source venv/bin/activate
4.Choose diroctory flask server
pip3 freeze > requirements.txt -run this command
5. Finally run this command
pip3 install -r requirements.txt
6. You can use this venv all flask server. You can update it.
Related
I am trying to upgrade a Pyramid project to Python 3. I am also exploring various on how to improve the build system so that is more modern.
Currently we are using Python's buildout to setup an instance. The approach is simple:
All the required eggs (including app - which is my package) are stored in a buildout configuration file with their exact versions specified.
Run buildout to get the exact eggs (including third party stuff) from a local package server.
Run the instance using ./bin/pserve config.ini.
For my new app source code that has Python 3 changes, I am trying to get rid of everything and just use pip instead. This is what I have done now (in a docker container):
git clone git#github.com/org/app.git
cd project
# Our internal components are fetched using the `git` directive inside `requirements.txt`.
pip install -r requirements.txt # Mostly from PyPi.
pip install .
It works, but is this the correct way to deploy an application for deployment?
Will I be able to convert the entire installation to a simple: pip install app and run it using pserve config.ini if I do the following:
Upload the latest app egg to my package server.
Sync setup.py and requirements.txt so that Python to do pip install -r requirements.txt (or its equivalent) internally?
pip install app.
Copy config.ini to the machine where I am going to install.
Run pserver config.ini
I wanted to know if the above approach can be made to work before proceeding with the egg creation, mocking a simple package server etc. I am not sure if I can really do pip install for a web application; and I think requirements.txt has some significance in this case.
I haven't explored wheels yet, but if the above works, I will try that as well.
Since I am really new to packaging, would appreciate if I can some suggestions to modernize by build using the latest tools.
After reading some of the links like requirements.txt vs setup.py, I think requirements.txt is needed for Web Apps especially if you want a consistent behaviour for deployment purposes. A project or an application seems to be different than a library where pip install suffices.
If that is the case, I think the ideal way is to do pip install -r requirements.txt and then pip install app from a local package server without git cloning?
Resources: install_requires vs requirements files
This is my GitHub repo
https://github.com/imsaiful/backmyitem
I push from my local machine and pull the changes in Amazon EC2.
Earlier I have not added the virtual env file in my repo but now I have changed some file in admin directory which is containing in the virtual env. So should I go for to add the virtual env too on my GitHub or instead I change the same thing on my remote server manually?
As was mentioned in a comment it is standard to do this through a requirements.txt file instead of including the virtualenv itself.
You can easily generate this file with the following:
pip freeze > requirements.txt
You can then install the virtualenv packages on the target machine with:
pip install -r requirements.txt
It is important to note that including the virtualenv will often not work at all as it may contain full paths for your local system. It is much better to use a requirements.txt file.
No - although the environment is 100% there, if someone else where to pull it down the path environment hasn't been exported not to mention Python version discrepancies will likely crop up.
The best thing to do is to create what is known as a requirements.txt file.
When you have created your environment, you can pip install this and pip install that. You'll start to built a number of project specific dependencies.
Once you start to build up a number of project dependencies I would then freeze your local python environment (analogoues to a package.json for node.js package dependency management). I would recommend doing the following in your terminal:
(local_python_environment) $ pip install django && pip freeze > requirements.txt
(local_python_environment) $ pip install requests && pip freeze > requirements.txt
That is to say, freeze your environment to a requirements.txt file every time a new dependency is installed.
Once a collaborator pulls down your project - they can then install a fresh python environment:
$ python3 -m venv local_python_environment
(* Please use Python 3 and not Python 2!)
And then activate that environment and install from your requirements.txt which you have included in your version control:
$ source local_python_environment/bin/activate
(local_python_environment) $ pip install -r requirements.txt
Excluding your virtual environment is probably analogous to ignoring node_modules! :)
No Its not necessary to upload virtualenv file on github. and even some time when you push your code to github then it ignore python file only if add into ignore.
Virtual Environment
Basically virtual environment is nothing but itis a tool that helps to keep dependencies required by different projects separate by creating isolated python virtual environments for them. This is one of the most important tools that most of the Python developers use. Apart from that you can add requirement.txt file into your project.
Requirement.txt
It is file that tells us to which library and application are need to run this application. you can add requirement.txt file with this simple command.
pip freeze > requirements.txt
After run this command all application and library add in this file. and if you make your project without activate any virtualenv then python automatically use system environment variable it will also add all the file that not necessary for your project.
You should add the virtualenv in your gitignore. Infact github has a recommended format for python, which files should be added and which shouldn't
Github recommendation for gitignore
I plan on sharing a Python program on GitHub.
However it uses additional libraries like http, Selenium, BeautifulSoup and the Google Calendar API.
How do I include these libraries in the directory I push to GitHub, so that all users have to do is run python script.py, instead of having to install the libraries?
I thought of generating an executable with pyinstaller but that didn't work :/
You use a pip requirements.txt file for this.
If you have done your work inside of a virtual environment then run on your command line/terminal:
pip freeze > requirements.txt
Then commit and push the file to your github repository.
If you have not done your script within a virtual environment then run:
pip freeze > requirements.txt
And edit the file so you only have the modules you require.
I'd recommend always use a virtual environment for this since it makes your application easy to share. In django framework employing virtualenv's is very common.
Your collaborators can install your dependencies using:
pip install -r requirements.txt
After cloning your github repo.
Usually you don't need to embed your dependencies in your project (not practical! specially when they are many). Instead you may include requirements.txt inside your project to list the modules (and version number) that are required by your application. Then when a user need to use your script, they can run something like this:
pip install -r requirements.txt
read more about requirements files here:
https://pip.readthedocs.org/en/1.1/requirements.html#requirements-files
How can I tell if flask or python are installed globally? Everytime I attempt to push a flask python app locally I need to copy the flask, jinja2, markupsafe,and werkzeug directories along with file itsdangerous.py
I have had a little experience with paths before, as such I did the echo $PATH command and received my path
/home/me/rampup/webapp/venv/bin:/usr/local/heroku/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
Should I append my $PATH with the path locations of python and flask? If so how would I identify the paths of those applications?
You probably don't want to manually copy your dependencies around. (It's tedious and error prone). Instead, install pip (to manage your dependencies) and virtualenv[1] (to allow you to work on multiple projects with conflicting dependencies). Then:
Create a virtual environment: virtualenv venv
Activate said virtual environment . venv/bin/activate
Use pip to install your dependencies pip install Flask
There is no step #4
For deployments, simply ask pip to produce a manifest of all the dependencies you have with the command pip freeze (you can redirect it to a requirements.txt file with the following command pip freeze > requirements.txt). Then you can install the same dependencies with pip install -r requirements.txt on the remote machine.
[1]: If you are on Python 3.4+ you already have both - although you'll use pyvenv-3.4 instead of virtualenv.
I'm writing a Python app to deploy on Heroku. Per Heroku's guide, I need to list package requirements in a Pip requirements.txt file. The guide instructs me to install the packages locally, then run pip freeze > requirements.txt to write the frozen requirements file.
However, one of the packages I want to use in deployment on Heroku can't be installed locally. It's incompatible with my operating system.
So how do I write a requirements.txt including this package suitable for Heroku?
The only way I can think of is to write it by hand - but this would be tedious, because the package in question has many dependencies of its own. Besides, this defeats the point of the package manager.
When deploying Ruby apps to Heroku, Bundler makes this easy. In my Gemfile I write
gem "pg", :group => :production
gem "sqlite3", :group => :development
The command bundle install then writes a frozen version list Gemfile.lock (analogous to requirements.txt). It doesn't install the packages listed under the 'production' group, but it still freezes a consistent list of versioned packages.
Example: Gemfile and Gemfile.lock
You can have more than one file, and call them different things, but Heroku does expect a requirements.txt. For instance, for dev, you could maintain a dev_requirements.txt
Locally you can run:
$ pip freeze > dev_requirements.txt
etc, and
$ pip install -r dev_requirements.txt
and Heroku will run:
$ pip install -r requirements.txt
It's not possible. Issue reported to pip https://github.com/pypa/pip/issues/747