Global installations - Flask & Python - python

How can I tell if flask or python are installed globally? Everytime I attempt to push a flask python app locally I need to copy the flask, jinja2, markupsafe,and werkzeug directories along with file itsdangerous.py
I have had a little experience with paths before, as such I did the echo $PATH command and received my path
/home/me/rampup/webapp/venv/bin:/usr/local/heroku/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games
Should I append my $PATH with the path locations of python and flask? If so how would I identify the paths of those applications?

You probably don't want to manually copy your dependencies around. (It's tedious and error prone). Instead, install pip (to manage your dependencies) and virtualenv[1] (to allow you to work on multiple projects with conflicting dependencies). Then:
Create a virtual environment: virtualenv venv
Activate said virtual environment . venv/bin/activate
Use pip to install your dependencies pip install Flask
There is no step #4
For deployments, simply ask pip to produce a manifest of all the dependencies you have with the command pip freeze (you can redirect it to a requirements.txt file with the following command pip freeze > requirements.txt). Then you can install the same dependencies with pip install -r requirements.txt on the remote machine.
[1]: If you are on Python 3.4+ you already have both - although you'll use pyvenv-3.4 instead of virtualenv.

Related

How to correct wrong pip path in venv?

I have a Flask app that with and using venv for my virtual environment. For some reason, which pip has suddenly stopped installing packages in venv/lib/python3.9/site-packages, but to a completely different repository on my system. How do I redirect pip to install packages to the correct path in venv?
instead of using pip as a stand alone command in terminal, i suggest you to use python -m pip instead, here python will be assoicated with the project (in case of Virtual environment) or whole system level (in case docker single app ). This will keep track and tell the system to run all package/module related to python interpreter assoicated with the project only
You can use the -t flag provided by the pip install command.
pip install <package_name> -t <full_location_path>

Forcing Flask to only use libraries contained in requirements.txt?

I have a Flask app currently running on Google App Engine. Locally, the app is running inside a virtual environment and references the appropriate libraries installed in the venv/Lib/site-packages directory.
When the app is updated in GAE, requirements.txt is used to determine what libraries/dependencies need to be installed. I frequently get tedious errors like "Module not found" and have to remember to add said module in my requirements.txt, and then have to redeploy and check the error logs, which takes time.
I have a bunch of dependencies installed in my virtual environment, only some of which need to be referenced in my requirements.txt file, since I only use a few in my Flask app. So, I am trying to figure out a way to test my app locally as if it was running on GAE by forcing Flask to reference only those dependencies in my requirements.txt file so if there is a "Module not found" error, I won't have to repeat gcloud app deploy and have to scour through the logs all over again but rather just do it quickly on my own machine.
Hopefully that wasn't to convoluted, lol.
To be clear, not everything installed in your virtual env needs to be declared in your requirements.txt file. Some libraries are installed because they are dependencies of another. For example, just listing Flask will lead to Jinja also being installed
To your specific issue, you're basically saying you did not narrow down the actual libraries you need for your project. This is usually due to copying over installed libraries from another project.
You can use pip3 freeze > requirements.txt or pip2 freeze > requirements.txt to auto-generate your requirements.txt file. The problem with this method is that it will include everything installed in your virtual env and it seems you don't want this.
Some suggest using pipreqs (see this Stackoverflow answer).
I normally do it the manual way i.e. Remove the existing venv, create a requirements.txt with just the basics - python, flask/django, run your program and then manually add each library it complains about into the requirements.txt file and reinstall the contents of your requirements.txt file. Rinse & Repeat till you no longer get errors. Now you have your full requirements.
1.Install venv --- sudo apt install python3-venv --- for python3
2.You create Virtual environment for all flask server.
Choose some directory and run this command.
python3 -m venv venv
3.Run this command for activate for venv
source venv/bin/activate
4.Choose diroctory flask server
pip3 freeze > requirements.txt -run this command
5. Finally run this command
pip3 install -r requirements.txt
6. You can use this venv all flask server. You can update it.

Do we need to upload virtual env on github too?

This is my GitHub repo
https://github.com/imsaiful/backmyitem
I push from my local machine and pull the changes in Amazon EC2.
Earlier I have not added the virtual env file in my repo but now I have changed some file in admin directory which is containing in the virtual env. So should I go for to add the virtual env too on my GitHub or instead I change the same thing on my remote server manually?
As was mentioned in a comment it is standard to do this through a requirements.txt file instead of including the virtualenv itself.
You can easily generate this file with the following:
pip freeze > requirements.txt
You can then install the virtualenv packages on the target machine with:
pip install -r requirements.txt
It is important to note that including the virtualenv will often not work at all as it may contain full paths for your local system. It is much better to use a requirements.txt file.
No - although the environment is 100% there, if someone else where to pull it down the path environment hasn't been exported not to mention Python version discrepancies will likely crop up.
The best thing to do is to create what is known as a requirements.txt file.
When you have created your environment, you can pip install this and pip install that. You'll start to built a number of project specific dependencies.
Once you start to build up a number of project dependencies I would then freeze your local python environment (analogoues to a package.json for node.js package dependency management). I would recommend doing the following in your terminal:
(local_python_environment) $ pip install django && pip freeze > requirements.txt
(local_python_environment) $ pip install requests && pip freeze > requirements.txt
That is to say, freeze your environment to a requirements.txt file every time a new dependency is installed.
Once a collaborator pulls down your project - they can then install a fresh python environment:
$ python3 -m venv local_python_environment
(* Please use Python 3 and not Python 2!)
And then activate that environment and install from your requirements.txt which you have included in your version control:
$ source local_python_environment/bin/activate
(local_python_environment) $ pip install -r requirements.txt
Excluding your virtual environment is probably analogous to ignoring node_modules! :)
No Its not necessary to upload virtualenv file on github. and even some time when you push your code to github then it ignore python file only if add into ignore.
Virtual Environment
Basically virtual environment is nothing but itis a tool that helps to keep dependencies required by different projects separate by creating isolated python virtual environments for them. This is one of the most important tools that most of the Python developers use. Apart from that you can add requirement.txt file into your project.
Requirement.txt
It is file that tells us to which library and application are need to run this application. you can add requirement.txt file with this simple command.
pip freeze > requirements.txt
After run this command all application and library add in this file. and if you make your project without activate any virtualenv then python automatically use system environment variable it will also add all the file that not necessary for your project.
You should add the virtualenv in your gitignore. Infact github has a recommended format for python, which files should be added and which shouldn't
Github recommendation for gitignore

Does python pip have the equivalent of node's package.json?

In NodeJS's npm you can create a package.json file to track your project dependencies. When you want to install them you just run npm install and it looks at your package file and installs them all with that single command.
When distributing my code, does python have an equivalent concept or do I need to tell people in my README to install each dependency like so:
pip install package1
pip install package2
Before they can use my code?
Once all necessary packages are added
pip freeze > requirements.txt
creates a requirement file.
pip install -r requirements.txt
installs those packages again, say during production.
The best way may be pipenv! I personally use it!
However in this guide i'll explain how to do it with just python and pip! And without pipenv! That's the first part! And it will give us a good understanding about how pipenv works! There is a second part that treat pipenv! Check the section pipenv (The more close to npm).
Python and pip
To get it all well with python! Here the main elements:
virtual environment
requirements file (listing of packages)
pip freeze command
How to install packages from a requirements file
Virtual environment and why
Note that for this the package venv is to be used! It's the official thing! And shiped with python 3 installation starting from 3.3+ !
To know well the what is it and the why check this out
https://docs.python.org/3/tutorial/venv.html
In short! A virtual environment will help us manage an isolated version of python interpreter! And so too installed packages! In this way! Different project will not have to depends on the same packages installation and have to conflict! Read the link above explain and show it well!
... This means it may not be possible for one Python installation to meet the requirements of every application. If application A needs version 1.0 of a particular module but application B needs version 2.0, then the requirements are in conflict and installing either version 1.0 or 2.0 will leave one application unable to run.
You may like to check the explanation on flask framework doc!
https://flask.palletsprojects.com/en/1.1.x/installation/#virtual-environments
Why we care about this and should use it! To isolate the projects! (each have it's environment)! And then freeze command will work per project base! Check the last section
Usage
Here a good guide on how to setup and work:
https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/
Check the installation section first!
Then
To create a virtual environment you go to your project directory and run:
On macOS and Linux:
> python3 -m venv env
On Windows:
> py -m venv env
Note You should exclude your virtual environment directory from your version control system using .gitignore or similar.
To start using the environment in the console, you have to activate it
https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/#activating-a-virtual-environment
On macOS and Linux:
> source env/bin/activate
On Windows:
> .\env\Scripts\activate
See the part on how you check that you are in the environment (using which (linux, unix) or where (windows)!
To deactivate you use
> deactivate
Requirement files
https://pip.pypa.io/en/latest/user_guide/#requirements-files
“Requirements files” are files containing a list of dependencies to be installed using pip install like so
(How to Install requirements files)
pip install -r requirements.txt
Requirements files are used to hold the result from pip freeze for the purpose of achieving repeatable installations. In this case, your requirement file contains a pinned version of everything that was installed when pip freeze was run.
python -m pip freeze > requirements.txt
python -m pip install -r requirements.txt
Some of the syntax:
pkg1
pkg2==2.1.0
pkg3>=1.0,<=2.0
== for precise!
requests==2.18.4
google-auth==1.1.0
Force pip to accept earlier versions
ProjectA
ProjectB<1.3
Using git with a tag (fixing a bug yourself and not waiting)
git+https://myvcs.com/some_dependency#sometag#egg=SomeDependency
Again check the link https://pip.pypa.io/en/latest/user_guide/#requirements-files
I picked all the examples from them! You should see the explanations! And details!
For the format details check: https://pip.pypa.io/en/latest/cli/pip_install/#requirements-file-format
Freeze command
Pip can export a list of all installed packages and their versions using the freeze comman! At the run of the command! The list of all installed packages in the current environment get listed!
pip freeze
Which will output something like:
cachetools==2.0.1
certifi==2017.7.27.1
chardet==3.0.4
google-auth==1.1.1
idna==2.6
pyasn1==0.3.6
pyasn1-modules==0.1.4
requests==2.18.4
rsa==3.4.2
six==1.11.0
urllib3==1.22
We can write that to a requirements file as such
pip freeze > requirements.txt
https://pip.pypa.io/en/latest/cli/pip_freeze/#pip-freeze
Installing packages Resume
By using venv (virtual environment) for each project! The projects are isolated! And then freeze command will list only the packages installed on that particular environmnent! Which make it by project bases! Freeze command make the listing of the packages at the time of it's run! With the exact versions matching! We generate a requirements file from it (requirements.txt)! Which we can add to a project repo! And have the dependencies installed!
The whole can be done in this sense:
Linux/unix
python3 -m venv env
source env/bin/activate
pip3 install -r requirements.txt
Windows
py -m venv env
.\env\Scripts\activate
pip3 install -r requirements.txt
First time setup after cloning a repo!
Creating the new env!
Then activating it!
Then installing the needed packages to it!
Otherwise here a complete guide about installing packages using requiremnets files and virtual environment from the official doc: https://packaging.python.org/guides/installing-using-pip-and-virtual-environments/
This second guide show all well too: https://docs.python.org/3/tutorial/venv.html
Links listing (already listed):
https://pip.pypa.io/en/latest/user_guide/#requirements-files
https://pip.pypa.io/en/latest/cli/pip_install/#requirements-file-format
https://pip.pypa.io/en/latest/cli/pip_freeze/#pip-freeze
pipenv (The more close to npm)
https://pipenv.pypa.io/en/latest/
pipenv is a tool that try to be like npm for python! Is a super set of pip!
pipenv create virtual environment for us! And manage the dependencies!
A good feature too is the ability to writie packages.json like files! With scripts section too in them!
Executing pipfile scripts
run python command with alias in command line like npm
Installation
https://pipenv.pypa.io/en/latest/install/
virtualenv-mapping-caveat
https://pipenv.pypa.io/en/latest/install/#virtualenv-mapping-caveat
For me having the env created within the project (just like node_modules) should be even the default! Make sure to activate it! By setting the environment variable!
pipenv can seems just more convenient!
Mainly managing run scripts is too good to miss on! And a one tool that simplify it all!
Basic usage and comparing to npm
https://pipenv.pypa.io/en/latest/basics/
(make sure to check the guide above to get familiar with the basics)
Note that the equivalent of npm package.json is the PipFile file!
An example:
[[source]]
url = "https://pypi.org/simple"
verify_ssl = true
name = "pypi"
[packages]
flask = "*"
simplejson = "*"
python-dotenv = "*"
[dev-packages]
watchdog = "*"
[scripts]
start = "python -m flask run"
[requires]
python_version = "3.9"
There is Pipfile.lock like package.lock
To run npm install equivalent! You run pipenv install!
To insall a new package
pipenv install <package>
This will create a Pipfile if one doesn’t exist. If one does exist, it will automatically be edited with the new package you provided.
Just like with npm!
$ pipenv install "requests>=1.4" # will install a version equal or larger than 1.4.0
$ pipenv install "requests<=2.13" # will install a version equal or lower than 2.13.0
$ pipenv install "requests>2.19" # will install 2.19.1 but not 2.19.0
If the PIPENV_VENV_IN_PROJECT=1 env variable is set! To make pipenv set the virtual environmnent within the project! Which is created in a directory named .venv (equiv to node_modules).
Also running pipenv install without a PipFile in the directory! Neither a virtual environment! Will create the virtual environment on .venv directory (node_modules equiv)! And generate a PipFile and Pipfile.lock!
Installing flask example:
pipenv install flask
Installing as dev dependency
pipenv install watchdog -d
or
pipenv install watchdog -dev
just like with npm!
pipenv all commands (pipenv -h)
Commands:
check Checks for PyUp Safety security vulnerabilities and against PEP
508 markers provided in Pipfile.
clean Uninstalls all packages not specified in Pipfile.lock.
graph Displays currently-installed dependency graph information.
install Installs provided packages and adds them to Pipfile, or (if no
packages are given), installs all packages from Pipfile.
lock Generates Pipfile.lock.
open View a given module in your editor.
run Spawns a command installed into the virtualenv.
scripts Lists scripts in current environment config.
shell Spawns a shell within the virtualenv.
sync Installs all packages specified in Pipfile.lock.
uninstall Uninstalls a provided package and removes it from Pipfile.
update Runs lock, then sync.
Command help
pipenv install -h
importing from requirements.txt
https://pipenv.pypa.io/en/latest/basics/#importing-from-requirements-txt
environment management with pipenv
https://pipenv.pypa.io/en/latest/basics/#environment-management-with-pipenv
pipenv run
To run anything with the project virtual environment you need to use pipenv run
As like pipenv run python server.py!
Custom scripts shortcuts
scripts in npm!
https://pipenv.pypa.io/en/latest/advanced/#custom-script-shortcuts
[scripts]
start = "python -m flask run"
And to run
pipenv run start
Just like with npm!
If you’d like a requirements.txt output of the lockfile, run $ pipenv lock -r. This will include all hashes, however (which is great!). To get a requirements.txt without hashes, use $ pipenv run pip freeze.
To mention too the pipenv cli rendering is well done:
Make sure to read the basics guide!
And you can see how rich is pipenv!
Yes, it's called the requirements file:
https://pip.pypa.io/en/stable/cli/pip_install/#requirement-specifiers
You can specify the package name & version number.
You can also specify a git url or a local path.
In the usual case, you would specify the package followed by the version number, e.g.
sqlalchemy=1.0.1
You can install all the packages specified in a requirements.txt file through the command
pip install -r requirements.txt
Once all the packages have been installed, run
pip freeze > requirements.txt
This will save the package details in the file requirements.txt.
For installation, run
pip install -r requirements.txt
to install the packages specified by requirements.txt.
I would like to propose pipenv here. Managing packages with Pipenv is easier as it manages the list and the versions of packages for you because I think you need to run pip freeze command each time you make changes to your packages.
It will need a Pipfile. This file will contain all of your required packages and their version just like package.json.
You can delete/update/add projects using pipenv install/uninstall/update <package>
This also generates a dependency tree for your project. Just like package-lock.json
Checkout this post on Pipfiles
Learn more about Pipenv

Installed Virtualenv and activating virtualenv doesn't work

I cloned my Django Project from Github Account and activated the virtualenv using famous command source nameofenv/bin/activate
And when I run python manage.py runserver
It gives me an error saying:
ImportError: Couldn't import Django. Are you sure it's installed and available on your PYTHONPATH environment variable? Did you forget to activate a virtual environment?
I was thinking that every and each dependency I need, might be present inside virtualenv.
Well, no. By default, a newly created virtualenv comes empty, that is, with no third-party library. (Optionaly, you may allow a virtualenv to access libraries installed system-wide, but that's another story.)
Once the virtualenv is created, you need to install the dependencies you need.
(How could virtualenv know what dependencies you need?)
The procedure is to install the virtualenv, activate it, and then install the libraries needed for the project (in you case Django and perhaps others).
If you project has a requirements.txt, you may install every required dependency with the command:
pip install -r requirements.txt
If your project has a setup.py, you may also execute
pip install -e path/to/your/project/clone/.
to install the project in the virtualenv. This should install the dependencies.
Of course, if the only dependency is Django, you can just type
pip install django
on ubuntu version
#install python pip
sudo apt-get install python-pip
#install python virtualenv
sudo apt-get install python-virtualenv
# create virtual env
virtualenv myenv
#activate the virtualenv
. myenv/bin/activate
#install django inside virtualenv
pip install django
#create a new django project
django-admin.py startproject mysite
#enter to the folder of the new django project
cd mysite
#run the django project
python manage.py runserver
If you have several python on your machine, for example,python2.7, python3.4, python3.6, it is import to figure out which version the python really reference to, and more over, which version does pip reference to.
The same problem got in my way after I installed the let's encrypt when I run the following command.
(python3 manage.py runserver 0:8000 &)
I inspected the python version and found that python3, python3.4, python3.6, python3.4m were all available.
I just change python3 to python3.6 and solved the problem.
(python3.6 manage.py runserver 0:8000 &)
So, this is probably a version mismatching problem if it is OK for a long time and crashes down suddenly.
I'm guessing you also upload the virtual environment from your other pc. And you hope that only activating that will work, bzz.
It's not recommended to upload the virtualenv files to your git repository, as #Alain says it's a good practice to have a requirements.txt file containing the project dependencies. You can use pip freeze > requirements.txt (when the environment is activated) to generate the project requirements file.
By doing so, when you clone the repository from another computer, you need to create a new virtualenv by issuing the command:
virtualenv nameofenv
then activate it:
source nameofenv/bin/activate
and finally, use the requirements file to install the requirements for your project using:
pip install -r requirements.txt
I had installed Django 2 via pip3 install Django, but I was running python manage.py runserver instead of python3 manage.py runserver. Django 2 only works with python 3+.

Categories