How do I set up a virtual environment with Flask using conda? - python

I wish to set up a virtual environment that I can use to develop web applications using the Flask framework for Python (3.4.2, Mac OS). I was given the instructions on how to do that here, using the virtualenv. However, trying to follow these instructions I ran into a problem: I have Python installed via Anaconda, and upon trying:
sudo easy_install virtualenv
I am warned that I should be doing this with the already-installed conda package instead. I can't imagine that the conda way of doing things is much harder, but I also want to get bogged down with reading the documentation, because then I might not emerge back out of it again... So my question is, what's a quick way of setting up a virtual environment with Flask using Conda? And how can I then add more dependencies into this mix?

Your mileage may vary, but the docs tends to be where the answers are.
conda create -n my_flask_env
source activate my_flask_env
conda install condastuff
pip install otherstuff

Related

Do I need a clean install of Python to start working with virtual environments?

I've been using Python on my system for about a year as a new programmer. Until recently, the topic of virtual environments hasn't come up until got to the end point of the Django course on codecademy. I'm now expected to make a Django project on my own system.
I have been just installing packages to Python without making virtual environments in the past as I wasn't aware that it was recommended to create an environment for each project.
Should I have a clean install of Python before I start using virtual environments?
If so, is there a pip command to uninstall all non-python native packages and essentially reset the install?
Should I have a clean install of Python before I start using virtual environments?
No, it's not needed. Indeed, doing so would defeat the main purpose of using a virtual environment: they are used in order to "isolate" the packages of a project, without having to load the ones you have "globally" installed.
You can just create a new virtual environment and use it, everytime you make a brand new project!
You don't necessarily need a clean install for Python to have a 'clean' environment using virtualenv.
It used to be that you would specify the --no-site-packages flag to remove visibility of the globally installed packages like the below:
virtualenv --no-site-packages venv_name
However, this is now the default option for virtualenv, and you don't need to do it explicitly unless you are running a very old version.

How to install python module local to a single project

I've been going around but was not able to find a definitive answer...
So here's my question..
I come from javascript background. I'm trying to pickup python now.
In javascript, the basic practice would be to npm install (or use yarn)
This would install some required module in a specific project.
Now, for python, I've figured out that pip install is the module manager.
I can't seem to figure out how to install this specific to a project (like how javascript does it)
Instead, it's all global.. I've found --user flag, but that's not really I'm looking for.
I've come to conclusion that this is just a complete different schema and I shouldn't try to approach as I have when using javascript.
However, I can't really find a good document why this method was favored.
It may be just my problem but I just can't not think about how I'm consistently bloating my pip global folder with modules that I'm only ever gonna use once for some single project.
Thanks.
A.) Anaconda (the simplest) Just download “Anaconda” that contains a lots of python modules pre installed just use them and it also has code editors. You can creat multiple module collections with the GUI.
B.) Venv = virtual environments (if you need something light and specific that contains specific packages for every project
macOS terminal commands:
Install venv
pip install virtualenv
Setup Venve (INSIDE BASE Project folder)
python3 -m venv thenameofyourvirtualenvironment
Start Venve
source thenameofyourvirtualenvironment/bin/activate
Stop Venve
deactivate
while it is activated you can install specific packages ex.:
pip -q install bcrypt
C.) Use “Docker” it is great if you want to go in depth and have a solide experience, but it can get complicated.
Pip is a program used to manage Python distribution. You usually have one system distribution which is by default managed by Pip. When you do pip install scipy, you install package scipy to your system Python. Everytime you try to import scipy after it will work because your system Python has it.
Project specific distributions are acomplished by using virtual environments. python -m venv env or venv env creates a copy of system Python interpreter, pip, setuptools and a couple of other essential tools. In other words, virtual environment created this way is empty.
To use created virtual environement one should use source env/bin/activate. After that, everytime you invoke python command it will use activated Python interpreter. When you install packages using pip, it will install them in the virtual environment rather than to your system python. To use system Python again use deactivate.
Such usage is actually prefered for projects because some user applications could rely on system Python and some packages, and installing, updating etc. could be potentionally dangerous.
Further reading: venv documentation

Why doesn't a new Conda environment come with packages like numpy?

I am going through the painful process of learning how to manage packages/ different (virtual) environments in Python/Anaconda. I was told that Anaconda is basically a python installation with all the packages I need (e.g. numpy, scipy, sci-kit learn etc).
However, when I create a new environment, none of these packages is readily available. I cannot import them when using PyCharm with the newly created environment. When I check the Pycharm project interpreter, or the anaconda navigator environments tab, It seems that indeed none of these packages are installed in my new environments. Why is this? It doesn't make sense to me to provide all these packages, but then not make them ready for use when creating new environments. Do I have to install all these packages manually in new env's or am I missing something?
Kindest regards, and thanks in advance.
The reason the default python environment doesn't come with numpy is because maybe you don't want numpy in the environment. Imagine writing an API (or general software package) where your users may or may not have access to numpy. You might want to run tests to make sure your software fails gracefully or has a pure python fallback if numpy is not installed on your user's machine. Conda environments provide this (insanely useful) benefit. Of course, the package in question doesn't have to be numpy. There are some more esoteric packages where this type of testing is useful.
Furthermore, you can create a conda environment with numpy pre-installed, or any other package you want pre-installed (just add them to the end of the conda create command):
conda create --name my-env-name numpy
Anaconda comes with available packages such as numpy, scipy, and sci-kit learn, but if you want to use them within your environment, you must:
1) Create the environment:
conda create --name new_env
2) Activate the environment:
source activate new_env
3) Install the desired package using conda install
conda install numpy
If you'd like to create a new environment that includes installations of all available Anaconda packages, see create anaconda python environment with all packages. You can include anaconda in the list of packages to install in the environment, which is a 'meta-package' meaning 'all the packages that come with the Anaconda installation'.
I don't know about "conda" environments but in general virtual environments are used to provide you a "unique" environment. This might include different packages, different environment variables etc.
The whole point of making a new virtual environment is to have a separate place where you can install all the binaries ( and other resources ) required for your project. If you have some pre-installed binaries in the environment, doesn't it defeat the purpose of creating one in the first place?
The fact that you can create multiple environments helps you to separate binaries that might be needed by one and not by the other.
For instance, if you are creating a project which requires numpy:1.1 but you have numpy:2.1 installed , then you have to change it. So basically, by not installing any other packages, they are not making assumptions about your project's requirements.
You can check the packages you have in your environment with the command:
conda list
If packages are not listed you just have to add it, with the command:
conda install numpy

How can I manage mutiple python in Ubuntu16.04?

  In my Ubuntu16.04, there are python 2 and python 3 default. In addition, i have installed anaconda too. I am sucked by the 'python' cmd. Every time i use pip or pip3 install, I don't know where the package install, python2 or python 3? And I use conda install to install anaconda package. I also use anaconda env to manage different virtual env. But I think it mix with my local Python 2 and 3.
  For example, in directory /usr/bin, I found many soft links like this:
   When i try 'python' cmd, it just confuse me!
   Why python3m are local, shouldn't it be anaconda? Why python3 are anaconda, shouldn't it be local? Then I found that if I use ./python2 or ./python3, I found it is correct now!
  So I know it is caused by environment variables. I echo $PATH, Found it like this: /home/kinny/.pyenv/shims:/home/kinny/.pyenv/bin:/home/kinny/anaconda3/bin:/home/kinny/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/opt/ant/bin:/snap/bin:/opt/maven/bin:/usr/lib/jvm/java-8-oracle/bin
   I have used update-alternative --config python to configure default python, but it doesn't work! It sames mixed with each others.
   Now I just want to install tensorflow 0.11 in local python3, because in anaconda it is 0.10 version by default. So how can I change this. I just want to use python python3 and python3m represents python2.7 python3.5 and anaconda python respectively, How can I do that! use pip and pip3 for local python2 and python3 respectively!
I ran into a similiar problem when setting up PyCharm Edu to work with Anaconda. I found that I had several versions of Python installed and it was very hard to keep track of which version the IDE was referencing. My CS professor gave me the advice of simply removing the versions of Python I didn't frequent. I now just have Anaconda installed; and use the Anaconda Prompt as my Python console. I also rely on PyCharm's IPython for the developer console. However, if you still want differing versions of Python installed (say your doing QA testing for older devices); there is the really helpful command: which python. When entered into the python console or Anaconda Prompt: which python will display the directory associated with the currently executing Python Shell. This enables you to better keep track of to what particular python.exe the current window is referring to.
Follow up to the comments mentioning using virtualenv and virtualenvwrapper.
Here are the official docs and a good blog post to follow for getting started using virtualenv's is here:
https://virtualenv.pypa.io/en/stable/installation/
http://virtualenvwrapper.readthedocs.io/en/latest/install.html
http://exponential.io/blog/2015/02/10/install-virtualenv-and-virtualenvwrapper-on-ubuntu/
Also, once you are setup you can create virtualenv's specifying which python installation you want to use.
which python3
returns
/usr/bin/python3
Then create a virtualenv with that python path. Where example_env is the name of the virtualenv.
mkvirtualenv -p /usr/bin/python3 example_env
Then activate the virtualenv using virtualenvwrapper.
workon example_env
Finally, install tensorflow and other dependencies with pip.
pip install tensorflow
the which command is very useful for finding the path to the executable that is first in your path. Zsh also has the where command, which will show you all instances of the given executable that show up in your path. For managing different python versions, you have a lot of options. The easiest for most people tends to be anaconda, using conda environments. The installer will ask you to add some stuff to your .bashrc file, which will then make anaconda's binaries come first in your path. Anything else you run after the .bashrc gets sourced after that, will then use that first, including PyCharm. For graphical desktop apps to pick up the change, you may need to log out and back in again. If you only need one version each of python 2 and python 3, you can just use the ones available via apt. Depending on your Ubuntu version, Python 2 is definitely installed by default as it is used by many system utilities, including apt itself. Some newer versions may also install python 3 by default, but I do not remember for sure. Another option is to install the versions of python you need in an alternate location, such as /opt/python/<version> and then using environment-modules (installed via apt install environment-modules) or Lmod to control which versions are being used, but that may or may not be easy/convenient to use with a desktop application such as PyCharm.
for TensorFlow, 1.11 is available in anaconda, but I don't remember if it's in the default channel or not.

Installing certain packages using virtualenv

So, I want to start using virtualenv this year. I like the no-site-packages option, that is nice. However I was wondering how to install certain packages into each virtualenv. For example, lets say I want to install django into each virtualenv... is this possible, and if so, how? Does buildout address this?
Well it's not so much django, more like the django applications... I dont mind installing a version of django into each virtualenv... i was just wondering if there was some intermediate option to 'no-site-packages'
I know where you're coming from with the no-sites-option. I want to use pip freeze to generate requirements lists and don't want a lot of extra cruft in site-packages. I also need to use multiple versions of django as I have legacy projects I haven't upgraded (some old svn checkouts (pre1.0), some 1.0, and some new svn checkouts). Installing Django in the global site-packages isn't really an option.
Instead I have a django folder with releases and a couple of different svn versions and just symlink to the appropriate version in the local site-packages. For ease of use I link to the local site-packages at the same level as the environment and then link in the appropriate django directory and any other "system" style packages I need (usually just PIL). So:
$ virtualenv pyenv
$ ln -s ./pyenv/lib/python2.5/site-packages ./installed
$ ln -s /usr/lib/python2.5/site-packages/PIL ./installed
$ ln -s /opt/django/django1.0svn/trunk/django ./installed
Now the following works:
$ source pyenv/bin/activate
$ python
> import django
> import PIL
If you want django to be installed on EACH virtualenv, you might as well install it in the site-packages directory? Just a thought.
I'd suggest using virtualenv's bootstrapping support. This allows you to execute arbitrary Python after the virtualenv is created, such as installing new packages.
The other option (one I've used) is to easy_install Django after you've created the virtual environment. This is easily scripted. The penalty you pay is waiting for Django installation in each of your virtual environments.
I'm with Toby, though: Unless there's a compelling reason why you have to have a separate copy of Django in each virtual environment, you should just consider installing it in your main Python area, and allowing each virtual environment to use it from there.
I want to check out this project:
http://www.stereoplex.com/two-voices/fez-djangoskel-django-projects-and-apps-as-eggs
Might be my answer....

Categories