Can Anaconda be used to create virtual environments for Go? - python

Anaconda is a powerful and popular toolkit normally associated to Python development. Its snake-name is probably chosen to remind of Python, but the feeling I get using it, is that all it does is handling virtual environments, pre-installing useful Python libraries, tools and packages, and then (and probably most importantly) dealing with dependencies and installations.
Can I set Anaconda up to create virtual environments for Go, too?
From the fact Anaconda can install Go, I get to think it's actually possible, but then I also get the feeling all dependencies-solving power of Anaconda is probably made thinking of Python only.
...Can it be done?

I also get the feeling all dependencies-solving power of Anaconda is probably made thinking of Python only.
Absolutely not. Anaconda installs Conda. Conda is a package and environment manager, and not just for Python, but also for R, Julia, Perl, Scala, etc... including Go.
You will need to create a new environment if you want to run Go:
conda create --name go --channel-name conda-forge go
Then move to the new environment to run go using:
activate go

Since Go supports vendoring, you can use an utility like https://github.com/golang/dep to have all dependencies in the project.

Related

Installing a second python environment I’m mac

I’m trying to figure out how to install a second python environment alongside anaconda.
On windows I can just install python in a different folder stand reference the desired python environment using env variables. I’d like to do the same on Mac.
A virtual env won’t do the trick as it does not copy the standard library and other things. It needs to be a complete stand alone environment. I guess I could compile it, but is there an easier way?
Thank you very much for any input.
You can do that using pyenv.
It allows you to have several python versions, and even different distributions.
It works, mostly on user space. So, no additional requirements are needed (apart from compilation tools)

Managing Multiple Python installations

Many modern software has dependency on python language and they -as a consequence- install their own versions of python with the necessary libraries for each particular software to work properly.
In my case, I have my own python that I downloaded intentionally using anaconda distribution, but I also have the ones came with ArcGIS, QGIS, and others.
I have difficulties distinguishing which python -say- I am updating or adding libraries to when reaching them from the command line, and these are not environments but rather the full python packages.
What is the best way to tackle that?
Is there a way to force new software to create new environments within one central python distribution instead of loosing track of all the copies existing in my computer?!
Note that I am aware that QGIS can be downloaded now through conda, which reduces the size of my problem, but doesn't completely solve it. Moreover, that version of QGIS comes with its own issues.
Thank you very much.
as Nukala suggested, that's exactly what virtual environments are for. It contains a particular version of a python interpreter and a set of libraries to be used by a single (or sometimes multiple) project. If you use IDE:s such as Pycharm, it handles the venvs for you automatically.
You can use pyenv to manage python versions in your system. Using pyenv you can easily switch between multiple versions.
And as suggested - each project can create a virtual environment. You have multiple options here - venv, virtualenv, virtualenvwrapper, pipenv, poetry ... etc.

Why and when to use a new anaconda environment? (One environment for everything? A new environment for every project?)

I have a little bit of experience using Anaconda and am about to transition to using it much more, for all of my Python data work. Before embarking on this I have what feels like a simple question: "when should I use a new environment?"
I cannot find any good, practical advice on this on StackOverflow or elsewhere on the web.
I understand what environments are and their benefits and how, if I am working on a project that has a dependency on a specific version of a library that is different to e.g. the latest version of that library etc. etc. ... then virtual environments are the answer; but I am looking for some advice as to how to practically approach their use in my day-to-day work on different data projects.
Logically there appears to be (at least) two approaches:
Use one environment until you absolutely need a separate environment
for a specific project
Use a new environment for every single project
I can see some pros and cons to each approach and am wondering if there is any best practice that can be shared.
If I should use just one environment until I need a second one, should I just use the default "root" environment and load all my required dependent libraries into that or is it best to start off with my own environment that is called something else?
An answer to this question "why create new environment for install" by #codeblooded gives me some hints as to how to use and approach conda environments and suggests a third way,
Create new environments on an as-needs basis, projects do not "live" inside environments but use environments at runtime, you will end up with as many different virtual environments as you need to run the projects that you regularly use on that machine, that may be just one environment or it may be more
Anyway, you can see that I am struggling to get my head around this, any help would be greatly appreciated. Thank you!
As a developer that works with data scientists I would strongly recommend creating an environment for each project. The benefit of python environments is that the encapsulate the requirements of a project from all other python projects.
In the case above, if you were to use Python36 for 8 different projects it would be very easy to accidentally upgrade a package or install a conflicting package that breaks other projects without you realising it.
In the work you do it might not be a big deal, but given how easy it is to create a separate environment for each project the benefits outway the small time cost.
I can tell you that if any of the developers I work with was found to be using a single python environment for multiple development projects they would be instructed to stop doing that immediately.
Ok, I think I worked this one out for myself. Seems kind of obvious now.
You do NOT need to create an environment for every project.
However if particular projects require particular versions of libraries, a particular version of Python etc then you can create a virtual environment to capture all of those dependencies.
To give an example,
let's say you are working on a project that requires a library that has a dependency on a particular version of Python e.g. Python 3.6,
and your base (root) environment is Python 3.7,
you would create a new anaconda environment configured to use Python 3.6 (maybe call it "Python36")
and you would install all the required libraries in that environment and you would use that environment when running that project.
When you have another project that requires similar libraries, you may re-use your now existing Python 3.6 environment (named "Python36") to run this new project,
you would not have to create a new Python 3.6 environment, in the same way that you would not have to install multiple instances of Python 3.6 in order to run multiple projects that required Python 3.6.

Running pip3 on MacOS Sierra (Python newbie) - do I need a virtualenv?

I'm running MacOS Sierra 10.12.6
By default the system came with Python 2.7.10
I installed Python 3.6.3 (with IDLE) so I can learn Python (3). I understand that this is normal as MacOS may rely on Python 2.x for some programs. Either way, Python3 runs just fine if I run python3 from the command line/terminal, or if I use IDLE (which defaults to Python 3).
Now I want to install some libraries like Beautiful Soup.
And I believe I can install it as follows:
pip3 install beautifulsoup4
which should automatically install it. However, I read that it's recommended to use virtualenv on Mac BEFORE I run the above command. As a newbie, I don't want to mess anything up on my PC, so can anyone point me out how I can do this correctly?
For example, I can follow this link: http://sourabhbajaj.com/mac-setup/Python/virtualenv.html
But I just want to write here to make sure I'm following the right article/commands before I do it. Just being super careful!
Also, can I make a folder with my "virtual environment" and then add sub-folders inside that for each project? Meaning, I don't need to do this everytime, I have one virtual environment and any project that I do just is a subfolder within that space so I can use any libraries that I installed. Just trying to grasp the concept.
Thanks!
Sorry to add confusion.. this can be a tough subject for someone starting out.
The official docs recommend venv, which is similar to, but slightly different than virtualenv.
I would strongly recommend pycharm. It will create your venv for you as part of your project, which you might find helpful.
[Edit: Some other virtual environment features of pycharm that will help you].
If you type in an import statement for a package that isn't installed, it will offer to install it for you.
typing alt-F12 will bring you up a console with your virtual environment active
It syncs up your requirements.txt document for you
It manages your virtual environment path for you (as long as you are running inside pycharm), helping avoid import problems that many newcomers have with virtual environments.
I am not affiliated with pycharm, btw -- I just think it is a great tool for python developers, especially for newcomers, and its treatment of virtual environments is especially helpful.
You create one virtualenv for each project as a way of keeping track of the specific dependencies to keep them minimal which then makes it easier when you want to share projects with other people.
But this is not something you need. No harm comes from installing packages in your real environment as well. So you can safely run
pip3 install beautifulsoup4

Will Anaconda Python config scripts clash with Homebrew's?

Will Anaconda Python config scripts clash with Homebrew's? Note that I do not use these config scripts in any of my workflows, I'm just wondering if any of these config scripts may get called "behind the scenes". Sample output below (with username replaced by '..'):
$ brew doctor
...
Having additional scripts in your path can confuse software installed via
Homebrew if the config script overrides a system or Homebrew provided
script of the same name. We found the following "config" scripts:
/Users/../anaconda/bin/curl-config
/Users/../anaconda/bin/freetype-config
/Users/../anaconda/bin/libdynd-config
/Users/../anaconda/bin/libpng-config
/Users/../anaconda/bin/libpng15-config
/Users/../anaconda/bin/llvm-config
/Users/../anaconda/bin/python-config
/Users/../anaconda/bin/python2-config
/Users/../anaconda/bin/python2.7-config
/Users/../anaconda/bin/xml2-config
/Users/../anaconda/bin/xslt-config
Clearly some of these clash with some Homebrew-installed packages.
$ ls /usr/local/bin/*-config
/usr/local/bin/Magick++-config /usr/local/bin/libpng-config
/usr/local/bin/Magick-config /usr/local/bin/libpng16-config
/usr/local/bin/MagickCore-config /usr/local/bin/pcre-config
/usr/local/bin/MagickWand-config /usr/local/bin/pkg-config
/usr/local/bin/Wand-config /usr/local/bin/python-config
/usr/local/bin/freetype-config /usr/local/bin/python2-config
/usr/local/bin/gdlib-config /usr/local/bin/python2.7-config
Such clashes are entirely possible. When you install softwares that depends on Python using Homebrew, you want it to see Python packages and libraries installed via Homebrew but not those installed by Anaconda.
My solution to this is not putting
export PATH=$HOME/anaconda/bin:$PATH
into .bashrc. Normally, you'll just use Python and pip installed via Homebrew and packages installed by that pip. Sometimes, when you are developing Python projects that is convenient to use Anaconda's environment management mechanism (conda create -n my-env), you can temporarily do export PATH=$HOME/anaconda/bin:$PATH to turn it on. From what I gathered, one important benefit of using Anaconda compared to using regular Python is that conda create -n my-env anaconda will not duplicate package installations unnecessarily as virtualenv my-env will when you have a large number of virtual environments. If you do not mind having some degree of duplication, you could just avoid installing Anaconda all together and just use virtualenv.
It's entirely possible you won't notice any problems. On the other hand, you may have some pretty frustrating ones. It all depends on what you use and how your $PATH is ordered. Homebrew will take whatever file has precedence in your $PATH; if another Homebrew package needs to use Homebrew-installed config files and it sees the Anaconda versions first, it doesn't known any better than to use the wrong ones. In a sense, that's what you told it to do.
My recommendation is to keep things simple and clean. Unless you have a particular reason to keep Anaconda on your $PATH, you should probably pop it out and alias anything you need. Alternatively you could just install the things you require (e.g., numpy) via Homebrew and eliminate Anaconda altogether. (Actually, that's really what I would do. Anaconda comes with way more stuff than I have any reason to be dumping onto my machine.)
I don't know what your $PATH looks like, but in my experience, keeping it short and systematic has a lot of advantages.

Categories