I'm using Vagrant to set up a box with python, pip, virtualenv, virtualenvwrapper and some requirements. A provisioning shell script adds the required lines for virtualenvwrapper to .bashrc. It does a very basic check that they're not already there, so that it doesn't duplicate them with every provision:
if ! grep -Fq "WORKON_HOME" /home/vagrant/.bashrc; then
echo 'export WORKON_HOME=/home/vagrant/.virtualenvs' >> /home/vagrant/.bashrc
echo 'export PROJECT_HOME=/home/vagrant/Devel' >> /home/vagrant/.bashrc
echo 'source /usr/local/bin/virtualenvwrapper.sh' >> /home/vagrant/.bashrc
source /home/vagrant/.bashrc
fi
That seems to work fine; after provisioning is finished, the lines are in .bashrc, and I can ssh to the box and use virtualenvwrapper.
However, virtualenvwrapper doesn't work during provisioning. After the section above, this next checks for a pip requirements file and tries to install with virtualenvwrapper:
if [[ -f /vagrant/requirements.txt ]]; then
mkvirtualenv 'myvirtualenv' -r /vagrant/requirements.txt
fi
But that generates:
==> default: /tmp/vagrant-shell: line 50: mkvirtualenv: command not found
If I try and echo $WORKON_HOME from that shell script, nothing appears.
What am I missing to have those environment variables available, so virtualenvwrapper will run?
UPDATE: Further attempts... it seems that doing source /home/vagrant/.bashrc has no effect in my shell script - I can put echo "hello" in the .bashrc file , and that isn't output during provisioning (but is if I run source /home/vagrant/.bashrc when logged in.
I've also tried su -c "source /home/vagrant/.bashrc" vagrant in the shell script but that is no different.
UPDATE 2: Removed the $BASHRC_PATH variable, which was confusing the issue.
UPDATE 3: In another question I got the answer as to why source /home/vagrant/.bashrc wasn't working: the first part of the .bashrc file prevented it from doing anything when run "not interactively" in that way.
The Vagrant script provisioner will run as root, so it's home dir (~) will be /root. In your script if you define BASHRC_PATH=/home/vagrant, then I believe your steps will work: appending to, then sourcing from /home/vagrant/.bashrc.
Update:
Scratching my earlier idea ^^ because BASHRC_PATH is already set correctly.
As an alternative we could use .profile or .bash_profile. Here's a simplified example which sets environment variable FOO, making it available during provisioning and after ssh login:
Vagrantfile
Vagrant.configure(2) do |config|
config.vm.box = "hashicorp/precise32"
$prov_script = <<SCRIPT
if ! grep -q "export FOO" /home/vagrant/.profile; then
sudo echo "export FOO=bar" >> /home/vagrant/.profile
echo "before source, FOO=$FOO"
source /home/vagrant/.profile
echo "after source, FOO=$FOO"
fi
SCRIPT
config.vm.provision "shell", inline: $prov_script
end
Results
$ vagrant up
...
==> default: Running provisioner: shell...
default: Running: inline script
==> default: before source, FOO=
==> default: after source, FOO=bar
$ vagrant ssh -c 'echo $FOO'
bar
$ vagrant ssh -c 'tail -n 1 ~/.profile'
export FOO=bar
I found a solution, but I don't know if it's the best. It feels slightly wrong as it's repeating things, but...
I still append those lines to .bashrc, so that virtualenvwrapper will work if I ssh into the machine. But, because source /home/vagrant/.bashrc appears to have no effect during the running of the script, I have to explicitly repeat those three commands:
if ! grep -Fq "WORKON_HOME" $BASHRC_PATH; then
echo 'export WORKON_HOME=$HOME/.virtualenvs' >> $BASHRC_PATH
echo 'export PROJECT_HOME=$HOME/Devel' >> $BASHRC_PATH
echo 'source /usr/local/bin/virtualenvwrapper.sh' >> $BASHRC_PATH
fi
WORKON_HOME=/home/vagrant/.virtualenvs
PROJECT_HOME=/home/vagrant/Devel
source /usr/local/bin/virtualenvwrapper.sh
(As an aside, I also realised that during vagrant provisioning $HOME is /root, not the /home/vagrant I was assuming.)
The .bashrc in Ubuntu box does not work. You have to create the .bash_profile and add:
if [ -f ~/.bashrc ]; then
. ~/.bashrc
fi
As mentioned in your other Q, Vagrant prohibits interactive shells during provisioning - apparently, only for some boxes (need to reference this though). For me, this affects the official Ubuntu Trusty and Xenial boxes.
However, you can simulate an interactive bash shell using sudo -H -u USER_HERE bash -i -c 'YOUR COMMAND HERE'
Answer taken from: https://stackoverflow.com/a/30106828/4186199
This has worked for me installing Ruby via rbenv and Node via nvm when provisioning the Ubuntu/trusty64 and xenial64 boxes.
Related
on my windows system I've succesfully installed a virtual environment (python version is 3.9) using windows command prompt
python -m venv C:\my_path\my_venv
Always using windows command prompt, I'm able to activate the created venv via
C:\my_path\my_venv\Scripts\activate.bat
I am sure the venv is correctly activated since:
on the windows terminal, I see the command line is preceded by (my_venv)
if I activate python from the terminal (python) and run the following commands: import sys ; sys.path I can see, in the list of paths, the desired path [..., 'C:\\my_path\\my_venv\\lib\\site-packages\\win32\\lib', ...]
if I do stuff in the activated venv (like installing packages) everything works and is done inside the venv
To sum up, everything is fine so far.
I also have WSL2 (Ubuntu) and I'd like to activate the same venv using the Ubuntu terminal.
If, from the Ubuntu terminal, I activate the venv
source /mnt/c/my_path/my_venv/Scripts/activate
it seems to work since the command line is preceeded by (my_venv), but when I run python (python3 command) and then run import sys ; sys.path I see that the system is targeting the base Ubuntu python installation (version 3.8) and not the venv installation:
['', '/usr/lib/python38.zip', '/usr/lib/python3.8', '/usr/lib/python3.8/lib-dynload', '/usr/local/lib/python3.8/dist-packages', '/usr/lib/python3/dist-packages']
The venv is not really activated. Any suggestions to solve the issue?
If it can help, I add a couple of information.
If I try to create a venv directly using the Ubuntu terminal
python3 -m venv /mnt/c/my_path/my_venv_unix
and activate it via the Ubuntu terminal (source /mnt/c/my_path/my_venv_unix/bin/activate) everything works fine, but that's not what I want: I'd like to use WSL to activate a virtual environment created using windows command prompt, since on my machine I've a lot of venvs created with windows and I don't want to replicate them.
Following the script C:\my_path\my_venv\Scripts\activate (/mnt/c/my_path/my_venv/Scripts/activate using wsl folders naming) (I had to change the EOL from windows to Ubuntu, otherwise the command source /mnt/c/my_path/my_venv/Scripts/activate would not have worked)
# This file must be used with "source bin/activate" *from bash*
# you cannot run it directly
deactivate () {
# reset old environment variables
if [ -n "${_OLD_VIRTUAL_PATH:-}" ] ; then
PATH="${_OLD_VIRTUAL_PATH:-}"
export PATH
unset _OLD_VIRTUAL_PATH
fi
if [ -n "${_OLD_VIRTUAL_PYTHONHOME:-}" ] ; then
PYTHONHOME="${_OLD_VIRTUAL_PYTHONHOME:-}"
export PYTHONHOME
unset _OLD_VIRTUAL_PYTHONHOME
fi
# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands. Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH:-}" -o -n "${ZSH_VERSION:-}" ] ; then
hash -r 2> /dev/null
fi
if [ -n "${_OLD_VIRTUAL_PS1:-}" ] ; then
PS1="${_OLD_VIRTUAL_PS1:-}"
export PS1
unset _OLD_VIRTUAL_PS1
fi
unset VIRTUAL_ENV
if [ ! "${1:-}" = "nondestructive" ] ; then
# Self destruct!
unset -f deactivate
fi
}
# unset irrelevant variables
deactivate nondestructive
VIRTUAL_ENV="C:\my_path\my_venv"
export VIRTUAL_ENV
_OLD_VIRTUAL_PATH="$PATH"
PATH="$VIRTUAL_ENV/Scripts:$PATH"
export PATH
# unset PYTHONHOME if set
# this will fail if PYTHONHOME is set to the empty string (which is bad anyway)
# could use `if (set -u; : $PYTHONHOME) ;` in bash
if [ -n "${PYTHONHOME:-}" ] ; then
_OLD_VIRTUAL_PYTHONHOME="${PYTHONHOME:-}"
unset PYTHONHOME
fi
if [ -z "${VIRTUAL_ENV_DISABLE_PROMPT:-}" ] ; then
_OLD_VIRTUAL_PS1="${PS1:-}"
PS1="(.venv_ml_dl_gen_purpose) ${PS1:-}"
export PS1
fi
# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands. Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH:-}" -o -n "${ZSH_VERSION:-}" ] ; then
hash -r 2> /dev/null
fi
Finally, here also the script /mnt/c/my_path/my_venv_unix/bin/activate
# This file must be used with "source bin/activate" *from bash*
# you cannot run it directly
deactivate () {
# reset old environment variables
if [ -n "${_OLD_VIRTUAL_PATH:-}" ] ; then
PATH="${_OLD_VIRTUAL_PATH:-}"
export PATH
unset _OLD_VIRTUAL_PATH
fi
if [ -n "${_OLD_VIRTUAL_PYTHONHOME:-}" ] ; then
PYTHONHOME="${_OLD_VIRTUAL_PYTHONHOME:-}"
export PYTHONHOME
unset _OLD_VIRTUAL_PYTHONHOME
fi
# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands. Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH:-}" -o -n "${ZSH_VERSION:-}" ] ; then
hash -r
fi
if [ -n "${_OLD_VIRTUAL_PS1:-}" ] ; then
PS1="${_OLD_VIRTUAL_PS1:-}"
export PS1
unset _OLD_VIRTUAL_PS1
fi
unset VIRTUAL_ENV
if [ ! "${1:-}" = "nondestructive" ] ; then
# Self destruct!
unset -f deactivate
fi
}
# unset irrelevant variables
deactivate nondestructive
VIRTUAL_ENV="/mnt/c/my_path/my_venv_unix"
export VIRTUAL_ENV
_OLD_VIRTUAL_PATH="$PATH"
PATH="$VIRTUAL_ENV/bin:$PATH"
export PATH
# unset PYTHONHOME if set
# this will fail if PYTHONHOME is set to the empty string (which is bad anyway)
# could use `if (set -u; : $PYTHONHOME) ;` in bash
if [ -n "${PYTHONHOME:-}" ] ; then
_OLD_VIRTUAL_PYTHONHOME="${PYTHONHOME:-}"
unset PYTHONHOME
fi
if [ -z "${VIRTUAL_ENV_DISABLE_PROMPT:-}" ] ; then
_OLD_VIRTUAL_PS1="${PS1:-}"
if [ "x(venv_unix) " != x ] ; then
PS1="(venv_unix) ${PS1:-}"
else
if [ "`basename \"$VIRTUAL_ENV\"`" = "__" ] ; then
# special case for Aspen magic directories
# see https://aspen.io/
PS1="[`basename \`dirname \"$VIRTUAL_ENV\"\``] $PS1"
else
PS1="(`basename \"$VIRTUAL_ENV\"`)$PS1"
fi
fi
export PS1
fi
# This should detect bash and zsh, which have a hash command that must
# be called to get it to forget past commands. Without forgetting
# past commands the $PATH changes we made may not be respected
if [ -n "${BASH:-}" -o -n "${ZSH_VERSION:-}" ] ; then
hash -r
fi
Thanks to anyone who wants to answer!
Short answer: It's highly recommended to use the Linux version of Python and tools when in WSL. You'll find a number of posts here on Stack Overflow related to this, but your question is different enough (regarding venv) that it deserves its own answer.
More Detail:
Also worth reading this question. In that case, the question was around a dual-boot system and whether or not the same venv could be shared between Windows and Linux.
I know it seems like things might be better on WSL, where you can run Windows .executables under Linux, but it really isn't for this particular case.
You've solved the first problem, in the difference in line endings, but the next problem that you are facing is the difference in the directory format. After sourcing activate, do an echo $PATH and you'll see that the Windows style C:\path\to\the\venv path has been prepended to your PATH. For WSL, that would need to be /mnt/c/path/to/the/venv.
That's not going to work.
Once you fix that (again, by editing activate), you are still trying to run python3. The venv executable is actually python.exe. Under WSL, you do have to specify the extension.
So if you:
Change the line-endings from CRLF to LF
Change the path style in activate from Windows to WSL2 format
Use the python.exe executable
Then you can at least launch the Windows Python version. Your import sys; sys.path will show the Windows paths.
That said, you are almost certainly going to run into additional problems that don't make it worth doing this. For instance, if a script assumes python or python3, or even pip; then those are going to fail because it needs to call, e.g., pip.exe.
Line endings and native code will also be a problem.
For these reasons (and likely more), it's highly recommended to use the Linux version of Python when in WSL.
By default when I type python in terminal it goes to python using the 2.7.x version. In oldest versions of macOS once I typed 'alias pyhton=python3' it changed forever and every time I typed python it goes to python version 3.
But in macOS Catalina, I need to type the statement every time I open the terminal.
Any suggestion?
First up, confirm what your default "python" links to, so you can reference it, and ensure that which and the shell agree:
user:~> which python
/usr/bin/python
user:~> type python
python is /usr/bin/python
user:~> ls -la /usr/bin/python
lrwxrwxrwx 1 root root 7 Oct 8 13:26 /usr/bin/python -> python2
Now you can either add an alias to override this in your shell... .
Open your ~/.bash_profile file for bash or ~/.zshrc file for zsh (look here for historical reasons behind the files used) as suggested by shahaf, and add a line with the alias — for example, quick method:
echo "alias python=/usr/bin/python3" >> ~/.bash_profile
echo "alias python=/usr/bin/python3" >> ~/.zshrc
The new alias will be set for the next shell you start, or, open a new Terminal window and source the profile file to make it active. Eg. in bash:
source ~/.bash_profile
Or, change the symlink to point by default to python3, and remember the change (I use a simple toggle script, else any install of a missing python2 package can result in complaints about the configure script, which uses the python symlink directly):
#!/bin/bash
TOGGLE=$HOME/.python3Active
if [ ! -e $TOGGLE ]; then
touch $TOGGLE
sudo ln -fs python3 /usr/bin/python
ls -la /usr/bin/python
echo "Press any key to continue..."
read
else
rm $TOGGLE
sudo ln -fs python2 /usr/bin/python
ls -la /usr/bin/python
echo "Press any key to continue..."
read
fi
Catalina now uses zsh as the default rather than Bash.
To verify which shell you are using type echo $0 in the terminal
Add alias python='python3' to $HOME/.zshrc
you will have to edit the terminal's profile file, usually reside under ~/.profile
add the alias line there, this files get loaded when a terminal session is started and export environment variables and methods so they will be accessible in that session
I suggest to use a more robust and powerful terminal enhancement like Z-Shell
I'm using ZSH/oh-my-zsh and the regular OSX terminal, though the same problem occurs in iTerm. I'd been using rbenv and nvm without issue, but recently started working with python and pyenv, and have run into the following issue. On loading a new terminal window, I get the following message at the prompt:
Last login: Sat Apr 1 11:56:46 on ttys001
/Users/jackfuller/.zshenv:3: command not found: pyenv
Since installing pyenv, my machine seems noticeably slower. Obviously loading pyenv will slow things down but load times have dropped way off.
The catch is that pyenv works perfectly after the terminal is loaded, and as far as I can tell my .zshrc is configured properly:
alias dev="cd ~/development"
alias gow="cd ~/goworkspace"
alias dl="cd ~/downloads"
export PATH=/usr/local/bin:$HOME/bin:$PATH
export EDITOR='atom -n'
export PAGER='less -f'
export PATH=$HOME/.rbenv/shims:$PATH
RBENV
if which rbenv > /dev/null; then eval "$(rbenv init -)"; fi
# NVM
export NVM_DIR="/Users/jackfuller/.nvm"
[ -s "$NVM_DIR/nvm.sh" ] && . "$NVM_DIR/nvm.sh" # This loads nvm
# PYENV
export PYENV_ROOT=/usr/local/opt/pyenv
eval "$(pyenv init - --no-rehash)"
# if which pyenv > /dev/null; then eval "$(pyenv init - --no-rehash)";
fi
# For go.
export GOPATH="$HOME/goworkspace"
export PATH=$PATH:/usr/local/go/bin
export GOROOT="usr/local/go"
If anyone can offer any advice/solutions, it would be much appreciated. Environment variables and shell config files seem more temperamental /confusing than they should be.
/Users/jackfuller/.zshenv:3: command not found: pyenv indicates that your error is in .zshenv at line 3. Maybe you could post your .zshenv. Is there a reason the fi after #PYENV is not commented? Also you could do export PATH=... once and not thrice. Try which pyenv to find the path to pyenv and look if its in your $PATH. For Future uses i'd put my aliases into ~/.zsh_aliases and do a source ~/.zsh_aliases in the .zshrc, otherwise it gets pretty ugly once you have some more aliases. Sorry for the bad structured answer ;)
How do you define a custom prompt to use when activating a Python virtual environment?
I have a bash script for activating a virtualenv I use when calling specific Fabric commands. I want the shell prompt to say something like "(fab)" so I can easily distinguish it from other shells I have open. Following this example, I've tried:
#!/bin/bash
script_dir=`dirname $0`
cd $script_dir
/bin/bash -c ". .env/bin/activate; PS1='(fab) '; exec /bin/bash -i"
but there's no change to the prompt. What am I doing wrong?
The prompt is set in the virtualenv's activate script (located in the bin folder under the virtualenv). If you only want to change the prompt some times, you could set an environment variable before calling activate (make sure to clear it in the corresponding deactivate file). If you simply want the prompt to be different all the time, you can do that right in activate at the line that looks like
set "PROMPT=(virtualenvname) %PROMPT%"
If you're using virtualenvwrapper, you could do all of this in the postactivate and postdeactivate scripts as well.
I couldn't find any way to do this via a script executed as a child process. Calling a separate bash process seems to forget any previously set PS1. However, it turned out to be trivial if I just sourced the script:
#!/bin/bash
script_dir=`dirname $0`
cd $script_dir
. .env/bin/activate
PS1="(fab) "
It appears the
exec /bin/bash -i
is resetting the PS1 variable. When I run
export PS1="foo "; bash
it resets it too. Curiously, when I look into the bash sources (shell.c and variables.c) it appears to use
set_if_not ("PS1", primary_prompt);
to init it. But I'm not exactly sure what happens between this and main(). Giving up.
I tried on cygwin and on linux (RedHat CentOS) as well. I found solution for both.
CYGWIN
After some investigation I found that the problem is that PS1 is set by /etc/bash.bashrc which overrides the PS1 env.var. So You need to disable to run this file using:
/bin/bash -c ". .env/bin/activate; PS1='(fab) ' exec /bin/bash -i --norc"
or
/bin/bash -c ". .env/bin/activate; export PS1='(fab) '; exec /bin/bash -i --norc"
LINUX
It works much simpler:
/bin/bash -c ". .env/bin/activate; PS1='(fab) ' exec /bin/bash -i"
or
/bin/bash -c ". .env/bin/activate; export PS1='(fab) '; exec /bin/bash -i"
If the script You are calling does not export the variables (and I suppose it does not) and the set variables does not appears in the environment then You could try something like this:
/bin/bash -c "PS1='(fab) ' exec /bin/bash --rcfile .env/bin/activate; "
I hope I could help!
I'm trying to run a command that I've installed in my home directory on a remote server. It's already been added to my $PATH in .bash_profile. I'm able to use it when logged in remotely via a normal ssh session, but Fabric doesn't seem to be pulling in my $PATH. Thus, I've tried adding it to my $PATH using Fabric's path context manager like so:
def test_path():
print('My env.path setting: %(path)s' % env)
with path('/path/to/sources/drush'):
run('echo $PATH')
run('drush')
Fabric responds with:
Executing task 'test_path'
My env.path setting:
run: echo $PATH
out: /usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games
out:
run: echo $PATH
out: /usr/local/bin:/usr/bin:/bin:/usr/local/games:/usr/games:/path/to/sources/drush
out:
run: drush
out: /bin/bash: drush: command not found
out:
Fatal error: run() received nonzero return code 127 while executing!
Requested: drush
Executed: /bin/bash -l -c "export PATH=\"\$PATH:\"/path/to/sources/drush\" \" && drush"
Aborting.
Thanks for looking...
The problem is in the way the PATH variable gets set - there is an additional space character at the end of it:
/bin/bash -l -c "export PATH=\"\$PATH:\"/path/to/sources/drush\" \" && drush"
^HERE
The last directory in the search path is interpreted by bash as "/path/to/source/drush " (trailing space) - an invalid directory.