Returns error "OSError : no Such file or directory". We were trying to activate our newly created virtual env venvCI using the steps in builder with shellCommand.Seems like we cant activate the virtualenv venvCI. Were only new in this environment so please help us.Thanks.
from buildbot.steps.shell import ShellCommand
factory = util.BuildFactory()
# STEPS for example-slave:
factory.addStep(ShellCommand(command=['virtualenv', 'venvCI']))
factory.addStep(ShellCommand(command=['source', 'venvCI/bin/activate']))
factory.addStep(ShellCommand(command=['pip', 'install', '-r','development.pip']))
factory.addStep(ShellCommand(command=['pyflakes', 'calculator.py']))
factory.addStep(ShellCommand(command=['python', 'test.py']))
c['builders'] = []
c['builders'].append(
util.BuilderConfig(name="runtests",
slavenames=["example-slave"],
factory=factory))
Since the buildsystem creates a new Shell for every ShellCommand you can't source env/bin/activate since that only modifies the active shell's environment. When the Shell(Command) exits, the environment is gone.
Things you can do:
Give the environment manually for every ShellCommand (read what
activate does) env={...}
Create a bash script that runs all your commands in
a single shell (what I've done in other systems)
e.g.
myscript.sh:
#!/bin/bash
source env/bin/activate
pip install x
python y.py
Buildbot:
factory.addStep(ShellCommand(command=['bash', 'myscript.sh']))
blog post about the issue
Another option is to call the python executable inside your virtual environment directly, since many Python tools that provide command-line commands are often executable as modules:
from buildbot.steps.shell import ShellCommand
factory = util.BuildFactory()
# STEPS for example-slave:
factory.addStep(ShellCommand(command=['virtualenv', 'venvCI']))
factory.addStep(ShellCommand(
command=['./venvCI/bin/python', '-m', 'pip', 'install', '-r', 'development.pip']))
factory.addStep(ShellCommand(
command=['./venvCI/bin/python', '-m', 'pyflakes', 'calculator.py']))
factory.addStep(ShellCommand(command=['python', 'test.py']))
However, this does get tiresome after a while. You can use string.Template to make helpers:
import shlex
from string import Template
def addstep(cmdline, **kwargs):
tmpl = Template(cmdline)
factory.addStep(ShellCommand(
command=shlex.split(tmpl.safe_substitute(**kwargs))
))
Then you can do things like this:
addstep('$python -m pip install pytest', python='./venvCI/bin/python')
These are some ideas to get started. Note that the neat thing about shlex is that it will respect spaces inside quoted strings when doing the split.
Related
I have 2 scripts: 1. Config one, which modifies about 20 environment variables and runs another small script inside, and 2. The process one which uses environment variables set by script1 to do something.
I tried to execute them one by one via Python2.7 subprocess32.Popen() and pass the same env variable to both Popen. No success, environments from script1 are just empty for script2
import os
import subprocess32 as subprocess
my_env = os.environ
subprocess.Popen(script1, env=my_env)
subprocess.Popen(script2, env=my_env)
How I could actually share environment between 2 scripts?
When the first subprocess exits, the changes to its environment are gone. Such changes are only propagated to its own subprocess. You need to do something like this:
import shlex
subprocess.Popen(". {}; {}".format(shlex.quote(script1),
shlex.quote(script2)),
shell=True)
But as always, using shell=True can introduce security issues.
I think that the problem is in my_env=my_env, second parameter need to be env.
my_env = os.environ
subprocess.Popen(script1, env=my_env)
my_env = os.environ
subprocess.Popen(script2, env=my_env)
Not sure if this is possible. I have a set of python scripts and have modified the linux PATH in ~/.bashrc so that whenever I open a terminal, the python scripts are available to run as a command.
export PATH=$PATH:/home/user/pythonlib/
my_command.py resides in the above path.
I can run my_command.py (args) from anywhere in terminal and it will run the python scripts.
I'd like to control this functionality from a different python script as this will be the quickest solution to automating my processing routines. So I need it to open a terminal and run my_command.py (args) from within the python script I'm working on.
I have tried subprocess:
import subprocess
test = subprocess.Popen(["my_command.py"], stdout=subprocess.PIPE)
output = test.communicate()[0]
While my_command.py is typically available in any terminal I launch, here I have no access to it, returns file not found.
I can start a new terminal using os then type in my_command.py, and it works
os.system("x-terminal-emulator -e /bin/bash")
So, is there a way to get the second method to accept a script you want to run from python with args?
Ubuntu 16
Thanks :)
Popen does not load the system PATH for the session you create in a python script. You have to modify the PATH in the session to include the directory to your project like so:
someterminalcommand = "my_command.py (args)"
my_env = os.environ.copy()
my_env["PATH"] = "/home/usr/mypythonlib/:" + my_env["PATH"]
combine = subprocess.Popen(shlex.split(someterminalcommand), env=my_env)
combine.wait()
This allows me to run my "my_command.py" file from a different python session just like I had a terminal window open.
If you're using Gnome, the gnome-terminal command is rather useful in this situation.
As an example of very basic usage, the following code will spawn a terminal, and run a Python REPL in it:
import subprocess
subprocess.Popen(["gnome-terminal", "-e", "python"])
Now, if you want to run a specific script, you will need to concatenate its path with python, for the last element of that list it the line that will be executed in the new terminal.
For instance:
subprocess.Popen(["gnome-terminal", "-e", "python my_script.py"])
If your script is executable, you can omit python:
subprocess.Popen(["gnome-terminal", "-e", "my_script.py"])
If you want to pass parameters to your script, simply add them to the python command:
subprocess.Popen(["gnome-terminal", "-e", "python my_script.py var1 var2"])
Note that if you want to run your script with a particular version of Python, you should specify it, by explicitly calling "python2" or "python3".
A small example:
# my_script.py
import sys
print(sys.argv)
input()
# main.py
import subprocess
subprocess.Popen(["gnome-terminal", "-e", "python3 my_script.py hello world"])
Running python3 main.py will spawn a new terminal, with ['my_script.py', 'hello', 'world'] printed, and waited for an input.
How can I run a python script with my own command line name like myscript without having to do python myscript.py in the terminal?
Add a shebang line to the top of the script:
#!/usr/bin/env python
Mark the script as executable:
chmod +x myscript.py
Add the dir containing it to your PATH variable. (If you want it to stick, you'll have to do this in .bashrc or .bash_profile in your home dir.)
export PATH=/path/to/script:$PATH
The best way, which is cross-platform, is to create setup.py, define an entry point in it and install with pip.
Say you have the following contents of myscript.py:
def run():
print('Hello world')
Then you add setup.py with the following:
from setuptools import setup
setup(
name='myscript',
version='0.0.1',
entry_points={
'console_scripts': [
'myscript=myscript:run'
]
}
)
Entry point format is terminal_command_name=python_script_name:main_method_name
Finally install with the following command.
pip install -e /path/to/script/folder
-e stands for editable, meaning you'll be able to work on the script and invoke the latest version without need to reinstall
After that you can run myscript from any directory.
I usually do in the script:
#!/usr/bin/python
... code ...
And in terminal:
$: chmod 755 yourfile.py
$: ./yourfile.py
Another related solution which some people may be interested in. One can also directly embed the contents of myscript.py into your .bashrc file on Linux (should also work for MacOS I think)
For example, I have the following function defined in my .bashrc for dumping Python pickles to the terminal, note that the ${1} is the first argument following the function name:
depickle() {
python << EOPYTHON
import pickle
f = open('${1}', 'rb')
while True:
try:
print(pickle.load(f))
except EOFError:
break
EOPYTHON
}
With this in place (and after reloading .bashrc), I can now run depickle a.pickle from any terminal or directory on my computer.
The simplest way that comes to my mind is to use "pyinstaller".
create an environment that contains all the lib you have used in your code.
activate the environment and in the command window write pip install pyinstaller
Use the command window to open the main directory that codes maincode.py is located.
remember to keep the environment active and write pyinstaller maincode.py
Check the folder named "build" and you will find the executable file.
I hope that this solution helps you.
GL
I've struggled for a few days with the problem of not finding the command py -3 or any other related to pylauncher command if script was running by service created using Nssm tool.
But same commands worked when run directly from cmd.
What was the solution? Just to re-run Python installer and at the very end click the option to disable path length limit.
I'll just leave it here, so that anyone can use this answer and find it helpful.
I want to activate a virtualenv instance from a Python script.
I know it's quite easy to do, but all the examples I've seen use it to run commands within the env and then close the subprocess.
I simply want to activate the virtualenv and return to the shell, the same way that bin/activate does.
Something like this:
$me: my-script.py -d env-name
$(env-name)me:
Is this possible?
Relevant:
virtualenv › Invoking an env from a script
If you want to run a Python subprocess under the virtualenv, you can do that by running the script using the Python interpreter that lives inside virtualenv's /bin/ directory:
import subprocess
# Path to a Python interpreter that runs any Python script
# under the virtualenv /path/to/virtualenv/
python_bin = "/path/to/virtualenv/bin/python"
# Path to the script that must run under the virtualenv
script_file = "must/run/under/virtualenv/script.py"
subprocess.Popen([python_bin, script_file])
However, if you want to activate the virtualenv under the current Python interpreter instead of a subprocess, you can use the activate_this.py script:
# Doing execfile() on this file will alter the current interpreter's
# environment so you can import libraries in the virtualenv
activate_this_file = "/path/to/virtualenv/bin/activate_this.py"
execfile(activate_this_file, dict(__file__=activate_this_file))
The simplest solution to run your script under virtualenv's interpreter is to replace the default shebang line with path to your virtualenv's interpreter like so at the beginning of the script:
#!/path/to/project/venv/bin/python
Make the script executable:
chmod u+x script.py
Run the script:
./script.py
Voila!
It turns out that, yes, the problem is not simple, but the solution is.
First I had to create a shell script to wrap the "source" command. That said I used the "." instead, because I've read that it's better to use it than source for Bash scripts.
#!/bin/bash
. /path/to/env/bin/activate
Then from my Python script I can simply do this:
import os
os.system('/bin/bash --rcfile /path/to/myscript.sh')
The whole trick lies within the --rcfile argument.
When the Python interpreter exits it leaves the current shell in the activated environment.
Win!
To run another Python environment according to the official Virtualenv documentation, in the command line you can specify the full path to the executable Python binary, just that (no need to active the virtualenv before):
/path/to/virtualenv/bin/python
The same applies if you want to invoke a script from the command line with your virtualenv. You don't need to activate it before:
me$ /path/to/virtualenv/bin/python myscript.py
The same for a Windows environment (whether it is from the command line or from a script):
> \path\to\env\Scripts\python.exe myscript.py
Just a simple solution that works for me. I don't know why you need the Bash script which basically does a useless step (am I wrong ?)
import os
os.system('/bin/bash --rcfile flask/bin/activate')
Which basically does what you need:
[hellsing#silence Foundation]$ python2.7 pythonvenv.py
(flask)[hellsing#silence Foundation]$
Then instead of deactivating the virtual environment, just Ctrl + D or exit. Is that a possible solution or isn't that what you wanted?
The top answer only works for Python 2.x
For Python 3.x, use this:
activate_this_file = "/path/to/virtualenv/bin/activate_this.py"
exec(compile(open(activate_this_file, "rb").read(), activate_this_file, 'exec'), dict(__file__=activate_this_file))
Reference: What is an alternative to execfile in Python 3?
The child process environment is lost in the moment it ceases to exist, and moving the environment content from there to the parent is somewhat tricky.
You probably need to spawn a shell script (you can generate one dynamically to /tmp) which will output the virtualenv environment variables to a file, which you then read in the parent Python process and put in os.environ.
Or you simply parse the activate script in using for the line in open("bin/activate"), manually extract stuff, and put in os.environ. It is tricky, but not impossible.
For python2/3, Using below code snippet we can activate virtual env.
activate_this = "/home/<--path-->/<--virtual env name -->/bin/activate_this.py" #for ubuntu
activate_this = "D:\<-- path -->\<--virtual env name -->\Scripts\\activate_this.py" #for windows
with open(activate_this) as f:
code = compile(f.read(), activate_this, 'exec')
exec(code, dict(__file__=activate_this))
I had the same issue and there was no activate_this.py in the Scripts directory of my environment.
activate_this.py
"""By using execfile(this_file, dict(__file__=this_file)) you will
activate this virtualenv environment.
This can be used when you must use an existing Python interpreter, not
the virtualenv bin/python
"""
try:
__file__
except NameError:
raise AssertionError(
"You must run this like execfile('path/to/active_this.py', dict(__file__='path/to/activate_this.py'))")
import sys
import os
base = os.path.dirname(os.path.dirname(os.path.abspath(__file__)))
if(sys.platform=='win32'):
site_packages = os.path.join(base, 'Lib', 'site-packages')
else:
site_packages = os.path.join(base, 'lib', 'python%s' % sys.version[:3], 'site-packages')
prev_sys_path = list(sys.path)
import site
site.addsitedir(site_packages)
sys.real_prefix = sys.prefix
sys.prefix = base
# Move the added items to the front of the path:
new_sys_path = []
for item in list(sys.path):
if item not in prev_sys_path:
new_sys_path.append(item)
sys.path.remove(item)
sys.path[:0] = new_sys_path
Copy the file to the Scripts directory of your environment and use it like this:
def activate_virtual_environment(environment_root):
"""Configures the virtual environment starting at ``environment_root``."""
activate_script = os.path.join(
environment_root, 'Scripts', 'activate_this.py')
execfile(activate_script, {'__file__': activate_script})
activate_virtual_environment('path/to/your/venv')
Refrence: https://github.com/dcreager/virtualenv/blob/master/virtualenv_support/activate_this.py
You should create all your virtualenvs in one folder, such as virt.
Assuming your virtualenv folder name is virt, if not change it
cd
mkdir custom
Copy the below lines...
#!/usr/bin/env bash
ENV_PATH="$HOME/virt/$1/bin/activate"
bash --rcfile $ENV_PATH -i
Create a shell script file and paste the above lines...
touch custom/vhelper
nano custom/vhelper
Grant executable permission to your file:
sudo chmod +x custom/vhelper
Now export that custom folder path so that you can find it on the command-line by clicking tab...
export PATH=$PATH:"$HOME/custom"
Now you can use it from anywhere by just typing the below command...
vhelper YOUR_VIRTUAL_ENV_FOLDER_NAME
Suppose it is abc then...
vhelper abc
In the following code snippet (meant to work in an init.d environment) I would like to execute test.ClassPath. However, I'm having trouble setting and passing the CLASSPATH environment variable as defined in the user's .bashrc.
Here is the source of my frustration:
When the below script is run in use mode, it prints out the CLASSPATH OK (from $HOME/.bashrc)
when I run it as root, it also displays CLASSPATH fine (I've set up /etc/bash.bashrc with CLASSPATH)
BUT when I do "sudo script.py" (to simulate what happens at init.d startup time), the CLASSPATH is missing !!
The CLASSPATH is quite large, so I'd like to read it from a file .. say $HOME/.classpath
#!/usr/bin/python
import subprocess
import os.path as osp
import os
user = "USERNAME"
logDir = "/home/USERNAME/temp/"
print os.environ["HOME"]
if "CLASSPATH" in os.environ:
print os.environ["CLASSPATH"]
else:
print "Missing CLASSPATH"
procLog = open(osp.join(logDir, 'test.log'), 'w')
cmdStr = 'sudo -u %s -i java test.ClassPath'%(user, ) # run in user
proc = subprocess.Popen(cmdStr, shell=True, bufsize=0, stderr=procLog, stdout=procLog)
procLog.close()
sudo will not pass environment variables by default. From the man page:
By default, the env_reset option is enabled. This causes
commands to be executed with a minimal environment containing
TERM, PATH, HOME, MAIL, SHELL, LOGNAME, USER and USERNAME in
addition to variables from the invoking process permitted by
the env_check and env_keep options. This is effectively a
whitelist for environment variables.
There are a few ways of addressing this.
You can edit /etc/sudoers to explicitly pass the CLASSPATH
variable using the env_keep configuration directive. That might
look something like:
Defaults env_keep += "CLASSPATH"
You can run your command using the env command, which lets you set the environment explicitly. A typical command line invocation might look like this:
sudo env CLASSPATH=/path1:/path2 java test.ClassPath
The obvious advantage to option (2) is that it doesn't require mucking about with the sudoers configuration.
You could put source ~/.bashrc before starting your python script to get the environment variables set.