Unable to launch a python script through my batch file - python

I have a batch file that's supposed to set PATH/PYTHON path variables and then invoke my python script (myscript.py), that's designed as an interactive console. I tried with the following:
#echo off
setlocal
if not defined PYTHONHOME (echo warning: PYTHONHOME environment variable is not defined. Using C:\Python24 by default.
SET PYTHONHOME=C:\Python24
if not exist "C:\Python24" ( echo warning: C:\Python24 does not exists. Please specify PYTHONHOME variable manually.))
color 1e
set PYTHONSTARTUP=%~dp0%myscript.py
set PYTHONPATH=%~dp0;%PYTHONPATH%
path %PYTHONHOME%;%PATH%
set PATHEXT=%PATHEXT%;.PY
cd %~dp0%
cmd.exe /k title Interactive Python Console 1.0
cls
%~dp0%myscript.py"
:done
endlocal
Before setting the colorpair (1e)for the console, I have appended directory containing myscript to path, python path and python24 is set as python home.
My problem is:
I am able to change the default font/background color of console, set the current window's title, but then neither cls (clearscreen) works, nor does my script is invoked. At the console, my pwd is script's directory. But when i enter 'python' at the prompt, myscript is invoked and I can see interactive console of my script.
Anything missing from batch, that would automatically clear off the console after setting color/title, and invoke myscript.py?

This doesn't really have anything to do with Python. cmd /k doesn't "set the window title", it starts a new command shell and leaves you in it, thus stopping your script midway through. Why don't you just do title My New Title? There's no need to use cmd.

Related

Add environment variables in python [duplicate]

I was hoping to write a python script to create some appropriate environmental variables by running the script in whatever directory I'll be executing some simulation code, and I've read that I can't write a script to make these env vars persist in the mac os terminal. So two things:
Is this true?
and
It seems like it would be a useful things to do; why isn't it possible in general?
You can't do it from python, but some clever bash tricks can do something similar. The basic reasoning is this: environment variables exist in a per-process memory space. When a new process is created with fork() it inherits its parent's environment variables. When you set an environment variable in your shell (e.g. bash) like this:
export VAR="foo"
What you're doing is telling bash to set the variable VAR in its process space to "foo". When you run a program, bash uses fork() and then exec() to run the program, so anything you run from bash inherits the bash environment variables.
Now, suppose you want to create a bash command that sets some environment variable DATA with content from a file in your current directory called ".data". First, you need to have a command to get the data out of the file:
cat .data
That prints the data. Now, we want to create a bash command to set that data in an environment variable:
export DATA=`cat .data`
That command takes the contents of .data and puts it in the environment variable DATA. Now, if you put that inside an alias command, you have a bash command that sets your environment variable:
alias set-data="export DATA=`cat .data`"
You can put that alias command inside the .bashrc or .bash_profile files in your home directory to have that command available in any new bash shell you start.
One workaround is to output export commands, and have the parent shell evaluate this..
thescript.py:
import pipes
import random
r = random.randint(1,100)
print("export BLAHBLAH=%s" % (pipes.quote(str(r))))
..and the bash alias (the same can be done in most shells.. even tcsh!):
alias setblahblahenv="eval $(python thescript.py)"
Usage:
$ echo $BLAHBLAH
$ setblahblahenv
$ echo $BLAHBLAH
72
You can output any arbitrary shell code, including multiple commands like:
export BLAHBLAH=23 SECONDENVVAR='something else' && echo 'everything worked'
Just remember to be careful about escaping any dynamically created output (the pipes.quote module is good for this)
If you set environment variables within a python script (or any other script or program), it won't affect the parent shell.
Edit clarification:
So the answer to your question is yes, it is true.
You can however export from within a shell script and source it by using the dot invocation
in fooexport.sh
export FOO="bar"
at the command prompt
$ . ./fooexport.sh
$ echo $FOO
bar
It's not generally possible. The new process created for python cannot affect its parent process' environment. Neither can the parent affect the child, but the parent gets to setup the child's environment as part of new process creation.
Perhaps you can set them in .bashrc, .profile or the equivalent "runs on login" or "runs on every new terminal session" script in MacOS.
You can also have python start the simulation program with the desired environment. (use the env parameter to subprocess.Popen (http://docs.python.org/library/subprocess.html) )
import subprocess, os
os.chdir('/home/you/desired/directory')
subprocess.Popen(['desired_program_cmd', 'args', ...], env=dict(SOMEVAR='a_value') )
Or you could have python write out a shell script like this to a file with a .sh extension:
export SOMEVAR=a_value
cd /home/you/desired/directory
./desired_program_cmd
and then chmod +x it and run it from anywhere.
What I like to do is use /usr/bin/env in a shell script to "wrap" my command line when I find myself in similar situations:
#!/bin/bash
/usr/bin/env NAME1="VALUE1" NAME2="VALUE2" ${*}
So let's call this script "myappenv". I put it in my $HOME/bin directory which I have in my $PATH.
Now I can invoke any command using that environment by simply prepending "myappenv" as such:
myappenv dosometask -xyz
Other posted solutions work too, but this is my personal preference. One advantage is that the environment is transient, so if I'm working in the shell only the command I invoke is affected by the altered environment.
Modified version based on new comments
#!/bin/bash
/usr/bin/env G4WORKDIR=$PWD ${*}
You could wrap this all up in an alias too. I prefer the wrapper script approach since I tend to have other environment prep in there too, which makes it easier for me to maintain.
As answered by Benson, but the best hack-around is to create a simple bash function to preserve arguments:
upsert-env-var (){ eval $(python upsert_env_var.py $*); }
Your can do whatever you want in your python script with the arguments. To simply add a variable use something like:
var = sys.argv[1]
val = sys.argv[2]
if os.environ.get(var, None):
print "export %s=%s:%s" % (var, val, os.environ[var])
else:
print "export %s=%s" % (var, val)
Usage:
upsert-env-var VAR VAL
As others have pointed out, the reason this doesn't work is that environment variables live in a per-process memory spaces and thus die when the Python process exits.
They point out that a solution to this is to define an alias in .bashrc to do what you want such as this:
alias export_my_program="export MY_VAR=`my_program`"
However, there's another (a tad hacky) method which does not require you to modify .bachrc, nor requires you to have my_program in $PATH (or specify the full path to it in the alias). The idea is to run the program in Python if it is invoked normally (./my_program), but in Bash if it is sourced (source my_program). (Using source on a script does not spawn a new process and thus does not kill environment variables created within.) You can do that as follows:
my_program.py:
#!/usr/bin/env python3
_UNUSED_VAR=0
_UNUSED_VAR=0 \
<< _UNUSED_VAR
#=======================
# Bash code starts here
#=======================
'''
_UNUSED_VAR
export MY_VAR=`$(dirname $0)/my_program.py`
echo $MY_VAR
return
'''
#=========================
# Python code starts here
#=========================
print('Hello environment!')
Running this in Python (./my_program.py), the first 3 lines will not do anything useful and the triple-quotes will comment out the Bash code, allowing Python to run normally without any syntax errors from Bash.
Sourcing this in bash (source my_program.py), the heredoc (<< _UNUSED_VAR) is a hack used to "comment out" the first-triple quote, which would otherwise be a syntax error. The script returns before reaching the second triple-quote, avoiding another syntax error. The export assigns the result of running my_program.py in Python from the correct directory (given by $(dirname $0)) to the environment variable MY_VAR. echo $MY_VAR prints the result on the command-line.
Example usage:
$ source my_program.py
Hello environment!
$ echo $MY_VAR
Hello environment!
However, the script will still do everything it did before except exporting, the environment variable if run normally:
$ ./my_program.py
Hello environment!
$ echo $MY_VAR
<-- Empty line
As noted by other authors, the memory is thrown away when the Python process exits. But during the python process, you can edit the running environment. For example:
>>> os.environ["foo"] = "bar"
>>> import subprocess
>>> subprocess.call(["printenv", "foo"])
bar
0
>>> os.environ["foo"] = "foo"
>>> subprocess.call(["printenv", "foo"])
foo
0

How do i append to the Maya PYTHONPATH?

I am working through Practical Maya Programming, and trying to set a 'development root' on my PC, I have followed the instructions (below) exactly but it is not working - At the point where I type 'mayapy.exe' I get the warning "'mayapy.exe' is not recognized as an internal or external command, operable program or batch file."
From the book:
Let's decide where we will do our coding. We'll call this location the development root for the rest of the book. To be concise, I'll choose C:\mayapybook\pylib to house all of our Python code.
Create the development root folder, and inside of it create an empty file named minspect.py.
Now, we need to get C:\mayapybook\pylib onto Python's sys.path so it can be imported. The easiest way to do this is to use the PYTHONPATH environment variable. From a Windows command line you can run the following to add the path, and ensure it worked:
> set PYTHONPATH=%PYTHONPATH%;C:\mayapybook\pylib
> mayapy.exe
>>> import sys
>>> 'C:\\mayapybook\\pylib' in sys.path
True
>>> import minspect
>>> minspect
<module 'minspect' from '...\minspect.py'>
EDIT
This is how it is working for me at the moment:
PS C:\Users\Me> set PYTHONPATH=%PYTHONPATH%;C:\mayapybook\pylib
C:\mayapybook\pylib : The term 'C:\mayapybook\pylib' is not recognized as the name of a cmdlet, function, script file,
or operable program. Check the spelling of the name, or if a path was included, verify that the path is correct and
try again.
At line:1 char:29
+ set PYTHONPATH=%PYTHONPATH%;C:\mayapybook\pylib
+ ~~~~~~~~~~~~~~~~~~~
+ CategoryInfo : ObjectNotFound: (C:\mayapybook\pylib:String) [], CommandNotFoundException
+ FullyQualifiedErrorId : CommandNotFoundException
So the code from the book is not working but the code from the post by DrHaze seems to:
PS C:\Users\Me> setx PATH "%PATH%C:\mayapybook\pylib\"
SUCCESS: Specified value was saved.
But when I run the Maya Python Interpreter and check if C:\mayapybook\pylib\ is in sys path it returns false:
>>> 'C:\\mayapybook\\pylib' in sys.path
False
This error "'mayapy.exe' is not recognized as an internal or external command, operable program or batch file." means that the path where mayapy.exe is located is not included in the PATH environment variable. Your system tries to look in all the folders included in the PATH variable but can't find an executable called mayapy.exe.
The executable mayapy.exe is generally located here:
C:\Program Files\Autodesk\Maya(VERSION)\bin\mayapy.exe
on my computer it's located here: C:\Program Files\Autodesk\Maya2014\bin\mayapy.exe
To add the mayapy.exe location to your path, use one of the following commands:
setx PATH "%PATH%;C:\Program Files\Autodesk\Maya2014\bin\" if you
want to change it permanently
set PATH "%PATH%;C:\Program Files\Autodesk\Maya2014\bin\" only works for the current instance of the cmd session.
EDIT
The error you show in your edit is the source of the problem. Windows failed to set the environment variable PYTHONPATH. Hence, when you execute 'C:\\mayapybook\\pylib' in sys.path it returns False. sys.path is in fact containing the value of PYTHONPATH. That's why it returns False.
Now, why did it fail to set this environment variable?
First I can see that you are using Windows Powershell, keep this in mind.
The command I gave you is:
set PATH "%PATH%;C:\Program Files\Autodesk\Maya2014\bin\"
You wrote:
set PYTHONPATH=%PYTHONPATH%;C:\mayapybook\pylib
What it should be:
set PYTHONPATH "%PYTHONPATH%;C:\mayapybook\pylib\"
The syntax is a bit different and this last command should work.
As an explaination, your book gives you some commands to type in the vintage/old-style windows terminal: cmd.exe
As you are using Windows Powershell, some commands might have a different syntax.
Now what you can do is:
Use cmd.exe (Right click on the title bar -> Properties to custom it)
Use Powershell but keep in mind that the syntax might be a bit different than in your book
If you are using Powershell, there are different commands and strategies for managing environment variables.
You can set a variable permanently with SetEnvironmentVariable
You can set for the current shell session with: $env:VARNAME =
VARVALUE
You can put the commands to set variables in a powershell profile
file.
I would go with the third option. All three are detailed below:
Option 1. To append the directory "C:\mayapybook\pylib\" to the existing
PYTHONPATH permanently for your account:
[Environment]::SetEnvironmentVariable("PYTHONPATH", $env:PYTHONPATH +";C:\mayapybook\pylib\", "User")
Option 2. To append the Maya bin folder to your PATH for only the current
shell session:
$env:PATH += ";C:\Program Files\Autodesk\Maya2016\bin\"
Option 3. Create a powershell profile and set your env vars there.
First you'll need to make sure powershell scripts can run locally:
Hit the windows button, start typing powershell, right click and open as administrator. Enter:
Get-ExecutionPolicy
If it is says Restricted or AllSigned, set it to RemoteSigned like so:
Set-ExecutionPolicy RemoteSigned
Close that shell. Now in another powershell (not admin) type:
cd ~\Documents
md WindowsPowerShell
cd WindowsPowerShell
New-Item -path "profile.ps1" -type file
notepad.exe profile.ps1
Paste into the file any commands you want to run whenever a new powershell is opened:
Write-Host "Hello From Your Profile"
$env:PATH += ";C:\Program Files\Autodesk\Maya2016\bin\"
$env:PYTHONPATH += ";C:\mayapybook\pylib\"
Now whenever you open a powershell, you'll get a silly message and those paths will be set. You can test by typing:
Write-Host $env:PATH
or to list all env vars:
Get-ChildItem Env:
You should now be able to run commands from the maya bin directory. For example, type: maya to start maya.
Some other useful powershell env var commands here.

Setting environment variables, does not work [duplicate]

I was hoping to write a python script to create some appropriate environmental variables by running the script in whatever directory I'll be executing some simulation code, and I've read that I can't write a script to make these env vars persist in the mac os terminal. So two things:
Is this true?
and
It seems like it would be a useful things to do; why isn't it possible in general?
You can't do it from python, but some clever bash tricks can do something similar. The basic reasoning is this: environment variables exist in a per-process memory space. When a new process is created with fork() it inherits its parent's environment variables. When you set an environment variable in your shell (e.g. bash) like this:
export VAR="foo"
What you're doing is telling bash to set the variable VAR in its process space to "foo". When you run a program, bash uses fork() and then exec() to run the program, so anything you run from bash inherits the bash environment variables.
Now, suppose you want to create a bash command that sets some environment variable DATA with content from a file in your current directory called ".data". First, you need to have a command to get the data out of the file:
cat .data
That prints the data. Now, we want to create a bash command to set that data in an environment variable:
export DATA=`cat .data`
That command takes the contents of .data and puts it in the environment variable DATA. Now, if you put that inside an alias command, you have a bash command that sets your environment variable:
alias set-data="export DATA=`cat .data`"
You can put that alias command inside the .bashrc or .bash_profile files in your home directory to have that command available in any new bash shell you start.
One workaround is to output export commands, and have the parent shell evaluate this..
thescript.py:
import pipes
import random
r = random.randint(1,100)
print("export BLAHBLAH=%s" % (pipes.quote(str(r))))
..and the bash alias (the same can be done in most shells.. even tcsh!):
alias setblahblahenv="eval $(python thescript.py)"
Usage:
$ echo $BLAHBLAH
$ setblahblahenv
$ echo $BLAHBLAH
72
You can output any arbitrary shell code, including multiple commands like:
export BLAHBLAH=23 SECONDENVVAR='something else' && echo 'everything worked'
Just remember to be careful about escaping any dynamically created output (the pipes.quote module is good for this)
If you set environment variables within a python script (or any other script or program), it won't affect the parent shell.
Edit clarification:
So the answer to your question is yes, it is true.
You can however export from within a shell script and source it by using the dot invocation
in fooexport.sh
export FOO="bar"
at the command prompt
$ . ./fooexport.sh
$ echo $FOO
bar
It's not generally possible. The new process created for python cannot affect its parent process' environment. Neither can the parent affect the child, but the parent gets to setup the child's environment as part of new process creation.
Perhaps you can set them in .bashrc, .profile or the equivalent "runs on login" or "runs on every new terminal session" script in MacOS.
You can also have python start the simulation program with the desired environment. (use the env parameter to subprocess.Popen (http://docs.python.org/library/subprocess.html) )
import subprocess, os
os.chdir('/home/you/desired/directory')
subprocess.Popen(['desired_program_cmd', 'args', ...], env=dict(SOMEVAR='a_value') )
Or you could have python write out a shell script like this to a file with a .sh extension:
export SOMEVAR=a_value
cd /home/you/desired/directory
./desired_program_cmd
and then chmod +x it and run it from anywhere.
What I like to do is use /usr/bin/env in a shell script to "wrap" my command line when I find myself in similar situations:
#!/bin/bash
/usr/bin/env NAME1="VALUE1" NAME2="VALUE2" ${*}
So let's call this script "myappenv". I put it in my $HOME/bin directory which I have in my $PATH.
Now I can invoke any command using that environment by simply prepending "myappenv" as such:
myappenv dosometask -xyz
Other posted solutions work too, but this is my personal preference. One advantage is that the environment is transient, so if I'm working in the shell only the command I invoke is affected by the altered environment.
Modified version based on new comments
#!/bin/bash
/usr/bin/env G4WORKDIR=$PWD ${*}
You could wrap this all up in an alias too. I prefer the wrapper script approach since I tend to have other environment prep in there too, which makes it easier for me to maintain.
As answered by Benson, but the best hack-around is to create a simple bash function to preserve arguments:
upsert-env-var (){ eval $(python upsert_env_var.py $*); }
Your can do whatever you want in your python script with the arguments. To simply add a variable use something like:
var = sys.argv[1]
val = sys.argv[2]
if os.environ.get(var, None):
print "export %s=%s:%s" % (var, val, os.environ[var])
else:
print "export %s=%s" % (var, val)
Usage:
upsert-env-var VAR VAL
As others have pointed out, the reason this doesn't work is that environment variables live in a per-process memory spaces and thus die when the Python process exits.
They point out that a solution to this is to define an alias in .bashrc to do what you want such as this:
alias export_my_program="export MY_VAR=`my_program`"
However, there's another (a tad hacky) method which does not require you to modify .bachrc, nor requires you to have my_program in $PATH (or specify the full path to it in the alias). The idea is to run the program in Python if it is invoked normally (./my_program), but in Bash if it is sourced (source my_program). (Using source on a script does not spawn a new process and thus does not kill environment variables created within.) You can do that as follows:
my_program.py:
#!/usr/bin/env python3
_UNUSED_VAR=0
_UNUSED_VAR=0 \
<< _UNUSED_VAR
#=======================
# Bash code starts here
#=======================
'''
_UNUSED_VAR
export MY_VAR=`$(dirname $0)/my_program.py`
echo $MY_VAR
return
'''
#=========================
# Python code starts here
#=========================
print('Hello environment!')
Running this in Python (./my_program.py), the first 3 lines will not do anything useful and the triple-quotes will comment out the Bash code, allowing Python to run normally without any syntax errors from Bash.
Sourcing this in bash (source my_program.py), the heredoc (<< _UNUSED_VAR) is a hack used to "comment out" the first-triple quote, which would otherwise be a syntax error. The script returns before reaching the second triple-quote, avoiding another syntax error. The export assigns the result of running my_program.py in Python from the correct directory (given by $(dirname $0)) to the environment variable MY_VAR. echo $MY_VAR prints the result on the command-line.
Example usage:
$ source my_program.py
Hello environment!
$ echo $MY_VAR
Hello environment!
However, the script will still do everything it did before except exporting, the environment variable if run normally:
$ ./my_program.py
Hello environment!
$ echo $MY_VAR
<-- Empty line
As noted by other authors, the memory is thrown away when the Python process exits. But during the python process, you can edit the running environment. For example:
>>> os.environ["foo"] = "bar"
>>> import subprocess
>>> subprocess.call(["printenv", "foo"])
bar
0
>>> os.environ["foo"] = "foo"
>>> subprocess.call(["printenv", "foo"])
foo
0

Batch file runs with "not recognized...command", how to fix this?

I have a batch file which when executed sets PATHs, prompts user for input and loads a script via Python. The python script creates a grid with the size of each cell determined by the user input variable (cellsize). The following is from my .bat file:
#echo off
rem Root OSGEO4W home dir to the following directory
call "C:\OSGeo4W64\bin\o4w_env.bat"
rem List available o4w programs
rem but only if osgeo4w called without parameters
#echo on
set PYTHONPATH=C:\OSGeo4W64\apps\qgis\python
set PATH=C:\OSGeo4W64\apps\qgis\bin;%PATH%
#echo off
echo.
set /p cellsize="Enter cellsize: "
cellsize=1
cmd /k python "Script.py" %cellsize%
#echo on
The .bat works the way it's supposed to, I obtain the correct results, but I receive the following error:
'cellsize' is not recognized as an internal or external command, operable program or batch file
What simple mistake(s) did I make? I am a beginner but still learning.
The line must read:
set cellsize=1
but surely this line seems more useful before the set /p line as an initialization, since otherwise it cancels the effect of that line.
#echo off
echo.
set /p cellsize="Enter cellsize: "
set cellsize=1
cmd /k python "Script.py" %cellsize%
#echo on
you need set

Launch configured cmd from Python

I'm on Windows. I am trying to write a Python 2.x script (let's call it setup.py) which would enable the following scenario:
User runs cmd to open a console window
In that console window, user runs setup.py
User finds themselves in the same console window, but now the cmd running there has had its environment (env. variables) modified by setup.py
setup.py modifies the environment by adding a new environment variable FOO with value foo, and by preneding something to PATH.
On Linux, I would simply use os.exec*e to replace the Python process with a shell with the environment configured.
I tried the same approach on Windows (like os.exec*e(os.environ['ComSpec'])), but it doesn't work, the environment of the newly executed cmd is messed up like this:
Running just set doesn't list FOO and doesn't show the effect on PATH. Running set FOO, however, shows FOO=foo, and echo %FOO% echoes foo.
Running set PATH or echo %PATH% shows the modified PATH variable. Running set path or echo %path% shows the value without the modification (even though env. vars are normally case insensitive on Windows).
If I type exit, the conole remains hanging in some state not accepting input, until I hit Ctrl+C. After that, it apparently returns to the cmd which originally called setup.py.
So clearly, os.exec*e doesn't work for this scenario on Windows. Is there a different way to achieve what I want? Is there a combination of subprocess.Popen() flags which would enable me to exit the calling Python process and leave the called cmd runnig, ideally in the same console? Or would accessing CreateProcess through ctypes help?
If necessary, I would settle for launching a new console window and closing the old one, but I certainly can't afford having the old console window hang in frozen state, waiting for a newly created one to close.
There's a much simpler solution if it's acceptable to use a Windows batch file in addition to your script, since the batch file runs in the calling process, and can therefore modify its environment.
Given a file setup.bat, which looks like this...
#echo off
for /f "tokens=*" %%a in ('python setup.py') do %%a
...and a file setup.py which looks like this...
import os
print 'set FOO=foo'
print 'set PATH=%s;%s' % ('C:\\my_path_dir', os.environ['PATH'])
...and assuming python.exe in in the PATH, then calling setup.bat from the command line will set the environment variables in the calling process, while still allowing you to make the setup.py script as complicated as you like, as long as it prints the commands you want to execute to stdout.
Update based on comments
If your setup.py has multiple modes of operation, you could make the setup.bat a generic wrapper for it. Suppose instead setup.bat looks like this...
#echo off
if "%1" == "setenv" (
for /f "tokens=*" %%a in ('python setup.py %1') do %%a
) else (
python setup.py %*
)
...and setup.py looks like this...
import sys
import os
if len(sys.argv) > 1 and sys.argv[1] == 'setenv':
print 'set FOO=foo'
print 'set PATH=%s;%s' % ('C:\\my_path_dir', os.environ['PATH'])
else:
print "I'm gonna do something else with argv=%r" % sys.argv
...would that not suffice?

Categories