In fabric, the cd context manager works like
with cd("dir"):
run("command")
and the command will be run after changing to the dir directory. This works fine, but the issue is that it uses a global state. For example, suppose I have a helper function, which needs to use cd:
def helper():
with cd("foo"):
run("some command")
If I call helper from another function like
def main_function():
helper()
...
it works fine. But if I do something like
def main_function():
with cd("bar"):
helper()
it breaks, because the run("come command") from helper is now run from bar/foo instead of just foo.
Any tips on how to get around this? I tried using absolute paths in cd, but that didn't work. What I really want is for the cd context to only extend to the function scope.
So apparently absolute paths do work. The issue is that paths with ~ do not work (they are treated like relative paths, which IMHO is a bug in fabric), and that was what I was trying. So you have to do (e.g., if you are using vagrant) cd('/home/vagrant/foo').
Probably you can get away with relative paths in nested context managers
def func():
with cd("/home/vagrant/foo"):
stuff()
with cd("bar"): # cd /home/vagrant/foo/bar
more_stuff()
because you know exactly what the current working directory is when you call the cd('bar'). But for top-level cds, if the function can ever be called from within another function (not just directly from fab), you should use absolute paths.
Related
I have a tool I wrote in python that works completely fine when running in the maya script editor. However, I want to be able to import the script from the script directory. Which should be simple, and I am shocked I can't find the solution while searching the web.
My script format is like this example:
import maya.cmds as cmds
# GUI code with buttons, they call the functions below.
#
#
def function1():
#commands that do things
def function2():
#commands that do things
#List of functions continues
Like I said, the program functions perfectly when run in the script editor. When saving the script to the directory and using this method:
import module
reload (module)
module.function()
The GUI loads fine, but then when pushing the gui buttons, it says the functions are not defined. I don't understand what I am missing? If the script was loaded, shouldn't the functions be defined? Any help would be greatly appreciated, thank you!
Just because the GUI loads doesn't mean that all of your functions loaded properly. You need to put your file, (module.py) in a directory that is visible to your PYTHONPATH. If you're in Maya, you can also put it in the MAYA_SCRIPT_PATH
The PYTHONPATH / MAYA_SCRIPT_PATH are environment variables that you set before launching Maya. In a default Maya installation, some places where you could put your module.py file would be:
(Windows) C:\Users\YOUR_USER_NAME\Documents\maya\scripts
(Linux) ~/maya/scripts
(Mac) - Not sure, put probably also ~/maya/scripts
If you want to know where else you can place it, you run this
import os
print(os.getenv('MAYA_SCRIPT_PATH', ''))
print(os.getenv('PYTHONPATH', ''))
Any location in that list that you have write permissions to is OK to add your module.py file.
Also, it's worth noting that in your example module.function() would fail. It'd need to be module.function1() or module.function2() but I assume you know that. Hope this helps
Sup guys
So i know this is an old question that has already been answered but I have some extra information that helps in regards to GUI functions not working. (same error)
and there's basically nothing on this anywhere.
so the script director only helps when loading in the module through the shelf but will still return the same error "fuction is not defined"
this has todo with how the function is called through the UI element.
example.
this is will allow you to call function in the script editor
but gives you the function not defined error when the module is imported
def GUI_function():
pm.button( command = "function()")
def function():
do stuff
this on the other hand works.
def GUI_function():
pm.button( command = function)
def function(*_):
do stuff
i don't know why but maya tends to think function() is a nodetype
so remove the brackets if you not using arguments and you good to go
I have one script that calls another - let's call them master.py and fetch.py. I suppose the second script could be integrated into the first, but it does have distinct functionality - so keeping them separate seems like a good way to force myself to learn how to call outside scripts.
Here's the basic structure of fetch.py:
<import block>
infiles = <paths>
arcpy.env.workspace = os.path.dirname(infile)
ws = arcpy.env.workspace
newfile_list = []
def main():
name = <name>
if not arcpy.Exists(name + ".gdb"):
global ws
new_gdb = DM.CreateFileGDB(ws, name + ".gdb")
newfile_list.append(new_gdb)
other_func1()
other_func2()
print "\nNew files from fetch.py:"
for i in newfile_list:
print " " + i
def other_func1():
stuff
def other_func2():
stuff
if __name__ == '__main__':
main()
And master.py:
<import block>
infiles = <paths>
def f1():
stuff
def f2():
stuff
import fetch
fetch.main()
f1()
f2()
The problems, concerning placement of the import block & file definitions of fetch.py:
When I put them inside main() and run it as a standalone script, my arcpy functions don't work because my various imports haven't run yet. Putting them ahead of main() and making them global, solves this problem.
But when I put them outside of main() as you see here, I get an error saying the local variable ws is referenced before assignment. I think this may have to do with my calling fetch.main(), where the initial lines don't get read. (I was able to make it work by declaring global ws, but don't know if this is advisable.)
How do I structure fetch.py so that the import statements and file definitions get read, both when run as a standalone script and when called?
Instead of rewriting your code for you, I will try to point you to some ideas from the developer philosophy toolkit to get you on the way how to refine your approach.
My feeling is that you should contemplate YAGNI and KISS when thinking about your code.
In your comment you write:
I'm keeping the 'fetch' script separate because finding feature orientation seems like a useful stand-alone tool down the road.
Like I say: YAGNI and KISS: If you need it as a standalone tool in the future then split it out in the future. For now keep it simple and put the code that belongs together in one module and make it callable exactly the way you need it. When you work with it and understand the problem better and see what else is needed, you can then add other ways of calling your script. There is no reason why you shouldn't keep all the code in one module while it is still manageable and just add different ways to call it.
If the time comes that you want to organize your code into several modules you can refactor it in a way that you pull out the things you do on the module level at the moment (either into functions or maybe even classes). Then you can control exactly what should happen when and you can control the order of things better.
As further reading in regards to structuring a Python project I would recommend the pypa sample project and Starting A Python Project The Right Way. The problems you have with imports will likely cease to exist if you structure your code and your project in a sensible, pythonic way.
I am writing a fabfile that I want to include an environment variable for the virtualenv that I'm using for development. I don't want to have to put
with shell_env(venv=VIRTUALENV):
...
in every single function. I'd like to be able to add this at the beginning of the file and have it apply to all tasks globally. Is there a way to do this?
For example, I have the tasks
def setup_dev_env():
with shell_env(venv=VIRTUALENV):
local('virtualenv $(venv)')
with prefix('workon $(venv)'):
local('pip install -r requirements.txt')
test()
def test():
with shell_env(venv=VIRTUALENV):
with prefix('workon $(vnenv)'):
local('python3 manage.py test')
and it would be nice if I didn't have to repeat exactly the same line at the beginning of both functions.
I'm not sure if this is considered a hack of command_prefixes but it works:
env.command_prefixes=["export myvar='Hello world'",]
def echo_env():
run(r"echo $myvar")
Output:
C:\Users\swozn\PycharmProjects\aaetuea>fab echo_env
[swozn#localhost] Executing task 'echo_env'
[swozn#localhost] run: echo $myvar
[swozn#localhost] out: Hello world
The advantage is that it is impossible to forget, because it is automatically prepended to all your commands.
If you look at prefix() and shell_env() - you'll notice that they just use _setenv({..}) you can combine everything with settings() and put that in a function and all it once or just have it all in one line.. up to you.
https://github.com/fabric/fabric/blob/5217b12f8aca3bc071206f7f4168e62c003509d1/fabric/context_managers.py#L370
https://github.com/fabric/fabric/blob/5217b12f8aca3bc071206f7f4168e62c003509d1/fabric/context_managers.py#L160
https://github.com/fabric/fabric/blob/5217b12f8aca3bc071206f7f4168e62c003509d1/fabric/context_managers.py#L443
if will look something like:
def setup_dev_env():
with settings(..all your options..):
local(..your command...)
or if you really want to be cool:
#with_settings(..all your same options from before..)
def setup_dev_env():
local(..command..)
Let's say I have a large Python library of functions and I want these functions (or some large number of them) to be available as commands in Bash.
First, disregarding Bash command options and arguments, how could I get a function of a Python file containing a number of functions to run using a single word Bash command? I do not want to have the functions available via commands of a command 'suite'. So, let's say I have a function called zappo in this Python file (say, called library1.py). I would want to call this function using a single-word Bash command like zappo, not something like library1 zappo.
Second, how could options and arguments be handled? I was thinking that a nice way could be to capture all of the options and arguments of the Bash command and then use them within the Python functions using docopt parsing *at the function level```.
Yes, but the answer might not be as simple as you hope. No matter what you do, you're going to have to create something in your bash shell for each function you want to run. However, you could have a Python script generate aliases stored in a file which gets sourced.
Here's the basic idea:
#!/usr/bin/python
import sys
import __main__ #<-- This allows us to call methods in __main__
import inspect #<-- This allows us to look at methods in __main__
########### Function/Class.Method Section ##############
# Update this with functions you want in your shell #
########################################################
def takesargs():
#Just an example that reads args
print(str(sys.argv))
return
def noargs():
#and an example that doesn't
print("doesn't take args")
return
########################################################
#Make sure there's at least 1 arg (since arg 0 will always be this file)
if len(sys.argv) > 1:
#This fetches the function info we need to call it
func = getattr(__main__, str(sys.argv[1]), None)
if callable(func):
#Actually call the function with the name we received
func()
else:
print("No such function")
else:
#If no args were passed to this function, just output a list of aliases for this script that can be appended to .bashrc or similar.
funcs = inspect.getmembers(__main__, predicate=inspect.isfunction)
for func in funcs:
print("alias {0}='./suite.py {0}'".format(func[0]))
Obviously, if you're using methods in a class instead of functions in main, change references from __main__ to your class, and change predicate in the inspect to inspect.ismethod. Also, you probably would want to use absolute paths for the aliases, etc.
Sample output:
~ ./suite.py
alias noargs='./suite.py noargs'
alias takesargs='./suite.py takesargs'
~ ./suite.py > ~/pyliases
~ echo ". ~/pyliases" >> ~/.bashrc
~ . ~/.bashrc
~ noargs
doesn't take args
~ takesargs blah
['./suite.py', 'takesargs', 'blah']
If you use the method I've suggested above, you can actually have your .bashrc run ~/suite.py > ~/pyliases before it sources the aliases from the file. Then your environment gets updated every time you log in/start a new terminal session. Just edit your python function file, and then . ~/.bashrc and the functions will be available.
I need to interpret few files (scripts) by embedded python interpreter concurrently (to be more detailed one script executes another script as Popen and my app intercepts it and executes it itself). I've found it's called sub-interpreter and i'm going to use it. But i've read sub-interpreter does not have sys.argv:
The new environment has no sys.argv variable
I need to pass argv anyway so how can i do it?
You might find it easier to modify each of the scripts follow the pattern:
def run(*posargs, **argdict):
"""
This does the work and can be called with:
import scriptname
scriptname.run(someargs)
"""
# Code goes here and uses posargs[n] where it would use sys.argv[n+1]
if __name__ == "__main__":
import sys
run(sys.argv[1:])
Then your main script can just call each of the subscripts in turn by simply calling the run method.
You can use environment variables. Have the parent set them by updating the dict os.environ if it's in Python, or setenv() if in C or C++ etc. Then the children can read os.environ to get whatever strings they need.