I have a python script that calls out to two Sysinternals tools (sigcheck and accesschk). Is there a way I can bundle these executables into a py2exe so that subprocess.Popen can see it when it runs?
Full explanation: My script is made to execute over a network share (S:\share\my_script.exe) and it makes hundreds of calls to sigcheck and accesscheck. If sigcheck and accesschk also reside on the server, they seem to get transferred to the host, called once, transferred the the host again, called a second time, on and on until the almost 400-500 calls are complete.
I can probably fall back to copying these two executables to the host (C:) and then deleting them when I'm done... how would you solve this problem?
I could be wrong about this, but I don't believe this is what py2exe was intended for. It's more about what you're distributing than about how you're distributing. I think what you may be looking for is the option to create a windows installer. You could probably add the executables as data files or scripts using distutils.
arggg Why can't my data_files just get bundled into the zipfile?
I've started using paver for this kind of thing. It makes it really easy to override commands or create new commands that will allow you to put some new files into the sdist.
Related
what's the best way to capture files/path info from watchman to pass
to 'make' or another app?
here's what im trying to achieve:
when i save a .py(s) file on the dev server, i'd like to retrieve the filename and path, compile the py to pyc, then transfer the pyc file to a staging server.
should i be using watchman-make, 'heredoc' methods, ansible, etc.?
because the docs are note very helpful, are there any examples available?
and, what's the use case for pywatchman?
thanks in advance
Hopefully this will help clarify some things:
Watchman runs as a per-user service to monitor your filesystem. It can:
Provide live subscriptions to file changes as they occur
trigger a command to be run in the background as file changes occur
Answer queries about how files have changed since a given point in time
pywatchman is a python client implementation that allows you to build applications that consume information from watchman. The watchman-make and watchman-wait tools are implemented using pywatchman.
watchman-make is a tool that helps you invoke make (or a similar program) when files change. It is most appropriate in cases where the program you want to run doesn't need the specific list of files that have just changed. make is in this category; make will analyze the dependencies in your Makefile and then build only the pieces that are changed. You could alternatively execute a python distutils or setuptools setup.py script.
Native watchman triggers are a bit harder to use than watchman-make, as they are spawned in the background by the watchman service and are passed the list of changed files. These are most appropriate for completely unattended processes where you don't need to see the output and need the precise list of changed files.
From what you've described, it sounds like the simplest solution is a script that performs the compilation step and then performs the sync, something along the lines of the following; let's call it build-and-sync.sh
#!/bin/sh
python -m compileall .
rsync -avz . host:/path/
(If you don't really need a .pyc file and just need to sync, then you can simply remove the python line from the above script and just let it run rsync)
You can then use watchman-make to execute this when things change:
watchman-make --make='build-and-sync.sh' -p '**/*.py' -t dummy
Then, after any .py file (or set of .py files) are changed, watchman-make will execute build-and-sync.sh dummy. This should be sufficient unless you have a large enough number of python files that the compilation step takes too long each time you make a change. watchman-make will keep running until you hit CTRL-C or otherwise kill the process; it runs in the foreground in your terminal window unless you use something like nohup, tmux or screen to keep it around for longer.
If that is the case, then you can try using make with a pattern rule to compile only the changed python files, or if that is awkward to express using make then perhaps it is worth using pywatchman to establish a subscription and compile the changed files. This is a more advanced use-case and I'd suggest looking at the code for watchman-wait to see how that might be achieved. It may not be worth the additional effort for this unless you have a large number of files or very tight time constraints for syncing.
I'd recommend trying out the simplest solution first and see if that meets your needs before trying one of the more complex options.
Using native triggers
As an alternative, you can use triggers. These run in the background with their output going to the watchman log file. They are a bit harder to work with than using watchman-make.
You need to write a small program, typically a script, to receive the list of changed files from the trigger; the best way to do this is via stdin of the script. You can receive a list of files one-per-line or a JSON object with more structured information. Let's call this script trigger-build-and-sync; it is up to you to implement the contents of the script. Let's assume you just want a list of files on stdin.
This command will set up the trigger; you invoke it once and it will persist until the watch is removed:
watchman -j <<-EOT
["trigger", "/path/to/root", {
"name": "build-and-sync",
"expression": ["suffix", "py"],
"command": "/path/to/trigger-build-and-sync",
"append_files": false,
"stdin": "NAME_PER_LINE"
}]
EOT
The full docs for this can be found at https://facebook.github.io/watchman/docs/cmd/trigger.html#extended-syntax
I am "compiling" my Python application for Windows with PyInstaller 2.1. I initially tried using onefile mode, but onefile takes a long time to startup (decompressing wx and matplotlib). With the onedir mode it's pretty fast (only a little bit slower than native python).
So I want to use onedir mode for faster startup times, but for the end-user it's difficult to find the actual *.exe file inside the huge amount of files located in the main directory (there are 98 files including the actual executable and it's manifest).
I want to make sure a non tech-savvy user can easily "double-click" the executable and work with this program (ease and portability) without a long disclaimer to "just ignore" the 97 other files there.
Is it possible to move all those "distracting" files into a subfolder? Or are there other ways to make it easy for the end-user to run this program?
maybe you could use onedir. The resulting folder you can put anywhere and create a shortcut to the user where it is more comfortable.
Easiest way to reduce the number of files created in --onedir mode would be to create a virtual environment and install only the necessary modules.
You could use some software to pack it into a one-file installer, such as Inno setup.
You could also try to delete some files one by one (if the executable fails just undelete it). I figured out that almost half of the files can be deleted with the executable working fine.
Download Inno Setup from here.
My python program consists of several files:
the main execution python script
python modules in *.py files
config file
log files
executables scripts of other languages.
All this files should be available only for root. The main script should run on startup, e.g. via upstart.
Where I should put all this files in Linux filesystem?
What's the better way for distribution my program? pip, easy_install, deb, ...? I haven't worked with any of these tool, so I want something easy for me.
The minimum supported Linux distributive should be Ubuntu.
For sure, if this program is to be available only for root, then the main execution python script have to go to /usr/sbin/.
Config files ought to go to /etc/, and log files to /var/log/.
Other python files should be deployed to /usr/share/pyshared/.
Executable scripts of other languages will go either in /usr/bin/ or /usr/sbin/ depending on whether they should be available to all users, or for root only.
If only root should access the scripts, why not put it in /root/ ?
Secondly, if you're going to distribute your application you'll probably need easy_install or something similar, otherwise just tar.gz the stuff if only a few people will access it?
It all depends on your scale..
Pyglet, wxPython and similar have a hughe userbase.. same for BeautifulSoup but they still tar.gz the stuff and you just use setuptools to deply it (whcih, is another option).
I have developed an application for a friend. Aplication is not that complex, involves only two .py files, main.py and main_web.py, main being the application code, and _web being the web interface for it. As the web was done later, it's kept in this format, I know it can be done with one app but not to complicate it too much, I kept it that way. Two two communicate with some files, and web part uses Flask so there's "templates" directory too.
Now, I want to make a package or somehow make this easier for distribution, on a OSX system. I see that there is a nice py2app thingy, but I am running Windows and I can't really use it since it won't work on Win. I also don't know will py2app make problems since some configs are in text files in the directory, and they change during the runtime.
So, I am wondering, is there any other way to make a package of this, some sort of setup like program, or maybe some script or something? Some simple "way" of doing this would be to just copy the files in the directory in the "Documents", and add some shortcuts to the desktop to run those two apps, and that would be it, no need for anything else. DMG would be fine, but not mandatory.
I believe what you are looking for is to add: #!/usr/bin/python to the first line of your code will allow your friend to just double click on the file and it should open. Just as a warning osx does not tell us what version and as such what version of python and what standard libraries are going to be present.
Also, just make sure that if they have played around with their settings to much and they double click on python it does not work they will have to choose to open the file in "terminal.app" in the Utilities Applications folder (/Applications/Utilities/terminal.app)
The other idea is borrow a mac and compile it with the py2app program that you already mentioned. Otherwise there is no generic binary file that you will be able to compile in windows and have run on mac.
My Problem is the following:
I want to create an script that can create other executables. These new executables have to be standalone, so they don't require any DLL's, etc.
I know this is possible with PyInstaller, but only from console/command line.
So essentially, what I want to do is make a python script that imports pyinstaller, creates another .py-file and uses pyinstaller to compile the new script to a .exe, so people who don't have python installed can use this program.
EDIT: The script itself should only use one file, so it can also be a one-file executable
Supposing you have already installed Pyinstaller in PYINSTALLER_PATH (you should have called the Configure.py script in the distribution for the first time), Pyinstaller generates a spec file from your main script by calling the Makespec.py. You can add some flags to generate one dir binary distribution or one file. Finally you have to call Build.py with spec file.
This is easy scriptable with a couple of system calls. Something like :
import os
PROJECT_NAME = "test"
PROJECT_MAIN_SCRIPT = "main_script.py"
MAKESPEC_CMD = """%s %s\Makespec.py -X -n %s -F %s""" % (PYTHON_EXECUTABLE, PYINSTALLER_PATH, PROJECT_NAME, PROJECT_MAIN_SCRIPT)
BUILD_CMD = """%s %s\Build.py %s.spec""" % (PYTHON_EXECUTABLE, PYINSTALLER_PATH, PROJECT_NAME)
os.system(MAKESPEC_CMD)
os.system(BUILD_CMD)
You can avoid to generate the spec file every time and hack it, adding embedded resources (i.e. xml files or configuration) and specifying some other flag. Basically this is a python file, with the definition of some dictionaries.
I don't think there is a Pyinstaller module you can use directly, but you can look at Build.py and mimic its behaviour to do the same. Build.py is the main script which does the trick.
You may want to check out cx_Freeze, which can be used to do this kind of thing.
There are three different ways to use cx_Freeze. The first is to use the included cxfreeze script which works well for simple scripts. The second is to create a distutils setup script which can be used for more complicated configuration or to retain the configuration for future use. The third method involves working directly with the classes and modules used internally by cx_Freeze and should be reserved for complicated scripts or extending or embedding.
Source
Try downloading Pyinstaller's latest development code. There they trying to implement GUI toolkit for building executables.