PyInstaller: Executable to access user-specified Sourcecode - python

I'm using pyinstaller to distribute my code as executable within my team as most of them are not coding/scripting people and do not have Python Interpreter installed.
For some advanced usage of my tool, I want to make it possible for the user to implement a small custom function to adjust functionality slightly (for the few experienced people). Hence I want to let them input a python file which defines a function with a fixed name and a string as return.
Is that possible?
I mean the py-file could be drag/dropped for example, and I'd tell them that their user-defined function needs to have a certain name, e.g. "analyze()" - is it now possible to import that from the drag/dropped pythonfile within my PyInstaller Script and use it as this?
I know, it certainly will not be safe/secure and they could do evil things, delete files and so one... But that are things which we don#t care at this point, please no discussions about it. Thanks!

To answer my own question: yes it does actually work to import a module/function from a given path/pythonfile at runtime (that I knew already) even in PyInstaller (that was new for me).
I used this for my Py2.7 program:
f = r'C:\path\to\userdefined\filewithfunction.py'
if os.path.exists(f):
import imp
userdefined = imp.load_source('', f) # Only Python 2.x, for 3.x see: https://stackoverflow.com/a/67692/701049
print userdefined # just a debugging print
userdefined.imported() # here you should use try/catch; or check whether the function with the desired name really exists in the object "userdefined". This is only a small demo as example how to import, so didnt do it here.
filewithfunction.py:
--------------------
def imported():
print 'yes it worked :-)'
As written in the comments of the example code, you'll need a slightly different approach in Python 3.x. See this link: https://stackoverflow.com/a/67692/701049

Related

For what uses do we need `sys` module in python?

I'm a bit experienced without other languages but, novice with Python. I have come across made codes in jupyter notebooks where sys is imported.
I can't see the further use of the sys module in the code. Can someone help me to understand what is the purpose of importing sys?
I do know about the module and it's uses though but can't find a concise reason of why is it used in many code blocks without any further use.
If nothing declared within sys is actually used, then there's no benefit to importing it. There's not a significant amount of cost either.
Sys module is a rather useful module as it allows you to work with your System and those things. Eg:
You can access any command line arguments using sys.argv[1:]
You can see the Path to files.
Version of your Python Interpreter using sys.version
Exit the running code with sys.exit
Mostly you will use it for accessing the Command Line arguments.
I'm a new pythonista bro, I learned to import it whenever I want to exit the program with a nice exit text in red
import sys
name = input("What's your name? ")
if name == "Vedant":
print(f"Hello There {name}.")
else:
sys.exit(f"You're not {name}!")
The sys includes "functions + variable " to help you control and change the python environment #runtime.
Some examples of this control includes:
1- using other sources data as input via using:
sys.stdin
2- using data in the other resources via using:
sys.stdout
3- writing errors when an exception happens, automatically in :
sys.stderr
4- exit from the program by printing a message like:
sys.exit("Finish with the calculations.")
5- The built-in variable to list the directories which the interpreter will looking for functions in them:
sys.pasth
6- Use a function to realize the number of bytes in anonymous datatype via:
sys.getsizeof(1)
sys.getsizeof(3.8)

How to inspect the type of REPL you're using?

There're many kinds of Python REPL, like the default REPL, ptpython, ipython, bpython, etc. Is there a way to inspect what current REPL is when I'm already in it?
A little background:
As you may have heard, I made pdir2 to generate pretty dir() printing. A challenge I'm facing is to make it compatible with those third-party REPLs, but first I need to know which REPL the program is running in.
Ok, finally found a simple but super reliable way: checking sys.modules.
A function you could copy and use.
import sys
def get_repl_type():
if any('ptpython' in key for key in sys.modules):
return 'PTPYTHON'
if any('bpython' in key for key in sys.modules):
return 'BPYTHON'
try:
__IPYTHON__
return 'IPYTHON'
except NameError:
return 'PYTHON'
Probably the best you can do is to look at sys.stdin and stdout and compare their types.
Maybe there are also ways for each interpreter to hook in custom completions or formatters.
You can try to find information from the call stack.
Those fancy REPLs use their startup script to initialize.
It's possible to run one REPL in another, so you need to traverse call stack from top to bottom until find a frame from REPL init script.

Making a GDB debugging helper for the QUuid class

I'm using the QUuid class in my project and for testing and debugging purposes it would be very nice to see the QUuid objects in human readable form instead of their low-level form.
For some reason, the people at Qt have not included a dump method for this type so I attempted to create one on my own, following this documentation and this guide.
I'm not familiar with Python so unfortunately, I could not get something running. Could someone help me create such a function that does nothing more than display the output of QUuid::toString() in the value column of Qt Creator?
Edit:
Mitko's solution worked perfectly. I expanded it a bit so the details can still be read if so desired:
from dumper import *
import gdb
def qdump__QUuid(d, value):
this_ = d.makeExpression(value)
finalValue = gdb.parse_and_eval("%s.toString()" % (this_))
d.putStringValue(finalValue)
d.putNumChild(4)
if d.isExpanded():
with Children(d):
d.putSubItem("data1", value["data1"])
d.putSubItem("data2", value["data2"])
d.putSubItem("data3", value["data3"])
d.putSubItem("data4", value["data4"])
The following python script should do the job:
from dumper import *
import gdb
def qdump__QUuid(d, value):
this = d.makeExpression(value)
stringValue = gdb.parse_and_eval("%s.toString()" % this)
d.putStringValue(stringValue)
d.putNumChild(0)
The easiest way to use it with Qt Creator is to just paste these lines at the end of your <Qt-Creator-Install-Dir>/share/qtcreator/debugger/personaltypes.py file. In this case you can skip the first line, as it's already in the file.
As the personaltypes.py file is overwritten when you update Qt Creator you might want to put the script above in its own file. In that case you'll need to configure Qt Creator to use your file. You can do this by going to Tools > Options... > Debugger > GDB > Extra Debugging Helpers > Browse and selecting your file.
Note:
This script will only work inside Qt Creator, since we use its specific dumper (e.g. putStringValue).
We call QUuid::toString() which creates a QString object. I'm not sure exactly how gdb and python handle this, and if there is a need to clean this up in order to avoid leaking memory. It's probably not a big deal for debugging, but something to be aware of.

Best way to write Python 2 and 3 compatible code using nothing but the standard library

I am trying to make some of my code Python 2 and 3 compatible.
At the moment I am struggling with functions like range/xrange and methods like dict.items/dict.iteritems. Ideally I would like my code to be able to use the former in Python 3.x and the latter in Python 2.x.
Using if/else seems to me to be the easiest way to implement this:
if py >= 3:
for item in array.items()
...
else:
for item in array.iteritems()
However, doing like that results in lots of repeated and ugly code. Is there a better way to do that using only the standard library? Can I just state somewhere at the beginning of the code to always use range/dict.items if py >= 3 and xrange/dict.iteritems if not?
Is it possible to do something like this?
if py < 3:
use xrange as range
I have looked around and I know that several libraries, like six o futurize) are used to solve this issue. However I am working on a server that run only python 2.7 and I am not allowed to install any extra libraries on it. I have some python3 code I would like to use but I also want to maintain only one version of the code.
The simple, "Don't Make Me Think!" solution I use is to start simple scripts with:
#!/usr/bin/env python
# just make sure that Python 3 code runs fine with 2.7+ too ~98% of the time :)
from __future__ import (division, print_function, absolute_import,
unicode_literals)
from builtins import int
try:
from future_builtins import ascii, filter, hex, map, oct, zip
except:
pass
import sys
if sys.version_info.major > 2:
xrange = range
(Extra tip to stop most pep8 linters for unnecessarily yelling at you for this: move last 3 lines inside and at the top of the try block above)
But the only case I use this is basically "shell scripts that were too large and hairy so I quickly rewrote them to Python and I just want them to run under both Python 2 and 3 with 0 dependencies". Please do NOT use this in real application/library code until you know exactly what are the consequences of all the lines above, and if they are enough for your use case.
Also, the "solution" in this case for .iteritems is "just don't use it", ignore memory use optimizations and just always use .items instead - if this matters, it means you're not writing a "0 dependencies simple script" anymore, so just pick Python 3 and code for it (or Python 2 if you need to pretend we're in 2008).
Also, check these resources to get a proper understanding:
http://python-future.org/compatible_idioms.html
http://lucumr.pocoo.org/2011/1/22/forwards-compatible-python/
https://wiki.python.org/moin/PortingToPy3k/BilingualQuickRef
(NOTE: I'm answering this already answered question mainly because the accepted answers roughly translates to "you are stupid and this is dumb" and I find this very rude for an SO answer: no matter how dumb the question, and how "wrong" to actually answer it, a question deserves a real answer._
import sys
if sys.version_info.major > 2:
xrange = range
But as Wim implies, this is basically rewriting six yourself.
And as you can see, six does a lot more that handling range. Just e.g. look at the _moved_attributes list in the six source code.
And while Python comes with "batteries included", its standard library is not and cannot be all-encompassing. Nor is it devoid of flaws.
Sometimes there are better batteries out there, and it would be a waste not to use them. Just compare urllib2 with requests. The latter is much nicer to work with.
I would recommend writing for py2 or py3 in your projects's modules, but not mix them together and not include any sort of 2/3 checks at all. Your program's logic shouldn't have to care about its version of python, except maybe for avoiding functions on builtin objects that conflict.
Instead, import * from your own compatiblity layer that fixes the differences between your framework and use shadowing to make it transparent to your actual project's module.
For instance, in the compatibility module, you can write Roland Smith's substition for range/xrange, and in your other modules you add "from compatibility import *". Doing this, every module can use "xrange" and the compatibility layer will manage the 2/3 differences.
Unfortunately it won't solve existing objects functions such as dict.iteritems; typically you would monkey-patch the dict methods, but it is not possible on builtin types (see https://stackoverflow.com/a/192857/1741414). I can imagine some workarounds:
Function wrappers (essentially sobolevn's answer)
Don't use .items() functions at all; use simple loops on keys and then access the dictionary with those keys:
for key in my_dict:
value = my_dict[key]
# rest of code goes here
I guess you are mixing up array and dict in this case.
If you are restricted in using 3-d party libraries for any reason, so why not like this:
def iterate_items(to_iterate):
if py >= 3:
return to_iterate.items()
else:
return to_iterate.iteritems()
And then use it:
for item in iterate_items(your_dict):
...
import sys
VERSION = float("{}.{}".format(sys.version_info.major, sys.version_info.minor))
And by using this we can write conditional code for desired versions.
if VERSION >= 3.5:
from subprocess import run as _run
else:
from subprocess import call as _run

How to have win32com code completion in IPython?

Via
import win32com.client
wordapp = win32com.client.gencache.EnsureDispatch('Word.Application')
I can get a Word Application object documented e.g. here. However, ipython's autocompletion is not aware of that API, is there any way to add that?
Quick solution
Perhaps the simplest way to achieve code completion in IPython (tested with 6.2.1, see the answer below for a snippet that works with 7.1) and Jupyter is to run the following snippet:
from IPython.utils.generics import complete_object
import win32com.client
#complete_object.when_type(win32com.client.DispatchBaseClass)
def complete_dispatch_base_class(obj, prev_completions):
try:
ole_props = set(obj._prop_map_get_).union(set(obj._prop_map_put_))
return list(ole_props) + prev_completions
except AttributeError:
pass
Short story long
With some more details being outlined in this guide, win32com ships with a script, makepy.py for generating Python types corresponding to the type library of a given COM object.
In the case of Word 2016, we would proceed as follows:
C:\Users\username\AppData\Local\Continuum\Anaconda3\pkgs\pywin32-221-py36h9c10281_0\Lib\site-packages\win32com\client>python makepy.py -i "Microsoft Word 16.0 Object Library"
Microsoft Word 16.0 Object Library
{00020905-0000-0000-C000-000000000046}, lcid=0, major=8, minor=7
>>> # Use these commands in Python code to auto generate .py support
>>> from win32com.client import gencache
>>> gencache.EnsureModule('{00020905-0000-0000-C000-000000000046}', 0, 8, 7)
The location of makepy.py will of course depend on your Python distribution. The script combrowse.py, available in the same directory, can be used to find the names of available type libraries.
With that in place, win32com.client will automatically make use of the generated types, rather than the raw IPyDispatch, and at this point, auto-completion is available in e.g. IPython or Jupyter, given that the COM object of interest actually publishes its available properties and methods (which is not a requirement).
Now, in your case, by invoking EnsureDispatch instead of Dispatch, the makepy part of the process is performed automatically, so you really should be able to obtain code completion in IPython for the published methods:
Note, though, that while this does give code completion for methods, the same will not be true for properties. It is possible to inspect those using the _prop_map_get_ attribute. For example, wordapp.Selection.Range.Font._prop_map_get_ gives all properties available on fonts.
If using IPython is not a strong requirement, note also that the PythonWin shell (located around \pkgs\pywin32\Lib\site-packages\pythonwin\Pythonwin.exe) has built-in code completion support for both properties and methods.
This, by itself, suggests that the same is achievable in IPython.
Concretely, the logic for auto-completion, which in turn relies on _prop_map_get_, can be found in scintilla.view.CScintillaView._AutoComplete. On the other hand, code completion in IPython 6.2.1 is handled by core.completer.IPCompleter. The API for adding custom code completers is provided by IPython.utils.generics.complete_object, as illustrated in the first solution above. One gotcha is that with complete_object being based on simplegeneric, only one completer may be provided for any given type. Luckily, all types generated by makepy will inherit from win32com.client.DispatchBaseClass.
If this turns out to ever be an issue, one can also circumvent complete_object entirely and simply manually patch IPython by adding the following five lines to core.completer.Completion.attr_matches:
try:
ole_props = set(obj._prop_map_get_).union(set(obj._prop_map_put_))
words += list(ole_props)
except AttributeError:
pass
Conversely, IPython bases its code-completion on __dir__, so one could also patch gencache, which is where the code generation ultimately happens, to include something to like
def __dir__(self):
return list(set(self._prop_map_get_).union(set(self._prop_map_put_)))
to each generated DispatchBaseClass.
fuglede's answer is great, just want to update it for the newest versions of IPython (7.1+).
Since IPython.utils.generics has changes from using simplegeneric to using functools, the #complete_object.when_type method should be changed to #complete_object.register. So his initial code should changed to:
from IPython.utils.generics import complete_object
import win32com.client
#complete_object.register(win32com.client.DispatchBaseClass)
def complete_dispatch_base_class(obj, prev_completions):
try:
ole_props = set(obj._prop_map_get_).union(set(obj._prop_map_put_))
return list(ole_props) + prev_completions
except AttributeError:
pass

Categories