I have a script I'm trying to compile with PyInstaller (2.1) using Python 2.7
The script uses a custom package I've written called 'auto_common'
In the script I import it by using
sys.path.append(path_to_package)
The project folders look like this:
Automation/ Top level project
Proj1/
script1.py This is the script I want to compile
myspec.spec Spec file for the script
Packages/
auto_common/
__init__.py Init module of the package (empty)
... More modules here
In the PyInstaller log file I get the following warning:
W: no module named auto_common (top-level import by __main__)
How do I create a hook which will include the package (using sys.path.append for example)?
I tried adding the path of the package to 'pathex' in the spec file but it didn't work.
Using "-p" when compiling (or when building a spec file) will add additional paths to python's path.
pyinstaller -p any_path/Automation/Packages script1.py
This mimics the behavior of sys.path.append().
Thanks to the guys at PyInstaller for the solution:
sys.path.append does not work when compiling with PyInstaller 2.1
Related
I am developing a PyQT based Windows Application. I have lot of folders with many python files. But when I try to create an executable with the pyinstaller python package,the dependencies of the files in the other folders could not be resolved.
For Simplicity, this is my folder structure
gui
-gui.py
-main.py
libs
-testlibs.py
utility
-folderstructure.py
I have used the command pyinstaller main.py --onefile
The problem is gui.py imports utility.folderstructure which couldnot be resolved after the exe is generated.
The exe is throwing execption that utility cannot be found.
I have added addition hook directory option. In that I have added a file named hook-gui.py with content hiddenimports=[".utility.folderstructure.*"]
Now my folderstructure is
gui
-gui.py
-main.py
-hook-gui.py
libs
-testlibs.py
utility
-folderstructure.py
But while running the command pyinstaller ----additional-hooks-dir=. main.py --onefile
INFO: Loading module hook hook-gui.py...
WARNING: Hidden import .utility.folderstructure.* not found!
This particular error pops up.
Could you tell where I am wrong?
Normally I would use a layout similar to:
-main.py
-gui
-init.py
-gui.py
-hook-gui.py
-libs
-init.py
-testlibs.py
-utility
-init.py
-folderstructure.py
Pyinstaller has always found the modules for me in this style layout.
You're just going to have to adjust some imports.
I am trying to generate Python bindings for a C++ shared library with SWIG and distribute the project with conda. The build process seems to work as I can execute
import mymodule as m
ret = m.myfunctioninmymodule()
in my build directory. Now I want to install files that are created (namely, mymodule.py and _mymodule.pyd) in my conda environment on Windows so that I can access them from everywhere. But where do I have to place the files?
What I have tried so far is to put both files in a package together with a __init__.py (which is empty, however) and write a setup.py as suggested here. The package has the form
- mypackage
|- __init__.py
|- mymodule.py
|- _mymodule.pyd
and is installed under C:\mypathtoconda\conda\envs\myenvironmentname\Lib\site-packages\mypackage-1.0-py3.6-win-amd64.egg. However, the python import (see above) fails with
ImportError: cannot import name '_mymodule'
It should be noted that under Linux this approach with the package works perfectly fine.
Edit: The __init__.py is empty because this is sufficient to build a package. I am not sure, however, what belongs in there. Do I have to give a path to certain components?
I have a large project with multiple packages. These packages use a set of modules in a common package. I am trying to create an exe on Windows using pyinstaller, but it cannot find the common package.
This cut down project exhibits the same issue. My package is organised as shown in this tree:
When I use
python -m my_package
in the top my_package directory it works perfectly.
The module main.py in my_package imports Bar (which is located in foo) from common. The __init__.py file in common includes:
from common.source.foo import Bar
When I build and exe file and run it in terminal, it fails with ' No module named common'
my pyintstaller spec includes:
hiddenimports=['../', '../common/', '../common/common/']
Should I try something different?
The hiddenimports are used to specify imports that can't be detected by pyinstaller, not the paths to those imports.
Try adding the necessary paths to the pathex list in the spec file instead (these are paths that will be available in sys.path during analysis).
I am trying to build a Python multi-file code with PyInstaller. For that I have compiled the code with Cython, and am using .so files generated in place of .py files.
Assuming the 1st file is main.py and the imported ones are file_a.py and file_b.py, I get file_a.so and file_b.so after Cython compilation.
When I put main.py, file_a.so and file_b.so in a folder and run it by "python main.py", it works.
But when I build it with PyInstaller and try to run the executable generated, it throws errors for imports done in file_a and file_b.
How can this be fixed? One solution is to import all standard modules in main.py and this works. But if I do not wish to change my code, what can be the solution?
So I got this to work for you.
Please have a look at Bundling Cython extensions with Pyinstaller
Quick Start:
git clone https://github.com/prologic/pyinstaller-cython-bundling.git
cd pyinstaller-cython-bundling
./dist/build.sh
This produces a static binary:
$ du -h dist/hello
4.2M dist/hello
$ ldd dist/hello
not a dynamic executable
And produces the output:
$ ./dist/hello
Hello World!
FooBar
Basically this came down to producing a simple setup.py that builds the extensions file_a.so and file_b.so and then uses pyinstaller to bundle the application the extensions into a single executeble.
Example setup.py:
from glob import glob
from setuptools import setup
from Cython.Build import cythonize
setup(
name="test",
scripts=glob("bin/*"),
ext_modules=cythonize("lib/*.pyx")
)
Building the extensions:
$ python setup.py develop
Bundling the application:
$ pyinstaller -r file_a.so,dll,file_a.so -r file_b.so,dll,file_b.so -F ./bin/hello
Just in case someone's looking for a quick fix.
I ran into the same situation and found a quick/dirty way to do the job. The issue is that pyinstaller is not adding the necessary libraries in the .exe file that are needed to run your program.
All you need to do is import all the libraries (and the .so files) needed into your main.py file (the file which calls file_a.py and file_b.py). For example, assume that file_a.py uses opencv library (cv2) and file_b.py uses matplotlib library. Now in your main.py file you need to import cv2 and matplotlib as well. Basically, whatever you import in file_a.py and file_b.py, you have to import that in main.py as well. This tells pyinstaller that the program needed these libraries and it includes those libraries in the exe file.
I'm new to python coming from java. I created a folder called: 'Project'. In 'Project' I created many packages (with __init__.py files) like 'test1' and 'tests2'. 'test1' contains a python script file .py that uses scripts from 'test2' (import a module from test2). I want to run a script x.py in 'test1' from command line. How can I do that?
Edit: if you have better recommendations on how I can better organize my files I would be thankful. (notice my java mentality)
Edit: I need to run the script from a bash script, so I need to provide full paths.
There are probably several ways to achieve what you want.
One thing that I do when I need to make sure the module paths are correct in an executable scripts is to get the parent directory and insert in the module search path (sys.path):
import sys, os
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.realpath(__file__))))
import test1 # next imports go here...
from test2 import something
# any import what works from the parent dir will work here
This way you are safe to run your scripts without worrying how the script is called.
Python code is organized into modules and packages. A module is just a .py file that can contain class definitions, function definitions and variables. A package is a directory with a __init__.py file.
A standard Python project might look something like this:
thingsproject/
README
setup.py
doc/
...
things/
__init__.py
animals.py
vegetables.py
minerals.py
test/
test_animals.py
test_vegetables.py
test_minerals.py
The setup.py file describes the metadata about your project. See Writing the Setup Script and particularly the section on installing scripts.
Entry points exist to help distribute command line tools in Python. An entry point is defined in setup.py like this:
setup(
name='thingsproject',
....
entry_points = {
'console_scripts': ['dog = things.animals:dog_main_function']
},
...
)
The effect is that when the package is installed using python setup.py install a script is automatically created in some reasonable place according to your OS, such as /usr/local/bin. The script then calls the dog_main_function in the animals module of the things package.
Yet another Python convention to consider is have a __main__.py file. This signifies the "main" script within a directory or zip file full of python code. This is a good place to define a command line interface to your code using the argparse parser for command line arguments.
Good and up-to-date information on the somewhat muddled world of Python packaging can be found in the Python Packaging User Guide.