Pyinstaller produces massive folder - python

I'm using pyinstaller to pack a splash screen, these are the import of the python script:
import subprocess
import time
import sys
import os
import signal
from multiprocessing import Process, Queue
import gi
gi.require_version('Gtk', '3.0')
from gi.repository import Gtk, Gdk, Pango
I use the following command to create the dist folder:
pyinstaller splash_gui.py
The problem is that the produced folder is 630Mb which is an overkill for just a splash screen program, so I investigated further and i find out that i could remove a LOT of files (the most heavy were the one in the share folder containing my themes, all of them) but more importantly i found out that the vast majority of dynamic libraries were useless.
With trial and error I managed to remove all the things that were not necessary (often generating a warning which i don't care because it is just a splash screen). Is there a direct way to avoid this mess? Briefly i want to keep just the file containing actual called functions.
This is the before and after clean situation:
630Mb 8Mb
. .
└── splash_gui └── splash_gui
├── array.so ├── binascii.so
... ├── _collections.so
├── share ├── cPickle.so
│   ├── fontconfig ├── cStringIO.so
│   ├── glib-2.0 ├── fcntl.so
│   ├── icons ├── _functools.so
│   ├── locale ├── gi._gi.so
│   ├── mime ├── _io.so
│   └── themes ├── itertools.so
... ├── libpython2.7.so.1.0
├── _sha.so ├── math.so
├── _socket.so ├── _multiprocessing.so
├── splash_gui ├── operator.so
├── _ssl.so ├── _random.so
├── strop.so ├── select.so
├── _struct.so ├── _socket.so
├── termios.so ├── splash_gui
├── time.so ├── _struct.so
├── unicodedata.so ├── time.so
└── zlib.so └── zlib.so
Except for the warnings the splash screen works normal

Pyinstaller packs all your installed librarieswith pip ,so you need to create a clean new virtual environment and only install packages that you need in that environment, when you do that install pyinstaller inside and run it

If a .exe file (and no other folders/files) is acceptable (really just try it out if you need to chack the file size), Here is my solution, running the command:
pyinstaller --onefile "directory\script_name.py"
In the current directory, a bunch of new folders will be made. Here's what to do with them:
Your file will be in "dist", and will be called script_name (without ".py", of course
The folders "dist" and "build" are unnecessary, as well as __pycache__ if it comes up
Also, the file "script_name".spec will be created.

Related

ImportError: No module named unable to import toplevel directory of module in python

My Directory structure:
├── common
│   ├── common.py
│   └── __init__.py
├── project1
│   ├── __init__.py
│   └── scripts
│   ├── example_import.py
│   └── __init__.py
└── project2
├── __init__.py
└── scripts
└── __init__.py
I need to import common/common.py module in project1/scripts/example_import.py file
example_import.py:
import sys
sys.path.append("../common")
from common import Test
print("Module Not import error")
Error:
Traceback (most recent call last):
File "project1/scripts/example_import.py", line 3, in <module>
from common import Test
ImportError: No module named common
How to fix a issue?
Understanding how Python imports work is tricky in the beginning but makes sense once you understand how.
There are different way to fix your import issue. I would not recommend messing up with sys.path. Depending on where you are calling your script, you have multiple choice at hand.
Your Directory structure.
├── common
│   ├── common.py
│   └── __init__.py
├── project1
│   ├── __init__.py
│   └── scripts
│   ├── example_import.py
│   └── __init__.py
└── project2
├── __init__.py
└── scripts
└── __init__.py
On the root of directory
python project1/scripts/example_import.py
Will work, assuming that the imports in example_import.py looks like
from common.common import Test
If you want to use from common import Test, you need to add from common.common import Test in __init__.py file in common folder.
You can also use PYTHONPATH environment variable to tell Python where to look for the base module.
PYTHONPATH=/pathto/basefolder python folder/filesx.py
Another way is to create a setup.py on base and do development installation python -m pip install --editable . on environment you are working on.
Example of setup.py
#!/usr/bin/env python
from distutils.core import setup
setup(name='projectX',
version='1.0',
description='The World is not Enoug',
author='James Bond',
author_email='agent#007.net',
url='https://www.python.org/sigs/distutils-sig/',
packages=['common', 'project1', 'project2],
)
See Python Documentation for more setup.py options.

importing python modules and packages in different sub-directories of the same project

I'd like to figure out the cleanest and preferably self contained way to use my packages in scripts that are in a different directory to the package itself.
The example problem is as follows:
The modules in lib need to both be imported, and run as a script.
My project directory is as below and I'm having two issues:
in lib/api.py, I want to read in data_files/key.txt correctly when api.py is called or imported
in testing_script.py I want to import and use lib/get_data.py
I can't seem to find a clean way to do this, does this mean my project is structured in a non-pythonic way?
Thanks for the help.
my-project-git
├── LICENSE
├── README.md
├─── my_project
│   ├── data_files
│   │   ├── key.txt
│   │   ├── mappings.csv
│   ├── lib
│   │   ├── __init__.py
│   │   ├── api.py
│   │   └── get_data.py
│   └── test
│   ├── __init__.py
│   └── testing_script.py
├── requirements.txt
└── setup.py
As far as I know, there's isn't a pythonic way to structure your project.
This is what Kenneth Reitz recommended in 2013 and it's how I use it: https://www.kennethreitz.org/essays/repository-structure-and-python.
README.rst
LICENSE
setup.py
requirements.txt
sample/__init__.py
sample/core.py
sample/helpers.py
docs/conf.py
docs/index.rst
tests/test_basic.py
tests/test_advanced.py
Inside sample (my_project in your case) you can separate into categories as you like. E.g. Utils (common functions), Database (read, write), View (user commands), etc. It depends on your project.
As for calling the modules at the same level, you should define them in the __init__ file of the top hierarchy module which is sample in this case.
For example:
__init__ in _my_project
from sample.core import a_Class
from sample.core import a_function
from sample.core import anything
then from /test/test_basic.py you do:
from sample import a_Class
# or import sample
a = a_Class() # use the class from core.py
# or a = sample.a_Class()
Take a look at the sample module repository: https://github.com/navdeep-G/samplemod

Packaging python with cython extension

I'm trying to build a package that uses both python and cython modules. The problem I'm having deals with imports after building and installing where I'm not sure how to make files import from the .so file generated by the build process.
Before building my folder structure looks like this
root/
├── c_integrate.c
├── c_integrate.pyx
├── cython_builder.py
├── __init__.py
├── integrator_class.py
├── integrator_modules
│   ├── cython_integrator.py
│   ├── __init__.py
│   ├── integrator.py
│   ├── numba_integrator.py
│   ├── numpy_integrator.py
│   ├── quadratic_error.png
│   ├── report3.txt
│   ├── report4.txt
│   └── report5.txt
├── report6.txt
├── setup.py
└── test
├── __init__.py
└── test_integrator.py
Building with python3.5 setup.py build gives this new folder in root
root/build/
├── lib.linux-x86_64-3.5
│   ├── c_integrate.cpython-35m-x86_64-linux-gnu.so
│   ├── integrator_modules
│   │   ├── cython_integrator.py
│   │   ├── __init__.py
│   │   ├── integrator.py
│   │   ├── numba_integrator.py
│   │   └── numpy_integrator.py
│   └── test
│   ├── __init__.py
│   └── test_integrator.py
The setup.py file looks like this
from setuptools import setup, Extension, find_packages
import numpy
setup(
name = "integrator_package",
author = "foo",
packages = find_packages(),
ext_modules = [Extension("c_integrate", ["c_integrate.c"])],
include_dirs=[numpy.get_include()],
)
My question is then: how do I write import statements of the functions from the .so file into ìntegrator_class.py in root and cython_integrator and test_integrator located in the build directory. Appending to sys.path seems like a quick and dirty solution that I don't much like.
EDIT:
As pointed out in the comments I haven't installed the package. This is because I don't know what to write to import from the .so file
In no specific order:
The file setup.py is typically located below the root of a project. Example:
library_name/
__init__.py
file1.py
setup.py
README
Then, the build directory appears alongside the project's source and not in the project source.
To import the file c_integrate.cpython-35m-x86_64-linux-gnu.so in Python, just import "c_integrate". The rest of the naming is taken care of automatically as it is just the platform information. See PEP 3149
A valid module is one of
a directory with a modulename/__init__.py file
a file named modulename.py
a file named modulename.PLATFORMINFO.so
of course located in the Python path. So there is no need for a __init__.py file for a compiled Cython module.
For your situation, move the Cython code in the project directory and either do a relative import import .c_integrate or a full from integrator_modules import c_integrate where the latter only works when your package is installed.
A few of this information can be found in my blog post on Cython modules http://pdebuyl.be/blog/2017/cython-module.html
I believe that this should let you build a proper package, comment below if not.
EDIT: to complete the configuration (see comments below), the poster also
Fixed the module path in the setup.py file so that it is the full module name starting from the PYTHONPATH: Extension("integrator_package.integrator_modules.c_integrat‌​or", ["integrator_package/integrator_modules/c_integrator.c"] instead of Extension("c_integrate", ["c_integrate.c"])]
Cythonize the module, build it and use with a same Python interpreter.
Further comment: the setup.py file can cythonize the file as well. Include the .pyx file instead of the .c file as the source.
cythonize(Extension('integrator_package.integrator_modules.c_integrat‌​or',
["integrator_package/integrator_modules/c_integrator.pyx"],
include_dirs=[numpy.get_include()]))

Import packages from current project directory in VScode

When I build or debug a particular file in my Python project (which imports a user defined package) I get an import error. How can I solve this problem?
test.py
def sum(a,b):
return a+b
test2.py
from test import sum
sum(3,4)
The above code will give an import error cannot import test.
Directory tree
├── graphs
│   ├── Dijkstra's\ Algorithm.py
│   ├── Floyd\ Warshall\ DP.py
│   ├── Kruskal's\ algorithm.py
│   ├── Prim's\ Algoritm.py
│   ├── __init__.py
│   └── graph.py
├── heap
│   ├── __init__.py
│   ├── heap.py
│   └── priority_queue.py
Trying to import in graphs;
from heap.heap import Heap
About the heap file, make sure that you are running on the project root folder.
If these test.py files are running on the same folder, try to add a __init__.py empty file on this folder.
The __init__.py files are required to make Python treat the directories as containing packages; this is done to prevent directories with a common name, such as string, from unintentionally hiding valid modules that occur later (deeper) on the module search path. In the simplest case, __init__.py can just be an empty file, but it can also execute initialisation code for the package or set the __all__ variable, described later.

Cannot import modules using Virtualenv in Vagrant

This is my first time trying to set up a vagrant environment or a python virtuelenv, so forgive me if I am missing something basic.
Right now, I ssh into my vagrant box and in the home directory I have placed my venv folder. I have run
source venv/bin/activate
From my home directory I move to /vagrant, and within here I have my project files laid out something like this:
├──project
├── LICENSE
│
├── project
│   │   ├── exceptions.py
│   │   ├── __init__.py
│   │   ├── resources
│   │   │   ├── base.py
│   │   │   ├── __init__.py
│   │   └── target
│   │   └── __init__.py
│   │   └── test.py
│   ├── README.md
My problem is I am unable to import my modules in different directories. For example, if I am in /vagrant/project/project/target/test.py and I attempt:
import project.exceptions
I will get the error
ImportError: No module named project.exceptions
If I am in the /vagrant/project/project directory and I run
import exceptions
that works fine.
I have read up on similar problems people have experienced on StackOverflow.
Based on this question: Can't import package from virtualenv I have checked that my sys.executable path is the same in both my python interpreter as well as when I run a script (home/vagrant/venv/bin/python)
Based on this question: Import error with virtualenv. I have run ~/venv/bin/python directly and attempted to import, but the import still fails.
Let me know if there is more information I can provide. Thank you.
You have two options:
You can install your project into the virtual environment, by writing a setup.py file and by calling python setup.py install. See the Python Packaging User Guide.
You can set the PYTHONPATH environment variable to point to your project, like this:
$ export PYTHONPATH=$PYTHONPATH:/vagrant/project

Categories