Does python require microsoft visual c++ redistributable to run the code ?
I'm using pyinstaller to compile my .py code into exe. In some systems my exe is asking for microsoft visual c++ redistributable package to run.
Does pyinstaller includes microsoft visual c++ redistributable files while making exe ? If no, how can I include those files so that I don't need to install microsoft visual c++ redistributable package into other's system to run my software ?
What are the other alternatives to build a standalone software in python ? I'm reading to use other languages along with python.
I saw electron js and python can be used together to make desktop application. But how will I distribute that application as a standalone exe ?
Python itself does not depend on the presence of MSVC. You can download a portable Python package, and it will run wherever you copy it. Those are the embeddable ones from https://www.python.org/downloads/windows/
But, Python modules with native extension code inside can depend on MSVC on multiple levels:
if the native part comes in binary format (.pyd file on Windows), actually that is a .dll, and it may depend on other .dll-s, depending on how it has been built
if the native part comes as C/C++ source code, it will be built at installation time, typically via a "setup.py", and this procedure will need a C compiler installed on the system
PyInstaller is a different story. First of all, it has a documentation which you may want to read. For example the page What PyInstaller Does and How It Does It clearly gives a direct answer to at least one of your questions:
Bundling to One File
PyInstaller can bundle your script and all its dependencies into a single executable named myscript (myscript.exe in Windows).
There is also a list of packages with known compatibility and known compatibility issues: https://github.com/pyinstaller/pyinstaller/wiki/Supported-Packages, which you may find useful depending on what packages you need.
While it is not a duplicate, this question: How to package a linked DLL and a pyd file into one self contained pyd file? (and another one it links) may be interesting to read.
Of course Python requires MSVC Redistributable, any native Windows program using standard library functions requires it. Obviously, Python uses lots of them and should provide a consistent environment across all extension modules.
However, since Python 3.5 it is bundled with an installer, so there's no need to install it manually. Python installers prior to 3.5 don't include it and I wasn't able to find any clarifications whether it's downloaded during installation or not.
By default Python also enforces extension modules to be compiled with the same (or, since 3.5, compatible) version of MSVC as an interpreter itself. So except for some very rare cases extension modules will also use the same redistributable.
"Embeddable" Python releases referred by #tevemadar are NOT a "portable Python"! Here's what the documentation says about their usage:
It is intended for acting as part of another application, rather than being directly accessed by end-users.
Note: The embedded distribution does not include the Microsoft C Runtime and it is the responsibility of the application installer to provide this. The runtime may have already been installed on a user’s system previously or automatically via Windows Update, and can be detected by finding ucrtbase.dll in the system directory.
But you still don't need them if you use PyInstaller.
To check whether or not redistributable files are included in your .exe file you could probably open it with any archiver software and see it for yourself. My guess is that they can be included at least if Python is installed in a single user mode, as in such case they're installed in the Python directory as well.
Other than that, however, you should really ask your questions separately.
Related
I am taking a course in C, and I decided to write a little script in python that would compile the homework questions we are given and run them against test files containing input and expected output.
It was straightforward to compile the code with the gcc or clang compilers from the script. I just needed to run shutil.which(compiler) where compiler is either gcc or clang to get the compiler's path, and then invoke it using the subprocess module to compile the code.
The problem I am facing is handling the Microsoft MSVC compiler for Windows. Some students who use Windows like to use Visual Studio for writing code and I can't blame them for that. I found it quite challenging to invoke the MSVC compiler from a Python script. As opposed to other compilers like gcc or clang, which are binaries located on the system's PATH and thus discoverable using shutil.which, the MSVC compiler isn't usually found directly on the system's PATH. I know that on Windows if you want to invoke the MSVC compiler, mainly accessible from the cl.exe binary, you have to open a special developer command prompt that sets up the environment for cl.exe to work correctly. This means that a simple shutil.which("cl.exe") wouldn't work, and more effort is required to invoke the cl.exe compiler. I read that there are certain batch scripts you can use to set up the MSVC compiler such as vcvarall.bat. However, I couldn't find an easy and portable way to invoke them from a script.
The Microsoft documentation specifies the location of these scripts in the Visual Studio installation directory, but there doesn't seem to be a portable way to access these locations. What if Visual Studio isn't installed in C:\Program Files, but elsewhere? How can I determine the version and edition of Visual Studio programmatically? What if only the VisualC++ tools are installed and not Visual Studio itself? Where would the scripts be located then?
There doesn't seem to be an environment variable or other portable tool that holds this information, so it looks like any attempt to find these scripts will be system dependent and is doomed to fail on a system with a custom installation.
Even if I manage to get these scripts programmatically, the documentation doesn't give a clear example of how to use them. I have no experience with using the MSVC compilers outside Visual Studio, so I don't know how to use these scripts and invoke the cl.exe compiler.
Can anyone find an easy and portable way to invoke the MSVC compilers from a Python script? Visual Studio is an extremely common C/C++ IDE on Windows, so I would like my script to be capable of invoking the MSVC compilers. If you can provide an example of how this could be accomplished, I'd really appreciate it.
To preface: my code works as I expect when compiling and running on Linux. However, this library needs to be compiled for use on a Windows machine. I looked in to a couple different options, and decided that using Cygwin to compile for Windows seemed to be the correct choice. I'm using a setup.py file with the distutils.core library and compiling using python setup.py install. When compiling on Windows in Cygwin, it fails to find pthread.h, arpa/inet.h, netinet/in.h, and sys/socket.h. I was under the impression that Cygwin came prepackaged with these headers, which is why I chose to use it. The alternative to Cygwin is putting preprocessor commands everywhere and using Windows specific libraries such as winsock2.h, which I want to avoid if at all possible. Is it possible to compile for Windows using Cygwin? If so, what have I done wrong to cause Cygwin to not recognize these headers?
You need to install the proper headers
$ cygcheck -p usr/include/pthread.h
Found 9 matches for usr/include/pthread.h
cygwin-devel-3.0.7-1 - cygwin-devel: Core development files
..
cygwin-devel-3.1.6-1 - cygwin-devel: Core development files
...
so install the cygwin-devel package
To check all the shared libraries needed by the built dll, you can use cygcheck
$ cygcheck /usr/lib/python3.8/site-packages/Cython/Compiler/FlowControl.cpython-38-x86_64-cygwin.dll
D:\cygwin64\lib\python3.8\site-packages\Cython\Compiler\FlowControl.cpython-38-x86_64-cygwin.
dll
D:\cygwin64\bin\cygwin1.dll
C:\WINDOWS\system32\KERNEL32.dll
C:\WINDOWS\system32\ntdll.dll
C:\WINDOWS\system32\KERNELBASE.dll
D:\cygwin64\bin\libpython3.8.dll
D:\cygwin64\bin\cygintl-8.dll
D:\cygwin64\bin\cygiconv-2.dll
D:\cygwin64\bin\cyggcc_s-seh-1.dll
As was built with Cygwin Python, you need also to transfer the cygwin python...
Most important, I think, is to follow the instructions in the Python help or on the Python doc web site for "Extending and Embedding the Python Interpreter" for the version you are building the extension for. For windows, the build instructions identify the build environment used to create the binary package that you download from python.org, usually something like VS2013 or VS2017. (As an aside, I think the Community editions have everything you need, and I don't think you actually have to use the Visual Studio GUI when you build using nmake from the CMD.EXE terminal.)
To build in Cygwin for use in a Windows version of Python, you may need to install and then use the x86_64-w64-mingw32-gcc, etc., cygwin packages to cross-compile non-cygwin (i.e. pure windows) executables and DLLs from Cygwin.
Binary extensions must be built using the source tree for a specific Python major.minor version, and bitness. For windows, you will need to build multiple versions of the extension, one for each major.minor, bitness version of Python that will import it, e.g. 3.6, 3.7, 3.8, 3.9, 32-bit, 64-bit. The extension code may not require changes between versions, but it still needs to be compiled with the right compiler and linked against exactly the same shared libraries (in this case .DLL files) as used by the Python executable. For instance, it must use exactly the same version of Microsoft's C run time library DLL as the Python executable does. This is a bit more sensitive and restrictive than on Linux, where you can rebuild the python executable and your extension with the same toolchain from your distro more easily.
Can one use the same python package (wheel file) for Linux , windows etc.? I am asking this as some packages include not only python files but EXEs as well, which I assume are python code turned into exe (at least with pip.exe and Django admin tool). Exe files are platform specific in the same way there are separate python interpreters for windows and Linux so that arises a question.
Some wheel packages are cross-platform; some are platform-specific.
This information is included in the wheel's name. For example:
pytz-2018.4-py2.py3-none-any.whl (510kB)
That py2.py3 means that it works in any Python implementation, both Python 2.x and 3.x, and that none-any means that it works on any platform.
This one is more specific:
numpy-1.14.3-cp36-cp36m-macosx_10_6_intel.macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64.whl
That cp36-cp36m means that it works only in CPython 3.6, and that macosx_10_9_intel.macosx_10_9_x86_64.macosx_10_10_intel.macosx_10_10_x86_64 means that it's built for x86_64 macOS versions 10.9-10.10. (Roughly speaking, that's the minimum and recommended versions of macOS; most other platforms aren't quite as complicated.)
The most common reason for a package to be platform-specific is that it includes C API extension modules, as is the case with numpy. But there can be other reasons. For example, it may include a native executable to subprocess, or it may use ctypes to access system APIs, etc.
A Python wheel is a packaging format, NOT an execution format. It's basically a .zip file.
Furthermore:
https://packaging.python.org/discussions/wheel-vs-egg/
...when the distribution only contains Python files (i.e. no compiled
extensions), and is compatible with Python 2 and 3, it’s possible for
a wheel to be “universal”, similar to an sdist.
From the same link:
A single wheel archive can indicate its compatibility with a number of
Python language versions and implementations, ABIs, and system
architectures.
In other words, the "wheel" format is designed to be as portable as possible ... and it also allows you to include platform-specific contents as required.
My program needs to import xxx.so, and this xxx.so file is compiled under Python2.4.
I want to run my program under Python2.7 & Python2.4, but there is an error when import xxx.so under Python2.7, I know that is due to mismatching with the Python version.
My question: should I compile xxx.so file to match each Python version?
C extension modules are version specific. Each different version of Python requires a different version of the extension module. You need to compile the extension module from source linking against the headers and libraries for the target Python version.
Yes, you should compile it with the matching Python version using the same compiler to assure ABI compatibility.
It's not a problem on *nix platforms, where compiler is bundled with the operating system, but may give you headaches on Windows, where many different compilers are used (mingw, visual studio, etc).
Python C API documentation describes compilers used by the official builds.
Currently we keep project related python packages in a subversion directory, so when someone adds or removes one it will directly be available to others.
Still, this method works well with Python packages that are not platform dependent.
Still I do have quite a few that are platform dependent and worse, when you install them using easy_install they will need a compiler to produce the .egg file.
I had to mention that the package maintainers are not providing binaries for these modules, so I need to compile it manually. I've tried to add the .egg file to the shared directory but python doesn't pick it up by default.
While in the entire team only a few have compilers, how can we share the packages in an easy way?
To make the problem even more complex, I had to specify that even if in 99% of the code is run on the same platform (Windows) with the same version of Python (2.5), we still have few scripts that are to be executed on another platform (Linux).