I'm building a VS Code extension that assumes that the user already has a particular Python package installed on their system.
What could be the possible approaches to install that Python package to the user's machine when the user installs my VS Code extension.
I was able to think of a few ways to do so but couldn't find any resources on the internet validating my approaches:
creating a subprocess from inside the VS Code extension and calling pip install if VS Code is allowed to make such a privileged install to the user's machine without having any special/admin rights?
adding that package as a dependency in the package.json so it gets installed when the user tried to install the extension? Is there a way to add Python packages as dependencies?
This doesn't sound like a good idea, because different users will use different Python versions and python environments. When you specify a specific Python version, is the cost of using the extensions you designed a little too high?
Moreover, with the update of Python version, your extensions also need to design new specific dependent versions.
I suggest that you try to let users customize as vscode recognizes Python interpreter, otherwise the applicable population will not too much.
Related
I'm trying to build a speech recognition application that works in a browser. Currently using pyodide with a web worker. I have my own package, built alongside pyaudio, that I use for the web worker.
Is there a way that I can include a brew instillation, specifically portaudio, inside of my python package so that when I build the package, portaudio is included in the wheel file? I need portaudio included for this to work in the browser.
Thank you!
I'm understanding two different questions here, so I'll try to answer them both.
Can I have a python build fetch a Homebrew project during buildtime
To my knowledge, the answer is no. The Python distribution system is separate from Homebrew, and they can't interact in this fashion.
Even if they could, this wouldn't necessarily be desirable:
What happens if the user isn't on macOS (or Linux)? Then the build would fail.
The prefix that Homebrew will install the package in isn't very deterministic. The user might be using a custom prefix, or they might be on Apple Silicon (which has a different default prefix to Intel).
Your python package might run into some difficulty locating the package.
What about if they don't have Homebrew installed? They might have another package manager like MacPorts or Fink, or maybe none at all.
Can I bundle portaudio into the build distribution?
Maybe? Even if you could, I almost certainly wouldn't recommend it.
Bundling dependencies increases the size of the distribution unnecessarily.
It would take a reasonable amount of effort to setup, assuming you can do it.
All these reasons are why for the majority of projects that have a similar setup, you will find that they recommend installing certain packages with their system package manager first, before building the Python source code.
This allows them to choose whatever package manager they have installed, and it should also be a quick and painless process.
Therefore, just change your installation instructions to the following:
# On macOS
brew install portaudio
pip install ...
I am new to Python having come from a proprietary compiled language (Xojo) that produces self-contained executables.
I understand that Python is an interpreted language. I understand that it requires an interpreter (let’s stick with CPython) and presumably it requires a number of accessory frameworks/C libraries in order to run. What I don’t understand is why is it so hard to create a folder containing the interpreter and all required files and libraries and simply bundle these up with my script to distribute.
I have discovered that there are a bunch of tools that attempt to do this (py2app, cx_freeze, etc) but many of them seem either broken, not maintained or really buggy.
I guess my question is: is there any documentation that describes the exact things I need to bundle with a “Hello World” script to get it running? This seems to be a really straightforward problem to solve but it hasn’t been (which suggests that it is far more complex than I appreciate).
My understanding is that PyInstaller works fine for making a single exe for distribution. But barring packaging tools like that, in general, there isn't an obvious "bare minimum"; the modules don't have documented dependencies, so it's usually best to ship the whole standard library.
Typically, if you need a redistributable version, you use the embedded Python zip redistributable, shipping Python alongside your main application.
The exact list of files/libraries depends on how the python interpreter is built. In windows for example, you can obtain CPython binaries built from Visual Studio, Cygwin and Mingw-w64. They have different dependency of cause. In Linux distributions, python is normally installed by default.
Below is the list of .dll and .exe files that you can find in the official CPython release for windows.
libcrypto-1_1-x64.dll python.exe python37.dll sqlite3.dll
libssl-1_1-x64.dll pythonw.exe python3.dll vcruntime140.dll
The total size of this ZIP file release is only 6.7 MB. So it would be easy to bundle it in your main executable. You can use whatever bundler at hand, not necessary those designed for python. Quoting from the documentation here:
extracting the embedded distribution to a subdirectory of the application installation is sufficient to provide a loadable Python interpreter.
I feel the absolute best way to experience Python for beginners in thonny and an esp32.
A very good way to get started with python is to use Anaconda https://www.anaconda.com/distribution/#download-section - this distribution contains the CPython interpreter and the most commonly used packages. For quite a while you will get along without installing more packages.
To be able to make a simple distributable piece of code just include a requirements.txt along with your code which should list down the packages (and versions) you are using in your code.
More on that here : https://www.idiotinside.com/2015/05/10/python-auto-generate-requirements-txt/
pip freeze generates a superset of all packages in your running environment so you would ideally go with the second smarter option in the link : pipreqs
So, in short along with your code just an additional requirements.txt should be fine using which people can install all required packages as
pip install -r requirements.txt
and they are good to go to run your code.
For advanced scenarios you might want to look up creating virtual environments using conda.
What is a conda environment?
https://docs.conda.io/projects/conda/en/latest/user-guide/concepts.html#conda-environments
How to create/manage a conda environment
https://docs.conda.io/projects/conda/en/latest/user-guide/tasks/manage-environments.html
All the best in your Python journey!
I thought it is an easy question but I spent a lot of google time to find the answer with no luck. Hope you can help me.
My company has a large SW system on windows which is portable, meaning copy some folders, add some folder to windows path and you are ready to go.
No registry, no dll in system directory, no shortcuts, Nothing!
I want to start using python 3.x in our system in the same paradigm. I also want the ability to add to this distribution a pip/conda 3rd packages from time to time.
I don't want to install python msi on all the systems.
I don't want to pack it to standalone executable like py2exe and pyinstaller or use special python distribution like PyWin32.
Somehow, I couldn't find a formal official solution for that.
The closest thing was here but no pip is supported, python is minimal, and the system isolation is "almost".
3.8. Embedded Distribution New in version 3.5.
The embedded distribution is a ZIP file containing a minimal Python
environment. It is intended for acting as part of another application,
rather than being directly accessed by end-users.
When extracted, the embedded distribution is (almost) fully isolated
from the user’s system, including environment variables, system
registry settings, and installed packages. The standard library is
included as pre-compiled and optimized .pyc files in a ZIP, and
python3.dll, python36.dll, python.exe and pythonw.exe are all
provided. Tcl/tk (including all dependants, such as Idle), pip and the
Python documentation are not included.
Note The embedded distribution does not include the Microsoft C
Runtime and it is the responsibility of the application installer to
provide this. The runtime may have already been installed on a user’s
system previously or automatically via Windows Update, and can be
detected by finding ucrtbase.dll in the system directory. Third-party
packages should be installed by the application installer alongside
the embedded distribution. Using pip to manage dependencies as for a
regular Python installation is not supported with this distribution,
though with some care it may be possible to include and use pip for
automatic updates. In general, third-party packages should be treated
as part of the application (“vendoring”) so that the developer can
ensure compatibility with newer versions before providing updates to
users.
Any ideas?
Thanks.
How about... installing Python in one machine and replicate that installation on others computers?
Usually, I install Python in a Windows Virtualbox machine (Microsoft usually give it for free to try it or for testing old Internet Explorer versions).
Then I copy the Python directory to my Windows machine (the real host) and usually works. This makes possible to using various python versions.
Did you try to complete the Python Embedded Distribution? Usually they not come with Tkinter, but once I could copy files and put in this distribution in a way that works. Try it too.
You can install pip with get-pip.py
I'm a Java/Scala dev transitioning to Python for a work project. To dust off the cobwebs on the Python side of my brain, I wrote a webapp that acts as a front-end for Docker when doing local Docker work. I'm now working on packaging it up and, as such, am learning about setup.py and virtualenv. Coming from the JVM world, where dependencies aren't "installed" so much as downloaded to a repository and referenced when needed, the way pip handles things is a bit foreign. It seems like best practice for production Python work is to first create a virtual environment for your project, do your coding work, then package it up with setup.py.
My question is, what happens on the other end when someone needs to install what I've written? They too will have to create a virtual environment for the package but won't know how to set it up without inspecting the setup.py file to figure out what version of Python to use, etc. Is there a way for me to create a setup.py file that also creates the appropriate virtual environment as part of the install process? If not — or if that's considered a "no" as this respondent stated to this SO post — what is considered "best practice" in this situation?
You can think of virtualenv as an isolation for every package you install using pip. It is a simple way to handle different versions of python and packages. For instance you have two projects which use same packages but different versions of them. So, by using virtualenv you can isolate those two projects and install different version of packages separately, not on your working system.
Now, let's say, you want work on a project with your friend. In order to have the same packages installed you have to share somehow what versions and which packages your project depends on. If you are delivering a reusable package (a library) then you need to distribute it and here where setup.py helps. You can learn more in Quick Start
However, if you work on a web site, all you need is to put libraries versions into a separate file. Best practice is to create separate requirements for tests, development and production. In order to see the format of the file - write pip freeze. You will be presented with a list of packages installed on the system (or in the virtualenv) right now. Put it into the file and you can install it later on another pc, with completely clear virtualenv using pip install -r development.txt
And one more thing, please do not put strict versions of packages like pip freeze shows, most of time you want >= at least X.X version. And good news here is that pip handles dependencies by its own. It means you do not have to put dependent packages there, pip will sort it out.
Talking about deploy, you may want to check tox, a tool for managing virtualenvs. It helps a lot with deploy.
Python default package path always point to system environment, that need Administrator access to install. Virtualenv able to localised the installation to an isolated environment.
For deployment/distribution of package, you can choose to
Distribute by source code. User need to run python setup.py --install, or
Pack your python package and upload to Pypi or custom Devpi. So the user can simply use pip install <yourpackage>
However, as you notice the issue on top : without virtualenv, they user need administrator access to install any python package.
In addition, the Pypi package worlds contains a certain amount of badly tested package that doesn't work out of the box.
Note : virtualenv itself is actually a hack to achieve isolation.
I'm setting up a new system for a group of Python rookies to do a specific kind of scientific work using Python. It's got 2 different pythons on it (32 and 64 bit), and I want to install a set of common modules that users on the system will use.
(a) Some modules work out of the box for both pythons,
(b) some compile code and install differently depending on the python, and
(c) some don't work at all on certain pythons.
I've been told that virtualenv (+ wrapper) is good for this type of situation, but it's not clear to me how.
Can I use virtualenv to set up sandboxed modules across multiple user accounts without having to install each module for each user?
Can I use virtualenv to save me some time for case (a), i.e. install a module, but have all pythons see it?
I like the idea of isolating environments, and then having them just type "workon science32", "workon science64", depending on the issues with case (c).
Any advice is appreciated.
With virtualenv, you can allow each environment to use globally installed system packages simply by omitting the --no-site-packages option. This is the default behavior.
If you want to make each environment install all of their own packages, then use --no-site-packages and you will get a bare python installation to install your own modules. This is useful when you do not want packages to conflict with system packages. I normally do this just to keep system upgrades from interfering with working code.
I would be careful about thinking about these as sandboxes, because they are only partially isolated. The paths to python binaries and libraries are modified to use the environment, but really that is all that is going on. Virtualenv does nothing to prevent code running from doing destructive things to the system. Best way to sandbox is set Linux/Unix permissions properly, and give them their own user accounts.
EDIT For Version 1.7+
The default for 1.7 is to not include system packages, so if you want the behavior of using system packages, use the --system-site-packages option. Check the docs for more info.