I am wondering what my options are if I write a Python script that makes use of installed libraries on my computer (like lxml for example) and I want to deploy this script on another computer.
Of course having Python installed on the other machine is a given but do I also have to install all the libraries I use in my script or can I just use the *.pyc file?
Are there any options for making an installer for this kind of problem that copies all the dependencies along with the script in question?
I am talking about Windows machines by the way.
Edit -------------------
After looking through your answers i thought i should add this:
The thing is....the library required for this to work won't come along quietly with either pip or easy_install on account on it requiring either a windows installer (witch i found after some searching) or being rebuilt on the target computer from sources (witch i'm trying to avoid) .
I thought there was some way to port a pyc file or something to another pc and the interpreter on that station will not require the dependencies on account on it already being translated to bytecode.
If this is not possible, can anyone show me a guide of some sort for making windows package installers ?
You should create setup.py for use with setuptools. Dependencies should be included in the field install_requires. Afterwards if you install that package using easy_install or pip, dependencies will be automatically downloaded and installed.
You can also use distutils. A basic setup.py looks as follows:
from distutils.core import setup
setup(
name='My App',
version='1.0',
# ... snip ...
install_requires=[
"somedependency >= 1.2.3"
],
)
For differences between distutils and setuptools see this question.
Related
I am developing a python package managed by poetry. The package has some complex requirements that are very difficult to install successfully on my system. I want the ability to install this in editable mode, with the ability to ignore dependencies (something which the developer of poetry frowns on). Unfortunately, I do not have the option of converting this package to a more mature packaging system.
Apparently the simple solution is to create a setup.py for the project and pip install -e that. Since unfortunately poetry has spread like a cancer to many projects now, I will have to employ such a workaround frequently. As such, I want to minimize the tedium by not copying over fields like description which are irrelevant to the developing the package.
What is the minimal setup.py file that I can use as a template for such poetry projects? I assume it must at least include the package name, version and location. Is there anything else?
I am also planning to not put any requirements in the setup.py file, since the whole point is to bypass the requirements defined by poetry and pyproject.toml. I am fine with manually resolving ModuleNotFoundError: No module named 'foo' errors by typing pip install foo.
It appears sufficient to create the following file:
from distutils.core import setup
setup(
name="<PACKAGE_NAME>",
version="<PACKAGE_VERSION>"
)
And also comment out the entire [build-system] block in the pyproject.toml file (see also How do I configure git to ignore some files locally? so you don't accidentally commit to that).
I think the package name and version can be automatically pulled from the toml file as well, but not sure right now how to do it.
As the title says, I was wondering if Google Cloud Functions (where I currently have some pure python code) support cython'd modules?
I guess, more specifically, I'm asking about how I would use said modules? It's a private project, I'm using cython via setup.py and cythonize(files) which creates a bunch of shared object modules (example.cpython-38-darwin.so, example1.cpython-38-darwin.so, example2.cpython-38-darwin.so).
Those are all for Mac, so won't work on Firebase.
Is there any way to get Cloud Functions to run the setup.py and compile some files? Or, better yet, is there some way to pre-compile those files for the appropriate OS and just deploy the shared libs?
I know a variety of libraries I'm installing via pip on Cloud Functions use Cython under the hood, but I don't really know the process of creating a wheel or other pip dependency...
You need to specify cython as a build-time dependency for your private project by adding a pyproject.toml file like:
[build-system]
requires = ["cython"]
Then when installing your package with the modern version of pip in the Cloud Functions runtime, cython will be installed into the build environment before your setup.py script is run.
I appear to have been able to (eventually) solve this... I might have several unnecessary steps, but I think they improve my overall build system (again, with the intention of being able to use cython'd shared libraries on Firebase).
From Docker (or in my case, a Linux VM), in my private repo, I cythonize the important code, and turn everything into a wheel. From here, I run auditwheel show over the wheel, to check if it adheres to the manylinux1 tag (or whichever manylinux I want). In this case, it did adhere to manylinux1 off the bat, so there was no need to repair the wheel or do any shenanigans this time.
... .py # Other irrelevant .py files
magic.py # Source code that needs to be cython'd
setup.py
Simplified setup.py:
from setuptools import setup, find_packages
from Cython.Build import cythonize
setup(
name='magiclib',
version='0.1.0',
packages=find_packages(),
ext_modules=cythonize(
"magic.py",
compiler_directives={'language_level': 3}
)
)
Running python setup.py bdist_wheel creates a wheel named dist/magiclib-0.1.0-cp37-cp37m-linux_x86_64.whl
From here, I run auditwheel show dist/magiclib-0.1.0-cp37-cp37m-linux_x86_64.whl which shows me that the code already adheres to the manylinux1 tag, but I nonetheless run auditwheel repair dist/magiclib-0.1.0-cp37-cp37m-linux_x86_64.whl which creates wheelhouse/magiclib-0.1.0-cp37-cp37m-manylinux1_x86_64.whl.
At this point, I bring this wheel into my GCF project, and use:
pip install -t magiclib magiclib-0.1.0-cp37-cp37m-manylinux1_x86_64.whl
which basically unzips the wheel into a sub-directory that I can vendor and deploy to Google Cloud and call from my Functions.
Works fine on some of my simple code, and I'll be experimenting with some more involved code.
I have a python script where I am including a third party library:
from docx import Document.
Now, I need to run this script in an environment where bare-bones python is present but not this library.
Installing this library in the target environment is beyond my scope and I tried using distutils, but couldn't go far with it. The target environment just need to run the script, not install a package.
I am from Java background and in Java I would have just exported and created a jar file which would have included all the libraries I needed. I need to do similar with python.
Edit: With distutils, I tried creating a setup.py:
from distutils.core import setup
import docx
setup(name='mymodule',
version='1.0',
py_modules=['mymodule', docx]
)
But I am not sure this works.
PyInstaller won't work if you can't make a pyc file and you cannot make pyc file unless your code runs without fatal errors.
You could have the import in a try block that excepts ImportError 's but that will result in NameError 's where the package is referenced. Long story short if the package is integral to the script no amount of avoiding it will fix your problem. You need the dependencies.
You said installing the package is beyond your scope, well then, it is time to expand your scope. Docx is an open source package you can find on github here
You can download that and run setup.py
You can include the modules for docx in your application. Just distribute them together.
But docx depends on the lmxl operating system package and needs to run setup on that. You can't just copy it to the target machine.
I'm not sure PyInstaller supports docx, especially add it has the non python dependency.
Really using pip or easy_install is the way to go.
PyInstaller is a program that converts (packages) Python programs into stand-alone executables, under Windows, Linux, Mac OS X, Solaris and AIX.
I'm new to python and I'm writing my first program. I would like after I finish to be able to run the program from the source code on a windows or mac machine. My program has dependencies on 3rd party modules.
I read about virtualenv but I don't think it helps me because it says it's not relocatable and it's not cross-platform (see Making Environments Relocatable http://pypi.python.org/pypi/virtualenv).
The best scenario is to install the 3rd party modules locally in my project, aka xcopy installation.
I will be really surprised if python doesn't support this easily especially since it promotes simplicity and frictionless programming.
You can do what you want, you just have to make sure that the directory containing your third-party modules is on the python path.
There's no requirement to install modules system-wide.
Note, while packaging your whole app with py2exe may not be an option, you can use it to make a simple launcher environment. You make a script with imports your module/package/whatever and launches the main() entry-point. Package this with py2exe but keep your application code outside this, as python code or an egg. I do something similar where I read a .pth text file to learn what paths to add to the sys.path in order to import my application code.
Simply, that's generally not how python works. Modules are installed site-wide and used that way. Are you familiar with pip and/or easy_install? Those + pypi let you automatically install dependencies no matter what you need.
If you want to create a standalone executable typically you'd use py2exe, py2app or something like that. Then you would have no dependencies on python at all.
I also found about zc.buildout that can be used to include dependencies in an automatic way.
I am newbie.
I am buidling rpm package for my own app and decided to use distutils to do achieve it. I managed to create some substitue of %post by using advice from this website, which i really am thankfull for, but i am having problems with %postun.
Let me describe what i have done. In setup.py i run command that creates symbolic link which is needed to run application. It works good but problem is when i want to remove rpm, link stays there. So i figured that i should use %postun in spec file. My question is: is there a way to do this in setup.py or do i have to manually edit spec file?
Please advise or point me some manuals or anything.
Thank you
Yes, you can specify a post install script, all you need is to declare in the bdist_rpm in the options arg the file you want to use:
setup(
...
options = {'bdist_rpm':{'post_install' : 'post_install',
'post_uninstall' : 'post_uninstall'}},
...)
In the post_uninstall file, put he code you need to remove the link, somethink like:
rm -f /var/lib/mylink
Neither distutils nor setuptools have uninstall functionality.
At some point, the python community agreed that uninstall should be handled by the packaging system. In this case you want to use rpm, so there is probably a way inside of rpm system to remove packages, but you will not find that in distutils or setuptools.
# pycon2009, there was a presentation on distutils and setuptools. You can find all of the videos here
Eggs and Buildout Deployment in Python - Part 1
Eggs and Buildout Deployment in Python - Part 2
Eggs and Buildout Deployment in Python - Part 3
There is a video called How to Build Applications Linux Distributions will Package. I have not seen it, but it seems to be appropriate.