I am currently working on a git repo located in
/home/user/bin/git/my-project
so i am saving the project as I work on it. But i have an issue, my project depends on python libraries, crontab configuration, bashrc/.profile variables and if i need to transfer all of this on another linux system I was wondering if it was possible to host the whole directory right from root? like:
git/home/user/bin/git/my-project
so that when i want to retrieve the exact same copy and update the second machine, it works exactly like the first? Is it possible? are there any downsides?
thanks in advance for your help
EDIT: Thanks for your answers, by root I meant the directory which does not have a parent directory. On my raspberry pi I probably meant the place where you find (etc, var, usr, boot, dev...). So how would you proceed, if you wanted to (for example) have another RPI with the exact same function without having to duplicate what's in the SD card? and everyone working on a different system have the exact same functionnality, up to date?
Thanks!
Downsides of making a git repo in the root (ie. in /)
Some IDE applications work horribly slow, because they "autodetect" if they are inside a git repo and then try to run git status to check for changes. This ends up indexing the whole filesystem. I remember a problem with eclipse and NetBeans (I think NetBeans didn't start at all, because it tried to open read-write some git files, and they were owned by root, but I'm not sure).
how would you proceed, if you wanted to (for example) have another RPI with the exact same function without having to duplicate what's in the SD card?
Create a package for the distribution installed on the platform. Create a publicly available repository with that package. Add that repository to the native package manager of that platform and install it like a normal package.
Package manager is the way of distributing stuff to systems. Package managers have dependency resolution, easy-updates, pre-/post-install scripts and hooks, all the functionality you need to distribute programs. And package managers have great support of removing files and updates.
my project depends on python libraries
So install them, via pip pyenv or the best via the package manager available on the system, the system administrators choice. Or request the user to setup a venv after installing your application.
, crontab configuration
(Side note: see systemd timers).
, bashrc/.profile variables
If your library depends on my home customization, you have zero chances of installing it on any of my systems. There is zero need to touch home directory stuff if you want to distribute a library to systems. Applications have /etc for configuration, and most distributions have /etc/profile.d/ as the standard directory for drop-in shell configuration. Please research linux directory structure, see Filesystem Hierarchy Specification and XDG specifications.
Still, requesting users to do manual configuration after installing a package is normal. Some applications to work need environmental variables.
If you want to propagate change to different computers and manage that change, then use ansible, puppet, chef and such automation tools.
Related
Edit: I am self taught, I don't know the right terms.
This question is probably a duplicate but I am not able to find it. I need to include a python pip package in my application say numpy. I don't want the user to pip install -r requirements.txt I want to include the module when the user downloads the application.
you don't have to include pip when you build the app, because when you build the app, all depencies will be included in the app resources, i am not sure which app you are building, but in general, all depencies will be converted to pyc packages when building the app and compiled
It's a whole world to explore, honestly ) you need to explore the subjects of deployment and distribution; those will depend on target operating systems... I would suggest investigating Docker, which allows you to package the OS (some Unix), runtime and dependencies into one "thing".
Vendoring
The only way I can think of besides pip to do this with pure source code would be to vendor the code you need. This means downloading the source code of the package (like numpy) and including it in your project.
This comes with many potential issues including:
licensing issues
issues with installation if there are cpython files that need to be compiled
Having to manually update the files when new releases come out
increased download size for your source code
etc.
I would not recommend doing this unless there is a hard technical requirement for it, because it's a real hassle to deal with. If there is something in particular besides having to run the extra command as to a reason why you want to avoid pip it might help to better address this question.
Binary distribution
Also depending on the app you could look into something like pyinstaller to make a single .exe or binary file out of your app so they don't even need python or your dependencies, but be warned this has it's own set of complexities to look out for like having to build for every target platform (Windows, Mac and Linux).
I have a python app with its setup.py that's working just fine to install it through setuptools. I am then packaging it up in DEB and PKGNG using the excellent Effing package management. I've also made some quick tests with setuptools-pkg and that seems to work too.
Now I have a need to distribute the packages including init scripts to start/stop/manage the service. I have my init scripts in the source repo and, according to what seems to be best practice, I'm not doing anything with them in setuptools and I'm handling them in the os-specific packaging: for debian-based systems I use the --deb-init, --deb-upstart and --deb-systemd FPM options as needed.
How can I build a FreeBSD package that includes the correct rc.d script, using FPM or through any other means?
All the examples I've seen are adding the rc.d script when building a package through the ports collection but this is an internal app and is not going to be published to the Ports or on PyPi. I want to be able to check out the repository on a FreeBSD system, launch a command that gives me a package, distribute it to other FreeBSD systems, install it using pkg and have my init script correctly deployed to /usr/local/etc/rc.d/<myappname>. There's no need to keep using FPM for that, anything works as long as it gives me a well-formed package.
I would highly suggest creating your package as if it were any other port either if is going to be published or not.
One of the advantages you can inherit by doing this is that you could also include all your test and automate the deployment having out of the box the base for a continues integration/delivery setup.
Check out poudriere. You could indeed maintain a set of custom ports with your very own settings and distribute them across your environments without any hassle:
pkg install -r your-poudriere yourpkg
In case this is probably too much or probably doesn't adapt well to your use case, you can always fallback to ansible, in where you could create a custom rc.d within a template of an ansible role.
If you just want to build and deploy something, let's say a microservice, then probably pkg is not the best tool, maybe you just need a supervisor that can work on all your platforms (sysutils/immortal) so that you could just distribute your code and have a single recipe for starting/stoping the service.
nbari's answer is probably the Right Way™ to do this and I'd probably create my own "port" and use that to build the package on a central host.
At the time of my original question I had taken a different approach that I'm reporting here for the sake of completeness.
I am still building the applications package (ie. myapp-1.0.0.txz) with fpm -s python -t freebsd, which basically uses Python's setuptools infrastructure to get the necessary informations, and I don't include any rc.d file in it.
I also build a second package which I will call myapp-init-1.0.0.txz with the source directory type (ie. fpm -s dir -t freebsd) and I only include the init script in that package.
Both packages get distributed to hosts and installed, thus solving my distribution issue.
First let me explain the current situation:
We do have several python applications which depend on custom (not public released ones) as well as general known packages. These depedencies are all installed on the system python installation. Distribution of the application is done via git by source. All these computers are hidden inside a corporate network and don't have internet access.
This approach is bit pain in the ass since it has the following downsides:
Libs have to be installed manually on each computer :(
How to better deploy an application? I recently saw virtualenv which seems to be the solution but I don't see it yet.
virtualenv creates a clean python instance for my application. How exactly should I deploy this so that usesrs of the software can easily start it?
Should there be a startup script inside the application which creates the virtualenv during start?
The next problem is that the computers don't have internet access. I know that I can specify a custom location for packages (network share?) but is that the right approach? Or should I deploy the zipped packages too?
Would another approach would be to ship the whole python instance? So the user doesn't have to startup the virutalenv? In this python instance all necessary packages would be pre-installed.
Since our apps are fast growing we have a fast release cycle (2 weeks). Deploying via git was very easy. Users could pull from a stable branch via an update script to get the last release - would that be still possible or are there better approaches?
I know that there are a lot questions. Hopefully someone can answer me r give me some advice.
You can use pip to install directly from git:
pip install -e git+http://192.168.1.1/git/packagename#egg=packagename
This applies whether you use virtualenv (which you should) or not.
You can also create a requirements.txt file containing all the stuff you want installed:
-e git+http://192.168.1.1/git/packagename#egg=packagename
-e git+http://192.168.1.1/git/packagename2#egg=packagename2
And then you just do this:
pip install -r requirements.txt
So the deployment procedure would consist in getting the requirements.txt file and then executing the above command. Adding virtualenv would make it cleaner, not easier; without virtualenv you would pollute the systemwide Python installation. virtualenv is meant to provide a solution for running many apps each in its own distinct virtual Python environment; it doesn't have much to do with how to actually install stuff in that environment.
Finally making a legitimate mac installer for my product. I've made a successful Windows installer with Inno installer. I'm not sure how to do this in Mac.
This must happen:
-Python is installed
-Wx is Installed
-Py Serial is installed
-Program is copied
-Shortcut is made.
I was doing this with Bash scripts before, but my customers having been complaining about those. Perhaps X-code package maker is the solution? I know the recommended method is "just copy files" but these libraries must be installed somehow.
Thanks in advance!
Unless I am using Fink for installing packages, I usually just download the .tar.gz file from the source and install it from terminal inside the unzipped folder containing the install.py file. Terminal command:
sudo python ./setup.py install
If you would like, I can show you how to set up and use Fink, which is another easy way to install packages / libraries.
tl;dr: py2app will make a self-contained application bundle out of your Python scripts, making it real easy to employ the 'just copy files' installation process. The libraries you need are bundled into the app bundle itself, so they don't need to be installed systemwide.
Also check out Optimizing for Mac OS X from the wxPython wiki; it gives good tips on using py2app and other useful information on building a Mac-friendly Python application.
On OS X, programs are generally installed through one of three ways: the Mac App Store, a package installer (.pkg/.mpkg), or a copyable application bundle on a disk image (.app in a .dmg). Each has its strengths and weaknesses.
The Mac App Store requires that you subscribe to Apple's restrictions and requirements, and may be a good choice for apps expecting wider distribution (though, nowadays, it can be a good way to reach that wider audience easily). Copyable application bundles are by far the simplest installation method pre-App Store, and still remain one of the more popular ways to install programs. Finally, an Installer package is a user-friendly way to install more complex programs requiring more than a simple application bundle (e.g. system services, files in particular directories, system-dependent components, advanced installation logic, etc.). I should note, though, that do exist complex applications which ship as application bundles and perform the bulk of their 'installation' at first run.
My experience with the Mac App Store is limited, so I won't talk about it. You can find more details at the official website.
Python is quite amenable to being shipped as an application bundle. You can use py2app to automatically create an application bundle for the program, and then drop that bundle into a Mac disk image (.dmg) using Disk Utility to create a complete installation package. This doesn't support making shortcuts, but on OS X it is much more usual for users to just drop the app into /Applications and make the necessary dock shortcut themselves if they want.
The next way is to make a metapackage (.mpkg) which will be installed using the OS X standard Installer utility. This is in line with what users will expect from a Mac application. IIRC, both Mac Python and wxPython ship as .pkg already, which should make it easier to integrate them into a metapackage. bdist_mpkg can help with making packages for pyserial and your own program, which can be added to the metapackage. Finally, using the third-party dockutil script, you can automatically add a dock shortcut. Note, however, that installers generally do not add shortcuts to the dock; it is more usual to have a program installed to the /Applications directory.`
I'm in a bit of a discussion with some other developers on an open source project. I'm new to python but it seems to me that site-packages is meant for libraries and not end user applications. Is that true or is site-packages an appropriate place to install an application meant to be run by an end user?
Once you get to the point where your application is ready for distribution, package it up for your favorite distributions/OSes in a way that puts your library code in site-packages and executable scripts on the system path.
Until then (i.e. for all development work), don't do any of the above: save yourself major headaches and use zc.buildout or virtualenv to keep your development code (and, if you like, its dependencies as well) isolated from the rest of the system.
We do it like this.
Most stuff we download is in site-packages. They come from pypi or Source Forge or some other external source; they are easy to rebuild; they're highly reused; they don't change much.
Must stuff we write is in other locations (usually under /opt, or c:\opt) AND is included in the PYTHONPATH.
There's no great reason for keeping our stuff out of site-packages. However, our feeble excuse is that our stuff changes a lot. Pretty much constantly. To reinstall in site-packages every time we think we have something better is a bit of a pain.
Since we're testing out of our working directories or SVN checkout directories, our test environments make heavy use of PYTHONPATH.
The development use of PYTHONPATH bled over into production. We use a setup.py for production installs, but install to an alternate home under /opt and set the PYTHONPATH to include /opt/ourapp-1.1.
The program run by the end user is usually somewhere in their path, with most of the code in the module directory, which is often in site-packages.
Many python programs will have a small script located in the path, which imports the module, and calls a "main" method to run the program. This allows the programmer to do some upfront checks, and possibly modify sys.path if needed to find the needed module. This can also speed up load time on larger programs, because only files that are imported will be run from bytecode.
Site-packages is for libraries, definitely.
A hybrid approach might work: you can install the libraries required by your application in site-packages and then install the main module elsewhere.
If you can turn part of the application to a library and provide an API, then site-packages is a good place for it. This is actually how many python applications do it.
But from user or administrator point of view that isn't actually the problem. The problem is how we can manage the installed stuff. After I have installed it, how can I upgrade and uninstall it?
I use Fedora. If I use the python that came with it, I don't like installing things to site-packages outside the RPM system. In some cases I have built rpm myself to install it.
If I build my own python outside RPM, then I naturally want to use python's mechanisms to manage it.
Third way is to use something like easy_install to install such thing for example as a user to home directory.
So
Allow packaging to distributions.
Allow selecting the python to use.
Allow using python installed by distribution where you don't have permissions to site-packages.
Allow using python installed outside distribution where you can use site-packages.