How should I deploy a web application to Debian? - python

Ideally I’d like to build a package to deploy to Debian. Ideally the installation process would check the system has the required dependencies installed, as well as configure Cronjobs, set up users etc.
I’ve tried googling around and I understand a .deb is the format I can distribute in - but that is as far as I got since I’m getting confused now with the tooling I need to get up to speed with. The other option is to just git clone on the server and configure the environment manually… but that’s not preferable for obvious reasons.
How can I get started with building a Debian package and is that the right direction for deploying web applications? If anyone could point me in the right direction tools-wise and perhaps a tutorial that would be massively appreciated :) also if you advise to just take the simple route with git, happy to take that advice as well if you explain why. if it makes any difference I’m deploying one nodejs and one python web application

You can for sure package everything as a Linux application; for example using pyinstaller for your python webapp.
Besides that, it depends on your use case.
I will focus on the second part of your question,
How can I get started with building a Debian package and is that the right direction for deploying web applications?
as that seems to be what you are after when considering other alternatives to .dev already in your question.
I want to deploy 1-2 websites on my linux server
In this case, I'd say manually git clone and configure everything. Its totally fine when you know that there won't be much more running on the server and is pretty hassle free.
Why spend time packaging when noone will need the package ever again after you just installed it on your server?
I want to distribute my webapps to others on Debian
Here a .deb would make total sense. For example Plex media server and other applications are shipped like this.
If the official Debian wiki is too abstract, there are also other more hands on guides to get you started quickly. You could also get other .deb Packages and extract them to see what they are made up from. You mentioned one of your websites is using python, so I just suspect it might be flask or Django. If it's Django, there is an example repository you might want to check out.
I want to run a lot of stuff on my server / distribute to other devs and platforms / or scale soon
In this case I would make the webapps into docker containers. They are easy to build, share, and deploy. On top you can easily bundle all dependencies and scripts to make sure everything is setup right. Also they are easy to run and stop. So you have a simple "on/off" switch if your server is running low on resources while you want to run something else. I highly favour this solution, as it also allows you to easily control what is running on what ip when you deploy more and more applications to your server. But, as you pointed out, it runs with a bit of overhead and is not the best solution on weak hardware.
Also, if you know for sure what will be running on the server long term and don't need the flexibility I would probably skip Docker as well.

Related

Deploy flask to a single server

Easy deployment of a Flask API, how do we do that? What is the best way?
I would like to deploy my Flask API on a single server, in the beginning. I just got started with a new project and I don't want to spend too much time on Docker and scalability. I am even a bit scared to use Docker in production at the beginning anyway.
With PHP there are a ton of options, I just saw they even have "deployer" now, which makes things even easier.
What I am looking for:
with one command, deploying my project to the server (using git). But depending on "deploy dev" or "deploy prod" command, the server needs to know from which branch to pull. So I do need to merge branches before deploying.
create a new "release" folder on the server and symlink the www folder to the new release.
keep at least 5 release folders, remove the 5th on every deploy.
make it possible to rollback, so change symlink to a previous release folder.
I saw I can use Fabric, but it seems kinda complicated and perhaps overkill (like capistrano). I searched quite a lot on the web, but couldn't find a very clear answer/solution. Or a solution which most people agree on.
Any thoughts or people who would like share their experience?
I will post an answer, cause I see I've been given an answer 9 months ago already, without actually answering the thread.
Like Sayse already told: plenty of ways but GIT and CI are both good ways to implement Continuous Deployment on a VPS.
I've been trying CI, with much success!

Python Web Server - mod_wsgi

I have been looking at setting up a web server to use Python and I have installed Apache 2.2.22 on Debian 7 Wheezy with mod_wsgi. I have gotten the initial page up and going and the Apache will display the contents of the wsgi file that I have in my directory.
However, I have been researching on how to deploy a Python application and I have to admin, I find some of it a little confusing. I am coming from a background in PHP where it is literally install what you need and you are up and running and PHP is processing the way it should be.
Is this the same with Python? I can't seem to get anything to process outside of the wsgi file that I have setup. I can't import anything from other files without the server throwing a "500" error. I have looked on Google and Bing to try to find an answer to this, but I can't seem to find anything, or don't know that what I have been looking at is the answer.
I really appreciate any help that you guys can offer.
Thanks in advance! (If I need to post any coding, I can do that, I just don't know what you guys would need, if anything, as far as coding examples for this...)
Python is different from PHP in that PHP executes your entire program separately for each hit to your website, whereas Python runs "worker processes" that stay resident in memory.
You need some sort of web framework to do this work for you (you could write your own, but using someone else's framework makes it much easier). Flask is an example of a light one; Django is an example of a very heavy one. Pick one and follow that framework's instructions, or look for tutorials for that framework. Since the frameworks differ, most practical documentation on handling web services with Python are focused around a framework instead of just around the language itself.
Nearly any python web framework will have a development server that you can run locally, so you don't need to worry about deploying yet. When you are ready to deploy, Apache will work, although it's usually easier and better to use Gunicorn or another python-specific webserver, and then if you need more webserver functionality, set up nginx or Apache as a reverse proxy. Apache is a very heavy application to use for nothing but wsgi functionality. You also have the option of deploying to a PaaS service like Heroku (free for development work, costs money for production applications) which will handle a lot of sysadmin work for you.
As an aside, if you're not using virtualenv to set up your Python environment, you should look into it. It will make it much easier to keep track of what you have installed, to install new packages, and to isolate an environment so you can work on multiple projects on the same computer.

Python: tool to keep track of deployments

I'm looking for a tool to keep track of "what's running where". We have a bunch of servers, and on each of those a bunch of projects. These projects may be running on a specific version (hg tag/commit nr) and have their requirements at specific versions as well.
Fabric looks like a great start to do the actual deployments by automating the ssh part. However, once a deployment is done there is no overview of what was done.
Before reinventing the wheel I'd like to check here on SO as well (I did my best w/ Google but could be looking for the wrong keywords). Is there any such tool already?
(In practice I'm deploying Django projects, but I'm not sure that's relevant for the question; anything that keeps track of pip/virtualenv installs or server state in general should be fine)
many thanks,
Klaas
==========
EDIT FOR TEMP. SOLUTION
==========
For now, we've chosen to simply store this information in a simple key-value store (in our case: the filesystem) that we take great care to back up (in our case: using a DCVS). We keep track of this store with the same deployment tool that we use to do the actual deploys (in our case: fabric)
Passwords are stored inside a TrueCrypt volume that's stored inside our key-value store.
==========
I will still gladly accept any answer when some kind of Open Source solution to this problem pops up somewhere. I might share (part of) our solution somewhere myself in the near future.
pip freeze gives you a listing of all installed packages. Bonus: if you redirect the output to a file, you can use it as part of your deployment process to install all those packages (pip can programmatically install all packages from the file).
I see you're already using virtualenv. Good. You can run pip freeze -E myvirtualenv > myproject.reqs to generate a dependency file that doubles as a status report of the Python environment.
Perhaps you want something like Opscode Chef.
In their own words:
Chef works by allowing you to write
recipes that describe how you want a
part of your server (such as Apache,
MySQL, or Hadoop) to be configured.
These recipes describe a series of
resources that should be in a
particular state - for example,
packages that should be installed,
services that should be running, or
files that should be written. We then
make sure that each resource is
properly configured, only taking
corrective action when it's
neccessary. The result is a safe,
flexible mechanism for making sure
your servers are always running
exactly how you want them to be.
EDIT: Note Chef is not a Python tool, it is a general purpose tool, written in Ruby (it seems). But it is capable of supporting various "cookbooks", including one for installing/maintaining Python apps.

Package a django project and its dependencies for a standalone "product"

I've made a small little "application" utilizing Django as a framework. This is an application that is not ment to be deployed to a server but run locally on a machine. Thus the runserver.py works just nice.
I, as an developer is comfortable with fireing up the terminal, running python manage.py runserver and using it.
But I have some Mac OS X and Windows friends wanting to use my application, and they dont have virtualenv, git or anything else.
Is there a way I can package this to be a standalone product? Of course it would depend on Python being installed on the system, but it is possbile to package the virtualenv — with django and everything, and just copy it to another system and make it work?
And maybe even run the runserver in some kind a deamon mode?
Use setuptools and easy_install.
Here's an introductory article.
Yes, you can package it. Django may not be the easiest to do this with, but the principles are the same for other frameworks. You need to make an installer that installs everything you need. And that installer needs to be different for different platforms. such as Windows, Ubuntu, OS X etc. That also means that the answer is significantly different for each platform, and only half of the answer is dependning on Django. :-(
This kinda sucks, but that's life, currently. There is no nice platform independent way to install software for end users.
I also haven't found the perfect solution for this yet.
My current approach is to provide a docker image because that's really easy to use for everyone. This includes an alpine base image because it's tiny and python + django and the app itself. You can also include a webserver like nginx and an app server like uwsgi or gunicorn and expose a port for it.
So in the end your user would just run the container and access the web app under http://localhost:9000/ or something like this.
This is really handy and also my preferred way of trying out some app I've found.
The "proper" way would be do build a package for every OS and distribution you are targeting and a simple zip bundle so people can also install the app manually.
To build the packages I suggest using fpm. It takes most of the pain of doing the packaging with their native tools away. The packages would then depend on a proper application server like uwsgi or gunicorn.
So in the end you could then install it like apt install your-package and it would depend on python-django, uwsgi etc.
For the location and where to put all the files in the package every distribution has their own way of doing it. I prefer putting everything under /usr/share/webapps/myapp/ and having the config under /etc/myapp/config.py or something like that.
For Windows and macOS there are solutions like PyInstaller. I haven't used it yet for a django app but it should do the job. You should include a app server like uwsgi in there too.
In generel you don't want to run the django dev server in a production environment. So keep that in mind when packaging.
I hope that helps a bit.
There are several ways of doing this. I think you are looking more for build tools (which includes packaging) rather than just a Python solution. Here are a couple that I've used in the past:
zc.buildout: Used to build and deploy Python modules and applications, but is also able to work with other languages with a little massaging. Easy to use (for a build tool).
make: The software build classic. Works with practically all languages but a little archaic and hard to learn for a first timer.
The new snap package manager for Linux should suit the task perfectly. It provides all the solutions that were quite a pain for Python apps so far (dependencies, interpreter etc) and at the same time avoiding the complexity of Docker.
These days Docker is probably a good answer
User needs to install Docker first, but it runs on Windows and OSX as well as Linux.
Your Dockerfile takes care of installing all the dependencies and then runs the devserver (or you could even run a proper webserver in the container)

Setting up a Python web development environment on OS X

I'm running Mac OS X Leopard and wanted to know what the easy way to setup a web development environment to use Python, MySQL, Apache on my machine which would allow me to develop on my Mac and then easily move it to a host in the future.
I've been trying to get mod_wsgi installed and configured to work with Django and have a headache now. Are there any web hosts that currently use mod_wsgi besides Google, so I could just develop there?
FWIW, we've found virtualenv [http://pypi.python.org/pypi/virtualenv] to be an invaluable part of our dev setup. We typically work on multiple projects that use different versions of Python libraries etc. It's very difficult to do this on one machine without some way to provide a localized, customized Python environment, as virtualenv does.
Most Python applications are moving away from mod_python. It can vary by framework or provider, but most development effort is going into mod_wsgi.
Using the WSGI standard will make your Python application server agnostic, and allow for other nice additions like WSGI middleware. Other providers may only provide CGI (which won't scale well performance wise), or FastCGI.
I've worked with Django using only the included server in the manager.py script and have not had any trouble moving to a production environment.
If you put your application in a host that does the environment configuration for you (like WebFaction) you should not have problems moving from development to production.
I run a Linux virtual machine on my Mac laptop. This allows me to keep my development environment and production environments perfectly in sync (and make snapshots for easy experimentation / rollback). I've found VMWare Fusion works the best, but there are free open source alternatives such as VirtualBox if you just want to get your feet wet.
I share the source folders from the guest Linux operating system on my Mac and edit them with the Mac source editor of my choosing (I use Eclipse / PyDev because the otherwise excellent TextMate doesn't deal well with Chinese text yet). I've documented the software setup for the guest Linux operating system here; it's optimized for serving multiple Django applications (including geodjango).
For extra added fun, you can edit your Mac's /etc/hosts file to make yourdomainname.com resolve to your guest Linux boxes internal IP address and have a simple way to work on / test multiple web projects online or offline without too much hassle.
What you're looking for is Mod_Python. It's an Apache-based interpreter for Python. Check it out here:
http://www.modpython.org/
Google App Engine has done it for you. Some limitations but it works great, and it gives you a path to hosting free.
Of course Mac OS X, in recent versions, comes with Python and Apache. However you may want to have more flexibility in the versions you use, or you may not like the tweaks Apple has made to the way they are configured. A good way to get a more generic set of tools, including MySQL, is to install them anew. This will help your portability issues. The frameworks can be installed relatively easily with one of these open source package providers.
Fink
MacPorts
MAMP
mod_wsgi is really, really simple.
Pyerweb is a really simple (~90 lines including comments/whitespace) WSGI-compliant routing-framework I wrote. Basically the WSGI API is just a function that gets passed environ, and wsgi_start_response, and it returns a string.
envrion is a dict with the request info, for example environ['PATH_INFO'] is the request URI)
wsgi_start_response which is a callable function which you execute to set the headers,:
wsgi_start_response(output_response, output_headers)
output_response is the string containing the HTTP status you wish to send (200 OK etc), and output_headers is a list-of-tuples containing your headers (for example, [("Content-type", "text/html")] would set the content-type)
Then the function returns a string containing your output.. That's all there is to it!
To run it, using spawning you can just do spawn scriptname.my_wsgi_function_nae and it will start listening on port 8080.
To use it via mod_wsgi, it's documentation is good, http://code.google.com/p/modwsgi/wiki/QuickConfigurationGuide and there is a django specific section
The up-side to using mod_wsgi is it's the standard for serving Python web-applications. I recently decided to play with Google App Engine, and was surprised when Pyerweb (which I linked to at the start of this answer) worked perfectly on it, completely unintentionally. I was even more impressed when I noticed Django applications run on it too.. Standardisation is a good thing!
You may want to look into web2py. It includes an administration interface to develop via your browser. All you need in one package, including Python.
Check out WebFaction—although I don't use them (nor am I related to / profit from their business in any way). I've read over and over how great their service is and particularly how Django-friendly they are. There's a specific post in their forums about getting up and running with Django and mod_wsgi.
Like others before me in this thread, I highly recommend using Ian Bicking's virtualenv to isolate your development environment; there's a dedicated page in the mod_wsgi documentation for exactly that sort of setup.
I'd also urge you to check out pip, which is basically a smarter easy_install which knows about virtualenv. Pip does two really nice things for virtualenv-style development:
Knows how to install from source control (SVN, Git, etc...)
Knows how to "freeze" an existing development environement's requirements so that you can create that environment somewhere else—very very nice for multiple developers or deployment.

Categories