I have a webapp that I made using django that I need to deploy to many raspberry pi devices. I'm using ansible to automate the deployment to devices. While developing the app I used pipenv to manage my project dependencies in a virtual environment.
My question is, is it necessary to make a virtual environment on the actual raspberry pi devices when deploying or can I just install all my necessary packages on the system environment? What are the advantages of making a virtual environment on the device?
Thank you.
Separating your apps' dependencies from systems' is always a good idea. The overhead is minimal and may prevent issues in the future. It makes it much easier to tear down and rebuild your app if you ever need to, rather than potentially having to re-image the raspberry pi if anything goes wrong. It also means you have the ability to run separate apps on the pi that don't need to be running off the same package versions, should you ever want to do that.
However, it's certainly possible not to use one and you might get away with it and not have any issues. But if you want to improve the reliability and maintainability of your app and pi, and considering how easy it is to setup and use, it seems like a poor design decision not to use it.
Related
Ideally I’d like to build a package to deploy to Debian. Ideally the installation process would check the system has the required dependencies installed, as well as configure Cronjobs, set up users etc.
I’ve tried googling around and I understand a .deb is the format I can distribute in - but that is as far as I got since I’m getting confused now with the tooling I need to get up to speed with. The other option is to just git clone on the server and configure the environment manually… but that’s not preferable for obvious reasons.
How can I get started with building a Debian package and is that the right direction for deploying web applications? If anyone could point me in the right direction tools-wise and perhaps a tutorial that would be massively appreciated :) also if you advise to just take the simple route with git, happy to take that advice as well if you explain why. if it makes any difference I’m deploying one nodejs and one python web application
You can for sure package everything as a Linux application; for example using pyinstaller for your python webapp.
Besides that, it depends on your use case.
I will focus on the second part of your question,
How can I get started with building a Debian package and is that the right direction for deploying web applications?
as that seems to be what you are after when considering other alternatives to .dev already in your question.
I want to deploy 1-2 websites on my linux server
In this case, I'd say manually git clone and configure everything. Its totally fine when you know that there won't be much more running on the server and is pretty hassle free.
Why spend time packaging when noone will need the package ever again after you just installed it on your server?
I want to distribute my webapps to others on Debian
Here a .deb would make total sense. For example Plex media server and other applications are shipped like this.
If the official Debian wiki is too abstract, there are also other more hands on guides to get you started quickly. You could also get other .deb Packages and extract them to see what they are made up from. You mentioned one of your websites is using python, so I just suspect it might be flask or Django. If it's Django, there is an example repository you might want to check out.
I want to run a lot of stuff on my server / distribute to other devs and platforms / or scale soon
In this case I would make the webapps into docker containers. They are easy to build, share, and deploy. On top you can easily bundle all dependencies and scripts to make sure everything is setup right. Also they are easy to run and stop. So you have a simple "on/off" switch if your server is running low on resources while you want to run something else. I highly favour this solution, as it also allows you to easily control what is running on what ip when you deploy more and more applications to your server. But, as you pointed out, it runs with a bit of overhead and is not the best solution on weak hardware.
Also, if you know for sure what will be running on the server long term and don't need the flexibility I would probably skip Docker as well.
I'm investigating ways to add vagrant to my development environment. I do most of my web development in python, and I'm interested in python-related specifics, however the question is more general.
I like the idea of having all development-related stuff isolated in virtual machine, but I haven't yet discovered an effective way to work with it. Basically, I see 3 ways to set it up:
Have all services (such as database server, MQ, etc) as well as an application under development to run in VM. Developer would ssh to VM and edit sources there, run app, tests, etc, all in an ssh terminal.
Same as 1), but edit sources on host machine in mapped directory with normal GUI editor. Run application and tests on vagrant via ssh. This seems to be most popular way to use vagrant.
Host only external services in VM. Install app dependencies into virtualenv on host machine and run app and tests from there.
All of these approaches have their own flaws:
Developing in text console is just too inconvenient, and this is the show-stopper for me. While I'm experienced ViM user and could live with it, I can't recommend this approach to anyone used to work in any graphical IDE.
You can develop with your familiar tools, but you cannot use autocompletion, since all python libs are installed in VM. Your tracebacks will point to non-local files. You will not be able to open library sources in your editor, ctags will not work.
Losing most of "isolation" feature: you have to install all compilers, *-dev libraries yourself to install python dependencies and run an app. It is pretty easy on linux, but it might be much harder to set them all up on OSX and on Windows it is next to impossible I guess.
So, the question is: is there any remedy for problems of 2nd and 3rd approaches? More specifically, how is it possible to create an isolated and easily replicatable environment, and yet enjoy all the comfort of development on host machine?
In most IDE you can add "library" path which are outside the project so that your code completion etc works. About the traceback, I'm unfamiliar with python but this sounds like issue that are resolved by "mapping" paths between servers and dev machine. This is generally the reason why #2 is often the way to go (Except when you have a team willing to do #1).
Hey everyone,
I am expanding my team and I have recently added an additional front end engineer on my site. I am currently using django to run my site but my site is using a lot of plugins, namely: django-celery, django-mailer, django-notification, and django-socialregistration.
Let me describe my situation:
He is using Mac OS X, and I have no experience in installing stuff on mac os X or configuration on that platform
I believe that getting my backend to run on his computer might be somewhat troublesome, i.e. I have to install a bunch of plugins (which are not available on pip or easy_install as they are the latest version) and I have also made heavy modification to django-socialregistration which I am currently using by symlinking to the modified code in my repos in my python path
I tried to look into solutions like pip and easy_install but I have not been able to get them to install code from github
I think the easiest way is to get my backend working on his computer and then he just commiting to the repos. Any ideas how I can make this easy?
Another, free option, is to use VirtualBox. I would recommend installing the same OS on it as your production server. Then, he's developing in the same environment as the live site, and can just check into the repo the same as you. Hey, you may want to do the same on your end--then both of your environments are the same and also the same as the live site.
Get him to set up a virtual machine on his Mac, using VMWare Fusion or Parallels, running the same operating system that you currently use for your back end. If he prefers developing using Mac tools he can do still that by sharing his local changes to the virtual machine via a shared directory.
An alternative, if that's possible, would be to set up a testing/development environment on a machine with an OS you're familiar with, then install something like Dropbox on his local machine where he can develop the frontend code, and install Dropbox on that other environment with the backend components. Dropbox would sync his local changes to that testing environment for him to run the code on.
That way, he would be able to use that environment to test his code, you wouldn't need to set up a backend on his machine (or keep it up to date) and you'd still be getting the same functionality.
Again, if that's an option.
I've made a small little "application" utilizing Django as a framework. This is an application that is not ment to be deployed to a server but run locally on a machine. Thus the runserver.py works just nice.
I, as an developer is comfortable with fireing up the terminal, running python manage.py runserver and using it.
But I have some Mac OS X and Windows friends wanting to use my application, and they dont have virtualenv, git or anything else.
Is there a way I can package this to be a standalone product? Of course it would depend on Python being installed on the system, but it is possbile to package the virtualenv — with django and everything, and just copy it to another system and make it work?
And maybe even run the runserver in some kind a deamon mode?
Use setuptools and easy_install.
Here's an introductory article.
Yes, you can package it. Django may not be the easiest to do this with, but the principles are the same for other frameworks. You need to make an installer that installs everything you need. And that installer needs to be different for different platforms. such as Windows, Ubuntu, OS X etc. That also means that the answer is significantly different for each platform, and only half of the answer is dependning on Django. :-(
This kinda sucks, but that's life, currently. There is no nice platform independent way to install software for end users.
I also haven't found the perfect solution for this yet.
My current approach is to provide a docker image because that's really easy to use for everyone. This includes an alpine base image because it's tiny and python + django and the app itself. You can also include a webserver like nginx and an app server like uwsgi or gunicorn and expose a port for it.
So in the end your user would just run the container and access the web app under http://localhost:9000/ or something like this.
This is really handy and also my preferred way of trying out some app I've found.
The "proper" way would be do build a package for every OS and distribution you are targeting and a simple zip bundle so people can also install the app manually.
To build the packages I suggest using fpm. It takes most of the pain of doing the packaging with their native tools away. The packages would then depend on a proper application server like uwsgi or gunicorn.
So in the end you could then install it like apt install your-package and it would depend on python-django, uwsgi etc.
For the location and where to put all the files in the package every distribution has their own way of doing it. I prefer putting everything under /usr/share/webapps/myapp/ and having the config under /etc/myapp/config.py or something like that.
For Windows and macOS there are solutions like PyInstaller. I haven't used it yet for a django app but it should do the job. You should include a app server like uwsgi in there too.
In generel you don't want to run the django dev server in a production environment. So keep that in mind when packaging.
I hope that helps a bit.
There are several ways of doing this. I think you are looking more for build tools (which includes packaging) rather than just a Python solution. Here are a couple that I've used in the past:
zc.buildout: Used to build and deploy Python modules and applications, but is also able to work with other languages with a little massaging. Easy to use (for a build tool).
make: The software build classic. Works with practically all languages but a little archaic and hard to learn for a first timer.
The new snap package manager for Linux should suit the task perfectly. It provides all the solutions that were quite a pain for Python apps so far (dependencies, interpreter etc) and at the same time avoiding the complexity of Docker.
These days Docker is probably a good answer
User needs to install Docker first, but it runs on Windows and OSX as well as Linux.
Your Dockerfile takes care of installing all the dependencies and then runs the devserver (or you could even run a proper webserver in the container)
I'm running Mac OS X Leopard and wanted to know what the easy way to setup a web development environment to use Python, MySQL, Apache on my machine which would allow me to develop on my Mac and then easily move it to a host in the future.
I've been trying to get mod_wsgi installed and configured to work with Django and have a headache now. Are there any web hosts that currently use mod_wsgi besides Google, so I could just develop there?
FWIW, we've found virtualenv [http://pypi.python.org/pypi/virtualenv] to be an invaluable part of our dev setup. We typically work on multiple projects that use different versions of Python libraries etc. It's very difficult to do this on one machine without some way to provide a localized, customized Python environment, as virtualenv does.
Most Python applications are moving away from mod_python. It can vary by framework or provider, but most development effort is going into mod_wsgi.
Using the WSGI standard will make your Python application server agnostic, and allow for other nice additions like WSGI middleware. Other providers may only provide CGI (which won't scale well performance wise), or FastCGI.
I've worked with Django using only the included server in the manager.py script and have not had any trouble moving to a production environment.
If you put your application in a host that does the environment configuration for you (like WebFaction) you should not have problems moving from development to production.
I run a Linux virtual machine on my Mac laptop. This allows me to keep my development environment and production environments perfectly in sync (and make snapshots for easy experimentation / rollback). I've found VMWare Fusion works the best, but there are free open source alternatives such as VirtualBox if you just want to get your feet wet.
I share the source folders from the guest Linux operating system on my Mac and edit them with the Mac source editor of my choosing (I use Eclipse / PyDev because the otherwise excellent TextMate doesn't deal well with Chinese text yet). I've documented the software setup for the guest Linux operating system here; it's optimized for serving multiple Django applications (including geodjango).
For extra added fun, you can edit your Mac's /etc/hosts file to make yourdomainname.com resolve to your guest Linux boxes internal IP address and have a simple way to work on / test multiple web projects online or offline without too much hassle.
What you're looking for is Mod_Python. It's an Apache-based interpreter for Python. Check it out here:
http://www.modpython.org/
Google App Engine has done it for you. Some limitations but it works great, and it gives you a path to hosting free.
Of course Mac OS X, in recent versions, comes with Python and Apache. However you may want to have more flexibility in the versions you use, or you may not like the tweaks Apple has made to the way they are configured. A good way to get a more generic set of tools, including MySQL, is to install them anew. This will help your portability issues. The frameworks can be installed relatively easily with one of these open source package providers.
Fink
MacPorts
MAMP
mod_wsgi is really, really simple.
Pyerweb is a really simple (~90 lines including comments/whitespace) WSGI-compliant routing-framework I wrote. Basically the WSGI API is just a function that gets passed environ, and wsgi_start_response, and it returns a string.
envrion is a dict with the request info, for example environ['PATH_INFO'] is the request URI)
wsgi_start_response which is a callable function which you execute to set the headers,:
wsgi_start_response(output_response, output_headers)
output_response is the string containing the HTTP status you wish to send (200 OK etc), and output_headers is a list-of-tuples containing your headers (for example, [("Content-type", "text/html")] would set the content-type)
Then the function returns a string containing your output.. That's all there is to it!
To run it, using spawning you can just do spawn scriptname.my_wsgi_function_nae and it will start listening on port 8080.
To use it via mod_wsgi, it's documentation is good, http://code.google.com/p/modwsgi/wiki/QuickConfigurationGuide and there is a django specific section
The up-side to using mod_wsgi is it's the standard for serving Python web-applications. I recently decided to play with Google App Engine, and was surprised when Pyerweb (which I linked to at the start of this answer) worked perfectly on it, completely unintentionally. I was even more impressed when I noticed Django applications run on it too.. Standardisation is a good thing!
You may want to look into web2py. It includes an administration interface to develop via your browser. All you need in one package, including Python.
Check out WebFaction—although I don't use them (nor am I related to / profit from their business in any way). I've read over and over how great their service is and particularly how Django-friendly they are. There's a specific post in their forums about getting up and running with Django and mod_wsgi.
Like others before me in this thread, I highly recommend using Ian Bicking's virtualenv to isolate your development environment; there's a dedicated page in the mod_wsgi documentation for exactly that sort of setup.
I'd also urge you to check out pip, which is basically a smarter easy_install which knows about virtualenv. Pip does two really nice things for virtualenv-style development:
Knows how to install from source control (SVN, Git, etc...)
Knows how to "freeze" an existing development environement's requirements so that you can create that environment somewhere else—very very nice for multiple developers or deployment.