So I'm setting up a server by myself. Now I ran into lots of different ways where to install the packages.
I am thinking of the core packages like nginx, gunicorn, python3, postgresql and so on.
I learned that setting up a VENV (virtual environment) is a good thing so I can have several projects running with different versions on packages.
But it's a bit confusing wich ones are not going to be inside the VENV.
Some install postgreSQL outside the VENV, but psycopg2 inside. Some the gunicorn inside VENV. and so on.
Are there any best practices or rules that are better safe to follow?
For info. I'm setting up a Ubuntu server 16.04 with Nginx, gunicorn. PostgreSQL, psycopg2, python3
This what I use for my applications, it's working but maybe there are better options.
Nginx, PostgreSQL, python3, supervisor installed as system packages
Using virtualenv for each of the applications I run inside one server
and I install there: gunicorn, psycopg2 and all other requirements
for Django project (Most of the time it's listed inside
requirements.txt file)
Using supervisor to run gunicorn and Celery(When needed)
If you want to make some server installation automation you can use Ansible, it's not that complex and interface well with Python/Django with plenty of code examples. But it's better that you start doing things on your own at first to know and understand what you're doing.
Good luck
#Mounir's answer is pretty solid- but I wanted to tag on another piece of advice- using playbooks from Ansible Galaxy is also another option. Existing playbooks already exist for lots of usecases (including Django) and they take into account many of these best practices. I am not saying that all playbooks on Galaxy are good- but some are, and by virtue of being open source, they are frequently patched and updated.
Related
I'm moving away from WordPress and into bespoke Python apps.
I've settled on Django as my Python framework, my only problems at the moment are concerning hosting. My current shared hosting environment is great for WordPress (WHM on CloudLinux), but serving Django on Apache/cPanel appears to be hit and miss, although I haven't tried it as yet with my new hosting company. - who have Python enabled in cPanel.
What is the easiest way for me to set up a VPS to run a hosting environment for say, twenty websites? I develop everything in a virtualenv, but I have no experience in running Django in a production environment as yet. I would assume that venv isn't secure enough or has scalability issues? I've read some things about people using Docker to set up separate Django instances on a VPS, but I'm not sure whether they wrote their own management system.
It's my understanding that each instance Python/Django needs uWSGI and Nginx residing within that virtual container? I'm looking for a simple and robust solution to host 20 Django sites on a VPS - is there an out of the box solution? I'm also happy to develop one and set up a VPS if I'm pointed in the right direction.
Any wisdom would be gratefully accepted.
Andy :)
Traditional approach
Virtualenv is good enough and perfectly ready for production use. You can have multiple virtualenv for multiple projects on the same VM.
If you have multiple database engines for multiple projects. Like, MySQL for one, PostgreSQL for another something like this then you just need to set up each individually.
Install Nginx and configure each according to project.
Install supervisor to manage(restart/start/stop) each project individually.
Anything that required by the project.
Here it has a huge drawback. Because you can't use different versions on your database engine for a different project in an easy way. So, containerization is highly recommended.
For simple and robust solution,
Use Docker(docker-compose) for local and production deployment.
Configure uWsgi with Nginx(Available on docker.)
Create a CI/CD pipeline with any tool like Jenkins.
Monitor your projects using any good tool like Raygun.
That's it.
I created a bash script that deploys as many websites as you want on your server. It automatically installs all dependencies on your server, creates a virtual environment, configure Gunicorn, Nginx, and a database for Django, etc. Check it out:
https://github.com/jdbit/django-auto-deploy
Python web development newbie question here. I'm coming from PHP/Laravel and there you have Homestead which is a pre-configured Vagrant box for local development. In a so-called Homestead file, you configure everything such as webserver, database or PHP version. Are there any similar pre-configured dev environments for Django?
I already googled and there don't seem to be any official or widely-used Vagrant boxes for Django. The official Django tutorial even tells you how to install and set up Apache and your preferred database. This is a lot of work everytime you want to create a new Django project, especially if those projects run in different production environments. All the other tutorials I've found just explain how you set up virtual environments with venv or the like. But that doesn't seem to be sufficient to me. What you obviously want is a dev environment that is as close as possible to your production environment, so you need some kind of virtual machines.
I'm a little bit confused right now. Do you just grab some plain Ubuntu (or any other OS) Vagrant box and install everything yourself? Don't you use Vagrant at all but something else? Did I miss something and the Python web development workflow is completely different?
The typical local development in Django just uses the builtin web server and an SQLite database. The steps to get that up and running are:
Ensure you have the desired version of Python installed.
Create a virtual env to isolate libraries needed for your project from the rest of the system (this is optional by highly recommended, I'd actually recommend using Poetry).
Install Django, probably via pip.
Run manage.py runserver (and migrate the database and set up a superuser, yada yada).
That's pretty much it and sufficient for local development. What you need to be aware of is that some differences exist between SQLite and Postgres, MySQL etc., and if you hit the spots where the difference is important, you'll want to set up your targeted database as well to develop directly against it. That can probably happen in a Docker container if that makes sense for you. But there's little reason to put Django into a container during development, unless your project is especially complex and requires simulating certain conditions which the builtin server somehow can't.
Does this help?
$ python3 -m venv my_env # create your virtual environment
$ source my_env/bin/activate # Any package you install will be inside this environment
$ pip install -r requirements.txt # can also install packages indivdually
$ deactivate # get out of the isolated environment
Here's the doc
I have django website in testing server and i am confused with how should the deployement procedure goes.
Locally i have these folders
code
virtualenv
static
static/app/bower_components
node_modules
Current on git i only have code folder in there.
My Initial thought was to do this on production server
git clone repo
pip install
npm install
bower install
colectstatic
But i had this problem that sometimes some components in pip or npm or bowel fail to install and then production deployemnet fails.
I was thinking of put everything in static, bower, npm etc inside git so that i can fetch all in prodcution.
Is that the right way to do. i want to know the right way to tackle that problem
But i had this problem that sometimes some components in pip or npm or
bowel fail to install and then production deployment fails.
There is no solution for this other than to find out why things are failing in production (or a way around would be to not install anything in production, just copy stuff over).
I would caution against the second option because Python virtual environments are not designed to be portable. If you have components such as PIL/Pillow or database drivers, these need system libraries to be installed and compiled against at build time.
Here is what I would recommend, which is in-line with the deployment section in the documentation:
Create an updated requirements file (pip freeze > requirements.txt)
Run collectstatic on your testing environment.
Move the static directory to your frontend/proxy machine, and map it to STATIC_URL. Confirm this works by browsing the static URL (for example: http://example.com/static/images/logo.png)
Clone/copy your codebase to the production server.
Create a blank virtual environment.
Install dependencies with pip install -r requirements.txt
Make sure you run through the deployment checklist, which includes security tips and settings you need to enable for production.
After this point, you can bring up your django server using your favorite method.
There are many, many guides on deploying django and many are customized for particular environments (for example, AWS automation, Heroku deployment tips, Digital Ocean, etc.) You can browse those for ideas (I usually pick out any automation tips) but be careful adopting one strategy without making sure it works with your particular environment/requirements.
In addition this might be helpful for some guidelines on deployment.
Finally made the switch from Windows to Linux (Ubuntu). I am teaching myself Python + Django.
GOAL: I want to setup a local development environment where I can build a django application and run it locally before deploying it live online.
SO FAR: Python comes installed with Linux. Gedit does also. Ok great. After that I am lost.
I am not sure how to proceed. I am a total Linux noob. I am guessing I need apache running, mysql running, and django to run. I don't know how to setup these things to run or how to set the proper directories or what I need to link to what... really I am not even sure what the right questions are to ask.
Quick answer:
Just install django via their documentation, and you will not even need apache running as it will run its own server. You can use sqlite as a db and you won't even have to worry about mysql. This is if your goal is to learn django and get things running asap.
Otherwise, if you want to take the full route, you'll need to start learning a lot more (which isn't a bad thing). I'd say take a look at some of the slicehost guides for setting up apache, mysql in ubuntu:
http://articles.slicehost.com/2010/5/19/installing-apache-on-ubuntu
http://articles.slicehost.com/2011/3/10/installing-mysql-server-on-ubuntu
And then pick up with just installing django and going from there. The django tutorial is awesome. There's plenty of other documentation out there on the web & tutorials for setting up dev environments.
Assuming you have python
You can install pip:
easy_install pip
From now you can use pip to install your python libraries for example to install django
pip install django
For development I would prefer pydev, it has a support for django with the power of eclipse.
I've made a small little "application" utilizing Django as a framework. This is an application that is not ment to be deployed to a server but run locally on a machine. Thus the runserver.py works just nice.
I, as an developer is comfortable with fireing up the terminal, running python manage.py runserver and using it.
But I have some Mac OS X and Windows friends wanting to use my application, and they dont have virtualenv, git or anything else.
Is there a way I can package this to be a standalone product? Of course it would depend on Python being installed on the system, but it is possbile to package the virtualenv — with django and everything, and just copy it to another system and make it work?
And maybe even run the runserver in some kind a deamon mode?
Use setuptools and easy_install.
Here's an introductory article.
Yes, you can package it. Django may not be the easiest to do this with, but the principles are the same for other frameworks. You need to make an installer that installs everything you need. And that installer needs to be different for different platforms. such as Windows, Ubuntu, OS X etc. That also means that the answer is significantly different for each platform, and only half of the answer is dependning on Django. :-(
This kinda sucks, but that's life, currently. There is no nice platform independent way to install software for end users.
I also haven't found the perfect solution for this yet.
My current approach is to provide a docker image because that's really easy to use for everyone. This includes an alpine base image because it's tiny and python + django and the app itself. You can also include a webserver like nginx and an app server like uwsgi or gunicorn and expose a port for it.
So in the end your user would just run the container and access the web app under http://localhost:9000/ or something like this.
This is really handy and also my preferred way of trying out some app I've found.
The "proper" way would be do build a package for every OS and distribution you are targeting and a simple zip bundle so people can also install the app manually.
To build the packages I suggest using fpm. It takes most of the pain of doing the packaging with their native tools away. The packages would then depend on a proper application server like uwsgi or gunicorn.
So in the end you could then install it like apt install your-package and it would depend on python-django, uwsgi etc.
For the location and where to put all the files in the package every distribution has their own way of doing it. I prefer putting everything under /usr/share/webapps/myapp/ and having the config under /etc/myapp/config.py or something like that.
For Windows and macOS there are solutions like PyInstaller. I haven't used it yet for a django app but it should do the job. You should include a app server like uwsgi in there too.
In generel you don't want to run the django dev server in a production environment. So keep that in mind when packaging.
I hope that helps a bit.
There are several ways of doing this. I think you are looking more for build tools (which includes packaging) rather than just a Python solution. Here are a couple that I've used in the past:
zc.buildout: Used to build and deploy Python modules and applications, but is also able to work with other languages with a little massaging. Easy to use (for a build tool).
make: The software build classic. Works with practically all languages but a little archaic and hard to learn for a first timer.
The new snap package manager for Linux should suit the task perfectly. It provides all the solutions that were quite a pain for Python apps so far (dependencies, interpreter etc) and at the same time avoiding the complexity of Docker.
These days Docker is probably a good answer
User needs to install Docker first, but it runs on Windows and OSX as well as Linux.
Your Dockerfile takes care of installing all the dependencies and then runs the devserver (or you could even run a proper webserver in the container)