Sorry I'm new to this specific topic.
I have a website implemented in django and AskBot it also has a DB (postgreSQL). I want to create a deployment package which can be distributed to any customer; such that this customer can have their own server. Taking into consideration that the deployment package should be platform independent; so it should work on all operating systems.
Can you tell me what are the available tools to achieve this?
virtualenv is a really good tool but I think Vagrant is what you're looking for.
https://www.vagrantup.com/
It should enable you to easily set up your system regardless of the platform and it's free as well and quite well documented. I'd suggest you give it a look over !
From my point of view, database always should be created before deployment. And the information of the database must be posted to the settings.py
for the application it self, I think virtualenv can be very helpful in these cases with requirements.txt
You run the application in your virtual environment and then export your dependencies using
pip freeze > requirements.txt
Then in the new server you create the database, and insert the configuration in your settings, then install dependences
pip install -r /path/to/requirements.txt
Run migrations, and you are done.
Related
Python web development newbie question here. I'm coming from PHP/Laravel and there you have Homestead which is a pre-configured Vagrant box for local development. In a so-called Homestead file, you configure everything such as webserver, database or PHP version. Are there any similar pre-configured dev environments for Django?
I already googled and there don't seem to be any official or widely-used Vagrant boxes for Django. The official Django tutorial even tells you how to install and set up Apache and your preferred database. This is a lot of work everytime you want to create a new Django project, especially if those projects run in different production environments. All the other tutorials I've found just explain how you set up virtual environments with venv or the like. But that doesn't seem to be sufficient to me. What you obviously want is a dev environment that is as close as possible to your production environment, so you need some kind of virtual machines.
I'm a little bit confused right now. Do you just grab some plain Ubuntu (or any other OS) Vagrant box and install everything yourself? Don't you use Vagrant at all but something else? Did I miss something and the Python web development workflow is completely different?
The typical local development in Django just uses the builtin web server and an SQLite database. The steps to get that up and running are:
Ensure you have the desired version of Python installed.
Create a virtual env to isolate libraries needed for your project from the rest of the system (this is optional by highly recommended, I'd actually recommend using Poetry).
Install Django, probably via pip.
Run manage.py runserver (and migrate the database and set up a superuser, yada yada).
That's pretty much it and sufficient for local development. What you need to be aware of is that some differences exist between SQLite and Postgres, MySQL etc., and if you hit the spots where the difference is important, you'll want to set up your targeted database as well to develop directly against it. That can probably happen in a Docker container if that makes sense for you. But there's little reason to put Django into a container during development, unless your project is especially complex and requires simulating certain conditions which the builtin server somehow can't.
Does this help?
$ python3 -m venv my_env # create your virtual environment
$ source my_env/bin/activate # Any package you install will be inside this environment
$ pip install -r requirements.txt # can also install packages indivdually
$ deactivate # get out of the isolated environment
Here's the doc
So I'm setting up a server by myself. Now I ran into lots of different ways where to install the packages.
I am thinking of the core packages like nginx, gunicorn, python3, postgresql and so on.
I learned that setting up a VENV (virtual environment) is a good thing so I can have several projects running with different versions on packages.
But it's a bit confusing wich ones are not going to be inside the VENV.
Some install postgreSQL outside the VENV, but psycopg2 inside. Some the gunicorn inside VENV. and so on.
Are there any best practices or rules that are better safe to follow?
For info. I'm setting up a Ubuntu server 16.04 with Nginx, gunicorn. PostgreSQL, psycopg2, python3
This what I use for my applications, it's working but maybe there are better options.
Nginx, PostgreSQL, python3, supervisor installed as system packages
Using virtualenv for each of the applications I run inside one server
and I install there: gunicorn, psycopg2 and all other requirements
for Django project (Most of the time it's listed inside
requirements.txt file)
Using supervisor to run gunicorn and Celery(When needed)
If you want to make some server installation automation you can use Ansible, it's not that complex and interface well with Python/Django with plenty of code examples. But it's better that you start doing things on your own at first to know and understand what you're doing.
Good luck
#Mounir's answer is pretty solid- but I wanted to tag on another piece of advice- using playbooks from Ansible Galaxy is also another option. Existing playbooks already exist for lots of usecases (including Django) and they take into account many of these best practices. I am not saying that all playbooks on Galaxy are good- but some are, and by virtue of being open source, they are frequently patched and updated.
I have an application using Django1.9, and python2.7. I recently flushed my PostgreSQL database on my production server, and now whenever I try to use the application it is telling me there are missing modules. I never faced this issue before so I am curious, when you put your application on a production server, does your virtual environment go with it ? If so, does flushing your database have any effect on your virtual environment ?
I have been getting past the issues by downloading each module to a third parties directory in my application, and including them in my 'installed apps' list in my setting file, but I wouldn't want to continue doing that if there are 100+ modules I need to download.
I also tried to use pip install on my production server, and it said that the command was not found, although I have the latest version of pip installed on my mac ?
I am curious, when you put your application on a production server, does your virtual environment go with it ?
Not necessarily unless you copied the virtualenv folder with it which isn't really a good practice, you should create the virtualenv on the production server
If so, does flushing your database have any effect on your virtual environment ?
No, the database and virtualenv are completely separate
I wouldn't want to continue doing that if there are 100+ modules
Use a requirements.txt file and install them all at one go with pip install -r requirements.txt
I also tried to use pip install on my production server, and it said that the command was not found
You have to install pip first, on the production server
I have django website in testing server and i am confused with how should the deployement procedure goes.
Locally i have these folders
code
virtualenv
static
static/app/bower_components
node_modules
Current on git i only have code folder in there.
My Initial thought was to do this on production server
git clone repo
pip install
npm install
bower install
colectstatic
But i had this problem that sometimes some components in pip or npm or bowel fail to install and then production deployemnet fails.
I was thinking of put everything in static, bower, npm etc inside git so that i can fetch all in prodcution.
Is that the right way to do. i want to know the right way to tackle that problem
But i had this problem that sometimes some components in pip or npm or
bowel fail to install and then production deployment fails.
There is no solution for this other than to find out why things are failing in production (or a way around would be to not install anything in production, just copy stuff over).
I would caution against the second option because Python virtual environments are not designed to be portable. If you have components such as PIL/Pillow or database drivers, these need system libraries to be installed and compiled against at build time.
Here is what I would recommend, which is in-line with the deployment section in the documentation:
Create an updated requirements file (pip freeze > requirements.txt)
Run collectstatic on your testing environment.
Move the static directory to your frontend/proxy machine, and map it to STATIC_URL. Confirm this works by browsing the static URL (for example: http://example.com/static/images/logo.png)
Clone/copy your codebase to the production server.
Create a blank virtual environment.
Install dependencies with pip install -r requirements.txt
Make sure you run through the deployment checklist, which includes security tips and settings you need to enable for production.
After this point, you can bring up your django server using your favorite method.
There are many, many guides on deploying django and many are customized for particular environments (for example, AWS automation, Heroku deployment tips, Digital Ocean, etc.) You can browse those for ideas (I usually pick out any automation tips) but be careful adopting one strategy without making sure it works with your particular environment/requirements.
In addition this might be helpful for some guidelines on deployment.
I'm a long-time Django developer and have just started using Ansible, after using Vagrant for the last 18 months. Historically I've created a single VM for development of all my projects, and symlinked the reusable Django apps (Python packages) I create, to the site-packages directory.
I've got a working dev box for my latest Django project, but I can't really make changes to my own reusable apps without having to copy those changes back to a Git repo. Here's my ideal scenario:
I checkout all the packages I need to develop as Git submodules within the site I'm working on
I have some way (symlinking or a better method) to tell Ansible to setup the box and install my packages from these Git submodules
I run vagrant up or vagrant provision
It reads requirements.txt and installs the remaining packages (things like South, Pillow, etc), but it skips my set of tools because it knows they're already installed
I hope that makes sense. Basically, imagine I'm developing Django. How do I tell Vagrant (via Ansible I assume) to find my local copy of Django, rather than the one from PyPi?
Currently the only way I can think of doing this is creating individual symlinks for each of those packages I'm developing, but I'm sure there's a more sensible model.
Thanks!
You should probably think of it slightly differently. You create a Vagrant file which specifies Ansible as a provisioner. In that Vagrant file you also specify what playbook to use for your vagrant provision portion.
If your playbooks are written in an idempotent way, running them multiple times will skip steps that already match the desired state.
You should also think about what your desired end-state of a VM should look like and write playbooks to accomplish that. Unless I'm misunderstanding something, all your playbook actions should be happening inside of VM, not directly on your local machine.