How to setup Git to deploy python app files into Ubuntu Server? - python

I setup a new Ubuntu 12.10 Server on VPN hosting. I have installed all the required setup like Nginx, Python, MySQL etc. I am configuring this to deploy a Flask + Python app using uWSGI. Its working fine.
But to create a basic app i used Putty tool (from Windows) and created required app .py files.
But I want to setup a Git functionality so that i can push my code to required directory say /var/www/mysite.com/app_data so that i don't have to use SSH or FileZilla etc everytime i make some changes into my website.
Since i use both Ubuntu & Windows for development of app, setting up a Git kind of functionality would help me push or change my data easily to my Cloud Server.
How can i setup a Git functionality in Ubuntu ? and How could i access it and Deploy data using tools like GitBash etc. ?
Please Suggest

Modified version of innaM:
Concept
Have three repositories
devel - development on your local development machine
central - repository server - like GitHub, Bitbucket or anything other
prod - production server
Then you commit things from devel to central and as soon as you want to deploy on prod, than you ask prod to pull data from prod.
"asking" prod server to pull the updates can be managed by cron (then you have to wait a moment) or you may use other means like one shot call of ssh asking to do git pull and possibly restart your app.
Step by step
In more details you can go this way.
Prepare repo on devel
Develop and test the app on your devel server.
Put it into local repository:
$ git init
$ git add *
$ git commit -m "initial commit"
Create repo on central server
E.g. bitbucket provides this description: https://confluence.atlassian.com/display/BITBUCKET/Import+code+from+an+existing+project
Generally, you create the project on Bitbucket, find the url of it and then from your devel repo call:
$ git remote add origin <bitbucket-repo-url>
$ git push origin
Clone central repo to prod server
Log onto your prod server.
Go to /var/www and clone form bitucket:
$ cd /var/www
$ git clone <bitbucket-repo-url>
$ cd mysite.com
and you shall have your directory ready.
Trigger publication of updates to prod3
There are numerous options. One being a cron task, which would regularly call
$ git pull
In case, your app needs restart afte an update, then you have to ensure, the restart would happen (this shall be possible using git log command, which will show new line after the update, or you may check, if status code would tell you.
Personally I would use "one shot ssh" (you asked not to use ssh, but I assume you are asking for "simpler" solution, so one shot call shall work simpler then using ftp, scp or other magic.
From your devel machine (assuming you have ssh access there):
$ ssh user#prod.server.com "cd /var/www/mysite.com && git pull origin && myapp restart"
Advantage is, that you do control the moment, the update happens.
Discussion
I use similar workflow.
rsync seems in many cases serve well enough or better (be aware of files being created at app runtime and by files in your app, which shall be removed during ongoing versions and shall be removed on server too).
salt (saltstack) could serve too, but requires a bit more learning and setup).
I have learned, that keeping source code and configuration data in the same repo makes sometime situation more dificult (that is why I am working on using salt).
fab command from Fabric (python based) may be best option (in case installation on Windows becomes difficult, look at http://ridingpython.blogspot.cz/2011/07/installing-fabric-on-windows.html

Create a bare repository on your server.
Configure your local repository to use the repository on the server as a remote.
When working on your local workstation, commmit your changes and push them to the repository on your server.
Create a post-receive hook in the server repository that calls "git archive" and thus transfers your files to some other directory on the server.

Related

Setting up docker container so that I can access python packages on ubuntu server

I'm new to using Docker, so I'm either looking for direct help or a link to a relevant guide. I need to train some deep learning models on my school's linux server, but I can't manually install pytorch and other python packages since I don't have root access (sudo). Another student said that he uses docker and has everything ready to go in his container.
I'm wondering how to wrap up my code and relevant packages into a container that I can push to the linux server and then run.
To address your specific problem the easiest way I found to get code into a container is to use git.
start the container in interactive mode or ssh to it if it's attached to a network.
git clone <your awesome deep learning code>. In your git repo have a requirements.txt file. Change directories into your local clone of your repo and run pip install -r requirements.txt
Run whatever script you need to run your code. Note you can easily put your pip install command in one of your run scripts.
It's important to remember that docker containers are stateless/ephemeral. You should not expect the container nor its contents to exist in some durable fashion. This specific issue is addressed by mapping a directory on the host system to a directory in the container.
Side note: I first recommend starting with the docker tutorial. You can easily skip over the installation parts if you are working on system that already has docker installed and where you have permissions to build, start, and stop containers.
I don't have root access (sudo). Another student said that he uses docker
I would like to point out that docker requires sudo permissions.
Instead I think you should look at using something like Google Colab or JupyterLab. This gives you the added benefit of code that is backed-up on a remote server

Docker vs old approach (supervisor, git, your project)

I'm on Docker for past weeks and I can say I love it and I get the idea. But what I can't figure out is how can I "transfer" my current set-up on Docker solution. I guess I'm not the only one and here is what I mean.
I'm Python guys, more specifically Django. So I usually have this:
Debian installation
My app on the server (from git repo).
Virtualenv with all the app dependencies
Supervisor that handles Gunicorn that runs my Django app.
The thing is when I want to upgrade and/or restart the app (I use fabric for these tasks) I connect to the server, navigate to the app folder, run git pull, restart the supervisor task that handles Gunicorn which reloads my app. Boom, done.
But what is the right (better, more Docker-ish) approach to modify this setup when I use Docker? Should I connect to docker image bash somehow everytime I want upgrade the app and run the upgrade or (from what I saw) should I like expose the app into folder out-of docker image and run the standard upgrade process?
Hope you get the confusion of old school dude. I bet Docker guys were thinking about that.
Cheers!
For development, docker users will typically mount a folder from their build directory into the container at the same location the Dockerfile would otherwise COPY it. This allows for rapid development where at most you need to bounce the container rather than rebuild the image.
For production, you want to include everything in the image and not change it, only persistent data goes in the volumes, your code is in the image. When you make a change to the code, you build a new image and replace the running container in production.
Logging into the container and manually updating things is something I only do to test while developing the Dockerfile, not to manage a developing application.

Openshift: How to and the Drawbacks of a Hot Deployment

I work with Openshift and in specific with Python. I have done many projects in there and I think the most irretating thing is that when you deploy your application, the server is down and you cannot even show a custom message.
I was socked after months when I searched in Google that there is an option to Hot Deploy an application. To git push it without the server get down. I am not a computer scientist, so I cannot understand if this technique has any drawbacks on my application.
Also, until now, when I wanted to update my application, I was doing:
git add .
git commit -a -m 'mycommit'
git push
I read on the manual that I have to enable the Hot Deployment with creating a file on the directory:
C:\app_directory> copy NUL > .openshift\markers\hot_deploy
But after that, how will I (hot) deploy the changes in my server?
Thank you
Once you have added the hot_deploy marker to your git repository, you need to follow the same git add, git commit, git push procedure, the only difference will be that your site will not shut down while it is being deployed. The new code will be deployed and everything should work as expected.
You need to add the marker file to your Git to make the change through to the the server.
git add .openshift/markers/hot_deploy
git commit -m "Changing application to hot deploy"
After this your subsequent commits (using the git add/commit/push combination) will not restart your server.
Alternatively you can use the following rhc commands to enable and disable auto-deployment.
rhc app-configure <app> auto-deploy
rhc app-configure <app> no-auto-deploy

What is the Best Practice or most efficient way to update custom python modules in pythonanywhere?

For PythonAnywhere:
I am currently building a project where I have to change one of my installed packages frequently (because I am adding to the package as I build out the project). It is very manual and laborious to constantly update the package in the BASH console be reinstalling the package everytime I make a change locally. Is there a better process for this?
It sounds like you want to be able to use a single command from your local machine to push up some changes to PythonAnywhere, one way to go about it would be to use PythonAnywere as a git remote. There's some details in this post, but, broadly:
username#PythonAnywhere:~$ mkdir my_repo.git
username#PythonAnywhere:~$ cd my_repo.git
username#PythonAnywhere:~$ git init --bare
Then, on your PC:
git remote add pythonanywhere username#ssh.pythonanywhere.com:my_repo.git
Then you should be able to push to the bare repository on PA from your machine with a
git push pythonanywhere master
You can then use a Git post-receive hook to update the package on PythonAnywhere, by whatever means you like. One might be to have your package checked out on PythonAnywhere:
username#PythonAnywhere:~$ git clone my_package ./my_repo.git
And then the post-receive hook could be as simple as
cd ~/my_package && git pull

Using git post-receive hook to deploy python application in virtualenv

My goal is to be able to deploy a Django application to one of two environments (DEV or PROD) based on the Git branch that was committed and pushed to a repository. This repository is hosted on the same server as the Django applications are being run on.
Right now, I have two virtualenvs set up. One for each environment. They are identical. I envision them only changing if the requirements.txt is modified in my repository.
I've seen tutorials around the internet that offer deployments via git by hosting the repository directly in the location where the application will be deployed. This doesn't work for my architecture. I'm using RhodeCode to host/manage the repository. I'd like to be able to use a post-receive (or other if it's more appropriate) hook to trigger the update to the appropriate environment.
Something similar to this answer will allow me to narrow down which environment I want to focus on.
When I put source activate command in an external script (ie. my hook), the script stops at that command. The virtualenv is started appropriately, but any further actions in the script (ie. pip install -r requirements.txt or ./manage.py migrate) aren't executed.
My question, is how can I have that hook run the associated virtualenv? Or, if it is already running, update it appropriately with the new requirements.txt, South migrations, and application code?
Is this work flow overly complicated? Theoretically, it should be as simple as git push to the appropriate branch.

Categories