Recently I have began to use AWS, I've built a python / django app, and now I want to deploy it onto AWS EC2
This is the first time I have tried to deplay a Django application, so please bear with me, anyway I started with the AWS docs to try and get a app up and running, but found it to be not in depth enough.
I found this Youtube tutorial - https://www.youtube.com/watch?v=QjrfUO91wfc which does cover all what I want to do, pull files off of github onto EC2 and deploy.
All was going well until
sudo systemctl daemon-reload
sudo systemctl start gunicorn
sudo systemctl enable gunicorn
sudo vim /etc/nginx/sites-available/python-django-product-review/prodreview
For some reason, directories haven't been created within site-available, the only file I have in there is default.
I have tried to create said files and directories but permissions seems to be a issue too
Does anyone have any advice? or a simpler way of uploading my app to AWS?
Thanks.
Related
I am developing an app with django cookiecutter (with docker and heroku setup) and have come so far as to deploying it. This is my first ever project, so no prior experience with django, docker or heroku. I've read up on it at cookiecutter docs, heroku and a little on the docker website but I still don't how to deploy.
I have downloaded the heroki cli , have set up app on heroku with my own domain and a postgres db and I am planning on getting hobby tier to get the automated certificate. All environment variables are set in the .env file and are set in the heroku config vars in my app at heoku. So everything should be alright as far as code and settings. I am also using git as version control.
Am I supposed to upload the whole project (code, settings, docker files etc) to heroku with git or by some other means? I saw there was an option to deploy the project with docker deploys aswell at herokus website. What option is the correct one?
I was thinking initially that I would just upload the project through git and run docker-compose -f production.yml up (in the heroku bash)... or something like that and that. I dont know, please help.
If some info is missing or is unclear I will try edit it as best as I can.
It is better to deploy the project to Heroku using git.
$ heroku login
$ heroku create your_custom_app_name
$ git add --a
$ git commit -m "My custom app deployment to heroku"
$ git push heroku master
and then once it is deployed.
$ heroku python manage.py migrate
I'm on Docker for past weeks and I can say I love it and I get the idea. But what I can't figure out is how can I "transfer" my current set-up on Docker solution. I guess I'm not the only one and here is what I mean.
I'm Python guys, more specifically Django. So I usually have this:
Debian installation
My app on the server (from git repo).
Virtualenv with all the app dependencies
Supervisor that handles Gunicorn that runs my Django app.
The thing is when I want to upgrade and/or restart the app (I use fabric for these tasks) I connect to the server, navigate to the app folder, run git pull, restart the supervisor task that handles Gunicorn which reloads my app. Boom, done.
But what is the right (better, more Docker-ish) approach to modify this setup when I use Docker? Should I connect to docker image bash somehow everytime I want upgrade the app and run the upgrade or (from what I saw) should I like expose the app into folder out-of docker image and run the standard upgrade process?
Hope you get the confusion of old school dude. I bet Docker guys were thinking about that.
Cheers!
For development, docker users will typically mount a folder from their build directory into the container at the same location the Dockerfile would otherwise COPY it. This allows for rapid development where at most you need to bounce the container rather than rebuild the image.
For production, you want to include everything in the image and not change it, only persistent data goes in the volumes, your code is in the image. When you make a change to the code, you build a new image and replace the running container in production.
Logging into the container and manually updating things is something I only do to test while developing the Dockerfile, not to manage a developing application.
I'm just wondering if I'm doing something wrong, or is developing with AWS really hard/confusing?
Currently I have an EC2 instance with the following address:
ec2-XX-XX-XX-XX.us-west-2.compute.amazonaws.com
And with that a elastic beanstalk application environment:
http://XXX.XXXXXX.us-west-2.elasticbeanstalk.com/
I find that it's really hard/long to code something, put it on the server, and test what it looks like by going to http://XXX.XXXXXX.us-west-2.elasticbeanstalk.com/ as what I need to do is this:
1) Upload the files via FTP to ec2-XX-XX-XX-XX.us-west-2.compute.amazonaws.com
2) SSH inside to ec2-XX-XX-XX-XX.us-west-2.compute.amazonaws.com and do eb deploy
3) Wait 2-3 minutes for the server to deploy
4) View the changes at http://XXX.XXXXXX.us-west-2.elasticbeanstalk.com
Is there something I'm doing wrong here? Normally this is what I'm used to do:
1) Upload file via FTP to http://mywebsite.com
2) SSH inside http://mywebsite.com
3) Do python manage.py runserver or gunicorn mySite.wsgi:application
4) View changes at http://mywebsite.com without having to wat 2-3 minutes for it to deploy.
Can someone guide me on what I might be doing wrong? I'm not too sure on what I'm missing here.
Thank you!
With AWS Elastic Beanstalk you dont exactly "FTP" files to the server. With the EB API tools you should only eb deploy and your latest GIT commit will deploy all files to your EB servers.
In my case, it only takes 3-4 lines of terminal commands to get everything up and running
git add -A
git commit -m '04212016_1_east'
./manage.py collectstatic (optional step since I use S3 for static files)
eb deploy
My web host does not have python and I am trying to build a machine learning application. I know that heroku lets you use python. I was wondering if I could use heroku as a python server? As in I would let heroku do all of the python processing for me and use my regular domain for everything else.
Yes, and it may be a pain at first but once it is set I would say Heroku is the easiest platform to continually deploy to. However, it is not intuitive - don't try and just 'take a stab' at it; follow a tutorial and try and understand why Heroku works the way it does.
Following the docs is a good bet; Heroku has great documentation for the most part.
Here's the generalized workflow for deploying to Heroku:
Locally, create your project and use virtualenv to install/manage
libraries.
Initialize a git repository in the base dir for your
Python project; create a heroku remote (heroku create)
Create a
procfile for Heroku to use when starting gunicorn (or see
the options for using waitress/etc); this is used by Heroku to start your process
cd to your base dir; freeze
your virtualenv (pip freeze > requirements.txt) and add/commit
requirements.txt. This tells Heroku what packages need to be installed, a requirement for your deployment to work. If you are trying to run a Python project and there are required packages missing, the app will be unable to start and Heroku will display an Internal Server Error.
Whenever changes are made, git commit your changes and git push heroku master to push all commits to Heroku. This will cause Heroku to restart the server application with your updated deployment. If there's a failure, you can use heroku rollback to just return to your last deployment.
In reality, it's not a pain in the ass, just particular. Knowing the rules of Heroku, you are able to manage your deployment with command-line git commands with ease.
One caveat - If deploying Django, Flask applications etc there are peculiarities to account for; specifically, non-project files (including assets) should NOT be stored on Heroku as Heroku periodically restarts your 'dyno' (server instance(s)), loading the whole project from the latest push to Heroku. With Django and Flask, this typically means serving assets/static/media files from an Amazon S3 bucket.
That being said, if you use virtualenv properly, provision your databases, and follow Heroku practices for serving files and commiting updates, it is (imho) the absolute best platform out there for ease of use, reliable uptime, and well-oiled rolling deployments.
One last tip - if you are creating a Django app, I'd suggest starting your project out of this boilerplate. I have a custom one I use for new projects and can start and publish a project in minutes.
Yes, you can use Heroku as a python server. I put a Python Flask server on Heroku but it was a pain: Heroku seemed to have some difficulties, and there were lots of conflicting advice on getting around those. I eventually got it working, can't remember what web page had the ultimate answer but you might look at this one: http://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-xviii-deployment-on-the-heroku-cloud
Have you done your Python Server on Heroku by using twisted?
I don't know if this can help you.
I see the doc 'Getting Started on Heroku with Python' is about the Django.
It is sure that Heroku can use Twisted from docs
Pure Python applications, such as headless processes and evented web frameworks like Twisted, are fully supported.
django-twisted-server has twisted in django but it isn't on Heroku.
I would like to be able to log the number of words in certain files in a Github repo whenever there is a new push to the repo. I have set up a hook on Github to hit a Django Heroku app url after each push, but I don't know how to run a git pull in python from a Django app running on Heroku. Is it possible to write to the local file system in Heroku?
Check out github repo from Heroku?
from the command line you can pull from heroku easily: git pull heroku master
have set up a hook on Github to hit a Django Heroku app url after each push, but I don't know how to run a git pull in python from a Django app running on Heroku?
Is it a different heroku App (from the one that was deployed) that will be doing the pull?
Yes? then you are going to have issues. Because the pull app needs permission (heroku login) to pull... and it wont have it. Also, b/c of the ephemeral filesystem, even if you login (via heroku run bash or the like) to it, the pull app will eventually lose its logged in session (see why below)
No? then don't pull. just use the os filesystem libraries to look into the application directory...
Is it even possible to write to the local file system in Heroku?
Yes and No. You can write to the local filesystem, but its going to get nuked. See: https://devcenter.heroku.com/articles/dynos#ephemeral-filesystem
Also, with the EFS, each dyno is going to have a different EFS - so each web process is in a way sandboxed.