So I'm following this tutorial http://rosslaird.com/blog/building-a-project-with-mezzanine/ for building a project with Mezzanine. I am extremely new to all of this stuff (including Linux and the command line) and frankly do not really know what I am doing. I am at this part of the tutorial:
Run this command within the same directory as your local_settings.py and settings.py files: python manage.py createdb
The tutorial says that after I enter the "python manage" command I will be "asked to create a super-user, to provide details that user, and to answer a few more questions". When I entered the command none of those questions showed up. Why is this? Thank you very much in advance.
So you are trying to run this on the commandline right (terminal)?
sudo -u postgres createuser --superuser $USER
sudo -u postgres psql
postgres=# \password [enter your username]
Enter new password:
Enter it again:
\q
createdb $USER;
Change $USER with your designated name.
Sorry that the link you provided is 404'd (Most likely because this post is over 2 years old) .
But... I think I found it here (sort of)... There's a few bits and pieces missing which might have made it confusing. The "manage.py" should be in the parent directory that's prior to where your "settings.py | local_settings.py | urls.py" files reside. Just make sure you are in the appropriate directory when running the manage command. An easy ls or ls -la command will show you where your files are at within the directory. I myself am a novice Mezzanine user. I've been playing around with it for a year now and hope this info can serve as a quickstart guide for setting up Mezzanine on PostgreSQL while also resolving your issue.
So... Once the following conditions are met you should be able to create a Mezzanine project with a PostgreSQL database instance. But first, make sure you have Mezzanine set up and running without warnings or errors.
For Mezzanine Setup...
Preconditions:
You've installed Python, pip, etc...
Get virtualenv & virtualenvwrapper installed and configured.
pip install virtualenv
pip install virtualenvwrapper
Make sure to add your environment variables for your virtual environments. Just point the paths to where you want your Mezzanine projects to live as well as the environments. An easy way to do it is edit your bashrc in the home directory. I like to keep my virtualenvs separate from my working directory but adjust the paths to how you like. Just sudo vi ~/.bashrc then add the following...
## Virtualenvwrapper Settings
export WORKON_HOME=$HOME/.virtualenvs
export PROJECT_HOME=$HOME/envs/mezzanine/projects/live/here
source /usr/local/bin/virtualenvwrapper.sh
Note: Do i to insert text and when done editing type ctrl + c to exit the prompt then :wq! to write (save) and quit. Then close your terminal and open a new one so the new changes take effect. If not, restart your computer.
Next, cd over to your project directory cd ~/envs/mezzanine/projects/live/here and create your virtualenv for your Mezzanine project (It'll activate once created).
mkvirtualenv environment_name
You can deactivate your environment by simply typing deactivate in terminal. To re-activate your environment, type workon environment_name
Now you can install Mezzanine...
pip install -U mezzanine
Then create your Mezzanine project and watch those folders get created in your project directory...
mezzanine-project project_name
Collect your templates & static files.
python manage.py collecttemplates
python manage.py collectstatic
Now create your db instance (by default this will be SQLite if you haven't changed anything in settings.py .
Make sure you have your ALLOWED_HOSTS configured and edit your settings.py if you haven't already.
vi ~/envs/mezzanine/projects/live/here/project_name/project_name/settings.py .
ALLOWED_HOSTS = [
'127.0.0.1:8000',
'localhost',
'www.mydomain.com' #if you want to set that too.
]
Note: Don't forget to save your changes ctrl+c & :wq! .
At this point you should be able to go back a directory and run your server python manage.py runserver and get a response from your localhost (loopback address) at port 8000 by opening a browser and typing in 127.0.0.1:8000. (make sure your environment has been activated first)
Now For Your PostgreSQL Database...
Check this out. It's a pretty good resource and even touches base on virtualenv. You can also replace the Django references with Mezzanine (almost). The most important part is the database setup portion...
https://www.digitalocean.com/community/tutorials/how-to-use-postgresql-with-your-django-application-on-ubuntu-14-04
Install Postgres and dependencies. (You might need to run as sudo with -H flag here)
sudo apt-get install libpq-dev python-dev postgresql postgresql-contrib
Install psysopg2 (Might need sudo -H as well)
sudo pip install psycopg2
Login as "postgres" user:
sudo -su postgres
Run the psql shell command: psql. You should see the 'postgres=#' text.
Now, create your database (Remember to end psql statements with a semicolon;) CREATE DATABASE mydb;
Create database user: CREATE USER mydbuser WITH PASSWORD 'mypassword';
Set your user roles:
ALTER ROLE mydbuser SET client_encoding TO 'utf8';
ALTER ROLE mydbuser SET default_transaction_isolation TO 'read committed';
ALTER ROLE mydbuser SET timezone TO 'UTC';
8: Then give the database user access rights: GRANT ALL PRIVILEGES ON DATABASE mydb TO mydbuser;
Type ctrl + d to exit shell, then type exit to exit postgres user.
Now, go to your settings.py and local_settings.py in your Mezzanine project's working directory and modify your DATABASES settings with the credentials you previously created... cd ~/envs/mezzanine/projects/live/here/project_name/project_name/ and then vi settings.py
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql_psycopg2',
'NAME': 'mydb',
'USER': 'mydbuser',
'PASSWORD': 'password',
'HOST': 'localhost',
'PORT': '5432',
}
}
Note: Don't forget local_settings.py
Now you can create your database via manage.py.
python manage.py createdb
The above command should prompt you to do the initial setup of your database along with your site info, as well as create a superuser. Just follow the prompts. To create additional superusers just do
`python manage.py createsuperuser' .
Now go back a directory to the project root cd .. and run your server python manage.py runserver . And now... you should have your new Mezzanine project running on PostgreSQL. Congratulations!! :)
The tutorial is just wrong. The writer has got confused with the Postgres createdb command used earlier, and the actual manage.py command, which is syncdb.
You would be better off using the actual Django tutorial.
Related
After creating a new lightsail django instance on AWS, I found that the folders /opt/bitnami/apps/ does not exist as referenced in the documentation https://aws.amazon.com/getting-started/hands-on/deploy-python-application/. I've created django instances before on AWS and have never encountered this issue.
bitnami#ip-172-26-4-185:~$ ls
bitnami_application_password bitnami_credentials htdocs stack
bitnami#ip-172-26-4-185:~$ cd /
bitnami#ip-172-26-4-185:/$ cd opt
bitnami#ip-172-26-4-185:/opt$ cd bitnami
bitnami#ip-172-26-4-185:/opt/bitnami$ cd apps
-bash: cd: apps: No such file or directory
bitnami#ip-172-26-4-185:/opt/bitnami$ ls
apache bncert-tool bnsupport-tool git nami properties.ini stats
apache2 bnsupport common gonit node python var
bncert bnsupport-regex.ini ctlscript.sh mariadb postgresql scripts
Additional info:
16 GB RAM, 4 vCPUs, 320 GB SSD
Django
Virginia, Zone A (us-east-1a)
attached static ip address
Bitnami Engineer here,
The apps folder doesn't exist anymore in the Django solution. The guide you are following is not maintained by Bitnami and that's why it's not up to date. To create a new project in the new Bitnami Django solution, you will need to run these commands
sudo mkdir -p /opt/bitnami/projects/PROJECT
sudo chown $USER /opt/bitnami/projects
django-admin startproject PROJECT /opt/bitnami/projects/PROJECT
cd /opt/bitnami/projects/PROJECT
python manage.py migrate
python manage.py startapp helloworld
python manage.py runserver
and access the port 8000 to see that new hello world project.
You can learn more about this in our official documentation
https://docs.bitnami.com/aws/infrastructure/django/get-started/start-django-project/
https://docs.bitnami.com/aws/infrastructure/django/get-started/deploy-django-project/
Thanks
I've gone through the same problem (the directories aren't created by the blueprint) and asked it in AWS Developer Forum.
The user donleyataws pointed to Bitnami's documentation and the first thing it tells is to create the projects directory and its ownership.
First, create a new folder to store your Django projects, such as the /opt/bitnami/projects directory, and give write permissions for the current system user. Assuming your in bitnami folder (the one with bitnami_application_password bitnami_credentials htdocs stack), then run
sudo mkdir projects
sudo chown $USER projects
I'm working to modify a cookiecutter Flask app.
locally I have deleted the migration folder and sqllite db a couple of times during development. I've pushed my chnages to heroku.
When trying to migrate the heroku postgresdb :
$ heroku run python manage.py db upgrade
.....
alembic.util.CommandError: Multiple head revisions are present for given argument 'head'; please specify a specific target revision, '<branchname>#head' to narrow to a specific head, or 'heads' for all heads
following http://alembic.readthedocs.org/en/latest/branches.html I tried:
$ heroku run python manage.py db merge heads
Running python manage.py db merge heads on myapp... up, run.9635
Generating /app/migrations/versions/35888775_.py ... done
Then I tried:
$ heroku run python manage.py db upgrade
Running python manage.py db upgrade on myapp... up, run.7021
INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO [alembic.runtime.migration] Will assume transactional DDL.
....
"%s#head" % branch_label if branch_label else "head")
alembic.util.CommandError: Multiple head revisions are present for given argument 'head'; please specify a specific target revision, '<branchname>#head' to narrow to a specific head
, or 'heads' for all heads
How can I merge the revision heads into one and make sure this is synced with my development version?
I contacted heroku support and got the following back (which worked for me):
Hi,
To remove a folder from your local repository, git rm needs to be run. Could you please try something like below?
$ git rm -r migrations
$ git commit -m 'Remove migrations directory'
$ git push heroku master
To see differences between actual files and what are registered onto your local repository, git status may be useful.
Please let us know if you have any difficulty here.
I have my Django app set up on Elastic Beanstalk and recently made a change to the DB that I would like to have applied to the live DB now. I understand that I need to set this up as a container command, and after checking the DB I can see that the migration was run, but I can't figure out how to have more controls over the migration. For example, I only want a migration to run when necessary but from my understanding, the container will run the migration on every deploy assuming the command is still listed in the config file. Also, on occassion, I will be given options during a migration such as:
Any objects realted to these content types by a foreign key will also be deleted.
Are you sure you want to delete these content types?
If you're unsure, answer 'no'
How do I set up the container command to respond to this with a yes during the deployment phase?
This is my current config file
container_commands:
01_migrate:
command: 'source /opt/python/run/venv/bin/actiate && python app/manage.py makemigrations'
command: 'source /opt/python/run/venv/bin/activate && python app/manage.py migrate'
Is there a way to set these 2 commands to only run when necessary and to respond to the yes/no options I receive during a migration?
I'm not sure there is a specific way to answer yes or no. but you can append --noinput to your container command. Use the --noinput option to suppress all user prompting, such as “Are you sure?” confirmation messages.
try
command: 'source /opt/python/run/venv/bin/activate && python app/manage.py migrate --noinput'
OR..
You can ssh into your elasticbean instance and run your command manually.
Then you'll have more control over the migrations.
Install awsebcli with pip install awsebcli
Type eb ssh Your EnvironmentName
Navigate to your eb instance app directory with:
sudo -s
source /opt/python/run/venv/bin/activate
source /opt/python/current/env
cd /opt/python/current/app
then run your command.
./manage.py migrate
I hope this helps
Aside from the automatic migration that you can add to deploy script (which runs every time you update the environment, and may not be desirable if you have long running migration or other Django management commands), you can ssh into an EB instance to run migration manually.
Here is how to manually run migration (and any other Django management commands) while working with Amazon Linux 2 (Python 3.7, 3.8) created by Elastic Beanstalk:
First, from your EB cli: eb ssh to connect an instance.
The virtual environment can be activated by
source /var/app/venv/*/bin/activate
The manage.py can be ran by
python3 /var/app/current/manage.py
Now the only tricky bit is to get Elastic Beanstalk's environment variables. You can access them by /opt/elasticbeanstalk/bin/get-config, I'm not super familiar with bash script, but here is a little script that I use to get and set environment variables, maybe someone can improve it to make it less hard-coded:
#! /bin/bash
export DJANGO_SECRET_KEY=$(/opt/elasticbeanstalk/bin/get-config environment -k DJANGO_SECRET_KEY)
...
More info regarding Amazon Linux 2 splatform script tools: https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/custom-platforms-scripts.html
Make sure that the same settings are used when migrating and running!
Thus I would recommend you change this kind of code in django.config
container_commands:
01_migrate:
command: "source /opt/python/run/venv/bin/activate && python manage.py migrate"
leader_only: true
to:
container_commands:
01_migrate:
command: "django-admin migrate"
leader_only: true
option_settings:
aws:elasticbeanstalk:application:environment:
DJANGO_SETTINGS_MODULE: fund.productionSettings
as recommended here. This will help you avoid issues with wrong settings used.
More on manage.py v.s. django-admin.py.
django-admin method not working as it was not configured properly. You can also use python manage.py migrate in
.ebextentions/django.config
container_commands:
01_migrate:
command: "python manage.py migrate"
leader_only: true
In reference to Oscar Chen answer, you can set environmental variables using eb cli with
eb setenv key1=value1 key2=valu2 ...etc
The trick is that the full output of container_commands is in /var/log/cfn-init-cmd.log (Amazon Linux 2 Elastic Beanstalk released November 2020).
To view this you would run:
eb ssh [environment-name]
sudo tail -n 50 -f /var/log/cfn-init-cmd.log
This doesn't seem to be documented anywhere obvious and it's not displayed by eb logs; I found it by hunting around in /var/log.
The Django example management command django-admin.py migrate did not work for me. Instead I had to use something like:
01_migrate:
command: "$PYTHONPATH/python manage.py migrate"
leader_only: true
02_collectstatic:
command: "$PYTHONPATH/python manage.py collectstatic --noinput --verbosity=0 --clear"
To see the values of your environment variables at deploy time, you can create a debug command like:
03_debug:
command: "env"
You can see most of these environment variable with eb ssh; sudo cat /opt/elasticbeanstalk/deployment/env, but there seem to be some subtle differences at deploy time, hence using env above to be sure.
Here you'll see that $PYTHONPATH is being in a non-typical way, pointing to the virtualenv's bin directory, not the site-packages directory.
This answer looks like it will work for you if you just want to send "yes" to a few prompts.
You might also consider the --noinput flag so that your config looks like:
container_commands:
01_migrate:
command: 'source /opt/python/run/venv/bin/actiate && python app/manage.py makemigrations'
command: 'source /opt/python/run/venv/bin/activate && python app/manage.py migrate --noinput
This takes the default setting, which is "no".
It also appears that there's an open issue/fix to solve this problem a better way.
I am learning Django from the official documentation and while going through the tutorial at https://docs.djangoproject.com/en/1.7/intro/tutorial01/, I am stuck at creating a project part.
When I run django-admin.py startproject mysite I am getting following error
C:\Python34>django-admin.py startproject mysite
Usage: django-admin.py subcommand [options] [args]
Options:
-v VERBOSITY, --verbosity=VERBOSITY
Verbosity level; 0=minimal output, 1=normal output,
2=verbose output, 3=very verbose output
--settings=SETTINGS The Python path to a settings module, e.g.
"myproject.settings.main". If this isn't provided, the
DJANGO_SETTINGS_MODULE environment variable will be
used.
--pythonpath=PYTHONPATH
A directory to add to the Python path, e.g.
"/home/djangoprojects/myproject".
--traceback Raise on exception
--no-color Don't colorize the command output.
--version show program's version number and exit
-h, --help show this help message and exit
Type 'django-admin.py help <subcommand>' for help on a specific subcommand.
Available subcommands:
[django]
check
compilemessages
createcachetable
dbshell
diffsettings
dumpdata
flush
inspectdb
loaddata
makemessages
makemigrations
migrate
runfcgi
runserver
shell
sql
sqlall
sqlclear
sqlcustom
sqldropindexes
sqlflush
sqlindexes
sqlinitialdata
sqlmigrate
sqlsequencereset
squashmigrations
startapp
startproject
syncdb
test
testserver
validate
Note that only Django core commands are listed as settings are not properly configured
error: Requested setting INSTALLED_APPS, but settings are not configured
. You must either define the environment variable DJANGO_SETTINGS_MODULE or call settings.configure() before accessing settings.).
I am using Python 3.4.1 and django 1.7. I don't have any other Django version installed and this is the first project I am creating.
You can just run django-admin startproject mysite(Note: not django-admin.py), because if you install django by pip, a executable program named 'django-admin.exe' will be added to 'C:\Python34\Scripts', which is already in your environment variable 'PATH' normally(if not, add it to PATH).
I was facing the same issue while installing Django 2.0.5. This can be resolved using Virtual Environments.
Environment details:
Python version: 3.6
OS: Ubuntu 18.xx.x
Steps:
Install Virtual Environment
pip install virtualenv
Create a Virtual work-space('VEnv')
virtualenv --python=python3 VEnv
Activate the Virtual Environment:
cd VEnv/
source bin/activate
Install Django(Version - 2.0.5)
pip install django==2.0.5
Create Project (Project Name: HelloDot)
django-admin startproject HelloDot
Run the server as below and then access it from "http://127.0.0.1:8000/"
cd HelloDot/
python manage.py runserver 8000
For more details, take a look at this: https://www.howtoforge.com/tutorial/how-to-install-django-on-ubuntu/
Make sure that you follow the troubleshooting guide because it looks like you don't have django-admin.py on your system path correctly. From the docs:
django-admin.py should be on your system path if you installed Django
via python setup.py. If it’s not on your path, you can find it in
site-packages/django/bin, where site-packages is a directory within
your Python installation. Consider symlinking to django-admin.py from
some place on your path, such as /usr/local/bin.
You should also use a virtualenv for each of your projects to allow isolation of dependencies per project and easier management of them. virtualenvwrapper is a useful tool for creating and managing your virtualenvs.
I cannot seem to figure out how to integrate my current Django project to run tests on Travis CI. Right now I have PostgreSQL set up to run on my local machine when unit tests are run.
language: python
python:
- 3.4.1
addons:
postgresql: "9.3"
before_script:
- psql -U postgres -c "create extension postgis"
- psql -c 'create database travis_ci_test;' -U postgres
install:
- pip install -r requirements.txt
- pip install coveralls
script:
coverage run --source=calculator manage.py test
after_success:
coveralls
Travis tells me:
$ coverage run --source=calculator manage.py test
Creating test database for alias 'default'...
Got an error creating the test database: permission denied to create database
Type 'yes' if you would like to try deleting the test database 'test_dk5va592r6j0v', or 'no' to cancel:
And right now I have a hacky set-up to deal with switching between a local db and my heroku db:
import dj_database_url
if DEBUG:
DATABASE_URL = 'postgres://localhost/storage'
else:
DATABASE_URL = 'postgres://somerealurl'
DATABASES = {'default': dj_database_url.config(default=DATABASE_URL)}
Does anyone have a good way to fix my issue? It seems I need to be able to create a PostgreSQL on Travis, and then run my tests so I can get coverage. Debug will also have to be set to False, whenever the code is checked in as well.
If you could post a working Travis, DJango, and PSQL setup that would awesome!
What I have done and has been successful for me is setting the DATABASE_URL as an environment variable and just using
DATABASES = {'default': dj_database_url.config(default=DATABASE_URL)}
in the code which will switch gracefully from local to production.
Here is a working travis config using Postgres, Django and deployed on Heroku.
https://github.com/kevgathuku/sermonbuddy/blob/master/.travis.yml