I have to get the names of Postgres databases in Python. How do I do that? I need to be able to put these names into a list.
you can use pg_dumpall and psql as superuser
pg_dumpall -h localhost -U postgres -f pg_dumpall.sql
Related
Please help, I have this stdin is not a tty message when i run the command below in my terminal.
psql -U postgres kdc < kdc.psql
kdc is the database and kdc.psql is the psql file with commands to populate the database. I am in the directory that holds the psql file.
I am not sure what causes that message (it does not happen here), but you should be able to avoid it using the -f option:
psql -U postgres -d kdc -f kdc.psql
I'm developing a website with Django 3 (in a docker container) using postgres sql as the backend; i.e. in the project settings file I have:
DATABASES = {
'default': {
'ENGINE': 'django.db.backends.postgresql',
'NAME': 'postgres',
'USER': 'postgres',
'PASSWORD': 'postgres',
'HOST': 'db',
'PORT': 5432
}
}
I've populated the backend database and can browse the data using the admin. However, Id like to connect to the database via the command line so I can more easily test queries. I have tried connecting to the database the normal way from the command line:
sudo -u postgres psql
postgres=# \c postgres
The problem is that there is no data found:
postgres=# \dt
Did not find any relations.
Since I'm new to docker I thought to try connecting other ways as well; specifically, based on another post I tried:
sudo docker run -d -p 5432 -t postgres/postgresql /bin/su postgres -c '/usr/lib/postgresql/10/bin/postgres -D /var/lib/postgresql/10/main -c config_file=/etc/postgresql/10/main/postgresql.conf'
This throws an error:
pull access denied for psql/postgresql, repository does not exist or may require 'docker login'
Again, I'd like to connect to the database via the command line so I can more easily test queries. Perhaps Im on the right track but assistance would be appreciated.
It is a bad idea to use postgres for the database name, as there is a postgres database used for maintenance by default by PostgreSQL itself. I'd recommend calling the database something like my_project, then creating a service account user my_project_user, and assign a password:
sudo -u postgres psql
postgres=# CREATE USER my_project_user WITH PASSWORD 'blahblah';
postgres=# CREATE DATABASE my_project WITH OWNER my_project_user;
postgres=# \q
Update your Django DATABASES["default"] settings accordingly, and run migrations. After running migrations, you can access your Django database using the following management command:
python manage.py dbshell
You may be able to issue the command above with your current setup, but you may run into problems using postgres as your database name. Good luck!
UPDATE, my docker-compose.yml file had the following service:
db:
image: postgres:11
volumes:
- postgres_data:/var/lib/postgresql/data/
I am able to connect to the Django backend using the following command:
sudo docker-compose exec db psql -U postgres
postgres=# \c postgres
As suggested by #FlipperPA I will start useing a different username and database name
This command below runs(opens) the command-line client for PostgreSQL so that you can test queries:
python manage.py dbshell
I am trying to directly export a table from redshift to my local computer. I am successful in getting data from redshift however it doesn't differentiate any of the data. When I do pandas.dtypes they all come out as objects and not various data types such as string or date timestamps. I would also like to add the headers of the columns straight from the export.
I've successfully exported to my local using PSQL commands from my terminal to access the redshift.
psql -h omaha-prod-cluster.example.us-east-1.redshift.amazonaws.com -d prod -U <username> -p 5439 -A -t -c "select * from l2_survey.survey_customerinsight" -F ',' -o Downloads/survey_customerInsights.csv
I am then running the panda command to read the kinds of data types
data.dtypes()
and it is returning every column with the data type of object. It also doesn't give me the headers of the columns with the psql command above
There is problem with your command where you are explicitly asking the export command to skip the column names by supplying the argument -t, which tells the command to just export tuples without column names. Just change it like below and it will provide you the header.
psql -h <host-values>.redshift.amazonaws.com -U <user> -d <database> -p 5439 -c "select * from your_schema.your_table" > out.txt
Hope it helps you.
I have an application inside a docker container which uses mongodb.
There is a particular db called "device" and it has an collection called "inventory" , I want all of this collection into a json or csv file , so that i can export the data into splunk and analyze.
When i try to get information for one device id , it works and i am redirecting it to a file
sudo /usr/local/bin/docker exec -it ss2-$HOSTNAME mongo device -u marco -p <pasword> --authenticationDatabase admin --eval 'printjson(db.inventory.findOne({"host_ip": "1.1.1.1"}))' --quiet >> /tmp/json.txt
But i want all the collection to be displayed and to be redirected /tmp/json.txt
I did try
sudo /usr/local/bin/docker exec -it ss2-$HOSTNAME mongo device -u marco -p <pasword> --authenticationDatabase admin --eval 'printjson(db.inventory.find( {} ))' --quiet >> /tmp/json.txt
This displays all the collection data when i run inside the container , but when i run outside i dont see any data.
I have written a code in Python which accesses Mysql database in my computer.My question is how do I make my program run on other machines i.e how do I transfer the database ??
Thank you for reading...
use the tools that come with MYSQL installation
from command line
backup
mysqldump -u root -p pass21 --databases yourdb > multibackup.sql
restore
mysql -u sadmin -p pass21 Customers < multibackup.sql
Backing-up-and-restoring-your-MySQL-Database