Export Query Result as CSV file from Docker PostgreSQL container to local machine - sql

I'm not sure if this is possible of if I'm doing something wrong since I'm still pretty new to Docker. Basically, I want to export a query result inside PostgreSQL docker container as a csv file to my local machine.
This is where I got so far. Firstly, I run my PostgreSQL docker container with this command:
sudo docker run --rm --name pg-docker -e POSTGRES_PASSWORD=something -d -p 5432:5432 -v $HOME/docker/volumes/postgres:/var/lib/postgresql/data postgres
Then I access the docker container with docker exec to run PostgreSQL command that would copy the query result to a csv file with specified location like this:
\copy (select id,value from test) to 'test_1.csv' with csv;
I thought that should export the query result as a csv file named test_1.csv in the local machine, but I couldn't find the file anywhere in my local machine, also checked both of these directories: $HOME/docker/volumes/postgres; /var/lib/postgresql/data postgres

You can export the data to the STDOUT and pipe the result to a file in the client machine:
docker exec -it -u database_user_name container_name \
psql -d database_name -c "COPY (SELECT * FROM table) TO STDOUT CSV" > output.csv
-c tells psql you to execute a given SQL statement when the connection is established.
So your command should look like this:
docker exec -it -u postgres pgdocker \
psql -d yourdb -c "COPY (SELECT * FROM test) TO STDOUT CSV" > test_1.csv

The /var/lib/postgresql/data directory is where the database server stores its data files. It isn't a directory that users need to manipulate directly or where nothing interesting can be found.
Paths like test_1.csv are relative to working directory. The default directory when you enter the postgres container with docker exec is / so that's where your file should be. You can also switch to another directory with cd before running psql:
root#b9e5a0572207:/some/other/path# cd /some/other/path/
root#b9e5a0572207:/some/other/path# psql -U postgres
... or you can provide an absolute path:
\copy (select id,value from test) to '/some/other/path/test_1.csv' with csv;
You can use docker cp to transfer a file from the container to the host:
docker cp pg-docker:/some/other/path/test_1.csv /tmp
... or you can create a volume if this is something you do often.

Related

Restoring database through docker

Currently learning SQL online. I've been trying to restore the database from this link:
http://app.sixweeksql.com:2000/SqlCourse.bak
when I run SQL Server through Docker (Mac user, can't run SSMS unfortunately). I've been following directions from Microsoft here:
https://learn.microsoft.com/en-us/sql/linux/tutorial-restore-backup-in-sql-server-container?view=sql-server-2017
I moved the file into my container and checked the files listed inside (Course New and CourseNew_log) so I could write out its file path:
sudo docker cp SqlCourse.bak container_name:/var/opt/mssql/backup
followed by:
sudo docker exec -it container_name /opt/mssql-tools/bin/sqlcmd -S localhost \
-U SA -P "Password" \
-Q 'RESTORE FILELISTONLY FROM DISK = "/var/opt/mssql/backup/SqlCourse.bak"'
Yet I just don't know how to restore the database. I've tried this:
sudo docker exec -it container_name /opt/mssql-tools/bin/sqlcmd \
-S localhost -U SA -P "Password" \
-Q 'RESTORE DATABASE SqlCourse FROM DISK = "/var/opt/mssql/backup/SqlCourse.bak" WITH MOVE "CourseNew" TO "/var/opt/mssql/data/SqlCourse.mdf", MOVE "CourseNew_log" TO "/var/opt/mssql/data/SqlCourse.ldf"
and it returns "unexpected argument." Clearly that's not the right call but I'm not sure how else to proceed.
(Running mcr.microsoft.com/mssql/server:2019-CTP3.2-ubuntu)
Single quotes are used to enclose string literals in T-SQL so the resultant RESTORE T-SQL script needs to be:
RESTORE DATABASE SqlCourse
FROM DISK = '/var/opt/mssql/backup/SqlCourse.bak\'
WITH
MOVE 'CourseNew' TO '/var/opt/mssql/data/SqlCourse.mdf'
, MOVE 'CourseNew_log' TO '/var/opt/mssql/data/SqlCourse.ldf';
Since you are passing the command as a bash shell command-line argument, you also need to prefix the argument string with '$' and escape the single quotes within the string by preceding them with a \:
sudo docker exec -it container_name /opt/mssql-tools/bin/sqlcmd \
-S localhost -U SA -P "Password" \
-Q $'RESTORE DATABASE SqlCourse FROM DISK = \'/var/opt/mssql/backup/SqlCourse.bak\' WITH MOVE \'CourseNew\' TO \'/var/opt/mssql/data/SqlCourse.mdf\', MOVE \'CourseNew_log\' TO \'/var/opt/mssql/data/SqlCourse.ldf\';'
You can avoid the escaping ugliness by copying the normal RESTORE script into the container and running with the SQLCMD -i argument.

How can I import a SQL Server RDS backup into a SQL Server Linux Docker instance?

I've followed the directions from the AWS documentation on importing / exporting a database from RDS using their stored procedures.
The command was similar to:
exec msdb.dbo.rds_backup_database
#source_db_name='MyDatabase',
#s3_arn_to_backup_to='my-bucket/myBackup.bak'
This part works fine, and I've done it plenty of times in the past.
However what I want to achieve now; is restoring this database to a local SQL Server instance; however I'm struggling at this point. I'm assuming this isn't a "normal" SQL Server dump - but I'm unsure what the difference is.
I've spun up a new SQL Server for Linux Docker instance; which seems all set. I have made a few changes so that the sqlcmd tool is installed; so technically the image I'm running is comprised of this Dockerfile; not much different.
FROM microsoft/mssql-server-linux:2017-latest
RUN apt-get update && \
apt-get install -y curl && \
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add - && \
apt-get update && \
apt-get install -y mssql-tools unixodbc-dev
This image works fine; I'm building it via docker build -t sql . and running it via docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=myPassword1!' -p 1433:1433 -v $(pwd):/backups sql
Within my local folder, I have my backup from RDS downloaded, so this file is now in /backups/myBackup.bak
I now try to run sqlcmd to import the data with the following command; and I'm running into an issue which makes me assume this isn't a traditional SQL dump. Unsure what a traditional SQL dump looks like, but the majority of the file looks garbled with ^#^#^#^# and of course other things.
/opt/mssql-tools/bin/sqlcmd -S localhost -i /backups/myBackup.bak -U sa -P myPassword1! -x
And finally; I get this error:
Sqlcmd: Error: Syntax error at line 56048 near command 'GO' in file '/backups/myBackup.bak'.
Final Answer
My final solution for this mainly came from using -Q and running a RESTORE query rather than importing with the file, but I also needed to include some MOVE options as they were pointing at Windows file paths.
/opt/mssql-tools/bin/sqlcmd -U SA -P myPassword -Q "RESTORE DATABASE MyDatabase FROM DISK = N'/path/to/my/file.bak' WITH MOVE 'mydatabase' TO '/var/opt/mssql/mydatabase.mdf', MOVE 'mydatabase_log' TO '/var/opt/mssql/mydatabase.ldf', REPLACE"
You should use the RESTORE DATABASE command to interact with your backup file instead of specifying it as an input file of commands to the database:
/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P myPassword1! -Q "RESTORE DATABASE MyDatabase FROM DISK='/backups/myBackup.bak'"
According to the sqlcmd Docs, the -i flag you used specifies:
The file that contains a batch of SQL statements or stored procedures.
That flag likely won't work properly if given a database backup file as an argument.

Postgres Container not see user

I have a problem, I'm trying to run a Postgres instance inside a docker container for it to be used in a java application. I try running the following command:
docker run --name postgres -e POSTGRES_USER=root -e POSTGRES_PASSWORD=postgres -v postgres:/var/lib/postgresql/data -P -d postgres
The container seems to be created successfully. But if I try to access to it to create a DB or table I do:
docker exec -it postgres /bin/bash
If I run the following:
psql -u postgres -p
The following response is returned:
/usr/lib/postgresql/10/bin/psql: invalid option -- 'u'
And that's no good for my application. I have read on the d.hub to use -e P_USER and P_Password to set it, but it doesn't work .

Docker: Run commands from multiple containers

I want to execute a command that uses commands from multiple containers.
E.g., I want to execute a backup script that used psql and pg_dump commands.
docker exec db_backup pg_dump
failed to exec: exec: "pg_dump": executable file not found in $PATH
docker run has an option --link. Is there a similar option for exec?
To clear this up, there are 3 containers:
my_app
db
db_backup
I want to use pg commands located in db from my db_backup scripts.
There is not --link option for docker exec. If you want to backup using a special script:
Create a new image db_backup starting from the postgresql one (the one that the db container uses), adding the backup script to some folder.
Do docker run --volumes-from db db_backup your_backup_script.sh.
1) go to the db shell by using sudo docker run -ti db /bin/bash
2) type which pg_dump or locate pg_dump if the first fails
3) use the full path in your command sudo docker exec db /full_path_to/pg_dump
run the 3) inside your db container
note: on my Fedora the pg_dump points to /usr/bin/pg_dump

Import dump/sql file into my postgresql database on Linode

I recently moved my Ruby on Rails 4 app from Heroku to Linode. Everything has been setup correctly, but I need to populate my database with a file, lets call it movies.sql
I am not very familiar with postgresql command and VPS, so having trouble getting this done. I uploaded it to Dropbox since I saw many SO posts that you can use S3/Dropbox.
I saw different commands like this (unsure how to go about it in my situation):
psql -U postgres -d testdb -f /home/you/file.sql
psql -f file.sql dbname
psql -U username -d myDataBase -a -f myInsertFile
So which is the correct one in my situation and how to run when I SSH in Linode? Thanks
You'll need to get the file onto your server or you'll need to use a different command from your terminal.
If you have the file locally, you can restore without sshing in using the psql command:
psql -h <user#ip_address_of_server> -U <database_username> -d <name_of_the_database> -f local/path/to/your/file.sql
Otherwise, the command is:
psql -U <database_username> -d <name_of_the_database> < remote/path/to/your/file.sql
-U sets the db username, -h sets the host, -d sets the name of the database, and -f tells the command you're restoring from a file.