Restore Database from sql file docker - sql

I am trying to restore a database with a sql file from a dockerized postgresql using this command:
cat pathfile.sql | docker exec -i dbcontainer psql -U user
But when I run this command the console doesn't do anything, it doesn't even throw an error.
I have verified the database and there is nothing created

Related

Run sql scripts using batch file

I am trying to create a batch file that connects to my local Postgres 14 server and creates tables by opening a .sql file that contains all the table scripts.
For now, I was able to connect to the server using this code:
psql -U postgres -h localhost -d test
This command opens the command prompt and asks for the password. After this, how do I update the batch file to insert the password and automtically install the tables from my .sql file?

Export Query Result as CSV file from Docker PostgreSQL container to local machine

I'm not sure if this is possible of if I'm doing something wrong since I'm still pretty new to Docker. Basically, I want to export a query result inside PostgreSQL docker container as a csv file to my local machine.
This is where I got so far. Firstly, I run my PostgreSQL docker container with this command:
sudo docker run --rm --name pg-docker -e POSTGRES_PASSWORD=something -d -p 5432:5432 -v $HOME/docker/volumes/postgres:/var/lib/postgresql/data postgres
Then I access the docker container with docker exec to run PostgreSQL command that would copy the query result to a csv file with specified location like this:
\copy (select id,value from test) to 'test_1.csv' with csv;
I thought that should export the query result as a csv file named test_1.csv in the local machine, but I couldn't find the file anywhere in my local machine, also checked both of these directories: $HOME/docker/volumes/postgres; /var/lib/postgresql/data postgres
You can export the data to the STDOUT and pipe the result to a file in the client machine:
docker exec -it -u database_user_name container_name \
psql -d database_name -c "COPY (SELECT * FROM table) TO STDOUT CSV" > output.csv
-c tells psql you to execute a given SQL statement when the connection is established.
So your command should look like this:
docker exec -it -u postgres pgdocker \
psql -d yourdb -c "COPY (SELECT * FROM test) TO STDOUT CSV" > test_1.csv
The /var/lib/postgresql/data directory is where the database server stores its data files. It isn't a directory that users need to manipulate directly or where nothing interesting can be found.
Paths like test_1.csv are relative to working directory. The default directory when you enter the postgres container with docker exec is / so that's where your file should be. You can also switch to another directory with cd before running psql:
root#b9e5a0572207:/some/other/path# cd /some/other/path/
root#b9e5a0572207:/some/other/path# psql -U postgres
... or you can provide an absolute path:
\copy (select id,value from test) to '/some/other/path/test_1.csv' with csv;
You can use docker cp to transfer a file from the container to the host:
docker cp pg-docker:/some/other/path/test_1.csv /tmp
... or you can create a volume if this is something you do often.

How can I import a SQL Server RDS backup into a SQL Server Linux Docker instance?

I've followed the directions from the AWS documentation on importing / exporting a database from RDS using their stored procedures.
The command was similar to:
exec msdb.dbo.rds_backup_database
#source_db_name='MyDatabase',
#s3_arn_to_backup_to='my-bucket/myBackup.bak'
This part works fine, and I've done it plenty of times in the past.
However what I want to achieve now; is restoring this database to a local SQL Server instance; however I'm struggling at this point. I'm assuming this isn't a "normal" SQL Server dump - but I'm unsure what the difference is.
I've spun up a new SQL Server for Linux Docker instance; which seems all set. I have made a few changes so that the sqlcmd tool is installed; so technically the image I'm running is comprised of this Dockerfile; not much different.
FROM microsoft/mssql-server-linux:2017-latest
RUN apt-get update && \
apt-get install -y curl && \
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add - && \
apt-get update && \
apt-get install -y mssql-tools unixodbc-dev
This image works fine; I'm building it via docker build -t sql . and running it via docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=myPassword1!' -p 1433:1433 -v $(pwd):/backups sql
Within my local folder, I have my backup from RDS downloaded, so this file is now in /backups/myBackup.bak
I now try to run sqlcmd to import the data with the following command; and I'm running into an issue which makes me assume this isn't a traditional SQL dump. Unsure what a traditional SQL dump looks like, but the majority of the file looks garbled with ^#^#^#^# and of course other things.
/opt/mssql-tools/bin/sqlcmd -S localhost -i /backups/myBackup.bak -U sa -P myPassword1! -x
And finally; I get this error:
Sqlcmd: Error: Syntax error at line 56048 near command 'GO' in file '/backups/myBackup.bak'.
Final Answer
My final solution for this mainly came from using -Q and running a RESTORE query rather than importing with the file, but I also needed to include some MOVE options as they were pointing at Windows file paths.
/opt/mssql-tools/bin/sqlcmd -U SA -P myPassword -Q "RESTORE DATABASE MyDatabase FROM DISK = N'/path/to/my/file.bak' WITH MOVE 'mydatabase' TO '/var/opt/mssql/mydatabase.mdf', MOVE 'mydatabase_log' TO '/var/opt/mssql/mydatabase.ldf', REPLACE"
You should use the RESTORE DATABASE command to interact with your backup file instead of specifying it as an input file of commands to the database:
/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P myPassword1! -Q "RESTORE DATABASE MyDatabase FROM DISK='/backups/myBackup.bak'"
According to the sqlcmd Docs, the -i flag you used specifies:
The file that contains a batch of SQL statements or stored procedures.
That flag likely won't work properly if given a database backup file as an argument.

Docker: Run commands from multiple containers

I want to execute a command that uses commands from multiple containers.
E.g., I want to execute a backup script that used psql and pg_dump commands.
docker exec db_backup pg_dump
failed to exec: exec: "pg_dump": executable file not found in $PATH
docker run has an option --link. Is there a similar option for exec?
To clear this up, there are 3 containers:
my_app
db
db_backup
I want to use pg commands located in db from my db_backup scripts.
There is not --link option for docker exec. If you want to backup using a special script:
Create a new image db_backup starting from the postgresql one (the one that the db container uses), adding the backup script to some folder.
Do docker run --volumes-from db db_backup your_backup_script.sh.
1) go to the db shell by using sudo docker run -ti db /bin/bash
2) type which pg_dump or locate pg_dump if the first fails
3) use the full path in your command sudo docker exec db /full_path_to/pg_dump
run the 3) inside your db container
note: on my Fedora the pg_dump points to /usr/bin/pg_dump

How to execute generated script(.sql file) with schema and data in SQL Server 2008

Using SSMS 2008 I am able to generate a script for a database with huge amounts of data in file ABC.sql on my desktop.
The database has approx. 9 GB of data so I'm unable to open the file. Is there any way to execute the script?
When I try to open it in SSMS I get an error:
The operation could not be completed. not enough storage is available to complete this operation
The template specified cannot be found. Please check that the full path is correct
SQL Server offers 2 command prompt features that can se used for executing large queries - osql (will be removed in future), and sqlcmd
osql is located in the Tools\Binn subfolder. To execute a SQL script:
Start the Command Prompt
Navigate to the folder where the osql utility is located
Run the command in the following format:
osql –H <workstation name> -S <server_name[\instance_name]> -U <user login ID> -P <login password> –i <full path to script>
To execute the large.sql file located in the D:\test, against the Central database on the SQL Server instance Dell\SQL2012, as an sa with the 'sqladmin' password, run the following command:
osql -H Dell -S Dell\SQL2012 -i D:\test\large.sql -U sa -P sqladmin
The sqlcmd command line utility is also located in the SQL Server’s Tools\Binn sub-directory. To execute a SQL script:
Start the Command Prompt
Navigate to the folder where the sqlcmd utility is located
Run a command in the following format:
sqlcmd –S <server name> -d <database name> -i <full path to script> -U <user login ID> –P <login password>
To execute the same as above, run the following command:
sqlcmd -S Dell\SQL2012 -d Central -i D:\test\large.sql -U sa –P sqladmin
Start the sqlcmd Utility
Run Transact-SQL Script Files Using sqlcmd
I use sqlcmd to execute large SQL files.
You can generate script of your database by RightClick on your database Tasks>GenerateScripts> click next on Generate and Script window Check on select specific table choose tables you want Press next Click on Advance option on end of General Category select Type of data to script now choose which kind you want your database to.
Scheme Only: Means this script will create your database.
DataOnly:If you have created database and table this will insert data into it.
Press ok then Next.
Your file is by default save in
C:\Users[UserName]\Documents\ .