I'm trying to dump a database i created on pgadmin, because i need a .sql file to send to my Professor for a University project.
Whenever i try to do a Backup, it instantly stops because of this command:
/Library/PostgreSQL/13/bin/pg_dump --file "/Users/fstxfreestyler/CinemaDB" --host "localhost"
--port "5433" --username "postgres" --no-password
--verbose --format=p --create --clean --inserts
--column-inserts --encoding "UTF8" "cinema"
I'm using pgadming 4.28, on a MacOS 10.13.3.
Any help?
Related
I am trying to restore a database with a sql file from a dockerized postgresql using this command:
cat pathfile.sql | docker exec -i dbcontainer psql -U user
But when I run this command the console doesn't do anything, it doesn't even throw an error.
I have verified the database and there is nothing created
I'm not sure if this is possible of if I'm doing something wrong since I'm still pretty new to Docker. Basically, I want to export a query result inside PostgreSQL docker container as a csv file to my local machine.
This is where I got so far. Firstly, I run my PostgreSQL docker container with this command:
sudo docker run --rm --name pg-docker -e POSTGRES_PASSWORD=something -d -p 5432:5432 -v $HOME/docker/volumes/postgres:/var/lib/postgresql/data postgres
Then I access the docker container with docker exec to run PostgreSQL command that would copy the query result to a csv file with specified location like this:
\copy (select id,value from test) to 'test_1.csv' with csv;
I thought that should export the query result as a csv file named test_1.csv in the local machine, but I couldn't find the file anywhere in my local machine, also checked both of these directories: $HOME/docker/volumes/postgres; /var/lib/postgresql/data postgres
You can export the data to the STDOUT and pipe the result to a file in the client machine:
docker exec -it -u database_user_name container_name \
psql -d database_name -c "COPY (SELECT * FROM table) TO STDOUT CSV" > output.csv
-c tells psql you to execute a given SQL statement when the connection is established.
So your command should look like this:
docker exec -it -u postgres pgdocker \
psql -d yourdb -c "COPY (SELECT * FROM test) TO STDOUT CSV" > test_1.csv
The /var/lib/postgresql/data directory is where the database server stores its data files. It isn't a directory that users need to manipulate directly or where nothing interesting can be found.
Paths like test_1.csv are relative to working directory. The default directory when you enter the postgres container with docker exec is / so that's where your file should be. You can also switch to another directory with cd before running psql:
root#b9e5a0572207:/some/other/path# cd /some/other/path/
root#b9e5a0572207:/some/other/path# psql -U postgres
... or you can provide an absolute path:
\copy (select id,value from test) to '/some/other/path/test_1.csv' with csv;
You can use docker cp to transfer a file from the container to the host:
docker cp pg-docker:/some/other/path/test_1.csv /tmp
... or you can create a volume if this is something you do often.
I've followed the directions from the AWS documentation on importing / exporting a database from RDS using their stored procedures.
The command was similar to:
exec msdb.dbo.rds_backup_database
#source_db_name='MyDatabase',
#s3_arn_to_backup_to='my-bucket/myBackup.bak'
This part works fine, and I've done it plenty of times in the past.
However what I want to achieve now; is restoring this database to a local SQL Server instance; however I'm struggling at this point. I'm assuming this isn't a "normal" SQL Server dump - but I'm unsure what the difference is.
I've spun up a new SQL Server for Linux Docker instance; which seems all set. I have made a few changes so that the sqlcmd tool is installed; so technically the image I'm running is comprised of this Dockerfile; not much different.
FROM microsoft/mssql-server-linux:2017-latest
RUN apt-get update && \
apt-get install -y curl && \
curl https://packages.microsoft.com/keys/microsoft.asc | apt-key add - && \
apt-get update && \
apt-get install -y mssql-tools unixodbc-dev
This image works fine; I'm building it via docker build -t sql . and running it via docker run -e 'ACCEPT_EULA=Y' -e 'SA_PASSWORD=myPassword1!' -p 1433:1433 -v $(pwd):/backups sql
Within my local folder, I have my backup from RDS downloaded, so this file is now in /backups/myBackup.bak
I now try to run sqlcmd to import the data with the following command; and I'm running into an issue which makes me assume this isn't a traditional SQL dump. Unsure what a traditional SQL dump looks like, but the majority of the file looks garbled with ^#^#^#^# and of course other things.
/opt/mssql-tools/bin/sqlcmd -S localhost -i /backups/myBackup.bak -U sa -P myPassword1! -x
And finally; I get this error:
Sqlcmd: Error: Syntax error at line 56048 near command 'GO' in file '/backups/myBackup.bak'.
Final Answer
My final solution for this mainly came from using -Q and running a RESTORE query rather than importing with the file, but I also needed to include some MOVE options as they were pointing at Windows file paths.
/opt/mssql-tools/bin/sqlcmd -U SA -P myPassword -Q "RESTORE DATABASE MyDatabase FROM DISK = N'/path/to/my/file.bak' WITH MOVE 'mydatabase' TO '/var/opt/mssql/mydatabase.mdf', MOVE 'mydatabase_log' TO '/var/opt/mssql/mydatabase.ldf', REPLACE"
You should use the RESTORE DATABASE command to interact with your backup file instead of specifying it as an input file of commands to the database:
/opt/mssql-tools/bin/sqlcmd -S localhost -U sa -P myPassword1! -Q "RESTORE DATABASE MyDatabase FROM DISK='/backups/myBackup.bak'"
According to the sqlcmd Docs, the -i flag you used specifies:
The file that contains a batch of SQL statements or stored procedures.
That flag likely won't work properly if given a database backup file as an argument.
Using SSMS 2008 I am able to generate a script for a database with huge amounts of data in file ABC.sql on my desktop.
The database has approx. 9 GB of data so I'm unable to open the file. Is there any way to execute the script?
When I try to open it in SSMS I get an error:
The operation could not be completed. not enough storage is available to complete this operation
The template specified cannot be found. Please check that the full path is correct
SQL Server offers 2 command prompt features that can se used for executing large queries - osql (will be removed in future), and sqlcmd
osql is located in the Tools\Binn subfolder. To execute a SQL script:
Start the Command Prompt
Navigate to the folder where the osql utility is located
Run the command in the following format:
osql –H <workstation name> -S <server_name[\instance_name]> -U <user login ID> -P <login password> –i <full path to script>
To execute the large.sql file located in the D:\test, against the Central database on the SQL Server instance Dell\SQL2012, as an sa with the 'sqladmin' password, run the following command:
osql -H Dell -S Dell\SQL2012 -i D:\test\large.sql -U sa -P sqladmin
The sqlcmd command line utility is also located in the SQL Server’s Tools\Binn sub-directory. To execute a SQL script:
Start the Command Prompt
Navigate to the folder where the sqlcmd utility is located
Run a command in the following format:
sqlcmd –S <server name> -d <database name> -i <full path to script> -U <user login ID> –P <login password>
To execute the same as above, run the following command:
sqlcmd -S Dell\SQL2012 -d Central -i D:\test\large.sql -U sa –P sqladmin
Start the sqlcmd Utility
Run Transact-SQL Script Files Using sqlcmd
I use sqlcmd to execute large SQL files.
You can generate script of your database by RightClick on your database Tasks>GenerateScripts> click next on Generate and Script window Check on select specific table choose tables you want Press next Click on Advance option on end of General Category select Type of data to script now choose which kind you want your database to.
Scheme Only: Means this script will create your database.
DataOnly:If you have created database and table this will insert data into it.
Press ok then Next.
Your file is by default save in
C:\Users[UserName]\Documents\ .
I have downloaded the database from an existing Heroku app onto my local machine. The data I downloaded is in a single file; let's call it herokuapp.db. I also have Postgres installed and I can successfully create a new Postgres database and have my Rails app reference that. What I want to do is move the data I downloaded from Heroku into this database or create a new Postgres database using the downloaded data.
Use pg_restore --verbose --clean --no-acl --no-owner -d [database_name] [herokuapp.db]
see https://devcenter.heroku.com/articles/export-from-heroku-postgres for more information
I just solved the same problem with a similar technique that Will suggested (thanks Will!).
$ heroku pgbackups:capture
$ curl -o latest.dump `heroku pgbackups:url`
$ pg_restore --verbose --clean --no-acl --no-owner -d [database_name] latest.dump
The database_name can be found in the development section of the database.yml file.
EDIT
I recently performed this task again with Postgres.app and the last command above did not work. I was getting this error message:
pg_restore: connecting to database for restore
pg_restore: [archiver (db)] connection to database "[database_name]" failed: could not connect to server: No such file or directory
Is the server running locally and accepting
connections on Unix domain socket "/var/pgsql_socket/.s.PGSQL.5432"?
pg_restore: *** aborted because of error
Here is the updated command that worked:
$ pg_restore --verbose --clean --no-acl --no-owner -h localhost -U $USER -d [database_name] latest.dump