I've only used MySQL before. Postgres is a little different for me. I'm trying to use the Postgres.app for OSX. I have a database dump from our development server, and I want to create the correct user roles and import the database to my local machine so I can do development at home (can't access the database remotely).
I think I've created the user. \du shows the appropriate user with the CreateDB permission. Then I used \i ~/dump.sql which seems to have imported the database. However when I use \l to list databases, it doesn't show up. So then I tried logging in with psql -U username, but then it tells me "FATAL: database username does not exist." Is that not the right switch for login? It's what the help said.
I'm getting frustrated with something simple so I appreciate any help. With the Postgres.app, how can I create the necessary users with passwords and import the database? Thanks for any help!
It sounds like you probably loaded the dump into the database you were connected to. If you didn't specify a database when you started psql it'll be the database named after your username. It depends a bit on the options used with pg_dump when the dump file was created though.
Try:
psql -v ON_ERROR_STOP=1
CREATE DATABASE mynewdb TEMPLATE template0 OWNER whateverowneruser;
\c mynewdb
\i /path/to/dump/file.sql
Personally, I recommend always using pg_dump -Fc to create custom-format dumps instead of SQL dumps. They're a lot easier to work with and pg_restore is a lot nicer than using psql for restoring dumps.
Mac users: If you are here in 2023 and are on the Mac operating system and you have created a database dump using pg_dump utility
pg_dump -U <USER_NAME> -h <DATABASE_HOST> <DB_NAME> > sample.sql
Then in order to restore it use the below command.
First, create the database manually using the command line/terminal
psql -U <USER_NAME> -h <DATABSE_HOST>
Once connected create the database using the command
create database test;
\q
Now restore the dump using the below command.
psql -U <USER_NAME> -d <DATABSE_NAME> -h <DATABSE_HOST> < sample.sql
For localhost use 127.0.0.1 as the database host.
-U <username> is correct for specifying the username with psql, but the last token is the database name here. It's interpreting that you specified no user and the database is username. Try adding the database name after, i.e. psql -U username database.
See the psql doc for more info regarding the myriad switches psql supports.
Related
According to this page: https://learn.microsoft.com/en-us/azure/mysql/single-server/concepts-migrate-dump-restore
You can:
Copy the backup files to an Azure blob/store and perform the restore from there, which should be a lot faster than performing the restore across the Internet.
However there is no information on how to actually achieve this.
I created an Azure storage account, uploaded a large .SQL file, but I'm not sure how I would go about importing this using mysql.
To achieve your scenario, please try the below:
To restore MySql Database for MySql flexible server, you can run the below command from this MsDoc:
mysql -h [hostname] -u [uname] -p[pass] [db_to_restore] < [backupfile.sql]
For MySql flexible server:
$ mysql -h testserver.mysql.database.azure.com -u admin -p testdb <testdb_backup.sql
To create a backup file using mysqldump you can try the below command from this MsDoc:
$ mysqldump --opt -u [uname] -p[pass] [dbname] > [backupfile.sql]
For more in detail, please refer below links:
Backup Azure Database for MySQL to a Blob Storage by Bashar Hussein
Backup and restore SQL Server to Azure Blob storage by Bijay Kumar Sahoo
Could somebody help with PostgreSql? I'm trying to access already existing database Employees, but it says that I do not have this database. How to fix this problem? Here is the screenshots
Also here is why all my data from local PostgreSQL local server located
And Here is database that I'm trying to access through out command line
You should user "-d" for specifiy database name.
Try this:
psql -U your_username -d employees
I have two exactly same mysql databases with different data running on Amazon AWS. I would like to move those databases to my local machine. They are not too big databases less than 1GB. I read about mysqldump but it is too complicated and could not find easy follow on instructions.
First, tried using MySQL workbench migration tool and cant connect to the source.
Second, I tried connecting to the databases from the workbench but failed.
Third, I tried to move table by table, but when I export it to .csv file and try to open it table formation is lost.
How can I move combine those databases and move to my local computer efficiently?
go to your ssh shell (terminal)
mysqldump -u root -p --all-databases > exported.sql
now move the dump to the target system (your local computer) and do
mysql -u root -p < exported.sql
do this for each db-source and your done
PS: replace root if needed for DB admin username
UPDATE:
You can do this on the fly from source to destination in one line:
mysqldump -h source_hostname_or_ip -u root --password='password' --extended-insert --databases DatabaseName | mysql -u root --password='password' --host=destination_host -C DatabaseName
Why are you not able to connect using Workbench? Fill in your SSH IP(port(22) not needed), Select the SSH Key file(In text format & not ppk), Fill in your RDS instance End Point and credentials.
Then TEST CONNECTION...
If successful, you can use EXPORT option, select your DB and proceed!
My QT program is connected to a sql server which contains with many databases. How can I back up a specific database and all its tables and existing entries to my local drive? I am trying to make a button backup to the database I want to a file and also be able to restore.If there is no way to do this, is there a 3rd party tool I could use at least?
I am giving this a try using QProcess
SqlCmd -S myserver.com -U username -P password –Q “BACKUP DATABASE [Name_of_Database]
TO DISK=’C:/Backup/[Name_of_Database].bak’
Here i want to transfer data from one database to another database in sql 2005,
i tried in dts but its not working.
Need more information, but if you want to just copy a database, you can back it up, then restore that backup in another database. If you just want to copy individual tables then DTS is your friend. How is it "not working" for you?
select *
into SecondDatabase.dbo.TableName
from FirstDatabase.dbo.TableName
If you want something else, you have to be more specific.
If you're moving a few tables once off then the simplest way is to use the BCP command line utility.
bcp db_name.schema_name.table_name out table_name.dat -c -t, -S source_server -T
bcp db_name.schema_name.table_name in table_name.dat -c -t, -S destination_server -T
Change '-T' to '-U your_username -P your_password' if you're not using trusted connections.
If you're moving data regularly between servers on a LAN then consider using linked servers. http://msdn.microsoft.com/en-us/library/ff772782.aspx
Link server performance over WANs is often poor, in my experience. Consider doing a BCP out, secure file transfer to the destination server then BCP in if the servers aren't on the same LAN.