My QT program is connected to a sql server which contains with many databases. How can I back up a specific database and all its tables and existing entries to my local drive? I am trying to make a button backup to the database I want to a file and also be able to restore.If there is no way to do this, is there a 3rd party tool I could use at least?
I am giving this a try using QProcess
SqlCmd -S myserver.com -U username -P password –Q “BACKUP DATABASE [Name_of_Database]
TO DISK=’C:/Backup/[Name_of_Database].bak’
Related
Could somebody help with PostgreSql? I'm trying to access already existing database Employees, but it says that I do not have this database. How to fix this problem? Here is the screenshots
Also here is why all my data from local PostgreSQL local server located
And Here is database that I'm trying to access through out command line
You should user "-d" for specifiy database name.
Try this:
psql -U your_username -d employees
I have two exactly same mysql databases with different data running on Amazon AWS. I would like to move those databases to my local machine. They are not too big databases less than 1GB. I read about mysqldump but it is too complicated and could not find easy follow on instructions.
First, tried using MySQL workbench migration tool and cant connect to the source.
Second, I tried connecting to the databases from the workbench but failed.
Third, I tried to move table by table, but when I export it to .csv file and try to open it table formation is lost.
How can I move combine those databases and move to my local computer efficiently?
go to your ssh shell (terminal)
mysqldump -u root -p --all-databases > exported.sql
now move the dump to the target system (your local computer) and do
mysql -u root -p < exported.sql
do this for each db-source and your done
PS: replace root if needed for DB admin username
UPDATE:
You can do this on the fly from source to destination in one line:
mysqldump -h source_hostname_or_ip -u root --password='password' --extended-insert --databases DatabaseName | mysql -u root --password='password' --host=destination_host -C DatabaseName
Why are you not able to connect using Workbench? Fill in your SSH IP(port(22) not needed), Select the SSH Key file(In text format & not ppk), Fill in your RDS instance End Point and credentials.
Then TEST CONNECTION...
If successful, you can use EXPORT option, select your DB and proceed!
I want to backup a firebird database.
I am using gbak.exe utility. It works fine.
But, when i want to do a backup from a remote computer, the backup file is stored on the serveur file system.
Is there a way to force gbak utility to download backup file ?
Thanks
Backup is stored on the Firebird Server
gbak -b -service remote_fb_ip:service_mgr absolute_path_to_db_file absolute_path_to_backupfile -user SYSDBA -pass masterkey
Backup is stored on the local machine
gbak -b remote_fb_ip:absolute_path_to_db_file path_to_local_file -user SYSDBA -pass masterkey
see:
remote server local backup
and
gbak documentation
It is always a problem to grab a remote database onto a different remote computer. For this purposes, our institute uses Handy Backup (for Firebird-based apps, too), but if you are preferring GBAK, these are some more ways to do it.
A simplest method is to call directly to a remote database from a local machine using GBAK (I see it was already described before me). Another method is an installation of GBAK to a remote machine using administrative instruments for Windows networks. This method can be tricky, as in mixed-architecture networks (with domain and non-domain sections) some obstacles are always existed.
Therefore, the simplest method is writing a backup script (batch) file calling GBAK and then copying the resulted Firebird file to some different network destination, using a command-line network file manager or FTP manager like FileZilla. It require some (minimal) skill and research, but can work for many times after a successful testing.
Best regards!
If you have gbak locally, you can back up over a network. Simply specify the host name before the database.
For example:
gbak -B 192.168.0.10:mydatabase mylocalfile.fbk -user SYSDBA -password masterkey
Try this command:
"C:\Program Files (x86)\Firebird\Firebird_2_5\bin\gbak" -v -t -user SYSDBA -password "masterkey" 192.168.201.10:/database/MyDatabase.fdb E:\Backup\BackupDatabase.fbk
Of course you need to update your paths accordingly :)
I believe you should be able to do this if you use the service manager for the backup, and specify stdout as the backup file. In that case the file should be streamed to the gbak client and you can write it to disk with a redirect.
gbak -backup -service hostname:service_mgr employee stdout > backupfile.fbk
However I am not 100% sure if this actually works, as the gbak documentation doesn't mention this. I will check this and amend my answer later this week.
I've only used MySQL before. Postgres is a little different for me. I'm trying to use the Postgres.app for OSX. I have a database dump from our development server, and I want to create the correct user roles and import the database to my local machine so I can do development at home (can't access the database remotely).
I think I've created the user. \du shows the appropriate user with the CreateDB permission. Then I used \i ~/dump.sql which seems to have imported the database. However when I use \l to list databases, it doesn't show up. So then I tried logging in with psql -U username, but then it tells me "FATAL: database username does not exist." Is that not the right switch for login? It's what the help said.
I'm getting frustrated with something simple so I appreciate any help. With the Postgres.app, how can I create the necessary users with passwords and import the database? Thanks for any help!
It sounds like you probably loaded the dump into the database you were connected to. If you didn't specify a database when you started psql it'll be the database named after your username. It depends a bit on the options used with pg_dump when the dump file was created though.
Try:
psql -v ON_ERROR_STOP=1
CREATE DATABASE mynewdb TEMPLATE template0 OWNER whateverowneruser;
\c mynewdb
\i /path/to/dump/file.sql
Personally, I recommend always using pg_dump -Fc to create custom-format dumps instead of SQL dumps. They're a lot easier to work with and pg_restore is a lot nicer than using psql for restoring dumps.
Mac users: If you are here in 2023 and are on the Mac operating system and you have created a database dump using pg_dump utility
pg_dump -U <USER_NAME> -h <DATABASE_HOST> <DB_NAME> > sample.sql
Then in order to restore it use the below command.
First, create the database manually using the command line/terminal
psql -U <USER_NAME> -h <DATABSE_HOST>
Once connected create the database using the command
create database test;
\q
Now restore the dump using the below command.
psql -U <USER_NAME> -d <DATABSE_NAME> -h <DATABSE_HOST> < sample.sql
For localhost use 127.0.0.1 as the database host.
-U <username> is correct for specifying the username with psql, but the last token is the database name here. It's interpreting that you specified no user and the database is username. Try adding the database name after, i.e. psql -U username database.
See the psql doc for more info regarding the myriad switches psql supports.
Here i want to transfer data from one database to another database in sql 2005,
i tried in dts but its not working.
Need more information, but if you want to just copy a database, you can back it up, then restore that backup in another database. If you just want to copy individual tables then DTS is your friend. How is it "not working" for you?
select *
into SecondDatabase.dbo.TableName
from FirstDatabase.dbo.TableName
If you want something else, you have to be more specific.
If you're moving a few tables once off then the simplest way is to use the BCP command line utility.
bcp db_name.schema_name.table_name out table_name.dat -c -t, -S source_server -T
bcp db_name.schema_name.table_name in table_name.dat -c -t, -S destination_server -T
Change '-T' to '-U your_username -P your_password' if you're not using trusted connections.
If you're moving data regularly between servers on a LAN then consider using linked servers. http://msdn.microsoft.com/en-us/library/ff772782.aspx
Link server performance over WANs is often poor, in my experience. Consider doing a BCP out, secure file transfer to the destination server then BCP in if the servers aren't on the same LAN.