How to import data from a .sql file which has gzipped content on windows - gzip

I have used the below command to compress and backup my data in a Linux server.
mysqldump -u root --triggers --routines --all-databases | gzip > MyDBs.sql
Now i would like to have all these data restored in my local machine which is a windows 7 machine. When i tried importing the same in MySQL GUI tools, it gives error.
Can anyone please tel me how to do it.
Thanks in advance.

Solution from the comments, please accept
Your file is not really an sql file, it's a gzip with the wrong file
extension. rename the file Extension from .sql to .gzip, unzip it and
Import the unzipped .sql file in it to your Server.

Related

Tried to import .trn file into APX server using .bat file

I want to import .trn file (Transaction file) using .bat file (batch) to APX server (Advent product) as I have got a path while exporting the sample file from the server using import/export utility but now I dont want to import it manually as I wanted to do it using batch file but i don't have any idea that how exactly it works.

Using azcopy to copy .bak files to Azure storage

Im trying to copy the contents of my sql server backup .bak files to azure storage using azcopy.
Im using this command in a batch .bat file -
"C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy.exe" ^
/Source:"I:\" /Dest:https://hmlrservicesstorage.****************************/ ^
/DestKey:********************************************** /Pattern:"*.bak" /s /XO
On the I drive there are two folders, one a normal folder and the other a network share.
When I run the .bat file through a cmd prompt it runs fine copying all .bak files to the storage, yet when I run the same .bat file through a SQL Server Agent job, using exactly the same .bat file it only copies the network share folder across and not the normal folders contents.
I am copying from an Azure VM in the cloud to azure storage account.
SQL Server is installed on the VM in the Azure cloud.
I cant understand why and would greatly appreciate any help on this.

Sync clients' files with server - Electron/node.js

My goal is to make an Electron application, which synchronizes clients' folder with server. To explain it more clearly:
If client doesn't have the files present on the host server, the application downloads all of the files from server to client.
If client has the files, but some files have been updated on the server, the application deletes ONLY the outdated files (leaving the unmodified ones) and downloads the updated files.
If a file has been removed from the host server, but is present at client's folder, the application deletes the file.
Simply, the application has to make sure, that client has EXACT copy of host server's folder.
So far, I did this via wget -m, however frequently wget did not recognize, that some files changed and left clients with outdated files.
Recently I've heard of zsync-windows and webtorrent npm package, but I am not sure which approach is right and how to actually accomplish my goal. Thanks for any help.
rsync is a good approach but you will need to access it via node.js
An npm package like this may help you:
https://github.com/mattijs/node-rsync
But things will get slightly more difficult on windows systems:
How to get rsync command on windows?
If you have ssh access to the server an approach could be using rsync through a Node.js package.
There's a good article here on how to implement this.
You can use rsync which is widely used for backups and mirroring and as an improved copy command for everyday use. It offers a large number of options that control every aspect of its behaviour and permit very flexible specification of the set of files to be copied.
It is famous for its delta-transfer algorithm, which reduces the amount of data sent over the network by sending only the differences between the source files and the existing files in the destination.
For your use case:
If the client doesn't have the files present on the host server, the application downloads all of the files from a server to the client. This can be achieved by simple rsync.
If the client has the files, but some files have been updated on the server, the application deletes ONLY the outdated files (leaving the unmodified ones) and downloads the updated files. Use: –remove-source-files or -delete based on whether you want to delete the outdated files from the source or the destination.
If a file has been removed from the host server but is present at the client's folder, the application deletes the file. Use: -delete option of rsync.
rsync -a --delete source destination
Given it's a folder list (and therefore having simple filenames without spaces, etc.), you can pick the filenames with below code
# Get last item from each line of FILELIST
awk '{print $NF}' FILELIST | sort >weblist
# Generate a list of your files
find -type f -print | sort >mylist
# Compare results
comm -23 mylist weblist >diffs
# Remove old files
xargs -r echo rm -fv <diffs
you'll need to remove the final echo to allow rm work
Next time you want to update your mirror, you can modify the comm line (by swapping the two file arguments) to find the set of files you don't have, and feed those to wget.
or
rsync -av --delete https://mirror.abcd.org/xyz/xyz-folder/ my-client-xyz-directory/

Phantomjs creates lots of dmp files

Can I stop PhantomJS from creating dump files when it crashes?
Currently, phantomjs is creating several dmp files in my /tmp folder on my server. I know that the /tmp folder is cleaned up after each reboot; however this box never gets rebooted.
Thanks,
This answer worked for me: export a variable
export PHANTOMJS_DISABLE_CRASH_DUMPS="on"

How to backup source repository and zip destination folders?

All sources are on windows OS, and destination backup is on Unix system (we are using Samba).
My source repository is similar to :
-Repository
--Folder1
---Files
--Folder2
---Files
etc...
I would like to get a destination similar to :
-Repository
--Folder1.zip
--Folder2.zip
etc...
After first backup, I only backup files that have changed since the last backup. (or backup if new folders/files have been created).
Is someone know a tool or a script for my backup needs? Can we do that with Robocopy?
You may install cygwin on your Windows machine, and use simple sh script, like the following one:
#!/bin/bash
for i in $( ls ); do
tar czf $i.tar.gz $i
done
# specify your destination here:
rsync *.tar.gz /tmp
rm *.tar.gz
BTW, that's not the most straightforward way, I suppose.
If you are open to forget about the zipping part, I would advise backuppc on the Unix system and rsync (win32 port) / or samba config on the Windows system
see https://help.ubuntu.com/community/BackupPC and http://backuppc.sourceforge.net/ for more infos