While taking table backup using mysqldump command, can we skip particular column..? - sql

I need to know, is there any option to skip particular column and take remaining table backup using mysqldump command.
If yes please let me know.

I wanted to move a table from one host to another but only include some of the columns and replace others with dummy data (like password columns). So I made a shell script that makes it possible to run a SELECT query and get INSERT statements as result.
You find the script here: https://gist.github.com/1239299

Related

PostgreSQL: Execute queries in loop - performance issues

I need to copy data from a file into a PostgreSQL database. For that purpose I parse that file using bash in a loop and generate the corresponding insert queries. The trouble is that it takes a lot of time in order to perform that loop.
1)What can I do to accelerate that loop? Should I open a kind of connection before the loop and close it after?
2)Should I use a temporary text file inside the loop in order to write there the unique values and search in it using the text utility instead of writing them to the database and perform a search there?
Does whatever programming language you use commit after every insert? If so, the easiest thing you can do is commit after inserting all rows rather than after every row.
You might also be able to batch inserts, but using the PostgreSQL copy command is less work and also very fast.
If you insist on using BASH you could split files by defined row numbers and then excecute paralel commands using & at the end of each line.
I strongly suggest you try a different approach or programing language since as Bill said, bash doesn't talk to postgres, you can also use the pg_dump funtionality if your file's source is another postgres database.

Hive: create table and write it locally at the same time

Is it possible in hive to create a table and have it saved locally at the same time?
When I get data for my analyses, I usually create temporary tables to track eventual
mistakes in the queries/scripts. Some of these are just temporary tables, while others contain the data that I actually need for my analyses.
What I do usually is using hive -e "select * from db.table" > filename.tsv to get the data locally; however when the tables are big this can take quite some time.
I was wondering if there is some way in my script to create the table and save it locally at the same time. Probably this is not possible, but I thought it is worth asking.
Honestly doing it the way you are is the best way out of the two possible ways but it is worth noting you can preform a similar task in an .hql file for automation.
Using syntax like this:
INSERT OVERWRITE LOCAL DIRECTORY '/home/user/temp' select * from table;
You can run a query and store it somewhere in the local directory (as long as there is enough space and correct privileges)
A disadvantage to this is that with a pipe you get the data stored nicely as '|' delimitation and new line separated, but this method will store the values in the hive default '^b' I think.
A work around is to do something like this:
INSERT OVERWRITE LOCAL DIRECTORY '/home/user/temp'
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
select books from table;
But this is only in Hive 0.11 or higher

Extract specific data from full mysqldump backup

I am making regular backups of my MySQL database with mysqldump. This gives me a .sql file with CREATE TABLE and INSERT statements, allowing me to restore my database on demand. However, I have yet to find a good way to extract specific data from this backup, e.g. extract all rows from a certain table matching certain conditions.
Thus, my current approach is to restore the entire file into a new temporary database, extract the data I actually want with a new mysqldump call, delete the temporary database and then import the extracted lines into my real database.
Is this really the best way to do this? Is there some sort of script that can directly parse the .sql file and extract the relevant lines? I don't think there is an easy solution with grep and friends unfortunately, as mysqldump generates INSERT statements that insert many values per line.
The solution to this just ended up being to import the whole file, extract the data I needed and drop the database again.

Purging an SQL table

I have an SQL table which is used for logging purpose(There are lakhs of records in the table). I need to purge the table (Take a back up of the data and need to clear the table data).
Is there a standard way of doing it where I can automate it.?
You can do this within SQL Server Management Studio, by:
right clicking Database > Tasks > Generate Script
You can then select the table you wish to script out and also choose to include any associated objects, such as constraints and indexes.
Attaching an image which will give you the step by step procedure,
image_bkp_procedure
PFB the stackoverflow link which will give you more insight on this,
Table-level backup
And your automation requirement,
You can download bcp utility which copies data between an instance of Microsoft SQL Server and a data file in a user-specified format.
Sample syntax to export,
bcp "select * from [MyDatabase].dbo.Customer " queryout "Customer.bcp" -N -S localhost -T -E
You can automate this query by using any scheduling mechanism (UNIX etc)
Simply we can create a job that runs once in a month
--> That backups data in another table like archive table
--> Then deletes data in the main table
Its primitive partitioning I guess, this way it will be more flexible when you need to select data from the past deleted one i.e. now on archive table where you have backed up

Batch file to get sql backup scripts

Is there any way where I can use Batch files to get backup of the selected scripts from the SQL database...?
Say - I have one stored procedure, one function and one view in a folder.
sp1.sql
vie1.sql
fn1.sql
Before run the batch file I want to take the backup of these files.
Kindly note: I do not want to take entire database backup. Just the provided scripts alone.
Help me to achieve this one pls...
The specific answer depends entirely on the flavor of your database engine. But the general answer is you need to SELECT the definition from your database's data catalog (meta data). The function and procedure definition will probably come out intact. But the view definition may come out as just the SELECT statement - you might have to prefix it with the CREATE VIEW XXXXXXX AS part.