Hi i have requirement to create a database from which no data goes outside not in csv format or as a dump file.
If mysql crashes the data should be gone but no recovery should exist..
It may looks like stupid idea to implement but clients requirement is like that only..
So any one help me how to restrict mysqlbump client program and INTO OUTFILE commands for all users except root user. Other users will have select insert update delete and etc database leve privileges but not global level privileges..
Can any one help me on this ?
I'm not sure what are you looking for, But if you have ssh access to server, I propose to use filesystem backup or some useful tools like innobackupex instead of mysqldump.
For big data mysqldump isn't good solution.
u must restrict every new mysql user to Select_priv, Lock_tables_priv, file,alter, crete tmp table , execute,create table . so the user cant do any things. even in mysqldump , they cant get export. use mysql.user table. or use some tools like navicat .
You can't. From the perspective of the MySQL server, mysqldump is just another client that runs a lot of SELECT statements. What it happens to do with them (generate a dump file) is not something that the server can control.
For what it's worth, this sounds like an incredibly stupid idea, as it means that there will be no way to restore your client's data from backups if a disaster occurs (e.g, a hard drive fails, MySQL crashes and corrupts tables, someone accidentally deletes a bunch of data, etc). The smartest thing to do here will be to tell the client that you can't do it - anything else is setting yourself up for failure.
Related
I want to practice some SQL locally on specific tables that I have.
What I need is simply to take a table, upload it to a software I can run SQL on and work with it. nothing more. no servers, no other users.
I tried a few different products but just can't find one that allows this option without creating a server and setting up connections.
Please help :)
Thanks!
I think something like SQLite would work well for your purpose. SQLite is serverless
You can then use a shell or DOS prompt to create a db for it, create your table(s) for the db, and then upload your data to the table(s).
https://www.sqlite.org/quickstart.html
sql fiddle, maybe this is what are you looking for.
Our website is hosted by dreamhost and two years ago we switched from joomla to WordPress.
Problem: there is a table (Jo's_bannertrack) with 28.000.000 records (yes seriously). I am pretty sure it's not used anymore, it should be joomla's...but since the website is big and visited, I don't want to make any trouble at all.
I can't make any kind of backup: I have tried with wpclone and with mysql dump but it's always too big.
Dreamhost grants me access with ssh and phpmyadmin.
Any idea?
Have you tried these options described in the other stack forums?
Serverfault
DBA
Although you haven't given a size of your table (only record count), I expect you should be able to do a simple mysqldump (using your ssh login) at least from that specific table only.
You could also define the amount of rows to backup by using your phpmyadmin
Browse to the table and open to view the records
Click on export
On the buttom "Save as file" use the option to select how many rows to export and starting at.
This will output different sql files what your data from 1 table. You should test to see how many rows you can export at a time.
I guess phpMyAdmin is not a kind of tool that can back up so much data effectively. Using some alternate solutions to migrate with such a bulk of data seems quite more reasonable, due to lack of any automation in phpMyAdmin.
Me and my friend both have sql server 2012. I have a database where I would like to copy over most of the tables into a database on my friends laptop. I am not sure not sure how to go about doing this?
I know in sql server that I can copy over data from one table in say database A to database B using the line below.
select * into database_a.dbo.MyTable from database_a.dbo.MyTable
What is the best way to connect the two laptops?
You could take a full backup and give it to him to restore. Or you could give him a copy of the *.mdf file and let him attach it to his DB.
There are two main options, one is to back up the database and send it to your friend. To do this right click on the database, Tasks and Backup. The problem with this approach is that if you're running even slightly different versions you may have issues.
The alternative is to script the database. To do this right on the database, click Tasks and Generate Scripts. Make sure you choose to script data and schema from the advanced options.
The latter is my preferred approach (as it's much more editable and human readable).
I have several websites hosted on a VPS and am currently performing database backups by running a shell script via cron that looks something like this:
mysqldump -uusername1 -prootpassword dbname1 > /backup/dbname1.bak
mysqldump -uusername2 -prootpassword dbname2 > /backup/dbname2.bak
mysqldump -uusername3 -prootpassword dbname3 > /backup/dbname3.bak
I have a couple of concerns about this process.
Firstly, I'm using the root server password to perform mysqldump, and the file is being stored in clear text on the server (not publicly accessible or anything, but there are obviously concerns if I grant other users access to the server for one reason or another). I'm using root because it's simpler than tracking everybody that creates a database down and asking them for their specific db passwords.
Secondly, this process only works if people inform me that they've added a database (which is fine for the most part, we're not doing anything super complicated over here). I would prefer to have a backup of everything without worrying that I've overlooked something.
You could always just dump ALL the databases:
mysqldump --all-databases | gzip -9 > /backup/dbs.bak.gz
That'd free you from having to keep track of which dbs there are. The downside is that restoring gets a bit more complicated.
As for using root, there's no reason you couldn't create another account that has permissions to do backups - you should never use the root account for anything other than initial setup.
I use this script: http://sourceforge.net/projects/automysqlbackup/ It works perfectly. Also, you should add a backup MySQL user that has global SELECT and LOCK TABLES permissions. That way you don't need everyone's username and password/
Is somebody aware of a tool that lets me browse MySQL-Files without having to import them into my database system?
I'm looking for an easy way to inspect MySQL Backups quickly without having to import them - but still being nicely displayed, so viewing the SQL source is not really an option.
Maybe there's a program that takes the SQL dump and automatically imports it into a temporary database, then presenting it in an interface similar to HeidiSQL (or any other SQL-Gui-Tool).
Why are you eliminating the obvious solution? You just need to load the backup into a mysql database. Either load the backup into a separate mysql instance, or if your backup is of just one database (i.e. you didn't pass --databases or --all-databases to mysqldump), load it into a database of a different name.
I came here looking for an answer to the same question, because it can be cumbersome to wait to load a 20 GB sql dump just for inspection and drop again. While I hope to find a standalone shortcut tool, best I can recommend is a cocktail of linux cli text manipulation tools like grep, sed, and cut. Some useful output could be:
What tables are being created/inserted into?
Are the mysqldump INSERTs one line-per-record or all stuffed into one? (Because this can affect other things like)
How many rows are being inserted into table XYZ?
What is some representative data being inserted into table XYZ?
What is the ABC column value for the last row inserted into table XYZ?
Good luck!