Our website is hosted by dreamhost and two years ago we switched from joomla to WordPress.
Problem: there is a table (Jo's_bannertrack) with 28.000.000 records (yes seriously). I am pretty sure it's not used anymore, it should be joomla's...but since the website is big and visited, I don't want to make any trouble at all.
I can't make any kind of backup: I have tried with wpclone and with mysql dump but it's always too big.
Dreamhost grants me access with ssh and phpmyadmin.
Any idea?
Have you tried these options described in the other stack forums?
Serverfault
DBA
Although you haven't given a size of your table (only record count), I expect you should be able to do a simple mysqldump (using your ssh login) at least from that specific table only.
You could also define the amount of rows to backup by using your phpmyadmin
Browse to the table and open to view the records
Click on export
On the buttom "Save as file" use the option to select how many rows to export and starting at.
This will output different sql files what your data from 1 table. You should test to see how many rows you can export at a time.
I guess phpMyAdmin is not a kind of tool that can back up so much data effectively. Using some alternate solutions to migrate with such a bulk of data seems quite more reasonable, due to lack of any automation in phpMyAdmin.
Related
Hi i have requirement to create a database from which no data goes outside not in csv format or as a dump file.
If mysql crashes the data should be gone but no recovery should exist..
It may looks like stupid idea to implement but clients requirement is like that only..
So any one help me how to restrict mysqlbump client program and INTO OUTFILE commands for all users except root user. Other users will have select insert update delete and etc database leve privileges but not global level privileges..
Can any one help me on this ?
I'm not sure what are you looking for, But if you have ssh access to server, I propose to use filesystem backup or some useful tools like innobackupex instead of mysqldump.
For big data mysqldump isn't good solution.
u must restrict every new mysql user to Select_priv, Lock_tables_priv, file,alter, crete tmp table , execute,create table . so the user cant do any things. even in mysqldump , they cant get export. use mysql.user table. or use some tools like navicat .
You can't. From the perspective of the MySQL server, mysqldump is just another client that runs a lot of SELECT statements. What it happens to do with them (generate a dump file) is not something that the server can control.
For what it's worth, this sounds like an incredibly stupid idea, as it means that there will be no way to restore your client's data from backups if a disaster occurs (e.g, a hard drive fails, MySQL crashes and corrupts tables, someone accidentally deletes a bunch of data, etc). The smartest thing to do here will be to tell the client that you can't do it - anything else is setting yourself up for failure.
I have a SQL Server 2005 instance running and a client of mine deleted some data that they would like to get back. It is four records. Is there a way to query the backups to see if the data exists without restoring the database?
They just noticed the data was missing and it could have been deleted by them 3 months ago or yesterday, so the backups could have been overwritten and it not exist at all. I am just trying to cover my bases to see if I can find the data before telling them they should not of clicked OK the second time when I asked them if they were sure they wanted to delete that record.
RedGate sells Virtual Restore, which can
Rapidly mount live, fully functional databases direct from backups
You could sign up for a trial and check your current backups.
P.S. I haven't used Virtual Restore, but the other RedGate products I used were of good quality.
No, there is not that posibility, but, if your backup media are files, (in example with diferent names in a folder), you can make a loop script to automate the restore each one and a subsequent query to the missing records.
The script just need to do restore database ... from ... if you want I have a similar script but not accesible to me now.
I was wondering what the simplest and easiest way to backup / restore a database on SQLite 3 is? I have read around and there are lots of articles detailing methods for complicated situations, but I am struggling to find a basic procedure.
I have one simple database on a site which is basically a news reel of a company's recent activities. The site is just about to be deployed and will have new posts added on a roughly daily basis. I am hoping to write a number of posts before the site goes online, then upload the database onto the live server. From then on, new posts will be added online but it would be nice to have a backup in case something goes wrong.
So, essentially my question is:
Is there a simple way to backup a database in SQLite3 and also to upload a database? I am aware that I could possibly use seeds as a way to upload the data initially, but ideally i would rather just copy the development database (if possible...) and upload it onto the production server.
Apologies for my ignorance...
I would read the backup documentation here. There are some potential risks in doing file copies, but especially for the initial launch, this approach would be fine. I have done this on a couple of low traffic sites for a number of years and never run into any issues.
The nice thing about sqlite3 is that it's a file-based database exclusively. So long as you can prevent an application from using the database for a bit, backing up and restoring is as simple as copying the database file itself.
Is somebody aware of a tool that lets me browse MySQL-Files without having to import them into my database system?
I'm looking for an easy way to inspect MySQL Backups quickly without having to import them - but still being nicely displayed, so viewing the SQL source is not really an option.
Maybe there's a program that takes the SQL dump and automatically imports it into a temporary database, then presenting it in an interface similar to HeidiSQL (or any other SQL-Gui-Tool).
Why are you eliminating the obvious solution? You just need to load the backup into a mysql database. Either load the backup into a separate mysql instance, or if your backup is of just one database (i.e. you didn't pass --databases or --all-databases to mysqldump), load it into a database of a different name.
I came here looking for an answer to the same question, because it can be cumbersome to wait to load a 20 GB sql dump just for inspection and drop again. While I hope to find a standalone shortcut tool, best I can recommend is a cocktail of linux cli text manipulation tools like grep, sed, and cut. Some useful output could be:
What tables are being created/inserted into?
Are the mysqldump INSERTs one line-per-record or all stuffed into one? (Because this can affect other things like)
How many rows are being inserted into table XYZ?
What is some representative data being inserted into table XYZ?
What is the ABC column value for the last row inserted into table XYZ?
Good luck!
I have a webapp that spans many different users, each with selective permissions about what they are able to see. The app is built on top of a MySQL database.
One feature I am interested in providing to my "power users" is a sql dump of all their data so that they can run off and do their own things with it. Now I can't just use mysqldump because there are things belonging to other users that should not be made available to anybody else on download.
Is there any other easy way to get data in and out of MySQL that allows you to selectively specify what to export, and without having to jump through all kinds of hoops? Note that I need control at the query level - i.e. being able to specify a list of tables is NOT sufficient. In an ideal world, such a tool would automatically find all relationships based on traversal of foreign keys, but if I have to write queries at the table-level I'm willing to, provided it's easy for others to get the data back into mysql without too much trouble.
Anyone know if such a tool exists, or if I am in "roll my own" territory?
Mysqldump does have a "--where" flag that you can use to selectively return rows. I think you should be able to do something like:
mysqldump --where="foreign_key_id=5"
Which should return only those specific rows, more documentation on the MySQL Site
However, I'm not sure you wouldn't be further ahead to do an export as comma separated value files. CSV files can be imported back into MySQL as well as give your users many other options for ways to work with their data (spreadsheets, other RDBMS, text analysis).
Here is a tool that can help you export data into CSV/Excel files, but not to import data. It has a permission management that should provide the necessary access you required.
You can find it here: https://github.com/mpetcu/report-manager.