mysqldump with --where clause is not working - sql

mysqldump -t -u root -p mytestdb mytable --where=datetime LIKE '2014-09%'
This is what I am doing and it returns:
mysqldump: Couldn't find table: "LIKE"
I am trying to return all the rows where the column datetime is like 2014-09 meaning "all September rows".

You may need to use quotes:
mysqldump -t -u root -p mytestdb mytable --where="datetime LIKE '2014-09%'"

Selecting dates using LIKE is not a good idea. I saw this method in one project. This causes huge DBMS load and slow system operation as no index by this table column used.
If you need to select date range use between:
where datetime between '2014-09-01' and '2014-09-30 23:59:59'

Not the answer but just a notice, when using mysqldump it will automatically add DROP TABLE and CREATE TABLE to the export file, in case you don't want that add --skip-add-drop-table and --no-create-info with the command, like:
mysqldump -u root-p database_name table_name --skip-add-drop-table --no-create-info > export.sql

You missed "" for where clause . datetime column name datetime is not recommended. It is a data type as well.
mysqldump -u root -p mytestdb mytable --where="datetime LIKE '2014-09%'
" > mytable.sql;
After executing the command a prompt will ask for MySQL password. then check your current folder for generated mystable.sql

Related

psql export query output to a new table in a new sqlite3 db

Using psql we can export a query output to a csv file.
psql -d somedb -h localhost -U postgres -p 5432 -c "\COPY (select * from sometable ) TO 'sometable.csv' DELIMITER ',' CSV HEADER;"
However I need to export the query output to a new table in a new sqlite3 database.
I also looked at pg_dump, but haven't been able to figure it out a way with it.
The reason I want to export it as a new table in a new sqlite3 db without any intermediately CSV conversion is because
The query output is going to run into GBs, I have disk space constraints - so rather than csv export and then create a new sqlite3 db, need to get this in one shot
My solution is using the standard INSERT SQL statements.
It's required the same table scheme. The grep command removes the problematic characters, such as -- or blanklines.
pg_dump --data-only --inserts --table=sometable DBNAME | grep -v -e '^SET' -e '^$' -e '^--' | sqlite3 ./target.db
I hope this will help you.

MySQL Restore - DB Tables with Prefix and SQL Script with No Prefix

In command line, I'm trying to restore some (not all) tables of data from an MySQL SQL script file. However, my single database tables have a prefix and the sql file tables does not.
Is there a way within the command line to specify a prefix on restore?
mysql -uroot -p databasename < script_with_no_prefix.sql
You can pick out the tables you need using a sed command. For example, if your table prefix is "prefix_", you could use this:
$ sed -n -e '/^-- Table structure for table `prefix_/,/^UNLOCK TABLES/p' \
script_with_no_prefix.sql | mysql -uroot -p databasename

How to take backup of functions only in Postgres

I want to take backup of all functions in my postgres database.How to take backup of functions only in Postgres?
use pg_getfunctiondef; see system information functions. pg_getfunctiondef was added in PostgreSQL 8.4.
SELECT pg_get_functiondef('proc_name'::regproc);
To dump all functions in a schema you can query the system tables in pg_catalog; say if you wanted everything from public:
SELECT pg_get_functiondef(f.oid)
FROM pg_catalog.pg_proc f
INNER JOIN pg_catalog.pg_namespace n ON (f.pronamespace = n.oid)
WHERE n.nspname = 'public';
it's trivial to change the above to say "from all schemas except those beginning with pg_" instead if that's what you want.
In psql you can dump this to a file with:
psql -At dbname > /path/to/output/file.sql <<"__END__"
... the above SQL ...
__END__
To run the output in another DB, use something like:
psql -1 -v ON_ERROR_STOP -f /path/to/output/file.sql target_db_name
If you're replicating functions between DBs like this, though, consider storing the authorative copy of the function definitions as a SQL script in a revision control system like svn or git, preferably packaged as a PostgreSQL extension. See packaging extensions.
You can't tell pg_dump to dump only functions. However, you can make a dump without data (-s or --schema-only) and filter it on restoring. Note the --format=c (also -Fc) part: this will produce a file suitable for pg_restore.
First take the dump:
pg_dump -U username --format=c --schema-only -f dump_test your_database
Then create a list of the functions:
pg_restore --list dump_test | grep FUNCTION > function_list
And finally restore them (-L or --use-list specifies the list file created above):
pg_restore -U username -d your_other_database -L function_list dump_test

dump out some tables of the database

I am using MySQL v5.1 on Ubuntu machine.
I have a database named db_test which contain tables like cars, customers, departments, prices , and so on.
I know I can use the following commands to dump out the db_test database and dump the database back into a new database in following way:
mysqldump -u username -p -v db_test > db_test.sql
mysqladmin -u username -p create new_database
mysql -u username -p new_database < db_test.sql
But for my new_database , I only needs some tables from db_test database, not all the tables.
So, How can I dump out some tables from db_test database and dump these tables back to my new_database ?
please use below code:
mysqldump -u username -p -v db_test[table1, table2,....] > db_test.sql
From the MySQL docs:
shell> mysqldump [options] db_name [tbl_name ...]
List the names of the tables after the database name; listing no table names results in all tables being dumped.
Simply list the tables in the mysqldump command.

MYSQL Dump only certain rows

I am trying to do a mysql dump of a few rows in my database. I can then use the dump to upload those few rows into another database. The code I have is working, but it dumps everything. How can I get mysqldump to only dump certain rows of a table?
Here is my code:
mysqldump --opt --user=username --password=password lmhprogram myResumes --where=date_pulled='2011-05-23' > test.sql
Just fix your --where option. It should be a valid SQL WHERE clause, like:
--where="date_pulled='2011-05-23'"
You have the column name outside of the quotes.
You need to quote the "where" clause.
Try
mysqldump --opt --user=username --password=password lmhprogram myResumes --where="date_pulled='2011-05-23'" > test.sql
Use this code for specific table rows, using LIKE condition.
mysqldump -u root -p sel_db_server case_today --where="date_created LIKE '%2018
%'" > few_rows_dump.sql