Sqoop Export Error: Mixed update/insert is not supported against the target database yet - hive

I am trying to export my data from Hive table to RDMBS (Microsoft SQL Server 2016 ) using this command:
sqoop export \
--connect connectionStirng \
--username name \
--password password \
--table Lab_Orders \
--update-mode allowinsert \
--update-key priKey \
--driver net.sourceforge.jtds.jdbc.Driver \
--hcatalog-table lab_orders \
-m 4
I want to do incremental export so I have specified update-mode and update-key. However when I run this command it fails with this error:
ERROR tool.ExportTool: Error during export:
Mixed update/insert is not supported against the target database yet
at org.apache.sqoop.manager.ConnManager.upsertTable(ConnManager.java:684)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:73)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:99)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
I went through all possible solutions including removing --driver. if I remove driver it doesn't recognize the RDBMS table. I am using sqoop version
Sqoop 1.4.6-cdh5.11.1 on cloudera cluster.
Can someone please help with possible solution?

Related

I have an error trying to run my sqoop job (trying to copy a table from oracle to hive)

I am trying to copy a table from oracle to hadoop (hive) with a sqoop script (the table does not already exist in hive). Within putty, I launch a script called "my_script.sh", code sample below. However, after I run it, it gives me back my code followed by no such file or directory error. Can someone please tell me if I am missing something from my code?
Yes my source and target directory is correct (I made sure to triple check).
Thank you
#!/bin/bash
sqoop import \
-Dmapred.map.child.java.opts='-Doracle.net.tns_admin=. -Doracle.net.wallet_location=.' \
-files $WALLET_LOCATION/cwallet.sso,$WALLET_LOCATION/ewallet.p12,$TNS_ADMIN/sqlnet.ora,$TNS_ADMIN/tnsnames.ora \
--connect jdbc:oracle:thin:/#MY_ORACLE_DATABASE \
--table orignal_schema.orignal_table \
--hive-drop-import-delims \
--hive-import \
--hive-table new_schema.new_table \
--num-mappers 1 \
--hive-overwrite \
--mapreduce-job-name my_sqoop_job \
--delete-target-dir \
--target-dir /hdfs://myserver/apps/hive/warehouse/new_schema.db \
--create-hive-table

how to import-all-tables from Mysql to hive using sqoop for particular database in hive?

sqoop import-all-tables into hive with default database works fine but Sqoop import-all-tables into hive specified database is not working.
As --hive-database is depreciated how to specify database name
sqoop import-all-tables \
--connect "jdbc:mysql://quickstart.cloudera:3306/retail_db" \
--username root \
--password XXX \
--hive-import \
--create-hive-table
The above code creates tables in /user/hive/warehouse/ i.e default directory
How to import all tables into /user/hive/warehouse/retail.db/
you can set the HDFS path of your database using the option --warehouse-dir.
The next example worked for me:
sqoop import-all-tables \
--connect jdbc:mysql://localhost:3306/retail_db \
--username user \
--password password \
--warehouse-dir /apps/hive/warehouse/lina_test.db
--autoreset-to-one-mapper

Import Data from Postgresql to Hive

I am facing issues while importing Table from postgresql to hive. Query I am using is :
sqoop import \
--connect jdbc:postgresql://IP:5432/PROD_DB \
--username ABC_Read \
--password ABC#123 \
--table vw_abc_cust_aua \
-- --schema ABC_VIEW \
--target-dir /tmp/hive/raw/test_trade \
--fields-terminated-by "\001" \
--hive-import \
--hive-table vw_abc_cust_aua \
--m 1
Error I am getting
ERROR tool.ImportTool: Error during import: No primary key could be found for table vw_abc_cust_aua. Please specify one with --split-by or perform a sequential import with '-m 1'.
PLease let me know what is wrong with my query
I am considering -- --schema ABC_VIEW is a typo error, it should be --schema ABC_VIEW
The other issue is the option to provide number of mapper is either -m or --num-mappers and not --m
Solution
in you script change --m to -m or --num-mappers

sqoop hive import "has not been cleaned" exception

When I try to run the following sqoop command
sqoop import \
--connect jdbc:mysql://hostname.com:3306/retail_db \
--username **** \
--password **** \
--table customers \
--hive-import \
--hive-database hariharan_hive \
--hive-table hivecustomers \
--hive-overwrite
I’m getting an exception as
" Failed with exception Destination directory
hdfs://nn01.itversity.com:8020/apps/hive/warehouse/hariharan_hive.db/hivecustomers
has not be cleaned up. "
but the path given in the exception does not exist..
can anybody help me on this?..
How about clearing the Hive metastore by the following command:
(hive shell)> msck repair table hariharan_hive.hivecustomers;

Sqoop import statement

The below mentioned sqoop import statements worked for me the other day and today the same statements is showing error. Below is the error and import statement.
ERROR:
You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'sqoop import
sqoop import \
--connect jdbc:mysql://localhost/loudacre \
--username training --password training \
--table device \
-- target-dir /loudacre/device
-m 1
You are missing where $CONDITIONS' in your import command. where $CONDITIONS' is mandatory and $CONDITIONS should be in upper case. Pls do this correction and try again.
you have missed "\" at the end of fourth line:
can you run this and check it once.
sqoop import \
--connect jdbc:mysql://localhost/loudacre \
--username training --password training \
--table device \
--target-dir /loudacre/device \
-m 1
Yes you may face the error saying that --target-dir /loudacre/device already exists. when you run the command for the first as there is no target-dir defined in hdfs the script runs fine but for the second time as target-dir is already available in hdfs dir it throws error saying dir already exists.
Solutions to resolve this error:
1. Give another new directory or delete existing directory and try script again.
2. you can also import mysql data in sqoop append mode or overwride mode. Refer the below link for reference:
Using sqoop import, How to append rows into existing hive table?
Kindly let me know if it works.
Error solved guys. Turns out when you copy paste the command from notepad to terminal, " changes and results into an error.We need to explicitly write " in the temrinal.