I have some database in My Hiveserver2 which I need to export to hdfs and than import them to an other cluster (other hiveserver).
I see this answer but it's csv form.
I would like also know if there's a such command "export $database" and "import $database"
Related
Is there a way to have a .csv or .txt imported into SQL Server automatically?
I know how to do it manually using Date import & Export tool. But is it possible to do it automatically?
You can use Windows Task Scheduler to run bcp commands automatically. The command that will be run automatically will import your csv file using bulk copy program (bcp) utility. It can import or export data from/to files. For example to import a csv file to a table in SQL Server, you can use command like this:
bcp.exe dbo.MyTable in "C:\Some Folder\Data.csv" -s MYPC\SQLEXPRESS -d MyDatabase -U LoginName -P StrongP#ssw0rd
Where:
dbo.MyTable is the schema and table name, where data should be imported.
in tells the direction (put data in the database, or get data out of it).
"C:\Some Folder\Data.csv" is the name and path to the file holding the data to be imported.
MYPC\SQLEXPRESS is the computer and SQL Server instance name.
MyDatabase is the name of the database, where dbo.MyTable is.
LoginName and StronP#ssw0rd are the credentials to be used to connect to the server (or -E instead of -U and -P to connect using Windows Authentication).
Then create a new scheduled task (Start -> Task Scheduler -> Create Basic Task) and set a schedule according your requirements (e.g. daily at 3:00 AM) to run the command above.
I am using the most recent version of PostgreSQL, and Pgadmin 4.
And I am attempting to import this database from git: https://github.com/pthom/northwind_psql
I attempted to load the sql file as a restore, and recieved this error:
This file is not intended for a pg_restore. You should to just execute it on psql.
pg_restore is a utility for restoring a PostgreSQL database from an archive created by pg_dump in one of the non-plain-text formats.
Take a look to create_db on GitHub shell script to understand how to import it.
I am trying to import a sample database "employees.sql" from official phpMyAdmin webpage. I am using uwamp server and getting the following error when using phpMyAdmin "import" option:
Unrecognized statement type. (near "source" at position 0)
.SQL FILE AT LINE WHERE ERROR IS REPORTED:
SELECT 'LOADING departments' as 'INFO';
source load_departments.dump ;
I am not sure what to change to successfully import the database. I also tried different things like putting load_departments.dump in quotes, but it still didn't work.
How do you use MySQL's source command to import large files in windows
must read and you will definitely get many ideas!
I think you should fire source command from cmd (command prompt)
I suggest you to create an empty database and import that sql file inside of it. Check it out..
Assumption: MySQL Server was installed and you have downloaded the employees database from github. Unzip the package and go to the directory from command prompt.
Enter the following command and on prompt, provide the sql password.
mysql -u root -p -t < employees.sql
Verify your installation by entering the following command.
mysql -u root -p -t < test_employees_md5.sql
I am trying to import contents of mysql table from a txt file.
I have already created dump of individual tables in .txt file. I then tried to import this txt file in my local server by LOAD DATA LOCAL INFILE. It worked fine locally. But when I tried to import using the same method in live. I am getting #1148 - The used command is not allowed with this MySQL version. I searched and followed steps from these two threads:
PHPMyAdmin saying: The used command is not allowed with this MySQL version
access denied for load data infile in MySQL
But it didn't work out. Now I am finding way to import these contents into the table from those txt files by ssh. Any help on how to do that from SSH will be great.
BTW, I am using mediatemple DV
TIA
I've been trying to import a table from Vectorwise to Hive using Sqoop. I downloaded the Vectorwise JDBC driver and all. It just ain't working.
This is the command I'm using:
sudo -u hdfs sqoop import --driver com.ingres.jdbc.IngresDriver --connect jdbc:ingres://172.16.63.157:VW7/amit --username ingres -password ingres --table vector_table --hive-table=vector_table --hive-import --create-hive-table -m 1
And I'm getting the error:
12/06/07 22:08:27 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.RuntimeException: Could not load db driver class: com.ingres.jdbc.IngresDriver
java.lang.RuntimeException: Could not load db driver class: com.ingres.jdbc.IngresDriver
at com.cloudera.sqoop.manager.SqlManager.makeConnection(SqlManager.java:635)
at com.cloudera.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:53)
at com.cloudera.sqoop.manager.SqlManager.execute(SqlManager.java:524)
at com.cloudera.sqoop.manager.SqlManager.execute(SqlManager.java:547)
at com.cloudera.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:191)
at com.cloudera.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:175)
at com.cloudera.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:263)
at com.cloudera.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1226)
at com.cloudera.sqoop.orm.ClassWriter.generate(ClassWriter.java:1051)
at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:84)
at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:370)
at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:456)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
I'd really appreciate it if someone can help me out here.
Thanks in advance! :)
I can't comment yet so as an answer:
This is a quote from the documentation:
You can use Sqoop with any other JDBC-compliant database. First,
download the appropriate JDBC driver for the type of database you want
to import, and install the .jar file in the $SQOOP_HOME/lib directory
on your client machine. (This will be /usr/lib/sqoop/lib if you
installed from an RPM or Debian package.) Each driver .jar file also
has a specific driver class which defines the entry-point to the
driver. For example, MySQL’s Connector/J library has a driver class of
com.mysql.jdbc.Driver. Refer to your database vendor-specific
documentation to determine the main driver class. This class must be
provided as an argument to Sqoop with --driver.
Do you have the proper jar file in a directory that's accessible by Sqoop?
For the future it is also always useful if you give a bit more information about your environment like which version of Sqoop you are using etc.
Okay, I got it working. It was a simple permission issue. I changed the owner of iijdbc.jar to hdfs.
sudo chown hdfs /usr/lib/sqoop/lib/iijdbc.jar
Now it's working! :)
I can now import my Vectorwise tables to Hive using Sqoop. Great!