rename a database name in db2 server - db2-luw

I have a db2 server and i need to rename the database in the server. i created the configuration file:
DB_NAME=E2P,E2POLD
DB_PATH=/db2/E2P
INSTANCE=db2e2p
NODENUM=0
and then started the database instance and executed the relocatedb command:
db2relocatedb -f relocate.cfg
but this is giving me the following error:
DBT1006N The file/device "/db2/E2P/db2e2p/NODE0000/SQL00001/" could not be opened.
I have checked and there are no blank spaces in the configuration file. Please suggest what is going wrong here?
Also i chekced the diaglog and saw the follwoing error over ther:
2015-04-11-04.22.18.593830-240 I481891E628 LEVEL: Error
PID : 26289 TID : 46931183135040PROC : db2sysc 0
INSTANCE: db2e2p NODE : 000 DB : E2P
APPHDL : 0-36 APPID: *LOCAL.DB2.150411082246
AUTHID : DB2Q01
EDUID : 88 EDUNAME: db2evmgi (DB2DETAILDEADLOCK) 0
FUNCTION: DB2 UDB, database monitor, sqmEvmonWriter::initTarget, probe:40
MESSAGE : ZRC=0x800D002C=-2146631636=SQLM_RC_EVPATH "path in use"
DATA #1 : String, 17 bytes
DB2DETAILDEADLOCK
DATA #2 : String with size, 60 bytes
/db2/E2P/db2e2p/NODE0000/SQL00001/db2event/db2detaildeadlock
2015-04-11-04.22.18.594489-240 I482520E528 LEVEL: Error
PID : 26289 TID : 46931183135040PROC : db2sysc 0
INSTANCE: db2e2p NODE : 000 DB : E2P
APPHDL : 0-36 APPID: *LOCAL.DB2.150411082246
AUTHID : DB2Q01
EDUID : 88 EDUNAME: db2evmgi (DB2DETAILDEADLOCK) 0
FUNCTION: DB2 UDB, database monitor, sqmEvmonWriter::activate, probe:40
MESSAGE : ZRC=0x800D002C=-2146631636=SQLM_RC_EVPATH "path in use"
DATA #1 : String, 17 bytes
DB2DETAILDEADLOCK
would system reboot be helpful here?

It seems that you need to run below similar mv command:
mv /home/db2inst1/db2inst1/NODE0000/E2P /home/db2inst1/db2inst1/NODE0000/E2POLD
before executing db2relocate command.
Here is one of good articles for your situation:
[Db2] Simple test case shell script for db2relocatedb command
https://www.ibm.com/support/pages/node/1099185
It tells basic usage of db2relocatedb command.
Hope this helps.

Related

Unsuccessful SQL dump import in PgAdmin 4 MacOS

I have to use SQL for the very first time. I downloaded data from French Public Service Open Data (https://cerema.app.box.com/v/dvfplus-opendata/folder/160785684885). I also downloaded Postgres App and PgAdmin 4. The file of interest is a dump, with a .sql extension. I created a database called "DVF" directly in PgAdmin.
Following all the tutorials online and the help here I tried to import the file using "Backup" and "Restore" commands, as well as lines directly from the terminal. I have a "success" message when I use Backup or Restore in PgAdmin, but always an error message from the Terminal (by clicking on DVF in Postgres App).
[DVF=# ./psql -U postgres DVF < /Users/Menica/dpts.sql;
>ERROR : syntax error at or near "."
[DVF=# psql -h localhost -p 5432 -U postgres -d DVF -f dpts.sql;
>ERROR: syntax error at or near "psql"
Furthermore, even with "success" messages from Backup/Restore, my tables aren't in PgAdmin after the Refresh.
I don't know if it can help but here are some directories :
The file : /Users/Menica/dpts.sql
PostGres App : /Library/PostgreSQL/9.3
=> There is a lot of folders there, like "bin"
Terminal by default : /bin/zhs
UPDATE
To dump the database I used the following in psql :
DVF-# pg_dump DVF > /Users/Menica/dpts.sql;
ERROR: syntax error at or near "pgdump"
LINE 1: pg_dump DVF > /Users/Menica/dpts.sql
^
The first 25 lines of the dpts.sql file are the following :
--
-- PostgreSQL database dump
--
-- Dumped from database version 9.5.25
-- Dumped by pg_dump version 9.5.25
-- Started on 2022-04-14 15:50:09 CEST
SET statement_timeout = 0;
SET lock_timeout = 0;
SET client_encoding = 'UTF8';
SET standard_conforming_strings = on;
SELECT pg_catalog.set_config('search_path', '', false);
SET check_function_bodies = false;
SET xmloption = content;
SET client_min_messages = warning;
SET row_security = off;
--
-- TOC entry 34 (class 2615 OID 631938251)
-- Name: dvf_d2a; Type: SCHEMA; Schema: -; Owner: -
--
CREATE SCHEMA dvf_d2a;
--
-- TOC entry 35 (class 2615 OID 632173840)
-- Name: dvf_d2b; Type: SCHEMA; Schema: -; Owner: -
--
CREATE SCHEMA dvf_d2b;
SET default_tablespace = '';
SET default_with_oids = false;

FAILED: ParseException line 1:21 cannot recognize input near '<EOF>' '<EOF>' '<EOF>' in table name

Command:
hive -e "use xxx;DROP TABLE IF EXISTS `xxx.flashsaleeventproducts_hist`;CREATE EXTERNAL TABLE `xxx.flashsaleeventproducts_hist`(`event_id` string,`group_code` string,`id` string,`is_deleted` int,`price` int,`price_guide` int,`product_code` int,`product_id` string,`quantity_each_person_limit` int,`quantity_limit_plan` int,`sort_num` int,`update_time` bigint,`meta_offset` bigint,`meta_status` int,`meta_start_time` bigint)PARTITIONED BY(`cur_date` string,`cur_hour` string) ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat' OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat'LOCATION '/data/ods/xxx/flashsaleeventproducts_hist';msck repair table flashsaleevents_hist;"
Error:
xxx.flashsaleeventproducts_hist: command not found
xxx.flashsaleeventproducts_hist: command not found
event_id: command not found
group_code: command not found
is_deleted: command not found
Command 'price' not found, did you mean:
command 'rice' from deb golang-rice
Try: sudo apt install <deb name>
price_guide: command not found
product_code: command not found
product_id: command not found
quantity_each_person_limit: command not found
quantity_limit_plan: command not found
sort_num: command not found
update_time: command not found
meta_offset: command not found
meta_status: command not found
meta_start_time: command not found
cur_date: command not found
cur_hour: command not found
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hive/conf/hive-env.sh: line 24: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hbase/hbase-protocol.jar: Permission denied
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hive/conf/hive-env.sh: line 25: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hbase/hbase-hadoop-compat.jar: Permission denied
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hive/conf/hive-env.sh: line 26: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hbase/hbase-hadoop2-compat.jar: Permission denied
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hive/conf/hive-env.sh: line 27: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hbase/lib/htrace-core.jar: Permission denied
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hive/conf/hive-env.sh: line 28: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hbase/hbase-client.jar: Permission denied
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hive/conf/hive-env.sh: line 29: /opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/lib/hbase/hbase-server.jar: Permission denied
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hadoop/libexec/hadoop-functions.sh: line 2331: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: bad substitution
/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/bin/../lib/hadoop/libexec/hadoop-functions.sh: line 2426: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: bad substitution
WARNING: Use "yarn jar" to launch YARN applications.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/log4j-slf4j-impl-2.8.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in jar:file:/opt/cloudera/parcels/CDH-6.2.0-1.cdh6.2.0.p0.967373/jars/hive-common-2.1.1-cdh6.2.0.jar!/hive-log4j2.properties Async: false
OK
Time taken: 2.277 seconds
NoViableAltException(-1#[199:1: tableName : (db= identifier DOT tab= identifier -> ^( TOK_TABNAME $db $tab) |tab= identifier -> ^( TOK_TABNAME $tab) );])
at org.antlr.runtime.DFA.noViableAlt(DFA.java:158)
at org.antlr.runtime.DFA.predict(DFA.java:144)
at org.apache.hadoop.hive.ql.parse.HiveParser_FromClauseParser.tableName(HiveParser_FromClauseParser.java:3821)
at org.apache.hadoop.hive.ql.parse.HiveParser.tableName(HiveParser.java:40055)
at org.apache.hadoop.hive.ql.parse.HiveParser.dropTableStatement(HiveParser.java:6887)
at org.apache.hadoop.hive.ql.parse.HiveParser.ddlStatement(HiveParser.java:3126)
at org.apache.hadoop.hive.ql.parse.HiveParser.execStatement(HiveParser.java:2266)
at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1318)
at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:218)
at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:75)
at org.apache.hadoop.hive.ql.parse.ParseUtils.parse(ParseUtils.java:68)
at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:564)
at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1425)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1493)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1339)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1328)
at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:409)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:342)
at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:800)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:772)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:699)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:313)
at org.apache.hadoop.util.RunJar.main(RunJar.java:227)
FAILED: ParseException line 1:21 cannot recognize input near '<EOF>' '<EOF>' '<EOF>' in table name
Note: The sql is correct, directly run in hive cli would not throw error.
I think the problem may be some special character in sql, but can't figure it out .
Better view for that sql
use xxx;
DROP TABLE IF EXISTS `xxx.flashsaleeventproducts`;
CREATE EXTERNAL TABLE `xxx.flashsaleeventproducts`(
`id` string,
`event_id` string,
`product_id` string,
`sort_num` int,
`price_guide` int,
`price` int,
`quantity_limit_plan` int,
`quantity_each_person_limit` int,
`is_deleted` int,
`update_time` bigint,
`group_code` string,
`product_code` int
)PARTITIONED BY(`cur_date` string,`cur_hour` string)
ROW FORMAT SERDE 'org.apache.hadoop.hive.ql.io.parquet.serde.ParquetHiveSerDe'
STORED AS INPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat'
OUTPUTFORMAT 'org.apache.hadoop.hive.ql.io.parquet.MapredParquetOutputFormat'
LOCATION '/data/ods/xxx/flashsaleeventproducts';
msck repair table flashsaleeventproducts;
Please find below script to connect to hive:
import subprocess
import sys
query=""" hive -e "set hive.cli.print.header=true;use db;select * from somehivetable;" """
outresutfile=open("query_result.csv", 'w')
p=subprocess.Popen(query,shell=True,stdout=outresutfile,stderr=subprocess.PIPE)
stdout,stderr = p.communicate()
if p.returncode != 0:
print stderr
sys.exit(1)
I have known the problem came from symbol : ` .
Remove that symbol solve the problem .
Only one question left is why ` work well in hive-cli but failed in hive -e "xxx".

How to edit json file resides on the remote server?

I have a json file on my remote server
location at remote host : ".docker/test.josn"
{
"key1" : "Value1",
"Key2" : "Value2"
}
I want to add new element to the test.josn from my local machine. I am trying following command but it is not working.
ssh <test-server> "jq '.key3 = "Value3"' .docker/test.json > .docker/test2.json && mv .docker/test2.json .docker/test.json"
Its giving me the following error:
bash: .docker/test2.json: No such file or directory
You have a shell quoting issue. You didn't escape the inner double quotes.
You can try the following:
ssh <test-server> 'jq ".key3 = \"Value3\"" .docker/test.json > .docker/test2.json && mv .docker/test2.json .docker/test.json'
which replace the outer double quote with single ones because you don't need variable expansion in this statement.

Informatica Session getting failed due to error GPWRT_34014

2017-02-06 12:26:23 : ERROR : (16120 | WRITER_1_*_1) : (IS | INF_PW1_ASCII) : NODE_PWDEV_PWNLD1022v04 : GPWRT_34014 : [ERROR] The Integration Service failed to create a pipe because of insufficient memory allocation or because it does not have required permissions to the directory specified for the pipe location in the session.
2017-02-06 12:26:23 : ERROR : (16120 | WRITER_1_*_1) : (IS | INF_PW1_ASCII) : NODE_PWDEV_PWNLD1022v04 : SDKS_38502 : Plug-in #431050's target [ods_ap_invoices_all: Partition 1] failed in method [init].
2017-02-06 12:26:23 : ERROR : (16120 | WRITER_1_*_1) : (IS | INF_PW1_ASCII) : NODE_PWDEV_PWNLD1022v04 : WRT_8068 : Writer initialization failed. Writer terminating.

Getting value from Oracle database table to UNIX variable

I am trying to create a simple script that connects to a oracle database, executes a select query and store the return value into a Unix variable. Below is the script I have created by following this post:
#!/bin/sh
VALUE=`sqlplus -silent $DB_USERNAME/"$PASSWORD"#"(DESCRIPTION=(ADDRESS_LIST=(ADDRESS=(PROTOCOL=TCP)(HOST=$HOST_NAME)(PORT=$DB_PORT)))(CONNECT_DATA=(SID=$DB_SID)))" <<END
set pagesize 0 feedback off verify off heading off echo off
SELECT ID FROM TEST_USERS WHERE USER_NAME=$SAMPLE_USER;
exit;
END`
if [ -z "$VALUE" ]; then
echo "No rows returned from database"
exit 0
else
echo $VALUE
fi
Now when I run this script I am facing error as :
ERROR: ORA-12533: TNS:illegal ADDRESS parameters SP2-0306: Invalid
option. Usage: CONN[ECT] [{logon|/|proxy} [AS {SYSDBA|SYSOPER|SYSASM}]
[edition=value]] where ::=
[/][#] ::=
[][/][#] SP2-0306:
Invalid option. Usage: CONN[ECT] [{logon|/|proxy} [AS
{SYSDBA|SYSOPER|SYSASM}] [edition=value]] where ::=
[/][#] ::=
[][/][#] SP2-0157:
unable to CONNECT to ORACLE after 3 attempts, exiting SQL*Plus
Please let me know where I am doing mistake?
Try with this VALUE variable:
VALUE=`sqlplus $DB_USERNAME/$PASSWORD#//$HOST_NAME:$DB_PORT/$DB_SID`
And, of course, you need to define all variables, you use there before this line