Presto odbc failing to read timestamp fileds in HIVE - hive

I am using presto ODBC to fetch data from Hive into one of the BI tools, however, the query fails with the below error. Could you please help me to understand what might be the issue.
ODBC version: Simba Presto ODBC Driver 1.02.09.1009
used sql : select created_ts from stg_tables.vend
Jul 07 08:07:03.919 ERROR 500 Statement::SQLPrepareW: [Simba][Presto] (1070) Unknown Presto data type: timestamp(3)
While if I am using below SQL it goes good.
select to_iso8601(created_ts AT TIME ZONE 'UTC') from stg_tables.ven

Recent versions of Presto added support for variable precision timestamp types. Unfortunately, Simba's ODBC driver makes certain assumptions about how the type names are presented to the client and fails with the new types.
The next release of Presto (coming out this week) will have a configuration option to restore the old behavior while we give client implementations a chance to upgrade their implementations if they made similar assumptions.
Update: this should now fixed in version 338. You can set the deprecated.omit-datetime-type-precision config option to true to restore the old behavior.

Related

Check HANA edition with SQL?

I have a JDBC connection to a SAP HANA database and I want to query whether it's a SAP HANA Cloud db or not. I know I can find the version with:
SELECT VERSION FROM SYS.M_DATABASE;
and this gives me 4.00.000.00.1608802791 for the cloud and 2.xx for my on-premise Dockerised version, but to avoid hard-coding version numbers everywhere, is there an equivalent query to, say, SQL Server's SELECT SERVERPROPERTY('edition')?
You could use SELECT VALUE FROM M_HOST_INFORMATION WHERE KEY='build_branch'
On premise: fa/hana2sp05
In the cloud: fa/CE2020.36
You might also like M_SYSTEM_OVERVIEW, it has interestign informations such as the server start time.

Configuration file in Netezza

Is there a configuration file in Netezza like tnsnames.ora in Oracle which contains database names and their connect string names?
If so, what is the default location of the file?
I'm using Informatica PowerCenter to load to target Netezza table. I want to know the Database details of the connect string Informatica uses to connect with Netezza DB. In Oracle, I could have got the informatica from tns file.
Netezza doesn't have an equivalent to Oracle TNSNames.
ODBC Connection String Example:
Driver={NetezzaSQL};servername=myServerAddress;port=myPortNumber;
database=myDataBase;username=myUsername;password=myPassword;
ODBC ConnectionStrings.com
ODBC Configuration IBM
JDBC Configuration IBM
You can check the dsn entry (connect string name in Informatica connection) in the odbc.ini file in the LD_LIBRARY_PATH which is defined at the time of Netezza ODBC driver installation
In PowerCenter, a developer can check the connection details only if a dedicated connector is used. For ODBC, the only information available in Workflow Manager is the name of ODBC. The details can be checked in ODBC definition on the server.
A small addition to #Marciejg:
We have only a few odbc connections compared to powercenter connections. Each odbc points to the ‘system’ database and in the powercenter connection pointing to a specific database on that server, we run a ‘set current_catalog PROD_EDW’ in the pre sql. That way things are mostly visible and manageable in powercenter, and the odbc only points to the server.
And slightly off topic: the pre sql has additional ‘set CLIENT_*_NAME’ statements that enters the powercenter workflow/session etc dynamically based on powercenter build in variables (they are named $PMWorkflowname and similar)
That way we can trace back to the powercenter code immediately from a planfile, the pg.log or most interestingly, the HISTDB
Follow these links if you want to play with it:
- https://www.ibm.com/support/knowledgecenter/SSULQD_7.2.1/com.ibm.nz.dbu.doc/r_dbuser_set.html
and
http://dwhlaureate.blogspot.dk/2012/09/built-in-variables-in-informatica.html

SSMA mysql to mssql

I try to convert mysql data base to mssql, I used SSMA.
At first I converted schema from mysql to mssql, then I synchronized it.
Finally I migrated data's and faced with these errors:
Column 'column1 for example' does not allow DBnull.vallue
used softwares:
sql server 2016
mysql work bench 6.1
SSMA
In this case, I’d like to suggest you either change the source data to ‘0000-00-01’ works well with ‘Zero-date in NOT NULL Columns’ or set destination column to NULL so you could process null data after the migration is complete.

How to insert LONG BINARY from SQL Server to Oracle

I need to get a copy of a SQL Server 2008 table into an Oracle RDBMS. I have database link for SQL Server, database has a table which contains LONG BINARY type column.
When I issue
create table test_ora as select * from mssqltable#dblink
I get the error
Can't convert LONG
I tried to use to_lob, to_char, hextoraw and a ream of Oracle conversion function but still hasn't defeated the issue. Do you have any ideas?
p.s. I'm out of work now so can't tell exact ORA- error number.
There is a way to do that with undocumented Oracle's package:
http://tonguc.wordpress.com/2008/08/28/how-to-transfer-long-datatype-over-dblink/
I would recommend tool called Pentaho Data Integration. This is free, small and superb ETL tool.
Download page: community(.)pentaho(.)com
It will recreated all tables and types for you. How to do it:
pldwh(.)blogspot(.)co(.)uk/2013/03/pentaho-data-integration-create-tables_1(.)html

TIME datatype in SQL Server 2008 will not accept time from Access form

I am using an Access form on the front end, bound to a SQL Server 2008 table. I have an Arrival Time column of datatype Time.
But I get an ODBC error every time I try to save a record. Error states:
Invalid character value for cast specification.
Time displays as 10:00:00 AM. I have tried with and without a time format in the properties of the field, and with/without an input mask of various types. Does anyone know how to avoid this error?
When I linked a 'Time' Field from SQL Server 2012 Express to an access front end (2010 accdb) using the old 'SQL Server' ODBC driver, it converted it to a Text field, and would not allow an updates. You might consider using a datetime field, or trying a newer ODBC driver. I believe the 'Time' field was first introduced in SQL 2008, so I'm guessing older ODBC drivers don't know how to handle it.