AWS Athena Create Table with JSON - sql

I'm trying to create a table with this below data value:
{"name": "Tommy", "Age": 16, "date":{"string": "2020-10-10"}}
{"name": "Will", "Age": 20, "date":{"string": "2020-10-10"}}
but when I try to access data from a select it comes:
{"string":"2020-10-10"}
and I just need data value
there is any option to solve this on create table step? For exemple, to create a table looking to the date["string"] value.
I know, this is very specific, but if someone knows I'll be very happy! Thks

One common way to solve this kind of situation is to use a view. Assuming you have a table called nested_Data with name, age, and date columns where the date column is defined as struct<string:string>, you can create a view like this:
CREATE VIEW flat_data AS
SELECT name, age, date['string'] AS date
FROM nested_data
when you run a query like SELECT date FROM flat_data you will get only the date value.
Views are often used like this when the raw data needs a bit of pre-processing and you want to avoid having to include all that pre-processing in every query.

Related

Extract key value pair from json column in redshift

I have a table mytable that stores columns in the form of JSON strings, which contain multiple key-value pairs. Now, I want to extract only a particular value corresponding to one key.
The column that stores these strings is of varchar datatype, and is created as:
insert into mytable(empid, json_column) values (1,'{"FIRST_NAME":"TOM","LAST_NAME" :"JENKINS", "DATE_OF_JOINING" :"2021-06-10", "SALARY" :"1000" }').
As you can see, json_column is created by inserting only a string. Now, I want to do something like:
select json_column.FIRST_NAME from mytable
I just want to extract the value corresponding to key FIRST_NAME.
Though my actual table is far more complex than this example, and I cannot convert these JSON keys into different columns themselves. But, this example clearly illustrates my issue.
This needs to be done over Redshift, please help me out with any valuable suggestions.
using function json_extract_path_text of Redshift can solve this problem easily, as follows:
select json_extract_path_text(json_column, 'FIRST_NAME') from mytable;

How can I query a NCLOB column as a table in SAP HANA?

I like to do a select query with a where condition on a column of type NCLOB let's call it information column. It has following format:
{
"firstName" : "name1",
"lastName" : "lastName1"
}
I want to do some thing like this
Select * from myTable where information.firsName = "targetName"
But I am not sure how to do it.
Any hints please?
Looks like you want a structured database instead of a JSON clob... :)
Why don't you have a look at the JSON functions in HANA?
With JSON_TABLE you can construct a table structure "over" the JSON-data in the clob-column and then query that table structure.

SQL command like R table function

I am a beginner for SQL.
If I have a table tab like blow:
In R, I can use table(tab$prediction, tab$result_code) to get the table which look like confusion table. But I don't known how to create this table by using SQL command. Is it possible to get the table contain counts between prediction column and result_code column?
From your data, the column "prediction" was used as the actual prices and the "result_code" column as predicted values.
As the comments advise, you can perform a COUNT over the two columns to get the occurrences. This table is used as the base to create the confusion matrix and is names as TEMP.
After creating the TEMP table, you perform a PIVOT to restructure the table to get the format of the confusion matrix.
SELECT prediction, "0", "1", "2"
FROM
(
SELECT [prediction],[result_code],COUNT(*) as Occurences
FROM [Your_Schema].[dbo].[Your_Database]
GROUP BY [prediction],[result_code]
)Temp
PIVOT
(
MAX(Occurences)
FOR result_code in ("0","1","2")
)PIV

Can you concatenate two strings to create dynamic column names in PostgreSQL?

I have a form where people can type in a start and end date, as well as a column name prefix.
In the backend, I want to do something along the lines of
SELECT *, CAST('{{startDate}}' AS TIMESTAMP) AS ({{prefix}} + '_startDate')
Is this possible? Basically, I want to dynamically create the name of the new column. The table is immediately returned to the user, so I don't want to mutate the underlying table itself. Thanks!
You can execute dynamic query that you have prepared by using EXECUTE keyword, otherwise it is not possible to have dynamic structure of SQL.
Since you are preparing your SQL outside database, you can use something like:
SELECT *, CAST('{{startDate}}' AS TIMESTAMP) AS {{prefix}}_startDate
Assuming that {{prefix}} is replaced with some string by your template before it is sent to database.

PLSQL - create type dynamically using function

I would like to implement something like the following pseudo-code. Let say I have table named T_TMP_TABLE with column 'hour', 'day', and 'year'. The table name is used as an input parameter for create_types_fn function. Let's assume that I can get a list of column names and store it to column_name_array. Now, I need to create "custom type" using those column names. The reason is that the function will return table as an output, and the columns of the returned table would be the same ('hour', 'day', and 'year')
Briefly speaking, I have a table and I need output as table-format with same column names.
Am I able to do this? any suggestion or recommendation would be much appreciated!!
CREATE OR REPLACE FUNCTION create_types_fn (table_name in varchar2)
....
begin
array column_name_array = get_column_name_in_array_by_table_name(table_name)
CREATE OR REPLACE TYPE my_type AS OBJECT (
column_name_array(0) NUMBER,
column_name_array(1) NUMBER,
column_name_array(2) VARCHAR2(30)
);
CREATE OR REPLACE type my_table AS TABLE OF my_type ;
select * bulk collect into my_table ;
end
EDIT
Here is what I am trying to do
I am trying to compare two tables and return rows if there are any difference. So, I think the output should be table-format. Since every table has different column names, I think it would be nice if I can make generic function..
If you are trying to compare the data in two different tables, you would almost certainly want to use the dbms_comparison package rather than writing your own. That populates a generic structure rather than creating new types for each table.