Find table_name that contains two known column names - sql

I want to find the tables that contain both of the two columns together in one table. I tried this:
SELECT COLUMN_NAME, TABLE_NAME
FROM INFORMATION_SCHEMA.COLUMNS
WHERE COLUMN_NAME in ('CurrencyName', 'CurrencyKey');
This query generates all the tables that has either one of the CurrencyName or CurrencyKey column.
But I want the table that has both of these columns together.
Please shoot some ideas.
Thanks!

You are close. You want to use group by and then validate that you have two matches using a having clause:
SELECT TABLE_NAME
FROM INFORMATION_SCHEMA.COLUMNS
WHERE COLUMN_NAME in ('CurrencyName', 'CurrencyKey')
GROUP BY TABLE_NAME
HAVING COUNT(*) = 2;
Note: To be sure you have the right table, you should use TABLE_SCHEMA as well in the query.

Related

SQL Match table and multiple columns

I wanted to know a Query where in which i want to locate a specific table that contains two or more columns:
I tried:
SELECT *
FROM DB
WHERE TableName = 'TableName'
AND ColumName in('column1' , 'column2')
But this query will look if any of those columns are there, but i want it to return only if all of them are a match.
I hope this questions makes sense.
This should work for you in MySQL:
SELECT TABLE_NAME
FROM INFORMATION_SCHEMA.COLUMNS
WHERE COLUMN_NAME in ('column1','column2')
AND TABLE_SCHEMA='your_database'
GROUP BY table_name
HAVING COUNT(COLUMN_NAME) =2;
The operator IN is a concatenation of OR. However, there's no way to creare a short concatenation of AND as well.
See this question.

Converting one to many relation into a json column in PostgreSQL

I'm trying to query two information_schema tables in PostgreSQL - tables and columns in order to get the following result:
table_name - columns_as_json_array
Sort of converting this one to many relation into a json array column.
I tried a lot of different methods and came up with somethings like this:
SELECT t.table_name, c.json_columns
FROM information_schema.TABLES t
LEFT JOIN LATERAL(
SELECT table_name, json_agg(row_to_json(tbc)) AS json_columns
FROM information_schema.COLUMNS tbc
WHERE t.table_name = tbc.table_name
GROUP BY table_name
) as c ON TRUE;
This results a list of table_names but the json_columns always contains all of the columns available instead of the columns of that certain table.
Any ideas?
I don't really see the point for a lateral join here. As far as concerns, you can get the expected results by aggregating information_schema.columns:
select table_name, json_agg(row_to_json(c)) json_columns
from information_schema.columns c
group by table_name
order by table_name

Get a list of database tables that contain a specific column field?

How to select all tables that contain a specific column?
Is this what you expect?
demo: db<>fiddle
SELECT table_name
FROM information_schema.columns
WHERE column_name = 'your_column_name'

Vertica Dynamic Max Timestamp from all Tables in a Schema

System is HP VERTICA 7.1
I am trying to create a SQL query which will dynamically find all particular tables in a specific schema that have a Timestamp column named DWH_CREATE_TIMESTAMP from system tables. (I have completed this part successfully)
Then, pass this list of tables to an outer query or some kind of looping statement which will select the MAX(DWH_CREATE_TIMESTAMP) and TABLE_NAME from all the tables in the list (200+) and union all the results together into one list.
The expected output is a 2 column table with all said tables with that TS field and the max of each value. Tables are constantly being created and dropped, so the point is to make everything totally dynamic where no TABLE_NAME values are ever hard-coded.
Any idea of Vertica specific ways to accomplish this without UDF's would be greatly appreciated.
Inner Query (working):
select distinct(table_name)
from columns
where column_name = 'DWH_CREATE_TIMESTAMP'
and table_name in (select DISTINCT(table_name) from all_tables where schema_name = 'PTG_DWH')
Outer Query (attempted - not working):
SELECT Max(DWH_CREATE_DATE) from
WITH table_name AS (
select distinct(table_name)
from columns
where column_name = 'DWH_CREATE_DATE' and table_name in (select DISTINCT(table_name) from all_tables where schema_name = 'PTG_DWH'))
SELECT MAX(DWH_CREATE_DATE)
FROM table_name
Thanks!!!
No way to do that in one SQL .
You can used the below method for node max timestamp columns values
select projections.anchor_table_name,vs_ros.colname,max(max_value) from vs_ros,vs_ros_min_max_values,storage_containers,projections where vs_ros.colname ilike 'timestamp'
and vs_ros.salstorageid=storage_containers.sal_storage_id
and vs_ros_min_max_values.rosid=vs_ros.rosid
and storage_containers.projection_name=projections.projection_name
group by projections.anchor_table_name,vs_ros.colname

How to determine the number of "Column Name" in a table?

I have a table tblEmployeeInfowhich has atleast a 100+ column name.
I want to know how many column name are in that table. Is that possible?
NOTE:
tbleEmployeeInfo has no data inside yet.
I would recommend using the INFORMATION_SCHEMA views. You can see all the columns and their types by doing:
select c.*
from INFORMATION_SCHEMA.COLUMNS c
where table_name = 'tbleEmployeeInfo';
(You might want to include the table_schema as well.)
To get the count, just use COUNT(*):
select count(*)
from INFORMATION_SCHEMA.COLUMNS c
where table_name = 'tbleEmployeeInfo';
SELECT COUNT(*)
FROM sys.columns
WHERE object_id = object_id('tblEmployeeInfo')