Can we Alter the Size of a Warehouse in Snowflake through an open query through SQL Server? - sql-server-2016

I just wanted to know if we can Alter the Size of a Warehouse in Snowflake through an open query using SQL Server.
I do have linked server from SQL to Snowflake created. I am able to do a normal select from SQL Server to Snowflake.

You should be able to change the size of the warehouse using SQL:
alter warehouse my_wh set warehouse_size=medium;
Details here
Given you have the access privileges to operate the warehouse. Details here

Related

how to reduce the performance issue of two hetrogeneous databases connected via database link

I have connected Oracle database with an Azure SQL Server database via a database link.
I am trying to fetch the data of Azure SQL Server database in Oracle and it is working fine.
But when I am joining one Oracle table and one Azure SQL Server table it is taking too much time to fetch the records.
For example:.
I have one table temp_one in Oracle database and one table temp_two in Azure SQL Server database having dblink like temp.sql.hl.com.
I joined them.
SELECT "EmpAddress",EMPLOYEE_ID
FROM temp_one a
INNER join temp_two#temp.sql.hk.com b
on a.EMPLOYEE_ID = b."EmpID";
Even this normal join is taking 8 seconds.
Note: I am trying to fetch the data in Oracle.
Please help me to resolve it.

Get data from Oracle to SQL server after Oracle statements is excuted

I'm making a project with SQL Server but I just can get data from Oracle database in my customer's PC. Because they want to manage data before i can do with it. I have used Linked Server of Microsoft and Microsoft SQL Server Migration Assistant for Oracle to get data from Oracle database. But I have a problem I dont know when my customer INSERT, DELETE, UPDATE Oracle records. Is there anyway to migrate data from Oracle automatically to my SQL Server?
Really need a help. I have to manual do it everytime I want to update SQL Server database to get new records from Oracle.
Thanks in advance!

Tableau Load Hive Metadata

Is it possible to read/load Hive metadata in Tableau?
For example, I want to build a Tableau report that has all databases' names. In Hive, I can use show databases to get the full list of databases; but how can I load the result to Tableau?
Hive's metadata is stored in an external RDBMS.
Currently supported RDBMS - MySQL, PostgreSQL, Oracle, MS SQL Server and also Derby.
All you need is to access the RDBMS and query it.
https://cwiki.apache.org/confluence/display/Hive/AdminManual+MetastoreAdmin#AdminManualMetastoreAdmin-SupportedBackendDatabasesforMetastore

Cross-Database Query on Azure SQL Database

I am using Azure SQL Database and I have two databases. I want to select one table data on the first database and merge into another (conditional) table of second database.
Is there any SQL solution to solve my problem?

How to capture truncate statement information in SQL Server 2012

I would like to capture the truncate statements information along with the user/Login information for all database in my production server.
Example:
Use mydb
go
truncate table deleteme_table
I would like to capture the information into the table like the below
Table Operation Database Login Time
deleteme_table Truncate mydb sandeep.pulikonda 17-12-2014 17:50:00
If the above scenario is not possible please suggest possible ways to capture it
I am using SQL Server 2012 Standard version. So granular level audit are not supported for that version.
you can use the SQL Server Audit functionality and add an audit for those queries.
this article explains in detail how to obtain this.
Another good way of profiling your SQL Server is using SQL Profiler. Here is a SO question similar to yours and an answer describing how to use SQL Profiler to achieve the results.
SQL Server Profiler - How to filter trace to only display TSQL containing a DELETE statement?