Azure Data Explorer supposedly supports T-SQL queries:
The Kusto.Explorer tool supports T-SQL queries to Kusto. To instruct Kusto.Explorer to execute a query, begin the query with an empty T-SQL comment line (--).
However, I can't get this to work in a Log Analytics Workspace.
For instance, this Kusto query works fine and returns results:
ContainerInstanceLog_CL
| where Message has "Hamlet"
| limit 500
But any attempt to use T-SQL (with a leading empty comment line) ...
--
SELECT * FROM ContainerInstanceLog_CL
...fails with
Query could not be parsed at '-' on line [1,1]
Token: -
Line: 1
Position: 1
Are T-SQL queries not supported in Log Analytics Workspaces?
Unfortunately, you cannot run T-SQL queries in Azure Log Analytics Workspaces.
I would suggest you to provide feedback on the same:
https://feedback.azure.com/forums/267889-azure-monitor-log-analytics
All of the feedback you share in these forums will be monitored and reviewed by the Microsoft engineering teams responsible for building Azure.
T-SQL queries run on the Azure Data Explorer:
Writing my comment as an answer as suggested.
Log Analytics Workspaces supports only Kusto as of now. You can further integrate it with power BI for better analytics options.
Related
I have a question about getting the detailed query usage logs from Azure Synapse on-demand.
With
SELECT * FROM sys.dm_external_data_processed
I can get the daily/weekly usage. It’s good.
But I need to get a result like this:
in a "exportable" way (TSQL or something to query and get usage details for export needs).
I've tried to google it. But with no luck.
Ok, I found it.
On each serverless DB this query gives the details:
SELECT TOP 1000 *
FROM sys.dm_exec_requests_history
order by start_time desc
I have tables in Azure Databricks that I am using SQL to interact with via a notebook. I need to select all columns from a table with 200 columns, I need to select all of them but I need to modify some for a select insert (To modify specific columns for a PK). Therefore I can not use a select *. (There are multiple scenarios this is just my current objective)
How can I generate a select statement on a table with all the column names in a sql statement. This would be equivalent of a 'Select top N' in SSMS where it generates a select for the table I can than edit.
I have seen functions like describe and show but they can't build a select statement.
I am new to Databricks. Any help is appreciated.
I have the same problem. It is really tough to make and modify SELECT statement for this kind of tables. I have tried many ways and found using the 3rd party software to connect to the table on Azure Databricks worked fine.
Here is what I do:
Download the 3rd party software such as DBeaver
Download Databricks JDBC driver form this page.
Configure Databricks driver. Luckily there is an official doc for DBeaver.
Connect to the Databricks and find the table to generate SELECT statement.
Use DBeaver built-in function to generate it. See the screenshot below.
That's it!
I found this setup took just 10-15 minutes to complete saving much time.
I want to copy data from one database table into another database table on the same server in Azure SQL. I have done all of the Azure SQL Cross Database Query' steps that are written here https://www.mssqltips.com/sqlservertip/6445/azure-sql-cross-database-query/but still get the same error whenever I execute a query
'Reference to database and/or server name in 'db name' is not supported in this version of SQL Server.'
Can you pls help to figure out this?
Azure SQL database doesn't support across query directly.
We can not use USE statements and it not supported. That's why you get the error. We can not run statements like select * from [other_database].[schema].[table].
In Azure SQL database, only elastic query overview (preview) can achieve cross database query:
The elastic query feature (in preview) enables you to run a
Transact-SQL query that spans multiple databases in Azure SQL
Database. It allows you to perform cross-database queries to access
remote tables, and to connect Microsoft and third-party tools (Excel,
Power BI, Tableau, etc.) to query across data tiers with multiple
databases.
You could follow the tutorial and it may be more complex than on-premise SQL Server:
Get started with cross-database queries (vertical partitioning) (preview)
I am using to BigQuery web UI for running my queries. I want to delete some specific rows from all tables in a Dataset. I want to do it by running all delete queries in one go, like below:
DELETE FROM `dataset_name.tabl_name_1` WHERE REGEXP_CONTAINS(user_dim.user_id, r'g_1478_h_1.') = TRUE;
DELETE FROM `dataset_name.tabl_name_2` WHERE REGEXP_CONTAINS(user_dim.user_id, r'g_1478_h_1.') = TRUE;
DELETE FROM `dataset_name.tabl_name_3` WHERE REGEXP_CONTAINS(user_dim.user_id, r'g_1478_h_1.') = TRUE
There are almost 500 tables. So there will be 500 queries to be run in one go. I have unchecked the option of 'use Legacy Sql'.
But on running above queries (almost 500) returns error:
Syntax error: Unexpected keyword DELETE at [2:1]
Is there any solution to my problem?
You cannot do this in BigQuery web UI!
Your best option here is to use BigQuery client of your preference and script those repetitive statements
Have in mind quotas/limitation for DML
Edit (October 2019):
Support for scripting and stored procedures is now in beta.
You can submit multiple statements separated with semi-colons and BigQuery is able to run them now.
I get around this by putting the queries in a Cloud Function (using Python) and scheduling using the new Cloud Scheduler. Works fine, but would be easier in BQ itself.
We have a closed source application here, that connect to an informix database ( using odbc )
is there any way I can see the queries being executed by this application?
Turn on ODBC tracing. Some information can be found here.
Multiple possibilities :
those give all the statements
like mentioned above odbc tracing.
set explain on as pre sql (generates huge files with the sqls and query plans).
those the current running statements.
using onstat commands on the server running the database (onstat -g sql ).
connecting to the monitoring database (sysmonitor) and querying the session tables .