I've been having a very hard time using SPL to query data in Splunk... I wish to replace all of that with some simple SQL, Is that possible ? If so how ?
I don't want to pay a lot just to get Splunk training... would rather use my SQL skills :)
Hope you all agree and can help me find a solution !
Cheers!
It is not possible to use SQL to query data in Splunk. Introductory training in Splunk's query language is free. Go to https://www.splunk.com/en_us/training.html, click on "Free Courses", and select "Free Splunk Fundamentals 1".
Splunk as a manual to help SQL users transition to SQL. See https://docs.splunk.com/Documentation/Splunk/7.3.0/SearchReference/SQLtoSplunk.
For general help with searching in Splunk, see https://docs.splunk.com/Documentation/Splunk/7.3.0/Search/GetstartedwithSearch.
I'm actually working on a connector between Apache Drill and Splunk that will enable you to execute SQL queries against a Splunk installation. The code hasn't been committed to Drill yet but you can view the pull request here 1. Here is a link to the documentation 2.
Related
I find myself using the Kusto query language (KQL) via Azure Log Analytics, and I'm struggling to find a way to get any sort of detailed execution report or query plan.
In PostgreSQL I'd use EXPLAIN to produce a report on how the DBMS intends to execute the query, or EXPLAIN ANALYZE for a report on how a query actually got executed. Is there anything akin to that in KQL?
Searches for "kql query plan", "kusto explain query" etc have been largely fruitless, but this probably just means I don't know the right terms.
Supporting the suggestion provided by David דודו Markovitz and posting that as an answer to help the other community members who are having the related discussions.
Yes I do agree that Log analytics has many limitations and we can't link or query log analytics from ADX web UI.
Here are the few documents related to Log Analytics and it's limitations.
In Azure Elastic Jobs (preview) I am trying to find a way to send an email from within T_SQL in one of the job steps as msdb.dbo.sp_send_dbmail is not available in elastic Jobs. I have done some research and it appears there is no native support that i can find for achieving this and the only suggestion I have found so far is saving the jobs output to a table and then using something else (like PowerShell) to send the actual email.
Here is a link to an article explaining one way of achieving this and another here link.
Has anyone found a better solution to this?
Azure SQL database doesn't support msdb.dbo.sp_send_dbmail. That's why we must use other tools like PowerShell or logic app. Just for now, these are the solutions we can find and there isn't a better solution for now.
If you want use this feature, may you can choose Azure SQL managed instance or SQL Server in VMs.
HTH.
I need to create a connection between ##MongoDB## and ##SQL Server## where I want to replicate a subset of my Database from SQL Server into MongoDB. Can anyone suggest for feasibility of the same and how ?
Right now I am using symmetricDS for the replication but unable to...
Please suggest if symmetricDS is able to serve for this purpose.
Here is how you target MongoDB:
http://www.symmetricds.org/doc/3.8/html/user-guide.html#_mongodb
If you need more flexibility than straight table to table mapping, then you would write your own data loader using the MongoDatabaseWriter as a pattern.
https://github.com/JumpMind/symmetric-ds/tree/0c5cc1c24b42a64405f4b79c3cb6b594a35467f2/symmetric-client/src/main/java/org/jumpmind/symmetric/io
Got an easy way around for the Data Exchange from SQL to MongoDB using:
SQLtoMongo C# Tool
KNIME Analytics Platform (way easy to implement - Opensource)
But still looking for something with triggers to easily replicate the things.
I need to build a kibana like dashboard over a sql database. Is this possible? or is there an alternative as easy as kibana (in term of integration) for sql?
Siren ( http://siren.io ) is an extended Kibana which has support for connecting directly to SQL (or other APIs) and create filters and analytics.
Check it out
There is a blog about how to index SQL databases in Elasticsearch here:
http://blog.comperiosearch.com/blog/2014/01/30/elasticsearch-indexing-sql-databases-the-easy-way/
Once you get it indexed, you can set up your Kibana to view your data.
You can also find more options suggested here:
http://community.spiceworks.com/topic/377151-generate-a-dashboard-from-sql-database
this could be an alternative for you- https://github.com/KPIWatchdog/DBconnector
creates an API to the tool where you can create queries, visualise data, prepare dashboards and share to others. Depending on the number of metrics, you can opt for free plan or small monthly fee.
I am trying to identify slow queries in a large-scale Django 1.3 web application. As it is kind of difficult to match the raw sql query in the slow query log with the specific ORM statement in the code, I wondered if it is possible to add a SQL comment to the query constructed with the ORM, something like..
Object.objects.filter(Q(pub_date__lte=datetime.now)).comment('query no. 123')
Solution found by using .extra() for raw SQL commands on the django-user mailinglist:
Object.objects.filter(Q(pub_date__lte=datetime.now()).extra(where=['1=1 /* query no. 123 */'])
For those reading in 2022 onwards - there is a much better answer these days:
Google's sqlcommenter project has a Django middleware
[A] Django middleware whose purpose is to augment a SQL statement right before execution, with information about the controller and user code to help with later making database optimization decisions, after those statements are examined from the database server’s logs.