Diagnostics from kubernetes is not sending logs to storage account - azure-log-analytics

I need to archive my kube audit logs in a storage accounts and retain it for 100 days. I followed these steps but I dont see the logs in my storage account. I see a folder $logs being created but there is nothing in it? What wrong am I doing? When I query in Logs and run Azure Diagnostics |where category == "kube-audit" for timespan of less than 30 min I can see the data there but not in logs
Created a storage account in portal.
went to kubernetes service of my cluster--> selected diagnostic settings
Clicked on new diagnostic setting. selected Kube-audit from list of logs and gave a retention period of 100 days. and selected the same storage account I created in first step from the dropdown.
clicked on Save.

Related

Log Analytics retention policy and querying on logs

I would like to know how can we address this scenario in Azure Log Analytics where I need to generate Kube-audit logs of different cluster every week and also retain these logs for approx 400 days. Now storing it over Log Analytics will cost me more and its not an optimized architecture as I will not be require that so often. So I would like to know from experts whats the best way to design the architecture, where we get the kube audit logs which can be retained for 400 days and be available for querying when required without incurring too much cost.
PS: I also heard in my team that querying 400 days logs always times out in KQL.
Log analytics offerings:
Log analytics now provides the capability to manage several service tiers at table scope. Setting your data as archive, with no query capabilities at a much lower cost. offering spans for up to 7 years.
when needed, you can choose to elevate a subset of your data into the Analytics offering, providing you the capability to query it. The action of elevating your data is denoted as - "Search jobs"
Another option is to elevate an entire period in time to the Analytic offering, they call it - "Restore logs".
Table's different service tiers -
https://learn.microsoft.com/en-us/azure/azure-monitor/logs/data-retention-archive?tabs=api-1%2Capi-2
Search job offering -
https://learn.microsoft.com/en-us/azure/azure-monitor/logs/search-jobs?tabs=api-1%2Capi-2%2Capi-3
Restore logs -
https://learn.microsoft.com/en-us/azure/azure-monitor/logs/restore?tabs=api-1%2Capi-2
all are under public preview.
both offerings - Search jobs and Restore logs provides you the capability to engage your data on demand, can't comment or suggest regarding the actual cost.
Azure data explorer solution:
Another option is to use Azure storage to hold your data (as an example), Azure data explorer provides the capability to create an external table, that table is a logical view on top of your data, the data itself is kept outside of the ADX cluster. you can query your data by using ADX, expect degradation in query performance.
ADX external table offering -
https://learn.microsoft.com/en-us/azure/data-explorer/kusto/query/schema-entities/externaltables

Change time in azure log analytics

In 'Azure log Analytics' logs are saved all day. Is there any way to change the time of saving? I mean changing time so that logs are saved for example between 6:00 AM to 9:00 PM?
It would be great if you send an screenshot.
Thank you!
Unfortunately, there is no such option when log into Azure Log Analytics. You can only control the log categories when configuring logs into Azure Log Analytics via UI or rest api.
If you are only interested the data in a specified time range, you can try to write a query to fetch these logs in Azure Log Analytics.

Bigquery user statistics from Microstrategy

I am using Microstrategy to connect to Bigquery using a service account. I want to collect user level job statistics from MSTR but since I am using a service account, I need a way to track user level job statistics in Bigquery for all the jobs executed via Microstrategy.
Since you are using a Service account to make the requests from Microstrategy, you could look up for all your project Jobs by listing them then, by using each Job ID in the list, gather the information of the job as this shows the Email used for the Job ID.
A workaround for this would be also using Stackdriver Logging advanced filters and use a filter to get the jobs made by the Service Account. For instance:
resource.type="bigquery_resource"
protoPayload.authenticationInfo.principalEmail="<your service account>"
Keep in mind this only shows jobs in the last 30 days due to the Logs retention periods.
Hope it helps.

What are BigQuery audit logs supposed to produce?

I have been looking at the new BigQuery Logging feature in the Cloud Platform Console, but it seems a bit inconsistent in what is being logging.
I can see some creates, deletes, inserts and queries. However, when I did a few queries and copy jobs through the web UI they do not show up.
Should activity in the BigQuery web UI also be logged?
Does it differ from where the request comes from, eg. console or API access?
There is no difference between console or API access. An activity in the BigQuery web UI should be logged.
Are you using Cloud Log viewer to view these logs? In some cases, there might be a few secs delay when these logs show up in the log viewer. And you might have to refresh the logs.
Logs containing information about queries are written to the Data Access log stream as opposed to the Admin Activity stream. Only users with Project Owner permissions can view the contents of the Data Access logs which might be why you aren't seeing these.
You should check with the project owner to confirm you have the right permissions to see these logs.
To view the Data Access audit logs, you must have the Cloud IAM roles Logging/Private Logs Viewer or Project/Owner. I had similar issue recently and after enabling the Logging/Private Logs Viewer I was able to see the logs

Monitor Data transffer on Windows Azure?

How can I monitor traffic going out of a wcf service (self-hosted) on Windows Azure ? The amount of data going into my stress-test app doesn't seem to add up to what I'm seeing on the pricing page (which doesn't seem to be updated live anyway). The service is using https and messages are pretty small. Is the SSL handshake traffic negligible? I also have a data-miner worker roler that continuously downloads data from the internet, but from what I've read, inbound traffic is free, so it shouldn't count in the OUT traffic.
How can I get a reliable traffic monitor?
Billing page is usually updated once a day (once in a 24hr period). So you have wait a lot until you see results of your stress test added to the billing page for your account.
One place that you can monitor this (among other KPI for you application) is the MONITOR tab in the Management Portal. You can navigate to your Cloud Service being under test, click the MONITOR menu item, then click on the Add Metric at the bottom, and finally chose Network Out. This monitoring dashboard gets data every 5 minutes so it shall reflect network usage you are talking about.
Here is a screenshot of how to achieve this:
Other option that you have is to use a Network Performance Counter such as Network Interface : Bytes Sent/sec. You have to configure Windows Azure Diagnostics to monitor that specific performance counter. You can then set a scheduled transfer period of 1 minute and dig into the table created by the diagnostics agent for data.
P.S. And yes, you are correct - INBOUND data for Azure is FREE.