Change time in azure log analytics - azure-log-analytics

In 'Azure log Analytics' logs are saved all day. Is there any way to change the time of saving? I mean changing time so that logs are saved for example between 6:00 AM to 9:00 PM?
It would be great if you send an screenshot.
Thank you!

Unfortunately, there is no such option when log into Azure Log Analytics. You can only control the log categories when configuring logs into Azure Log Analytics via UI or rest api.
If you are only interested the data in a specified time range, you can try to write a query to fetch these logs in Azure Log Analytics.

Related

Which GCP Log Explorer query will show success message of data loading to BigQuery by Dataflow Job so that log sink to pub/sub can be created

I am running a Dataflow streaming job which reads data from pub/sub topic and performs streaming insert into BigQuery.
Once data loaded, I want to capture the success status from Log explorer to send acknowledgement back to another pub/sub topic.
What Log Explorer query can serve the purpose ?
I tried to run query below, but did not help.
protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.load.destinationTable.datasetId="[REPLACE_WITH_YOUR_DATASET_ID]"
protoPayload.serviceData.jobCompletedEvent.job.jobConfiguration.load.destinationTable.projectId="REPLACE_WITH_YOUR_PROJECT_ID"
protoPayload.methodName="jobservice.jobcompleted"
protoPayload.state="DONE"
Please help.
Thanking you,
Santanu

Diagnostics from kubernetes is not sending logs to storage account

I need to archive my kube audit logs in a storage accounts and retain it for 100 days. I followed these steps but I dont see the logs in my storage account. I see a folder $logs being created but there is nothing in it? What wrong am I doing? When I query in Logs and run Azure Diagnostics |where category == "kube-audit" for timespan of less than 30 min I can see the data there but not in logs
Created a storage account in portal.
went to kubernetes service of my cluster--> selected diagnostic settings
Clicked on new diagnostic setting. selected Kube-audit from list of logs and gave a retention period of 100 days. and selected the same storage account I created in first step from the dropdown.
clicked on Save.

BigQuery resource used every 12h without configuring it

I need some help to understand what happened to our cloud to have BigQuery resource running every 12h to our cloud without configuring it. Also, it seems very intense because we got charged, in average, one dollar every day for the past month.
After checking in Logs Explorer, I saw several logs regarding the BigQuery resource
I saw the email from one of our software guy. Since I removed him from our Firebase project, there is no more requests.
Though, that person did not do or configure anything regarding the BigQuery so we are a bit lost here and this is why we are asking some help to investigate and understand what is going on.
Hope you will be able to help. Let me know if you need more information.
Thanks in advance
NB: I did not try to add the software guy's email yet. I wanted to see how it will go for the rest of the month.
The most likely causes I've observed for this in the wild:
A scheduled query was setup by a user.
A data studio dashboard was setup and configured to periodically refresh data in the background.
Someone's setup a workflow the queries BigQuery, such as cloud composer or a cloud function, etc.
It's also possible its just something like a script running in a crontab on someone's machine. The audit log should have relevant details like request origin for cases where it's just something running as part of a bespoke process.

Why I'm missing events on my Log Analytics Query?

On query below , I can only see data from hours 3 up 8.. all data for other timeframes are missing.
The data is being generated by Azure SQL log analytics configuration where I can't see anything missing.
Any ideas?
Thanks a lot!
I don't think the events are missing for the other times, since you can see the events between time 3 to 8.
You'd better check if there are logs in the other time. Azure log analytics does not abandon your logs.
According to MS support, I'm hitting the daily cap limit and also free tier wasn't available on my region.

What are BigQuery audit logs supposed to produce?

I have been looking at the new BigQuery Logging feature in the Cloud Platform Console, but it seems a bit inconsistent in what is being logging.
I can see some creates, deletes, inserts and queries. However, when I did a few queries and copy jobs through the web UI they do not show up.
Should activity in the BigQuery web UI also be logged?
Does it differ from where the request comes from, eg. console or API access?
There is no difference between console or API access. An activity in the BigQuery web UI should be logged.
Are you using Cloud Log viewer to view these logs? In some cases, there might be a few secs delay when these logs show up in the log viewer. And you might have to refresh the logs.
Logs containing information about queries are written to the Data Access log stream as opposed to the Admin Activity stream. Only users with Project Owner permissions can view the contents of the Data Access logs which might be why you aren't seeing these.
You should check with the project owner to confirm you have the right permissions to see these logs.
To view the Data Access audit logs, you must have the Cloud IAM roles Logging/Private Logs Viewer or Project/Owner. I had similar issue recently and after enabling the Logging/Private Logs Viewer I was able to see the logs