Best ways to collect ELB access logs into Cloudwatch - amazon-s3

ELB access logs are stored on S3 buckets. I've been researching but have been unable to find some good examples of how to get these logs into Cloudwatch.
Any suggestion is appreciated.
Greg

One option is to trigger events when the S3 bucket receives log streams and send it directly to CloudWatch metrics or Cloudwatch events using a Lambda.
I'm not sure if it is the best approach. I'll investigate and update the answer.

Related

Send AWS X-Ray traces and segments to the AWS CloudWatch

I looking for a solution to send the logs from AWS-Xray to AWS CloudWatch to help me doing an aggregation and metrics.
I was checking if we can do this directly using AWS X-Ray daemon, it seems there is no way to do this form the X-Ray daemon.
I can see that the only solution to do so using Get the trace summary from Xray using AWS XRAY SDK API and share to other streams like CloudWatch.
Is there a solution to conduct this using a config in AWS X-Ray daemon to send the logs directly to CloudWatch log group?
Unfortunately xray daemon only support X-Ray endpoint by PutTraceSegments API, it cannot emit metrics or logs to CloudWatch.
Alternatively, you can choose ADOT collector which is a all-in-one agent.
https://aws-otel.github.io/docs/getting-started/collector

How to get the queries made to AWS/RDS in Grafana

I have a grafana instance running in a kubernetes cluster. I have set up a CoudWatch Datasource with the corresponding credentials and I can retrieve some metrics.
My specific need is to know if I can retrieve the queries to the DB (or SQL digest) like RDS does in TOP SQL on AWS Console (https://i.stack.imgur.com/H6sO4.png) or something similar, so I can check the query performance.
Thank you so much in advance.
You can do this with the following steps:
First, Enable the Query logging for Amazon RDS: eg. PostgreSQL and MySQL
Then, publish the logs to Amazon CloudWatch Logs
And, in the Grafana side, add the AWS CloudWatch data source integration
This way you be able to get your queries in Grafana like this Cloudwatch Logs integration example.
If you want you can analise/filter your RDS logs using CloudWatch Logs Insights in Grafana.

Kinesis data analytics sql application is not writing logs into cloudwatch

I created a kinesis data analytics application(using SQL) and attached cloudwatch logging option.
when i run the application, i am receiving the result based on my requirements.
Problem: my kinesis-data-analytics application is not writing logs into cloudwatch.
Note: I used CloudWatch FullAccess policy. The configured cloudwatch log-group and stream-name are also correct.
Please let me know how can i receive the logs.
Regards,
Siva

retrieving Apache log files from AWS Beanstalk

I know that Beanstalk's Snapshot Logs can give you a recent overview of the httpd/access_log files from among the EC2 instances under the ELB for that environment. But does anyone know a good way to get all the logs?
It's a production environment, so I want to do the processing elsewhere. But I don't want to (for obvious reasons) configure root sftp and go around collecting the files manually.
I think I had read something about configuring logging to S3?
In the "Configuration" tab for an Environment, under "Software Configuration", there is a checkbox for enabling log file rotation to S3. These are stored in an S3 bucket used specifically for Elastic Beanstalk.
You can feed your current logs to aws cloudwatch logs.
AWS cloudwatch logs will centralise all logs of your infrastructure with a neat solution to search an process them as well as creating metrix and alarm based on your logs.
I have a guide on how to Store aws beanstalk symfony and apache logs in cloudwatch logs. This will help you to get up and running fast, and then you can tweak it.

Monitoring Amazon S3 logs with Splunk?

We have a large extended network of users that we track using badges. The total traffic is in the neighborhood of 60 Million impressions a month. We are currently considering switching from a fairly slow, database-based logging solution (custom-built on PHP—messy...) to a simple log-based alternative that relies on Amazon S3 logs and Splunk.
After using Splunk for some other analyisis tasks, I really like it. But it's not clear how to set up a source like S3 with the system. It seems that remote sources require the Universal Forwarder installed, which is not an option there.
Any ideas on this?
Very late answer but I was looking for the same thing and found a Splunk app that does what you want, http://apps.splunk.com/app/1137/. I have yet not tried it though.
I would suggest logging j-son preprocessed data to a documentdb database. For example, using azure queues or simmilar service bus messaging technologies that fit your scenario in combination with azure documentdb.
So I'll keep your database based approach and modify it to be a schemaless easy to scale document based DB.
I use http://www.insight4storage.com/ from AWS Marketplace to track my AWS S3 storage usage totals by prefix, bucket or storage class over time; plus it shows me the previous versions storage by prefix and per bucket. It has a setting to save the S3 data as splunk format logs that might work for your use case, in addition to its UI and webservice API.
You use Splunk Add-On for AWS.
This is what I understand,
Create a Splunk instance. Use the website version or the on-premise
AMI of splunk to create an EC2 where splunk is running.
Install Splunk Add-On for AWS application on the EC2.
Based on the input logs type (e.g. Cloudtrail logs, Config logs, generic logs, etc) configure the Add-On and supply AWS account id or IAM Role, etc parameters.
The Add-On will automatically ping AWS S3 source and fetch the latest logs after specified amount of time (default to 30 seconds).
For generic use case (like ours), you can try and configure Generic S3 input for Splunk