How to ingest AWS ALB logs in S3 to Loki? - aws-application-load-balancer

I'm attempting to ingest AWS ALB logs into Loki but I can't seem to find a smooth way to do this. AWS ALB logs end up in S3 for consumption through Athena or other sources but Loki doesn't have a simple way to ingest S3 logs.
Is there a known way to accomplish this?

lambda-promtail recently gained the ability to ingest ALB logs from S3, in this merged pull request. The lambda is triggered by an S3 bucket notification. The repository includes example Terraform and CloudFormation configs for setting it up.

Related

AWS S3 bucket migration to Cloud Storage

I have to migrate all AWS S3 bucket and it's contents to GCP cloud storage with the help of terraform only. Please help me with how can I do that.
I have not found anything suitable so reaching to you

S3 event notifications from on-prem to cloud

Looking for technology solutions to this problem - how to trigger S3 bucket notifications from on-prem S3 NetApp to AWS cloud. We're looking at two approaches - StorageGRID event notification and Qlik S3 options.
Current setup - AWS direct connect to on-prem. NetApp S3 compatible storage on-prem. Need to trigger to AWS cloud.
thanks

Processing AWS ELB access logs (from S3 bucket to InfluxDB)

We would like to process AWS ELB access logs and write them into InfluxDB
to be used for application metrics and monitoring (ex. Grafana).
We configured ELB to store access logs into S3 bucket.
What would be the best way to process those logs and write them to InfluxDB?
What we tried so far was to mount S3 bucket to filesystem using s3fs and then use Telegraf agent for processing. But this approach has some issues: s3fs mounting looks like a hack, and all the files in the bucket are compressed and need to be unzipped before telegraf can process them which makes this task overcomplicated.
Is there any better way?
Thanks,
Oleksandr
Can you just install the telegraf agent on the AWS instance that is generating the logs, and have them sent directly to InfluxDB in real-time?

How to set up a volume linked to S3 in Docker Cloud with AWS?

I'm running my Play! webapp with Docker Cloud (could also use Rancher) and AWS and I'd like to store all the logs in S3 (via volume). Any ideas on how I could achieve that with minimal effort?
Use docker volumes to store the logs in the host system.
Try S3 aws-cli to sync your local directory with S3 Bucket
aws s3 sync /var/logs/container-logs s3://bucket/
create a cron to run it on every minute or so.
Reference: s3 aws-cli

How to specify different AWS credentials for EMR and S3 when using MRJob

I can specify what AWS credentials to use to create an EMR cluster via environment variables. However, I would like to run a mapreduce job on another AWS user's S3 bucket for which they gave me a different set of AWS credentials.
Does MRJob provide a way to do this, or would I have to copy the bucket using my account first so that the bucket and EMR keys are the same?