Upload files from specific repository in Artifactory to S3 - amazon-s3

I want to upload files from a single repository (not the whole Artifactory) in Artifactory to Amazon's S3 the moment the file is uploaded to Artifactory. Is there a way to do it?
I'm using a SAAS Artifactory installation, version 7.27.9

At least one of the ways it can be done is using webhooks. You can create a webhook and configure it to notify a service you maintain when an artifact is uploaded (deployed) to the specific repository in Artifactory. Your service will receive an HTTP POST request each time a matching artifact is uploaded.
For more details, see Event: deployed in the Webhooks documentation.

Related

How to use CloudWatch after Control Tower version 3.0 update

We have a multi-account setup where we deployed an organizational-level CloudTrail in our root account's Control Tower.
For the newest version of the Control Tower (3.0), AWS introduced Organizational-level CloudTrail, this service deploys a baseline CloudTrail in each of our respective accounts and provides them the ability to send logs to a central CloudWatch location in our Root account and to a central S3 location in our logging account.
We have concerns regarding providing access to the root account just to be able to view the centralized CloudWatch logs.
I have tried setting up Athena in our Logging account so that our team can view the logs in our logging bucket, but that feels like I'm taking an unnecessary detour.
What is the best way to still be able to access the root account's CloudWatch logs without having to be in the root account?
Any advice would be appreciated!
Thanks in advance!

Orocommerce application root on AWS

I have installed the OroCommerce Community Edit, via the AWS marketplace. I would like to generate the public and private keys, for enabling Web API access of the back-end application. Upon SSH'ing into the application, where would I find the root directory of the app? I need to add the keys to the /var directory, within the application, as per the this doc.
Thanks
The root directory is '/var/www/html/commerce'

How to restrict public user access to s3 buckets or minIO?

I have got a question about minio or s3 policy. I am using a stand-alone minio server for my project. Here is the situation :
There is only one admin account that receives files and uploads them to minio server.
My Users need to access just their own uploaded objects. I mean another user is not supposed to see other people's object publicly (e.g. by visiting direct link in URL).
Admin users are allowed to see all objects in any circumstances.
1. How can i implement such policies for my project considering i have got my database for user authentication and how can i combine them to authenticate the user.
2. If not what other options do i have here to ease the process ?
Communicate with your storage through the application. Do policy checks, authentication or authorization in the app and store/grab files to/from storage and make the proper response. I guess this is the only way you can have limitation on uploading/downloading files using Minio.
If you're using a framework like Laravel built in S3 driver works perfectly with Minio; Otherwise it's just matter of a HTTP call. Minio provides HTTP APIs.

Expose expiring URL with compressed file

The requirement is:
A technical user creates a DB backup from postgreSQL (pg_dump)
The technical user uploads the file to a bucket in the closes AWS region
the technical user gets an URL that should expire every week
technical user user sends the URL to 2-4 people with little IT knowledge: the non-technical user
non-technical user downloads the file accessing the temporary URL and replace it into a Docker Container Bind Volume local location
Constrains:
AWS technical user doesn't have permissions to generate AIM access key nor secret key
AWS S3 must be used as the organization uses AWS and strategically the purpose will be to have everything centralized in AWS infrastructure
I am following this documentation about presigned object URL
What do you suggest?
I suggest to create Iam user and consume the credentials with an small application (server side). There is Api already created by aws to connect any programming language. Personally I use symfony you have bundles to connect to s3 directly. Under my perspective I recommending you to create a simple interface to upload the backup and provide access to people with roles according to your necessities.

How to give access to s3 files to authenticated web site users

I have a .net core web app, when someone uploads a file to a post in the app, I store it in an s3 bucket. I don’t want the s3 bucket to be publicly accessible, I only want logged in users to be able to download files from it.
Is the recommended solution for this creating temporary links directly to the s3 files when they are requested through the site by authenticated users? I don’t want these links to be accessible later by non-authenticated users.
Or should I download the file to the web server then stream it to the user, in effect doubling my bandwidth usage?
You should generate the links from the .net backend which the client wont easily be able to copy and share. And they will expire after given time.
Try this from Amazon documentation:
https://docs.aws.amazon.com/AmazonS3/latest/dev/ShareObjectPreSignedURLDotNetSDK.html