API/CLI to get all the services(ec2, s3, rds..etc) my account is using? - aws-java-sdk

I tried to look around but couldn't find enough details on how to programatically get list of all services that my aws account is currently using.
Something like: EC2, S3, DynamoDB..etc
Is there a Java API or AWS CLI solution that I can use?

AFAIK there is no AWS service that exposes that information programatically. However if you're after this information for cost purposes there is the AWS Budgets service that allows you to setup budgets for your AWS resources - it has a public API which is accessible from Java and the CLI.

Related

Is there any way I can build a RESTful web service to hit Amazon EMR

I am trying to build a backend service which contains an endpoint. This endpoint is supposed to hit Amazon EMR server and run a scala script with provided params. I know we can build services with Amazon Lambda and Amazon API, but is there any way to do the same with EMR?

Cloud and REST API based Recording and storing to Google bucket or AWS S3

I need to do cloud based recording. I need to take all video+audio streams of all clients. Then push to AWS S3 or Gcloud bucket or Agora storage itself. I need to do it through API since I have some criteria and also I need to organize the uploaded data in specific folder names.
I did not see any specific function in agora service to do so. Should I take copy of localstream and handle it on my own? If so, I hope it does not come in the way of agora functionality.. Please share if any sample project. Just to emphasize- I am not looking for on-premise SDK. I need to do it through REST. I use angular 8. I could write my own REST server (to deal cloud of my choice) if really needed.
Background study done-
In agora.IO documentation, I see lot of documentation about on-premise recording. I did not see any reference for cloud based recording. I checked the agora.IO documentation and also angular code which includes agora service component. Angular sample works fine. But I did not understand how to take the stream and upload to my cloud storage. I am concerned that it might impact the streaming and playback of agora. To handle this , I thought you might have specific guideline or library.
Thanks
There is a new Cloud Recording API in beta that Agora is offering that can connect to the channel and record the streams (configured by RESTful API) directly to your S3 bucket, currently GCP is not supported.
This is an add-on feature so it's not enabled by default and not available to be enabled through the dashboard, it needs to be enabled by the Agora team. Please Join the Agora Community Slack and send me a direct message with the AppID for which you wish to enable with this feature and I can get you into the beta program.

same domain name for lambda function api endpoints(backend) and for frontend

How to use same domain name for front end and for lambda function endpoints
for Serverless framework ?
I am using reactjs for frontend design and for frontend hosting I am using s3 and Aws Dynamodb for lambda functions.
We do the same in our systems, AWS solved it long time back.
It is the cloud service called CloudFront, which lets you connect multiple origins including external origins that are outside of AWS cloud.
Created a simple architecture diagram to help you view the same.
Hope it helps.

Integrate openstack Swift storage with ceph

I am new with Software defined storage. I was working on possible ways of integration of different object storage implementations like AWS S3, Openstack Swift with Ceph.
I am wondering if I can use ceph API calls to store objects in Openstack swift?
The document here specifies that I can use Swift APIs to store objects in ceph (OSDs), but is it possible the other way around?
Thanks for any help in advance.
Exaclty. Ceph Rados Gateway is a full replacement of Swift Object storage with some missing functionality as described here.
http://docs.ceph.com/docs/master/radosgw/swift/
To make it clear, in order to store objects in Ceph Cluster you should just setup Rados Gateway and that's it. You will have two options to operate either using Swift API or S3 API.
Rados Gateway also supports S3 API to store objects. You can even configure RGW to access containers like S3 bucket subdomains. Ex: #bucketname#.s3.example.com

Can Rackspace Cloud Files be accessed using S3 (AWS) APIs?

I got to know that Rackspace Cloud Files is based on OpenStack Object Storage service (Swift).
As OpenStack allows configuring/manipulating object storage using S3 APIs through swift3
http://docs.openstack.org/trunk/openstack-object-storage/admin/content/configuring-openstack-object-storage-with-s3_api.html
I am thinking whether Rackspace Cloud Files provides S3 API support as well. I have a client written for Amazon Web Services using S3 RESTful APIs so was thinking to reuse it for Rackspace Cloud Files as well.
The S3 plugin for Swift is not deployed as part of Rackspace Cloud Files (most production deploys of openstack don't deploy it by default). However, if you want better flexibility in the app, you can use a cross cloud toolkit such as libcloud (python), fog (ruby), jclouds (java), pkgcloud (node/js). This means you can use a simpler abstraction and support multiple providers within your application.