Is there any way I can build a RESTful web service to hit Amazon EMR - api

I am trying to build a backend service which contains an endpoint. This endpoint is supposed to hit Amazon EMR server and run a scala script with provided params. I know we can build services with Amazon Lambda and Amazon API, but is there any way to do the same with EMR?

Related

Google Cloud Run API - accessing endpoint internally

Bear with me, I am still on training wheels with GCP
Scenario being
I have a Cloud Run instance serving an API (to be consumed internally)
A middleware running on Cloud Compute instance serving an API. This utilises API served by (1)
(2) needs to access (1)
I'm trying to figure out how to discover the internal DNS name or IP, so that the middleware (2) can be configured to access Cloud Run served API (1)
Intention being to create templates (environment config files) so that I can eventually automate the deployment of all layer.
With Cloud Run, you can't know before the first deployment the URL of the service. The pattern is
https://<serviceName>-<projectHash>.<region>.run.app
You can also have optionally a tag at the beginning, but it's not important here.
The bad part is the project Hash can't be calculated before the deployment (or I donc know the hash formula). Thus it's impossible to register in a DNS the URL in advance of the deployment.
However, if you use tools like terraform, you can get, as output, the URL of the service after the deployment and then register it in your DNS, as CNAME (Cloud Run is a managed service, you haven't a static IP).
You can also use the CLoud Run API to request the list of services on a project, pick the service that you want and get the URL of the service (get request to the service API)
A last word before you hit the wall, you talk about internal endpoint. Cloud Run url is public, and public only. Therefore, your middleware need to access to the internet to be able to request Cloud Run.
If your middleware is deployed on the same project, you can set the Cloud Run service ingress to internal to allow only resources from the VPCs of the current project to access to Cloud Run service.
But the URL is still public and you need an internet access to resolve and to access it.
If you don't wanna be bothered by service discovery and constructing the full URL, you can use runsd, a solution that I developed for Cloud Run that allows you to call a service by its name (e.g. http://hello).

Calling REST API in Snowflake

I have a usecase where in API's are published internally. Not using AWS Gateway or Azure. API's are deployed OnPrem. How can i use Snowflake External or internal functions to load data accessing REST API published internally?

Invoke AWS Step Function from AWS Lambda Proxy (AWS API Gateway) .NET SAM Template

I have a AWS API Gateway, deployed using SAM template. The API request comes to the Proxy Lambda Function. From there I need to call a AWS Step Function which invokes multiple Lambda Functions. I have multiple Solutions. Following Microservices pattern. Need to call one microservice from another. Each Microservice is in a seperate solution and the startup project is a ClassLibrary(.NET Core 2.1). Using SAM template and deploying it via AWS Toolkit for Visual Studio. Not using Fargate Containers and WebApi projects. Need to coordinate between API Gateways.
In your question you say: "The API request comes to the Proxy Lambda Function. From there I need to call a AWS Step Function". It is simple, here an example of api gateway that use a lambda like authorizer and in the method execution I call a step function. In your step function later you manage your flow and your and the lambdas that you need execute

API/CLI to get all the services(ec2, s3, rds..etc) my account is using?

I tried to look around but couldn't find enough details on how to programatically get list of all services that my aws account is currently using.
Something like: EC2, S3, DynamoDB..etc
Is there a Java API or AWS CLI solution that I can use?
AFAIK there is no AWS service that exposes that information programatically. However if you're after this information for cost purposes there is the AWS Budgets service that allows you to setup budgets for your AWS resources - it has a public API which is accessible from Java and the CLI.

Can Rackspace Cloud Files be accessed using S3 (AWS) APIs?

I got to know that Rackspace Cloud Files is based on OpenStack Object Storage service (Swift).
As OpenStack allows configuring/manipulating object storage using S3 APIs through swift3
http://docs.openstack.org/trunk/openstack-object-storage/admin/content/configuring-openstack-object-storage-with-s3_api.html
I am thinking whether Rackspace Cloud Files provides S3 API support as well. I have a client written for Amazon Web Services using S3 RESTful APIs so was thinking to reuse it for Rackspace Cloud Files as well.
The S3 plugin for Swift is not deployed as part of Rackspace Cloud Files (most production deploys of openstack don't deploy it by default). However, if you want better flexibility in the app, you can use a cross cloud toolkit such as libcloud (python), fog (ruby), jclouds (java), pkgcloud (node/js). This means you can use a simpler abstraction and support multiple providers within your application.