Spring cloud sleuth Propagating Span Context - spring-cloud-sleuth

I am trying to add span context like below, with spring boot 2.0 and sleuth version 2.0.RC1/2.0.RC2
Span initialSpan = _tracer.nextSpan().name("span").start();
try( Tracer.SpanInScope ws = _tracer.withSpanInScope(initialSpan) ) {
ExtraFieldPropagation.set("foo", "bar");
ExtraFieldPropagation.set("UPPER_CASE", "someValue");
}
after setting the data when trying to retrieve getting null value
ExtraFieldPropagation.get("foo")

Related

Spring Webflux: I want to send data to kafka after saving to database

I'm trying to do send data to kafka after my database operation is successful.
I have a /POST endpoint which store the data in mongodb and return the whole object along with mongoDB uuid.
Now I want to perform an addition task, if data is successfully saved in mongodb i should call my kafka producer method and send the data.
Not sure how to do it.
Current Codebase
public Mono<?> createStock(StockDTO stockDTONBody) {
// logger.info("Received StockDTO body: {}, ", stockDTONBody);
Mono<StockDTO> stockDTO = mongoTemplate.save(stockDTONBody);
// HERE I WANT TO SEND TO KAFKA IF DATA IS SAVED TO MONGO.
return stockDTO;
}
Thanks #Alex for help. I
Adding my answer for others.
public Mono<?> createStock(StockDTO stockDTONBody) {
// logger.info("Received StockDTO body: {}, ", stockDTONBody);
Mono<StockDTO> stockDTO = mongoTemplate.save(stockDTONBody);
// =============== Kafka Code added======================
return stockDTO.flatMap(data -> sendToKafka(data, "create"));
}
public Mono<?> sendToKafka(StockDTO stockDTO, String eventName){
Map<String, Object> data = new HashMap<String, Object>();
data.put("event", eventName);
data.put("campaign", stockDTO);
template.send(kafkaTopicName, data.toString()).log().subscribe();
System.out.println("sending to Kafka "+ eventName + data.toString());
return Mono.just(stockDTO);
}
This can result in dual writes if your data is saved in mongo and something goes wrong while publishing to kafka. Data will be missing in kafka. Instead you should use change data capture for this. Mongo provides mongo change streams which can be used here or there are other open source kafka connectors available where you can configure the connectors to listen to changelogs of Mongo and stream those to kafka.

Restricting options method in javalin

We have a kotlin code like the following, I am trying to disable the options method for the API's as follows using Javalin(3.12.0), but it is resulting in blocking all the other methods like get and post as well. What is that I am missing here?
val app = Javalin.create {
it.defaultContentType = "application/json"
it.enableWebjars()
it.addStaticFiles("", Location.CLASSPATH)
it.enableCorsForAllOrigins()
it.dynamicGzip = true
}
app.options("/*") {ctx -> ctx.status(405)}
app.routes {
path("/auth") {
post("/login") {
Auth.doLogin(it)
}
get("/metrics") {
val results = getData()
it.json(results)
}
}
Also there are 2 questions
1.want to implement the ratelimit for the get APi's for 20 request for an hour using the below code
app.get("/") { ctx ->
RateLimit(ctx).requestPerTimeUnit(5, TimeUnit.MINUTES) // throws if rate limit is exceeded
ctx.status("Hello, rate-limited World!")
}
How to achieve it?
How to restrict the jetty server version to display when the API call is made?
For Jetty...
There is only 1 Rate Limit concept in Jetty, and that's the org.eclipse.jetty.server.AcceptRateLimit, added as a Jetty Container LifeCycle bean to the ServerConnector, it cannot adjust rates for specific request endpoints, only for the entire connector.
If you want specific endpoint rates, then the org.eclipse.jetty.servlets.QoSFilter is the way that's done with Jetty.
The org.eclipse.jetty.server.HttpConfiguration for the org.eclipse.jetty.server.ServerConnector contains the controls to enable/disable the server announcement.
See
HttpConfiguration.setSendServerVersion(boolean)
HttpConfiguration.setSendXPoweredBy(boolean)
HttpConfiguration.setSendDateHeader(boolean)

Kotlin reactor MultiPart processing in RestController

I'm trying to convert a Java Spring Reactive RestController to Kotlin coroutine based RestController. Here's the signature of the Java RestController.
#PostMapping(path = "/{db}/manifest/{asset_id}")
Mono<PushResult> pushManifest(
#PathVariable(name = "db") String db,
#PathVariable(name = "asset_id") String assetId,
#RequestPart(name = "manifest") Mono<FilePart> manifest,
#RequestPart(name = "head", required = false) Mono<FilePart> head
) {
}
While I can easily change Mono<FilePart> to just FilePart, to read the content of FilePart I have to deal with Flux<DataBuffer>, whereas in Kotlin it would be preferable to always deal with Flow<DataBuffer>.
Is there a Kotlin equivalent for dealing with multipart requests in Spring Reactive that uses the Kotlin native reactive types, such as Flow?

Autofac - share instance in scope and child scopes

I have an ASP.NET Core 2.2 application and I'd like to configure a service to be "singleton" for a request. Using InstancePerLifetimeScope works as long as you don't create child scopes. There are some processes which are running in child scopes, created from the scope of the request.
Using InstancePerRequest doesn't work in ASP.NET Core 2.2 (this is basically what I need).
Did anyone encounter this situation and found a solution?
using(var scope1 = container.BeginLifetimeScope())
{
var w1 = scope1.Resolve<Worker>(); // should resolve worker 1
using(scope2 = scope1.BeginLifetimeScope())
{
var w2 = scope2.Resolve<Worker>(); // should resolve same worker as w1
}
}
using(scope3 = container.BeginLifetimeScope())
{
var w3 = scope3.Resolve<Worker>(); // should resolve another worker
}
Looks like you may want to define Worker as InstancePerMatchingLifetimeScope and then tag your outer scope with the same ID as you registered Worker with.
See the docs on tagged scopes: https://autofaccn.readthedocs.io/en/latest/lifetime/working-with-scopes.html#tagging-a-lifetime-scope

Spring webflux - Validate Query param & Path param

Is there any better way to do query param validation in Spring webflux handler?
final Optional<String> productIdParam = request.queryParam("product_id");
int productId = 0;
if(!productIdParam.isEmpty()) {
productId = Integer.parseInt(productIdParam.get());
}
No, not with the 'functional' definition.
You could switch to definition with controllers and annotations which gives you validation of PathVariable's, RequestParam's and the RequestBody out of the box.
Take a look at the Spring Webflux docs