Jackson, how to ignore some values - jackson

I'd like to exclude all properties with values less than zero. Does Jackson have some ready to use solutions or I should create CustomSerializerFactory and BeanPropertyWriter?

You can do this by using a filter - it's a little verbose but it does the job.
First - you need to specify the filter on your entity:
#JsonFilter("myFilter")
public class MyDtoWithFilter { ...}
Then, you need to specify your custom filter, which will look at the values:
PropertyFilter theFilter = new SimpleBeanPropertyFilter() {
#Override
public void serializeAsField(Object pojo, JsonGenerator jgen, SerializerProvider provider, PropertyWriter writer) throws Exception {
if (include(writer)) {
if (writer.getName().equals("intValue")) {
int intValue = ((MyDtoWithFilter) pojo).getIntValue();
if (intValue < 0) {
writer.serializeAsField(pojo, jgen, provider);
}
} else {
writer.serializeAsField(pojo, jgen, provider);
}
} else if (!jgen.canOmitFields()) { // since 2.3
writer.serializeAsOmittedField(pojo, jgen, provider);
}
}
#Override
protected boolean include(BeanPropertyWriter writer) {
return true;
}
#Override
protected boolean include(PropertyWriter writer) {
return true;
}
};
This is done for a field called intValue but you can do the same for all your fields that need to be positive in a similar way.
Finally, not you can marshall an object and test that it works:
FilterProvider filters = new SimpleFilterProvider().addFilter("myFilter", theFilter);
MyDtoWithFilter dtoObject = new MyDtoWithFilter();
dtoObject.setIntValue(12);
ObjectMapper mapper = new ObjectMapper();
String dtoAsString = mapper.writer(filters).writeValueAsString(dtoObject);
Hope this helps.

Related

Deserialize enums using the EnumMember value in AspNet [FromQuery] model binding

I have an endpoint in .NET 6 Microsoft.NET.Sdk.Web project that deserialize query strings into a .NET object by using the standard [FromQuery]
[Route("[controller]")]
public class SamplesController
: ControllerBase
{
[HttpGet]
public IActionResult Get([FromQuery]QueryModel queryModel)
{
if (!queryModel.Status.HasValue)
{
return BadRequest("Problem in deserialization");
}
return Ok(queryModel.Status.Value.GetEnumDisplayName());
}
}
The model contains an enum
public class QueryModel
{
/// <summary>
/// The foo parameter
/// </summary>
/// <example>bar</example>
public string? Foo { get; init; } = null;
/// <summary>
/// The status
/// </summary>
/// <example>on-hold</example>
public Status? Status { get; set; } = null;
}
And the enum has EnumMember attributes which value I want to use to deserialize from.
public enum Status
{
[EnumMember(Value = "open")]
Open,
[EnumMember(Value = "on-hold")]
OnHold
}
By default, .NET 6 does not take into consideration the EnumMember when deserializing.
The goal is to be able to send requests such as
http://localhost:5000/Samples?Foo=bar&Status=on-hold
and have the controller's action deserialize the QueryModel with the proper Status.OnHold value by using its EnumMember
I have tried without luck an extensions library that contains a converter, but the converter is not getting triggered when using [FromQuery]. See https://github.com/Macross-Software/core/issues/30
I have added a project to reproduce problem and as a sandbox to provide a solution**
https://gitlab.com/sunnyatticsoftware/issues/string-to-enum-mvc/-/tree/feature/1-original-problem
NOTE: I would need a solution where the Enum and the does not require any external dependency (just .NET sdk).
A custom Enum converter might be your choice. By leveraging the existing EnumConverter class what we need is to have a customized ConvertFrom method:
public class CustomEnumConverter : EnumConverter
{
public CustomEnumConverter([DynamicallyAccessedMembers(DynamicallyAccessedMemberTypes.PublicParameterlessConstructor | DynamicallyAccessedMemberTypes.PublicFields)] Type type) : base(type)
{
}
public override object? ConvertFrom(ITypeDescriptorContext? context, CultureInfo? culture, object value)
{
if (value is string strValue)
{
try
{
foreach (var name in Enum.GetNames(EnumType))
{
var field = EnumType.GetField(name);
if (field != null)
{
var enumMember = (EnumMemberAttribute)(field.GetCustomAttributes(typeof(EnumMemberAttribute), true).Single());
if (strValue.Equals(enumMember.Value, StringComparison.OrdinalIgnoreCase))
{
return Enum.Parse(EnumType, name, true);
}
}
}
}
catch (Exception e)
{
throw new FormatException((string)value, e);
}
}
return base.ConvertFrom(context, culture, value);
}
}
And then decorate the converter to your Model class:
[TypeConverter(typeof(CustomEnumConverter))]
public enum Status
{
[EnumMember(Value = "open")]
Open,
[EnumMember(Value = "on-hold")]
OnHold
}
then we can get the "on-hold" parsed. You might also want to override the ConverTo() for printing the EnumMember value to swagger. It is a bit hacky, but if you want a pure .NET solution this should be one of the minimal viable solutions.
Following the documentation guide Custom Model Binding in ASP.NET Core, you can create your own versions of Microsoft's classes EnumTypeModelBinderProvider, EnumTypeModelBinder (and base class SimpleTypeModelBinder) that replace incoming enum value names that have been renamed via EnumMemberAttribute with the original enum names before binding:
// Begin code for enum model binding
public class EnumMemberEnumTypeModelBinderProvider : IModelBinderProvider
{
public EnumMemberEnumTypeModelBinderProvider(MvcOptions options) { }
public IModelBinder? GetBinder(ModelBinderProviderContext context)
{
if (context == null)
throw new ArgumentNullException(nameof(context));
if (context.Metadata.IsEnum)
{
var enumType = context.Metadata.UnderlyingOrModelType;
Debug.Assert(enumType.IsEnum);
var loggerFactory = context.Services.GetRequiredService<ILoggerFactory>();
if (EnumExtensions.TryGetEnumMemberOverridesToOriginals(enumType, out var overridesToOriginals))
return new EnumMemberEnumTypeModelBinder(suppressBindingUndefinedValueToEnumType: true, enumType, loggerFactory, overridesToOriginals);
}
return null;
}
}
public class EnumMemberEnumTypeModelBinder : ExtensibleSimpleTypeModelBinder
{
// Adapted from https://github.com/dotnet/aspnetcore/blob/c85baf8db0c72ae8e68643029d514b2e737c9fae/src/Mvc/Mvc.Core/src/ModelBinding/Binders/EnumTypeModelBinder.cs#L58
readonly Type enumType;
readonly bool isFlagged;
readonly Dictionary<ReadOnlyMemory<char>, string> overridesToOriginals;
readonly TypeConverter typeConverter;
public EnumMemberEnumTypeModelBinder(bool suppressBindingUndefinedValueToEnumType, Type modelType, ILoggerFactory loggerFactory, Dictionary<ReadOnlyMemory<char>, string> overridesToOriginals) : base(modelType, loggerFactory)
{
this.enumType = Nullable.GetUnderlyingType(modelType) ?? modelType;
if (!this.enumType.IsEnum)
throw new ArgumentException();
this.isFlagged = Attribute.IsDefined(enumType, typeof(FlagsAttribute));
this.overridesToOriginals = overridesToOriginals ?? throw new ArgumentNullException(nameof(overridesToOriginals));
this.typeConverter = TypeDescriptor.GetConverter(this.enumType);
}
protected override string? GetValueFromBindingContext(ValueProviderResult valueProviderResult) =>
EnumExtensions.ReplaceRenamedEnumValuesToOriginals(base.GetValueFromBindingContext(valueProviderResult), isFlagged, overridesToOriginals);
protected override void CheckModel(ModelBindingContext bindingContext, ValueProviderResult valueProviderResult, object? model)
{
if (model == null)
{
base.CheckModel(bindingContext, valueProviderResult, model);
}
else if (IsDefinedInEnum(model, bindingContext))
{
bindingContext.Result = ModelBindingResult.Success(model);
}
else
{
bindingContext.ModelState.TryAddModelError(
bindingContext.ModelName,
bindingContext.ModelMetadata.ModelBindingMessageProvider.ValueIsInvalidAccessor(
valueProviderResult.ToString()));
}
}
private bool IsDefinedInEnum(object model, ModelBindingContext bindingContext)
{
// Adapted from https://github.com/dotnet/aspnetcore/blob/c85baf8db0c72ae8e68643029d514b2e737c9fae/src/Mvc/Mvc.Core/src/ModelBinding/Binders/EnumTypeModelBinder.cs#L58
var modelType = bindingContext.ModelMetadata.UnderlyingOrModelType;
// Check if the converted value is indeed defined on the enum as EnumTypeConverter
// converts value to the backing type (ex: integer) and does not check if the value is defined on the enum.
if (bindingContext.ModelMetadata.IsFlagsEnum)
{
var underlying = Convert.ChangeType(
model,
Enum.GetUnderlyingType(modelType),
CultureInfo.InvariantCulture).ToString();
var converted = model.ToString();
return !string.Equals(underlying, converted, StringComparison.OrdinalIgnoreCase);
}
return Enum.IsDefined(modelType, model);
}
}
public class ExtensibleSimpleTypeModelBinder : IModelBinder
{
// Adapted from https://github.com/dotnet/aspnetcore/blob/c85baf8db0c72ae8e68643029d514b2e737c9fae/src/Mvc/Mvc.Core/src/ModelBinding/Binders/SimpleTypeModelBinder.cs
private readonly TypeConverter _typeConverter;
private readonly ILogger _logger;
public ExtensibleSimpleTypeModelBinder(Type type, ILoggerFactory loggerFactory) : this(type, loggerFactory, null) { }
public ExtensibleSimpleTypeModelBinder(Type type, ILoggerFactory loggerFactory, TypeConverter? typeConverter)
{
if (type == null)
throw new ArgumentNullException(nameof(type));
if (loggerFactory == null)
throw new ArgumentNullException(nameof(loggerFactory));
_typeConverter = typeConverter ?? TypeDescriptor.GetConverter(type);
_logger = loggerFactory.CreateLogger<ExtensibleSimpleTypeModelBinder>();
}
protected virtual string? GetValueFromBindingContext(ValueProviderResult valueProviderResult) => valueProviderResult.FirstValue;
/// <inheritdoc />
public Task BindModelAsync(ModelBindingContext bindingContext)
{
if (bindingContext == null)
throw new ArgumentNullException(nameof(bindingContext));
//_logger.AttemptingToBindModel(bindingContext);
var valueProviderResult = bindingContext.ValueProvider.GetValue(bindingContext.ModelName);
if (valueProviderResult == ValueProviderResult.None)
{
//_logger.FoundNoValueInRequest(bindingContext);
// no entry
//_logger.DoneAttemptingToBindModel(bindingContext);
return Task.CompletedTask;
}
bindingContext.ModelState.SetModelValue(bindingContext.ModelName, valueProviderResult);
try
{
var value = GetValueFromBindingContext(valueProviderResult);
object? model;
if (bindingContext.ModelType == typeof(string))
{
// Already have a string. No further conversion required but handle ConvertEmptyStringToNull.
if (bindingContext.ModelMetadata.ConvertEmptyStringToNull && string.IsNullOrWhiteSpace(value))
model = null;
else
model = value;
}
else if (string.IsNullOrWhiteSpace(value))
{
// Other than the StringConverter, converters Trim() the value then throw if the result is empty.
model = null;
}
else
{
model = _typeConverter.ConvertFrom(context: null,culture: valueProviderResult.Culture, value: value);
}
CheckModel(bindingContext, valueProviderResult, model);
//_logger.DoneAttemptingToBindModel(bindingContext);
return Task.CompletedTask;
}
catch (Exception exception)
{
var isFormatException = exception is FormatException;
if (!isFormatException && exception.InnerException != null)
{
// TypeConverter throws System.Exception wrapping the FormatException,
// so we capture the inner exception.
exception = System.Runtime.ExceptionServices.ExceptionDispatchInfo.Capture(exception.InnerException).SourceException;
}
bindingContext.ModelState.TryAddModelError(bindingContext.ModelName,exception, bindingContext.ModelMetadata);
// Were able to find a converter for the type but conversion failed.
return Task.CompletedTask;
}
}
/// <inheritdoc/>
protected virtual void CheckModel(
ModelBindingContext bindingContext,
ValueProviderResult valueProviderResult,
object? model)
{
// When converting newModel a null value may indicate a failed conversion for an otherwise required
// model (can't set a ValueType to null). This detects if a null model value is acceptable given the
// current bindingContext. If not, an error is logged.
if (model == null && !bindingContext.ModelMetadata.IsReferenceOrNullableType)
{
bindingContext.ModelState.TryAddModelError(
bindingContext.ModelName,
bindingContext.ModelMetadata.ModelBindingMessageProvider.ValueMustNotBeNullAccessor(
valueProviderResult.ToString()));
}
else
{
bindingContext.Result = ModelBindingResult.Success(model);
}
}
}
// End code for enum model binding
/********************************************************/
// Begin general enum parsing code
public class CharMemoryComparer : IEqualityComparer<ReadOnlyMemory<char>>
{
public static CharMemoryComparer OrdinalIgnoreCase { get; } = new CharMemoryComparer(StringComparison.OrdinalIgnoreCase);
public static CharMemoryComparer Ordinal { get; } = new CharMemoryComparer(StringComparison.Ordinal);
readonly StringComparison comparison;
CharMemoryComparer(StringComparison comparison) => this.comparison = comparison;
public bool Equals(ReadOnlyMemory<char> x, ReadOnlyMemory<char> y) => MemoryExtensions.Equals(x.Span, y.Span, comparison);
public int GetHashCode(ReadOnlyMemory<char> obj) => String.GetHashCode(obj.Span, comparison);
}
public static partial class EnumExtensions
{
public const char FlagSeparatorChar = ',';
public const string FlagSeparatorString = ", ";
public static bool TryGetEnumMemberOverridesToOriginals(Type enumType, [System.Diagnostics.CodeAnalysis.NotNullWhen(returnValue: true)] out Dictionary<ReadOnlyMemory<char>, string>? overridesToOriginals)
{
if (enumType == null)
throw new ArgumentNullException(nameof(enumType));
if (!enumType.IsEnum)
throw new ArgumentException(nameof(enumType));
overridesToOriginals = null;
foreach (var name in Enum.GetNames(enumType))
{
if (TryGetEnumAttribute<EnumMemberAttribute>(enumType, name, out var attr) && !string.IsNullOrWhiteSpace(attr.Value))
{
overridesToOriginals = overridesToOriginals ?? new(CharMemoryComparer.OrdinalIgnoreCase);
overridesToOriginals.Add(attr.Value.AsMemory(), name);
}
}
return overridesToOriginals != null;
}
public static bool TryGetEnumAttribute<TAttribute>(Type type, string name, [System.Diagnostics.CodeAnalysis.NotNullWhen(returnValue: true)] out TAttribute? attribute) where TAttribute : System.Attribute
{
var member = type.GetMember(name).SingleOrDefault();
attribute = member?.GetCustomAttribute<TAttribute>(false);
return attribute != null;
}
public static string? ReplaceRenamedEnumValuesToOriginals(string? value, bool isFlagged, Dictionary<ReadOnlyMemory<char>, string> overridesToOriginals)
{
if (string.IsNullOrWhiteSpace(value))
return value;
var trimmed = value.AsMemory().Trim();
if (overridesToOriginals.TryGetValue(trimmed, out var #override))
value = #override;
else if (isFlagged && trimmed.Length > 0)
{
var sb = new StringBuilder();
bool replaced = false;
foreach (var n in trimmed.Split(EnumExtensions.FlagSeparatorChar, StringSplitOptions.TrimEntries))
{
ReadOnlySpan<char> toAppend;
if (overridesToOriginals.TryGetValue(n, out var #thisOverride))
{
toAppend = thisOverride.AsSpan();
replaced = true;
}
else
toAppend = n.Span;
sb.Append(sb.Length == 0 ? null : EnumExtensions.FlagSeparatorString).Append(toAppend);
}
if (replaced)
value = sb.ToString();
}
return value;
}
}
public static class StringExtensions
{
public static IEnumerable<ReadOnlyMemory<char>> Split(this ReadOnlyMemory<char> chars, char separator, StringSplitOptions options = StringSplitOptions.None)
{
int index;
while ((index = chars.Span.IndexOf(separator)) >= 0)
{
var slice = chars.Slice(0, index);
if ((options & StringSplitOptions.TrimEntries) == StringSplitOptions.TrimEntries)
slice = slice.Trim();
if ((options & StringSplitOptions.RemoveEmptyEntries) == 0 || slice.Length > 0)
yield return slice;
chars = chars.Slice(index + 1);
}
if ((options & StringSplitOptions.TrimEntries) == StringSplitOptions.TrimEntries)
chars = chars.Trim();
if ((options & StringSplitOptions.RemoveEmptyEntries) == 0 || chars.Length > 0)
yield return chars;
}
}
Then add the binder in ConfigureServices() like so:
public void ConfigureServices(IServiceCollection services)
{
services.AddControllers(options =>
{
options.ModelBinderProviders.Insert(0, new EnumMemberEnumTypeModelBinderProvider(options));
});
}
Notes:
EnumTypeModelBinder and base class SimpleTypeModelBinder provide no useful extension points to customize the parsing of the incoming value string, thus it was necessary to copy some of their logic.
Precisely emulating the logic of SimpleTypeModelBinder is somewhat difficult because it supports both numeric and textual enum values -- including mixtures of both for flags enums. The binder above retains that capability, but at a cost of also allowing original enum names to be bound successfully. Thus the values on-hold and onhold will be bound to Status.OnHold.
Conversely, if you do not want to support binding of numeric values for enums, you could adapt the code of JsonEnumMemberStringEnumConverter from this answer to System.Text.Json: How do I specify a custom name for an enum value?. Demo fiddle here. This approach also avoids binding to the original, unrenamed enum names.
Matching of override names with original enum names is case-insensitive, so override names that differ only in case are not supported.

Add weights to documents Lucene8+solr 8 while indexing

I am working on migrating solr from 5.4.3 to 8.11 for one of my search apps and successfully upgraded to 7.7.3. But for further upgradations facing the order of the response data being changed than it was earlier. Here I am trying to use FunctionScoreQuery along with DoubleValuesSource since CustomScoreQuery is deprecated in 7.7.3 and removed in 8.
Below is my code snippet (now I am using solr 8.5.2 and Lucene 8.5.2)
public class CustomQueryParser extends QParserPlugin {
#Override
public QParser createParser(final String qstr, final SolrParams localParams, final SolrParams params,
final SolrQueryRequest req) {
return new MyParser(qstr, localParams, params, req);
}
private static class MyParser extends QParser {
private Query innerQuery;
private String queryString;
public MyParser(final String qstr, final SolrParams localParams, final SolrParams params,
final SolrQueryRequest req) {
super(qstr, localParams, params, req);
if (qstr == null || qstr.trim().length() == 0) {
this.queryString = DEFAULT_SEARCH_QUERY;
setString(this.queryString);
} else {
this.queryString = qstr;
}
try {
if (queryString.contains(":")) {
final QParser parser = getParser(queryString, "edismax", getReq());
this.innerQuery = parser.parse();
} else {
final QParser parser = getParser(queryString, "dismax", getReq());
this.innerQuery = parser.parse();
}
} catch (final SyntaxError ex) {
throw new RuntimeException("Error parsing query", ex);
}
}
#Override
public Query parse() throws SyntaxError{
final Query query = new MyCustomQuery(innerQuery);
final CustomValuesSource customValuesSource = new CustomValuesSource(queryString,innerQuery);
final FunctionScoreQuery fsq = FunctionScoreQuery.boostByValue(query, customValuesSource.fromFloatField("score"));
return fsq;
}
}
}
public class MyCustomQuery extends Query {
#Override
public Weight createWeight(final IndexSearcher searcher, final ScoreMode scoreMode, final float boost) throws IOException {
Weight weight;
if(query == null){
weight = new ConstantScoreWeight(this, boost) {
#Override
public Scorer scorer(final LeafReaderContext context) throws IOException {
return new ConstantScoreScorer(this,score(),scoreMode, DocIdSetIterator.all(context.reader().maxDoc()));
}
#Override
public boolean isCacheable(final LeafReaderContext leafReaderContext) {
return false;
}
};
}else {
weight = searcher.createWeight(query,scoreMode,boost);
}
return weight;
}
}
public class CustomValuesSource extends DoubleValuesSource {
#Override
public DoubleValues getValues(final LeafReaderContext leafReaderContext,final DoubleValues doubleValues) throws IOException {
final DoubleValues dv = new CustomDoubleValues(leafReaderContext);
return dv;
}
class CustomDoubleValues extends DoubleValues {
#Override
public boolean advanceExact(final int doc) throws IOException {
final Document document = leafReaderContext.reader().document(doc);
final List<IndexableField> fields = document.getFields();
for (final IndexableField field : fields) {
// total_score is being calculated with my own preferences
document.add(new FloatDocValuesField("score",total_score));
//can we include the **score** here?
this custom logic which includes score is not even calling.
}
}
}
I am trying for a long time but have not found a single working example. Can anybody help me and save me here.
Thank you,
Syamala.

How do you write data into a Redis custom state store using Kafka Streams

I've recently been learning about how to use the Kafka Streams client and one thing that I've been struggling with is how to switch from the default state store (RocksDB) to a custom state store using something like Redis. The Confluent documentation makes it clear you have to implement the StateStore interface for your custom store and you must provide an implementation of StoreBuilder for creating instances of that store.
Here's what I have so far for my custom store. I've also added a simple write method to append a new entry into a specified stream via the Redis XADD command.
public class MyCustomStore<K,V> implements StateStore, MyWriteableCustomStore<K,V> {
private String name;
private volatile boolean open = false;
private boolean loggingEnabled = false;
public MyCustomStore(String name, boolean loggingEnabled) {
this.name = name;
this.loggingEnabled = loggingEnabled;
}
#Override
public String name() {
return this.name;
}
#Override
public void init(ProcessorContext context, StateStore root) {
if (root != null) {
// register the store
context.register(root, (key, value) -> {
write(key.toString(), value.toString());
});
}
this.open = true;
}
#Override
public void flush() {
// TODO Auto-generated method stub
}
#Override
public void close() {
// TODO Auto-generated method stub
}
#Override
public boolean persistent() {
// TODO Auto-generated method stub
return true;
}
#Override
public boolean isOpen() {
// TODO Auto-generated method stub
return false;
}
#Override
public void write(String key, String value) {
try(Jedis jedis = new Jedis("localhost", 6379)) {
Map<String, String> hash = new HashMap<>();
hash.put(key, value);
jedis.xadd("MyStream", StreamEntryID.NEW_ENTRY, hash);
}
}
}
public class MyCustomStoreBuilder implements StoreBuilder<MyCustomStore<String,String>> {
private boolean cached = true;
private String name;
private Map<String,String> logConfig=new HashMap<>();
private boolean loggingEnabled;
public MyCustomStoreBuilder(String name, boolean loggingEnabled){
this.name = name;
this.loggingEnabled = loggingEnabled;
}
#Override
public StoreBuilder<MyCustomStore<String,String>> withCachingEnabled() {
this.cached = true;
return this;
}
#Override
public StoreBuilder<MyCustomStore<String,String>> withCachingDisabled() {
this.cached = false;
return null;
}
#Override
public StoreBuilder<MyCustomStore<String,String>> withLoggingEnabled(Map<String, String> config) {
loggingEnabled=true;
return this;
}
#Override
public StoreBuilder<MyCustomStore<String,String>> withLoggingDisabled() {
this.loggingEnabled = false;
return this;
}
#Override
public MyCustomStore<String,String> build() {
return new MyCustomStore<String,String>(this.name, this.loggingEnabled);
}
#Override
public Map<String, String> logConfig() {
return logConfig;
}
#Override
public boolean loggingEnabled() {
return loggingEnabled;
}
#Override
public String name() {
return name;
}
}
And here's what my setup and topology look like.
#Bean
public KafkaStreams kafkaStreams(KafkaProperties kafkaProperties) {
final Properties props = new Properties();
props.put(StreamsConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaProperties.getBootstrapServers());
props.put(StreamsConfig.APPLICATION_ID_CONFIG, appName);
props.put(AbstractKafkaSchemaSerDeConfig.SCHEMA_REGISTRY_URL_CONFIG, schemaRegistryUrl);
props.put(StreamsConfig.DEFAULT_KEY_SERDE_CLASS_CONFIG, Long().getClass());
props.put(StreamsConfig.DEFAULT_VALUE_SERDE_CLASS_CONFIG, Double().getClass());
props.put(StreamsConfig.STATE_DIR_CONFIG, "data");
props.put(StreamsConfig.APPLICATION_SERVER_CONFIG, appServerConfig);
props.put(JsonDeserializer.VALUE_DEFAULT_TYPE, JsonNode.class);
props.put(DEFAULT_DESERIALIZATION_EXCEPTION_HANDLER_CLASS_CONFIG, LogAndContinueExceptionHandler.class);
props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, "earliest");
final String storeName = "the-custome-store";
Topology topology = new Topology();
// Create CustomStoreSupplier for store name the-custom-store
MyCustomStoreBuilder customStoreBuilder = new MyCustomStoreBuilder(storeName, false);
topology.addSource("input","inputTopic");
topology.addProcessor("redis-processor", () -> new RedisProcessor(storeName), "input");
topology.addStateStore(customStoreBuilder, "redis-processor");
KafkaStreams kafkaStreams = new KafkaStreams(topology, props);
kafkaStreams.start();
return kafkaStreams;
}
public class MyCustomStoreType<K,V> implements QueryableStoreType<MyReadableCustomStore<String,String>> {
#Override
public boolean accepts(StateStore stateStore) {
return stateStore instanceof MyCustomStore;
}
#Override
public MyReadableCustomStore<String,String> create(final StateStoreProvider storeProvider, final String storeName) {
return new MyCustomStoreTypeWrapper<>(storeProvider, storeName, this);
}
}
public class MyCustomStoreTypeWrapper<K,V> implements MyReadableCustomStore<K,V> {
private final QueryableStoreType<MyReadableCustomStore<String, String>> customStoreType;
private final String storeName;
private final StateStoreProvider provider;
public MyCustomStoreTypeWrapper(final StateStoreProvider provider,
final String storeName,
final QueryableStoreType<MyReadableCustomStore<String, String>> customStoreType) {
this.provider = provider;
this.storeName = storeName;
this.customStoreType = customStoreType;
}
#Override
public String read(String key) {
try (Jedis jedis = new Jedis("localhost", 6379)) {
StreamEntryID start = new StreamEntryID(0, 0);
StreamEntryID end = null; // null -> until the last item in the stream
int count = 2;
List<StreamEntry> list = jedis.xrange("MyStream", start, end, count);
if (list != null) {
// Get the most recently added item, which is also the last item
StreamEntry streamData = list.get(list.size() - 1);
return streamData.toString();
} else {
System.out.println("No new data in the stream");
}
return "";
}
}
}
// This throws the InvalidStateStoreException when I try to get access to the custom store
MyReadableCustomStore<String,String> store = streams.store("the-custome-store", new MyCustomStoreType<String,String>());
String value = store.read("testKey");
So, my question is how do I actually get the state store data to persist into Redis now? I feel like I'm missing something in the state store initialization or with the StateRestoreCallback. Any help or clarification with this would be greatly appreciated.
It looks to me that you have the store wired up to the topology correctly. But you don't have any processors using the store.
It could look something like this:
final String storeName = "the-custome-store";
MyCustomStoreBuilder customStoreBuilder = new MyCustomStoreBuilder(storeName, false);
Topology topology = new Topology()
topology.addSource("input", "input-topic");
// makes the processor a child of the source node
// the source node forwards its records to the child processor node
topology.addProcessor("redis-processor", () -> new RedisProcessor(storeName), "input");
// add the store and specify the processor(s) that access the store
topology.addStateStore(storeBuilder, "redis-processor");
class RedisProcessor implements Processor<byte[], byte[]> {
final String storeName;
MyCustomStore<byte[],byte[]> stateStore;
public RedisProcessor(String storeName) {
this.storeName = storeName;
}
#Override
public void init(ProcessorContext context) {
stateStore = (MyCustomeStore<byte[], byte[]>) context.getStateStore(storeName);
}
#Override
public void process(byte[] key, byte[] value) {
stateStore.write(key, value);
}
#Override
public void close() {
}
}
HTH, and let me know how it works out for you.
Update to answer from comments:
I think you need to update MyCustomStore.isOpen() to return the open variable.
Right now it's hardcoded to return false
Override
public boolean isOpen() {
// TODO Auto-generated method stub
return false;
}

How to create new Principal Object using Jackrabbit/JCR

I'm a new developer trying to tackle the Jackrabbit/JCR library. My team and I have been using the EveryonePrincipal for a couple of months now, but we've been wanting to implement more capabilities per user roles/principals so we can grant them necessary read/write access to each node. However, we're having some difficulties figuring out how to create a new Principal object.
I've been using:
PrincipalImpl newPrincipal = new PrincipalImpl("MyPrincipal");
Then creating a new RolePrincipal class matching the EveryonePrincipal, except the name would be "MyPrincipal" inside the RolePrincipal class. This method doesn't work unfortunately. Is there anything else we're missing from this? And how does the 'everyone' principal gets stored?
EveryonePrincipal.java
public final class EveryonePrincipal implements JackrabbitPrincipal, java.security.acl.Group {
public static final String NAME = "everyone";
private static final EveryonePrincipal INSTANCE = new EveryonePrincipal();
private EveryonePrincipal() { }
public static EveryonePrincipal getInstance() {
return INSTANCE;
}
//----------------------------------------------------------< Principal >---
#Override
public String getName() {
return NAME;
}
//--------------------------------------------------------------< Group >---
#Override
public boolean addMember(Principal user) {
return false;
}
#Override
public boolean removeMember(Principal user) {
throw new UnsupportedOperationException("Cannot remove a member from the everyone group.");
}
#Override
public boolean isMember(Principal member) {
return !member.equals(this);
}
#Override
public Enumeration<? extends Principal> members() {
throw new UnsupportedOperationException("Not implemented.");
}
//-------------------------------------------------------------< Object >---
#Override
public int hashCode() {
return NAME.hashCode();
}
#Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
} else if (obj instanceof JackrabbitPrincipal && obj instanceof Group) {
JackrabbitPrincipal other = (JackrabbitPrincipal) obj;
return NAME.equals(other.getName());
}
return false;
}
#Override
public String toString() {
return NAME + " principal";
}
}

JSon.Net specific JSonConverter issue, empty array must be ignored

I notice weird behavior in the Serialization.
Though, I have the settings
var SerializerSettings = new JsonSerializerSettings() {
NullValueHandling = NullValueHandling.Ignore, DefaultValueHandling= DefaultValueHandling.Ignore}
SerializerSettings.Converters.Add(new JsonArrayToNullConverter());
var val = new company()
{
name = "Bobo Company Renamed"
}
var str = JsonConvert.SerializeObject(val, SerializerSettings );
The result would be:
{"document_type":2,"locations":null, ...
Without the custom converter, it would be
{"document_type":2,"locations":[], ...
you get the point?
But, since it becomes null, it -should- listen to NullValueHandling = NullValueHandling.Ignore
but obviously, Newton, looks at the object value to be serialized, not at the issued writer.WriteNull();
:(
Any workaround? I've spent some hours on this. Thanks!
using Newtonsoft.Json;
using System;
using System.Collections;
using System.Collections.Generic;
namespace Company.Model.TypeConverters
{
/// <summary>
/// undo's the forced array creation because OData needs it, but for PATCH, we want no [] empty collections, it is considered to be a
/// </summary>
public class JsonArrayToNullConverter : JsonConverter
{
public override bool CanConvert(Type objectType)
{
var canConvert = objectType.IsArray || (objectType.IsGenericType && objectType.GetGenericTypeDefinition().IsAssignableFrom(typeof(IEnumerable<>)));
return canConvert;
}
public override bool CanRead
{
get
{
return false;
}
}
public override bool CanWrite
{
get
{
return true;
}
}
public override object ReadJson(JsonReader reader, Type objectType, object existingValue, JsonSerializer serializer)
{
throw new NotImplementedException();
}
public override void WriteJson(JsonWriter writer, object value, JsonSerializer serializer)
{
var enumerator = (IEnumerable)value;
if (enumerator != null && enumerator.GetEnumerator().MoveNext() == false)
{
writer.WriteNull();
//value = null;
return;
}
serializer.Serialize(writer, value);
}
}
}
The solution seemed to be using a contractresolver and set that on the JsonSerializerSettings.
public class NullToEmptyListResolver : DefaultContractResolver
{
protected override IValueProvider CreateMemberValueProvider(MemberInfo member)
{
IValueProvider provider = base.CreateMemberValueProvider(member);
if (member.MemberType == MemberTypes.Property)
{
Type propType = ((PropertyInfo)member).PropertyType;
if (propType.IsArray || ( propType.IsGenericType &&
propType.GetGenericTypeDefinition().IsAssignableFrom(typeof(IEnumerable<>))))
{
return new EmptyListValueProvider(provider, propType);
}
}
return provider;
}
public class EmptyListValueProvider : IValueProvider
{
private readonly IValueProvider innerProvider;
public EmptyListValueProvider(IValueProvider innerProvider, Type listType)
{
this.innerProvider = innerProvider;
}
public void SetValue(object target, object value)
{
throw new NotImplementedException();
}
public object GetValue(object target)
{
var val = innerProvider.GetValue(target) ;
if (val == null) return null;
var enumerator = (IEnumerable)val;
return enumerator.GetEnumerator().MoveNext() == false ? null : val;
}
}