Serialization of google storage response object to json - serialization

import com.google.api.gax.paging.Page;
import com.google.cloud.storage.Bucket;
import com.google.cloud.storage.Storage;
import com.google.cloud.storage.StorageOptions;
public class ListBuckets {
public static void listBuckets(String projectId) {
// The ID of your GCP project
// String projectId = "your-project-id";
Storage storage = StorageOptions.newBuilder().setProjectId(projectId).build().getService();
Page<Bucket> buckets = storage.list();
Gson gson = new Gson();
gson.toJson(buckets);
}
}
How to serialize the buckets response object to json?
buckets reponse will be dynamic based upon the BucketListOption provided as input parameter in list() method. I don't want to call all individual method to retrieve the values like bucket.getName() .
gson.toJson(buckets); // what I am doing wrong, Can anyone point the issue?

Related

How to request String param together with Flux<Part> in reactive spring

I am trying to upload file using Reactive spring fremework.If I only reqeset file by postman. It works fine. When I added addtional parameter Stirng as uploadType. It is always null. I tried postman as following:-
My controller code:-
#RestController
#RequestMapping(path = "/upload")
public class ReactiveUploadResource {
Logger LOGGER = LoggerFactory.getLogger(ReactiveUploadResource.class);
#RequestMapping(method = RequestMethod.POST, consumes = MediaType.MULTIPART_FORM_DATA_VALUE, produces = MediaType.APPLICATION_JSON_VALUE)
public Flux<String> uploadHandler(#RequestBody Flux<Part> parts, String uploadType) {
return (Flux<String>) parts
.filter(part -> part instanceof FilePart)
.ofType(FilePart.class) // convert the flux to FilePart
.flatMap(item -> saveFile(item, uploadType)); // save each file and flatmap it to a flux of results
}
}
Postman:-
I always get uploadType null. What is wrong? How can pass String or Enum in this reqeust?

Google Cloud Functions - Realtime Database Trigger - how to deserialize data JSON to POJO?

As described on the Google Cloud Functions docs, it is possible to trigger a Function based on Firebase Realtime Database events (write/create/update/delete).
The following docs sample explains how to get the delta snapshot.
public class FirebaseRtdb implements RawBackgroundFunction {
private static final Logger logger = Logger.getLogger(FirebaseRtdb.class.getName());
// Use GSON (https://github.com/google/gson) to parse JSON content.
private static final Gson gson = new Gson();
#Override
public void accept(String json, Context context) {
logger.info("Function triggered by change to: " + context.resource());
JsonObject body = gson.fromJson(json, JsonObject.class);
boolean isAdmin = false;
if (body != null && body.has("auth")) {
JsonObject authObj = body.getAsJsonObject("auth");
isAdmin = authObj.has("admin") && authObj.get("admin").getAsBoolean();
}
logger.info("Admin?: " + isAdmin);
if (body != null && body.has("delta")) {
logger.info("Delta:");
logger.info(body.get("delta").toString());
}
}
}
The sample works perfectly but the question is: How can I deserialize this delta to a POJO?
I tried:
val mObject = gson.fromJson(body.get("delta").toString(), MyCustomObject::class.java)
But I am getting:
com.google.gson.JsonSyntaxException: java.lang.IllegalStateException: Expected BEGIN_ARRAY but was BEGIN_OBJECT
As far as I know it is because MyObject class has a List<T> field, and Firebase Database always convert Lists to Maps with integer keys.
I preferably do not want to change every List<T> to Map<Int,T>, because I have a lot of classes :(
Thanks in advance!
So, here is what I ended up doing (maybe not the best solution!):
1) Create a custom Json Deserializer for Firebase-coming lists:
class ListFirebaseDeserializer<T> : JsonDeserializer<ArrayList<T>> {
override fun deserialize(json: JsonElement?, typeOfT: Type?, context: JsonDeserializationContext?): ArrayList<T> {
val result = ArrayList<T>()
val typeOfElement = (typeOfT as ParameterizedType).actualTypeArguments[0]
json?.let {
json.asJsonObject.entrySet().forEach {
entry->
result.add(Gson().fromJson(entry.value, typeOfElement))
}
}
return result
}
}
This takes the lists that Firebase turned into maps and convert it back to actual lists.
2) Annotate every list in my POJO with #JsonAdapter(ListFirebaseDeserializer::class), for instance:
class MyCustomObject {
#JsonAdapter(ListFirebaseDeserializer::class)
var myPaymentList = ArrayList<Payment>()
}
It could be a pain if you already have lots of lists to annotate, but it is better than having to use maps instead.
Hope it helps!

How to make versioned KV store work with VaultPropertySource

I am trying to make versioned KV store of vault work with VaultPropertySource so that property can be accessed using #Value. However it is not working as expected. I am using 2.1.2.RELEASE version of spring-vault-core. The intention is to make it work with spring vault and Spring MVC.
I have already tried with #import(EnvironmentVaultConfiguration.class) to no avail.
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Configuration;
import org.springframework.context.annotation.PropertySource;
import org.springframework.core.env.ConfigurableEnvironment;
import org.springframework.core.env.MutablePropertySources;
import org.springframework.vault.authentication.ClientAuthentication;
import org.springframework.vault.authentication.TokenAuthentication;
import org.springframework.vault.client.VaultEndpoint;
import org.springframework.vault.config.AbstractVaultConfiguration;
import org.springframework.vault.core.VaultTemplate;
import org.springframework.vault.core.env.VaultPropertySource;
import javax.annotation.PostConstruct;
import java.net.URI;
import java.util.List;
#Configuration
#PropertySource("vault.properties")
public class AppConfig extends AbstractVaultConfiguration {
#Value("${vault.uri}")
private URI vaultUri;
#Value("${vault.token}")
private String token;
#Value("#{'${vault.sources:}'.split(',')}")
private List<String> vaultSources;
#Autowired
private ConfigurableEnvironment environment;
#Autowired
private VaultTemplate vaultTemplate;
/**
* Specify an endpoint for connecting to Vault.
*/
#Override
public VaultEndpoint vaultEndpoint() {
return VaultEndpoint.from(vaultUri);
}
/**
* Configure a client authentication.
* Please consider a more secure authentication method
* for production use.
*/
#Override
public ClientAuthentication clientAuthentication() {
return new TokenAuthentication(token);
}
#PostConstruct
public void setPropertySource() {
MutablePropertySources sources = environment.getPropertySources();
vaultSources.stream().forEach(vs -> {
sources.addFirst(new VaultPropertySource(vaultTemplate, vs));
});
}
}
In the given code, if I provide
vault.sources=secret/data/abcd,secret/data/pqrs
then it works and returns secrets with data. and metadata. prefix. Which means that it is using generic approach to fetch secrets and not kv one.
If I remove data from path i.e. vault.sources=secret/abcd,secret/pqrs, it simply does not connect and throws exception with 403. This means that it must not be using kv v2.
Can someone please help me with how to use Versioned API of spring-vault in this code?
Key-Value 2 support using VaultPropertySource is not yet released. It will be shipped with Spring Vault 2.2 (see this GitHub issue).
Until then, you can use snapshot builds to verify the code is helpful for your use case.
Based on Mark's reponse above, I decided to use VaultPropertySource with PropertyTransformer until we get KV version2 support out of the box.
public class DataMetadataPrefixRemoverPropertyTransformer implements PropertyTransformer {
private final String dataPrefix = "data.";
private final String metadataPrefix = "metadata.";
public Map<String, Object> transformProperties(Map<String, ? extends Object> inputProperties) {
Map<String, Object> target = new LinkedHashMap(inputProperties.size(), 1.0F);
Iterator propertiesIterator = inputProperties.entrySet().iterator();
while(propertiesIterator.hasNext()) {
Map.Entry<String, ? extends Object> entry = (Map.Entry)propertiesIterator.next();
String key = entry.getKey();
// do not add metadata properties to environment for now - do not see a use case for it as of now.
if (StringUtils.startsWithIgnoreCase(key, metadataPrefix)) {
continue;
}
if (StringUtils.startsWithIgnoreCase(key, dataPrefix)) {
key = StringUtils.replace(key, dataPrefix, "");
}
target.put(key, entry.getValue());
}
return target;
}
}
Hope it can help someone looking for similar solution.

How to get ArrayList of POJO's from Amazon Lambda (getting only LinkedTreeMap)

I try to call my AWS Lambda function (serverless backend) with my Android mobile app client. The AWS lambda function returns an ArrayList of POJO objects (as JSON).
The problem is that the android client AWS Lambda(JSON)DataBinder does not deserialize to my ArrayList of POJOs. I get an ArrayList of LinkedTreeMap (see code at onPostExecute() below).
At the android client side I'm using Android AWS SDK: com.amazonaws:aws-android-sdk-core:2.6
Here is some code:
public void readSurveyList(String strUuid, int intLanguageID) {
// Create an instance of CognitoCachingCredentialsProvider
// You have to configure at least an AWS identity pool to get access to your lambda function
CognitoCachingCredentialsProvider credentialsProvider = new CognitoCachingCredentialsProvider(
this.getApplicationContext(),
IDENTITY_POOL_ID,
Regions.EU_CENTRAL_1);
LambdaInvokerFactory factory = LambdaInvokerFactory.builder()
.context(this.getApplicationContext())
.region(Regions.EU_CENTRAL_1)
.credentialsProvider(credentialsProvider)
.build();
// Create the Lambda proxy object with default Json data binder.
myInterface = factory.build(MyInterface.class);
//create a request object (depends on your lambda function)
SurveyListRequest surveyListRequest = new SurveyListRequest(strUuid, intLanguageID);
// Lambda function in async task with definiton of
// request object (-> SurveyListRequest)
// response object (-> ArrayList<SurveyListItem>>)
new AsyncTask<SurveyListRequest, Void, ArrayList<SurveyListItem>>() {
#Override
protected ArrayList<SurveyListItem> doInBackground(SurveyListRequest... params) {
try {
return myInterface.ReadSurveyList(params[0]);
} catch (LambdaFunctionException lfe) {
Log.e("TAG", String.format("echo method failed: error [%s], details [%s].", lfe.getMessage(), lfe.getDetails()));
return null;
}
}
#Override
protected void onPostExecute(ArrayList<SurveyListItem> surveyList) {
// PROBLEM: here i get a ArrayList of LinkedTreeMap
}
}.execute(surveyListRequest);
}
Here is the code of my lambda function Interface:
public interface MyInterface {
#LambdaFunction
ArrayList<SurveyListItem> ReadSurveyList (SurveyListRequest surveyListRequest);
}
I would expect to get a list of my POJO objects. I found a lot of discussions about Gson and ArrayList type and solutions based on TypeToken (e.g. Gson TypeToken with dynamic ArrayList item type). Maybe same problem ...
I found a solution using a custom LambdaDataBinder. I have specified the type of my POJO-class "SurveyListItem" in deserialize function. The Gson uses the TypeToken definition and converts the JSON string correct to the list of POJOs (in my case "SurveyListItem" objects).
Here is the sourcecode of MyLambdaDataBinder:
public class MyLambdaDataBinder implements LambdaDataBinder {
private final Gson gson;
Type mType;
//CUSTOMIZATION: pass typetoken via class constructor
public MyLambdaDataBinder(Type type) {
this.gson = new Gson();
mType = type;
}
#Override
public <T> T deserialize(byte[] content, Class<T> clazz) {
if (content == null) {
return null;
}
Reader reader = new BufferedReader(new InputStreamReader(new ByteArrayInputStream(content)));
//CUSTOMIZATION: Original line of code: return gson.fromJson (reader, clazz);
return gson.fromJson(reader, mType);
}
#Override
public byte[] serialize(Object object) {
return gson.toJson(object).getBytes(StringUtils.UTF8);
}
}
Here is how to use the custom MyLambdaDataBinder. Use your POJO instead of "SurveyListItem":
myInterface = factory.build(LambdaInterface.class, new MyLambdaDataBinder(new TypeToken<ArrayList<SurveyListItem>>() {}.getType()));

Hazelcast 3.6.1 "There is no suitable de-serializer for type" exception

I am using Hazelcast 3.6.1 to read from a Map. The object class stored in the map is called Schedule.
I have configured a custom serializer on the client side like this.
ClientConfig config = new ClientConfig();
SerializationConfig sc = config.getSerializationConfig();
sc.addSerializerConfig(add(new ScheduleSerializer(), Schedule.class));
...
private SerializerConfig add(Serializer serializer, Class<? extends Serializable> clazz) {
SerializerConfig sc = new SerializerConfig();
sc.setImplementation(serializer).setTypeClass(clazz);
return sc;
}
The map is created like this
private final IMap<String, Schedule> map = client.getMap("schedule");
If I get from the map using schedule id as key, the map returns the correct value e.g.
return map.get("zx81");
If I try to use an SQL predicate e.g.
return new ArrayList<>(map.values(new SqlPredicate("statusActive")));
then I get the following error
Exception in thread "main" com.hazelcast.nio.serialization.HazelcastSerializationException: There is no suitable de-serializer for type 2. This exception is likely to be caused by differences in the serialization configuration between members or between clients and members.
The custom serializer is using Kryo to serialize (based on this blog http://blog.hazelcast.com/comparing-serialization-methods/)
public class ScheduleSerializer extends CommonSerializer<Schedule> {
#Override
public int getTypeId() {
return 2;
}
#Override
protected Class<Schedule> getClassToSerialize() {
return Schedule.class;
}
}
The CommonSerializer is defined as
public abstract class CommonSerializer<T> implements StreamSerializer<T> {
protected abstract Class<T> getClassToSerialize();
#Override
public void write(ObjectDataOutput objectDataOutput, T object) {
Output output = new Output((OutputStream) objectDataOutput);
Kryo kryo = KryoInstances.get();
kryo.writeObject(output, object);
output.flush(); // do not close!
KryoInstances.release(kryo);
}
#Override
public T read(ObjectDataInput objectDataInput) {
Input input = new Input((InputStream) objectDataInput);
Kryo kryo = KryoInstances.get();
T result = kryo.readObject(input, getClassToSerialize());
input.close();
KryoInstances.release(kryo);
return result;
}
#Override
public void destroy() {
// empty
}
}
Do I need to do any configuration on the server side? I thought that the client config would be enough.
I am using Hazelcast client 3.6.1 and have one node/member running.
Queries require the nodes to know about the classes as the bytestream has to be deserialized to access the attributes and query them. This means that when you want to query on objects you have to deploy the model classes (and serializers) on the server side as well.
Whereas when you use key-based access we do not need to look into the values (neither into the keys as we compare the byte-arrays of the key) and just send the result. That way neither model classes nor serializers have to be available on the Hazelcast nodes.
I hope that makes sense.