Compare two flux - spring-webflux

When I try to compare two flux obtained from reactive couchbase repository facing timeout issue. I am trying to get all the cars from vehicle collection.
Flux<Vehicles> vehicles = vehicleRepo.findAll();
Flux<Car> cars = carRepo.findAll();
cars.flatMap(car -> {
vehicles.filter(vehicle -> vehicle.type.equals("car") && vehicle.id==car.id).map(vehicle -> {
});
});
when I try to compare these two flux getting error "Query timed out while streaming/receiving rows"
Facing error "Query timed out while streaming/receiving rows"

Related

How to handle null response of get command using lettuce reactive api

I'm using lettuce reactive api to query redis, and need to handle the situation redis doesn't has the key. But it seems that it dosen't call the function of subscribe when the key doesn't exist
I want to query redis and then print the result,
if the result exist, it should print "result received:{result}"
and if the result doesn't exist, it should print "result received:null".
RedisURI redisURI = RedisURI.builder().withHost("10.203.0.114").withPort(6379).withPassword("123456").build();
RedisClient client = RedisClient.create(redisURI);
StatefulRedisConnection<String,String> connection = client.connect();
RedisStringReactiveCommands<String,String> reactive = connection.reactive();
reactive.get("hello").onErrorReturn("error").subscribe(res->{
System.out.println("result receive:"+res);
},Throwable::printStackTrace);
reactive.get("hell").doOnError(throwable -> {
System.out.println("do on error");
throwable.printStackTrace();
}).subscribe(r->{
System.out.println("result receive:"+r);
});
redis has the key: hello and it's value is world.
I expect the output of first get call should be "result receive:world" and actual output is the same with my expectation.
The output of second call should be "result receive:null",but nothing is printed actually
Solution:I found lettuce use project reactor api and its api behaviour is different from rxjava.It should use switchIfEmpty to handle null response
reactive.get("hell").switchIfEmpty(Mono.just("")).doOnError(throwable -> {
System.out.println("do on error");
throwable.printStackTrace();
}).subscribe(r->{
System.out.println("result receive:"+r);
});

Google Cloud Pub/Sub - Cloud Function & Bigquery - Data insert is not happening

I am using a Google Cloud Platform Function that listens to a Pub/SubTopic and inserts the data in BigQuery.
The input data which I am passing from pub/sub console is in JSON format {"NAME", "ABCD"}, but from the console log, I could see that message is coming as {NAME, ABCD}, and during execution, it error as well. 2 common errors I faced
SyntaxError: Unexpected token n in JSON at position 1 at Object.parse (native) at exports.helloPubSub"
"ERROR: { Error: Invalid value at 'rows[0].json' "
Input given:
gcloud pubsub topics publish pubsubtopic1 --message {"name":"ABCD"}
Tried various formats of input data with single quotes and square brackets and other possible options as well, nothing helps
Workarounds tried like using JSON.parse, JSON.stringfy which helps to avoid the 1st issue which mentioned above but ends up with row[0] issue
When I pass the JSON input data as hard-coded values inside the cloud function like {"NAME", "ABCD"}, data is getting inserted properly.
/**This is working code since i hardcoded the data in JSON format, commented the lines which i tried and did not helped**/
/**
* Triggered from a message on a Cloud Pub/Sub topic.
*
* #param {!Object} event Event payload and metadata.
* #param {!Function} callback Callback function to signal completion.
*/
exports.helloPubSub = (event, callback) => {
const pubsubMessage = event.data;
console.log(Buffer.from(pubsubMessage.data, 'base64').toString());
const {BigQuery} = require('#google-cloud/bigquery');
const bigquery = new BigQuery();
//console.log(Buffer.from(pubsubMessage.data, 'base64').toString());
//console.log(JSON.parse(Buffer.from(pubsubMessage.data, 'base64').toString()));
var myjson='{"NAME":"ABCD","STATE":"HHHH","AGE":"12"}';
console.log(myjson);
bigquery
.dataset("DEMO")
.table("EMP")
.insert(JSON.parse(myjson),
{'ignoreUnknownValues':true, 'raw':false})
//.insert(JSON.parse(Buffer.from(pubsubMessage.data, 'base64').toString()),
.then ((data) => {
console.log('Inserted 1 rows');
console.log(data);
})
.catch(err => {
if (err && err.name === 'PartialFailureError') {
if (err.errors && err.errors.length > 0) {
console.log('Insert errors:');
err.errors.forEach(err => console.error(err));
}
} else {
console.error('ERROR`enter code here`:', err);
}
});
};
I ran a quick test using gcloud to publish and to pull the message as well.
Using the syntax you mentioned I get the following result:
gcloud pubsub topics publish pubsubtopic1 --message {"name":"ABCD"}
gcloud pubsub subscriptions pull pubsubsubscription1
The result is:
DATA │ {name:ABCD}
If you use this syntax instead:
gcloud pubsub topics publish pubsubtopic1 --message "{\"name\":\"ABCD\"}"
gcloud pubsub subscriptions pull pubsubsubscription1
The result is:
DATA | {"name":"ABCD"}
EDIT 2019-04-01
The workaround above is for test purposes,the need to use escape characters is a caveat of using the command line. To publish from your real application, you may use a REST call or a client library as listed here.Please note the Pub/Sub API expects the message to be base64 encoded. For example:
POST https://pubsub.googleapis.com/v1/projects/{YOUR_PROJECT_ID}/topics/{YOUR_TOPIC}:publish?key={YOUR_API_KEY}
{
"messages": [
{
"data": "eyJuYW1lIjoiQUJDRCJ9"
}
]
}

AngularJS/Ionic How to correctly get data from database to display in view?

What is the correct/best approach to get data from a database for something like calendar and its events.
I think there are two possible ways:
1) Should I get all data from the database and save it in a $scope object, so that I access the SQLite db only once and work with the object later on
2) Should I only get the data for the currently selected day of the calendar, and if I want to display another day, I do another sql query.
EDIT: the database is local, so network perfomance is no matter
I think it depends on what is more important for you and how big the data is.
If you are working with a local database you have keep in mind that you always have to do async operations to get data from your database. So you always have a little time (depending on the device performance) where the promises have to get resolved even if its on local db.
My ionic application follows the concept 2 of your options without using any cache. The trick to render correct data is to resolve relevant data before entering the view.
My stateprovider looks like this:
.state('app.messages', {
url: "/messages",
views: {
'menuContent': {
templateUrl: "pathToTemplate.html",
controller: 'ExampleMessageListCtrl'
}
},
resolve: {
messages: function($q,DatabaseService) {
var deferred = $q.defer();
//All db functions are async operations
DatabaseService.getMessageList().then(function (messageList) {
deferred.resolve(messageList);
}, function (err) {
deferred.reject(err);
});
return deferred.promise;
}
},
//Disable cache to always get the values from db
cache:false
In the controller you can access the messages variable via injection:
.controller('ExampleMessageListCtrl', function ($scope,messages) {
var loadMessages = function() {
$scope.messages = messages;
};
...
});
The benefits of this concept with resolving the data before entering the state is that you always get the data which is inside the db without rendering frames with empty data inside, which comes from default data inside the scope variables.
I think options 1 is the right option if you want to display data very quickly. The challenge in this case is to hold your cached data synchronized with the data in the database without loosing much performance, which I think is not very easy.

GridGain SQL Transform Query Limitations

I am running into an issue with doing an SQL Transform Query. I have a replicated Cache setup with thousands of cached items in various Classes. When I run a transform query that returns specific (summary) items from Classes on the Cache, it looks like the query executes just fine and returns a Collection. However, when I iterate through the Collection, after 2,048 items, the individual items in the Collection (which used to be Cast'able until then) are now simple a 'GridCacheQueryResponseEntry', which I can't seem to cast anymore...
Is 2,048 items the limit for a Transform Query Result Set in GridGain?
here's the code I use to query/transform the cache items (Simplified a bit). This works for exactly 2048 items and then throws an Exception:
GridCacheQuery<Map.Entry<UUID, Object>> TypeQuery = queries.createSqlQuery(Object.class, "from Object where Type = ? and Ident regexp ?");
GridClosure<Map.Entry<UUID, Object>, ReturnObject> Trans = new GridClosure<Map.Entry<UUID, Object>, ReturnGeometry>() {
#Override public ReturnObject apply(Map.Entry<UUID, Object> e) {
try {
ReturnObject tmp = e.getValue().getReturnObject();
} catch (Exception ex) {ex.getMessage()); }
return tmp;
}
};
Collection<ReturnObject> results = TypeQuery .execute(Trans,"VarA","VarB").get();
Iterator iter = results.iterator();
while (iter.hasNext()) {
try {
Object item = iter.next();
ReturnObjectpoint = (ReturnObject) item;
} catch (Exception ex) {}
}
There are no such limitations in GridGain. Whenever you execute a query, you have two options:
Call GridCacheQueryFuture.get() method to get the whole result set as a collection. This works only for relatively small result set, because all the rows in the result set have to be loaded to client node's memory.
Use GridCacheFutureMethod.next() to iterate through result set. In this case results will be acquired from remote nodes page by page. When you finished iteration through a page, it's discarded and next one is loaded. So you have only one page at a time which gives you an opportunity to query result sets of any size.
As for GridCacheQueryResponseEntry, you should not cast to it, because it's an internal GridGain class and is actually a simple implementation of Map.Entry interface which represents a key-value pair from GridGain cache.
In case of transform query you will get Map.Entry instances only in transformer, while client node will receive already transformed values, so I'm not sure how it's possible to get them during iteration. Can you provide a small code example of how you execute the query?

wcf data services saving relational data in single click

I'm using WCF data services in Windows phone 7, I want to save relational data in single click how could I do that?
it's think 2 table : Category and Products
I want to save data UI :
From Cateogry Table :-
CategoryId : (auto increment)
CategoryName : abc
From product Table :-
ProductId :-(auto increment)
CategoryId :- ? ( not sure how could I retrieve )
ProductsName : xyz
on button save click:
I want to insert above data in appropriated table , how could I do that?
I am using following code for add one table data :
try
{
context = new NorthwindEntities(NorthwindUri);
context.AddToProducts(product);
context.BeginSaveChanges(new AsyncCallback((result) =>
{
bool errorOccured = false;
// Use the Dispatcher to ensure that the
// asynchronous call returns in the correct thread.
Deployment.Current.Dispatcher.BeginInvoke(() =>
{
context = result.AsyncState as NorthwindEntities;
try
{
// Complete the save changes operation and display the response.
DataServiceResponse response = context.EndSaveChanges(result);
foreach (ChangeOperationResponse changeResponse in response)
{
if (changeResponse.Error != null) errorOccured = true;
}
if (!errorOccured)
{
MessageBox.Show("The changes have been saved to the data service.");
}
else
{
MessageBox.Show("An error occured. One or more changes could not be saved.");
}
}
catch (Exception ex)
{
// Display the error from the response.
MessageBox.Show(string.Format("The following error occured: {0}", ex.Message));
}
});
}), context);
}
catch (Exception ex)
{
MessageBox.Show(string.Format("The changes could not be saved to the data service.\n"
+ "The following error occurred: {0}", ex.Message));
}
In OData relationships are not represented as foreign keys, instead they are represented as navigation properties. And then you manipulate them through manipulating links in the client library.
Take a look at this article: http://msdn.microsoft.com/en-us/library/dd756361(v=vs.103).aspx
You can call multiple methods which modify data and then call SaveChanges to send them all to the server.
Note though, that if the server requires referential integrity and you're for example adding two related entities at the same time, you might need to use SaveChanges(Batch) (which makes the client send everything in one request and thus allows the server to process it as a single transaction).