Aerospike ListOperation usage in Java client - aerospike

I'm getting started with the ListOperation in the Java client, and the operations do not work as expected. Here are my code:
AerospikeClient client = new AerospikeClient("10.0.0.1", 3000);
Key key = new Key("test", "demo", "testlist");
Value vll = Value.get(123);
List<Value> itemList = new ArrayList<Value>();
itemList.add(Value.get("s11"));
itemList.add(Value.get("s22222"));
Bin bin01 = new Bin("bin3", Value.get(itemList));
client.put(null, key, bin01);
Record record0 = client.operate(null, key, Operation.put(new Bin("bin4", 90)), ListOperation.insertItems("bin6", 0, itemList) , Operation.get());
System.out.println("record0");
System.out.println(record0);
Record record1 = client.operate(null, key, ListOperation.append("bin3", vll), Operation.get());
System.out.println("record1");
System.out.println(record1);
itemList.add(Value.get("s3333333"));
itemList.add(Value.get("s4444444444"));
itemList.add(Value.get("s5555555555555555"));
Record record2 = client.operate(null, key,
ListOperation.insertItems("bin3", 0, itemList), Operation.get()
);
System.out.println("record2");
System.out.println(record2);
Record record3 = client.get(null, key);
System.out.println("record3");
System.out.println(record3);
And here are the output:
record0 (gen:2),(exp:239021264),(bins:(bin3:[s11, s22222]),(bin4:90))
record1 (gen:3),(exp:239021264),(bins:(bin3:[s11, s22222]),(bin4:90))
record2 (gen:4),(exp:239021265),(bins:(bin3:[s11, s22222]),(bin4:90))
record3 (gen:4),(exp:239021265),(bins:(bin3:[s11, s22222]),(bin4:90))
It seems that all of the ListOperations used in my code were not applied. Am I using it wrong?
Thanks.

Need Server version 3.7.0.1+ to do the above ListOperations. Testing on ver 3.3.21.

Related

IntelliJ IDEA LiveTemplate auto increment between usages

I am trying to make my life easier with Live Templates in intelliJ
I need to increment some param by 1 every-time I use the snippet.
So I tried to develop some groovyScript, and I am close, but my groovy capabilities keeps me back. the number is not incremented by 1, but incremented by 57 for some reason... (UTF-8?)
here is the script:
File file = new File("out.txt");
int code = Integer.parseInt(file.getText('UTF-8'));
code=code+1;
try{
if(_1){
code = Integer.parseInt(_1);
}
} catch(Exception e){}
file.text = code.toString();
return code
So whenever there's param passed to this script (with _1) the initial value is set, and otherwise simply incremented.
this script needs to be passed to the live template param with:
groovyScript("File file = new File(\"out.txt\");int code = Integer.parseInt(file.getText(\'UTF-8\'));code=code+1;String propName = \'_1\';if(this.hasProperty(propName) && this.\"$propName\"){code = Integer.parseInt(_1);};file.text =code.toString();return code", "<optional initial value>")

Grid Events and oldValue versus newValue

When updating entities in my cache, the oldValue=newValue in the event, unless I do a put with the original object instance. This is of course not always possible.
Here is a simplified example.
IgnitePredicate<CacheEvent> locLsnr = evt -> {
// do something
};
ignite.events().localListen(locLsnr,EventType.EVT_CACHE_OBJECT_PUT);
IgniteCache<TradeKey, Trade> cache = ignite.getOrCreateCache("MyCache");
Trade trade1 = new Trade();
trade1.setId(1);
trade1.setSize(10);
cache.put(new TradeKey(trade.getId()), trade1);
// event is generated
//evt.oldValue is null, no problem
trade1.setSize(20);
cache.put(new TradeKey(trade.getId()), trade1);
// event is generated
// evt.oldValue().getSize() is 10, evt.newValue().getSize() is 20, this is GOOD
But, if I retrieve the trade from the cache again before updating (from another part of the application for example), I am not able to see what the old value was in the event. Old value will just show the new value.
IgnitePredicate<CacheEvent> locLsnr = evt -> {
// do something
};
ignite.events().localListen(locLsnr,EventType.EVT_CACHE_OBJECT_PUT);
IgniteCache<TradeKey, Trade> cache = ignite.getOrCreateCache("MyCache");
Trade trade1 = new Trade();
trade1.setId(1);
trade1.setSize(10);
cache.put(new TradeKey(trade.getId()), trade1);
// event is generated
//evt.oldValue is null, no problem
trade1 = cache.get(new TradeKey(1)); // or could be a query or any search on the cache
trade1.setSize(20);
cache.put(new TradeKey(trade.getId()),trade1);
// event is generated
// evt.oldValue().getSize() is 20, evt.newValue().getSize() is 20, this is BAD
Any advice?
Thanks.
This is also discussed on Apache Ignite user forum: http://apache-ignite-users.70518.x6.nabble.com/Grid-Events-and-oldValue-versus-newValue-td10577.html

Pentaho Kettle: list of remote Carte objects IDs from Java

I already know how to run and attach a transformation running on a remote Carte server using Java given transformation's Carte Object ID:
KettleEnvironment.init();
TransMeta transMeta = new TransMeta("file.ktr");
Trans trans = new Trans(transMeta);
SlaveServer ss = new SlaveServer("test", IP, PORT, "cluster", "cluster");
TransExecutionConfiguration jec = new TransExecutionConfiguration();
jec.setRemoteServer(ss);
String carteObjectId = trans.sendToSlaveServer(transMeta, jec, null, null);
and
KettleEnvironment.init();
SlaveServer ss = new SlaveServer("test", IP, PORT, "cluster", "cluster");
SlaveServerTransStatus state = ss.getTransStatus(transMetaName, carteObjectId, 0);
List<StepStatus> list = state.getStepStatusList();
However, for a more general (and usable) remote monitoring I need to get the whole list of the Object IDs of the running/run transformations on the remote Carte server. Which methods can I use to get such a list ?
List<SlaveServerTransStatus> transStatus = slave1.getStatus().getTransStatusList();
for(SlaveServerTransStatus transStatu:transStatus){
System.out.println(transStatu.getTransName()+"--"+transStatu.getStatusDescription()+"---"+transStatu.getId());
}

Kettle (Pentaho PDI): Couldn't find starting point in this job

I'm manually creating a Job using Kettle from Java, but I get the error message Couldn't find starting point in this job.
KettleEnvironment.init();
JobMeta jobMeta = new JobMeta();
JobEntrySpecial start = new JobEntrySpecial("START", true, false);
start.setStart(true);
JobEntryCopy startEntry = new JobEntryCopy(start);
jobMeta.addJobEntry(startEntry);
JobEntryTrans jet1 = new JobEntryTrans("first");
Trans trans1 = jet1.getTrans();
jet1.setFileName("file.ktr");
JobEntryCopy jc1 = new JobEntryCopy(jet1);
jobMeta.addJobEntry(jc1);
jobMeta.addJobHop(new JobHopMeta(startEntry, jc1));
Job job = new Job(null, jobMeta);
job.setInteractive(true);
job.start();
I've discovered that I was missing
job.setStartJobEntryCopy(startEntry);
Class org.pentaho.di.job.JobMeta class has method findJobEntry
U can use it to look for entry point called "START
This is how it is original source of kettle-pdi
private JobMeta jobMeta;
....
// Where do we start?
jobEntryCopy startpoint;
....
if ( startJobEntryCopy == null ) {
startpoint = jobMeta.findJobEntry( JobMeta.STRING_SPECIAL_START, 0, false );
// and then
JobEntrySpecial jes = (JobEntrySpecial) startpoint.getEntry();

Update a DynamicContent item of a dynamic module in Sitefinity

I am trying to update a DynamicContent Property of a module using the following:
DynamicModuleManager dynamicModuleManager = DynamicModuleManager.GetManager();
Type pollanswerType = TypeResolutionService.ResolveType("Telerik.Sitefinity.DynamicTypes.Model.Poll.Pollanswer");
Guid pollanswerID = new Guid(answerID);
// This is how we get the pollanswer item by ID
DynamicContent pollanswerItem = dynamicModuleManager.GetDataItem(pollanswerType, pollanswerID);
pollanswerItem.SetValue("VoteCount", int.Parse(pollanswerItem.GetValue("VoteCount").ToString()) + 1);
dynamicModuleManager.SaveChanges();
Basically getting the current Property value and incrementing it by 1
and calling SaveChanges()
the code runs without errors but it doesn't update the value when I check it from the Back End of Sitefinity.
Any suggestions?
The problem might be caused by the pollanswerID that you are passing.
If the pollanswerID is the id of the live version of the content item then the value wouldn't be set.
Make sure you set the field value to the of master version of the content type not the live one.
In case you don't know the id of the master version of the content type you can get the master content item by the id of the live version of the content type
var masterItem = dynamicModuleManager.GetDataItems(pollanswerType).Where(dynItem => dynItem.Id == pollanswerItem.OriginalContentId).FirstOrDefault();
if (masterItem != null)
{
masterItem.SetValue("VoteCount", int.Parse(masterItem.GetValue("VoteCount").ToString()) + 5);
}
you should be always doing this
Get Master
Checkout Master
Checkin Master
Save Changes
Publish changes – this will update live.
bookingItemLive is the live record
var bookingItemMaster = dynamicModuleManager.Lifecycle.Edit(bookingItemLive) as DynamicContent;
//Check out the master to get a temp version.
DynamicContent bookingItemTemp = dynamicModuleManager.Lifecycle.CheckOut(bookingItemMaster) as DynamicContent;
//Make the modifications to the temp version.
bookingItemTemp.SetValue("CleanerId", cleanerId);
bookingItemTemp.SetValue("SubCleanerId", subCleanerId);
bookingItemTemp.LastModified = DateTime.UtcNow;
//Checkin the temp and get the updated master version.
//After the check in the temp version is deleted.
bookingItemMaster = dynamicModuleManager.Lifecycle.CheckIn(bookingItemTemp) as DynamicContent;
dynamicModuleManager.SaveChanges();
//Publish the item now
ILifecycleDataItem publishedBookingItem = dynamicModuleManager.Lifecycle.Publish(bookingItemMaster);
bookingItemMaster.SetWorkflowStatus(dynamicModuleManager.Provider.ApplicationName, "Published");
#Joseph Ghassan: You can follow below step.
Step 1: Get Live
var bookingItemLive= manager.GetDataItem(ModuleType, GuidId);
Step 2: Get Master
var bookingItemMaster = manager.Lifecycle.GetMaster(bookingItemLive) as DynamicContent;
Step 2: Checkout Master --> will be created a draft item( you will update data by draft)
if (bookingItemMaster == null)
{
return new HttpResponseMessage
{
Content = new JsonContent(new { Result = false })
};
}
var bookingItemDraff = manager.Lifecycle.CheckOut(bookingItemMaster) as DynamicContent;
//Make the modifications to the temp version.
bookingItemDraff.SetValue("CleanerId", cleanerId);
bookingItemDraff.SetValue("SubCleanerId", subCleanerId);
bookingItemDraff.LastModified = DateTime.UtcNow;
Step 3: Checkin Master
// Now we need to check in, so the changes apply
var checkInItem = manager.Lifecycle.CheckIn(bookingItemDraff);
Step 4: Publish changes
manager.Lifecycle.Publish(checkInItem);
Step 5: Save Changes - this will save to Database and update live.
bookingItemDraff.SetWorkflowStatus(manager.Provider.ApplicationName, CustomFieldName.PublishedStatus);
manager.SaveChanges();