Thingworx: Customize `GetImplementingThings` service - thingworx

I am new to ThingWorx and I want to get some practical flavour of implementing services on this example.
I have such data model:
Thing 'Car' has Thing 'Sensor'(Infotable)
I want to have service of CarTemplate that will return all implemented Cars and instead of Sensor's Object it will return Sensor's 'name' property.
What I have now:
"Car1Name" | SensorObject
What I want:
"Car1Name" | "Accelerator1Name"
Please, help me to make it happens.

There's no kind of "Static" services on ThingTemplates, if you want to recover all Implementing things of a ThingTemplate with properties values you should build a Thing Helper.
What's a Thing Helper? It's another thing, call it whatever you want, let's say CarHelpers, which has a Service called GetCarsWithSensors, which does a ThingTemplates["ThingTemplateName"].GetImplementingThings(), or a GetImplementingThingsWithData and returns the desired Infotable.

Carles answer is valid, but I would avoid using QueryImplementingThingsWithData.
The problem with QueryImplementingThingsWithData is that Thingworx will check visibility, then security for every single property on every single implemented Thing. This is fine if you are running as a user in the Administrators group but once you have a lot of UserGroups and OrganizationalUnits this will slow down, A LOT.
Instead do something like this: (You'll need to create a DataShape and set that as your service return datashape)
var result; //result infotable, of your CarDataShape
var myThings = ThingTemplates["CarTemplate"].QueryImplementingThings();
for(var i=0; i< myThings.getRowCount(); i++) {
var myCar = Things[myThings.rows[i].name];
for(var j=0; j < myCar.sensorProperty.getRowCount(); j++) {
var newRow = {};
newRow.name = myCar.name;
newRow.sensor = myCar.sensorProperty.rows[j].sensorName;
result.AddRow(myCar);
}
}

Related

Complex function using Parse Server Cloud Code (looping and creating records)

After a night of trial and error I have decided on a much simpler way to explain my issue. Again, I have no JS experience, so I don't really know what I am doing.
I have 5 classes:
game - holds information about my games
classification - holds information about the user classes available in games
game_classifications - creates a one game to many classifications relationship (makes a game have mulitple classes)
mission - holds my mission information
mission_class - creates a one to many relationship between a mission and the classes available for that mission
Using Cloud Code, I want to provide two inputs through my Rest API being missionObjectId and gameObjectId.
The actual steps I need the code to perform are:
Get the two inputs provided {"missionObjectId":"VALUE","gameObjectId":"VALUE"}
Search the game_classifications class for all records where game = gameObjectID
For each returned record, create a new record in mission_class with the following information:
mission_id = missionObjectId
classification = result.classification
Here is an image of the tables:
And here is how I have tried to achieve this:
Parse.Cloud.define("activateMission", async (request) => {
Parse.Cloud.useMasterKey();
const query = new Parse.query('game_classifications');
query.equalTo("gameObjectId", request.params.gameObjectId);
for (let i = 0; i < query.length; i ++) {
const mission_classification = Parse.Object.extend("mission_class");
const missionClass = new mission_classification();
missionClass.set("mission_id", request.params.missionObjectId);
missionClass.set("classification_id", query[i].classificationObjectId);
return missionClass.save();
}
});
Does anyone have any advice or input that might help me achieve this goal?
The current error I am getting is:
Parse.query is not a constructor
Thank you all in advance!
Some problems on your current code:
Parse.Cloud.useMasterKey() does not exist for quite a long time. Use useMasterKey option instead.
It's Parse.Query and not Parse.query.
You need to run query.findAll() command and iterate over it (and not over query).
For performance, move Parse.Object.extend calls to the beginning of the file.
To access the field of an object, use obj.get('fieldName') and not obj.fieldName.
If you return the save operation, it will save the first object, return, and not save the others.
So, the code needs to be something like this:
const mission_classification = Parse.Object.extend("mission_class");
const game = Parse.Object.extend("game");
Parse.Cloud.define("activateMission", async (request) => {
const query = new Parse.Query('game_classifications');
const gameObj = new game();
gameObj.id = request.params.gameObjectId;
query.equalTo("gameObjectId", gameObj);
const queryResults = await query.findAll({useMasterKey: true});
for (let i = 0; i < queryResults.length; i++) {
const missionClass = new mission_classification();
missionClass.set("mission_id", request.params.missionObjectId);
missionClass.set("classification_id", queryResults[i].get('classificationObjectId'));
await missionClass.save(null, { useMasterKey: true });
}
});

How to convert a TStringDynArray to a TStringList

I'm using TDirectory::GetFiles() to get a list of files (obviously).
The result is stored in a TStringDynArray and I want to transfer it to a TStringList for the sole purpose to use the IndexOf() member to see if a string is present in the list or not.
Any solution that will let me know if a certain string is present in the list of files returned from TDirectory::GetFiles() will do fine. Although, it would be interesting to know how to convert the TStringDynArray.
TStringDynArray DynFiles = TDirectory::GetFiles("Foo path");
System::Classes::TStringList *Files = new System::Classes::TStringList;
Files->Assing(DynFiles) // I know this is wrong, but it illustrates what I want to do.
if(Files->IndexOf("Bar") { // <---- This is my goal, to find "Bar" in the list of files.
}
TStringList and TStringDynArray do not know anything about each other, so you will have to copy the strings manually:
TStringDynArray DynFiles = TDirectory::GetFiles("Foo path");
System::Classes::TStringList *Files = new System::Classes::TStringList;
for (int I = DynFiles.Low; I <= DynFiles.High; ++I)
Files->Add(DynFiles[I]);
if (Files->IndexOf("Bar")
{
//...
}
delete Files;
Since you have to manually loop through the array anyway, you can get rid of the TStringList:
TStringDynArray DynFiles = TDirectory::GetFiles("Foo path");
for (int I = DynFiles.Low; I <= DynFiles.High; ++I)
{
if (DynFiles[I] == "Bar")
{
//...
break;
}
}
But, if you are only interested in checking for the existence of a specific file, look at TFile::Exists() instead, or even Sysutils::FileExists().
if (TFile::Exists("Foo path\\Bar"))
{
//...
}
if (FileExists("Foo path\\Bar"))
{
//...
}
* personally, I hate that the IOUtils unit uses dynamic arrays for lists. They are slow, inefficient, and do not integrate well with the rest of the RTL. But that is just my opinion.
TStrings knows TStringDynArray good enough to provide a member AddStrings:
Files->AddStrings(TDirectory::GetFiles("Foo path"));
will do the job.

Delaying writes to SQL Server

I am working on an app, and need to keep track of how any views a page has. Almost like how SO does it. It is a value used to determine how popular a given page is.
I am concerned that writing to the DB every time a new view needs to be recorded will impact performance. I know this borderline pre-optimization, but I have experienced the problem before. Anyway, the value doesn't need to be real time; it is OK if it is delayed by 10 minutes or so. I was thinking that caching the data, and doing one large write every X minutes should help.
I am running on Windows Azure, so the Appfabric cache is available to me. My original plan was to create some sort of compound key (PostID:UserID), and tag the key with "pageview". Appfabric allows you to get all keys by tag. Thus I could let them build up, and do one bulk insert into my table instead of many small writes. The table looks like this, but is open to change.
int PageID | guid userID | DateTime ViewTimeStamp
The website would still get the value from the database, writes would just be delayed, make sense?
I just read that the Windows Azure Appfabric cache does not support tag based searches, so it pretty much negates my idea.
My question is, how would you accomplish this? I am new to Azure, so I am not sure what my options are. Is there a way to use the cache without tag based searches? I am just looking for advice on how to delay these writes to SQL.
You might want to take a look at http://www.apathybutton.com (and the Cloud Cover episode it links to), which talks about a highly scalable way to count things. (It might be overkill for your needs, but hopefully it gives you some options.)
You could keep a queue in memory and on a timer drain the queue, collapse the queued items by totaling the counts by page and write in one SQL batch/round trip. For example, using a TVP you could write the queued totals with one sproc call.
That of course doesn't guarantee the view counts get written since its in memory and latently written but page counts shouldn't be critical data and crashes should be rare.
You might want to have a look at how the "diagnostics" feature in Azure works. Not because you would use diagnostics for what you are doing at all, but because it is dealing with a similar problem and may provide some inspiration. I am just about to implement a data auditing feature and I want to log that to table storage so also want to delay and bunch the updates together and I have taken a lot of inspiration from diagnostics.
Now, the way Diagnostics in Azure works is that each role starts a little background "transfer" thread. So, whenever you write any traces then that gets stored in a list in local memory and the background thread will (by default) bunch all the requests up and transfer them to table storage every minute.
In your scenario, I would let each role instance keep track of a count of hits and then use a background thread to update the database every minute or so.
I would probably use something like a static ConcurrentDictionary (or one hanging off a singleton) on each webrole with each hit incrementing the counter for the page identifier. You'd need to have some thread handling code to allow multiple request to update the same counter in the list. Alternatively, just allow each "hit" to add a new record to a shared thread-safe list.
Then, have a background thread once per minute increment the database with the number of hits per page since last time and reset the local counter to 0 or empty the shared list if you are going with that approach (again, be careful about the multi threading and locking).
The important thing is to make sure your database update is atomic; If you do a read-current-count from the database, increment it and then write it back then you may have two different web role instances doing this at the same time and thus losing one update.
EDIT:
Here is a quick sample of how you could go about this.
using System.Collections.Concurrent;
using System.Data.SqlClient;
using System.Threading;
using System;
using System.Collections.Generic;
using System.Linq;
class Program
{
static void Main(string[] args)
{
// You would put this in your Application_start for the web role
Thread hitTransfer = new Thread(() => HitCounter.Run(new TimeSpan(0, 0, 1))); // You'd probably want the transfer to happen once a minute rather than once a second
hitTransfer.Start();
//Testing code - this just simulates various web threads being hit and adding hits to the counter
RunTestWorkerThreads(5);
Thread.Sleep(5000);
// You would put the following line in your Application shutdown
HitCounter.StopRunning(); // You could do some cleverer stuff with aborting threads, joining the thread etc but you probably won't need to
Console.WriteLine("Finished...");
Console.ReadKey();
}
private static void RunTestWorkerThreads(int workerCount)
{
Thread[] workerThreads = new Thread[workerCount];
for (int i = 0; i < workerCount; i++)
{
workerThreads[i] = new Thread(
(tagname) =>
{
Random rnd = new Random();
for (int j = 0; j < 300; j++)
{
HitCounter.LogHit(tagname.ToString());
Thread.Sleep(rnd.Next(0, 5));
}
});
workerThreads[i].Start("TAG" + i);
}
foreach (var t in workerThreads)
{
t.Join();
}
Console.WriteLine("All threads finished...");
}
}
public static class HitCounter
{
private static System.Collections.Concurrent.ConcurrentQueue<string> hits;
private static object transferlock = new object();
private static volatile bool stopRunning = false;
static HitCounter()
{
hits = new ConcurrentQueue<string>();
}
public static void LogHit(string tag)
{
hits.Enqueue(tag);
}
public static void Run(TimeSpan transferInterval)
{
while (!stopRunning)
{
Transfer();
Thread.Sleep(transferInterval);
}
}
public static void StopRunning()
{
stopRunning = true;
Transfer();
}
private static void Transfer()
{
lock(transferlock)
{
var tags = GetPendingTags();
var hitCounts = from tag in tags
group tag by tag
into g
select new KeyValuePair<string, int>(g.Key, g.Count());
WriteHits(hitCounts);
}
}
private static void WriteHits(IEnumerable<KeyValuePair<string, int>> hitCounts)
{
// NOTE: I don't usually use sql commands directly and have not tested the below
// The idea is that the update should be atomic so even though you have multiple
// web servers all issuing similar update commands, potentially at the same time,
// they should all commit. I do urge you to test this part as I cannot promise this code
// will work as-is
//using (SqlConnection con = new SqlConnection("xyz"))
//{
// foreach (var hitCount in hitCounts.OrderBy(h => h.Key))
// {
// var cmd = con.CreateCommand();
// cmd.CommandText = "update hits set count = count + #count where tag = #tag";
// cmd.Parameters.AddWithValue("#count", hitCount.Value);
// cmd.Parameters.AddWithValue("#tag", hitCount.Key);
// cmd.ExecuteNonQuery();
// }
//}
Console.WriteLine("Writing....");
foreach (var hitCount in hitCounts.OrderBy(h => h.Key))
{
Console.WriteLine(String.Format("{0}\t{1}", hitCount.Key, hitCount.Value));
}
}
private static IEnumerable<string> GetPendingTags()
{
List<string> hitlist = new List<string>();
var currentCount = hits.Count();
for (int i = 0; i < currentCount; i++)
{
string tag = null;
if (hits.TryDequeue(out tag))
{
hitlist.Add(tag);
}
}
return hitlist;
}
}

given a list of objects using C# push them to ravendb without knowing which ones already exist

Given 1000 documents with a complex data structure. for e.g. a Car class that has three properties, Make and Model and one Id property.
What is the most efficient way in C# to push these documents to raven db (preferably in a batch) without having to query the raven collection individually to find which to update and which to insert. At the moment I have to going like so. Which is totally inefficient.
note : _session is a wrapper on the IDocumentSession where Commit calls SaveChanges and Add calls Store.
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
var page = 0;
const int total = 30;
do
{
var paged = sales.Skip(page*total).Take(total);
if (!paged.Any()) return;
foreach (var sale in paged)
{
var current = sale;
var existing = _session.Query<Sale>().FirstOrDefault(s => s.Id == current.Id);
if (existing != null)
existing = current;
else
_session.Add(current);
}
_session.Commit();
page++;
} while (true);
}
Your session code doesn't seem to track with the RavenDB api (we don't have Add or Commit).
Here is how you do this in RavenDB
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
sales.ForEach(session.Store);
session.SaveChanges();
}
Your code sample doesn't work at all. The main problem is that you cannot just switch out the references and expect RavenDB to recognize that:
if (existing != null)
existing = current;
Instead you have to update each property one-by-one:
existing.Model = current.Model;
existing.Make = current.Model;
This is the way you can facilitate change-tracking in RavenDB and many other frameworks (e.g. NHibernate). If you want to avoid writing this uinteresting piece of code I recommend to use AutoMapper:
existing = Mapper.Map<Sale>(current, existing);
Another problem with your code is that you use Session.Query where you should use Session.Load. Remember: If you query for a document by its id, you will always want to use Load!
The main difference is that one uses the local cache and the other not (the same applies to the equivalent NHibernate methods).
Ok, so now I can answer your question:
If I understand you correctly you want to save a bunch of Sale-instances to your database while they should either be added if they didn't exist or updated if they existed. Right?
One way is to correct your sample code with the hints above and let it work. However that will issue one unnecessary request (Session.Load(existingId)) for each iteration. You can easily avoid that if you setup an index that selects all the Ids of all documents inside your Sales-collection. Before you then loop through your items you can load all the existing Ids.
However, I would like to know what you actually want to do. What is your domain/use-case?
This is what works for me right now. Note: The InjectFrom method comes from Omu.ValueInjecter (nuget package)
private void PublishSalesToRaven(IEnumerable<Sale> sales)
{
var ids = sales.Select(i => i.Id);
var existingSales = _ravenSession.Load<Sale>(ids);
existingSales.ForEach(s => s.InjectFrom(sales.Single(i => i.Id == s.Id)));
var existingIds = existingSales.Select(i => i.Id);
var nonExistingSales = sales.Where(i => !existingIds.Any(x => x == i.Id));
nonExistingSales.ForEach(i => _ravenSession.Store(i));
_ravenSession.SaveChanges();
}

Silverlight is not fetching data from my WCF RIA service

I just started learning Silverlight by walking through the labs posted on Channel9. When I tried to explore a little bit I found that my queries were not working as I thought they would.
To recreate what I have done you would need to create a new Silverlight Business application, create a data entity that is pointed to the Adventureworks LT db, and generate the web services for those entities (including edit).
I then simply drug a RichTextbox to Home.xaml and in Home.xaml.cs I added this code first to OnNavigatedTo and when that didn't work to the constructor.
AdventureWorksDomainContext ctx = new AdventureWorksDomainContext();
EntityQuery<Product> query =
from p in ctx.GetProductsQuery()
select p;
LoadOperation<Product> loadOp = ctx.Load(query);
var paragraph = new Paragraph();
foreach (var product in loadOp.Entities)
{
paragraph.Inlines.Add(new Run { Text = product.Name });
}
richTextBox1.Blocks.Add(paragraph);
When I run the page I never see loadOp.Entities contain a value and I only see the query I expect, go across the wire after all my code has been executed.
I feel like I'm missing something fundamental and this will make more sense if I can find someone to explain it to me.
Thanks,
Eric
The problem is related to the how you are loading the data. The actual Load operation is asynchronous, as is all Silverlight network calls. You are callingt ctx.Load(query) and then immediately setting the paragraph to the entities. You need to use a callback when Load is completed. Something like this,
AdventureWorksDomainContext ctx = new AdventureWorksDomainContext();
EntityQuery<Product> query =
from p in ctx.GetProductsQuery()
select p;
LoadOperation<Product> loadOp = ctx.Load(query,() =>
{
var paragraph = new Paragraph();
foreach (var product in loadOp.Entities)
{
paragraph.Inlines.Add(new Run { Text = product.Name });
}
richTextBox1.Blocks.Add(paragraph);
});
Since you aren't using the entities directly in a binding and are just iterating them, you need to make sure you wait until they are loaded. I can't remember the actual signature of the Load method, so you may need to modify my lambda to make it work.