Maybe this can be done without StreamInsight, but I'm curious.
I have an application that is populating a table with "messages" (inserts a row in the table).
I want to create a monitoring application that monitors this table for the rate at which messages are "arriving", and how quickly they are "processed" (flag gets updated).
As this is a vendors application, I don't want to drop in a trigger or anything. But I can query the db and the table has a PK using an identity column.
How can I get to a hopping window query? I would love to show a line graph for the say the past 30 minutes showing the rate of messages coming in, and the rate at which the messages are process.ed.
Depending on what information is captured in this table of messages, I think you could probably do this faster by just running a SQL query.
If you are still wanting to use StreamInsight to do this, here's some code to get you started.
var app = Application;
var interval = TimeSpan.FromSeconds(1);
var windowSize = TimeSpan.FromSeconds(10);
var hopSize = TimeSpan.FromSeconds(1);
/* Replace the Observable.Interval with your logic to poll the database and
convert the messages to instances of TPayload. It just needs to be a class
that implements the IObservable<TPayload> interface. */
var observable = app.DefineObservable(()=> Observable.Interval(interval));
// Convert the observable to a point streamable.
var streamable = observable.ToPointStreamable(
e=> PointEvent.CreateInsert(DateTimeOffset.Now, e),
AdvanceTimeSettings.IncreasingStartTime);
/* Using the streamable from the step before, write your actual LINQ queries
to do the analytics you want. */
var query = from win in streamable.HoppingWindow(windowSize, hopSize)
select new Payload{
Timestamp = DateTime.UtcNow,
Value = win.Count()
};
/* Create a sink to output your events (WCF, etc). It just needs to be a
class that implements the IObserver<TPayload> interface. The
implementation is highly dependent on your needs. */
var observer = app.DefineObserver(()=> Observer.Create<Payload>(e => e.Dump()));
query.Bind(observer).Run();
Related
I need to merge events coming from 2 different event sourcing systems handled by Akka.Net Persistence module. The merge must sort events based on their timestamp, and I found the MergeSorted operator in Akka.Stream that does exactly what I need (tried with 2 list of numbers - for events I wrote a custom EventEnvelopComparer).
In my solution I have an actor system (readsystem1) to read from db1, and a second actor system (readysystem2) to read from db2, both created passing the right connection string to the db (a PostGres db).
The problem is: when I use the MergeSorted operator, I need to pass an instance of ActorMaterializer and if the actor materializer is created in the readsystem1 actor system then only the events from db1 are loaded (and merged with themselves); the opposite if I create the actor materializer in the readsystem2. I need to load them both.
Here is an example of the code (writing timestamps to a file, just to test them):
var actorMaterializer1 = ActorMaterializer.Create(
readSystem1,
ActorMaterializerSettings.Create(readSystem1).WithDebugLogging(true));
var readJournal1 = PersistenceQuery.Get(readSystem1)
.ReadJournalFor<SqlReadJournal>(SqlReadJournal.Identifier);
var source1 = readJournal1.CurrentEventsByPersistenceId("mypersistenceId", 0L, long.MaxValue);
await source1
.Select(x => ByteString.FromString($"{x.Timestamp.ToString()}{Environment.NewLine}"))
.RunWith(FileIO.ToFile(new FileInfo("c:\\tmp\\timestamps1.txt")), actorMaterializer1);
// just creating the materializer changes the events loaded by the source!!!
var actorMaterializer2 = ActorMaterializer.Create(
readSystem2,
ActorMaterializerSettings.Create(readSystem1).WithDebugLogging(true));
var readJournal2 = PersistenceQuery.Get(readSystem2)
.ReadJournalFor<SqlReadJournal>(SqlReadJournal.Identifier);
var source2 = readJournal2.CurrentEventsByPersistenceId("mypersistenceId", 0L, long.MaxValue);
await source2
.Select(x => ByteString.FromString($"{x.Timestamp.ToString()}{Environment.NewLine}"))
.RunWith(FileIO.ToFile(new FileInfo("c:\\tmp\\timestamps2.txt")), actorMaterializer2);
// RunWith receives actorMaterializer1, so only events coming from db1
// will be loaded and merged with themselves
var source = source1.MergeSorted(source2, new EventEnvelopComparer());
await source
.Select(x => ByteString.FromString($"{x.Timestamp.ToString()}{Environment.NewLine}"))
.RunWith(FileIO.ToFile(new FileInfo("c:\\tmp\\timestamps.txt")), actorMaterializer1);
How can I accomplish this? Is it possible to read 2 different event sourcing table from the same actor system, in the same or in different db? Is there something about the ActorMaterializer that can solve my problem? Is my approach completely wrong?
To use events from two different ActorSystems I think you'd need to use StreamRefs. But what you could do here is configure two ReadJournalIds, each pointing to a different *.db file. That way you can use one ActorSystem and materializer.
var source1 = PersistenceQuery.Get(actorSystem).ReadJournalFor<SqlReadJournal>("read-journal-1")
.CurrentEventsByPersistenceId("sample-id-1", 0L, long.MaxValue);
var source2 = PersistenceQuery.Get(actorSystem).ReadJournalFor<SqlReadJournal>("read-journal-2")
.CurrentEventsByPersistenceId("sample-id-1", 0L, long.MaxValue);
var source = source1.MergeSorted(source2, new EventEnvelopComparer())
.RunForeach(x => System.Console.WriteLine($"EVENT: {x.Timestamp}"), actorSystem.Materializer());
I think I see what's going on here....
Here's why your choice of Materializer is creating an issue for you - the Materializer is going to compile your Akka.Persistence.Query + Akka.Streams graphs into actors. When you use the Materializer from ActorSystem A - it's going to materialize the actors into that ActorSystem and use its Journal implementation for Akka.Persistence - that's why you only get 1 events from 1 ActorSystem.
However, it looks like you're doing exactly what I would do - pre-materializing each source before merging them together... Would it be possible to create a reproduction of this on GitHub using a dummy SQLite database or some such? If so I'd be happy to debug it.
Currently have Elastic Apm setup with: app.UseAllElasticApm(Configuration); which is working correctly. I've just been trying to find a way to record exactly how many SQL Queries are run via Entity Framework for each transaction.
Ideally when viewing the Apm data in Kibana the metadata tab could just include an EntityFramework.ExecutedSqlQueriesCount.
Currently on .Net Core 2.2.3
One thing you can use is the Filter API for this.
With that you have access to all transactions and spans before they are sent to the APM Server.
You can't run through all the spans on a given transaction, so you need some tweaking - for this I use a Dictionary in my sample.
var numberOfSqlQueries = new Dictionary<string, int>();
Elastic.Apm.Agent.AddFilter((ITransaction transaction) =>
{
if (numberOfSqlQueries.ContainsKey(transaction.Id))
{
// We make an assumption here: we assume that all SQL requests on a given transaction end before the transaction ends
// this in practice means that you don't do any "fire and forget" type of query. If you do, you need to make sure
// that the numberOfSqlQueries does not leak.
transaction.Labels["NumberOfSqlQueries"] = numberOfSqlQueries[transaction.Id].ToString();
numberOfSqlQueries.Remove(transaction.Id);
}
return transaction;
});
Elastic.Apm.Agent.AddFilter((ISpan span) =>
{
// you can't relly filter whether if it's done by EF Core, or another database library
// but you have all sorts of other info like db instance, also span.subtype and span.action could be helpful to filter properly
if (span.Context.Db != null && span.Context.Db.Instance == "MyDbInstance")
{
if (numberOfSqlQueries.ContainsKey(span.TransactionId))
numberOfSqlQueries[span.TransactionId]++;
else
numberOfSqlQueries[span.TransactionId] = 1;
}
return span;
});
Couple of thing here:
I assume you don't do "fire and forget" type of queries, if you do, you need to handle those extra
The counting isn't really specific to EF Core queries, but you have info like db name, database type (mssql, etc.) - hopefully based on that you'll be able filter the queries you want.
With transaction.Labels["NumberOfSqlQueries"] we add a label to the given transction, and you'll be able to see this data on the transaction in Kibana.
I have created 2 SQL models in Google App Maker. For simplicity sake lets say Model 1 has all of the information that can be added and edited for each of the records. Model 2 works as a storage model where once a record in Model 1 is removed it moves over to Model 2. The idea is that the individual can click on a "removed" boolean which will open a dialog page to add in comments for the removal and once complete the record will be moved to Model 2 for storage and will no longer be visible in Model 1.
Is there any way to do this? If you need more information let me know and I will try to provide it but the reason I cannot post the existing app is because the information is confidential.
Thanks for you help!
Updated answer: move to another model
If you want to enforce users to enter a message, you need to forbid them to delete records through datasources:
// onBeforeDelete model event
throw new Error('You should provide message prior deleting a record');
Then you need to implement audit itself:
// server script
function archive(itemKey, message) {
if (!message) {
throw new Error('Message is required');
}
var record = app.models.MyModel.getRecord(itemKey);
if (!record) {
throw new Error('Record was not found');
}
var archive = app.models.Removed.newRecord();
archive.Field1 = record.Field1;
archive.Field2 = record.Field2;
...
archive.Message = message;
app.saveRecords([archive]);
app.deleteRecords([record]);
}
// client script
google.script.run
.withSuccessHandler(function() {
// TODO
})
.withFailureHandler(function() {
// TODO
})
.archive(itemKey, message);
If you need to implement auditing for multiple/all models then you can generalize the snippet by passing model's name and using Model Metadata: funciton archive(modelName, itemKey, message) {}
Original answer: move to another DB
Normally I would recommend just to add and set a boolean field Deleted to the model and ensure that records marked as deleted are not sent to the client. Implementation of moving data between databases could be tricky since transactions are not supported across multiple databases.
If you desperately want to make your app more complex and less reliable you can create record's backup in onBeforeDelete model event using JDBC Apps Script service (External Database Sample could be your friend to start with):
// onBeforeDelete model event
var connection = Jdbc.getConnection(dbUrl, user, userPassword);
var statement = connection.prepareStatement('INSERT INTO ' + TABLE_NAME +
' (Field1, Field2, ...) values (?, ?, ...)');
statement.setString(1, record.Field1);
statement.setString(2, record.Field2);
...
statement.execute();
Why do you need JDBC? Because App Maker natively doesn't support models attached to different databases.
I was able to do what I needed using a query filter as a client script This keeps the data on the back end when i export and only shows the active user whatever is not removed.
var datasource1 = app.datasources.WatchList_Data;
datasource1.query.filters.Remove_from_WatchList._equals = 'No';
datasource1.load();
I am very new to NServiceBus, and in one of our project, we want to accomplish following -
Whenever table data is modified in Sql server, construct a message and insert in sql server broker queue
Read the broker queue message using NServiceBus
Publish the message again as another event so that other subscribers
can handle it.
Now it is point 2, that I do not have much clue, how to get it done.
I have referred the following posts, after which I was able to enter the message in broker queue, but unable to integrate with NServiceBus in our project, as the NServiceBus libraries are of older version and also many methods used are deprecated. So using them with current versions is getting very troublesome, or if I was doing it in improper way.
http://www.nullreference.se/2010/12/06/using-nservicebus-and-servicebroker-net-part-2
https://github.com/jdaigle/servicebroker.net
Any help on the correct way of doing this would be invaluable.
Thanks.
I'm using the current version of nServiceBus (5), VS2013 and SQL Server 2008. I created a Database Change Listener using this tutorial, which uses SQL Server object broker and SQLDependency to monitor the changes to a specific table. (NB This may be deprecated in later versions of SQL Server).
SQL Dependency allows you to use a broad selection of all the basic SQL functionality, although there are some restrictions that you need to be aware of. I modified the code from the tutorial slightly to provide better error information:
void NotifyOnChange(object sender, SqlNotificationEventArgs e)
{
// Check for any errors
if (#"Subscribe|Unknown".Contains(e.Type.ToString())) { throw _DisplayErrorDetails(e); }
var dependency = sender as SqlDependency;
if (dependency != null) dependency.OnChange -= NotifyOnChange;
if (OnChange != null) { OnChange(); }
}
private Exception _DisplayErrorDetails(SqlNotificationEventArgs e)
{
var message = "useful error info";
var messageInner = string.Format("Type:{0}, Source:{1}, Info:{2}", e.Type.ToString(), e.Source.ToString(), e.Info.ToString());
if (#"Subscribe".Contains(e.Type.ToString()) && #"Invalid".Contains(e.Info.ToString()))
messageInner += "\r\n\nThe subscriber says that the statement is invalid - check your SQL statement conforms to specified requirements (http://stackoverflow.com/questions/7588572/what-are-the-limitations-of-sqldependency/7588660#7588660).\n\n";
return new Exception(messageMain, new Exception(messageInner));
}
I also created a project with a "database first" Entity Framework data model to allow me do something with the changed data.
[The relevant part of] My nServiceBus project comprises two "Run as Host" endpoints, one of which publishes event messages. The second endpoint handles the messages. The publisher has been setup to IWantToRunAtStartup, which instantiates the DBListener and passes it the SQL statement I want to run as my change monitor. The onChange() function is passed an anonymous function to read the changed data and publish a message:
using statements
namespace Sample4.TestItemRequest
{
public partial class MyExampleSender : IWantToRunWhenBusStartsAndStops
{
private string NOTIFY_SQL = #"SELECT [id] FROM [dbo].[Test] WITH(NOLOCK) WHERE ISNULL([Status], 'N') = 'N'";
public void Start() { _StartListening(); }
public void Stop() { throw new NotImplementedException(); }
private void _StartListening()
{
var db = new Models.TestEntities();
// Instantiate a new DBListener with the specified connection string
var changeListener = new DatabaseChangeListener(ConfigurationManager.ConnectionStrings["TestConnection"].ConnectionString);
// Assign the code within the braces to the DBListener's onChange event
changeListener.OnChange += () =>
{
/* START OF EVENT HANDLING CODE */
//This uses LINQ against the EF data model to get the changed records
IEnumerable<Models.TestItems> _NewTestItems = DataAccessLibrary.GetInitialDataSet(db);
while (_NewTestItems.Count() > 0)
{
foreach (var qq in _NewTestItems)
{
// Do some processing, if required
var newTestItem = new NewTestStarted() { ... set properties from qq object ... };
Bus.Publish(newTestItem);
}
// Because there might be a number of new rows added, I grab them in small batches until finished.
// Probably better to use RX to do this, but this will do for proof of concept
_NewTestItems = DataAccessLibrary.GetNextDataChunk(db);
}
changeListener.Start(string.Format(NOTIFY_SQL));
/* END OF EVENT HANDLING CODE */
};
// Now everything has been set up.... start it running.
changeListener.Start(string.Format(NOTIFY_SQL));
}
}
}
Important The OnChange event firing causes the listener to stop monitoring. It basically is a single event notifier. After you have handled the event, the last thing to do is restart the DBListener. (You can see this in the line preceding the END OF EVENT HANDLING comment).
You need to add a reference to System.Data and possibly System.Data.DataSetExtensions.
The project at the moment is still proof of concept, so I'm well aware that the above can be somewhat improved. Also bear in mind I had to strip out company specific code, so there may be bugs. Treat it as a template, rather than a working example.
I also don't know if this is the right place to put the code - that's partly why I'm on StackOverflow today; to look for better examples of ServiceBus host code. Whatever the failings of my code, the solution works pretty effectively - so far - and meets your goals, too.
Don't worry too much about the ServiceBroker side of things. Once you have set it up, per the tutorial, SQLDependency takes care of the details for you.
The ServiceBroker Transport is very old and not supported anymore, as far as I can remember.
A possible solution would be to "monitor" the interesting tables from the endpoint code using something like a SqlDependency (http://msdn.microsoft.com/en-us/library/62xk7953(v=vs.110).aspx) and then push messages into the relevant queues.
.m
I have a WCF Data Service (OData) that serves as the data repository for a larger system. I'm trying to fire off specific methods based on operations on Entities in the repository.
Specifically, if someone changes a Message record, I want to hook into the pipeline. I'm using ChangeInterceptors for this.
They work for Add and Delete. However, nothing fires when an entity is updated. I am concerned that the DbContext can not resolve the fact that the entity has changed, since the request is stateless.
This does not trigger the handler:
var whatever = from m in Messages
where m.MessageKey == 3
select m;
whatever.First().UpdatedDate = DateTime.Now;
this.SaveChanges();
Has anyone else faced this problem?
So, I was trying to use AttachTo() to handle the fact that my record was detached. This flat out didn't work, and led to runtime exceptions like the following:
This operation requires the entity be of an Entity Type, and has at least one key
property.Parameter name: entity
At any rate, just use the update method and the change will be intercepted (and actually applied)
var whatever = (from m in Messages where m.MessageKey == 1
select m ).Single();
whatever.UpdatedDate = DateTime.Now;
this.UpdateObject(whatever);
this.SaveChanges();