I have an existing application where Sync Framework is used. It was all working before and with latest changes on adding some additional columns to one of the sync tables, the sync is no more working. Wanted to know how to debug the issue as the problem I see is that the GetChanges method of WCF is getting called while the ApplyChnages isn't getting called.
Am using Sync over WCF with a Winforms application.
Related
We would like to programmatically ensure that a database table has a certain set of rows (based on a sometimes-changing enum). We are using EF Core 2.2 with code-first migrations and are looking for the right place to seed this data. We had thought that adding a seeding method to our Startup.cs would be a good idea, but Microsoft's documentation says
The seeding code should not be part of the normal app execution as this can cause concurrency issues when multiple instances are running and would also require the app having permission to modify the database schema.
Is the code in Startup.cs considered "part of the normal app execution"?
Our app currently only runs with 1 instance, but there might be multiple in the future. Plus, we have an Azure Functions app and a console app which might also need to ensure that the database table has the correct rows before executing. Despite these concerns, I have seen accepted and upvoted answers on other threads saying that initializing as part of Startup.cs is okay. Will we be shooting ourselves in the foot by doing this?
From the docs:
Depending on the constraints of your deployment the initialization code can be executed in different ways:
Running the initialization app locally.
Deploying the initialization app with the main app, invoking the initialization routine and disabling or removing the initialization app.
My interpretation from this is that you could deploy a console app using publishing profiles that ensured the database seed at launch.
I realise normally a debug run is not visible in the data factory v2 UI after closing the browser window, however unfortunately I needed to restart my machine unexpectedly and it's a long running pipeline.
I thought maybe the runs might be available via powershell, but I haven't had any luck.
The pipeline is likely still running.
We do have external logging, however ideally I'd like to see how long each activity is taking as I'm load testing.
And more importantly I do not want to do another run until I'm sure it's finished.... notably I'll run it from a trigger next time (just in case!).
EDIT:
It looks like a sandbox id is used which is stored in the browser local storage and there appears to be undocumented API endpoints for gathering info using the sandbox id. But there doesn't appear to be a way of getting old sandbox id's so I'm probably out of luck.
There is a button for view all debug runs.
Taken from Microsoft documentation:
To view a historical view of debug runs or see a list of all active debug runs, you can go into the Monitor experience.
We are planning to create a application that reads all the properties file associated to the mule application running in server.
Our intention to read the properties and update from the custom application instead of providing access to the MMC.
Users can update the Quartz Time schedule, Reschedule, Pause and Resume Jobs and Triggers.
Can we create an application and run in parallel to Mule Instance deployment and read all application properties and update dynamically with out effecting the deployment ( No restart and deployment).
Short answer is no:
The reason why you can't do this is because many components have a special lifecycle and bring up server sockets or connect to jms queues upon that configuration, so even if you change the properties you would least at a bare minimum stop and start the respective components so the previous resources are released and new ones acquired.
The properties file change cannot be detected until the Mule XML is changed. So an alternet option to read the changes in the
properties file is to change some thing in Mule XML and save it.
When ever the Mule server detect any changes in the XML file,
XML is read and the mule context is created. This has already been discussed here :- http://forum.mulesoft.org/mulesoft/topics/can-we-reload-properties-without-restarting-application
So if you want to create an application in Mule to read the updated value from properties you always need that application to do some changes in your Mule XML
I have flowchart workflow(4) services runnning on IIS, everything is working fine. now trying for persistence on the same project, i want to add a code activity to persist the current instance based on some condition, Please provide code to save the current instance in sql store (db) I mean persisting.
If you want to save the current state of your workflow you need to add a Persist activity to your workflow. When it executes it will save the state in SQL server.
So I am having problems getting NHibernate intergated in my MVC project. I therefore, installed the NHProfiler and initialized it in the Global.asax.cs file (NhibernateProfiler.Initialize();).
However, all I can see in the NHProf is a Session # and the time it took to come up. But selecting it or any other operations doesn't show me any information about the connection to the database or any information at all in any of the other windows such as:
- Statements, Entities, Session Usage
The Session Factory Statistics only shows Start time, execution time, and thats it.
Any thoughts.
Do you have any custom log4net configuration? Just thinking that might be overwriting NHProf's log4net listener after startup. If you refresh the page (and hence start another session*), does NHProf display another Session start? Also verify that your HibernatingRhinos.Profiler.Appender.dll (or HibernatingRhinos.Profiler.Appender.v4.0.dll if you're using .NET 4) is the same one as the current version of NHProf.
* I'm assuming that you're using Session-per-Request since this is a web app.