I have two 3rd-party logging providers in my ASP.NET Core GRPC Service. Can I somehow "chain" these logging providers? I mean, can logging provider 2 perform logging only in situations when logging provider 1 failed to log.
For example, if I have an Application Insights provider and NLog File provider, can I log into the file only if my service can't connect to Application Insights infrastructure?
You can do everything mentioned using different NLog targets. So there is a target for Azure Application Insight: https://www.nuget.org/packages/Microsoft.ApplicationInsights.NLogTarget/.
Moreover if you have doubts that some of your targets can proceed slowly you can use asynchronous logging for it: https://github.com/NLog/NLog/wiki/AsyncWrapper-target
In my humble opinion, setting some rule to make 2 kinds of logging providers work like a chain is not necessary and maybe impossible.
What we usually do is making them work at the same time, just like using NLog to write log to files and send log to azure application insights as well.
By the way, azure application insight may not work because of some accident, but it's a small probability event. The scenario that the logging module not working is more likely to be the application crashed, then all the modules in your application are down including both the logging modules. So even have the feature to make Nlog work when application insight doesn't work, it's high probably to have a result that when application insight doesn't work, Nlog can't work too.
Related
Is it possible to use IIS's logging with an asp.net core application? We've developed several .net core applications and they are all hosted under IIS so I'm trying to take advantage of IIS's logging.
Unfortunately, enabling logging with a No Managed Code app seems to produce no logs. Managed Code applications do produce logs correctly.
We are using Application Insights SDK for our Azure WebApp application with Azure SQL Database and Azure Storage. Which Azure Service we can use to track time specific code requests and exceptions within the application?
If you are already using Application Insights SDK, you can use it to track requests, exceptions, dependencies etc.
https://learn.microsoft.com/en-us/azure/application-insights/app-insights-asp-net
The application insights SDK provides many methods for logging, so you can use built-in method like TrackException, TrackTrace, Trackxxx to track. Details are here.
I am confused between Elmah and Elmah.Contrib.WebApi. And which one is best option for Web API. Iam already using Nlog for exception logging along with Tracing in Web API. So How Elmah is different from all of them. What is the exact need to going for Elmah??
Thanks in Advance
I find that it is useful for catching errors that you did not catch and log yourself. I am using it for a MVC application and it sends me an email when I have an issue that I need to resolve. I can be proactive and I am working on errors before I can hear from the user.
Elmah addresses your default application error logging. However when you're using Web API, for example ASP.NET MVC Web API, you need some extra logic to log your Web API errors within your Elmah data store.
There are a few ways to address this requirement, one being to use the Elmah.Contrib.WebApi package.
Once the package is imported to your project remember to follow any implementation requires, i.e. startup filter registry (as noted here)
I'm evaluating the WCF Data Service approach for my next project. As I would need to be flexible on logging and authentication I have a couple of questions that maybe you are able to answer.
1) Am I able to log different level of events, ex. warnings, errors, and redirect them to a different logging sources as database, text file, event log?
2) A link that shows how to perform the requested on question 1
3) Is there a way to introduce a simple authentication based on user name and password and how it is done?
4) Do you have by direct experience discovered any limitation on using Data Services instead of creating a WS-* WCF service for what concerns logging and authentication?
Thanks
There's a good series of blogs about auth over OData service here: http://blogs.msdn.com/b/astoriateam/archive/tags/authentication/
For logging you should be able to use your web server's logging facilities (typically IIS I assume), since all errors are reported as error responses by the service.
You can also override the DataService.HandleException method and implement your own logging in any way you want.
For logging of general non-error things there's also the processing pipeline (DataService.ProcessingPipeline). You can register a handler and implement your own logging of these as well.
I don't know of a sample of this tough.
A service I have in WCF occasionally goes down due a problem with a COM component. While I am troubleshooting I would like to setup another host to make regular calls to this service to monitor availability.
It is slightly more complicated that a simple HTTP call though as the service is secured by SSL and WCF authentication (username / password). I'd also like to be able to parse successful calls to see if they return warning / fail states from my code.
Would you recommend any monitoring providers for this or is it beyond the simple monitoring they normally provide?
Regards
Ryan
You could enable WCF logging and auditing facilities either on the server or the client to produce a log of all traffic. Then you can analyze the results using the WCF Service Trace Viewer Tool provided in .NET Framework 3.0 and 3.5 SDK.
In your situation I would probably enable logging only at the message level. This will reduce the amount of information that ends up in the log file and will help you focus on analyzing the data that's actually being sent back and forth from the services.