Keep history of jobs executed for more than 1 day in Hangfire - hangfire

I've just started using Hangfire, and I am loving it.
I understand that Hangfire maintains the history for succeeded jobs for 1 day, and clear it thereafter.
Is there is a way where I can customize this default behavior and persist the history for any duration say 7 days?

To do this, you need to create a job filter and register it through hangfire global configurations, as discussed here -
https://discuss.hangfire.io/t/how-to-configure-the-retention-time-of-job/34
Create job filter -
using Hangfire.Common;
using Hangfire.States;
using Hangfire.Storage;
using System;
namespace HangfireDemo
{
public class ProlongExpirationTimeAttribute : JobFilterAttribute, IApplyStateFilter
{
public void OnStateApplied(ApplyStateContext filterContext, IWriteOnlyTransaction transaction)
{
filterContext.JobExpirationTimeout = TimeSpan.FromDays(7);
}
public void OnStateUnapplied(ApplyStateContext context, IWriteOnlyTransaction transaction)
{
context.JobExpirationTimeout = TimeSpan.FromDays(7);
}
}
}
...and register in global job filters -
GlobalJobFilters.Filters.Add(new ProlongExpirationTimeAttribute());

Related

How to span a ConcurrentDictionary across load-balancer servers when using SignalR hub with Redis

I have ASP.NET Core web application setup with SignalR scaled-out with Redis.
Using the built-in groups works fine:
Clients.Group("Group_Name");
and survives multiple load-balancers. I'm assuming that SignalR persists those groups in Redis automatically so all servers know what groups we have and who are subscribed to them.
However, in my situation, I can't just rely on Groups (or Users), as there is no way to map the connectionId (Say when overloading OnDisconnectedAsync and only the connection id is known) back to its group, and you always need the Group_Name to identify the group. I need that to identify which part of the group is online, so when OnDisconnectedAsync is called, I know which group this guy belongs to, and on which side of the conversation he is.
I've done some research, and they all suggested (including Microsoft Docs) to use something like:
static readonly ConcurrentDictionary<string, ConversationInformation> connectionMaps;
in the hub itself.
Now, this is a great solution (and thread-safe), except that it exists only on one of the load-balancer server's memory, and the other servers have a different instance of this dictionary.
The question is, do I have to persist connectionMaps manually? Using Redis for example?
Something like:
public class ChatHub : Hub
{
static readonly ConcurrentDictionary<string, ConversationInformation> connectionMaps;
ChatHub(IDistributedCache distributedCache)
{
connectionMaps = distributedCache.Get("ConnectionMaps");
/// I think connectionMaps should not be static any more.
}
}
and if yes, is it thread-safe? if no, can you suggest a better solution that works with Load-Balancing?
Have been battling with the same issue on this end. What I've come up with is to persist the collections within the redis cache while utilising a StackExchange.Redis.IDatabaseAsync alongside locks to handle concurrency.
This unfortunately makes the entire process sync but couldn't quite figure a way around this.
Here's the core of what I'm doing, this attains a lock and return back a deserialised collection from the cache
private async Task<ConcurrentDictionary<int, HubMedia>> GetMediaAttributes(bool requireLock)
{
if(requireLock)
{
var retryTime = 0;
try
{
while (!await _redisDatabase.LockTakeAsync(_mediaAttributesLock, _lockValue, _defaultLockDuration))
{
//wait till we can get a lock on the data, 100ms by default
await Task.Delay(100);
retryTime += 10;
if (retryTime > _defaultLockDuration.TotalMilliseconds)
{
_logger.LogError("Failed to get Media Attributes");
return null;
}
}
}
catch(TaskCanceledException e)
{
_logger.LogError("Failed to take lock within the default 5 second wait time " + e);
return null;
}
}
var mediaAttributes = await _redisDatabase.StringGetAsync(MEDIA_ATTRIBUTES_LIST);
if (!mediaAttributes.HasValue)
{
return new ConcurrentDictionary<int, HubMedia>();
}
return JsonConvert.DeserializeObject<ConcurrentDictionary<int, HubMedia>>(mediaAttributes);
}
Updating the collection like so after I've done manipulating it
private async Task<bool> UpdateCollection(string redisCollectionKey, object collection, string lockKey)
{
var success = false;
try
{
success = await _redisDatabase.StringSetAsync(redisCollectionKey, JsonConvert.SerializeObject(collection, new JsonSerializerSettings
{
ReferenceLoopHandling = ReferenceLoopHandling.Ignore
}));
}
finally
{
await _redisDatabase.LockReleaseAsync(lockKey, _lockValue);
}
return success;
}
and when I'm done I just ensure the lock is released for other instances to grab and use
private async Task ReleaseLock(string lockKey)
{
await _redisDatabase.LockReleaseAsync(lockKey, _lockValue);
}
Would be happy to hear if you find a better way of doing this. Struggled to find any documentation on scale out with data retention and sharing.

Save complex object to session ASP .NET CORE 2.0

I am quite new to ASP .NET core, so please help. I would like to avoid database round trip for ASP .NET core application. I have functionality to dynamically add columns in datagrid. Columns settings (visibility, enable, width, caption) are stored in DB.
So I would like to store List<,PersonColumns> on server only for actual session. But I am not able to do this. I already use JsonConvert methods to serialize and deserialize objects to/from session. This works for List<,Int32> or objects with simple properties, but not for complex object with nested properties.
My object I want to store to session looks like this:
[Serializable]
public class PersonColumns
{
public Int64 PersonId { get; set; }
List<ViewPersonColumns> PersonCols { get; set; }
public PersonColumns(Int64 personId)
{
this.PersonId = personId;
}
public void LoadPersonColumns(dbContext dbContext)
{
LoadPersonColumns(dbContext, null);
}
public void LoadPersonColumns(dbContext dbContext, string code)
{
PersonCols = ViewPersonColumns.GetPersonColumns(dbContext, code, PersonId);
}
public static List<ViewPersonColumns> GetFormViewColumns(SatisDbContext dbContext, string code, Int64 formId, string viewName, Int64 personId)
{
var columns = ViewPersonColumns.GetPersonColumns(dbContext, code, personId);
return columns.Where(p => p.FormId == formId && p.ObjectName == viewName).ToList();
}
}
I would like to ask also if my approach is not bad to save the list of 600 records to session? Is it better to access DB and load columns each time user wants to display the grid?
Any advice appreciated
Thanks
EDIT: I have tested to store in session List<,ViewPersonColumns> and it is correctly saved. When I save object where the List<,ViewPersonColumns> is property, then only built-in types are saved, List property is null.
The object I want to save in session
[Serializable]
public class UserManagement
{
public String PersonUserName { get; set; }
public Int64 PersonId { get; set; }
public List<ViewPersonColumns> PersonColumns { get; set; } //not saved to session??
public UserManagement() { }
public UserManagement(DbContext dbContext, string userName)
{
var person = dbContext.Person.Single(p => p.UserName == userName);
PersonUserName = person.UserName;
PersonId = person.Id;
}
/*public void PrepareUserData(DbContext dbContext)
{
LoadPersonColumns(dbContext);
}*/
public void LoadPersonColumns(DbContext dbContext)
{
LoadPersonColumns(dbContext, null);
}
public void LoadPersonColumns(DbContext dbContext, string code)
{
PersonColumns = ViewPersonColumns.GetPersonColumns(dbContext, code, PersonId);
}
public List<ViewPersonColumns> GetFormViewColumns(Int64 formId, string viewName)
{
if (PersonColumns == null)
return null;
return PersonColumns.Where(p => p.FormId == formId && p.ObjectName == viewName).ToList();
}
}
Save columns to the session
UserManagement userManagement = new UserManagement(_context, user.UserName);
userManagement.LoadPersonColumns(_context);
HttpContext.Session.SetObject("ActualPersonContext", userManagement);
HttpContext.Session.SetObject("ActualPersonColumns", userManagement.PersonColumns);
Load columns from the session
//userManagement build-in types are set. The PersonColumns is null - not correct
UserManagement userManagement = session.GetObject<UserManagement>("ActualPersonContext");
//The cols is filled from session with 600 records - correct
List<ViewPersonColumns> cols = session.GetObject<List<ViewPersonColumns>>("ActualPersonColumns");
Use list for each column is better than use database.
you can't create and store sessions in .net core like .net framework 4.0
Try Like this
Startup.cs
public void ConfigureServices(IServiceCollection services)
{
//services.AddDbContext<GeneralDBContext>(options => options.UseSqlServer(Configuration.GetConnectionString("DefaultConnection")));
services.AddMvc().AddSessionStateTempDataProvider();
services.AddSession();
}
Common/SessionExtensions.cs
sing Microsoft.AspNetCore.Http;
using Newtonsoft.Json;
using System;
using System.Collections.Generic;
using System.Linq;
using System.Threading.Tasks;
namespace IMAPApplication.Common
{
public static class SessionExtensions
{
public static T GetComplexData<T>(this ISession session, string key)
{
var data = session.GetString(key);
if (data == null)
{
return default(T);
}
return JsonConvert.DeserializeObject<T>(data);
}
public static void SetComplexData(this ISession session, string key, object value)
{
session.SetString(key, JsonConvert.SerializeObject(value));
}
}
}
Usage
==> Create Session*
public IActionResult Login([FromBody]LoginViewModel model)
{
LoggedUserVM user = GetUserDataById(model.userId);
//Create Session with complex object
HttpContext.Session.SetComplexData("loggerUser", user);
return Json(new { status = result.Status, message = result.Message });
}
==> Get Session data*
public IActionResult Index()
{
//Get Session data
LoggedUserVM loggedUser = HttpContext.Session.GetComplexData<LoggedUserVM>("loggerUser");
}
Hope this is helpful. Good luck.
This is an evergreen post, and even though Microsoft has recommended serialisation to store the object in session - it is not a correct solution unless your object is readonly, I have a blog explaining all scenario here and i have even pointed out the issues in GitHub of Asp.Net Core in issue id 18159
Synopsis of the problems are here:
A. Serialisation isn't same as object, true it will help in distributed server scenario but it comes with a caveat that Microsoft have failed to highlight - that it will work without any unpredictable failures only when the object is meant to be read and not to be written back.
B. If you were looking for a read-write object in the session, everytime you change the object that is read from the session after deserialisation - it needs to be written back to the session again by calling serialisation - and this alone can lead to multiple complexities as you will need to either keep track of the changes - or keep writing back to session after each change in any property. In one request to the server, you will have scenarios where the object is written back multiple times till the response is sent back.
C. For a read-write object in the session, even on a single server it will fail, as the actions of the user can trigger multiple rapid requests to the server and not more than often system will find itself in a situation where the object is being serialised or deserialised by one thread and being edited and then written back by another, the result is you will end up with overwriting the object state by threads - and even locks won't help you much since the object is not a real object but a temporary object created by deserialisation.
D. There are issues with serialising complex objects - it is not just a performance hit, it may even fail in certain scenario - especially if you have deeply nested objects that sometimes refer back to itself.
The synopsis of the solution is here, full implementation along with code is in the blog link:
First implement this as a Cache object, create one item in IMemoryCache for each unique session.
Keep the cache in sliding expiration mode, so that each time it is read it revives the expiry time - thereby keeping the objects in cache as long as the session is active.
Second point alone is not enough, you will need to implement heartbeat technique - triggering the call to session every T minus 1 min or so from the javascript. (This we anyways used to do even to keep the session alive till the user is working on the browser, so it won't be any different
Additional Recommendations
A. Make an object called SessionManager - so that all your code related to session read / write sits in one place.
B. Do not keep very high value for session time out - If you are implementing heartbeat technique, even 3 mins of session time out will be enough.

How do I execute a new job on success or failure of Hangfire job?

I'm working on a Web API RESTful service that on a request needs to perform a task. We're using Hangfire to execute that task as a job, and on failure, will attempt to retry the job up to 10 times.
If the job eventually succeeds I want to run an additional job (to send an event to another service). If the job fails even after all of the retry attempts, I want to run a different additional job (to send a failure event to another service).
However, I can't figure out how to do this. I've created the following JobFilterAttribute:
public class HandleEventsAttribute : JobFilterAttribute, IElectStateFilter
{
public IBackgroundJobClient BackgroundJobClient { get; set; }
public void OnStateElection(ElectStateContext context)
{
var failedState = context.CandidateState as FailedState;
if (failedState != null)
{
BackgroundJobClient.Enqueue<MyJobClass>(x => x.RunJob());
}
}
}
The one problem I'm having is injecting the IBackgroundJobClient into this attribute. I can't pass it as a property to the attribute (I get a "Cannot access non-static field 'backgroundJobClient' in static context" error). We're using autofac for dependency injection, and I tried figuring out how to use property injection, but I'm at a loss. All of this leads me to believe I may be on the wrong track.
I'd think it would be a fairly common pattern to run some additional cleanup code if a Hangfire job fails. How do most people do this?
Thanks for the help. Let me know if there's any additional details I can provide.
Hangfire can build an execution chains. If you want to schedule next job after first one succeed, you need to use ContinueWith(string parentId, Expression<Action> methodCall, JobContinuationOptions options); with the JobContinuationOptions.OnlyOnSucceededState to run it only after success.
But you can create a HangFire extension like JobExecutor and run tasks inside it to get more possibilities.
Something like that:
public static JobResult<T> Enqueue<T>(Expression<Action> a, string name)
{
var exprInfo = GetExpressionInfo(a);
Guid jGuid = Guid.NewGuid();
var jobId = BackgroundJob.Enqueue(() => JobExecutor.Execute(jGuid, exprInfo.Method.DeclaringType.AssemblyQualifiedName, exprInfo.Method.Name, exprInfo.Parameters, exprInfo.ParameterTypes));
JobResult<T> result = new JobResult<T>(jobId, name, jGuid, 0, default(T));
JobRepository.WriteJobState(new JobResult<T>(jobId, name, jGuid, 0, default(T)));
return result;
}
More detailed information you can find here: https://indexoutofrange.com/Don%27t-do-it-now!-Part-5.-Hangfire-job-continuation,-ContinueWith/
I haven't been able to verify this will work, but BackgroundJobClient has no static methods, so you would need a reference to an instance of it.
When I enqueue tasks, I use the static Hangfire.BackgroundJob.Enqueue which should work without a reference to the JobClient instance.
Steve

UserNamePasswordValidator and Session Management

I'm using WCF custom Validator with HTTPS (.NET 4.5). Validate on success returns Customer object which I would like to use later. Currently I'm able to do it with Static variables which I like to avoid if possible. I tried to use HttpContext which becomes null in main thread. My understanding Validate runs under different thread. Is there any way I could share session info without involving DB or File share. See related threads here and here.
In Authentication.cs
public class CustomValidator : UserNamePasswordValidator
{
public override void Validate(string userName, string password)
{
//If User Valid then set Customer object
}
}
In Service.cs
public class Service
{
public string SaveData(string XML)
{
//Need Customer object here. Without it cannot save XML.
//HttpContext null here.
}
}
I can suggest you an alternative approach. Assuming that the WCF service is running in ASP.Net compatibility mode and you are saving the customer object to session storage. Create a class such as AppContext
The code would look something like this
public class AppContext {
public Customer CurrentCustomer {
get {
Customer cachedCustomerDetails = HttpContext.Current.Session[CUSTOMERSESSIONKEY] as Customer;
if (cachedCustomerDetails != null)
{
return cachedCustomerDetails;
}
else
{
lock (lockObject)
{
if (HttpContext.Current.Session[CUSTOMERSESSIONKEY] != null) //Thread double entry safeguard
{
return HttpContext.Current.Session[CUSTOMERSESSIONKEY] as Customer;
}
Customer CustomerDetails = ;//Load customer details based on Logged in user using HttpContext.Current.User.Identity.Name
if (CustomerDetails != null)
{
HttpContext.Current.Session[CUSTOMERSESSIONKEY] = CustomerDetails;
}
return CustomerDetails;
}
}
}
}
The basic idea here is to do lazy loading of data, when both WCF and ASP.Net pipelines have executed and HTTPContext is available.
Hope it helps.
Alright this should have been easier. Since the way UserNamePasswordValidator works, I needed to use custom Authorization to pass UserName/Password to the main thread and get customer info again from the database. This is an additional DB call but acceptable workaround for now. Please download code from Rory Primrose's genius blog entry.

NHibernate not persisting changes to my object

My ASP.NET MVC 4 project is using NHibernate (behind repositories) and Castle Windsor, using the AutoTx and NHibernate Facilities. I've followed the guide written by haf and my I can create and read objects.
My PersistenceInstaller looks like this
public class PersistenceInstaller : IWindsorInstaller
{
public void Install(Castle.Windsor.IWindsorContainer container, Castle.MicroKernel.SubSystems.Configuration.IConfigurationStore store)
{
container.AddFacility<AutoTxFacility>();
container.Register(Component.For<INHibernateInstaller>().ImplementedBy<NHibernateInstaller>().LifeStyle.Singleton);
container.AddFacility<NHibernateFacility>(
f => f.DefaultLifeStyle = DefaultSessionLifeStyleOption.SessionPerWebRequest);
}
}
The NHibernateInstaller is straight from the NHib Facility Quickstart.
I am using ISessionManager in my base repository...
protected ISession Session
{
get
{
return _sessionManager.OpenSession();
}
}
public virtual T Commit(T entity)
{
Session.SaveOrUpdate(entity);
return entity;
}
Finally, my application code which is causing the problem:
[HttpPost]
[ValidateAntiForgeryToken]
[Transaction]
public ActionResult Maintain(PrescriberMaintainViewModel viewModel)
{
if (ModelState.IsValid)
{
var prescriber = UserRepository.GetPrescriber(User.Identity.Name);
//var prescriber = new Prescriber { DateJoined = DateTime.Today, Username = "Test" };
prescriber.SecurityQuestion = viewModel.SecurityQuestion;
prescriber.SecurityAnswer = viewModel.SecurityAnswer;
prescriber.EmailAddress = viewModel.Email;
prescriber.FirstName = viewModel.FirstName;
prescriber.LastName = viewModel.LastName;
prescriber.Address = new Address
{
Address1 = viewModel.AddressLine1,
Address2 = viewModel.AddressLine2,
Address3 = viewModel.AddressLine3,
Suburb = viewModel.Suburb,
State = viewModel.State,
Postcode = viewModel.Postcode,
Country = string.Empty
};
prescriber.MobileNumber = viewModel.MobileNumber;
prescriber.PhoneNumber = viewModel.PhoneNumber;
prescriber.DateOfBirth = viewModel.DateOfBirth;
prescriber.AHPRANumber = viewModel.AhpraNumber;
prescriber.ClinicName = viewModel.ClinicName;
prescriber.ClinicWebUrl = viewModel.ClinicWebUrl;
prescriber.Qualifications = viewModel.Qualifications;
prescriber.JobTitle = viewModel.JobTitle;
UserRepository.Commit(prescriber);
}
return View(viewModel);
}
The above code will save a new prescriber (tested by uncommenting out the commented out line etc).
I am using NHProf and have confirmed that no sql is sent to the database for the Update. I can see the read being performed but that's it.
It seems to me that NHibernate doesn't recognise the entity as being changed and therefore does not generate the sql. Or possibly the transaction isn't being committed?
I've been scouring the webs for a few hours now trying to work this one out and as a last act of desperation have posted on SO. Any ideas? :)
Oh and in NHProf I see three Sessions (1 for the GetPrescriber call from the repo, one I assume for the update (with no sql) - and one for some action in my actionfilter on the base class). I also get an alert about the use of implicit transactions. This confuses me because I thought I was doing everything I needed to get an transaction - using AutoTx and the Transaction attribute. I also expected there to be only 1 session per webrequest, as per my Windsor config.
UPDATE: It seems, after spending the day reading through the source for NHibernateFacility and AutoTx Facility for automatic transactions, that AutoTx is not setting the Interceptors on my implementation of INHibernateInstaller. It seems this means whenever SessionManager calls OpenSession it is calling the default version with no parameter, rather than the one that accepts an Interceptor. Internally AutoTxFacility registers TransactionInterceptor with windsor, so that it can be added the Interceptor on my INHibernateInstaller concrete, by windsor making use of the AutoTx's TransactionalComponentInspector
AutoTxFacility source on github
To me it looks like creating sessions for every call to the repository. A session should span the whole business operation. It should be opened at the beginning and committed and disposed at the end.
There are other strange things in this code.
Commit is a completely different concept than SaveOrUpdate.
And you don't need to tell NH to store changes anyway. You don't need to call session.Save for objects that are already in the session. They are stored anyway. You only need to call session.Save when you add new objects.
Make sure that you use a transaction for the whole business operation.
There is one most likely "unintended" part in the code snippet above. And proven by observation made by NHProf
Oh and in NHProf I see three Sessions (1 for the GetPrescriber call
from the repo, one I assume for the update (with no sql) - and one for
some action in my actionfilter on the base class).
Calling the OpenSession() is triggering creation of a new session instances.
protected ISession Session
{
get { return _sessionManager.OpenSession(); }
}
So, whenever the code is accessing the Session property, behind is new session instance created (again and again). One session for get, one for udpate, one for filter...
As we can see here, the session returned by SessionManager.OpenSession() must be used for the whole scope (Unit of work, web request...)
http://docs.castleproject.org/Windsor.NHibernate-Facility.ashx
The syntaxh which we need, si to create one session (when firstly accessed) and reuse it until enf of scope (then later correctly close it, commit or rollback transaction...). Anyhow, first thing right now is to change the Session property this way:
ISession _session;
protected ISession Session
{
get
{
if (_session == null)
{
_session = sessionFactory.OpenSession();
}
return _session;
}
}
After spending a full day yesterday searching through the AutoTx and NHibernate facilities on github and getting nowhere, I started a clean project in an attempt to replicate the problem. Unfortunately for the replication, everything worked! I ran Update-Package on my source and brought down new version of Castle.Transactions and I was running correctly. I did make a small adjustment to my own code. That was to remove the UserRepository.Commit line.
I did not need to modify how I opened sessions. That was taken care of by the SessionManager instance. With the update to Castle.Transactions, the Transaction attribute is being recognised and a transaction is being created (as evidenced by no more alert in NHProf).