I'm trying to convert the Optaplanner ProjectJobScheduling Dates to LocalDateDime, but I'm facing problems in score calculations. The scores are calculated based on the resources business calendar minutes duration between two dates.
In the ProjectJobSchedulingIncrementalScoreCalculator.java I have the soft1Score initial set based in only one date. What should I do, is it feasable?
Can someone provide me any samples with real dates?
Regards
Look at the optaplanner-examples NurseRostering and ConferenceScheduling (new in 7.6).
Basically, do a find in path for "java.time".
Related
I am applying the VRP example of optaplanner with time windows and I get feasible solutions whenever I define time windows in a range of 24 hours (00:00 to 23:59). But I am needing:
Manage long trips, where I know that the duration between leaving the depot to the first visit, or durations between visits, will be more than 24 hours. So currently it does not give me workable solutions, because the TW format is in 24 hour format. It happens that when applying the scoring rule "arrivalAfterDueTime", always the "arrivalTime" is higher than the "dueTime", because the "dueTime" is in a range of (00:00 to 23:59) and the "arrivalTime" is the next day.
I have thought that I should take each TW of each Customer and add more TW to it, one for each day that is planned.
Example, if I am planning a trip for 3 days, then I would have 3 time windows in each Customer. Something like this: if Customer 1 is available from [08:00-10:00], then say it will also be available from [32:00-34:00] and [56:00-58:00] which are the equivalent of the same TW for the following days.
Likewise I handle the times with long, converted to milliseconds.
I don't know if this is the right way, my consultation would be more about some ideas to approach this constraint, maybe you have a similar problematic and any idea for me would be very appreciated.
Sorry for the wording, I am a Spanish speaker. Thank you.
Without having checked the example, handing multiple days shouldn't be complicated. It all depends on how you model your time variable.
For example, you could:
model the time stamps as a long value denoted as seconds since epoch. This is how most of the examples are model if I remember correctly. Note that this is not very human-readable, but is the fastest to compute with
you could use a time data type, e.g. LocalTime, this is a human-readable time format but will work in the 24-hour range and will be slower than using a primitive data type
you could use a date time data tpe, e.g LocalDateTime, this is also human-readable and will work in any time range and will also be slower than using a primitive data type.
I would strongly encourage to not simply map the current day or current hour to a zero value and start counting from there. So, in your example you denote the times as [32:00-34:00]. This makes it appear as you are using the current day midnight as the 0th hour and start counting from there. While you can do this it will affect debugging and maintainability of your code. That is just my general advice, you don't have to follow it.
What I would advise is to have your own domain models and map them to Optaplanner models where you use a long value for any time stamp that is denoted as seconds since epoch.
I have a database with integer fields (columns) named fSystemDate, fOpenned, fStatusDate, etc... I think they represent dates, but I don't know their format. The values in those fields are how these: 76505, 76530, 76554, 76563.
I do not have examples with the real date associated with them.
Solved. See answers.
I found that this format is part of a programming language called Clarion and his date numbering starts at the date 28-December-1800.
I can convert clarion data to sql date in two ways:
SELECT DATEADD(day, 76505, '28-12-1800')
where the result would be 2010-06-15 00:00:00.
SELECT CONVERT(DateTime,76505 - 36163)
where the result is same. The number 36163 is used to adjust a SQL. This is the number of days between 1-Jan-1900 (MSSQL datetime numbering starts) and 28-Dec-1800 (Clarion datetime numbering starts).
The result in my case is correct because I asked them (my customer) examples of data from your application and compare information.
It's rather hard to help you given just a number. It looks like your dates are some sort of serial number. But without any other data points
epoch. An epoch is the zero point of a calendrical system.
increment. How big is a tick in the serial number? 1 day? 1 hour, 1 minute? A week? A month?
source hardware/operating system. From what computer system did the value originate? Different systems represent dates differently, using different calendrical systems with different epochs.
source software system. What software created the value? Was it custom software? What language what it written in? When? What is the backing store for the data? Databases, filesystems, etc., might all have their own internal date representation.
the represented value. If 76563 is indeed a representation of a date, what date does it represent? Or at least, does it represent a recent date? a date in the past? a date in the future?
It's impossible to answer your question. This page might help you:
http://www.itworld.com/article/2823335/data-center/128452-Just-dating-The-stories-behind-12-computer-system-reference-dates.html
It lists some common epochs for different computer systems:
Edited to note: here's one data point for you: Adding 76,563 days to 1 Jan 1800 yields the date 16 August 2009.
Let's assume a variation on Nurse Rostering example in which instead of assigning a nurse to a shift on a day, the nurse is assigned to a variable number of timeblocks on that day (which consists of 24 timeblocks). eg: Nurse1 is assigned to timeblocks [8,9,10,11,12,13,14]. Let's call these consecutive assignments a ShiftPeriod. There is a hard minimum and maximum on these shiftperiods. However, optaplanner has difficulties finding a feasible solution.
When having hard consecutive constraints, is it better to model the planning entity as a startTimeBlock with a duration instead of my current way with assignment to a timeblock and a day and then imposing min/max consecutive?
Take a look at the meeting scheduling example on github master for 6.4.0.Beta1 (but the example will work perfectly with 6.3.0.Final too). Video and docs coming soon. That example uses the design pattern TimeGrains, which is what you're looking for I think.
I have this huge database of records that have been created over the past 5 or so years. I'm thinking it would be cool (and edifying) to try to create some time categories/segments for these records, the unit could be week or month or something like that, something to use for a graph.
Anyway, I need to develop a query that, given a datetime attr for each record in the table, would return all the records with a datetime falling in between X and Y (June 1, 2011 & June 7, 2011, for example).
I'm not good at using the time helpers yet and could not find any sufficiently similar questions on SO or elsewhere.
Solutions that use subjective increments like "week" or "month" that rails can understand would be strongly appreciated. I know how tricky the calendar can get in programming. Or I could just use some lowest common denominator (day) and do an extremely fine graph.
Client.where(:created_at => X..Y)
Source: Ruby on Rails Guides
I'm looking for recommendations on a best practice here.
I have a requirement where on a given day I must have an arbitrary number of intervals (think buckets of time which are composed of transactions) where I can have at most N intervals per day. These intervals are like time but can be arbitrary lengths i.e. some are seconds, others are minutes.
How the intervals should be formed is based on my source data. On any given day, we always start with interval 1 and it is unknown the total number of intervals we will have by EOD, each interval is defined by a fixed number of transactions. For every interval I am going to need to know the end time as well.
What is the best approach here? Should I be bucketing my fact table and connecting to a standard hour/minute/second dimension or should I be using my transactional data to be making a dimension that accommodates it?
I appreciate your feedback.
If the buckets are on time, you probably have to do it on one of your dimensions. There is a property on the attributes called bucket that can do that for you