Significance of 631138519 in `Strict-Transport-Security: max-age' - http-headers

I see many websites sets the value of 631138519 (for example twitter) for the security header Strict-Transport-Security: max-age.
That's roughly getting converted to 7,304.84 days or 175,316.26 hours. What's the significance of the number in this context?

631138519 seconds is 20 years, if an average year is 365.2421985 days long. Where does that number of days come from? I'm not sure, but it seems to represent the tropical year to an arbitrary degree of precision.
If I had to guess, I'd say that someone picked 20 years as a really long time, then looked up the number of days in a year and happened to see that value. Then other sites just copied the first one.

Related

Using Optaplanner for long trip planning of a fleet of vehicles in a Vehicle Routing Problem (VRP)

I am applying the VRP example of optaplanner with time windows and I get feasible solutions whenever I define time windows in a range of 24 hours (00:00 to 23:59). But I am needing:
Manage long trips, where I know that the duration between leaving the depot to the first visit, or durations between visits, will be more than 24 hours. So currently it does not give me workable solutions, because the TW format is in 24 hour format. It happens that when applying the scoring rule "arrivalAfterDueTime", always the "arrivalTime" is higher than the "dueTime", because the "dueTime" is in a range of (00:00 to 23:59) and the "arrivalTime" is the next day.
I have thought that I should take each TW of each Customer and add more TW to it, one for each day that is planned.
Example, if I am planning a trip for 3 days, then I would have 3 time windows in each Customer. Something like this: if Customer 1 is available from [08:00-10:00], then say it will also be available from [32:00-34:00] and [56:00-58:00] which are the equivalent of the same TW for the following days.
Likewise I handle the times with long, converted to milliseconds.
I don't know if this is the right way, my consultation would be more about some ideas to approach this constraint, maybe you have a similar problematic and any idea for me would be very appreciated.
Sorry for the wording, I am a Spanish speaker. Thank you.
Without having checked the example, handing multiple days shouldn't be complicated. It all depends on how you model your time variable.
For example, you could:
model the time stamps as a long value denoted as seconds since epoch. This is how most of the examples are model if I remember correctly. Note that this is not very human-readable, but is the fastest to compute with
you could use a time data type, e.g. LocalTime, this is a human-readable time format but will work in the 24-hour range and will be slower than using a primitive data type
you could use a date time data tpe, e.g LocalDateTime, this is also human-readable and will work in any time range and will also be slower than using a primitive data type.
I would strongly encourage to not simply map the current day or current hour to a zero value and start counting from there. So, in your example you denote the times as [32:00-34:00]. This makes it appear as you are using the current day midnight as the 0th hour and start counting from there. While you can do this it will affect debugging and maintainability of your code. That is just my general advice, you don't have to follow it.
What I would advise is to have your own domain models and map them to Optaplanner models where you use a long value for any time stamp that is denoted as seconds since epoch.

Google Bigquery table decorators

I need to add decorators that will represent from 6 days ago till now.
how should I do it?
lets say the date is realative 604800000 millis from now and it's absolute is 1427061600000
#-604800000
#1427061600000
#now in millis - 1427061600000
#1427061600000 - now in millis
Is there a difference by using relative or absolute times?
Thanks
#-518400000--1
Will give you data for the last 6 days (or last 144 hours).
I think all you need is to read this.
Basically, you have the choice of #time, which is time since Epoch (your #1427061600000). You can also express it as a negative number, which the system will interpret as NOW - time (your #-604800000). These both work, but they don't give the result you want. Instead of returning all that was added in that time range, it will return a snapshot of your table from 6 days ago....
Although you COULD use that snapshot, eliminate all duplicates between that snapshot and your current table, and then take THOSE results as what was added during your 6 days, you're better off with :
Using time ranges directly, which you cover with your 3rd and 4th lines. I don't know if the order makes a difference, but I've always used #time1-time2 with time1<time2 (in your case, #1427061600000 - now in millis).

Convert average(decimal) into years, months and days

I have been doing some calculation and able to find my solution but I want to go back to convert my average(587.3) into years, months and days. I'm using (ssrs 2008) for reporting. I have my avg(Fields!XXX.Value)to calculate the average inside my text box. Sorry for my English as this is my second language
If you really have "a number of days", then you don't need to take care of leap years etc, because this number (of days) doesn't refer to a "date". You might choose 360d/y and 30d/m as it is usually done in financial calculations.

How do you store datetimes in a database on the scale of the history of the universe?

Is there a standard approach to storing datetimes in a database (PostgreSQL/MongoDB/Neo4j) that can handle timescales down to a millisecond up to the age of the universe?
Some examples of times would be:
13.7 billion years ago: origin of the universe
Photon epoch: Between 10 seconds and 380,000 years after the Big Bang (so 13.7 Billion years - 10 seconds ago)
8,000 BCE: end of last ice age
356 BCE: Alexander the Great's birth
Is it possible to build an actual timeline of this scale. The examples above aren't necessarily exact.
If you need to store far past and far future times exactly
Well, 13.7 billion years is about 432043200000000000000 milliseconds. That number requires 69 bits of storage, so I guess you are looking for an integer type with at least 70 bits (1 for the sign) if you want to store the times exactly. PostgreSQL doesn't have one: you need to use NUMERIC instead.
If you can accept loss of precision for far past and far future times
Since the precise age of the universe isn't known anyway, you can just use floating point numbers. With double floating point, if 0 represents the present, then times 13.7 billion years ago will only be precise to within about 20 seconds.

Interview: Determine how many x in y?

Today I had an interview with a gentleman who asked me to determine how many veterinarians are in the city of Atlanta. The interview was for an entry-level development position.
Assumptions: 1,000,000 people in Atlanta, 500,000 pets in Atlanta. The actual data is irrelevant.
Other than that there were no specifics. He asked me to find this data using only a whiteboard. There was no code required; it was simply a question to determine how well I could "reason" the problem. He said there was no right or wrong answer, and that I should work from the ground up.
After several answers, one of which was ~1,000 veterinarians in Atlanta, he told me he was going to ask other questions and I got the impression I had missed the point entirely.
I tried to work from the assumption that each vet could maybe see five animals a day, in a total of 24 working days per month.
Using those assumptions, I finally calculated (24 * 5) * 12 = 1,440 pets/year, and with 500,000 pets that would come to 500,000 / 1,440 ~= 348 veterinarians.
What steps could I have taken to approach this problem differently, in case I run into this sort of problems in future interviews?
I agree with your approach. The average pet sees a veterinarian so many times a year. The average veterinarian sees so many pets per week. Crunch those numbers and you have your answer.
Just guessing off the top of my head, I would say the average pet sees a veterinarian twice each year. So that's 1,000,000 visits. I'd say the average vet works 48 weeks a year, sees about a pet every 40 minutes, and works 30 hours per working week. That's about 2,160 vists per vet.
1,000,000 / 2,160 ~= 462.
My answer is close enough to yours, given that the numbers are all guesses.
The point of the question, I think, is to clearly define each assumption you have to make in order to produce an estimate. Your assumptions can be wildly inaccurate; in practice, they usually aren't too bad.
Interesting aside...there's a fun board game called Guesstimation built entirely around this kind of estimation problem.
How many pets are the types of pets that need to see veterinarians? How many vets see pets instead of large animals?
The point of this question isn't necessarily a Fermi question: It's to see how you handle ambiguous requirements that could significantly affect your answer.