windows server has a registry key HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\TimeZoneInformation where the field ActiveTimeBias will return the offset in minutes from GMT for the machine that you are running on.
we have a web application that needs to present a user with their local time on an html page that we generate. we do this by having them set a preference for their timezone and then fetch the above value so that can compare server time and client time and do the correct calculation.
on servers that we have built this works great. we are deploying our application into a tier 1 cloud provider who is providing a windows server ami that we configure and use. what we have found is that when you use the clock control panel on a local server, the registry entrys for TimeZoneInformation are correct. when we do the same for this virtual machine, the entrys are correct with the exception of ActiveTimeBias.
Microsoft cautions against diddling the individual value in their usual fashion.
question for the community - has anyone else encountered this problem and if so, how did you fix it?
One usually doesn't program directly against these registry keys. For example, in .Net, we have the TimeZoneInfo class. It uses the same data, but in a much more developer-friendly way.
Regarding your particular question, the ActiveTimeBias key hold the current offset from UTC for the time zone that your server is set to. If your server is in a time zone that follows daylight savings rules, then this will change twice a year. If you manually update your time zone and refresh the registry, you will see it change.
The best advice I can offer is that since the timezone of the server is likely to be unimportant to anyone, you should set it to UTC.
See also: Daylight saving time and time zone best practices
Related
Wow! Tons of posts on converting GMT to local time, including correction for DST. But it seems my need is different.
As the title says, I have a stand-alone embedded system with no O/S. I'm using NTP to get UTC. That is used to tag events with an accurate date/time. I can correct UTC for the current time zone but cannot automatically adjust for DST.
Since there is no O/S, I don't have any of the Windows/Linux data such as time zone. So there is no way to locally adjust for the GMT offset.
It seems the only way for me to do this is to use an http call to find the offset, and the only way I can think of doing this is using the lon/lat or address. It would be possible for me to add lon/lat or address to the configuration so this seems like the only option.
I've seen references to sites which return the GMT offset based on location. Do these sites also automatically adjust for DST? To do that, they would have to use one of the solutions posted in many places in this forum, but that should be easy enough.
Thanks for the advice and help!
Dave
If you need to only convert a specific single timestamp to local time, then yes - you can use services such as those listed here. At least the ones offered by Microsoft and Google do convert a timestamp to the local time in the time zone given, in addition to providing the IANA time zone id.
Additionally, you'll find that the gettimezonebycoordinates function in the Microsoft Azure LBS Time Zone API returns a PosixTz value, such as "PST+8PDT,M3.2.0,M11.1.0". This is ideal for embedded systems, as you can set your TZ environment variable to this value and then many local APIs (such as with C, and others) will use this value in their conversions. This approach works best when you may be converting many different local time values and don't want to make an http call for each one.
Be aware, however, that using a POSIX time zone string has some limitations, such as being restricted to a single set of DST transition rules. They generally work ok for near-current time ranges, but not for historical purposes.
As a software tester I came to an incident regarding testing on platforms with time travel. (the time can be set manually to past/future according to requirements of tests)
So the application time doesn't have to be same as my local time .... or should it be the same?
I found a bug that was caused by inconsistency between my local time and app time. Simple description: There are two validations. Validation #1 validates user input on client side (and is using local date for validation) and validation #2 validates user input on server side (and is using server date). Both validations are according to business rules that are specified in project specification. (it does not specify whether it should run locally or on server side) When there is inconsistency between those dates, it produces unexpected results.
However the bug was rejected by development that my test was wrong and that it's client's responsibility to synchronize those two dates.
Honestly I don't see reason what my local time has to do with application behaviour. There is lot of functionality and rules and for all of those is used server time as reference point. But because of that client side validation which is done by javascript the reference point is local time (because it's default behaviour, it's not intentional).
So I am just asking about your opinion. Do you think it's a bug or it's my bad understanding of importance of local time? How are you used to handle this things in your projects (as tester or developer)? This is not just issue of testing and server time travelling, but what about client "time travelling"? (eg. different time zones). Do you put any kind of efford to handle this things or just believe, that "bad local time = client problem" and that's not problem of development?
In general it is going to really depend on your application, what it does, and what is required. However, as a best practice, "application time" is always UTC. As another best practice, never trust client times. There are a ton of reasons an end-user's computer's time could be completely off or out of sync.
In almost every application I've worked with, we set the application server to UTC. For end-users we have them set their timezone, which we use to determine the timezone offset. So for example if an end-user selects "Eastern Timezone" we'd store that setting along with a -5 hour offset (ignoring daylight savings time for brevity). If you aren't storing user settings to a database you can still get their timezone via a client-side preference screen or automatically via javascript. But the key factor is ignoring their time, and just get their timezone/offset. Then you send that timezone over to the server so you can TRANSLATE the time using the server's accurate time. This way you always have the accurate server time, but then a reference to the user's local time that you can use for either reporting, logic, or display values.
So to answer your question: a bad local time in most cases needs to be accounted for, and is going to be YOUR problem. End-users aren't going to be checking their clocks to make sure they are accurate, they expect your application to work without having an IT person fix their side.
I have a server in Usa and I have clients in different parts of the world, Australia, South america, Usa, Canada, Europe.
So I need to send notification of events one hour before the event take place.
So In sql server I have a table with different events those events are stored in Utc(2015-12-27 20:00:00.0000000). and in other table the timezone that belongs to every event ("Australia/Sydney").
So how could I calculate in a query when to send the notifications? or maybe I would have to do it with a server side language.
Could any one could help me with a possible solution.
Thanks
You've asked very broadly, so I can only answer with generalities. If you need a more specific answer, please edit your question to be more specific.
A few things to keep in mind:
Time zone conversions are best done in the application layer. Most server-side application platforms have time zone conversion functions, either natively or via libraries, or both.
If you must convert at the database layer (such as when using SSRS or SSAS, or complex stored procs, etc.) and you are using SQL Server, then there are two approaches to consider:
SQL Server 2016 CTP 3.1 adds native support for time zone conversions via the AT TIME ZONE statement. However, they work with Windows time zone identifiers, such as "AUS Eastern Standard Time", rather than IANA/Olson identifiers, such as the "Australia/Sydney" you specified.
You might use third-party support for time zones, such as my SQL Server Time Zone Support project, which does indeed support IANA/Olson time zone identifiers. There are other similar projects out there as well.
Regardless of whether you convert at the DB layer or at the application layer, the time zone of your server should be considered irrelevant. Always get the current time in UTC rather than local time. Always convert between UTC and a specific time zone. Never rely on the server's local time zone setting to be anything in particular. On many servers, the time zone is intentionally set to UTC, but you should not depend on that.
Nothing in your question indicates how you plan on doing scheduling or notifications, but that is actually the harder part. Specifically, scheduling events into the future should not be based on UTC, but rather on the event's specific time zone. More about this here.
You might consider finding a library for your application layer that will handle most of this for you, such as Quartz (Java) or Quartz.Net (.NET). There are probably similar solutions for other platforms.
You should read the large quantity of material already available on this subject here on Stack Overflow, including the timezone tag wiki and Daylight saving time and time zone best practices.
I have a problem where I can see time is not same in application and in database.
For example, record that has been created from UI has time '10/21/2014 07:49:12 AM'. Where as, the same record when queried in database has the time '10/21/2014 11:49:12 AM'.
As we can see, there is a time difference of 4 hours.
We have set UTC parameter to FALSE and re-started the siebel web server. But, we haven't restarted the siebel server and load balancing server.
Siebel server's default timezone is set at Application->System Preference level: "Default Time Zone". Siebel will adjust the values from DB tables accordingly. In addition, users may specify their own timezone at Employee level. Check these places to see if something is set.
As you've already noticed, Siebel stores the date and time values in UTC. When a user enters a date, it's internally translated to UTC before saving it to the database; when it's recovered in a business component, it's translated back to the current user's timezone. As Ranjith R said in his answer, each user may specify it's own timezone in his/her profile, otherwise the Default Time Zone system preference applies.
The decision whether to use UTC times or not, must be made before deploying the servers. Once the application is set up and running, as it appears to be your case, the change from UTC to non-UTC is not supported.
There is a full chapter dedicated to UTC dates and times, on the Siebel Deployment Guide of the Siebel Bookshelf. Here is the link for Siebel 8.0. Amongst other things, they state the following:
CAUTION: Once you have already deployed your Siebel applications using UTC and created user date-time data in UTC format, it is not supported to stop using UTC by setting the Universal Time Coordinated system preference to FALSE. Otherwise, incorrect time stamp values may be created and displayed.
Apparently, there is a UTC conversion utility intended to update historical data from non-UTC format to UTC, but it doesn't work the other way around.
Everything i am reading says that IIS uses the local machine time and timezone, and as such to change the effective time zone and/or time, all i need do is change the time and timezone for the server, and it will change the effective time and timezone that the IIS server sees, and i have a server running a simple WCF service that this just does not seem to be working for me.
This server is is located on Pacific turf in a leased farm, but has had an Eastern timezone configured on it since we first set it up. I have tried resetting everything and even tried bouncing the box, so i am sure it is not cached Time values or something so simple. However upon break-pointing my WCF code to try and understand why it is passing out dates that are off by three hours to all consuming services, i found that IIS is convinced it is in Pacific Timezone, despite everything being configured otherwise.
System.TimeZoneInfo.Local reports that it is in Pacific Time, and DateTime.Now give me a timestamp that is off by three hours, and i can not seem to figure out how to convince IIS that it needs to use Eastern Time as the effective time zone for the records it is creating and handing out.
In general, server-side code should not depend on the local time zone. Calling either TimeZoneInfo.Local or DateTime.Now from a server application is usually a mistake. See The Case Against DateTime.Now.
The best practice would be to leave your server set to Coordinated Universal Time (UTC), and write your application to manage time zones internally. If you are dependent on Eastern time, then your code should do something like:
TimeZoneInfo tz = TimeZoneInfo.FindSystemTimeZoneById("Eastern Standard Time");
DateTime now = TimeZoneInfo.ConvertTimeFromUtc(DateTime.UtcNow, tz);
That said, if TimeZoneInfo.Local.Id is returning Pacific Standard Time, then there are only two possible explanations:
Your system is indeed set for the Pacific time zone.
Your system was set for the Pacific time zone, but you changed it without restarting or calling TimeZoneInfo.ClearCachedData.
Since you've eliminated both of these explanations in how you described the problem, I can only say that there must be something wrong with how you are setting the time zone.
Try using tzutil.exe on an Administrator elevated command prompt. tzutil /g will give you the current time zone setting. tzutil /s "Eastern Standard Time" will set your time zone for US Eastern Time. Be sure to restart your application after changing the time zone, either by recycling the application pool in the management console, using iisreset, restarting IIS, or (if you must) rebooting the server.
You can also just make the change through the time zone control panel.
If you are saying you've done all of that, and you're getting "Eastern Standard Time" back from tzutil /g, but TimeZoneInfo.Local.Id is returning "Pacific Standard Time" even though you've rebooted, then I call BS. That's just not possible. Perhaps there's a simpler explanation, such as maybe you're deploying to multiple servers and you're setting the time zone on a different server than you're getting the results from.