How to convert every time zone in GMT +5.30? - objective-c

I am creating an application. I am running a countdown timer and saving time into a database by converting time into a floating point value (double). Suppose I start the timer at 2.00 PM and set the countdown for 3 hours. After 1 hour I quit the application and on relaunch I am getting time stored from database. Count the difference between stored time and current time I restore the timer according to that, but what if I quit the application and change the time zone to something else?
How do I handle this or how do I fix a specific time zone so that even user changes the time zone the timer always works perfect?

I don't know the iPhone SDK, but the traditional way of doing this would be to find the time in UTC - often expressed in terms of "milliseconds since Jan 1st 1970" or something similar.
That way the time zone of the device is irrelevant.
EDIT: Looking at the docs for NSDate, I think timeIntervalSince1970 is what you want. That should always be in terms of UTC.

Related

UTC and daylight savings - job schedule

For daylight savings time will an ADFV2 schedule update itself or do we need to go in there and update it when the time comes? I'm guessing the latter bc UTC doesn't change but I don't want to guess.
ADF won’t handle daylight savings time automatically.
Here's a post using azure function and web activity to make it work. Maybe you could take a look.
http://www.aussierobsql.com/azure-data-factory-v2-handling-daylight-savings-p1/
http://www.aussierobsql.com/azure-data-factory-v2-handling-daylight-savings-p2/

Time Difference in Siebel Application and Database

I have a problem where I can see time is not same in application and in database.
For example, record that has been created from UI has time '10/21/2014 07:49:12 AM'. Where as, the same record when queried in database has the time '10/21/2014 11:49:12 AM'.
As we can see, there is a time difference of 4 hours.
We have set UTC parameter to FALSE and re-started the siebel web server. But, we haven't restarted the siebel server and load balancing server.
Siebel server's default timezone is set at Application->System Preference level: "Default Time Zone". Siebel will adjust the values from DB tables accordingly. In addition, users may specify their own timezone at Employee level. Check these places to see if something is set.
As you've already noticed, Siebel stores the date and time values in UTC. When a user enters a date, it's internally translated to UTC before saving it to the database; when it's recovered in a business component, it's translated back to the current user's timezone. As Ranjith R said in his answer, each user may specify it's own timezone in his/her profile, otherwise the Default Time Zone system preference applies.
The decision whether to use UTC times or not, must be made before deploying the servers. Once the application is set up and running, as it appears to be your case, the change from UTC to non-UTC is not supported.
There is a full chapter dedicated to UTC dates and times, on the Siebel Deployment Guide of the Siebel Bookshelf. Here is the link for Siebel 8.0. Amongst other things, they state the following:
CAUTION: Once you have already deployed your Siebel applications using UTC and created user date-time data in UTC format, it is not supported to stop using UTC by setting the Universal Time Coordinated system preference to FALSE. Otherwise, incorrect time stamp values may be created and displayed.
Apparently, there is a UTC conversion utility intended to update historical data from non-UTC format to UTC, but it doesn't work the other way around.

IIS Reporting wrong Time and TimeZone

Everything i am reading says that IIS uses the local machine time and timezone, and as such to change the effective time zone and/or time, all i need do is change the time and timezone for the server, and it will change the effective time and timezone that the IIS server sees, and i have a server running a simple WCF service that this just does not seem to be working for me.
This server is is located on Pacific turf in a leased farm, but has had an Eastern timezone configured on it since we first set it up. I have tried resetting everything and even tried bouncing the box, so i am sure it is not cached Time values or something so simple. However upon break-pointing my WCF code to try and understand why it is passing out dates that are off by three hours to all consuming services, i found that IIS is convinced it is in Pacific Timezone, despite everything being configured otherwise.
System.TimeZoneInfo.Local reports that it is in Pacific Time, and DateTime.Now give me a timestamp that is off by three hours, and i can not seem to figure out how to convince IIS that it needs to use Eastern Time as the effective time zone for the records it is creating and handing out.
In general, server-side code should not depend on the local time zone. Calling either TimeZoneInfo.Local or DateTime.Now from a server application is usually a mistake. See The Case Against DateTime.Now.
The best practice would be to leave your server set to Coordinated Universal Time (UTC), and write your application to manage time zones internally. If you are dependent on Eastern time, then your code should do something like:
TimeZoneInfo tz = TimeZoneInfo.FindSystemTimeZoneById("Eastern Standard Time");
DateTime now = TimeZoneInfo.ConvertTimeFromUtc(DateTime.UtcNow, tz);
That said, if TimeZoneInfo.Local.Id is returning Pacific Standard Time, then there are only two possible explanations:
Your system is indeed set for the Pacific time zone.
Your system was set for the Pacific time zone, but you changed it without restarting or calling TimeZoneInfo.ClearCachedData.
Since you've eliminated both of these explanations in how you described the problem, I can only say that there must be something wrong with how you are setting the time zone.
Try using tzutil.exe on an Administrator elevated command prompt. tzutil /g will give you the current time zone setting. tzutil /s "Eastern Standard Time" will set your time zone for US Eastern Time. Be sure to restart your application after changing the time zone, either by recycling the application pool in the management console, using iisreset, restarting IIS, or (if you must) rebooting the server.
You can also just make the change through the time zone control panel.
If you are saying you've done all of that, and you're getting "Eastern Standard Time" back from tzutil /g, but TimeZoneInfo.Local.Id is returning "Pacific Standard Time" even though you've rebooted, then I call BS. That's just not possible. Perhaps there's a simpler explanation, such as maybe you're deploying to multiple servers and you're setting the time zone on a different server than you're getting the results from.

How to know the TimeZone StandardName or DayLightName from TimeZoneOffset in Sql Server

I am using Sql Sever 2008 R2. Is there a way to identify the time zone Standard name or daylight name from timezoneoffSet.
For example I have "2013-09-26 03:00:00.0000000 -04:00" and need
"Eastern Daylight Time" from above.
How can I accomplish this in SQL server ?
Any suggestions will be appreciated.
If you're talking about getting the local system's time zone, I've investigated that heavily in the past, and it isn't possible without resorting to either "unsafe" SQL CLR calls, or unsupported xp_regread calls to read this out of the registry. The general recommendation is to do that in application code, and not in the database.
However, if what you are saying is that you have an offset of -4:00 in your input value, and you want to translate that to a time zone name, I'm afraid that isn't possible at all, neither in SQL, nor in your application code.
The reason is that there are many time zones that share the same offset. For example, see this Wikipedia page that shows all of the zones that use -04:00.
Even if you limit the scope to just the United States, it still won't work in certain cases due to daylight saving time. For example, consider the time stamp of 2013-11-03T01:00:00-05:00. Is this Eastern Standard Time? Or Central Daylight Time? There's no way to tell, because at this moment it could be either one.
In the USA (unlike Europe), each time zone transitions at 2AM in its own local time. So it's like a wave that moves from the east to the west. While the US time zones are normally one hour spaced apart, during the spring-forward transition they can be two hours apart, and during the fall-back transition they can have the same exact local time.

I'm writing software that will be tracking files and if they are changed, should I be using UTC

I'm writing some software to track if a file has been checked out and changed. The file could be checked out and changed by various people in several different time zones.
So, simply put I would be doing something like this:
if ( checked_out_file.last_modified_date > my_database_record_of_the_last_modified_date ) {
// file has changed since last sync so do something
}
The above is just pseudo-code so don't get hung up on what language and things like that. What I am basically wondering is should I store the my_database_record_of_the_last_modified_date as the UTC time, and when I do my checked_out_file.last_modified_date comparison, as illustrated in my pseudo-code above, should I also use the UTC time for that?
Time is a pain to work with. Time zones are even worse. There's not much you can do about that. But if you use UTC everywhere, you can make time a little more bearable.
Store all your dates in UTC. Do all your date comparisons in UTC. Do everything you possibly can in UTC. Ideally, the only time a date won't be in UTC is when it's being formatted for display to a user. Time zones are a rabbit hole you do not want to go down.
Hope this helps!
PS: Yes, I've had bad experiences with time zones in the past. Could you tell?