The time command on the redis-cli returns the current server time (doc here). How can I change the timezone on the redis server?
Note that redis gives me UTC timezone, whereas date on the Linux terminal shows me UTC+5, which is my correct timezone (Asia/Oral).
The time command is documented as using Unix time, which is UTC, so there's no way to change that. Timezone issues are complicated, so it makes sense for the Redis server not to concern itself with them.
Instead, convert it on the client using the libraries available on your platform.
Related
Wow! Tons of posts on converting GMT to local time, including correction for DST. But it seems my need is different.
As the title says, I have a stand-alone embedded system with no O/S. I'm using NTP to get UTC. That is used to tag events with an accurate date/time. I can correct UTC for the current time zone but cannot automatically adjust for DST.
Since there is no O/S, I don't have any of the Windows/Linux data such as time zone. So there is no way to locally adjust for the GMT offset.
It seems the only way for me to do this is to use an http call to find the offset, and the only way I can think of doing this is using the lon/lat or address. It would be possible for me to add lon/lat or address to the configuration so this seems like the only option.
I've seen references to sites which return the GMT offset based on location. Do these sites also automatically adjust for DST? To do that, they would have to use one of the solutions posted in many places in this forum, but that should be easy enough.
Thanks for the advice and help!
Dave
If you need to only convert a specific single timestamp to local time, then yes - you can use services such as those listed here. At least the ones offered by Microsoft and Google do convert a timestamp to the local time in the time zone given, in addition to providing the IANA time zone id.
Additionally, you'll find that the gettimezonebycoordinates function in the Microsoft Azure LBS Time Zone API returns a PosixTz value, such as "PST+8PDT,M3.2.0,M11.1.0". This is ideal for embedded systems, as you can set your TZ environment variable to this value and then many local APIs (such as with C, and others) will use this value in their conversions. This approach works best when you may be converting many different local time values and don't want to make an http call for each one.
Be aware, however, that using a POSIX time zone string has some limitations, such as being restricted to a single set of DST transition rules. They generally work ok for near-current time ranges, but not for historical purposes.
I am using Telegraf as a server to collect StatsD data from Python and send it to InfluxDB. However, the data I am getting on InfluxDB has a different timezone than mine. Where do I have to configure the timezone settings: Telegraf or InfluxDB?
Note: I will use this data with Grafana, in case I have to set something up there too.
Telegraf and influxdb are both using UTC as default timezone. As far as I know you cant set another timezone for them. What you want to do is simply use the "Local browser time" option in grafana.
Everything i am reading says that IIS uses the local machine time and timezone, and as such to change the effective time zone and/or time, all i need do is change the time and timezone for the server, and it will change the effective time and timezone that the IIS server sees, and i have a server running a simple WCF service that this just does not seem to be working for me.
This server is is located on Pacific turf in a leased farm, but has had an Eastern timezone configured on it since we first set it up. I have tried resetting everything and even tried bouncing the box, so i am sure it is not cached Time values or something so simple. However upon break-pointing my WCF code to try and understand why it is passing out dates that are off by three hours to all consuming services, i found that IIS is convinced it is in Pacific Timezone, despite everything being configured otherwise.
System.TimeZoneInfo.Local reports that it is in Pacific Time, and DateTime.Now give me a timestamp that is off by three hours, and i can not seem to figure out how to convince IIS that it needs to use Eastern Time as the effective time zone for the records it is creating and handing out.
In general, server-side code should not depend on the local time zone. Calling either TimeZoneInfo.Local or DateTime.Now from a server application is usually a mistake. See The Case Against DateTime.Now.
The best practice would be to leave your server set to Coordinated Universal Time (UTC), and write your application to manage time zones internally. If you are dependent on Eastern time, then your code should do something like:
TimeZoneInfo tz = TimeZoneInfo.FindSystemTimeZoneById("Eastern Standard Time");
DateTime now = TimeZoneInfo.ConvertTimeFromUtc(DateTime.UtcNow, tz);
That said, if TimeZoneInfo.Local.Id is returning Pacific Standard Time, then there are only two possible explanations:
Your system is indeed set for the Pacific time zone.
Your system was set for the Pacific time zone, but you changed it without restarting or calling TimeZoneInfo.ClearCachedData.
Since you've eliminated both of these explanations in how you described the problem, I can only say that there must be something wrong with how you are setting the time zone.
Try using tzutil.exe on an Administrator elevated command prompt. tzutil /g will give you the current time zone setting. tzutil /s "Eastern Standard Time" will set your time zone for US Eastern Time. Be sure to restart your application after changing the time zone, either by recycling the application pool in the management console, using iisreset, restarting IIS, or (if you must) rebooting the server.
You can also just make the change through the time zone control panel.
If you are saying you've done all of that, and you're getting "Eastern Standard Time" back from tzutil /g, but TimeZoneInfo.Local.Id is returning "Pacific Standard Time" even though you've rebooted, then I call BS. That's just not possible. Perhaps there's a simpler explanation, such as maybe you're deploying to multiple servers and you're setting the time zone on a different server than you're getting the results from.
windows server has a registry key HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\TimeZoneInformation where the field ActiveTimeBias will return the offset in minutes from GMT for the machine that you are running on.
we have a web application that needs to present a user with their local time on an html page that we generate. we do this by having them set a preference for their timezone and then fetch the above value so that can compare server time and client time and do the correct calculation.
on servers that we have built this works great. we are deploying our application into a tier 1 cloud provider who is providing a windows server ami that we configure and use. what we have found is that when you use the clock control panel on a local server, the registry entrys for TimeZoneInformation are correct. when we do the same for this virtual machine, the entrys are correct with the exception of ActiveTimeBias.
Microsoft cautions against diddling the individual value in their usual fashion.
question for the community - has anyone else encountered this problem and if so, how did you fix it?
One usually doesn't program directly against these registry keys. For example, in .Net, we have the TimeZoneInfo class. It uses the same data, but in a much more developer-friendly way.
Regarding your particular question, the ActiveTimeBias key hold the current offset from UTC for the time zone that your server is set to. If your server is in a time zone that follows daylight savings rules, then this will change twice a year. If you manually update your time zone and refresh the registry, you will see it change.
The best advice I can offer is that since the timezone of the server is likely to be unimportant to anyone, you should set it to UTC.
See also: Daylight saving time and time zone best practices
I have a very limited environment which have no standard linux/unix date/time API and have to synchronize clock against NTP server. I can communicate via UDP and get NTP reply. Now i need to convert NTP timestamp to the device date/time completly by myself.
I've found several ntp implementations (qntp, c#, java) but they all using system wide functions to convert NTP timestamp to the corresponding DateTime implementation.
Any avdice to how to convert or any links where i can read about such a conversion will be very appreciated.