I would like some clarification on tzs for the Jawbone Moves endpoint: https://jawbone.com/up/developer/endpoints/moves. Is this key going to be present on all response items? If not, what types of records will have it vs those that don't. Additionally, the docs indicate it will be an array of arrays with the following format:
"tzs": [
[1384963500, "America/Phoenix"],
[1385055720, "America/Los_Angeles"]
]
However, I am getting response that look like the following:
"tzs": [[1468410383, -14400]]
Is the second an offset I presume in seconds?
The tzs key will appear in responses from the moves endpoint that provide data for a given day's move. It will always be present, but it will only contain more than one entry if the user changes timezones during the given time period (e.g., the user is travelling).
Here's the explanation from the documentation:
Each entry in the list contains a unix timestamp and a timezone. In most instances the timezone entry is a string containing the Olson timezone.
When the timezone entry is just a number, then you are correct it's the GMT offset in seconds, so -14400 corresponds to US/Eastern
Related
I have indexed data on splunk but i can see the _time(indexed time) is showing wrong like.
I had indexed this data on 19th oct but this is showing like it is indexed on 18th oct.
Please suggest what would be the solution or i need to manually overwrite the _time key with current date time.
Thanks
_time is not the time the event was indexed - that's _index_time. _time is the time the event happened, which usually is different from when it was indexed (because of transport/processing delays).
From your screenshot I see what I presume is the event time ('date' field) differs from _time. That often happens when the time zone is incorrect or is not interpreted correctly. Were that the case here, however, I would expect the difference between date and _time to be a multiple of 30 minutes.
From what I see in the question, it's possible the props.conf settings are causing Splunk to interpret the wrong field as _time. Closer inspection shows the sourcetype ends with "too_small". This is an indication that Splunk does not have specific settings for the sourcetype so it's trying to guess at where the timestamp is (and getting it wrong, obviously).
The solution is to create a props.conf stanza for the sourcetype. It should be something like this:
[json]
TIME_PREFIX = date:
TIME_FORMAT = %Y-%m-%dT%H:%M:%S.%3N%Z
MAX_TIMESTAMP_LOOKAHEAD = 26
SHOULD_LINEMERGE = false
LINE_BREAKER = ([\r\n]+)
TRUNCATE = 10000
Put this settings on your indexer and restart it. Events that arrive after that should have the right time on them.
My client needs to know how long it will take to ship their time sensitive product for FedEx and USPS. I'm using the GetRates function of DotNetShipping but the Commitment Date is coming back as null and being set to 30 days by DotNetShipping which isn't helpful. Are there particular parameters for the USPS Web Tools API that have to be passed in in order to get a Commitment Date? I know that when I call the USPS API directly, with the following URL, I do get a Commitment Date in the return data.
http://production.shippingapis.com/ShippingAPI.dll?API=RateV4&XML=<RateV4Request USERID="[USPSUSERID]"><Revision>2</Revision><Package ID="2ND"><Service>PRIORITY</Service><ZipOrigination>44106</ZipOrigination><ZipDestination>20770</ZipDestination><Pounds>1</Pounds><Ounces>0</Ounces><Container>RECTANGULAR</Container><Size>LARGE</Size><Width>11</Width><Length>13</Length><Height>11</Height><Girth>55</Girth><Value>1000</Value><SpecialServices><SpecialService>1</SpecialService></SpecialServices></Package></RateV4Request>
The above URL won't work without replacing [USPSUSERID] with a valid user ID.
I had to modify DotNetShipping to pass in Value and SpecialServices -> SpecialService and remove Machinable in order to get the CommitmentDate returned.
Are Heart Points available in the REST API for reading? IF so, how do we get to them? I'm not seeing the documentation data. Thanks.
Eric
You should use the Users.dataSources.datasets API endpoint. You can grab the hearts points merged from all data points by querying the dataSourceId "derived:com.google.heart_minutes:com.google.android.gms:merge_heart_minutes". It returns a JSON object with an array called "points". You'll find each heart point in that list and if you drill down further for each heart point you'll get the derived source.
The endpoint takes the form:
https://www.googleapis.com/fitness/v1/users/me/dataSources/dataSourceId/datasets/datasetId
Replace the following in the URL above:
dataSourceId: derived:com.google.heart_minutes:com.google.android.gms:merge_heart_minutes
datasetId: The ID is formatted like: "startTime-endTime" where startTime and endTime are 64 bit integers.
Expanding on WiteCastle's answer, this datasource will provide you with the heart points.
"derived:com.google.heart_minutes:com.google.android.gms:merge_heart_minutes"
You will need to specify a timeframe denoted by the datasetId parameter which is a start time and and an end time in epoch time with nanoseconds format, e.g.:
1607904000000000000-1608057778000000000
The json response includes an array of points, essentially each time the sensor detected the user's activity. The 'heart points' are accessible within each point's "fpVal". Example of a point is below:
{
"startTimeNanos": "1607970900000000000",
"endTimeNanos": "1607970960000000000",
"dataTypeName": "com.google.heart_minutes",
"originDataSourceId": "derived:com.google.heart_rate.bpm:com.google.android.gms:merge_heart_rate_bpm",
"value": [
{
"fpVal": 2, <--- 2 heart points recorded during this activity
"mapVal": []
}
],
"modifiedTimeMillis": "1607976569329"
},
To get the heart points for today, specify the timeframe (00:00-23:59 in epcoch format), then loop through each point adding up all the "fpVal" values.
While searching for items in the inbox that have been received after a particular time frame (as mentioned in the code below). It searches for the date but it is also returning the email with the specified timestamp. I want the emails only after the specified timestamp.
SearchFilter greaterthanfilter = new SearchFilter.IsGreaterThan(ItemSchema.DateTimeReceived,
Convert.ToDateTime(lastUploadedEmailtimeStamp));
mailItems = inbox.FindItems(greaterthanfilter, view);
Not sure if anyone has faced any similar issues? Basically I want to search for items that were received after a particular mm/dd/yyyy hh:mm:ss.
Exchange stores the datetimes with a precision down to the Millisecond, EWS only give you a precision on datetimes to the second however the Searchfilters do have a precision of milliseconds with Date time. So if you datetime stamps your using only have a precision of seconds then you need to use something like this eg where you wanted all email that was received after 7:43 and 8 seconds
SearchFilter sfs = new SearchFilter.IsGreaterThan(ItemSchema.DateTimeReceived, DateTime.ParseExact("2014/12/29 07:43:08.999", "yyyy/MM/dd HH:mm:ss.fff", null));
FindItemsResults<Item> femaa = service.FindItems(WellKnownFolderName.Inbox,sfs, iItemView);
If you want to look at the actual precision on your messages you need to use a MAPI editor like OutlookSpy of MFCMapi. You can then look at the PT_Systime value which are FileTime "8 bytes; a 64-bit integer representing the number of 100-nanosecond intervals since January 1, 1601" see http://msdn.microsoft.com/en-us/library/ee157583(v=EXCHG.80).aspx
Cheers
Glen
I've got a requirement to check whether some objects were modified since last logon of current user. There is a table USR02 that contains last logon date, but it is updated at moment of logon and here "last" means "current".
For example, I logged in 2014.11.21 and then 2014.11.26, so dates range I want to get is 21…26, but when I enter the system, date 2014.11.21 in USR02 will be overwritten with 2014.11.26.
Of course, I could follow Z-way and create my own table containing user name and previous login date, but maybe there is there a standard way to achieve this?
I noticed that you can see the current as well as the last logon date and time in the dialog you can open with System --> Status. I went through the code of the function pool SHSY that contains this dialog and found the following implementation:
DATA: BEGIN OF last_logon,
date LIKE sy-datum,
time LIKE sy-uzeit,
date_now LIKE sy-datum,
time_now LIKE sy-uzeit,
END OF last_logon.
* ...
* Datum und Zeit der aktuellen und letzten Anmeldung
GET PARAMETER ID 'US2' FIELD last_logon.
Certainly not the standard API one would expect, but apparently it's all there is...