Salesforce REST API get a list of updated records - api

I want to have a list of records under accounts object from a particular time to a particular time
say for example this my api query:
https://ax1.salesforce.com/services/data/v29.0/sobjects/Account/updated/?start=2015-06-30T06%3A49%3A00%2B00%3A00&end=2015-06-30T16%3A30%3A26%2B00%3A00
In my salesforce the time I have chosen is Indian Time which is UTC + 5:30
I created an account at 16:45 pm on 30th June in Indian time(as per salesforce this time is shown in the created by field of the account)
but for the above query in which I have chosen the start time and end time is 06:49 AM and 16:30 PM respectively
I got the record id which I have added at 16 45 pm Indian time but it shouldn't come in the response
The following is the response:
"ids": [
"0019000001QeOINAA3"
],
"latestDateCovered": "2015-06-30T09:00:00.000+0000"
}
Also the latestDateCovered it says 09 AM only
I don't understand this system
Could somebody help me on how this works?

The API REST API will be working with UTC DateTime values.
So you searched for records between:
2015-06-30 06:49:00 UTC
2015-06-30 16:30:26 UTC
Which I make to be:
2015-06-30 12:19:00 IST
2015-06-30 22:00:26 IST
So it would make sense that a record created at 16:45 pm on the 30th of June IST would appear in the results.
Try checking the SystemModStamp and LastModifiedDate fields in the API for the record in question, as the values will also be in UTC there.

Related

create additional date, time and timezone columns based on the date and state columns in python dataframe

I have 3 columns called 'customer_state','call_date' and 'call_time' in my dataframe and I want to create 3 new columns 'customer_timezone' ,'customer_date' and 'customer_time'
Possible values for timezone are
Eastern Standard Time (EST)
Central Standard Time (CST)
Mountain Standard Time (MST)
Pacific Standard Time (PST)
Note: call_time is in Mountain Standard time and in 24 hours format
My dataframe looks like below :
call_date call_time customer_state
2019-11-01 13:46 MD
How my resultant dataframe should look like:
call_date call_time customer_state customer_timezone customer_date customer_time
2019-11-01 13:46 MD EST 2019-11-01 16:46
Any help is appreciated!
Additional note: To simplify this solution, my data only has 'call_time' within 6am and 4pm MST. So, I don't have to worry about changing the dates (for instance, if it is 9pm in MST, then it would be 12 am next day in EST). I do not have to worry about these edge cases. Infact 'call_date' and 'customer_date' would always be the same in my scenario. I just need to add +3 hours to the time

Interval Based Report Sql Server

I need First Login and Last Logout report for CC agents, I have information of every login and logout they perform during their shift (i.e Logout for lunch/smoke/breaktime etc).
We have the Following shifts:
S.No Shift Possible Login Possible Log Out
1. 08 – 04 8: 07 16:05
2. 10 – 06 10:03 18:09
3. 04 – 00 16:08 00:02
4. 06 – 02 18:04 02:01
5. 00 – 08 23:57 08:04
I have this view for collecting relevant information as below:
Problem with this report is that if I am generating a single day report as 20/06/2016 then I am not able to capture information on Shift S.No. (3,4,5) because there is a day change.
For Example:
Agent Login Date/Time is: 20/06/2016 18:10
And
Agent Logout Date/Time is: 21/06/2016 02:05
I need something like to have Interval column where day starts at 20/06/2016 03:00 and day ends at 21/06/2016 03:00 how to achieve this Interval ? Or if you have any other Idea for this report requirement.
If I understood you correctly, based on the view you've shown there, you need to provide data for all the people that were working on, say, 20/06/2016? If that's so, I believe you could select from the view all the people whose
CAST(FirstLogin AS date) = '20160620'
If you needed to also include the people who started working on 19/06/2016 and ended on 20/06/2016 then you would do the same for LastLogOut.
In addition, if you were to write it as a dynamic query, if you needed to provide a report for yesterday for example, you'd substitute the 20/06/2016 from the where clause with
CAST(DATEADD(day, -1, GETDATE()) AS date)

ActiveRecord Rails convert created_at timestamp to required timezone in where query

Situation:
I am working on a Rails application with UTC as the application timezone so data for any user coming from any zone will be saved in UTC format in database.
Lets suppose one of the user from Central Time (US & Canada) (difference of 5 hours from UTC) signups on my website on 28th July 2014 6:45 pm. Thus the created_at for the user in database will be Mon, 28 Jul 2014 23:45:00 UTC +00:00. Now the same user uploads a photo in Photo model the same day at 7:30 pm. So value of created_at in Photo records for the user will be Tue, 29 Jul 2014 00:30:30 UTC +00:00.
Thus storing values in UTC brings a difference of 1 day between the user created_at and Photo stored for the user though the user practically uploaded the photo on the same day as signed up on the website.
Requirement:
I want to find all the photos uploaded by the user created on the same day as the user created. So I implemented the query -
Expected Result
=> 1
u = User.first
u.created_at
=> Mon, 28 Jul 2014 23:45:00 UTC +00:00
u.photos.where('Date(created_at) = ?', u.created_at).count
=> 0 {Failed - Because the db compares the time in UTC}
After lot of search and research I came to know about setting Time.zone to the user time zone.
u = User.first
Time.zone = u.time_zone
=> Central Time (US & Canada)
u = User.first
u.created_at
=> Mon, 28 Jul 2014 18:45:49 CDT -05:00 {Correct Time as per my requirements}
Photo.where('user_id = ? and Date(created_at) = ?', u.id, u.created_at).count
=> 0 {Failed}
But When I tried to find created_at of photo record uploaded by the user it return the correct timestamp -
u.photos.first.created_at
=> Sun, 28 Jul 2014 19:30:00 CDT -05:00
Thus setting Time.zone only enables Ruby to return time in the specified time zone but Rails Active Record continues to compare time in UTC.
Can anyone please suggest is there any way of using Active Record queries by changing the timestamp from utc to user time_zone.
I know this an old post, but in the event that others read this post I found help using the local_time gem, which renders the saved date into the users timezone: https://github.com/basecamp/local_time
Otherwise I think this tutorial may be what you need if you absolutely must change the timezone in the DB: https://gorails.com/episodes/auto-detect-user-time-zones-in-rails

Shopify API: Get Orders In Shopify by Date

Why fetching Orders from Shopify Store by date and limits always returns the orders of latest date?
like: if I make a query for getting 5 orders from 1 august 2012
using this query:
/admin/orders.json?status=open&created_at_min=2012-08-01 12:00&limit=5
As I have 5 orders in 20 August 2012 and 5 orders in 31 August 2012
but this will returns 5 orders of latest date (31 August 2012).
It's not documented, but you can do it by adding: order=processed_at+asc to your query.
The Shopify API returns orders from Most Recent to Oldest.
When you submit your query Shopify will first create an array of your ten orders; the first five are from 31 August, the last 5 are from 20 August.
Then, by limiting it to five orders, Shopify gives you the first five.
As far as I'm aware, there is no way to specify your own sort order in the Shopify API. You'll need to get all the orders with created_at_min= 2012-08-01 then, using whatever language you're writing in, get the last 5 items in the array.
&created_at_max=2012-08-01 11:59 does the trick
shopify order api
Shopify API is specifically working on the EST/EDT time. So first convert your timezone time to EDT, So you can mark the exact time as per the EDT timezone.
Let's understand the example:
Currently, I'm in JST (TOKYO/ASIA) timezone, now I want the records for 2021-10-20 from order API as per the japan timings. So I need to make this kind of request parameter:
API:
orders.json?status=any&limit=250
Valid Parameter as per the date 2021-10-20 (JST)
If the requirement is for updated_at date
&updated_at_min=2021-10-19T11:00:00-04:00&updated_at_max=2021-10-20T11:00:00-04:00
for created_at date
&created_at_min=2021-10-19T11:00:00-04:00&created_at_max=2021-10-20T11:00:00-04:00
Here, We have actually converted the date of the JST (+09:00) format to EST(-04:00) for making sure that the data which I'm getting from the API should be only for 2021-10-20 date (From 2021-10-20 00:00:00 to 2021-10-21 00:00:00 [JST]).
Similarly, you can create the dates (EDT) format according to your timezone.
for bettween two dates
/admin/orders.json?created_at_min=2012-08-01 12:00&limit=2012-08-01 12:00

Sorting records based on modified timestamp?

I am trying to sort a list of records that have been created using a screen scraping script. The script adds the following style of date and time (timestamp) to each record:
13 Jan 14:49
The script runs every 15 minutes, but if I set the sort order to 'time DESC' it doesn't really make sense because it lists the records as follows:
13 Jan 14:49
13 Jan 12:32
13 Jan 09:45
08 Feb 01:10
07 Feb 23:31
07 Feb 06:53
06 Feb 23:15
As you can see, it's listing the first figure correctly (the day of the month in number form) but it's putting February after January. To add to the confusion it's putting the latest date in February at the top of the February section.
Is there a better way of sorting these so they are in a more understandable order?
If you are storing the values in a database, simply use the column type datetime when creating the field. The database will treat the field as time and will sort the values chronologically.
Otherwise, if you are storing the values elsewhere, for example in a flat file, convert the formatted time to unix time. Unix time is an integer, thus you can sort it easier.
Time.parse("13 Jan 09:45").to_i
# => 1326444300
Time.parse("08 Feb 01:10").to_i
# => 1328659800
You can always convert a unix time to a Time instance.
Time.at(1328659800).to_s
# => "2012-02-08 01:10:00 +0100"