This question already has answers here:
Subset a dataframe between 2 dates
(8 answers)
Closed 5 years ago.
Please run the code below: The "patients$time" column gives the timestamp. I want to fetch all the records between two times say first row value "2017-01-02 11:41:53" and 226th row value "2017-08-07 09:06:07". I want to basically get all the records between these two times. I tried dbGetquery but am getting an error. Please help.
library(bupaR)
patients
Try this:
patients[patients$time > '2017-01-02 11:41:53' & patients$time < '2017-08-07 09:06:07',]
Related
This question already has answers here:
How do I melt a pandas dataframe?
(3 answers)
Closed 8 months ago.
I need to unpivot a dataset with the column names in date format. To un-pivot, I would need to call column names, but they would keep changing every month; hence, I cannot use column names because of their dynamic nature.
Here is an example of the table:
I need to un-pivot these date columns and I cannot call column names as these columns may change next month. Here is the desired output:
Could you please help me with any solution for this in spark SQL or Pandas as I am using Palantir foundry.
Note:
There are hundreds of rows in data, this only one row is a sample example.
If I rename column names, changing them back to date columns would be difficult as well.
Thanks.
You can use melt() to achieve this
pd.melt(df, id_vars=df.columns[:3], value_vars=df.columns[3:],
var_name='Date', value_name='Value')
This question already has answers here:
Add days Oracle SQL
(6 answers)
Closed 12 months ago.
I am having an issue trying to add a portion to my WHERE clause to get only the last 3 days of data rather than grab the whole table. The date format comes through as 'dd-Mon-yy'
Here is my current WHERE clause
WHERE ("IAINVN00"."REF_LOCN" LIKE '51C%'
OR "IAINVN00"."REF_LOCN" LIKE '511%')
This works fine, just brings back way too much data. When I add:
and "IAINVN00"."ADJDATE" >= (Date()-3)
This brings back an error of "ODBC--call failed. [Oracle][ODBC][Ora]ORA-00936: missing expression (#936)"
I have tried using this as well and get the same error
DateAdd("d",-3,Date())
In order to fix this, instead of using Date, I needed to use SysDate.
and "IAINVN00"."ADJDATE" >= sysdate - 3
This question already has answers here:
How to get Time from DateTime format in SQL?
(19 answers)
Closed 5 years ago.
I have a datetime column that has data as this:
Appt_DateTime (datetime, not null)`
12/30/1899 7:50:00PM
I want to display only the time in this case the 7:50:pm. it can be with or without the seconds, better without them.
How can I do this in a select?
You may try this by using something like
Select cast(<your column name> as time) [time] from <your table>
This question already has answers here:
Simple way to transpose columns and rows in SQL?
(9 answers)
Closed 6 years ago.
i need to make the data in the column to be column head
this is the current result from my query .. :
select employee_id,reimbursement_type,SUM(amount) as [total amount],reimbursement_status from md_reimbursement
group by employee_id,reimbursement_type,reimbursement_status
and i want it to be like this :
*just ignore the Status field
so the data reimbursement_type to become a column head and it SUM each amount.
i already tried using pivot but didnt get what i expected.
thx
Prepared a sample according to your requirement. As you have mentioned to ignore Status, I am not considering it into the query.
select * from
(
select employee_id,reimbursement_type,amount from md_reimbursement
)src
pivot
(
sum(amount) for reimbursement_type in ([Biaya Dinas],[Other],[Transport],[Uang Makan])
)pvt
This question already has answers here:
Compare 3 Consecutive rows in a table
(2 answers)
Closed 8 years ago.
I have a large table of transactions identified by user id and date. For each user's last transaction, I would like to calculate the time elapsed since the previous transaction. Is there something like a lag operator I can use to do this?
You can find an example of using Window Aggregate functions to accomplish LEAD and LAG in Teradata here.