dears
I want to calculate the price rate of change by dividing the today's price with price 12 periods ago
"df.close" is the my data where I want to calculate the rate of change.
please guide me
I was unable to try anything because I felt it is the window of 12 days but not a continuous window
I see some similar replies but those are some big codes not relevant to me. I simply need one line to call the price 12 days ago and divide it
thank you.
Related
I have a problem with calculations using the CAPM model (I have to do a finance project using this formula)
When calculating the Rf for 10 year bonds, what time period do I have to use? I stumbled across a video where some guy is using a 3 month time period, is that okay? If not, what time period should I use for my calculations?
I’m a data analyst in the insurance industry and we currently have a program in SAS EG that tracks catastrophe development week by week since the start of the event for all of the catastrophic events that are reported.
(I.E week 1 is catastrophe start date + 7 days, week 2 would be end of week 1 + 7 days and so on) then all transaction amounts (dollars) for the specific catastrophes would be grouped into the respective weeks based on the date each transaction was made.
Problem that we’re faced with is we are moving away from SAS EG to GCP big query and the current process of calculating those weeks is a manually read in list which isn’t very efficient and not easily translated to BigQuery.
Curious if anybody has an idea that would allow me to calculate each week number in periods of 7 days since the start of an event in SQL or has an idea specific for BigQuery? There would be different start dates for each event.
It is complex, I know and I’m willing to give more explanation as needed. Open to any ideas for this as I haven’t been able to find anything.
For years I've been using webpage requests like the following to retrieve 20 days at a time of minutewise stock data from Google:
http://www.google.com/finance/getprices?q=.INX&i=60&p=20d&f=d,c,h,l,o,v
= Retrieve for .INX (S&P 500 index) 60-second interval data for the last 20 days, with format Datetime(in Unix format), Close, High, Low, Open, Volume.
The Datetime is in Unix format (seconds since 1/1/1970, prefixed with an "A") for the first entry of each day, and subsequent entries show the intervals that have passed (so 1 = 60 seconds after the opening of the market that day).
That worked up until 9/10/2017, but today (9/17) it only returns day-end data (it even reports the "interval" between samples as 86400). Pooey! I can get that anywhere, in bulk.
But if I ask for fewer days, or broader intervals, it seems to return data - but weird data. Asking for data every 120 seconds returns exactly that - but only for every other market day. Weird!
Has anyone got a clue what might have happened?
Whoa! I think I figured it out.
Google still returns minutewise data for the same approximate limitations (up to 20 calendar days), but instead of d=10 returning all the market data for the last 10 calendar days, it return the data for the last 10 market days. Previously, to get the last 10 market days you would ask for d=14 (M-Fx2, plus two weekends). Now, Google interprets the d variable as market days, and asking for d=20 exceeds the limits on what they will deliver.
It now appears that d=15 is the limit (three weeks of market days). No clue on why I got the very weird every-other-day data for a while... but maybe if you exceed their d-limits the intervals get screwy. Dunno. Don't care. Easy fix.
I used to have a number of queries running on the past 40 days of data using a decorator with [dataset.table#-4123456789-].
However, since September 15 all the decorators return maximum 10 days of data.
By the way [dataset.table#0] returns the whole table and not the past 7 days as told in the documentation.
Does anyone know what is going on. Do I have to move my table to partition in order to receive data for a limited period of time but more the a week?
Thanks
Basically, I'm trying to calculate the total average outage hours of a worksheet I'm working with, however I'm trying to have it further broken down.
Here is a picture of the Excel Sheet:
Excel Calculations
(Not allowed to add embedded pictures yet :( - 10 reputation needed, sorry!)
Pretty much, I'm trying to calculate what the average up-time is for the month, by calculating the average downtime and then subtracting it from 100.00%
What I've got works, but I'm trying to workout whether the Outage Hours column can be scrapped, and the total can be calculated with perhaps just a larger formula.
Here is a link to the spreadsheet: https://www.dropbox.com/s/msowjndootd2hh2/Spreadsheet%20Calculations.xlsx?dl=0
Thanks in advance!
Just to clarify, you're looking for the result from the second column without having to include the second column, correct?
Since the second column is just the remaining difference (from 1.00) of the first column, then to get the result, all you have to do is take the remaining difference for the maximum overall to the total sum of the first column.
Meaning (assuming 12 months)...:
=12-SUM(B4:B15)
(Substitute 12 for however many months to be summed)
EDIT: OP is looking for =AVERAGE(B4:B15)