How to perform math calculations in Active Record Querying in Rails - sql

I'm building an application and I'm finding it necessary to perform some simple math calculations in my query. Essentially, I've got a database with daily values from the S&P 500, and I need to get a listing of days depending on the criteria entered.
The user inputs both a day range, and a % range. For instance, if the date range is Jan 1/2013 - Apr 1/2013 and the % range is -1% - 1%, it should return a list of all S%P 500 daily values between the dates where difference between the opening and closing values are in the % range.
The problem is that I don't actually have a column for %; I only have a column for opening/closing values. It is simple enough to calculate the % given only the opening and closing values (close-open)/open*100. But I'm not sure how to do this within the query.
Right now the query is successfully searching within the date range. My query is:
#cases = Close.find(:all, conditions:["date between ? and ?",
#f_start, #f_end])
But how can I get it to check if the current row's (close-open)/open*100 value is between the two % range values?
Alternatively, if this is not possible or in bad practice, where should I be handling this?

You can calculate the open/close range yourself in Ruby/Rails and pass it on the same way you do with the date range. Something like:
percent = #f_percent / 100.0 # 5% => 0.05
low = #f_close * (1.0 - percent)
high = #f_close * (1.0 + percent)
Close.where 'date between ? and ? AND close between ? and ?', \
#f_start, #f_end, low, high

Related

Conversion of MIPS to %

I've been learning TSQL and need some help with a conversion CPU MIPS into PERCENTAGE.
I've built my code to get some data that I'm expecting. In addition to this, I want to add a column to my code which is to get the CPU%. I have a column that gives me TOTALCPU MIPS and want to use this in the code but in the form of percentage. Example, I have these values in my TOTAL CPU Column:
1623453.66897
0
0
2148441.01573933
3048946.946314
I want to convert these values into percentage and use them. I couldn't find much info on the internet.
Appreciate your response.
I assume that you have 5 numeric quantities (2 of them being zero) and you want to find the percentage that corresponds to each of them out of the addition of the five quantities. Is it so?
To find the percentage of a particular number in the addition you multiply the number by 100 and divide by the addition, the result is the percentage that that number is in relation with the addition.
The sum: 6820841.631023
The percentage of the first number (of MIPS):
1623453.668970 * 100 / 6820841.631023 = 23.80136876 =>
23.80136876% is the percentage of CPU used by the first program.
To give the answer some SQL looking, refering to Mips_Table as the view/table that contains the MIPs data:
select mips, mips/totMips*100 Pct_CPU
from Mips_Table,
(select sum(mips) TotMips from Mips_Table) k

how to calculate monthly log returns

enter image description hereI need to construct two new variable: monthly log returns of TSX and BTC.
Hint: log return of each month is calculated with respect to the previous months values
using the log of prices, i.e. Rt = ln(Pt) − ln(Pt−1)
Alternatively, you can work with the percentage change returns
My TSX and BTC come from a data frame that includes 3 rows and 60 columns.
I have tried the following:
InputData['S&P/TSX Composite index (^GSPTSE)'] = InputData.pct_change()
InputData['log_ret'] = np.log(InputData.price) - np.log(InputData.price.shift(1))
but that did not work

Filter the Google Finance formula to only display the "high" of all time

It's in reference to the Google Finance function in Google Sheets: https://support.google.com/docs/answer/3093281?hl=en
I would like to obtain the "all time LOW" (ATL) and "all time HIGH" (ATH) for a specific ticker (i.e. ABBV or GOOG) but only in 1 cell for each. Basically "What's the ATL/ATH value for this ticker?"
I've tried to do both formulas for ATL and ATH, but only ATL gives the expected result for now.
To get the ATL, you can use
=GOOGLEFINANCE("ABBV","low","01/12/1980",TODAY(),7)
and to get the ATH you can use:
=GOOGLEFINANCE("ABBV","high","01/12/1980",TODAY(),7)
The output of this is 2 columns of data:
Please note that column A, containing the timestamp, will be the one making trouble when it comes to computing the MAX function as it translates into some weird figures.
In order to get the ATL, I'll be using the MIN function which works perfectly fine:
=MIN(GOOGLEFINANCE("ABBV","low","01/01/1980",TODAY(),7))
as it will just scan the 2 columns of data and grab the lowest value which is 32.51 in USD.
BUT when I'm trying to do the same with MAX or MAXA for the ATH using for example
=MAX(GOOGLEFINANCE("ABBV","high","01/12/1980",TODAY(),7)
the result that comes out is 43616.66667 which seems to be a random computation of the column A containing the timestamp.
The expected result of the ATH should be 125.86 in USD.
I've tried using FILTER to excluded values >1000 but FILTER doesn't let me search in column B, so then I tried with VLOOKUP using this formula
=VLOOKUP(MAX(GOOGLEFINANCE("ABBV","high","01/12/1980",TODAY(),7)),GOOGLEFINANCE("ABBV","high","01/12/1980",TODAY(),7),2,FALSE)
but again it returns the value of column B but based on the MAX value of column A which end up giving me 80.1 and not the expected 125.86.
use:
=MAX(INDEX(GOOGLEFINANCE("ABBV", "high", "01/12/1980", TODAY(), 7), , 2))
43616.66667 is not a "random computation". it's date 31/05/2019 16:00:00 converted into a date value
MAX and MIN functions return single output from all possible cells in the included range which are in your case two columns. the date is considered as a number too so maxing out those two columns will output you the max value whenever it is from 1st or 2nd column. by introducing INDEX you can skip 1st column and look for a max value only in the 2nd column.
=MAX(INDEX(GOOGLEFINANCE("BTCSGD", "price", "01/12/1980", TODAY(), 7), , 2))
replace BTCSGD with any stock price you want to search up.
You can put ABCXYZ where ABC is the stock/ETF/Crypto and XYZ is the currency

Defining an RDLC chart axis with an aggregate function

The autoaxis for one of my embedded charts isn't behaving well, sometimes only showing one other major value besides top and bottom. So I thought I'd set my own boundaries, which seemed pretty easy given that one of the columns on the chart is always going to be larger than any of the others.
<Maximum>=(((Max(Fields!Entered.Value, "Chart1") + 10) \ 50) + 1) * 50</Maximum>
(the other columns detail what happened to the things that entered this process)
Round up to the nearest 50 with a little overage to put the label on top. Then I can put the intervals at this divided by 5 and I'm gold.
Except I'm not gold. The chart groups records by date and the individual bars are Sum(Fields!Entered.Value) et cetera, so it's drastically underscaling when multiple batches get processed on a single date. But hey, it groups records by date, I can use that:
<ChartCategoryHierarchy>
<ChartMembers>
<ChartMember>
<Group Name="Chart1_CategoryGroup">
<GroupExpressions>
<GroupExpression>=Fields!Date.Value</GroupExpression>
</GroupExpressions>
</Group>
</ChartMember>
</ChartMembers>
</ChartCategoryHierarchy>
as:
<Maximum>=(((Max(Fields!Entered.Value, "Chart1_CategoryGroup") + 10) \ 50) + 1) * 50</Maximum>
and it'll aggregate over the group just fine. Right?
The ValueAxis_Primary.Maximum expression for the chart 'Chart1' has a scope parameter that is not valid for an aggregate function. The scope parameter must be set to a string constant that is equal to either the name of a containing group, the name of a containing data region, or the name of a dataset.
Nope! It works just fine for "Chart1" but not for "Chart1_CategoryGroup"!
So, uh:
what scope are the axis calculations operating in, 'cause it ain't the category scope?
is there some way to provide them an aggregate scope that groups the data by date so they can do their calculations proper?
You Have To Nest The Scope
A little extra work gave me this insight:
Max(Fields!Entered.Value, "Chart1_CategoryGroup") returns the maximum of the entered fields within one single category group, which is not the level the Y axis is concerned with. What you're interested in is the maximum value of the summed calculation (within a group) for the whole chart, so specify the scopes to do that:
<Maximum>
=(((Max(
Sum(Fields!Entered.Value, "Chart1_CategoryGroup")
, "Chart1") + 10) \ 50) + 1) * 50
</Maximum>

error in dividing 2 pandas series with decimal values (daily stock price)

I am trying to divide 2 pandas columns (same column divided by shifting one cell) but getting an error as below...
..This is surprising as I have done such computation many times before on time series data and never encountered this issue.
Can someone suggest what is going on here?...I am computing the daily returns of Adj Close price of a stock so need the answer in decimal.
I think you need convert to float first column, because dtype is object, what is obviously string:
z = x.astype(float) / y.astype(float)
Or:
data['Adj Close'] = data['Adj Close'].astype(float)
z = data['Adj Close'].shift(-1) / data['Adj Close']