I'm have a MS SQL database for storing raw data on utility usage (electric, water, and gas), which I have implemented to compile data from four disparate automated collection systems. The ultimate goal is to generate invoices from this data.
Different customers have one of a dozen different rate structures, which all may-or-may-not use a non-linear function to calculate cost per usage based on peak demand and total usage.
It is not unheard of for a particular customer to change from one rate structure to another, or for the rate calculation for a particular rate class to change from year to year, so I would like to put these formulas into new tables within my database where they can be easily referenced and modified.
Ideally, I would want to run one of these dynamic functions as part of a query without relying on the front-end having to do them, but I have no idea how that would work.
By request, an example formula of the type I am talking about:
All current customer with Electricity Rate Structure A pay $0.005/kW-hr for the first 2,000 kW-hr consumed, $0.004/kW-hr for 2,000-15,000 kW-hr, and $0.003/kW-hr for all consumption above 15,000kW-hr. Additionally, any customer who has higher than 50kW demand will be subject to a $0.002/kW-hr surcharge on all consumption. The values for these coefficients, thresholds, the number of thresholds, and whether or not the customer even gets charged for peak demand can and do change from year to year and from rate structure to rate structure.
The formula for this (if I was programming it) would be:
min(sum(usage),2000)*.005+min(min(sum(usage)-2000,0),15000)*.004 + min(sum(usage)-15000,0)*.002 + (max(usage)>50)*sum(usage)*.002
Related
I am looking for a way to download daily WTI crude oil (CL) term structure data from Bloomberg in the Excel Add-in.
My goal is to create a table that looks like this: For each futures contract, I want the lat price and the days to maturity.
Date
CL1
CL2
...
1.1.12
PX_Last
FUT_ACT_DAYS_EXP
PX_Last
FUT_ACT_DAYS_EXP
I have tried the FUT_ACT_DAYS_EXP code but for historical price series it is not available. Is there maybe a different way to receive the term structure data with days to maturity from BB?
Many thanks!
The generic contract CL1 Comdty has a historic field of FUT_CUR_GEN_TICKER. So you can pull back a timeseries of PX_LAST and FUT_CUR_GEN_TICKER.
Then you can feed these underlying contract tickers into a BDP call for LAST_TRADEABLE_DT and subtract the PX_LAST date. You can hide the intermediate columns if they are not needed.
And the final result, with the intermediate column D hidden:
NB. I'm using array functions here (note the # symbols in the formulae), rather than hard-coding ranges. It makes it more flexible if you want to change the history range. The multiple BDP calls are unnecessary but the Bloomberg addin may be caching them in any case. If performance is an issue you can use the UNIQUE() function to get a list of the underlying contract names into a lookup table.
It's been a few years since I looked at this but you are using generic tickers, whereas FUT_ACT_DAYS_EXP would most likely expect an actual contract, e.g. CLU1. There is a field that you can use to convert generic to actual as a time series, i.e. with BDH, but you would have to check that on FLDS. Once you have that you can use BDH to pull in price and days to expiry. Bear in mind that with one BDH per date this would be a very inefficient approach with regards to limits .
Alternatively ask HELP HELP and they should give you a solid answer. Unfortunately I don't access to the terminal anymore but I used to be very involved in building such analytics.
I am designing a database for an app to sell parking spot use. Customer can order n number of uses of parking spot uses or pay a flat rate and use it for monthly/weekly or yearly. They can do mix of those also. Like they can buy 9 parking spot uses and later decide they will pay for the whole month - in which case their charges are kind of pro-rated.
For this, I have a Customer table, an Order Table and then an order type table. However, I am having a slightly hard time with the order-type table.
Can some one please shed light on how to model the rate and get available spot-uses for a customer?
I have searched and searched but to no avail... has anybody created a payroll module for a U.S. based company? It seems that most of what I've seen is that companies are using payroll companies to process their payroll, but I haven't found anybody using OpenERP 7 for hourly employees with the U.S. tax system (it's not a flat tax rate).
It seems like what I may have to do, is create tax table in PostgresQL for federal, state, and local taxes, then reference those tables in the deduction calculation. I read one article on using the vendors/ or suppliers module and implementing a tax structure from that, but then again, those are still flat rates. I have to believe someone else has done this for the U.S. payroll system, and probably done it better than I could as I am fairly new to OpenERP.
I am in the process of doing something similar for LedgerSMB. The thing is that doing this on an open source model is extremely painful business-model-wise. I am working on solutions to that part but that's outside the scope of your question.
In general many US taxes are set up in marginal rates with certain minimums and maximums. For example income tax withholding is a set of marginal rates within tax brackets. Same with FICA and FUMA, but FICA taxes are capped at a certain level, so a simple tax table with rates, minimums, maximums, etc. and then a way of handling deductions to determine the correct line may be sufficient.
But users of most open source ERP's use third party services for payroll.
I have worked in an ERP. How we did is just calculate the FIT in yearly for all the employee and subtract the withholding amount with multiples.
FIT => Taxable Wages Yearly - (No.of withholding(Exemption))
Do the process as per the revision based on single or married only for annually. No need to update all the tables.
Then,
Divided it based on the frequency from the employee table information
EX:
for monthly : FIT/12
for daily : FIT/365
For SIT you have to use based on the document in the state usine case function.
First some background: I have the typical Date dimension (similar to the one in the Adventure Works cube) and an Account dimension. In my fact table I have daily transaction amounts for the accounts.
I need to calculate cumulative transaction amounts for different accounts for different periods of time. The catch is that whatever is the first period shown on the resulting report should get its transaction amount as-is from the fact table and all the following periods in the report should have cumulative amounts.
For example, I might have a single account on rows and on columns I could have [Date].[Calendar].[Calendar Year].[&2005]:[Date].[Calendar].[Calendar Year].[&2010]. The transaction amount for 2005 should have the sum of transaction amounts that took place in 2005 for that specific account. For the following year, 2006, the transaction amount should be TransactionAmountsIn2005 + TransactionAmountsIn2006. Same goes for the remaining of the years.
My problem is that I don't really know how to specify this kind of calculated member in the cube because the end-user who is responsible for writing the actual MDX queries that produce the reports could use any range of periods on any hierarchy level of the Date dimension.
Hope this made some sense.
Teeri,
I would avoid letting the end-user actually write MDX queries and just force them to use ranges you defined. To clarify, just give them a start and end date, or a range if you will, to select and then go from there. I've worked with accounting and finance developing cubes (General Ledger, etc) for years and this is usually what they were ultimately looking for.
Good luck!
Lets say I have a website that sells widgets. I would like to do something similar to a tag cloud tracking best sellers. However, due to constantly aquiring and selling new widgets, I would like the sales to decay on a weekly time scale.
I'm having problems puzzling out how store and manipulate this data and have it decay properly over time so that something that was an ultra hot item 2 months ago but has since tapered off doesn't show on top of the list over the current best sellers. What would be the logic and database design for this?
Part 1: You have to have tables storing the data that you want to report on. Date/time sold is obviously key. If you need to work in decay factors, that raises the question: for how long is the data good and/or relevant? At what point in time as the "value" of the data decayed so much that you no longer care about it? When this point is reached for any given entry in the database, what do you do--keep it there but ensure it gets factored out of all subsequent computations? Or do you archive it--copy it to a "history" table and delete it from your main "sales" table? This is relevant, as it has to be factored into your decay formula (as well as your capacity planning, annual reporting requirements, and who knows what all else.)
Part 2: How much thought has been given to the decay formula that you want to use? There's no end of detail you can work into this. Options and factors to wade through include but are not limited to:
Simple age-based. Everything before the cutoff date counts as 1; everything after counts as 0. Sum and you're done.
What's the cutoff date? Precisly 14 days ago, to the minute? Midnight as of two Saturdays ago from (now)?
Does the cutoff date depend on the item that was sold? If some items are hot but some are not, does that affect things? What if you want to emphasize some things (the expensive/hard to sell ones) over others (the fluff you'd sell anyway)?
Simple age-based decays are trivial, but can be insufficient. Time to go nuclear.
Perhaps you want some kind of half-life, Dr. Freeman?
Everything sold is "worth" X, where the value of X is either always the same or varies on the item sold. And the value of X can decay over time.
Perhaps the value of X decreased by one-half every week. Or ever day. Or every month. Or (again) it may vary depending on the item.
If you do half-lifes, the value of X may never reach zero, and you're stuck tracking it forever (which is why I wrote "part 1" first). At some point, you probably need some kind of cut-off, some point after which you just don't care. X has decreased to one-tenth the intial value? Three months have passed? Either/or but the "range" depends on the inherent valud of the item?
My real point here is that how you calculate your decay rate is far more important than how you store it in the database. So long as the data's there that the formalu needs to do it's calculations, you should be good. And if you only need the last month's data to do this, you should perhaps move everything older to some kind of archive table.
you could just count the sales for the last month/week/whatever, and sort your items according to that.
if you want you can always add the total amonut of sold items into your formula.
You might have a table which contains the definitions of the pointing criterion (most sales, most this, most that, etc.), then for a given period, store in another table the attribution of points for each of the criterion defined in the criterion table. Obviously, a historical table will be used to store the score for each sellers for a given period or promotion, call it whatever you want.
Does it help a little?