openerp payroll (using previous payslips data) - odoo

I want to calculate Holiday payment using OpenERP payroll. I managed to successfully configure Salaries calculations depending on my needs, but I'm stuck at using previous payslips data into new one).
What I want to do is calculate Holidays payments, which requires to use previous three months gross (or bruto) salary and worked days (at those three months) and get average $/day for these three months. Then I could use this parameter to calculate how much money employee should get money for his holidays.
I just don't find a way to use such data, because all data is being used at present payslip (like rules, categories, inputs).
Does anyone know how to do it?
Thanks

Salary Rules using python expression have available a payslip object.
This object has sum method:
def sum(self, code, from_date, to_date=None)
Probably you can write a rule containing something like:
payslip.sum('GROSS', a_start_date, a_end_date)
You'll need to add expression to calculate your star and end dates, but I'm not sure if you the datetime and timedelta objects are available in the evaluated expression...

Great to hear that you have right configuration of the Payroll, as the Payroll Engine is quite a strong and technical, but regarding your holiday payment it is purely a customization the generic implementation does not come with the any special feature like this,
But If you want o make it part of the salary slip yes you can always some complex Head with Python Code which can calculate your requirement. you have variable like :
# employee: hr.employee object
# rules: object containing the rules code (previously computed)
**# worked_days: object containing the computed worked days.**
# inputs: object containing the computed inputs.
Hope this will help thanks

Related

Commodity Term Structure Data from Bloomberg

I am looking for a way to download daily WTI crude oil (CL) term structure data from Bloomberg in the Excel Add-in.
My goal is to create a table that looks like this: For each futures contract, I want the lat price and the days to maturity.
Date
CL1
CL2
...
1.1.12
PX_Last
FUT_ACT_DAYS_EXP
PX_Last
FUT_ACT_DAYS_EXP
I have tried the FUT_ACT_DAYS_EXP code but for historical price series it is not available. Is there maybe a different way to receive the term structure data with days to maturity from BB?
Many thanks!
The generic contract CL1 Comdty has a historic field of FUT_CUR_GEN_TICKER. So you can pull back a timeseries of PX_LAST and FUT_CUR_GEN_TICKER.
Then you can feed these underlying contract tickers into a BDP call for LAST_TRADEABLE_DT and subtract the PX_LAST date. You can hide the intermediate columns if they are not needed.
And the final result, with the intermediate column D hidden:
NB. I'm using array functions here (note the # symbols in the formulae), rather than hard-coding ranges. It makes it more flexible if you want to change the history range. The multiple BDP calls are unnecessary but the Bloomberg addin may be caching them in any case. If performance is an issue you can use the UNIQUE() function to get a list of the underlying contract names into a lookup table.
It's been a few years since I looked at this but you are using generic tickers, whereas FUT_ACT_DAYS_EXP would most likely expect an actual contract, e.g. CLU1. There is a field that you can use to convert generic to actual as a time series, i.e. with BDH, but you would have to check that on FLDS. Once you have that you can use BDH to pull in price and days to expiry. Bear in mind that with one BDH per date this would be a very inefficient approach with regards to limits .
Alternatively ask HELP HELP and they should give you a solid answer. Unfortunately I don't access to the terminal anymore but I used to be very involved in building such analytics.

Standing Orders with SQL / HQL

I implemented an own program to manage my incomes and expenses some years ago. However, I realized that I need some kind of "standing orders" - incomes or expenses which repeat monthly, quarterly or yearly. I would add them in an own table (with the value, description, start and end date, repetition rate, ...). But how do I query them with SQL/HQL in a smart way? For example: I want all incomes for a given month. Now I have to run through all entries and check somehow whether the start date plus a multiple of the repetition rate "hits" the current month. Seems to me very cumbersome. Is there an easy way to implement such operations?
Sorry for answering my own question, but in the meantime I think that it is not that difficult. Even when using HQL it is possible to calculate the number of months between two dates (if not with an existing function, it can still be done in the HQL expression using an obvious formula). Of course one has to handle the case correctly with days which exist in one month but not in the other (e.g., february), but this is a detail which can likely be ignored in my case.
Knowing the number of months between the current month, a simple modulo expression can check whether the current month is "hit" by the standing order. The rest is simple.

Qlikview guage chart - static variable

I am looking to create and use a gauge chart, but I am trying to figure how to compare selected vs the complete universe. For example, let's say I want to measure a monthly spending average of one department vs. the spending average of the entire organization. How do I make the organization average the segment marker? or make it a static variable that does not change with the selection?
As is the case with a lot of questions that pop up here, you can use set analysis to achieve this. In you gauge expression you could do something like this:
Assuming you are going to select one or more departments in the current state, you could do
percent_of_total_spending = avg({$}spending)/avg({1}spending)
The {} syntax is used in conjunction with the set identifiers $ and 1 to denote the the set of the current selections and the entire universe respectively. So the above expression will give you the percentage of average organizational spending for whatever departments you have selected.
You can use set modifiers in addition to the set identifiers to further customize what population you are including. See here for more information.
All you need to do is use set analysis and tell the expression what to ignore in the selections box.
Example:
Sum({<Month=, Year=>}Spending)
What the above is telling you is that it's taking the "Month" field and setting the selection for that specific expression to nothing, as well as the "Year" field. If you were to do the following:
Sum({<Month=, Year={"2014"}>}Spending)
It will calculate for that specific year, or even this
Sum({<Month=, Year={"2013","2014"}>}Spending)
Will make it calculate for both 2013 and 2014. Read up on the helpfile as well by pushing F1 and searching for set analysis.
So something like this will work for you
Sum(Spending) / Sum({<Month=, Year=>}Spending)

MDX Calculated Member SubCube

I am relatively new to this depth of MDX, but here is my dilemma. My goal is to implement a calculated member using a .Net Stored Procedure. The calculation (XIRR) will be based on a set of cash flow dates and cash flow amounts. Ideally this would be a calculation in my cube that is available as a measure to Excel/Browser users.
So to start simple I am just trying to implement my own COUNT calculated member/measure (not even using .Net) to say count the # of members in a given dimensions based on the current context. So lets say I have a dimensions Customer with a Customer Id Key. And let's say there are a total of 100 customers in my database. So Count(Customer.CustomerId.AllMembers) would be 100. Now when you start using the browser and say filter on Customer.CustomerId.&1, Customer.CustomerId.&2 (customer id 1 and 2) I would expect my count calculated member to return 2 but it returns the total 100 count. I have tried using exists. I am sure there is something that I am just fundamentally not understanding yet.
Hopefully this makes sense, would hugely appreciate any help from someone that has a good understanding of SSAS/MDX and calculations. Thanks in advance.
Marty
You may have some issues here, I did when I tried to do a similar thing.
Your calculated member is not honouring the client sub-select, which is normal. What in theory you would do is create a dynamic set, and then use that in the calculated member to force the dimension count to be evaluated in the context of the subcube your filters have created. Mosha has a good article here: http://sqlblog.com/blogs/mosha/archive/2007/08/25/mdx-in-katmai-dynamic-named-sets.aspx
So you'd end up with something like:
CREATE DYNAMIC SET CurrentCube.Customers AS
EXISTING(Customer.CustomerId.CHILDREN);
CREATE MEMBER CurrentCube.Measures.CustomerCount AS
Customers.COUNT
Now the real problem you'll have is a bug in SSAS https://connect.microsoft.com/SQLServer/feedback/details/484865/calcuated-member-with-a-reference-to-dynamic-named-set-kills-the-cubes-performance so the code above, which will probably work just fine locally, will kill a production cube. This was an exciting learning experience for me.
See if you can get any of the workarounds to work, I couldn't.
I was able to get what I wanted, but I had to create query-scoped dynamic sets as part of the MDX query, I wasn't able to create it as a cube object:
WITH DYNAMIC SET Customers AS
EXISTING(Customer.CustomerId.CHILDREN);
MEMBER Measures.CustomerCount AS
Customers.COUNT
SELECT
Measures.CustomerCount
ON COLUMNS
FROM [Cube]
WHERE Customer.CustomerId.&[1]
Let us know how you get on.

track sales for week/month and find the best sellers

Lets say I have a website that sells widgets. I would like to do something similar to a tag cloud tracking best sellers. However, due to constantly aquiring and selling new widgets, I would like the sales to decay on a weekly time scale.
I'm having problems puzzling out how store and manipulate this data and have it decay properly over time so that something that was an ultra hot item 2 months ago but has since tapered off doesn't show on top of the list over the current best sellers. What would be the logic and database design for this?
Part 1: You have to have tables storing the data that you want to report on. Date/time sold is obviously key. If you need to work in decay factors, that raises the question: for how long is the data good and/or relevant? At what point in time as the "value" of the data decayed so much that you no longer care about it? When this point is reached for any given entry in the database, what do you do--keep it there but ensure it gets factored out of all subsequent computations? Or do you archive it--copy it to a "history" table and delete it from your main "sales" table? This is relevant, as it has to be factored into your decay formula (as well as your capacity planning, annual reporting requirements, and who knows what all else.)
Part 2: How much thought has been given to the decay formula that you want to use? There's no end of detail you can work into this. Options and factors to wade through include but are not limited to:
Simple age-based. Everything before the cutoff date counts as 1; everything after counts as 0. Sum and you're done.
What's the cutoff date? Precisly 14 days ago, to the minute? Midnight as of two Saturdays ago from (now)?
Does the cutoff date depend on the item that was sold? If some items are hot but some are not, does that affect things? What if you want to emphasize some things (the expensive/hard to sell ones) over others (the fluff you'd sell anyway)?
Simple age-based decays are trivial, but can be insufficient. Time to go nuclear.
Perhaps you want some kind of half-life, Dr. Freeman?
Everything sold is "worth" X, where the value of X is either always the same or varies on the item sold. And the value of X can decay over time.
Perhaps the value of X decreased by one-half every week. Or ever day. Or every month. Or (again) it may vary depending on the item.
If you do half-lifes, the value of X may never reach zero, and you're stuck tracking it forever (which is why I wrote "part 1" first). At some point, you probably need some kind of cut-off, some point after which you just don't care. X has decreased to one-tenth the intial value? Three months have passed? Either/or but the "range" depends on the inherent valud of the item?
My real point here is that how you calculate your decay rate is far more important than how you store it in the database. So long as the data's there that the formalu needs to do it's calculations, you should be good. And if you only need the last month's data to do this, you should perhaps move everything older to some kind of archive table.
you could just count the sales for the last month/week/whatever, and sort your items according to that.
if you want you can always add the total amonut of sold items into your formula.
You might have a table which contains the definitions of the pointing criterion (most sales, most this, most that, etc.), then for a given period, store in another table the attribution of points for each of the criterion defined in the criterion table. Obviously, a historical table will be used to store the score for each sellers for a given period or promotion, call it whatever you want.
Does it help a little?