Using the sum of previous payslips of the same year in salary calculation in Odoo - odoo

I am using Odoo Payroll 15 with some custom python structures.
I have some type of taxes that have a maximum amount that has to be paid during a year per employee. I want to do a sum of the payslip of that year for a specific line, then I can calculate correctly the amount that has to be paid in that payslip.
I have tried this previously, but it seems like it uses only the current payslip in its calculation:
payslip.sum('THE_TAX', '2022-01-01', '2022-12-31')
How can I access previous payslips within the payslip rules? A solution that requires Odoo 16 would work for me too.

The main difference between your example and the sum implementation is that your code does not check for payslip states.
This means it does cout all Canceled and unfinished payslips as well.
And the sum implementation counts only payslips with the state in 'done'.
You can add modification of the sum function to the payslip model and call that if you need a different behavior.
The implementation is from v12. Now the hr_payslip is in the Enterprise edition.
class Payslips(BrowsableObject):
def sum(self, code, from_date, to_date=None):
if to_date is None:
to_date = fields.Date.today()
self.env.cr.execute("""SELECT sum(pl.total) -- this line is different in v12 and v15
FROM hr_payslip as hp, hr_payslip_line as pl
WHERE hp.employee_id = %s
AND hp.state = 'done'
AND hp.date_from >= %s
AND hp.date_to <= %s
AND hp.id = pl.slip_id
AND pl.code = %s""",
(self.employee_id, from_date, to_date, code))
res = self.env.cr.fetchone()
return res and res[0] or 0.0
https://github.com/odoo/odoo/blob/c53081f10befd4f1c98e46a450ed3bc71a6246ed/addons/hr_payroll/models/hr_payslip.py#L300
Edit:
I think it is Odoos bug, the paid state should be included, but it is not. You can use variation of your code; filtering, and mapped functions should make it faster.
I can't test the code:
already_paid = sum(
employee.mapped("slip_ids")
.filtered(lambda s: s.date_from.year == 2022 and s.state in ("done", "paid"))
.mapped("line_ids")
.filtered(lambda l: l.code == "THE_TAX")
.mapped("total")
)

I think I found an ugly workaround that does give the correct answer:
already_paid = 0.0
for slip in employee.slip_ids:
for line in slip.line_ids:
if line.code != "THE_TAX":
continue
if line.date_from.year != 2022:
continue
already_paid += line.total

Related

How to write SQL in DAX?

I trying to write the following SQL code in DAX.
select SUM(amount) from TABLE
where BusinessDate = '2023-02-11'
OR (Segment = 'General' and Days_Without_Payments < 210)
OR (Segment = 'Enterprise' and Days_Without_Payments < 1215)
OR (Segment = 'House' and Days_Without_Payments < 1945)
I got the first part(i-e. SUM) right, but can't get my head around the filters.
What I tried:
Measure = CALCULATE([Total Outstanding],
FILTER(VALUES(BILBL_WO[Date]),BILBL_WO[Date]=MAX(BILBL_WO[Date])),
AND(BILBL_WO[Days_Without_Payments] < 210, BILBL_WO[Segment] = "General") ||
AND(BILBL_WO[Days_Without_Payments] < 1215, BILBL_WO[Segment] = "Enterprise") ||
AND(BILBL_WO[Days_Without_Payments] < 1945, BILBL_WO[Segment] = "House")
)
Where Total Outstanding is another Measure which I created for summation of Amount.
Please help as I couldn't find something useful from internet and I am new at this. Thanks!
In general, when people start learning DAX, most think it is like SQL query, but that is not true because SQL is a structured query language. In contrast, DAX is a formula language used for analysis purposes similar to excel equations.
I will give you a few pointers on how to convert SQL to DAX in simple terms considering you have one table that you imported to Power BI or SSAS tabular model, which are as follow:
You need to segregate your query as follows:
Aggregation function in your example SUM (amount) in SQL query will be in DAX:
Total Outstanding = sum('BILBL_WO'[amount])
Where condition on a date column:
I usually create a date dimension and create a relation with the table that has the date column. You can do that when you go into more details relating to evaluation context in Power BI and Star schema architecture as a start, but in your case, if the column data type is a string, convert it to date type and then use the date column as a filter in the power bi report page, allowing the report users to select different dates, not just ‘2023-02-11'.
For your conditions on Segment and Days_Without_Payments, before converting it to DAX, I think your original query is missing Parenthesis. If I understand it correctly, please let me know if I am wrong.
select SUM(amount)
from TABLE
where BusinessDate = '2023-02-11'
and
(
(Segment = 'General' and Days_Without_Payments < 210)
OR
(Segment = 'Enterprise' and Days_Without_Payments < 1215)
OR
(Segment = 'House' and Days_Without_Payments < 1945)
)
Altered SQL script
If my above assumption is valid and you need to write a DAX measure with the Segment and Days_Without_Payments conditions, you can write it as follow:
Measure = CALCULATE([Total Outstanding] ,
KEEPFILTERS(
FILTER(
ALL(BILBL_WO[Segment] ,BILBL_WO[Days_Without_Payments]),
(BILBL_WO[Segment] = "General" && BILBL_WO[Days_Without_Payments] < 210)
||
(BILBL_WO[Segment] = "Enterprise" && BILBL_WO[Days_Without_Payments] < 1215)
||
(BILBL_WO[Segment] = "House" && BILBL_WO[Days_Without_Payments] < 1945))
)
)
And for more on how to filter DAX using multicolumn, see this YouTube video https://www.youtube.com/watch?v=kQjYG6TJVp8 and follow the SQLBI channel. It will help you a lot.
I hope I helped in a way; if so, please mark this as an answer and vote for it :)

How to get value of form field and past it to another form field

I am developing HR system, I want to calculate how many days off the employee can take.
In the HR table I have three attributes.
1- start_date(when the employee starting to work in the company).
2- leavestaken(how many days off the employee took before).
3- leaves(the number of days available for the employee to take off) which equal to=(months between current date and start_date) * 2.5(beacuse each month give 2.5 days off) - leavestaken
third attribute is automated and not from the user, how I can get the first and second attributive values from field forms, use them in the calculation and save the result in the leaves attribute?
I'm using sql and this only for server side
Server side:
If your start_date is unix timestamp:
if ($model->load(Yii::$app->request->post())) {
n_month = (time() - $model->leaves) / (30 * 24 * 60 * 60);
$model->leaves = ceil(n_month * 2.5) - $model->leavestaken;
# code...
//ceil : Because one day off is not complete, Rounds a number down to the nearest integer
But if it's not unix, you can use the date_diff() method or ...
You can do this in the model file, in the beforeSave or beforeValidate method.
You can use JavaScript to calculate the client side.
My solution was
if ($model->load(Yii::$app->request->post()) ) {
$date = date('m/d/Y');
$datetime1 = date_create($date);
$datetime2 = date_create($model->start_date);
// calculates the difference between DateTime objects
$interval = $datetime2->diff($datetime1);
$n_month= $interval->m + 12*$interval->y;
$model->leaves = ceil($n_month * 2.5) - $model->leavestaken;
$model->save();

Changing average price in Material Master (MM02) programmatically

I want to programmatically change the moving/average price(s) of materials for the following special case:
VPRSV = 'S' (Standard price)
MLMAA = 'X' (Material Ledger activated)
MLAST = '3' (Material Price Determination = '3' (Single-/Multilevel))
period = current
It has to work when there is already a material document for the given material in the current period. All other special cases that I need are solved.
I am searching for the function module equivalent of changing the moving average price using MM02, not MR21.
Maybe BAPI_MATVAL_PRICE_CHANGE is what I'm searching for?
What confuses me is that I cannot find a parameter that determines that I want to change the moving average price and not the standard price. Did I miss a parameter? If not, does it change the standard price or moving average price?
And I'm not sure whether this function module is the equivalent of MM02 or MR21.
no, there is not such a function module. But you can use Bapi BAPI_MATVAL_PRICE_CHANGE to post price differences to ML. With this you can adjust your price to the value that you want.
You should use BAPI_MATERIAL_SAVEDATA to do this. Several mandatory structures should be filled for the successful update of average price:
HEADDATA-MATERIAL = P_MATNR. "material number
HEADDATA-ACCOUNT_VIEW = 'X'.
VALDATA-VAL_AREA = P_BWKEY. "valuation area
VALDATA-VAL_TYPE = P_BWTAR. "valuation type
VALDATA-MOVING_PR = P_STPRS. "new value of moving price
VALDATAX-VAL_AREA = P_BWKEY. "valuation area for tax accounting
VALDATAX-VAL_TYPE = P_BWTAR. "valuation type for tax accounting
VALDATAX-MOVING_PR = 'X'. "update indicator
CALL FUNCTION 'BAPI_MATERIAL_SAVEDATA'
EXPORTING
HEADDATA = HEADDATA
VALUATIONDATA = VALDATA
VALUATIONDATAX = VALDATAX
IMPORTING
RETURN = BAPI_RETURN
TABLES
MATERIALDESCRIPTION = INT_MAKT
.

Mean time to Failure calculation in DAX

I am trying to calculate the mean time to failure for each asset in a job table. At the moment I calculate as follows;
Previous ID = CALCULATE(MAX('JobTrackDB Job'[JobId]),FILTER('JobTrackDB Job','JobTrackDB Job'[AssetDescriptionID]=EARLIER('JobTrackDB Job'[AssetDescriptionID]) && 'JobTrackDB Job'[JobId]<EARLIER('JobTrackDB Job'[JobId])))
Then I bring back the last finish time for the current job when the JobStatus is 7 (closed);
Finish Time = CALCULATE(MAX('JobTrackDB JobDetail'[FinishTime]),'JobTrackDB JobDetail'[JobId],'JobTrackDB JobDetail'[JobStatus]=7)
Then I bring back the previous jobs finish time where the JobType is 1 (Response rather than comparing it to maintenance calls);
Previous Finish = CALCULATE(MAX('JobTrackDB Job'[Finish Time]),FILTER('JobTrackDB Job','JobTrackDB Job'[AssetDescriptionID]=EARLIER('JobTrackDB Job'[AssetDescriptionID]) && 'JobTrackDB Job'[Finish Time]<EARLIER('JobTrackDB Job'[Finish Time]) && EARLIER('JobTrackDB Job'[JobTypeID])=1))
Then I calculate the Time between failure where I also disregard erroneous values;
Time between failure = IF([Previous Finish]=BLANK(),BLANK(),IF('JobTrackDB Job'[Date Logged]-[Previous Finish]<0,BLANK(),'JobTrackDB Job'[Date Logged]-[Previous Finish]))
Issue is that sometimes the calculation uses previous maintenance jobs even though I specified JobTypeID = 1 in the filter. Also, the current calculation does not take into account the time from the start of records to the first job for that asset and also from the last job till today. I am scratching my head trying to figure it out.
Any ideas???
Thanks,
Brent
Some base measures:
MaxJobID := MAX( Job[JobID] )
MaxLogDate := MAX ( Job[Date Logged] )
MaxFinishTime := MAX (JobDetail[Finish Time])
Intermediate calculations:
ClosedFinishTime := CALCULATE ( [MaxFinishTime], Job[Status] = 7 )
AssetPreviousJobID := CALCULATE (
[MaxJobID],
FILTER(
ALLEXCEPT(Job, Table1[AssetDescriptionID]),
Job[JobId] < MAX(Table1[JobID])
)
)
PreviousFinishTime: = CALCULATE ( [ClosedFinishTime],
FILTER(
ALLEXCEPT(Job, Job[AssetDescriptionID]),
Job[JobId] < MAX(Job[JobID])
&& Job[JobType] = 1
)
)
FailureTime := IF (
ISBLANK([PreviousFinishTime]),
0,
( [MaxLogDate]-[PreviousFinishTime] )
)
This should at least get you started. If you want to set some sort of "first day", you can replace the 0 in the FailureTime with a calc like MaxLogDate - [OverallFirstDate], which could be a calculated measure or a constant.
For something that hasn't failed, you'd want to use an entirely different measure since this one is based on lookback only. Something like [Days Since Last Failure] which would just be (basically) TODAY() - [ClosedFinishTime]

How would I translate this SQL query into a Raven Map/Reduce query?

Following on from my previous question at When is a groupby query evaluated in RavenDB? I decided to completely restructure the data into a format that is theoretically easier to query on.
Having now created the new data structure, I am struggling to find how to query it.
It took me 30 seconds to write the following SQL query which gives me exactly the results I need:
SELECT GroupCompanyId, AccountCurrency, AccountName, DATEPART(year, Date) AS Year,
(SELECT SUM(Debit) AS Expr1
FROM Transactions AS T2
WHERE (T1.GroupCompanyId = GroupCompanyId) AND (T1.AccountCurrency = AccountCurrency) AND (T1.AccountName = AccountName) AND (DATEPART(year,
Date) < DATEPART(year, T1.Date))) AS OpeningDebits,
(SELECT SUM(Credit) AS Expr1
FROM Transactions AS T2
WHERE (T1.GroupCompanyId = GroupCompanyId) AND (T1.AccountCurrency = AccountCurrency) AND (T1.AccountName = AccountName) AND (DATEPART(year,
Date) < DATEPART(year, T1.Date))) AS OpeningCredits, SUM(Debit) AS Db, SUM(Credit) AS Cr
FROM Transactions AS T1
WHERE (DATEPART(year, Date) = 2011)
GROUP BY GroupCompanyId, AccountCurrency, AccountName, DATEPART(year, Date)
ORDER BY GroupCompanyId, AccountCurrency, Year, AccountName
So far I have got the Map/Reduce as follows, which from Studio appears to give the correct results - i.e. it breaks down and groups the data by date.
public Transactions_ByDailyBalance()
{
Map = transactions => from transaction in transactions
select new
{
transaction.GroupCompanyId,
transaction.AccountCurrency,
transaction.Account.Category,
transaction.Account.GroupType,
transaction.AccountId,
transaction.AccountName,
transaction.Date,
transaction.Debit,
transaction.Credit,
};
Reduce = results => from result in results
group result by new
{
result.GroupCompanyId,
result.AccountCurrency,
result.Category,
result.GroupType,
result.AccountId,
result.AccountName,
result.Date,
}
into g
select new
{
GroupCompanyId = g.Select(x=>x.GroupCompanyId).FirstOrDefault(),
AccountCurrency = g.Select(x=>x.AccountCurrency).FirstOrDefault(),
Category=g.Select(x=>x.Category).FirstOrDefault(),
GroupType=g.Select(x=>x.GroupType).FirstOrDefault(),
AccountId = g.Select(x=>x.AccountId).FirstOrDefault(),
AccountName=g.Select(x=>x.AccountName).FirstOrDefault(),
Date=g.Select(x=>x.Date).FirstOrDefault(),
Debit=g.Sum(x=>x.Debit),
Credit=g.Sum(x=>x.Credit)
};
Index(x=>x.GroupCompanyId,FieldIndexing.Analyzed);
Index(x=>x.AccountCurrency,FieldIndexing.Analyzed);
Index(x=>x.Category,FieldIndexing.Analyzed);
Index(x=>x.AccountId,FieldIndexing.Analyzed);
Index(x=>x.AccountName,FieldIndexing.Analyzed);
Index(x=>x.Date,FieldIndexing.Analyzed);
}
}
However, I can't work out how to query the data at one go.
I need the opening balance as well as the period balance, so I ended up writing this query which takes as a parameter the account. Following on from Oren's comments to my previous question, that I was mixing Linq with Lucene query, having rewritten the query, I've basically ended up again with a mixed query.
Even though I am showing in the SQL query above that I am filtering by year, in fact I need to be able to determine the current balance from any day.
private LedgerBalanceDto GetAccountBalance(BaseAccountCode account, DateTime periodFrom, DateTime periodTo, string queryName)
{
using (var session = MvcApplication.RavenSession)
{
var query = session.Query<Transactions_ByDailyBalance.Result, Transactions_ByDailyBalance>()
.Where(c=>c.AccountId==account.Id && c.Date>=periodFrom && c.Date<=periodTo)
.OrderBy(c=>c.Date)
.ToList();
var debits = query.Sum(c => c.Debit);
var credits = query.Sum(c => c.Credit);
var ledgerBalanceDto = new LedgerBalanceDto
{
Account = account,
Credits = credits,
Debits = debits,
Currency = account.Currency,
CurrencySymbol = account.CurrencySymbol,
Name = queryName,
PeriodFrom = periodFrom,
PeriodTo = periodTo
};
return ledgerBalanceDto;
}
}
Required result:
GroupCompanyId AccountCurrency AccountName Year OpeningDebits OpeningCredits Db Cr
Groupcompanies-2 EUR Customer 1 2011 148584.2393 125869.91 10297.6891 28023.98
Groupcompanies-2 EUR Customer 2 2011 236818.0054 233671.55 50959.85 54323.38
Groupcompanies-2 USD Customer 3 2011 69426.11761 23516.3776 10626.75 0
Groupcompanies-2 USD Customer 4 2011 530587.9223 474960.51 97463.544 131497.16
Groupcompanies-2 USD Customer 5 2011 29542.391 28850.19 4023.688 4231.388
Any suggestions would be greatly appreciated
Jeremy
In answer to the comment
I basically ended up doing pretty much the same thing. Actually, I wrote an index that does it in only two hits - once for the opening balance and again for the period balance. This is almost instantaneous for grouping by the account name, category etc.
However my problem now is getting a daily running balance for the individual account. If I bring down all the data for the account and the period, its not a problem - I can sum the balance on the client, however, when the data is paged, and the debits and credits are grouped by Date and Id, the paging cuts across the date, so the opening/closing balance is not correct.
Page 1
Opening balance until 26/7/12 = 0
25/7/12 Acct1 Db 100 Cr 0 Bal +100 Runn Bal +100
26/7/12 Acct1 Db 100 Cr 0 Bal +100 Runn Bal +200
26/7/12 Acct1 Db 200 Cr 0 Bal +200 Runn Bal +400
Closing balance until 26/7/12 = +400
Page 2
Opening balance until 26/7/12 = +450 (this is wrong - it should be the balance at the end of Page 1, but it is the balance until the 26/7/12 - i.e. includes the first item on Page 2)
26/7/12 Acct1 Db 50 Cr 0 Bal +50 Runn Bal +500 (should be +450)
27/7/12 Acct1 Db 60 Cr 0 Bal +60 Runn Bal +560 (should be +510)
I just can't think up an algorithm to handle this.
Any ideas?
Hi this is a problem I have also faced recently with RavenDb when I needed to retrieve rolling balances as at any imaginable date. I never found a way of doing this all in one go but I managed reduce the amount of documents that I needed to pull back in order to calculate the rolling balance.
I did this by writing multiple map reduce indexes that summed up the value of transactions within specific periods:
My First Summed up the value of all transactions grouped at the year level
My Second index Summed up the value of all transactions at the Day level
So if someone wanted their account balance As At 1st June 2012 I would:
Use the Year level Map-reduce index to get the Value of transactions for years up to 2012 and summed them together (so if transactions started being captured in 2009 I should be pulling back 3 documents)
Use the Day level Map-reduce index to get all documents from the start of the year and the 1st of June
I then Added the Day totals to the year totals for my final rolling balance (I could have also had a monthly map reduce as well but didn't bother).
Anyway not as quick as in SQL but it was the best alternative I could come up with to avoid bringing back every single transaction