Goods Receipts creation issue - abap

I have a problem:
I create goods receipts using transaction MB1C, filling Document Date, Posting Date, Mov. type, Plant, Stor.Location ... Press New Item -> filling Material, quantity, stor. type, Batch, Storagin Bin and Manuf. Date ...
Then I can display the created batch in MSC3N... If I open the tab Classification, and press Set Classification Status, the status = 'Incomplete'.
The steps described above I do also with Batch Input Session ... !the same steps .. But, when I open Classification Status, the status = 'Released'.
My question is why that's happening? I use the same values in both cases, nothing is different!!!
thanks..

Technically speaking, there are many reasons why a batch input might behave differently from a manual input, so you should do several tests until you discover the reason.
>>> Please refer to this table (in SCN) which gives the possible reasons and solutions.
Note that this table is generic and is not limited to one transaction code (MB1C in your case).
PS: it's not clear what you exactly did: you are talking about running a batch input session and checking the "SY structure" (I guess you are talking about the return code SY-SUBRC), but a "session" means that it's a recording in transaction code SM35, but the only way to run it is by calling a "report", so you don't get SY-SUBRC. Therefore, I guess you are not running a session, but doing the batch input (not session) using the ABAP statement CALL TRANSACTION USING (CTU).

Related

BAPI_GOODSMVT_CREATE with multiple material numbers and same PP order?

As I know of, When you're using BAPI_GOODSMVT_CREATE at the same time(by loop or just coincidence), Using same material number puts you an error about locked object (Material XXXX is locked by USER YYYY).
But, as i know of, using BAPI_GOODSMVT_CREATE at the same time, but different material number WITH same production order makes no error.
Issue
Recently I found an error about M3/897 (Plant Data of Material XXXX is locked by user XXXX) when I'm doing BAPI_GOODSMVT_CREATE when I'm trying GI for Production order, by parallel processing, which are putting different Material number to same production order.
Question
So, I'm asking about constraint of BAPI_GOODSMVT_CREATE.
So far I know is -
A. You can't issue GI for Production Order(Mvt 261) at the same time, when you're putting same material number for different production order.
B. (I'm not sure about this) You can't issue GI for Production Order(Mvt 261) at the same time, when you're putting different material number for same production order.
Is both is right, or just A is right? Any help from experienced ABAPer or MM consultant would be appreciated!
To post GI in a loop you need to make commit after each run, and unlock the object explicitly, otherwise you will get the PP lock.
Try like this:
LOOP AT lt_orders ASSIGNING <fs>.
...
CALL FUNCTION 'BAPI_GOODSMVT_CREATE'
EXPORTING
goodsmvt_header = ls_header
goodsmvt_code = ls_code
IMPORTING
goodsmvt_headret = ls_headret
materialdocument = ls_retmtd
TABLES
goodsmvt_item = lt_item
return = lt_return.
IF line_exists( lt_return[ type = 'E' ] ).
CALL FUNCTION 'BAPI_TRANSACTION_ROLLBACK'.
ELSE.
COMMIT WORK AND WAIT.
CALL FUNCTION 'DEQUEUE_ALL'.
ENDIF.
ENDLOOP.
Always use BAPI_TRANSACTION_COMMIT with WAIT parameter or COMMIT WORK with the same after each BAPI call.
Also there can be tricky issues with the GR and implicit GI movements, see the note 369518 about this.
You can check the presence of existing lock at runtime using this FM - "ENQUE_READ2".
data: RAW_ENQ like LOCKSEDX_ENQ_TAB,
SUBRC type SY-SUBRC,
NUMBER type I.
clear : RAW_ENQ[], SUBRC, NUMBER.
add 1 to COUNTER.
call function 'ENQUE_READ2'
importing
SUBRC = SUBRC
NUMBER = NUMBER
tables
ENQ = RAW_ENQ.
But if you have to prevent a failure of GOODS mvt. in general you have instead to implement some reprocessing logic to store errors.
The steps would be : Catch errors --> store bapi information or header doc number --> retry later

Global / Line Discount for Sale/Purchase/Account

[Odoo 14 Community Edition]
I need to customize Global and Line Discounts (amount & percentage) into Sale / Purchase / Account.
I have done the Sale and Purchase parts. It is just adding fields and a few logics here and there and send the data to Account (account.move) by prepare_invoice methods.
Now here's the issue I am facing -- The Account. I am having a tremendous confusion of where I should modify. After I tried to understand and tracked the flow of the standard invoicing/billing, I am at lost. There are too many functions/variables for me, who do not understand accounting, to try to get the whole idea out of it.
I did add the discount fields that I need. However, the standard calculation of price / taxes / credit / debit are messed up when I try to inherit some methods that I think I should modify. I ended up having incorrect taxes, unbalanced credit/debit, and incorrect total amount.
I tried editing here and there (by inheriting of course. I can still rollback everything I did).
The point is that I need precise suggestions and/or directions of what functions/methods I should inherit just enough to make discount possible. I have 2 Line Discount fields (amount and percent) and 2 Global Discount (also amount and percent). The calculation between them is not the point of interest here. The only problem at this point is to integrate these fields (from SO, PO, and manually created) into the calculation of everything in Invoicing/Billing.
Here are the fields:
account.move
global_discount_amount = fields.Float(string='Global Discount Amount', compute=compute_global_discount_amount, store=True)
global_discount_percent = fields.Float(string='Global Discount Percent', compute=compute_global_discount_percent, store=True)
account.move.line
discount_line_amount = fields.Float(string='Disc. Line Amount', compute=compute_discount_line_amount, store=True)
discount_line_percent = fields.Float(string='Disc. Line %', compute=compute_discount_line_percent, store=True)
Right now, I am messing with some methods such as: (a few examples)
account.move
_recompute_tax_lines
account.move.line
create
_get_fields_onchange_balance_model
_get_price_total_and_subtotal_model
_onchange_price_subtotal
Most of the modifications are written by copying the whole method from standard into my new model (inherit that standard model) and edit some codes here -- Override the standard code from my understanding.
Function computation/execution either depends on other fields value change or compute every time form/listview load.
Check in your case what is depends on the function compute_global_discount_amount and compute_global_discount_percentage
For better developing/troubleshooting, remove any #depends() fields declaration on the functions. Additionally, remove the store=True attribute temporarily. It will help you to narrow down the issue. And make sure you get the correct numbers.
Once you get it, add back fields depending.
Here is a sample example of a method (Odoo 14 CE) override which will be executed during compute amount.
#api.depends(
'line_ids.matched_debit_ids.debit_move_id.move_id.payment_id.is_matched',
'line_ids.matched_debit_ids.debit_move_id.move_id.line_ids.amount_residual',
'line_ids.matched_debit_ids.debit_move_id.move_id.line_ids.amount_residual_currency',
'line_ids.matched_credit_ids.credit_move_id.move_id.payment_id.is_matched',
'line_ids.matched_credit_ids.credit_move_id.move_id.line_ids.amount_residual',
'line_ids.matched_credit_ids.credit_move_id.move_id.line_ids.amount_residual_currency',
'line_ids.debit',
'line_ids.credit',
'line_ids.currency_id',
'line_ids.amount_currency',
'line_ids.amount_residual',
'line_ids.amount_residual_currency',
'line_ids.payment_id.state',
'line_ids.full_reconcile_id')
def _compute_amount(self):
super()._compute_amount()
for record in self:
record.compute_global_discount_amount()
record.compute_global_discount_percent()
def compute_global_discount_amount(self):
for record in self:
# Execute your logic for compute global_discount_amount
record.global_discount_amount = $$$$
have a look at apply_discount function in an inherited class of sale.order
def apply_discount(self, cr, uid, ids, discount_rate):
cur_obj = self.pool.get('res.currency')
res = {}
line_obj = self.pool.get('sale.order.line')
for order in self.browse(cr, uid, ids, context=None):
for line in order.order_line:
line_obj.write(cr, uid, [line.id], {'discount': discount_rate}, context=None)
return res
A new column was added to the new inherited subclass of sale order
'discount_rate' : fields.float('Discount rate'),
Then in the sale order view (an inherited one) placed the new field discount on the sale.order.view and fired an event on the on_change of the value passing the self value of the field to the on_change event
In this way you can apply discount sequentially to the rows of the order without altering the normal process.
Firstful, please pardon my English. Then let's get into the core, I guess that this is exactly the task that I was assigned. Luckily that I was not alone, I was guided by my so-called senior which showed me the way in order to achieve the Global Discount for Invoice and Bill.
By looking at your methods listing, you were already on the right path!
So now, let me help you further...as much as I can
As in my case, I didn't put any new field regarding the Global Discount in Account Move Line model, even though in Sale Order Line and Purchase Order Line Global Discount fields do exist.
All and all, here are the methods that need to be customized:
_onchange_invoice_line_ids
_compute_amount
_recompute_payment_terms_lines
_recompute_tax_lines
I heavily modified the 3rd and 4th methods. However, I think that I still have some bugs, don't hesitate to tell me.

BAPI/FM to search prod orders confirmations by workcenter and date?

I'm trying to figure out which BAPI/FM I could use to search amounts confirmed based on search criteria of date (+time if possible) and workcenter confirmed where was confirmed...
I would be using BAPI_PRODORDCONF_GETDETAIL which contains these informations, but according to BAPI guide I can only load in the data of confirmation number+confirmation counter.
Therefore the option would be to run BAPI_PRODORDCONF_GETLIST (but I can only input the production order range or confirmation number range), then filter what includes the workcenter and date I need and from those pick up confirmation number+counter and run it through BAPI_PRODORDCONF_GETDETAIL.
but this procedure of getting list of everything without data being filtered on serverside is extemly timeconsuming and out of SAP Gui I have timeout error... therefore I need any BAPI/FM which I could input the workcenter where was confirmed and date, and have the data filtered already...
Any ideas how to do that?
As far as I know there is no such standard FM, so your only choice is custom development.
I would suggest you MCPK transaction were this info is exposed in a handy form, but as I see that your requirement is to receive this info externally this is not appropriate for you.
The confirmations reside in AFRU table and workcenters are in CRHD, so to find confirmed quantities by workcenter you should join these tables, or use a view u_15673 where this info is linked:
TYPES: BEGIN OF prod_orders,
rueck TYPE afru-rueck, "confirmation number
rmzhl TYPE afru-rmzhl," confirmation counter
gmnga TYPE afru-gmnga, " quantity
arbid TYPE crhd-arbpl, " workcenter
END OF prod_orders.
DATA: orders TYPE TABLE OF prod_orders.
SELECT *
FROM u_15673
INTO CORRESPONDING FIELDS OF TABLE orders
WHERE isdd >= '20180101' AND isdz <= '163000'.
To pull this externally, you must create RFC-enabled FM or use RFC_READ_TABLE and fetch this view with parameters, here is the sample.
Another approach is to use RFC_ABAP_INSTALL_AND_RUN. You must create an ABAP program that uses WRITE for output the results as a standard list to screen.
Send the lines of this program to RFC_ABAP_INSTALL_AND_RUN to PROGRAM parameter and the code will be executed on the remote system and this FM will return screen results as the lines of table WRITES.
Possible sample based on MCPK tcode to send to RFC_ABAP_INSTALL_AND_RUN:
CLEAR lwa_selection.
lwa_selection-selname = 'SL_SPTAG'.
lwa_selection-sign = 'I'.
lwa_selection-option = 'BT'.
lwa_selection-low = '20180101'.
lwa_selection-high = '20201231'.
APPEND lwa_selection TO li_selection.
CLEAR lwa_selection.
lwa_selection-selname = 'SL_ARBPL'.
lwa_selection-sign = 'I'.
lwa_selection-option = 'EQ'.
lwa_selection-low = '10400001'.
APPEND lwa_selection TO li_selection.
SUBMIT rmcf0200 WITH SELECTION-TABLE li_selection
with par_stat = abap_true
EXPORTING LIST TO MEMORY
AND RETURN.
DATA: xlist TYPE TABLE OF abaplist.
DATA: xtext TYPE TABLE OF char200.
CALL FUNCTION 'LIST_FROM_MEMORY'
TABLES
listobject = xlist.
CALL FUNCTION 'LIST_TO_TXT'
EXPORTING
list_index = -1
TABLES
listtxt = xtext
listobject = xlist.
IF sy-subrc = 0.
LOOP AT xtext ASSIGNING FIELD-SYMBOL(<text>).
WRITE <xtext>.
ENDLOOP.
ENDIF.
However, this approach is not flexible because MCPK standard layout is a bit different than you want, and is not easy to adjust programmatically.
Because of that I recommend to stick to the RFC_READ_TABLE approach.

Rails show different object every day

I want to match my user to a different user in his/her community every day. Currently, I use code like this:
#matched_user = User.near(#user).order("RANDOM()").first
But I want to have a different #matched_user on a daily basis. I haven't been able to find anything in Stack or in the APIs that has given me insight on how to do it. I feel it should be simpler than having to resort to a rake task with cron. (I'm on postgres.)
Whenever I find myself hankering for shared 'memory' or transient state, I think to myself "this is what (distributed) caches were invented for".
#matched_user = Rails.cache.fetch(#user.cache_key + '/daily_match', expires_in: 1.day) {
User.near(#user).order("RANDOM()").first
}
NOTE: While specifying a TTL for cache entry tells Rails/the cache system to try and keep that value for the given timeframe, there's NO guarantee that it will. In particular, a cache that aggressively tries to reclaim memory may expire an entry well before its desired expires_in time.
For this particular use case, it shouldn't be a big deal but in cases where the business/domain logic demands periodically generated values that are durable then you really have to factor that into your database.
How about using PostgreSQL's SETSEED function? I used the date to seed so that every day the seed will change, but within a day, the seed will be consistent.:
User.connection.execute "SELECT SETSEED(#{Date.today.strftime("%y%d%m").to_i/1000000.0})"
#matched_user = User.near(#user).order("RANDOM()").first
You may want to seed a random value after using this so that any future calls to random aren't biased:
random = User.connection.execute("SELECT RANDOM()").to_a.first["random"]
# Same code as above:
User.connection.execute "SELECT SETSEED(#{Date.today.strftime("%y%d%m").to_i/1000000.0})"
#matched_user = User.near(#user).order("RANDOM()").first
# Use random value before seed to make new seed:
User.connection.execute "SELECT SETSEED(#{random})"
I have split these steps in different sections just for readability. you can optimise query later.
1) Find all user records till today morning. so that the count will freeze.
usrs_till_today_morning = User.where("created_at <?", DateTime.now.in_time_zone(Time.zone).beginning_of_day)
2) Pluck all ID's
user_ids = usr_till_today_morning.pluck(:id)
3) Today date it will be a range (1..30) but will remain constant throughout the day.
day_today = Time.now.day
4) Select the same ID for the day
todays_user_id = user_ids[day_today % user_ids.count]
#matched_user = User.find(todays_user_id)
So it will give you random user records by maintaining same record throughout the day!!

Can I use the the power of Generics to solve my issue

I have a wierd issue. I am loading 1k invoice objects, header first then details in my DAL. I am using VB.NET on this project. I am able to get the invoice headers just fine. When I get to loading the details for each invoice I am getting a timeout on SQL Server. I increased the timeout to 5 minutes but still the same thing. If I reduce the invoice count to 200 it works fine.
Here is what I am doing
//I already loaded the invoice headers. I am now iterating each invoice to get it's detail
For Each invoice As Invoice In invoices
drInvoiceItems = DBSqlHelperFactory.ExecuteReader(CONNECTION_STRING, CommandType.StoredProcedure, "dbo.getinvoiceitem", _
New SqlParameter("#invoicenumber", invoice.InvoiceNumber))
While drInvoiceItems.Read()
invoice.LineItems.Add(New InvoiceLine(drInvoiceItems("id"), drInvoiceItems("inv_id"), drInvoiceItems("prodid"), drInvoiceItems("name"), drInvoiceItems("barcode"), drInvoiceItems("quantity"), drInvoiceItems("costprice")))
End While
Next
Return invoices
I am aware that I am firing 1k connections to the DB due to the iterations. Can't I load all the line items with one select statement and then do something like
For Each invoice As Invoice In invoices
invoice.Items.Add(invoiceItems.Find(Function(i as InvoiceItem),i.InvoiceNumber = invoice.InvoiceNumber))
Next
I get the error whenusing the lambda funcion above
Error 1 Value of type 'System.Collections.Generic.List(Of BizComm.InvoiceLine)' cannot be converted to 'BizComm.InvoiceLine'. C:\Projects\BizComm\InvoiceDAL.vb 75 35 BizComm
One thing I have done when iterating through items in the past is use the same Connection object for all the necessary read activities. It seems to greatly enhance performance.
I'd also look at the database to see whether the dbo.getinvoiceitem procedure can be improved, or if another procedure can be written which will give you all the line items for a group of invoices (perhaps by date or customer/vendor) rather than just one header at a time. Then you can more effectively apply your iteration over the invoice collection and add the lines to the headers.
You can also check to see whether there is an effective index on column that the #invoicenumber parameter references.
From your code, it looks like you are not closing the connections and datareaders. See if you can place your connections and datareaders in a USING statement:
Using con As New SqlConnection(connectionString)
....
End Using
The DBSqlHelperFactory opens a connection, but can't close it since the connection is needed after its return. I'd modify the code, so that you open one connection and pass it to DBSqlHelperFactory as a parameter.
To quickly pick up these issues, I always debug with:
Max Pool Size=1;
added to the end of the connection string. That will quickly throw an error any time you forget to close a connection.
Why load InvoiceItems before hand? Can't you load it on demand?
i.e. when you need to get the Items, call a method on Invoice instance (myInvoice.GetItems)
EDIT: It will be better to understand the full picture of what you are trying to do.
Is it really required to get all Invoices as well?
Why not select all the line items for all the invoices you need in a single query. then split the results up into multiple invoice objects?
Re: how do I map between collections?
One implementation could be: create 1000 anemic Invoice object, chuck them in a Dictionary which goes from Id to Invoice. Then when you select the line items you include the invoice id, look up the anemic invoice and add the line to it.