Shipment number XXX is currently being processed - abap

I have the following scenario:
With a Z report I create a Shipment number (BAPI_SHIPMENT_CREATE). My report displays an ALV. If I double click on created shipment number, I go to VT03N. There I try to "Change" the shipment (Shipment-> Change). But the I get the error message that the shipment is locked by my user.
In this case I tried to unlock the shipment before calling VT03N with DEQUEUE_EVVTTKE. Now I have another problem: "The delivery number YYYY is locked by me". Delivery number linked to this shipment number. I tried analog with DEQUEUE_EVVBLKE, but the error message persists.
Can anyone help me? Thanks.

I used FUNCTION 'LEINT_DELIVERY_UNLOCK' to unlock the delivery. It worked.

Related

GA4: total revenue is 0 and Monetization screen not showing data

I'm trying to integrate ecommerce tracking to a website. The problem is the events are captured and I'm sending all the data.
This is the data I'm sending
{"event":"purchase","currency":"EUR","value":21.85,"items":[{"item_id":"3cd937-debc-416d-955f-8ccc84a751","item_name":"anuy-namer","affiliation":"","coupon":"","currency":"EUR","discount":0,"index":1,"item_brand":"Fontastic","item_category":"THeadset / Lautsprecher","item_category2":"Unterhaltungk - Körer","item_category3":"","item_category4":"","item_category5":"","item_list_id":"31baa491-4c6f-8671-c808f0cb2100","item_list_name":"Fontastic BT In-Ear Headset S1 blau Bluetooth-Kopfhörer","item_variant":"Mit Begleiterliche Musikfans.","location_id":"","price":14.95,"quantity":1}],"transaction_id":"9826314a-46af-4304-a057-dc77f4a799b0","affiliation":"446c6345-9193-4841-9f4a-e06e8cf7220e","tax":"3.4917","shipping":"6.9000","coupon":"","gtm.uniqueEventId":12}
tbh I'm not sure if the data is wrong but the monetization reports only show number of purchasers and not anything else. It's all zeros. in the conversion screen the total revenue is not calculated.
Check if you have been given permission to see revenue. When someone gives you access, there are several options including one to hide revenue.
It turned out that the setup wasn't correct. I needed to add custom trigger to the events so It can be captured.
I also found some problem with the structure of the object, (currency,value , items) needed to be added inside ecommerce object.

binance futures how to close a position at market price (the same as the "market" button under "close positions") via API

This question is about binance FUTURES api (not spot exchange, but futures).
GOAL : have the same behaviour that the button "market" under "close positions" that closes a position.
NOTE : please don't reply the endpoit : DELETE /fapi/v1/allOpenOrders ==>> this is CANCELLING only orders NOT FILLED / (not opened) positions.
I want to CLOSE a actual OPENED position.
(don't forget that the buttons buy/long and sell/short are opening positions) in futures, sell is not the same as a sell in spot trading. in futures, sell actually open a position. to take profit we have to close (not cancel) the position.
I search all over forums and it's very hard to find a correct working answer to this.
** I can OPEN a position at market price with this:
symbol=BTCUSDT&side=SELL&positionSide=SHORT&type=MARKET&quantity=0.01
** But when I try to CLOSE it with those parameters, I always get an error NOT matter what I try
symbol=BTCUSDT&side=SELL&type=STOP_MARKET&closePosition=true
I get Stop price less than zero.
symbol=BTCUSDT&side=SELL&type=STOP_MARKET&closePosition=true&stopPrice=30895.00
I get Order’s position side does not match user’s setting.
symbol=BTCUSDT&side=SELL&type=STOP&closePosition=true&stopPrice=30895.00
ProfitTarget strategy invalid for orderType STOP
symbol=BTCUSDT&side=SELL&quantity=0.01&type=MARKET
Order's position side does not match user's setting.
symbol=BTCUSDT&side=SELL&type=MARKET&closePosition=true
Target strategy invalid for orderType MARKET,closePosition true
symbol=BTCUSDT&side=SELL&type=STOP&closePosition=true
Target strategy invalid for orderType STOP,closePosition true
symbol=BTCUSDT&side=SELL&type=STOP_MARKET&closePosition=true
Stop price less than zero.
symbol=BTCUSDT&side=SELL&type=STOP_MARKET&closePosition=true&stopPrice=30158.30
Order's position side does not match user's setting.
symbol=BTCUSDT&side=SELL&type=TAKE_PROFIT_MARKET&closePosition=true&stopPrice=30131.30
Order's position side does not match user's setting.
symbol=BTCUSDT&side=SELL&positionSide=SHORT&type=TAKE_PROFIT_MARKET&closePosition=true&stopPrice=30271.60
Combination of optional parameters invalid.
!!!!!!!!!! what’s wrong or what parameter I’m missing ???
it's a bit frustrating....
does anyone know the correct parameters ???
I mixed the BUY SELL stuff
OPEN SHORT
symbol=BTCUSDT&side=SELL&positionSide=SHORT&type=MARKET&quantity=0.01
CLOSE SHORT
symbol=BTCUSDT&side=BUY&positionSide=SHORT&type=MARKET&quantity=0.01
OPEN LONG
symbol=BTCUSDT&side=BUY&positionSide=LONG&type=MARKET&quantity=0.01
CLOSE LONG
symbol=BTCUSDT&side=SELL&positionSide=LONG&type=MARKET&quantity=0.01
I was running into the same issue and worked out that it can be done without the need for a socket connection to monitor whether or not one of the TP/SL has been filled in order to cancel the other.
It does require three separate calls to the API to create each as the solved answer points out, but the parameters sent in the calls can be set such that one will cancel the other two if it executes.
For example, the SL being hit will cancel the original order and the TP and the TP being hit will cancel the original order and the SL, etc. Here are the parameters I used in each call to achieve this (with example values):
Original order -
symbol=BNBUSDT
side=BUY
positionSide=BOTH
type=MARKET
quantity=1
reduceOnly=false
SL Order -
symbol=BNBUSDT
side=SELL
positionSide=BOTH
type=STOP_MARKET
timeInForce= GTE_GTC
quantity=1
reduceOnly=true
stopPrice=(your stop price)
workingType= MARK_PRICE
TP Order -
symbol=BNBUSDT
side=SELL
positionSide=BOTH
type=TAKE_PROFIT_MARKET
timeInForce= GTE_GTC
quantity=1
reduceOnly=true
stopPrice=(your take profit price)
workingType= MARK_PRICE
Hopefully, this helps anyone else running into this issue

Catch Post Goods Issue event while outbound delivery processing

I am using the user exit USEREXIT_SAVE_DOCUMENT_PREPARE to check some positions in a delivery for some specific criteria.
I just want to do this whenever PGI is triggered. For this I use this condition in the user-exit:
IF ( sy-tcode EQ 'VL01N' OR
sy-tcode EQ 'VL02N' ) AND
sy-ucomm EQ 'WABU_T'.
But now I am afraid that this is not enough for cases like:
booking the exit directly from vl02n (without checking the positions)
book the exit via "Edit --> Post Good Issue"
editing the positions and book
Are there some further options which can be checked to make sure that there is a booking?
How can I make completely sure that a post of goods was triggered in fact?
You can try to utilize Workflow to cover all possible cases.
Create Workflow event which will be triggered upon delivery creation/change and check for Post Goods Issue there. Delivery BO is LIKP so go to tcode SWU_EWCD and enter data like this
Workflow events are based on change documents so every times smth is written to the table it will be fired. Check that your event is properly created in SWEC transaction.
You can also create events based on NACE conditions. After that use your event to generate your own workflow.
Also BAdI LE_SHP_DELIVERY_PROC may be of interest for you, it has method
SAVE_AND_PUBLISH_BEFORE_OUTPUT which is executed before saving of delivery.
You have to check value in T180-TRTYP. If value is 'H' then it is create booking otherwise it is change booking.It is better to remove transaction code and sy-ucomm condition and condition for T180-TRTYP.
regards,
Umar Abdullah

Purchase Requisition for asset services

I have a task to modify a purchase requisition (tcode:me51n) with item category (service) and account assignment (asset services) for filling automatically the asset field (ESKN-ANLN1) based on specific service-activity number (ESLL-SRVPOS). These fields are at items details screen and asset number at account assignment tab.
Until now I have achieved to link an asset with specific service in service master through classification. This information is stored at AUSP. Also, I am taking the activity number using a field symbol and asset number with select from AUSP. The problem is that the asset value doesn’t have an export at screen. I have tried user exits EXIT_SAPLMLSK_001 from enhancement SRVESLL, exits from enhancement mereq001 and the exit EXIT_SAPLKACB_002 from ACCOBL01 and also the badi ME_PROCESS_REQ_CUST, but I couldn’t achieve to pass the asset number as export to screen.
Could any expert help me?
Thanks

SharePoint Workflow Error: "Unable to transform the input lookup data into the requested type" BUT only on New Item Creation

FYI to start, I am aware of how to properly set up an update to a lookup, and am 99% positive I've done this correctly.
I know this because When I set the workflow to automatically start when an Item is Changed, then it works perfectly. But when I simply change this setting so it will automatically start on New Item Creation, it Cancels the workflow and I get a "Coercion Failed: Unable to transform the input lookup data into the requested type." If both options are checked then it fails on creation, but simply clicking edit on the item properties, and the "Save" makes it work.
The workflow is on a Document Library and works as follows;
User selects the Work Task LookUp from a dropdown in the edit properties form after uploading, and then Saves the item (adding it to the document library). The workflow is suppose to then look at the Work Task LookUp selected, and pull the Account and Effective Date-Type lookUp ID's that Work Task item has, and sets the Document's identical fields to the same value.
Here is the code for the workflow if it helps;
If Current Item: Parent Task is not empty
If Current Item: Sub Task is not empty
Log Both are empty to workflow history list
Then Set Account to Work Tasks:Account
The Log Set Account to workflow history list
Then Set Effective Date and Type to WorkTasks: Effective Date and Type
The Log Set EffDateType to the workflow history list
This is all done in one step. I also added additional steps to test if the account and effective date type fields have been set properly, and if not to set them again. But everytime I run the workflow on change and it works, it always correctly sets these fields based upon the first Step (posted above) and the additional check logs to the history that they are not needed.
As an example, The lookUp for Integer for Tasks:Account is set to work as follows;
Date Source: Work Tasks (a list)
Field from Source: Account (a lookup)
Return Field as: Lookup ID (as Integer)
Find the List Item
Field: Title (from the Work Tasks list)
Value: Current Item: Parent Task (Which is a look up of the "Title"
Field from Work Tasks List, and is set to return the Value as a LookUp Value (As Text))
The Effective Date and Type setting is pretty much identical.
So anyone have any insight? I've tried running it as an impersonated Step, setting a workflow pause (for 1 minute), changing the lookup types incase I messed it up to start with, but ultimately the above workflow DOES work, but only when I set it to "Automatically start on the Change (edit) of an Item", NOT "Automatically start on New Item Creation" like I need to to do.
Oh yes, fyi, I am using SPServices CascadingDropDown on the Work Task and Sub Task fields of the doc Library form, but I honestly do not believe this has anything to do with my issue.
UPDATE:
I've talked with another developer, and he believes it is due to the issue that the workflow is occuring too quickly, before the item creates an ID for itself, which it needs to conduct the lookUps. He had me add another "Pause Workflow" to the very top of my workflow code (above the If conditions) and set it for 1 minute.
It then worked properly.
Downside is we want this to labeling to occur as close to item creation as possible. Because a view of the library relies on grouping based upon Account and Effective Date and Type. To add to this downer, Microsoft's Pause Workflow only allows for 1 minute or more, and then the timer used for this is often off, resulting in a pause longer than that. So far, every test is currently showing 2 minutes minimum on the pause.
A possible alternative solution for instantaniously populate the fileds is to use Javascript and SPServices to do the lookUp to the Task list to pull the account and effective date - type fields and then populate, but my Javascript is not very strong and I would need help doing this. If anyone has any suggestions, I would appreciate them.
(Answered in a question edit. Converted to a community wiki answer. See Question with no answers, but issue solved in the comments (or extended in chat) )
The OP wrote:
I don't know if it is the ID for the item after further testing. I changed the start of the workflow to wait until a field in the item changes. I set it to wait until the ID field is not 0 (since you cannot set to null), and it still does not work.
6/14/2012 4:13 PM Comment System Account Waiting on ID ​
6/14/2012 4:13 PM Comment System Account Waiting complete on ID ​
6/14/2012 4:13 PM Error System Account Coercion Failed: Unable to transform the input lookup data into the requested type.
I have tried other fields as well, like document ID value is not empty, and it will wait, log it finishing the wait, and then fail.
UPDATE This issue has something to do with the Parent Task field. I have solved the issue without having to wait for a period of time by setting the change from above to wait until the Parent Task field is not empty. It then completes the workflow fine.
Anyone know why there is a delay though? I've solved it, but still don't fully understand what takes it so long.
The main fault has been solved (hence the answer), and the remaining point about the reasons for the delay would probably be a discussion point or not specific enough for SO. Any further clarification can be edited in here.