I'm dealing with a pretty common issue. Consider the code:
CALL FUNCTION 'BAPI_CREATE_SOMETHING' ... .
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTNG WAIT = 'X'.
CALL FUNCTION 'BAPI_CHANGE_SOMETHING' ... .
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTNG WAIT = 'X'.
Here I use BAPIs to create some object and make some changes in it right after the creation. The problem is that when BAPI_TRANSACTION_COMMIT returns, the object may still be locked for some time, so BAPI_CHANGE_SOMETHING will fail.
The most recommended option I've seen so far is putting a loop right after the 1st commit that will check if the object is locked (WAIT UP TO ... SECONDS together with SELECT/ENQUEUE).
There is one more suggestion, however: wrap the object creation code into a FM and call it with a STARTING NEW TASK addition. So, the whole thing would look like this:
FUNCTION 'Z_CREATE_SMTH'.
CALL FUNCTION 'BAPI_CREATE_SOMETHING' ... .
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTNG WAIT = 'X'.
ENDFUNCTION.
...
CALL FUNCTION 'Z_CREATE_SMTH' STARTING NEW TASK 'TSK' DESTINATION 'NONE'
PERFORMING on_done ON END OF TASK.
WAIT UNITL lv_done = abap_true.
CALL FUNCTION 'BAPI_CHANGE_SOMETHING' ... .
CALL FUNCTION 'BAPI_TRANSACTION_COMMIT'
EXPORTNG WAIT = 'X'.
...
FORM on_done USING p_task.
lv_done = abap_true.
...
ENDFORM.
The main point is that when async function call completes, it's guaranteed that all changes are committed and and all locks are released. However, I haven't found any proofs for that yet, so my question is: does this make sense, or WAIT + ENQUEUE is the most recommended option?
Thank you.
Related
I have a function outside this, which I want to call once this thread is finished.
Public Sub Replace(ByVal p_objStation As voapPlanningStation, _
ByVal p_objGenset As voapPlanningGenset, _
ByVal p_arlValues As ArrayList)
'PUT below in a thread
Dim i As Int32 = 0
Dim tasks As Task() = New Task(4) {}
mobjParty.ReplaceGensetMEL(p_objStation, p_objGenset, p_arlValues)
'put this in a thread
Me._replaceMELThread = New System.Threading.Thread(Sub() Me.ReplaceMELThreadSafe(p_objStation, p_objGenset, p_arlValues))
Me._replaceMELThread.Start()
End Sub
The Threading API based around explicit Thread objects is very much mechanism focussed. "I need another thread to run some code, so I'll create a Thread and have it run the code". However, there wasn't much thought put into its design at the time to allow later rendezvous - Join blocks the current thread until the Joined thread exits, so just blocks you up.
However, wanting to not block the UI is a common desire. So the framework team created BackgroundWorker. This gives you a abstraction of the problem. Different methods in the worker run either in the UI thread or some other thread, and you don't have to care about where that other thread came from or where it'll go after you've used it. You just update your UI in e.g. ProgressChanged or RunWorkerCompleted.
However, that abstraction didn't go far enough. There's a common desire to be able to want to say "I've got some work to do, I don't care where that code runs, all I care about is being able to find out it's complete and get its results". This is the Task API. How tasks run, what threads exist behind the scenes are irrelevant implementation details.
And since async/await were added to the language, that's the most "natural" work of interacting with Tasks. The key thing about await is that it frees your current thread (rather than blocking it) if what you're awaiting for isn't finished yet. You can also use ContinueWith if your use case doesn't fit await, but most times it will.
For CPU-bound work that shouldn't block the UI, use Task.Run.
Its quite simple, use the async/await pattern and Await the below function somewhere, the very line after the one containing your new Await statement will execute after this method has finished doing its work :
Async Function Replace(ByVal p_objStation As voapPlanningStation, _
ByVal p_objGenset As voapPlanningGenset, _
ByVal p_arlValues As ArrayList) As Task
'concurrently wait for task completion
Await Task.Run(Sub() mobjParty.ReplaceGensetMEL(p_objStation, p_objGenset, p_arlValues))
'concurrently wait for task completion
Await Task.Run(Sub() Me.ReplaceMELThreadSafe(p_objStation, p_objGenset, p_arlValues))
End Sub
Is there a function module or BAPI or method that nicely performs a material/material ledger consistency check for a given material?
I am aware of report SAPRCKMU which would be very hard to use inside my own program.
I am also aware of and using function module CKML_F_CKML1_PRICES_GET which performs a consistency check.
When this function module finds an inconsistency, it calls MESSAGE E... which means I lose control in my program. This is the core problem that I have.
So I am searching a way to check consistency before I call CKML_F_CKML1_PRICES_GET in a way that gives me a return parameter with the error message without calling MESSAGE E....
I found a solution that works very well:
add the line error_message = 99 to the function module call:
CALL FUNCTION 'CKML_F_CKML1_PRICES_GET'
....
EXCEPTIONS
...
error_message = 99
others = 98.
Now the program doesn't interrupt the control flow when the function module itself uses MESSAGE E... instead of RAISE ....
Whenever MESSAGE E... is called inside, it is converted into SY-SUBRC = 99 and the error-fields in SY-... are also set.
I have a Source Script Component in SQL2012.
I believe if you want to set a Read/write variable in the Scriptcomponent it must be set in the Postexecute method . And I have done so like this;
public override void PostExecute()
{
base.PostExecute();
Variables.value1 = "some value";
}
I tested the variable after the script component ran and found it hadn't been set. I set a break point in the PostExecute method and confirmed it never gets called.
I even wrote a very simple new package and tested it again with the same results. Can anyone tell me why the PostExecute will not fire.
I'm not aware of any restrictions on seting a variable here (regardless of if there are records to process or not).
The variable can be set in the script component, but you can only use it after the Data Flow task is finished (in the next task).
After some testig it appears that when you use a script component as a "Source" the PostExecture event never fires. Works fine a "Transformation". Didn't test out "Destination" type. Nothing in the Microsoft documentation mentions this fact so it's missleading.
I ended up using the rowcount control to count records. Not sure what I would have done if i had needed to to more on the PostExecture event.
I have used a Script Component as a source many times and PostExecute does fire.
I have a custom-written background task manager that takes an Action as parameter, queues it, waits until an execution thread is free and then invokes it.
Now I wanted to do the following:
Dim lValue As Integer = 0I
For tIndex As Integer = 0 To 10
BackgroundTaskManager.QueueTask(New Action(Sub() DoSomething(lValue)))
lValue += 1
Next
This obviously leads to some strange results, because the lambda expression is invoked at an unpredictable moment (whenever a thread is available), so the DoSomething-method does not have access to the actual value that I want it to have.
However, putting these parameters into to the sub - which would require an Action(Of T, ...) - is not really an option, because my background task manager only takes Action without parameters and I want it to stay that way (the background task manager should not have any knowledge about what's actually being called).
What I'm looking for is a way to store a function pointer with parameter values already evaluated, ready to be invoked by a simple [FunctionWithParametersEvaluated].[Invoke].
Any idea how to achieve this?
I have the following code which is being called from a Web Api. As you can see I want to return as soon as I can and shift the work onto the threadpool. (The client polls to see when the job is complete. But the polling is nothing to do with this. The purpose of these routines is simply to extract data and write a file away in background whilst maintaining the progress of the job in a table. The client will interrogates this to determine whether the file is ready so I'm not trying to push progress messages to the client.)
Public Function Extract(filepath as string, ...) as ExtractResult
dim source = ExtractInternal(filepath, ...)
' works first time it is called only!
using source.SubscribeOn(ThreadPoolScheduler.Instance)
.SubScribe()
end using
' works every time it is called ...
dim subscription = source.SubscribeOn(ThreadPoolScheduler.Instance)
.SubScribe()
Return New ExtractResult()
End Function
Public Function ExtractInternal(filepath as string, ...) as IObservable(of Unit)
return Observable.Create(of Unit)
Function()
....
uses filepath here
Return Disposable.Empty
End Function
End Function
As you can see in my comments, if I use auto-disposing of Using ..., I am finding that the observable gets called on the first occasion but not thereafter. Whereas if I assign the subscription to a local var it works every time the web call invokes the routine but I'm concerned that I'm actually leaving stuff hanging around.
Could someone explain why the observable doesn't get re-instantiated on subsequent calls and perhaps explain how I can get it to work every time and tidy up afterwards properly.
EDIT:
So I ended up using Observable.Defer which seems tom give me what I am after ...
Public Function Extract(filepath as string, ...) As ExtractResult
Observable.Defer(Function() ExtractInternal(filepath, ...) _
.SubscribeOn(NewThreadScheduler.Default) _
.Subscribe()
Return New ExtractResult()
End Function
I'm wondering if this is perhaps the correct way to do it to give me proper disposal whilst also using the current parameter values.
Could anyone confirm or correct?
EDIT 2
That was wrong! In fact if I rewrite it as
Public Function Extract(filepath as string, ...) As ExtractResult
Using Observable.Defer(Function() ExtractInternal(filepath, ...)
.SubscribeOn(NewThreadScheduler.Default) _
.Subscribe()
End Using
Return New ExtractResult()
End Function
I get the same behaviour as I originally was getting when I wrote the post.
One thing (amongst many) I don't understand is why if the observable is local var, when a second call is made to the Extract method, another observable is not created and subscribed to? It seems to go against scoping logic if I am actually referencing the same observable under the hood? I've obviously misunderstood.
Many thx
S
Yes, when you dispose the subscription, it stops receiving notifications.
You should keep it in an instance field and have the class implement disposable. Consumers of this class can then dispose it at their convenience.
In your Dispose implementation, you call subscription.Dispose().