How to find the Transport Request with my custom objects? - abap

I've copied two Function Modules QM06_SEND_PAPER_STEP2 and QM06_FM_TASK_CLAIM_SEND_PAPER to similar Z* Function Modules. I've put these FMs into a ZQM06 Function Group which was created by another developer.
I want to use Transaction SCC1 to move my developments from one client to another. In transaction SE01 Transport Organizer I don't find the names of my 2 function modules anywhere.
How can I find out the change request with my work?
I copied the FM in order to modify functionality and I know FMs are client independent.

Function modules, like other ABAP workbench entities, are client-independent. That is, you do not need to copy them between clients on the same instance.
However, you can find the transport request that contains your changes by going to transaction SE37, entering the name of your function module, and then choosing Utilities -> Versions -> Version Management from the menu.
Provided you did not put the changes into a local package (like $TMP) the system will have asked you for a transport request when you saved or activated your changes, that is, unless the function group is in a modifiable transport request, in which case it would have created a new task for your user under that request which will contain you changes. To check the package, use Goto -> Object Directory Entry from the menu in SE37.

Function modules are often added to transports under the function group name, especially if they're new.
The easiest way to fin dhte transport is to go to SE37, display the function module, and then go to Version Management.

The answer from mydoghasworms is correct. Alternatively you can also use transaction SE03 -> Search for Objects in Requests/Tasks (top of the transaction screen) -> check the box next to "R3TR FUGR" and type in your function group name.

Related

Change RZ11 Profile parameters programmatically

I was asked by the IT Department to write an ABAP program that switches the profile parameter login/server_logon_restriction from 0 to 1 and back automatically triggered (time etc) on all of our SAP servers.
I am not very familiar in the SAP environment yet and until now I managed to get the wanted parameter by using:
RSAN_SYSTEM_PARAMETERS_GET
and
RSAN_SYSTEM_PARAMETERS_READ
But I cannot find anything to change and save them dynamically. Is this even possible?
Cheers, Nils
login/server_logon_restriction parameter dynamic so you can change it via cl_spfl_profile_parameter=>change_value class method.
You can configure background job for automatically trigger it in t-code SM36.
The method don't save the given value to profile file, so the parameter turn back profile value after system restart.
Current logged-in users can continue to use system. May be you can want to inform them with TH_POPUP function then kick from the system.

Check original language of repository objects at creation?

In our company, repository objects must be created with original language EN.
Is there a way to check the logon language in case of creating a new object in the ABAP repository?
Desired behaviour:
SE80 - Create program/class/data element/table/....
==> user exit/badi checks the logon language. When it is not 'EN', the creation will be refused.
regards,
Umar Abdullah
I know there is a exit for this but I haven't remember exact name. You can use general purpose for finding exit. Go to SE24 and open CL_EXITHANDLER class, find GET_INSTANCE method and add break point. Then start creating item, it will pause on debugger multiple times, try to find suitable one.
As #mkysoft suggested, you may implement a check in the BAdI CTS_REQUEST_CHECK, method CHECK_BEFORE_ADD_OBJECTS, which is invoked when the object is about to be attached to a transport request. Raise the exception CANCEL to make the attachment fail (and so the object is not created too).
EDIT: sorry, ignore my answer, "this method is NOT released for Customer usage" as said in note 2150125 - Method CHECK_BEFORE_ADD_OBJECTS not triggered
DISCLAIMER: THE METHOD DESCRIBED HERE IS ABSOLUTELY NOT RECOMMENDED.
As correctly pointed out by the other members there is no standard and customer-exposed method to achieve your requirement, but if you absolutely must enable this check during creation you can use the below method. As well as the previously offered to you, it also involves modification of SAP standard.
There is a system BAdi CTS_TADIR_SUBSCREEN that is located inside enhancement point CTS_ES_TADIR_POPUP. They are SAP internal and not released for customer usage, so do this at your own risk.
Implementation procedure:
Step 0. First thing you need to change is a SAP internal usage flag, for which you need Object Access key which can be obtained from SAP or from SAP partner that made the implementation in your org. In virgin state this BAdi throws the error if you try to implement it
So hereinafter we assume that you already ticked off this checkbox in BAdi settings
Step 1.
In order to implement the BAdi one need to implement enhancement spot prior to that. This is the most complicated part, because despite we disabled internality flag the SAP-namespaced enhancements must be stored only in SAP-namespaced objects. By SAP namespace I mean non-Z, non-Y and non-T (Test). This means to implement this enhancement, besides modifying the enhancement definition, one need to create, for example, CTS_ES_TADIR named enh.impl., and save it to non-Z package, which you also need to create. Your enhancement implementations selector should look somehow like this
On the above screen only the second will work, all the rest Z will not.
Every non-Z object need Object Access Key, remember? Too bad. But just to show the proof-of-concept, I will proceed.
Step 2. After you created the enh. implementation in SAP-namespace it will propose you to create the BAdi implementation. The same principle applies here: only SAP-namespaced container for SAP-namespaced objects, hence CTS_TADIR_SUBSCREEN should have implementing class for example CL_TADIR_SUBSCREEN. During the creation of enhancement you will see many warnings
but finally you should have something like this, where all system-named objects are created and the enhancement/BAdi is activated.
Step 3. In order to get the BAdi working we need to enable this subscreen processing
during the playing with enhancement I found out that BAdi class is not being triggered standalone, without screen events not enhanced, so to make it work you need to touch a screen enhancement for screen 100
If you do not wanna modify screen elements logic, just put the dummy enhancement in SHOW_TADIR dialog module in the end of the include LSTRDO18
PROCESS BEFORE OUTPUT.
MODULE SHOW_TADIR. "<-- create the dummy enhancement here
CALL SUBSCREEN subs_info INCLUDING gv_badi_prog gv_badi_dynnr.
for example declaration statement like I did
Step 4. Activate your created BAdi class and put the necessary logic there. I wasn't able to trigger method GET_DATA_FROM_SCREEN, but PUT_DATA_TO_SCREEN worked fine
If we put this simple processing for your requirement
METHOD cts_if_tadir_subscreen~get_data_from_screen.
IF object_data-l_mstlang <> 'E'.
MESSAGE 'Objects in non-English languages are not allowed!' TYPE 'A'.
ENDIF.
ENDMETHOD.
it will not allow creating objects in languages other than English.
The check in method get_data_from_screen is being done before showing the screen so language is determined from system logon settings. If to play more with this BAdi, I suppose the method GET_DATA_FROM_SCREEN can also be enabled, which will make it possible to check user input, i.e. the case when the user gonna change the default language.

Is it possible to write a plugin for Glimpse's existing SQL tab?

Is it possible to write a plugin for Glimpse's existing SQL tab?
I'm trying to log my SQL queries and the currently available extensions don't support our in-house SQL libary. I have written a custom plugin which logs what I want, but it has limited functionality and it doesn't integrate with the existing SQL tab.
Currently, I'm logging to my custom plugin using a single helper method inside my DAL's base class. This function looks takes the SqlCommand and Duration in order to show data on my custom tab:
// simplified example:
Stopwatch sw = Stopwatch.StartNew();
sqlCommand.Connection = sqlConnection;
sqlConnection.Open();
object result = sqlCommand.ExecuteScalar();
sqlConnection.Close();
sw.Stop();
long duration = sw.ElapsedMilliseconds;
LogSqlActivity(sqlCommand, null, duration);
This works well on my 'custom' tab but unfortunately means I don't get metrics shown on Glimpse's HUD:
Is there a way I can provide Glimpse directly with the info it needs (in terms of method names, and parameters) so it displays natively on the SQL tab?
The following advise is based on the fact that you can't use DbProviderFactory and you can't use a proxied SqlCommand, etc.
The data that appears in the "out-of-the-box" SQL tab is based on messages of given types been published through our internal Message Broker (see below on information on this). Because of the above limitations in your case, to get things lighting up correctly (i.e. your data showing up in HUD and the SQL tab), you will need to simulate the work that we do under the covers when we publish these messages. This shouldn't be that difficult and once done, should just work moving forward.
If you have a look at the various proxies we have here you will be above to see what messages we publish in what circumstances. Here are some highlights:
DbCommand
Log command start - here
Log command error - here
Log command end - here
DbConnection:
Log connection open - here
Log connection closed - here
DbTransaction
Log Started - here
Log committed - here
Log rollback - here
Other
Command row count here - Glimpses calculates this at the DbDataReader level but you could do it elsewhere as well
Now that you have an idea of what messages we are expecting and how we generate them, as long as you pass in the right data when you publish those messages, everything should just light up - if you are interested here is the code that looks for the messages that you will be publishing.
Message Broker: If you at the GlimpseConfiguration here you will see how to access the Broker. This can be done statically if needed (as we do here). From here you can publish the messages you need.
Helpers: For generating some of the above messages, you can use the helpers inside the Support class here. I would have shifted all the code for publishing the actual messages to this class, but I didn't think there would be too many people doing what you are doing.
Update 1
Starting point: With the above approach you shouldn't need to write your own plugin. You should just be able to access the broker GlimpseConfiguration.GetConfiguredMessageBroker() (make sure you check if its null, which it is if Glimpse is turned off, etc) and publish your messages.
I would imagine that you would put the inspection that leverages the broker and published the messages, where ever you have knowledge of the information that needs to be collected (i.e. inside your custom lib). Normally this would require references inside your lib to glimpse (which you may not want), so to protect against this, from your lib, you would call a proxy (which could be another VS proj) that has the glimpse dependency. Hence your ado lib only has references to your own code.
To get your toes wet, try just publishing a couple of fake connection and command messages. Assuming the broker you get from GlimpseConfiguration.GetConfiguredMessageBroker() isn't null, these should just show up. Then you can work towards getting real data into it from your lib.
Update 2
Obsolete Broker Access
Its marked as obsolete because its going to change in v2. You will still be able to do what you need to do, but the way of accessing the broker has changed. For what you currently need to do this is ok.
Sometimes null
As you have found this is really dependent on where in the page lifecycle you are currently at. To get around this, I would probably change my original recommendation a little.
In the code where you are currently creating messages and pushing them to the message bus, try putting them into HttpContext.Current.Items. If you haven't used it before, this is a store which asp.net provides out of the box which lasts the lifetime of a given request. You could have a list that you put in there, still create the message objects that you are doing, but put them into that list instead of pushing them through the broker.
Then, create a HttpModule (its really simple to do) which taps into the PostLogRequest event. Within this handler, you would pull the list out of the context, iterate through it and push the message into the message broker (accessing the same way you have been).

IBM Worklight - JSONStore logic to refresh data from the server and be able to work offline

currently the JSONStore API provides a load() method that says in the documentation:
This function always stores whatever it gets back from the adapter. If
the data exists, it is duplicated in the collection". This means that
if you want to avoid duplicates by calling load() on an already
populated collection, you need to empty or drop the collection before.
But if you want to be able to keep the elements you already have in
the collection in case there is no more connectivity and your
application goes for offline mode, you also need to keep track of
these existing elements.
Since the API doesn't provide a "overwrite" option that would replace the existing elements in case the call to the adapter succeeds, I'm wondering what kind of logic should be put in place in order to manage both offline availability of data and capability to refresh at any time? It is not that obvious to manage all the failure cases by nesting the JS code due to the promises...
Thanks for your advices!
One approach to achieve this:
Use enhance to create your own load method (i.e. loadAndOverwrite). You should have access to the all the variables kept inside an JSONStore instance (collection name, adapter name, adapter load procedure name, etc. -- you will probably use those variables in the invokeProcedure step below).
Call push to make sure there are no local changes.
Call invokeProcedure to get data, all the variables you need should be provided in the context of enhance.
Find if the document already exists and then remove it. Use {push: false} so JSONStore won't track that change.
Use add to add the new/updated document. Use {push: false} so JSONStore won't track that change.
Alternatively, if the document exists you can use replace to update it.
Alternatively, you can use removeCollection and call load again to refresh the data.
There's an example that shows how to use all those API calls here.
Regarding promises, read this from InfoCenter and this from HTML5Rocks. Google can provide more information.

Using search help for different system?

I need a search help for my bukrs field. The problem is data should come from different system. There are 2 systems like X and Y. i am in X system and running a program.
At selection screen bukrs field exists. When i click on search help data should come from system Y.
I heard that it is possible to pull data to search help but couldn't find enough info in sites.
Best Regards.
5 month experienced with SAP/ABAP :)
To create a search help with a custom data selection, you can define a search-help exit in your search help. To select data in a custom way and not from a database table, go to the tab "Definition" of the search help, remove the content of the input "Selection Method" and enter a function module into the input "Search help exit". This function module must have the same signature as the example module F4IF_SHLP_EXIT_EXAMPLE. The comments in the sourcecode of this example module will explain you how to implement it. Your implementation of this function module can then perform the data acquisition from another system with a remote function call.
To get data from another system, you have to call a function module in a remote system via RFC (remote function call). To do this you need:
the RFC-capable function module in the remote system which exports the data you need. You can set a tick on the properties tab of a function module to make it RFC-capable.
An RFC connection from the local system to the remote system. RFC destinations can be created and configured with the transaction SM59.
To call a function module via RFC, you just have to add DESTINATION [rfc-destination] to the function call.
CALL FUNCTION 'Z_YOUR_RFC_CAPABLE_FUNCTION_MODULE'
DESTINATION 'my_rfc_destination'
IMPORTING [...]
The user will have to log into the remote system in order to call RFC function modules in it, unless you define a username with password in the RFC connection. When you do that, you should create a dedicated system user in the remote system with minimal permissions especially for this RFC connection. When the user has too wide permissions, the RFC connection can be abused for other purposes.