SAP missing file - sap

I am integrating a BizTalk application with SAP. I get an error when SAP sends me some data because the required schema is not declared in BizTalk.
The required schema is http://Microsoft.LobServices.Sap/2007/03/Types/Idoc/3/ZCREMAS01//700
I can find this schema in SAP (when consuming an adapter service from BizTalk), but it does not contain the segment E2LFM1M005.
The error I get is
The adapter "WCF-Custom" raised an error message. Details
"Microsoft.ServiceModel.Channels.Common.XmlReaderGenerationException:
The segment or group definition E2LFM1M005 was not found in the IDoc
metadata. The UniqueId of the IDoc type is: IDOCTYP/3/ZCREMAS01//700.
For Receive operations, the SAP adapter does not support unreleased
segments.
Can you tell me where can I find this iDoc definition?

The SAP release when it comes to idocs can be a bit tricky. In your case, your SAP system will probably be in a higher version then 700.
There are 2 things that you can change.
In your SAP receive location, go to the "Binding" tab in the settings and check the "ReceiveIdocRelease" parameter. There should be a syntax hint on the bottom.
In SAP go to WE20 and check the specific LS, KU, ... you are using. If you edit the idoc type you want to change there, you can see on the bottom a field where you can specify a segment release. Put 700 here and try again.
I can't really make screenshots now. If it's not clear, let me know. I'll post a more complete answer next week.
Kind Regards
Tim

Related

Configure .eds file to map channels of a CANopen Client PLC

In Order use a PLC as a Client (formerly “Slave”), one has to configure the PDO channels, since the default values of the manufacturer are often not suitable. In my case, I need the PDOs so send INT valued instead of the default UNSIGNED8 (see. Picture).
Therefore my question: What kind of workflow would you recommend, to map the CANopen Client PDO channels?
I found the following workflow suitable, however I appreciate any improvements and recommendations from your side!
Start by locating the .eds file from the manufacturer. The image show this in the B&R Automation Studio Programming Environment
Open the file in a eds. Editor. I found the free Vector CANEds Editor very useful. Delete all RxPODs and RxPDO mappings that you don’t need.
Assign the needed Data Type (e.g. INTEGER16) and Channel Name (“1 Byte In (1)”).
Add the necessary PDOs and PDO mapping from the database. (This might actually be a bug, but if you just edit the PDOs without deleting and recreating them, I always receive error messages)
Map the Date to the Channels
Don't forget to write the number of channels in the first entry (in this image: 1601sub0)
Check the eds file for Errors (press F5) and copy&paste the eds file to the original location point 1.)
Add the PLC Client device in Automation Studio and you should see the correct mappings.
(PS: I couldn't make the images smaller ... any recommendations about formating this question are welcome!)

Old ABAP code still active for PyRFC even after TR was imported. Why?

I changed an ABAP RFC module in a SAP system X and transported the changes the Y. Now when I call the RFC SAP stills executes the old code.
I compared both versions from X and Y with a diff tool and found no differences, so it looks like the changes where transported. Is there a special step needed to activate my ABAP RFC code?
I use PyRFC as a client library.
We had the same problem with one of our RFC FMs. Reason was that the connection remained open once it was established. In this case, the binaries are not refreshed in the RFC context. Simply restart the connection and everything should work as desired.
This is a known issue: https://github.com/SAP/PyRFC/issues/89
Quoting the issue:
After the Python script ended, the connection should be automatically
closed and SAP NW RFC SDK initialised. Here what happens under the
hub.
Python interpreters and PyRFC instances share the same SAP NW RFC SDK
lib instance and when the remote enabled function module (RFM) is
called for the 1st time, the RFM metadata are cached inside SAP NW RFC
SDK. When the 2nd call of the same RFM requested from Python/PyRFC,
the SAP NW RFC SDK returns the metadata from cache, rather than
reading again from ABAP system, saving one Python/ABAP roundtrip and
some performance, especially in case of complex RFMs. If the RFM
signature changed in the meantime, the cached RFM metadata are not
changed and Python "sees" the old ABAP code.
I hope a developer friendly solution will get used for the future.
There is no need for activating anything. It should be fine as you transport it.
You can try these;
Transport everything again.(Including other tasks of the same request)
Check the destination field in your call to see if you are calling the correct system
Clear the buffer => TCode /$sync

SAPJco invoking BAPI_MATERIAL_DISPLAY

I was trying to invoke BAPI_MATERIAL_DISPLAY functional module from SAP JCO, This is how i pass my input parameter.
function.getImportParameterList().setValue("MATERIAL", "10");
From my program output i got
The material 10 does not exist or is not activated.
If I execute BAPI_MATERIAL_DISPLAY using SAP logon, iam getting the entry. Using debugger I found that,
My input is going as 00000000000010. And so returning response.
Dunno, how to handle this in a proper way in SAPJCO.
I had directly passed the value 00000000000010 from SAPJCo and this time i got an error,
com.sap.conn.jco.JCoException: (104) JCO_ERROR_SYSTEM_FAILURE: Screen output without connection to user.
Hope SAP is opening a popup. Let me know how to solve both the issues in SAPJCO
Field Material has a conversion exit routine. See also its domain MATNR in the DDIC.
These conversion exits are always called automatically by SE37 but not when the Remote Function Module is called directly - like here from outside from a a JCo program.
So if the BAPI expects to get certain parameters in their SAP internal representation format (I don't know if this is the case here), then you have to do this data transformation on your own beforehand, either by doing this purely within an own routine at Java side, or by calling the appropriate conversion routines at ABAP side via RFC.
For more details on this I recommend to study SAP note 206068.
Regarding your second question with the error message "Screen output without connection to user", I guess that this BAPI expects to have a connection to an SAP GUI for displaying the selected data. With a remote function call you don't have a SAP GUI connection by default, but you can attach a SAP GUI to your RFC connection with JCo, namely by specifying the additional logon parameter jco.client.use_sapgui=1. For this to work, an SAP GUI frontend (either for Windows or for Java) also needs to be installed on your host where JCo is running, of course.

Is it possible to write a plugin for Glimpse's existing SQL tab?

Is it possible to write a plugin for Glimpse's existing SQL tab?
I'm trying to log my SQL queries and the currently available extensions don't support our in-house SQL libary. I have written a custom plugin which logs what I want, but it has limited functionality and it doesn't integrate with the existing SQL tab.
Currently, I'm logging to my custom plugin using a single helper method inside my DAL's base class. This function looks takes the SqlCommand and Duration in order to show data on my custom tab:
// simplified example:
Stopwatch sw = Stopwatch.StartNew();
sqlCommand.Connection = sqlConnection;
sqlConnection.Open();
object result = sqlCommand.ExecuteScalar();
sqlConnection.Close();
sw.Stop();
long duration = sw.ElapsedMilliseconds;
LogSqlActivity(sqlCommand, null, duration);
This works well on my 'custom' tab but unfortunately means I don't get metrics shown on Glimpse's HUD:
Is there a way I can provide Glimpse directly with the info it needs (in terms of method names, and parameters) so it displays natively on the SQL tab?
The following advise is based on the fact that you can't use DbProviderFactory and you can't use a proxied SqlCommand, etc.
The data that appears in the "out-of-the-box" SQL tab is based on messages of given types been published through our internal Message Broker (see below on information on this). Because of the above limitations in your case, to get things lighting up correctly (i.e. your data showing up in HUD and the SQL tab), you will need to simulate the work that we do under the covers when we publish these messages. This shouldn't be that difficult and once done, should just work moving forward.
If you have a look at the various proxies we have here you will be above to see what messages we publish in what circumstances. Here are some highlights:
DbCommand
Log command start - here
Log command error - here
Log command end - here
DbConnection:
Log connection open - here
Log connection closed - here
DbTransaction
Log Started - here
Log committed - here
Log rollback - here
Other
Command row count here - Glimpses calculates this at the DbDataReader level but you could do it elsewhere as well
Now that you have an idea of what messages we are expecting and how we generate them, as long as you pass in the right data when you publish those messages, everything should just light up - if you are interested here is the code that looks for the messages that you will be publishing.
Message Broker: If you at the GlimpseConfiguration here you will see how to access the Broker. This can be done statically if needed (as we do here). From here you can publish the messages you need.
Helpers: For generating some of the above messages, you can use the helpers inside the Support class here. I would have shifted all the code for publishing the actual messages to this class, but I didn't think there would be too many people doing what you are doing.
Update 1
Starting point: With the above approach you shouldn't need to write your own plugin. You should just be able to access the broker GlimpseConfiguration.GetConfiguredMessageBroker() (make sure you check if its null, which it is if Glimpse is turned off, etc) and publish your messages.
I would imagine that you would put the inspection that leverages the broker and published the messages, where ever you have knowledge of the information that needs to be collected (i.e. inside your custom lib). Normally this would require references inside your lib to glimpse (which you may not want), so to protect against this, from your lib, you would call a proxy (which could be another VS proj) that has the glimpse dependency. Hence your ado lib only has references to your own code.
To get your toes wet, try just publishing a couple of fake connection and command messages. Assuming the broker you get from GlimpseConfiguration.GetConfiguredMessageBroker() isn't null, these should just show up. Then you can work towards getting real data into it from your lib.
Update 2
Obsolete Broker Access
Its marked as obsolete because its going to change in v2. You will still be able to do what you need to do, but the way of accessing the broker has changed. For what you currently need to do this is ok.
Sometimes null
As you have found this is really dependent on where in the page lifecycle you are currently at. To get around this, I would probably change my original recommendation a little.
In the code where you are currently creating messages and pushing them to the message bus, try putting them into HttpContext.Current.Items. If you haven't used it before, this is a store which asp.net provides out of the box which lasts the lifetime of a given request. You could have a list that you put in there, still create the message objects that you are doing, but put them into that list instead of pushing them through the broker.
Then, create a HttpModule (its really simple to do) which taps into the PostLogRequest event. Within this handler, you would pull the list out of the context, iterate through it and push the message into the message broker (accessing the same way you have been).

Using search help for different system?

I need a search help for my bukrs field. The problem is data should come from different system. There are 2 systems like X and Y. i am in X system and running a program.
At selection screen bukrs field exists. When i click on search help data should come from system Y.
I heard that it is possible to pull data to search help but couldn't find enough info in sites.
Best Regards.
5 month experienced with SAP/ABAP :)
To create a search help with a custom data selection, you can define a search-help exit in your search help. To select data in a custom way and not from a database table, go to the tab "Definition" of the search help, remove the content of the input "Selection Method" and enter a function module into the input "Search help exit". This function module must have the same signature as the example module F4IF_SHLP_EXIT_EXAMPLE. The comments in the sourcecode of this example module will explain you how to implement it. Your implementation of this function module can then perform the data acquisition from another system with a remote function call.
To get data from another system, you have to call a function module in a remote system via RFC (remote function call). To do this you need:
the RFC-capable function module in the remote system which exports the data you need. You can set a tick on the properties tab of a function module to make it RFC-capable.
An RFC connection from the local system to the remote system. RFC destinations can be created and configured with the transaction SM59.
To call a function module via RFC, you just have to add DESTINATION [rfc-destination] to the function call.
CALL FUNCTION 'Z_YOUR_RFC_CAPABLE_FUNCTION_MODULE'
DESTINATION 'my_rfc_destination'
IMPORTING [...]
The user will have to log into the remote system in order to call RFC function modules in it, unless you define a username with password in the RFC connection. When you do that, you should create a dedicated system user in the remote system with minimal permissions especially for this RFC connection. When the user has too wide permissions, the RFC connection can be abused for other purposes.