Any plans to support open row set for dedicated pools?
Advantages:
More (and faster) parser options over External File Format such as row delimiter
Can auto-infer schema
More convenient to define the file format directly
This is a great feature request Anders. I see that you created already a feature request, I would encourage everyone to vote for it here.
Related
Question - Does Informatica PowerCenter provide API to access session logs - I believe No but wanted to through in forum to be sure?
Objective -Actually I want to extract session logs and process them through Logstash and perform reactive analytics periodically.
Alternate - The same could be solved using Logstash input plugin for Informatica - but I did not find that either.
Usage - This will be used to determine common causes, analyze usage of cache at session level, throughput, and any performance bottlenecks.
You can call Informatica Webservice's getSessionLog. Here's a sample blog post with details: http://www.kpipartners.com/blog/bid/157919/Accessing-Informatica-Web-Services-from-3rd-Party-Apps
I suppose that the correct answer i 'yes', since there is a command line tool to convert logfiles to txt or even xml format.
The tool for session/workflow logs is called infacmd with the 'getsessionlog' argument. You can look it up in the help section of you powercenter clients or here:
https://kb.informatica.com/proddocs/Product%20Documentation/5/IN_101_CommandReference_en.pdf
That has always been enough for my needs..
But there is more to look into: when you run this commandline tool (which is really a BAT file) a java.exe does the bulk of the processing in a sub-process. The jar files used by this process could potentially be utilized by somebody else directly, but I don't know if it has been documented anywhere publicly available....?
Perhaps someone else knows the answer to that.
I need to create a lot of SAP roles and profiles with a little difference between them.
Is there any way to do this using ABAP, or any the tamplate for the file to be uploaded using the PFCG transaction?
I'm pretty new in SAP, so if you have any document about that, please send me.
Thanks in advance.
quite often you can use the Legacy Systems Migration Workbench (transaction 'lsmw'). The workbench works like a sort of macro recorder. In it you can record the steps in a transaction and replay that record any number of times, replacing the values you used in your recorded transaction with new ones, for instance read from a text file. There are a few limitations though:
handling table controls is quite tricky
the steps for all iterations have to be the same. You can't just omit some part of your recording because you only need it for some of the records.
A lot more complex would be creating your own batch input (that is the technology used to replay recorded transactions) using some ABAP coding you need to create yourself. There you would be more flexible, for instance adding different numbers of privileges to different roles. That batch input would then be executed by using the "call transaction using " statement (see here).
If you can manage to restrict the differences to organizational hierarchy fields, you can use the built-in function to derive roles. This way, you can create a master role and a number of derived roles that only differ in specific values. You should be able to use the LSMW mentioned by Dirk Trilsbeek to create the derived roles, if necessary.
If this is not possible, you could try to create the role once, download it and check the contents of the file - it's basically a line-based fixed-width format with the first field of each line describing the line type, IIRC - just compare the contents of each line to the structures named. If you are familiar with any programming environment that is able to handle text output, it's not too hard to generate files containing the new roles with any toolkit you're comfortable with. I've successfully used XText / XPand for this, but it doesn't really matter. You can then upload the roles from the generated text files.
On my BizTalk server I use several different credentials to connect to internal and external systems. There is an upcoming task to change the passwords for a lot of systems and I'm searching for a solution to simplify this task on my BizTalk server.
Is there a way that I could adjust the File/FTP adapters to extract the information from an XML file so that I can change it only in the XML file and everything will be updated or is there an alternative that I could use such as PowerShell?
Did someone else had this task as well?
I rather don't want to create a custom adapter but if there is no alternative I will go for that one. Using dynamic credentials for the send port can be solved with Orchestration but I need this as well for the receive port.
You can export the bindings of all your applications. All the passwords for the FTP and File Adapter will be masked out with a series off * (asterisks).
You could then edit your binding down to just those ports you want to update, replace the masked out passwords with the correct passwords, and when you want the passwords changed, import them.
Unfortunately unless you have already prepared tokenised binding files the above is a manual effort.
I was going to recommend that you take a look at Enterprise Single Sign-On, but on second thoughts, I think you probably just need to 'bite the bullet' and make the change in the various Adapters.
ESSO would be beneficial if you have a single Adapter with multiple endpoints/credentials, but I infer from your question that isn't the case (i.e. you're not just using a single adapter). I also don't think re-writing the adapters to include functionality to read usernames/passwords from file is feasible IMHO - just changing the passwords would be much faster, by an order of weeks or months ;-)
One option that is available to you however, depending on which direction the adapter is being used: if you need to change credentials on Send Adapters, you should consider setting usernames/passwords at runtime via the various Adapter Property Schemas (see http://msdn.microsoft.com/en-us/library/aa560564.aspx for the FTP Adapter Properties for example). You could then easily create an encoding Send Pipeline Component that reads an Xml file containing credentials and updates the message context properties accordingly, the message would then be send with the appropriate credentials to the required endpoint.
There is also the option of using ESSO as your (encrypted) config store instead of Xml files / database etc. Richard Seroter has a really good post on this from way back in 2007 (its still perfectly valid tho.)
I'm in a situation where I need to query modifications out of an DirX Directory Server (LDAP).
In more commonly products like OpenDS, Oracle DSEE, etc. there is usually come kind of changelog that can be queried, which gives you the sequence of modifications performed in that server.
Unfortunately, there is basically no information available online that helps me with this question.
Can anybody with some insight to DirX give some hints if DirX provides anything like this?
DirX doesn't provide the cn=changelog node/subtree that you're looking for.
DirX changelogs are written as LDIF change files. These files can simply be dumped to the filesystem for later use/processing, or as they are written you can invoke any application/script you like do do something with the LDIF data. For example, you can pipe the ldif data to ldapmodify and send every change made in DirX out to another ldap server in real-time. You could pipe the data to a custom application or script that filters it for certain types of operations and writes the wanted info to a sql db, or to whatever output you want. There really aren't any limits here. You just need to read LDIF.
The LDIF data can be written (and piped to your application/script) on change to handle real-time requirements, or on a scheduled basis for batch based processes.
BTW, I've seen implementations where the cn=changelog node (like you'd find on Oracle DSEE) is created in DirX using the LDIF changelog data. i.e. as the LDIF data is written on change, the data is piped to a script that creates the entries you expect under cn=changelog. Obviously this was done to provide more familiar changelog functionality for Oracle DSEE users.
Check whether DirX supports the persistent search control. If it does, this provides change notification, but not history like the UnboundID change log or the retro-changelog of DSEE.
Can file operations, like creation of a file, be done in ABAP?
Yes it can be done.
You can code in ABAP by using 'open dataset' / 'transfer' / 'close dataset' statements to create files on the Application Server.
You can also create your file directly to a certain application for e.g. MS Excel like so.
Also there are several function modules and classes that can simplify certain tasks like gathering your report output, putting your file on the AS (such as 'GUI_UPLOAD' / 'GUI_DOWNLOAD' / 'WS_DOWNLOAD' / 'SAP_CONVERT_TO_CSV_FORMAT' / etc.) ...
Bear in mind that certain functions modules were built for foreground tasks so they won't work in background job scheduling ...
Yes, it's possible, as nict said before. You should start reading here - that's the official documentation, it covers pretty much everything, including working with files on both the application and the presentation server. It also explains how to use platform-independent filenames - always remember, someday you might encounter an application server running on OS/400 that will not let you write stuff to C:\Temp\MyExport.csv. One more hint: Be careful about the function modules nict mentioned, some of them are not safe to use when unicode content is involved. Always use the methods of class CL_GUI_FRONTEND_SERVICES to be on the safe side.
You can use CL_GUI_FRONTEND_SERVICES class or GUI_DOWNLOAD function. Here is a link
You may use CL_GUI_FRONTEND_SERVICES class. But this services only work on front end. Or you can use some function modules like GUI_DOWNLOAD, GUI_UPLOAD etc.
we can create a flat file with data entered into it, with tabs-separated.
Now, that dota corresponds to the sap tables-fields, where the tables are related to an application, like say, material master.
Now we can use the standard FMs to upload the data to the internal tables of the program and followed by updating the database.
So, uploading flat-file data can be done.