"Install" SAP custom program in SAP System - abap

I've created a SAP program and I want to deploy it in another SAP system.
I know I can import the Transport Request files with the created program to the new system but I'm looking for other options.
Is it possible to "install"/import my program to another SAP system?
Regards

I can only think you don't want to use the transport system because the systems are not part of the same landscape? If so, you can still use the transport system, you just need to manually move the required files around.
But, there is another approach you can follow - using SAPlink. It's an open source program that allows you download ABAP source, dictionary objects, etc. from one system into files and then upload them into another system. Of course, both systems will need to have SAPlink installed for this to work.

This is somewhat by design, SAP is the largest OTS system available and there has to be some controls to ensure that people can not install software if they are not specifically given the authorization to do so.
Even to use SAPlink ( that mjturner suggests ) requires you to have the ability to install that software first and I doubt you will find it in very many productive landscapes so likely that wont be an option.
Assuming you have a developer authorization you can always download the source code from your development SAP system and then upload from within the ABAP editor (SE38) using utilities -> More utilities->Upload/Download. Note that this doesn't work in the class editor so cut and paste is another option.
Later.......

There are three ways to move transports from one system to another.
1. Moving the transport files form the directories “data” and “cofiles” manually.
When the transport is released in the source system SAP automatically puts the transport files into the transport directory on file system. This files easily can be copied to the second system an be imported via transaction “STMS”.
2. Using CAR files
CAR files are packed files like a zip file. The contain the data and cofiles.
car.exe -cvf packedFile.car data\R900000.XXX cofiles\K900000.XXX
(car.exe is a SAP standard tool, XXX is the system ID)
This CAR files can be imported via transaction SAINT. This allows import files from frontend into the data and cofiles directory without direct access to the file system. After importing the file via SAINT the transport can be imported using STMS. This is be the common way to transport software to other systems outside the current landscape.
3. Using SAR and PAT files
These files are more special. They allow to install software as Add-On in SAP. This is required if the program should be certified by SAP. They have to be created using the AAK (SAP Add-On Assembly Kit). Unfortunately, I have not created this files myself yet. But it seems to be very complex to get this running, because there are some checks which have to be passed. The files can be imported via transaction SPAM (upload) and SAINT (import).

Related

Pdf2htmlEX common error "Cannot load font"

Running the pdf2htmlEX.exe Windows binary from the command prompt works as expected. While, running the pdf2htmlEX Windows binary in a wrapper (.Net in my case) I received an error like the one below.
__tmp_font1.ttf is not in a known format (or uses features of that format fontfo
rge does not support, or is so badly corrupted as to be unreadable)
Cannot load font C:\Users\admin\AppData\Local\Temp\pdf2htmlEX-5RLDCX/__tmp_fo
nt1.ttf
This is a pretty ambiguous error, and appears to be frequent among users when using the windows binary version.
Apparently Lu Wang wasn't able to offer a solution for Windows users, as all posts related are marked 'insufficient info'. Unfortunately, the pdf2htmlEX project is also archived, and no new comments can be added, so I'm adding this information here in the hope that this may help someone else in the future.
In my scenario, the library is called via an ASP.Net wrapper method using System.Diagnostics.Process to convert uploaded files into HTML versions. The Pdf2htmlEX library would work without issue from the Command Prompt, and for some reason, would also work perfectly in my development environment, but not in a production environment (Both of which are Windows Server 2012R2).
My first assumption, and correctly so, was that there was a permissions issue. Pdf2htmlEX uses FontForge internally to handle fonts, and one or both use the Windows Temp directory by default to store resource files used in the creation of the HTML and/or other files. And, I 'believe' although not confirmed, that it also may use the active user's %USERPROFILE%\AppData\Local\Temp folder...
When running test commands from Command Prompt, you are operating under your user context, and everything your user can do, Pdf2htmlEX can do. So everything works as expected.
In a server environment, the process is operating under the ApplicationPoolIdentity, a special IIS user type with limited permissions. Here it failed for me. While, I'd see folders and files created in the Windows Temp folder, they couldn't be opened by Pdf2HtmlEX to create the end files elsewhere.
Solution: (there may be other solutions for your individual case)
In my case, adding a new system user, adding that user to the Users group, and then setting the IIS worker process to that account resolved the issue. The reason I believe, is that the Users group has read/write access to the Windows Temp directory, and potentially other required areas of the system required for Pdf2htmlEX to complete.

How to manage database credentials for mule proejct

I am using database connector component, with vault component to store the database credentials. Now as per the documentation of both components i have created different properties file for each environment to store the encrypted credentials for diff env.
Following is the structure of my mule project
Now the problem with this structure is that i have to build new deployable zip file whenever i have to update the database credentials for any environment.
I need a solution where i can keep all credentials encrypted and centralized and i don't have to create a build every time after updated the credentials, We can afford to restart the server, but building new zip and deploying is really cumbersome.
Second problem we have this approach is a developer needs to know the production db to update it in properties file, this is also a security issue.
Please suggest alternate approach for credentials management for mule projects.
I'm going to recommend you do NOT try to change the secure solution provided to you by MuleSoft. To alleviate the need for packaging and deployment, you would have to extract the properties files outside of the deployment and this would be a huge risk. Regardless of where you store the property files within the deployment if you change the files, you have to package and re-deploy. I see the only solution to your problem as moving the files outside of the deployment and securely storing them. Mule has provided a solution while it may be cumbersome, they are securing these files first with encryption and secondly within the server container. You can move out the property files but you have to provide a custom implementation and you will be assuming great risk to your protected resources.
Set a VM arguement e.g. environment.type=local for local machine on your anypoint studio.
Read this variable in wherever you are reading your properties file in a way that environment type is read dynamically such as below.
" location="classpath:properties/sample-app-${environment.type}.properties" doc:name="Secure Property Placeholder"/>
In order to set the environment type on your production server(or wherever you are using mule runtime), open \conf\wrapper.conf and add the arguement wrapper.java.additional.=-Dserver.type=production. If you already have any property in this file, you may need to set the value of n appropriately. For example 13 or 14.
This way you don't need to generate different deployment artefacts for different environment because correct properties file is picked by using environment specific VM arguement.

How to copy an *.exe file from one computer and paste it to another computer over LAN

For example I have a client-server application, this application often gets updated (it's an exe file). If I download the update on the server machine then the same update should be transfered to the client machines, or vice-versa.
At the moment the update is downloaded on all machines individually. My idea is downloading the update should be done only on the server and I'm planning to make an option in the client to copy the *.exe file (update) directly from the server and paste it on the installation path.
How can I make this happen?
NOTE : the update is a self extracting file.
There is already a technology for achieving this called ClickOnce. The client application can be "published" to a share that is accessible to all the clients, then each time the client is executed a version check is done - if a later version is detected on the share then it is downloaded before execution continues.
You can read more about this here: ClickOnce Security and Deployment. Creating a ClickOnce package and publishing it is a feature already built into Visual Studio, so you do not need to write any code.
You have to write an application that will be split into different parts :
Detect file change either on the client or server machine
Perform the copy of the exe file to server or other clients.
In all cases you can't tell an exe file to update it self.

Where are the best locations to write an error log in Windows?

Where would you write an error log file, say ErrorLog.txt, in Windows? Keep in mind the path would need to be open to basic users for file write permissions.
I know the eventlog is a possible location for writing errors, but does it work for "user" level permissions?
EDIT: I am targeting Windows 2003, but I was posing the question in such a way as to have a "General Guideline" for where to write error logs.
As for the EventLog, I have had issues before in an ASP.NET application where I wanted to log to the Windows event log, but I had security issues causing me heartache. (I do not recall the issues I had, but remember having them.)
Have you considered logging the event viewer instead? If you want to write your own log, I suggest the users local app setting directory. Make a product directory under there. It's different on different version of Windows.
On Vista, you cannot put files like this under c:\program files. You will run into a lot of problems with it.
In .NET, you can find out this folder with this:
Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData)
And the Event Log is fairly simple to use too:
http://msdn.microsoft.com/en-us/library/system.diagnostics.eventlog.aspx
Text files are great for a server application (you did say Windows 2003). You should have a separate log file for each server application, the location is really a matter of convention to agree with administrators. E.g. for ASP.NET apps I've often seen them placed on a separate disk from the application under a folder structure that mimics the virtual directory structure.
For client apps, one disadvantage of text files is that a user may start multiple copies of your application (unless you've taken specific steps to prevent this). So you have the problem of contention if multiple instances attempt to write to the same log file. For this reason I would always prefer the Windows Event Log for client apps. One caveat is that you need to be an administrator to create an event log - this can be done e.g. by the setup package.
If you do use a file, I'd suggest using the folder Environment.SpecialFolder.LocalApplicationData rather than SpecialFolder.ApplicationData as suggested by others. LocalApplicationData is on the local disk: you don't want network problems to stop you from logging when the user has a roaming profile. For a WinForms application, use Application.LocalUserAppDataPath.
In either case, I would use a configuration file to decide where to log, so that you can easily change it. E.g. if you use Log4Net or a similar framework, you can easily configure whether to log to a text file, event log, both or elsewhere (e.g. a database) without changing your app.
The standard location(s) are:
C:\Documents and Settings\All Users\Application Data\MyApp
or
C:\Documents and Settings\%Username%\Application Data\MyApp
(aka %UserProfile%\Application Data\MyApp) which would match your user level permission requirement. It also separates logs created by different users.
Using .NET runtime, these can be built as:
AppDir=
System.Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData)
or
AppDir=
System.Environment.GetFolderPath(Environment.SpecialFolder.ApplicationData)
followed by:
MyAppDir = IO.Path.Combine(AppDir,'MyApp')
(Which, hopefully, maps Vista profiles too).
Personally, I would suggest using the Windows event log, it's great. If you can't, then write the file to the ApplicationData directory or the ProgramData (Application Data for all users on Windows XP) directory.
The Windows event log is definitely the way to go for logging of errors. You're not limited to the "Application" log as it's possible to create a new log target (e.g. "My Application"). That may need to be done as part of setup as I'm not sure if it requires administrative privileges or not. There's a Microsoft example in C# at http://support.microsoft.com/kb/307024.
Windows 2008 also has Event Log Forwarding which can be quite handy with server applications.
I agree with Lou on this, but I prefer to set this up in a configuration file like Joe said. You can use
file value="${APPDATA}/Test/log-file.txt"
("Test" could be whatever you want, or removed entirely) in the configuration file, which causes the log file to be written to "/Documents and Settings/LoginUser/Application
Data/Test" on Windows XP and to "/Users/LoginUser/AppData/Roaming/Test on Windows Vista.
I am just adding this as I just spent way too much time figuring how to make this work on Windows Vista...
This works as-is with Windows applications. To use logging in web applications, I found Phil Haack's blog entry on this to be a great resource:
http://haacked.com/archive/2005/03/07/ConfiguringLog4NetForWebApplications.aspx
%TEMP% is always a good location for logs I find.
Going against the grain here - it depends on what you need to do. Sometimes you need to manipulate the results, so log.txt is the way to go. It's simple, mutable, and easy to search.
Take an example from Joel. Fogbugz will send a log / dump of error messages via http to their server. You could do the same and not have to worry about the user's access rights on their drive.
I personally don't like to use the Windows Event Log where I am right now because we do not have access to the production servers, so that would mean that we would need to request access every time we wanted to look at the errors. It is not a speedy process unfortunately, so your troubleshooting is completely haulted by waiting for someone else. I also don't like that they kind of get lost within the ones from other applications. Sure you can sort, but it's just a bit of a nucance scrolling down. What you use will end up being a combination of personal preference coupled along with limitations of the enviroment you are working in. (log file, event log, or database)
Put it in the directory of the application. The users will need access to the folder to run and execute the application, and you can check write access on application startup.
The event log is a pain to use for troubleshooting, but you should still post significant errors there.
EDIT - You should look into the MS Application Blocks for logging if you are using .NET. They really make life easy.
Jeez Karma-killers. Next time I won't even offer a suggestion when the poster puts up an incomplete post.

Accessing a resource file from a filesystem plugin on SymbianOS

I cannot use the Resource File API from within a file system plugin due to a PlatSec issue:
*PlatSec* ERROR - Capability check failed - Can't load filesystemplugin.PXT because it links to bafl.dll which has the following capabilities missing: TCB
My understanding of the issue is that:
File system plugins are dlls which are executed within the context of the file system process. Therefore all file system plugins must have the TCB PlatSec privilege which in turn means they cannot link against a dll that is not in the TCB.
Is there a way around this (without resorting to a text file or an intermediate server)? I suspect not - but it would be good to get a definitive answer.
The Symbian file server has the following capabilities:
TCB ProtServ DiskAdmin AllFiles PowerMgmt CommDD
So any DLL being loaded into the file server process must have at least these capabilities. There is no way around this, short of writing a new proxy process as you allude to.
However, there is a more fundamental reason why you shouldn't be using bafl.dll from within a fileserver plugin: this DLL provides utility functions which interface to the file servers client API. Attempting to use it from within the filer server will not work; at best, it will lead to the file server deadlocking as it attempts to connect to itself.
I'd suggest rethinking that you're trying to do, and investigating an internal file-server API to achieve it instead.
Using RFs/RFile/RDir APIs from within a file server plugin is not safe and can potentially lead to deadlock if you're not very careful.
Symbian 9.5 will introduce new APIs (RFilePlugin, RFsPlugin and RDirPlugin) which should be used instead.
Theres a proper mechanism for communicating with plugins, RPlugin.
Do not use RFile. I'm not even sure that it would work as the path is checked in Initialise of RFile functions which is called before the plugin stack.
Tell us what kind of data you are storing in the resource file.
Things that usually go into resource files have no place in a file server plugin, even that means hardcoding a few values.
Technically, you can send data to a file server plugin using RFile.Write() but that's not a great solution (intercept RFile.Open("invalid file name that only your plugin understands") in the plugin).
EDIT: Someone indicated that using an invalid file name will not let you send data to the plugin. hey, I didn't like that solution either. for the sake of completness, I should clarify. make up a filename that looks OK enough to go through to your plugin. like using a drive letter that doesn't have a real drive attached to it (but will still be considered correct by filename-parsing code).
Writing code to parse the resource file binary in the plugin, while theoratically possible, isn't a great solution either.