QTP adds object to local repository while doing Update Run mode - automation

I have created total 50 test scripts. All these scripts use almost same objects so I have created a Shared Repository to ease the maintenance work.
But the problem is...
While executing scripts in Update Run mode, QTP adds some objects in local repository even if they are present there in Shared Repository. I have checked properties of these newly added objects and there are no change in that too.
Anybody please tell me the logic QTP uses to decided whether to add a object to local repository or not.
Also tell me a solution so that no object is added to the local repository.

There are two things at work here.
QTP opens Shared Object Repositories (SORs) in read only mode
since SORs are by definition shared, QTP wants to make sure that
when someone makes a change to a SOR he is aware that many tests may
be affected. Therefore in order to edit a SOR you have to use the
Object Repository Manger and no changes are made to a SOR
automatically (Update Run Mode or Maintenance Run Mode)
When adding an object to the object repository QTP first checks if the object already exists (by comparing properties) if it does it reuses the existing object and doesn't make a new entry.
This explains why you're getting objects in the Local Object Repository (LOR) and not the SOR, (I'm not sure if having unmodified objects added to the LOR is the correct behaviour or not).
In order to propagate the changes after the Update Run Mode you should export the LOR to a new SOR and then merge it with your existing SOR.

I also faced similar kind of problem. While execurting script qtp was not identifying an object although object was present in shared OR and it was highlighting in application. when i checked in maintenance run mode there was a change in one prperty value which i updated in shared OR. The problem was I spy on that object and qtp saved it in local OR. so next time when i run the script it was refering from local OR and script was failing. so i deleted object (which was showing as Local) from object repository window , then it displayed the same object of shared OR (which was QC path)

Related

Is there a non-deterministic delay between writing an OID and creating a reference to the target OID, especially when done from separate processes?

I have two separate processes writing to the same git repository. One is responsible for creating the commit object, and one is supposed to wait until the object is found using git_object_lookup_prefix before moving on to actually create the reference.
It seems that occasionally I have some non-determinism, in that the git object is found, I try to write the reference, and the create_reference call fails reporting that the target oid isn't found in the repository.
I even tried to unload the repository, wait 5 seconds, reload the repository and try to create the reference again, but it still failed.
Ideas? Wait longer? Something else?

#Object already exported, no package change is possible" while mass package assignment

I need to change a package for ~250 SAP development objects (ABAP classes, data elements, tables, etc). I'm getting an error message TR242 (Object already exported, no package change is possible) when I'm trying to do the change via se24/se80 transactions or via RSWBO052 report.
SAP help docs say that the object must be copied under new name, the old one must be deleted and the new one must be renamed to the old name back. However, it's not a good way for 250 objects.
Is there any way to do a mass package change except call tranaction/LSMW for this case?
The problem occurred because I was trying to move the development objects to a non transportable package as #vwegert metnioned above. The target package was marked as non transportable because it was marked as a legacy one. This happened because the target package was moved from a system with basis level lower then the current system basis level. Next steps are necessary to resolve the issue:
The legacy package must be migrated via report RS_MIGRATE_PACKAGES (see note 1711900). The mark 'legacy package' will be removed, but the package will be still non transportable. However, you will be able to recreate the package after the migration.
Delete the non transportable target package and create a new as copy of the non TMS package.
Assign all necessary objects to package created at step 2 using RSWBO052 report.
This message occurs if you try to move objects from a transport-enabled package to a non-transportable package like $TMP. The rationale behind this is:
The object once was in a transportable package, so it must have been added to at least one transport request.
The transport request might have been transported to another system (directly or via ToC), so the other system might have that object.
The current system is the original system of the object, so it is responsible for notifying the other systems (via transport) when the object is to be deleted.
Moving the object to a non-transportable package is semantically equivalent to deleting it for the rest of the system landscape.
Since that process happens very infrequently, it's usually sufficient to direct the developer to copy and delete the object.

How do you create persistant IntelliJ variables in run configurations in IntelliJ?

I'm using IntelliJ 15.0.3.
Update: Have also tried updating to 2016.2.4, but the issue persists.
By creating variables under Settings -> Appearence and Behaviour -> Path Variables, these can be used in a run configuration with $VARIABLE_NAME$ to indicate for example what working directory or program arguments should be used by that run configuration. This is useful if for example the same directory is used in many parts of the run configuration but is changed from run to run.
However when using $VARIABLE_NAME$ in a run configuration it doesn't seem to be persistant. If I close my IntelliJ session the value of the variable will replace the variable reference. So for example if I have the variable:
FILENAME = somefile.csv
and in my run configuration I put "$FILENAME$" under program arguments, this will only be persistant for that session. When I close and reopen IntelliJ the program argument has been set to "somefile.csv" instead of retaining the reference "$FILENAME$".
How do I ensure that the variable reference is retained over several sessions?
Edit: Added screenshots showing before and after session reset.
Before a session reset I set my program arguments to reference my FILENAME variable. This also happens if I try to for example use the working directory field instead of the program arguments field.
After restarting IntelliJ the run configuration no longer references FILENAME.
I can't reproduce the issue in v2016.2.4 (i.e. I still get the replacement value after a restart). You may want to upgrade to v15.0.6 which is the latest v15.x available and see if that resolves the issue.
The values you set are simply stored in the file .IntelliJIdea\config\options\path.macros.xml (see Directories used by the IDE to store settings, caches, plugins and logs for info on where the configs directory is located.) So you can take a look at that file and see what's going on. Maybe try adding it outside IDEA (when IDEA is closed) and see if it holds (in the event a bug is causing the that file to not save properly after editing via the IDE.)

Migrating Transformations in Pentaho PDI

We are using two servers, one as preprod and other as Production. When we are migrating jobs or Transformations from preprod to Prod it copies its connection properties as well and this affects our Production job execution.
Can someone let me know how to migrate transformations without coping it's connections to another server.
From the Tools->Options menu, there are two checkboxes that effect PDI's import behavior: "Replace existing objects on open/import" and "Ask before replacing objects".
Normally when migrating between environments, I set the first option to false. That way if a connection definition already exists, it is silently not replaced. The other way to go is to check both options on and answer 'No' when asked to replace an existing definition.
In this way, a transform/job that runs on pre-prod can simply be exported and imported into prod without changing anything, and it runs against prod in the new environment as long as the connections are named the same.
The only thing to watch out for is importing a new connection definition for the first time. There will be no warning that a new connection object is being created, and after import, it will still point to pre-prod. After each new connection import, you need to change the connection definition to point to the new environment. The good new is you only have to do that once.
I wish they had an option, or just an info dialog to show all new connection objects created as a result of the import; that way you would know exactly what you need to change. But alas -- earwax.
If by 'connection' you mean 'databases connection', JNDI allows you to give them a symbolic name independent of your environment : it is when you configure your environment (e.g. biserver or baserver) that you specify to which database (jdbc driver, IP and port,...) this symbolic name is related.
So your transformations don't contain any refrence to a server adress and you can deploy it "as is".
I use JNDI for my CDE dashboards in biserver too : to deploy a dashboard, I just export it from the dev environment and import it in the preprod environment without modifying anything.
There are a lot of resources on the web about JNDI. Check the Pentaho documentation too.

IBM Urbancode Deploy - Supply values to parameters at runtime using properties

I have created a process in IBM UCD to deploy a .Net application.
My Scenario is that i should be able to provide different application name at run time each time i run the process. How can we do this using property in IBM UCD.
I have tried enabling "Prompt on use" option and also created component property and mapped it to the parameter say ${p:component/application.name} but doesn't seem to work. May be i missing out some sequence of steps.
It would be great if i get detailed steps to making this working.
I take it that you are on version 4.x (uDeploy)?
I would steer clear of the prompt on use approach, that feature was removed in 6.x. While there is a migration in place, its simpler to just avoid it.
Using a property on the component process itself is the way to go. So go to your process configuration, and go to the properties / configuration tab. Create a property there. You'll be prompted for a value whenever you run an application process that uses this component process.
If the property is named "iis.app.name" you would reference it with just ${p:iss.app.name}.
Don't use the property "application.name". That is an automatically created property that gets the name of the UCD Application that you are deploying. If you ever can't find out the right way to reference a property, look at your executed process (at component / application levels). The normal view that lists out all the steps that were run and how long they took is sitting on a tab called "Log". Right next to it is "Properties" tab. Click that and you'll see what properties were available to the process.
Also, you'll have better luck getting fast answers about UC Deploy using their own forum: https://developer.ibm.com/answers/?community=urbancode
Did you tried using process plugin for updating the property file ?
Application >> Process >> Select Process >> Process Editor -- From left panel you can Utility plugins , try with update property option.