IBM Urbancode Deploy - Supply values to parameters at runtime using properties - properties

I have created a process in IBM UCD to deploy a .Net application.
My Scenario is that i should be able to provide different application name at run time each time i run the process. How can we do this using property in IBM UCD.
I have tried enabling "Prompt on use" option and also created component property and mapped it to the parameter say ${p:component/application.name} but doesn't seem to work. May be i missing out some sequence of steps.
It would be great if i get detailed steps to making this working.

I take it that you are on version 4.x (uDeploy)?
I would steer clear of the prompt on use approach, that feature was removed in 6.x. While there is a migration in place, its simpler to just avoid it.
Using a property on the component process itself is the way to go. So go to your process configuration, and go to the properties / configuration tab. Create a property there. You'll be prompted for a value whenever you run an application process that uses this component process.
If the property is named "iis.app.name" you would reference it with just ${p:iss.app.name}.
Don't use the property "application.name". That is an automatically created property that gets the name of the UCD Application that you are deploying. If you ever can't find out the right way to reference a property, look at your executed process (at component / application levels). The normal view that lists out all the steps that were run and how long they took is sitting on a tab called "Log". Right next to it is "Properties" tab. Click that and you'll see what properties were available to the process.
Also, you'll have better luck getting fast answers about UC Deploy using their own forum: https://developer.ibm.com/answers/?community=urbancode

Did you tried using process plugin for updating the property file ?
Application >> Process >> Select Process >> Process Editor -- From left panel you can Utility plugins , try with update property option.

Related

Check and Notify non-existence of Microsoft.VisualBasic.PowerPacks

In a simple windows form application on VS 2010 I have put a ovalShape using power packs.
The simple Form
Now automatically this action puts the reference of Microsoft.VisualBasic.PowerPacks.Vs in to project properties.
when deploying this in different PC obviously the (a)powerpacks needed to be installed if this application doesn't work, (b) or it can set to "copy local = true" in reference properties so that it should sit to next with the application.
assuming (b) is not an option, since it is a solitary executable, (a) is the only option. in this way if the target machine does not have powerpacks the requirement is to notify it to the user in the first place.
apparently the dll will be deployed in when using the "VisualBasicPowerPacksSetup"
C:\Windows\assembly\GAC_MSIL\Microsoft.VisualBasic.PowerPacks.Vs\10.0.0.0__b03f5f7f11d50a3a\Microsoft.VisualBasic.PowerPacks.Vs.dll
so the blind approach is just to check if the above file not exist then prompt user to install "VisualBasicPowerPacksSetup". but i feel its more accurate if the application able to actually check in registry level.
in registry "Microsoft.VisualBasic.PowerPacks" records in several location, thus makes a confusion.
how to identify the correct key and what should be correct way of checking this reference in vb ?
You could just try to create an object defined in the dependency and catch the resulting exception.
Handling this you could ask the user to install the package. This is probably not considered good practice but should get the job done.

TIBCO Global variables, reverse engineering

I'm currently working on a project were am at the stage of figuring out what the current implementation is doing. Have been putting in a lot of time (A LOT) searching connections between queues declared as global variables.
Is there a way to get a listing of were a specific global variable is being used, or do I actually need to go through all processes, as I´m doing atm?
Thank you :)
in Tibco Designer 5.8 you can find where global variable is used using "Tools->Find Global Variable usages" menu item.
Please note that all tibco processes source code are text files. So, you also can search inside project folder using file text search from any utility that allowing you to search inside text files. For windows I prefer Far Manager
In the "Far manager" you can navigate to project folder then ALt+F7 and search for
%%GLOBAL_VARIABLE_NAME%%
Please also note that even if you don't have tibco project source code you can get it from tibco BW server. example path
tibco\tra\domain\tibco\datafiles\YOUR_PROJECT_NAME

Update changes from Developement instance to Production instance in Odoo

I have 2 instances of Odoo v9 running in the same server (Ubuntu 14.04). I want to make changes (install modules, change source code or anything) in the developement instance and after confirming they are OK, move the changes to the Production Instance. Is there anyway of doing that without repeating the whole process of development?
Thank you.
As I can understand you do not want to stop the production instance.
If they are only XML files you might be able to get away by only updating the module from the frontend (Apps-> Your Module -> Update. Although if you have modified the __openerp__.py file inside your module you have to enter the debug mode and click Update Apps List first of all.
For changes in files that are inside the static folder of your module, you do not need to stop the server. Although, your users must click ctr + shift + R in order to flush their caches and bring to their browsers the new content.
For Python source code I am afraid that you have to stop both instances of the server so that the code can be correctly recompiled.
(See note 1 on this)
In the end you should stop and update everything because unexpected things might pop up at random times due to resources not been properly updated.
Note 1: The Python documentation about the compilation of Python modules above others mentions:
As an important speed-up of the start-up time for short programs that
use a lot of standard modules, if a file called spam.pyc exists in the
directory where spam.py is found, this is assumed to contain an
already-“byte-compiled” version of the module spam. The modification
time of the version of spam.py used to create spam.pyc is recorded in
spam.pyc, and the .pyc file is ignored if these don’t match.
So theoretically if you modify fileA.py in a module and a new fileA.pyc is generated the server will be able to interpret and use it. In any case I had an issue with two instances running where the py file was creating the field and the XML file was using it and the server reported that a filed had not been created for the XML view, that means that the server did pick up and parse the XML file but did not recompile the py.

TFS Continuous Deployment for Windows Service?

I have managed to do Continuous deployment for my Web project using TFS Msbuild.
I have goggled for few hours but couldn't find a relative link to achieve Continuous Deployment for windows service.
Possible to do CD for windows service using TFS Build Definitions? i.e for every check in below steps should be performed, I am using TFS2010 with Windows Server 2008 R2
1] Stop Service,
2] Copy respective Project folder from (Source) Build server to (Destination Server)'staging server1' or 'staging server2'
3] Start Services (willing to do this step manually)
Any blog,tutorial references to achieve this? My guess is need to use Power shell scripts but not sure.
Should be ok, you'll need to install an agent on the box you're deploying to. And you'll need to be able to exit the XAML templates (you'll probably want to copy your existing template that does your build and just add the stop/copy/start stuff onto the end of it).
After your CI build, you'll need to edit it (the XAML template) to start and stop the service you can use the "invoke process" activity (you'll probably want to do something like make it generic and pass in the service name as an argument - note you can change the display names etc in the Metadata argument so it appears meaningful in your build definition).
As far as copying stuff across goes, you can do this fairly easily by accessing properties like the drop location.
Should be fairly straight forward - once you get your head round modifying the templates!
Edit:
Sorry for not responding sooner, I'd have to revise my earlier comment, this isn't as straight forward as it seems unless you really know what you want, I have been thinking about this and like skinning cats, there are more than one ways to achieve this... I've rewritten this a few times so I hope the edit's make sense :)
Boils down to the following:
1) Pass into your template the build agent/machine you want to run this on (this can be done as a simple string, or as an AgentReservationSpec - up to you), since it's unlikely to be the machine that you run your actual CI build on. This is done in the Arguments section of the XAML, as noted before, if you want to edit the display name/description you can edit the Metadata Argument. This machine needs a TFS agent installed of course.
2) Run the task on the remote machine, this is done by adding the Agent Scope activity into your template, you will have to use the info from step 1 to get the ReservationSpec (so would be easier if you add the argument as an AgentReservationSpec or you'll need to resolve this in the template)
2.1) Run the stop/uninstall, this is done via dropping in a (two actually) Invoke Process activity, Invoke Process can take arguments and you need to point it to the executable you're executing, so you'll want to use this, one for the NET command (i.e. NET STOP ), and one for InstallUtil.exe.
2.2) Copy the files from your CI to the remote server, you can use the Copy Directory activity for this, it needs a couple of parameters, the main one is the source location, you should be able to drop in a GetBuildDetail activity, give it a name then reference .DropLocation to get this, destination is wherever you like you're installing to.
2.3) Install the new service as step 2.1, you need to use Invoke Process to install the service, then you can use another to start the service up.
I haven't covered everything, but I haven't set this up myself so I'm sure there are a few pitfalls or things I haven't though of. Off the top of my head this makes sense, but maybe someone that knows better can poke a few holes in it :)

SP2010 Client Object Model: Uploading File to Drop Off Library Doesn't Apply Content Organizer Rules

I am currently developing a service using the SharePoint 2010 Client Object Model to programmatically upload Excel worksheets to a Drop Off Library and then set the properties on the file. This process is working well. However, the Drop Off Library is governed by Content Organizer Rules that aren't being applied to the uploaded file. I have examined every property I thought I could have missed:
ContentTypeId is being properly set
_ModerationStatus is being set to 0
The two properties required to invoke the rule are being set to valid values
Update is being called on the ListItem
The file is checked in after the ListItem is updated
The list doesn't have minor versioning enabled so I don't make any calls to publish.
What's most frustrating is that if I edit the document properties using the Web UI and check it back in without making any changes, the file is moved to its final location. What might I have overlooked that is preventing Content Organizer Rules from being applied to newly uploaded files when using SP2010 COM?
The ultimate answer to this question turned out to be that everything was indeed being set correctly. However, one cannot force the evaluation of content management rules programmatically. The information I required was provided by a post from Steve Curran on this MSDN thread.
In SharePoint 2010 Central Administration under the "Monitoring" section there is a control panel for "Timer Jobs" that includes an item to "Review job definitions." On this panel, there should be a job named "Content Organizer Processing." This is a nightly task that will run and clean up content according to the rules you have established in your site. After uploading a file to the drop off library programmatically, you will likely find that hitting the "Run Now" button for this job will cause the file to be moved to its final destination if the properties are set correctly.
The solution was to change the frequency of this job under the Recurring Schedule section from a nightly process to one that is executed every 15 minutes (or whatever interval you determine will work best).
A word of caution: Be certain to note that if you send automated e-mail to the site administrator or a mailing list when files are left in the drop off library that do not have their properties set correctly, these will start arriving with the same frequency as the job's execution.
This article may help.
Basically, it does not appear to be supported in the 2010 COM so you have to work around it, unfortunately.