How to create SAS EG Flow programmatically? - vb.net

I write a solution which manages execution of complex calculations written in heterogeneous environments (R, SAS, Oracle). One of the features I might want to add is the ability to create a nice SAS Enterprise Guide flow using the SAS branches of the execution, using the exposed COM (that is based on the .NET) interface.
I thought I can use SASEGScripting to insert SAS code and link the code according to their dependency that my solution already manages.
I can insert the code object, but unfortunately I don't know how to create a link.
Using the techniques from ExtractCodeAndLog.vbs.txt
I can get an existing link between two flow items using set item = myproject.ContainerCollection.Item and item.Items. If I iterate through the Items collection, I would get links, and would be able to inspect their properties.

In your EG project, right click on the object you want to make the link from and select 'Link' this will show you the list of possible object to link to. just select one from them.
Example:
Step 1
Step 2
Step 3

Related

How to run process on multiple records?

I want to create an on-demand process of some kind in Dynamics CRM 2013 that will run on multiple records of the same type. The process will create an equal number of records of another type, and all will relate to the same parent record. I can imagine how a workflow would be used to create the new child records but I am not sure how I could create the parent record and associate it with the child records.
If you're running on multiple records, then I presume your are starting from a gridview of some sort. If that's the case, then the solution is easy. Just create a custom ribbon button that accepts the selected records as a parameter, and runs a custom javascript. That will accomplish what you need in a nice elegant solution.
Because it's running javascript, you will have full control to be able to do everything you need. One of the features of ribbon buttons, is they can receive the selected records in a parameter as an array.
But if you don't want to do all the work in javascript, you can have the script pass the parameters to a custom Workflow or Action.
As its already been mentioned, a workflow won't be able to do this alone, because it can only run on a single record, and cannot accept multiple records as an input parameter.
Jason I think the point here is to automate the process. Lee you are correct in your assessment that creating the work order with a workflow step is easy to do while creating the child work order items is either difficult or impossible. Even if you managed to hack this together with several workflows triggered by different events during the process the end result would be a UX/maintenance nightmare.
The simplest and best solution is to have piece of plugin logic that you trigger with your workflow. This plugin code would create a new work order and associated work order items based on the context of the service you run the workflow against. If you would like for this action to be triggered by a database operation instead of manually triggered this would be simple to do as well.
You aren't going to be able to do this via a CRM dialog because it can only run against one record. You can accomplish this fairly easily by leveraging existing CRM functionality:
If it doesn't already exist, create a field in your service entity (the Work Order Item) called new_MasterWorkOrder (or something similar) which is link to a Master Work Order entity.
Create your master record - this would be your Overall Work Order.
From your Work Order Item record entry listing, select all the items you want to add to the Master Work Order record created in the previous step. Alternately, you could use an Advanced Find to locate the target records.
Click the Edit button to initiate the CRM bulk/multi-record edit form.
In the new_MasterWorkOrder field, select the Overall Work Order previously created.
Save.
Once the process complete, all of the Work Order Items you selected will now be linked to your Overall Work Order.
It sounds like you might a need a step before this to create a Work Order Item from selected Service entities. You should be able to accomplish this easily by having a workflow which runs takes in a Service entity as a parameter and builds a Work Order Item from it. Once you have these built you can link them to an Overall Work Order using the process above.

How to pass random parameters to SilkTest Workbench or Classic Record&Play Scenario

I am new to SilkTest and I don't have any scripting background. What I need to do is to record some test cases and then play them to check my system. After getting used to it, I plan to learn scripting and dive into it, but first things first.
What I need is to pass random generated (or randomly read from a text file or pre-defined) parameters into the recordins so that every time I run the tests, different parameters are used. For example, there is a component in which I write some letters and the component filters the results based on the text. Then, I select one of the results. Now, instead of recording the same letters everytime, how can I use random given parameters?
Thanks
What you are looking for is called Active Data in Silk Test.
It allows enhancing your visual tests with external data, for example from an Excel file.
ActiveData testing enables you to leverage existing data in external files as input for powerful, comprehensive application testing solutions. ActiveData testing enables you to perform multiple transactions against test applications using a different set of data for each transaction without writing complicated code or compromising existing data.
You can find an introduction to Active Data in the online documentation or in the tutorial video.
I have a question, what version of Silk Test are you using, also, what client are you using (Silk Test Workbench, Silk4Net or Silk4J). Each of these clients has the ability to receive parameters from an external source whether it is from a command line or from an external data file.
You indicate that you want random data, do you really mean random data or external data? If it is random data that you want you probably need to use a random number/string generator for the client that you are working with (.Net code for Workbench and Silk4Net and Java code for Silk4J).

Is it possible to set shared variables outside of the plugin pipeline CRM 2011

I want to create a record of an entity, but I need to pass a list of guids to the pre create plugin. I don't want to create fields or related entities to do this. Can I use the Shared Variables to do it?
In other words is it possible to set shared variables before initiating the action that will trigger the plugins that will consume them?
EDIT:
I can be creating this type of records from different points that integrate with crm, silverlight, external pages or even plugins of other entities. My current problem can be solved with a field on the entity, but this way if I had to send parameters to control the execution of the plugin for two or more independent actions I would need one field for each action or instead use only one field using a complex format/parse pattern to parameterize each different action. Using fields to accomplish this feature looks a bit excessive.
If the shared variables could be set before the call of the action that will trigger the plugin that would solve the problem and I wouldn't have to create fields in the crm database, because the data I want to pass to the plugin it will only be needed at that time, like a parameter in a function, no need to persist them in the database.
But if it is not possible I will have to stick with the fields :(
Not if they vary by entity/execution of the plugin.
Options:
Set them in the plugin configuration if they don't change but need to be updated
without a recompile.
Apply them as a delimited string in a single field on the entity if they vary per record.
What's the reason for not wanting to use 2?
Nope. The easiest solution that I can think of is to add a BAT (big-ass text) field to the entity and populate it with a comma-delimited list of GUIDs, then access that field in your Create plugin. You could even clear it out if you don't want that extra data in your system.
Edit after your edit:
General comment about your thinking process: you are probably overthinking it. :) Using a single field, you could pass in any kind of "command" using a json or xml formatted string. As I said above, in the pre-create plugin, after you have extracted this "argument" field, you can clear out that field in the Target entity image and that data will never be persisted to the database. Technically it achieves the exact result you want with the only side effect being one extra "argument" field that is always NULL in the database. Don't fight simplicity so hard! :)

API - How to programmatically merge a list of merge candidates returned by .VersionControlServer.GetMergeCandidates?

I am creating a clone of Default Merge Window, to add a feature.
I already have a Merge candidates in a grid from command below:
MergeCandidate[] candidates = tfs.GetMergeCandidates(edtSelectedSource.Text, cbxTargetBranchs.Text);
Now, the user selected 1 or more candidates and I need to merge them.
But the TFS API VersionControl.Merge requires source path and target path.
At first, my question, I need to iterate each candidate and merge each file of its changesets, one by one ?
Second, how could I obtains the target path from a changeset ?
First off, I've done a fair amount of programming with the TFS API, but merging is something that I would never blindly trust to automation. Merge conflicts are best dealt with by human beings. Yes, it's painful and can be automated in many cases, but in many others - things can go terribly wrong. I would think twice and then twice again before doing this on Production branches.
Here's some tips that should help:
You need to create a temp Workspace. The Workspace is the sandbox where everything happens. The Workspace can have files and thus, file locations associated with it. Workspace items have rich metadata.
Have a look at the Workspace and WorkspaceInfo classes.
Then have a look at the workspace client:
http://msdn.microsoft.com/en-us/library/microsoft.teamfoundation.versioncontrol.client.item.aspx
As long as the changesets are continuous, you can do it in a single merge call. If they are not continuous, you need to submit n merges for each continuous block. Let's say they select changesets 10, 15 and 20 and these are continuous (i.e. there are no additional candidates between that range) then you would submit a merge with a versionFrom of 10 and versionTo of 20.
As far as paths go, you want to use the ones that you passed into QueryMergeCandidates and you'll want to specify the full recursion type as well.

How to use use if..else in Data Flow based on user variable values in SSIS

I have a fairly straightforward SSIS package with a number of Data Flow tasks each with data-flows for multiple tables like shown below:
I want to be able to execute each of these data-flows based on some user-defined variable values that I manipulate using a Script Task in control-flow. Something like, if a variable (say BESTELLDRUCK) value is true, then I want to execute the data-flow for this table (source-conversion-destination tasks), else I want to skip this table and proceed to another table (e.g. AKT_FEHLER) in same data-flow task.
How can I do this? Thanks in advance.
You cannot disable or enable transformations within the Data Flow Task. However, you can enable or disable Data Flow Tasks on the Control Flow tab.
Here is one possible way to do this on Control Flow tab:
If it is possible, move the source --> destination transformations to individual data flow tasks. Something like as shown below.
Let's assume you have created variables for each flow to enable or disable the Data Flow Task based on some condition. For this example, I have hard coded some values.
To dynamically enable or disable Data Flow Tasks based on variable. Click on a Data Flow Task and press F4 to view Properties. On the Properties, click the Ellipsis button next to the Expressions property. You will see the Property Expression Editor.
Select the Property Disable and use the Ellipsis button to enter the expression !#[User::Enable_BESTELLDRUCK] Notice the exclamation sign because the variable is declared to Enable but only Disable property is available to you need to do the opposite.
Repeat the process for other Data Flow Tasks with appropriate variables. Run the package and you will notice that the second Data Flow Task did not execute because the variable Enable_AKT_FEHLER was set to the value False.
Hope that helps.
Reference:
To load multiple tables having same schema within ForEach Loop container, take a look at the below SO answer. It transfers data from MS Access to SQL Server. Hopefully, that should give you an idea.
How do I programmatically get the list of MS Access tables within an SSIS package?
I guess there are enough pointers here for Agent 007 to resolve the issue. I would like to add a few general comments.
Enabling/disabling the tasks dynamically is not a good practice. A better way to disable a task is to use an expression within a precedence constraint. One such reference: http://www.sqlis.com/sqlis/post/Disabling-tasks-Through-Expressions.aspx
As suggested convert each STD (Source-Transform-Destination) into its own DFT. Even better use parent-child pattern. This would help in testing future additions of more DFTs.