What are the steps to convert a Scenario to BPMN? - process

I have an exam tomorrow and to be honest till now I don't know what are the steps that I should go through to design a given Scenario.
For example, when you see a scenario like this
Every weekday morning, the database is backed up and then it is checked to see whether the “Account Defaulter” table has new records. If no new records are found, then the process should check the CRM system to see whether new returns have been filed. If new returns exist, then register all defaulting accounts and customers. If the defaulting client codes have not been previously advised, produce another table of defaulting accounts and send to account management. All of this must be completed by 2:30 pm, if it is not, then an alert should be sent to the supervisor. Once the new defaulting account report has been completed, check the CRM system to see whether new returns have been filed. If new returns have been filed, reconcile with the existing account defaulters table. This must be completed by 4:00 pm otherwise a supervisor should be sent a message.
What is your approach to model this? I am not asking for the answer of this particular scenario, I am asking for the method. Do you design sentence by sentence? or do you try to figure out the big picture first then try to find the sub process?

There are no exact steps. Use imagination, Luke!)
You can take these funny instructions like a starting point, but they were made by dummies for dummies.
Commonly you should outline process steps and process participants on a sheet of paper schematically and try to build your model. No other way: only brainstorm.

When BPMN comes to mind, one thinks of people together in a conference room discussing how the business does things (creating what you call scenarios and translating to business processes) and drawing boxes and lines on a whiteboard.
Since 2012, when BPMN 2.0 appeared as an Object Management Group (OMG) specification, we have the very comprehensive 532-page .pdf file with pretty much all the information to create the process diagrams one needs.
Still, in addition to reading the previous file, one can also find many BPMN examples of common modeling problems, patterns, books and research papers which help to understand how certain scenarios come to live.
Generally speaking, we first identify who takes part in the process to understand who are the actors. After, we realize where they get (if they get) their input, what they do with it (if they do anything) and where they forward it after they have completed their work (if they forward). This allows to visualize each actor has specific tasks that follow a specific flow of work and can better draw it.
Then, once the clean and simple diagram is built, one can validate visualizing (IRL or not) the users / actors executing the activities.

Related

update old processes with the new process definition -Activiti

I have some processes that ran with old process definitions. But due to requirement change the user task data has been updated with new attributes and this process definition has been deployed. I'm aware that "SetProcessDefinitionVersionCmd" can be set to "yes" to point the processes to the new definition/version.
I would like to know how to migrate the old process data to have the newly added attributes of the user task updated in them?
There is no easy way to migrate process instance data, however, when you set the version to the new process definition the instance data will go with the migrated instance.
What you have to be careful of is to make sure you include null checks for any of the data that may not be present in the migrated process instances.
Hope this helps,
Greg
Indeed there is no easy way for migration, however depending on the differences between the two definitions and to what extend you may not prefer to use SetProcessDefinitionVersionCmd, you may find DynamicBpmnService useful when combined with detecting definitions' versions inside your logic.
And yes another way would be to use SetProcessDefinitionVersionCmd but be extra cautions for tasks that were actually active prior to migration, as Activiti's database model have some redundant data (some for performance reasons), you are better studying the DB tables first for these tasks and then inspecting the before and after migration state. For example, keeping up with a simple changed attribute is much easier than an added boundary event on an active User Task, which affects the "execution tree".
I would also advice to compare SetProcessDefinitionVersionCmd's implementations between Activiti and Camunda, it is sad to have such enhancements efforts separated, but that is another story.

How to illustrate incoming data from one OR two sources in BPMN model?

I've studied BPMN in coursework; this is my first time applying it in real-work scenarios that don't follow any of my textbook examples.
I am trying to illustrate a process where a client can either upload a CSV file, manually enter records, or both. At the end of the day, all records are loaded to a production database via a script. At the moment, I've got it like this:
But, unless one reads the notes attached to each object, this tells me that uploaded AND manual data will be present.
In BPMN how would I designate that Path "A", Path "B" OR both, could be valid? How do I label the gateway? The scripting step I anticipate putting between the data input and the production database, but I'm not quite sure, again, how to specify that the script runs ONCE based on the presence of data from EITHER feed, not both.
What would this typically look like, and thanks in advance.
In BPMN to express that Path A, Path B or both could be valid ways forward, you can use an "inclusive or" gateway. I would typically label the split with a question and the outgoing pathes with the "answers", iow conditions under which the pathes are activated. If I understand your example correctly, a possible solution could look like the following.
Whether you want to use the task types I used, depends a bit on your more specific context. My task types in that example would mean that for the "upload" the process is "waiting for an incoming message", while in the case of manual entry it is "waiting for a user to complete the task" (by entering the required data).
The example also assumes that you know before you reach the inclusive or gateway which channels you will want to use this time.

Noob Guidance on a Parallel Task Workflow (without Visual Studio)

This is going to be my first workflow, and I could use a little guidance.
I have a list I'm using for requests when a user needs their profile changed (eg: change of office location). The change has to be done in AD, PeopleSoft, and another database. Right now, I have it set up so requesters submit an item to a list, and Alerts go out to the different people responsible for making the updates in AD, PeopleSoft, etc. However, there has been enough frustration with missed emails and the like that I've been asked to track via workflow.
So essentially, I need to track a request that goes out to multiple users who will then need to confirm that the task has been completed. I found !(http://officeimg.vo.msecnd.net/en-us/files/989/238/ZA102615287.jpg), which is a very good representation of what I want to do, but does a very confusing job of explaining how to do it: http://office.microsoft.com/en-us/sharepoint-help/all-about-approval-workflows-HA102771433.aspx
Can someone point me to the workflow type that I need and the steps to implement? OOB/SPDesigner please, I don't have VS on my machine.
Thanks,
Scott
I will start by saying that implementing parallel tasks in a single workflow is hard.
What you can do is customise the OOB approval workflow (the one mentioned in the article) to suit your needs. This will give you an insight on how Sharepoint Workflows work and are designed.
It will look confusing at start (very confusing) since like i said is a complex workflow to setup, until you start to understand how it works.
make sure you make a copy of the approval workflow before modifying it so you can still use it if needed.

How to access results of Sonar metrics for use with applications like PowerPivot

I'm trying to run a number of applications with known failure rates through Sonar, with hopes of deciding which metrics are most valuable in determining whether a particular application will fail. Ultimately I'll be making some sort of algorithm that will look at the outputs of whatever metrics I'm using and generate a score from 1 - 100. I've got about 21 applications put through Sonar, and the results have been stored in a MySQL database. I originally planned to use PowerPivot to find relationships in the data, but it seems like the formatting of the tables doesn't lend itself well to that. Other questions on stackoverflow have told me that Sonar's tables are unformatted, and I should instead use the Web Service API to get the information. I'm unfamiliar with API and was unsuccessful in trying to do what I wanted by looking at Sonar's documentation for API.
From an answer to another question:
http://nemo.sonarsource.org/api/timemachine?resource=org.apache.cxf:cxf&format=csv&metrics=ncloc,violations_density,comment_lines_density,public_documented_api_density,duplicated_lines_density,blocker_violations,critical_violations,major_violations,minor_violations
This looks very similar to what I'd like to have, except I'm only looking at each application once (I'm analyzing a sample of all the live applications on a grid), which means Timemachine isn't really what I'm looking for. Would it be possible to generate a similar table, except instead of the stats for a particular application per date, it showed the statistics for an application and all of its classes, etc?
If you're not familiar with the WS API, you can also create your own Sonar plugin to achieve whatever you want: it is written in Java and it will execute on every analysis you run. This way, in the code ot this custom plugin, you can do whatever you want: flush the metrics you need in an output file, push them into a third party system, ... etc.
Just take a look on how to write a plugin (most probably you will create a Decorator). You have concrete examples also to get started faster.

Automating WebTrends analysis

Every week I access server logs processed by WebTrends (for about 7 profiles) and copy ad clickthrough and visitor information into Excel spreadsheets. A lot of it is just accessing certain sections and finding the right title and then copying the unique visitor information.
I tried using WebTrends' built-in query tool but that is really poorly done (only uses a drag-and-drop system instead of text-based) and it has a maximum number of parameters and maximum length of queries to query with. As far as I know, the tools in WebTrends are not suitable to my purpose of automating the entire web metrics gathering process.
I've gotten access to the raw server logs, but it seems redundant to parse that given that they are already being processed by WebTrends.
To me it seems very scriptable, but how would I go about doing that? Is screen-scraping an option?
I use ODBC for querying metrics and numbers out of webtrends. We even fill a scorecard with all key performance metrics..
Its in German, but maybe the idea helps you: http://www.web-scorecard.net/
Michael
Which version of WebTrends are you using? Unless this is a very old install, there should be options to schedule these reports to be emailed to you, and also to bookmark queries. Let me know which version it is and I can make some recommendations.