In my two previous blog posts, I described dependency analysis and impact analysis. These two kinds of analysis focus on what you might call the steady state of your enterprise, or the enterprise at rest. But there is also the enterprise in motion, where we look at the behavior of the enterprise, in particular its business processes.
Enterprise Studio supports several kinds of time-based process analysis.
First of all, in BPMN process diagrams you can check the syntactic consistency of your processes. Just to pick some examples of checks that are performed:
And these are just some of the simpler examples out of several dozen checks.
Second, you can analyze various properties regarding the efficient operation of your process, for instance in the context of Lean process management. Examples include the transfer of work between roles within a process (which you typically want to minimize to avoid time-consuming and error-prone hand-overs), or the amount of rework (i.e., loops in your process).
Third, for compliance and risk management purposes, you can cross-reference tasks and roles, controls and risks, to assess whether for example you have implemented separation of duties correctly, or taken the right mitigating measures against operational risks.
Finally, the most advanced types of analysis assess the behavior of processes over time. In Enterprise Studio, you can add information to your BPMN processes on the completion time, such as the delay, probability distribution and mean processing time. When you do that for all tasks in a business process model, you can analyze the critical path of your process and calculate how long it will take on average and what the maximum time will be. The screenshot below illustrates this, where the labels of the tasks depict their average processing time, and in the lower left-hand corner, you see the outcome of a critical path calculation.
Calculate the processing time with process models
This kind of analysis helps you to find bottlenecks and unnecessary delays and so very useful in optimizing your business processes. Similar analyses can be made to calculate e.g. the average cost associated with executing this process, helping you to optimize it further.
Naturally, this kind of analysis requires accurate input data, but this can be a sensitive matter, for obvious reasons. For (semi-)automated tasks, timing data can often be acquired from the IT system used. For example in call centers, extensive statistics are kept to optimize their processes. But for many other types of work, this information is not readily available and people will often feel threatened if you want to measure this. And given the potential misuse of this type of data, they cannot be blamed for being wary. Moreover, they may change their behavior when you measure it, so the data will not be correct anyway. To avoid this, it has to be clear that the outcomes of such an analysis will also be used to their benefit, for example to reduce their work load and not just to ‘optimize’ the head count. This is an important aspect of any analysis that involves people.