white-logo Presents

 

The Process Intelligence Playbook

An actionable guide to demystifying the buzz, and setting up your process intelligence effort for success.

Case Study 4

 

Introduction and Playbook Overview

Thank you for reading the Process Intelligence Playbook. We created this as an output of our own learnings together with our customers on their emerging process intelligence journey. As the title suggests, The Process Intelligence Playbook is intended to be an actionable guide to demystifying the buzz, and setting up your process intelligence effort for success. We are truly excited about the potential for Process Intelligence to revolutionize the way enterprises are approaching productivity and we hope the end result of this playbook is that readers better understand this emerging technology space and find success improving and optimizing their processes in a data-driven way.

A couple of questions to keep in mind as we you read:

• Who is the Process Intelligence Playbook created for? This playbook was created for operations, business and IT/transformation leaders focused on understanding and improving the productivity of processes in their organizations. While the playbook will cover some of the emerging advanced technologies on the forefront of process intelligence, detailed technical knowledge is not necessary to understand any of the sections of the playbook.

• Do I need to read the full playbook? No. There are several sections in this playbook. Depending on your goals and familiarity with different tools and technologies, you might find certain sections more useful than others. Each section is meant to stand on its own, so feel free to jump to the section that is most useful for you without needing to read everything at once.

• Will this playbook be updated? Yes. In fact, we suggest bookmarking this playbook. Just as the Process Intelligence space is continuously evolving, so will this playbook, which will be frequently updated with new insights and content.

We hope you find this playbook valuable and welcome any feedback, suggestions or thoughts for improving The Process Intelligence Playbook in the future.

Section 1:
Process Discovery Overview

 

Case Study 2

 

The quest to improve businesses processes is as old as work itself. From the rise of industrial automation over a century ago, to the era of the digital-first enterprise, companies have the same goal: improving the productivity of processes and work.

While the approaches and techniques for improving processes have evolved over the last 100 years, one thing has remained the same: You can’t improve your processes unless you understand your processes.

This is the focus of the Process Intelligence Playbook: Helping enterprises understand how they can develop an accurate view of their processes so that they can improve them.

Process Discovery is an exciting topic within the broader field of process improvement, particularly over the last 10 years. New technologies have enabled more automated and scalable methods for analyzing processes, and recent improvements in machine learning, computer vision and AI have unlocked data-driven approaches to understanding process details in ways not previously possible.

However, understanding and differentiating between each of these new and powerful process discovery techniques can pose its own challenges. This section will cover the common and emerging process discovery techniques, with simple to understand descriptions and the pros and cons of each approach.

MANUAL PROCESS DISCOVERY | HIRING A CONSULTING FIRM

One of the most common approaches used by enterprises for process discovery is hiring a consulting firm to conduct an internal business assessment. This assessment typically involves the consulting firm interviewing employees about how a process is completed, as well as directly observing the process, as with a time and motion study.

Based on the output of those interviews and observations, business analysts at the consulting firm will build a process map or workflow based on how they believe the process is being done, and will likely recommend opportunities for improvement based on best practices.

This consultative approach is popular because its brings an independent evaluation of the process, and consulting firms can share industry wide best practices for improving the way a process is being completed.

However, this manual approach to process discovery also has a number of shortcomings:

• Polanyi’s paradox: “We know more than we can tell”. This cognitive phenomenon explains that in many cases, humans have an intuitive understanding of how to perform a task, but cannot verbalize the reasoning behind it. Common examples of this paradox include driving a car, or explaining how one recognizes someone’s face. Polanyi’s Paradox also applies when asking someone how exactly they complete a task or a process. As a result, process discovery interviews often miss critical steps within a process that the employee being interviewed is unable to articulate.

Polanyi's Paradox

"We know more than we can tell"

• Limited Perspective: Because interviews typically only involve a small subset of employees, they can only provide a very narrow snapshot of how a process is being completed. As a result, interviews often miss common variations or permutations of how work is actually being done by employees who have not been interviewed. 

• Disruption to work: Interviews can be disruptive to productivity and work, as they involve pulling a company’s most knowledgeable employees away from their actual day-to-day job to take part in the interview.

• Outdated Results: These consulting projects often take many months to complete. However, processes and work are dynamic and constantly shifting, particularly in the age of digital transformation. As a result, the completed analysis from a consulting project is often already outdated by time it is delivered.

 

Manual Process Discovery Summary
  Manual process discovery is often led by external consultants, who interview and observe employees completing tasks and work.
  These consultative assessments are popular because they provide an external analysis of processes.
  Interviews have several drawbacks, including "Polanyi's Paradox" ("We know more than we can tell.") and disruption to employees and work.
  Other drawbacks of manual process discovery are they reveal only single snapshots of how a process is being done, and that consulting projects often take so long to complete that the results are almost instantly outdated.

 

PROCESS MINING | Analyzing processes using event logs

The term Process Mining became broadly popularized in 2011 with the release of the “The Process Mining Manifesto”, published by the “IEEE Task Force on Process Mining”, led by Dr. Wil van der Aalst (known as the “Godfather of Process Mining”)

The Process Mining Manifesto defines process mining as follows:

Process Mining

"Analysis of operational processes based on software event logs"

“Process mining techniques are able to extract knowledge from event logs commonly available in today’s information systems. These techniques provide new means to discover, monitor, and improve processes in a variety of application domains.”


 

PROCESS MINING | Analyzing processes using event logs

The term Process Mining became broadly popularized in 2011 with the release of the “The Process Mining Manifesto”, published by the “IEEE Task Force on Process Mining”, led by Dr. Wil van der Aalst (known as the “Godfather of Process Mining”)

The Process Mining Manifesto defines process mining as follows:

Event log

"A timestamp produced by a piece of software after a user defined change, such as creating a new account."

Process Mining tools are able to extract these event logs by integrating directly to these enterprise software systems. Then, by applying statistical analysis to all of the event logs across these different systems, process mining is able to build a data-driven view of how a process is being completed.

This approach was a paradigm shift towards understanding how work was being done. Rather than relying on interviews or anecdotal feedback, process mining tools use definitive data from the collected logs. By applying analytics tools and models on top of those data points, process mining can build accurate, data-driven views of how a process has historically been completed.

Despite this game changing shift process mining has enabled, there have also been several challenges that have emerged with these event log-based process mining techniques.

Challenges with event log based process mining

• Data quality issues: Similar to any other data base, event logs often have data quality issues such as incorrect timestamps, noisy data and missing data attributes. Correcting these challenges often requires significant up front effort and support from IT teams to clean up data before being able to start the actual process analysis. According to the 2020 Gartner Process Mining Market Guide, "Typically, 80% of the efforts and time are spent on locating, selecting, extracting and transforming the process.... Process mining often reveals data quality problems that need to be dealt with urgently”.

• Integration effort: Because process mining works by analyzing software event logs, it needs to be integrated with the software, which requires up front development effort and support from IT teams. This can slow down and complicate the rollout of process mining projects.

• Many process steps are not visible from an event-log: Process Mining can only identify process steps in software that have event logs. However, many pieces of software do not provide software logs. Additionally, even within software that does provide logs, those logs do not provide visibility into all possible actions. As a result, while process mining can provide a great high level view of a process, it can often miss significant steps and details in a process.

Missing Process Mining Details

• The paradox of process mining - you need process knowledge to get process knowledge: Successful implementation of process mining often requires existing knowledge of a process, in order to guide the necessary integration. In fact, according to Deloitte's 2021 Global Process Mining Survey, the most 'critical skills' for the success of a process mining implementation was "Process Knowledge". It is counterintuitive that the most critical success factors for getting value from a technology that is supposed to deliver process visibility is existing process visibility.

 

Process Mining Summary

  Process Mining uses software event logs to identify the steps in a process in a data-driven way, rather than relying on interviews or anecdotal feedback.
  Process Mining implementations often struggle due to data quality issues.
 Process mining can only identify process steps visible via software event logs. As a result, while process mining can provide a great high level view of a process, it can often misses significant steps and details not visible from those software event logs.

PROCESS MINING | Analyzing processes using event logs

The term Process Mining became broadly popularized in 2011 with the release of the “The Process Mining Manifesto”, published by the “IEEE Task Force on Process Mining”, led by Dr. Wil van der Aalst (known as the “Godfather of Process Mining”)

The Process Mining Manifesto defines process mining as follows:

Task Mining

"using computer vision to analyze steps of a process from computer screen shots"

Task Mining has emerged over the past couple of years in large part due an impressive increase in both effectiveness and speed of computer vision algorithms, dubbed “The Industrialization of Computer Vision” by Stanford University’s Artificial Intelligence Index Report 2021. According to the report, training time for computer vision algorithms decreased by 87% between December 2018 and July 2020, and accuracy improved to almost 99% in 2020 (up from 85% in 2013).

Stanford AI Graphic

Stanford University's Artificial Intelligence Index Report 2021


Task Mining tools apply these computer vision algorithms to screen shots of specific process steps, extracting detailed and specific information as data points that can be interpreted by an algorithm to build a process map. Because task mining can observe every action being completed on the desktop, it is able to build a much more detailed view of a process compared with process mining, which is limited to observing only those specific actions which can be identified by software with backend logs. Task Mining can be a powerful approach when organizations only need to know details of very specific parts of a process ('tasks'). Additionally, because of task mining's focus on front-end and UI, it has also held particular attraction for organizations looking to implement RPA-type automation, which is also focused on the front-end/UI.

While Task Mining overcomes some of the specific challenges present in process mining, it also comes with its own share of challenges, most notably difficulty stitching together all of the specific data points into longer term processes.

Challenges with task mining

• Difficulty Identifying Long Term Processes: While task mining tools have had success in accurately building a view of processes that take minutes or possibly days, it has had challenges in identifying end-to-end processes over longer periods of time as well as process that involves multiple employees. In fact, this shorter term focus has resulted in the name “Task Mining” itself – referring to the shorter term, more tactical focus on “tasks”, rather than longer term “processes”.

• Limited context within a full process: Because of the limited visibility task mining provides within the scope of a larger process, it is often not clear where the tasks being analyzed exist within the larger process (and the potential impact of those tasks on the rest of the process), the frequency of the tasks (particularly across different process variants), and of the importance/impact of those tasks on the overall efficiency of the process.

• Task Mining is not scalable: Due to the computational intensity of the statistical approaches used in task mining, its scalability is limited in terms of the length and complexity of the tasks being analyzed, as well as the number of employees that are part of the analysis. 

• Inconsistency of results: Because of the non-deterministic nature of the statistical approaches used in task mining, it often delivers inconsistent results. This means that the exact same tasks or process steps can be analyzed twice, but deliver varying results. This makes it particularly challenging to accurately measure the impact of process improvements by comparing pre- and post-implementation metrics using task mining.

• Privacy and Security Limits Cloud Based Task Mining Solutions: Because task mining utilizes computer vision to analyze screenshots, for security and privacy reasons it almost always needs to be deployed in a customer's own servers.  In fact, cloud based task mining solutions which need to send those screen shots and other potentially sensitive data to external servers are often a show-stopper in large enterprises.

Task Mining Summary

  Task mining uses computer vision to analyze the ‘front end’ of processes and tasks: the computer screen.
  Because task mining can observe every action being completed on the desktop, it is able to build a much more detailed view of a process compared with process mining
  Task mining is popular with organizations interested in deploying RPA, as both technologies are UI focused.
  Task mining is 'short-term' focused and struggles with identifying end-to-end processes over longer periods of time (beyond hours or days), and processes that involves multiple employees.
  Enterprise privacy and security considerations prevent cloud-based task mining solutions.

PROCESS INTELLIGENCE | The best of both worlds (and beyond)

Process Mining and Task Mining take two fundamentally different approaches towards data-driven process discovery.

• Process Mining focuses on creating a longer term view of processes, but misses critical process details.

• Task Mining provides a much richer view of process details, but sacrifices the longer, end-to-end view of a process.

So, how is Process Intelligence different? Process Intelligence combines the best of both process mining and task mining approaches to create a highly detailed and end-to-end view of a process.

• Process Intelligence builds a detailed, end-to-end view of processes: Process Intelligence takes a novel approach to process discovery by building an end-to-end view of processes (benefit of process mining) with a high level of detail about each process step (benefit of task mining). Process Intelligence achieves this by using computer vision to identify each detailed process step, then applying advanced machine learning to connect each data-point into a complete, end-to-end process. 

• Process Intelligence creates a detailed process 'data model': Process Intelligence goes beyond just ‘mining’ process data points, and creates a data-model of digital work, achieved through application of advanced machine learning. 

• Process Intelligence uncovers actionable data-driven opportunities for improving processes: Process Intelligence analyzes the key factors and drivers of a process to determine data-driven opportunities for improving them.

• Continuous improvement and measurement: Process Intelligence builds a quantitative baseline for continuously measuring the impact and ROI of process improvements on key metrics such as efficiency, throughput, and variable cost.

 

Process Intelligence Summary

 Process Intelligence combines the best of both process mining and task mining approaches to create a highly detailed and end-to-end view of a process.
 Process Intelligence applies advanced machine learning algorithms to process data points to create a new data-model of each process.
 Process Intelligence goes beyond simply 'mining' and uncovering process data points, and focuses on transforming those data points into 'intelligence' to identify opportunities for improving processes.
 Process Intelligence applies analytics on those process data models to analyze the key factors and drivers of each process and determine data-driven opportunities for improving them.
 Process Intelligence measures the impact of process improvements such as automation and process redesign to determine the ROI across metrics such as efficiency, throughput, and variable cost.

SECTION 2:
PROCESS INTELLIGENCE USE CASES

Case Study 1

Now that we understand how Process Intelligence can deliver a more comprehensive and detailed view of processes and work, how exactly can it help organizations improve their processes?

A few of the most common Process Intelligence use cases include:

• Process Transparency: Before improving a process, it is important to understand the current baseline of how it is being done. Process Transparency focuses on building an initial view of a process and work.

• Automation: Automating basic repetitive tasks using Robotic Process Automation or similar tools.
• Process Optimization: Improving process efficiency metrics by redesigning or otherwise streamlining the way a process is being done.

• Digital Transformation: Replacing manual process steps with new technology and digital tools, or otherwise modernizing or updating existing tools or digital steps within a process.

• Conformance and Compliance:  Ensuring that a process is being completed in a way that adheres to required process steps to meet operational, regulatory, and customer compliance.

 

Process Intelligence Use Cases

USE CASE #1 | Building Process Transparency

According to Deloitte’s 2021 State of Process Mining report, the second most common expectation is “Process Transparency”, leaping ahead of even common optimization-focused outcomes such as cost reduction, digital transformation, automation, and even process re-design.

satya-bio

“We’ve seen two years’ worth of digital transformation in two months. From remote teamwork and learning, to sales and customer service, to critical cloud infrastructure and security…”

In reality, this focus on building process transparency should come as no surprise.

2020 and 2021 resulted in a historically unprecedented acceleration of digital transformation, in many cases completely changing the way enterprises get work done. In particular, as employees and work shifted to remote scenarios, a new era of digital tools became necessary. Employees began collaborating through new online tools, and steps in a process that previously involved paper quickly shifted to a digital format.

A result of this high pace of transformation is that organizations' knowledge of their processes have fallen behind. Standard operating procedures have become obsolete, process steps and workflows have fundamentally changed, and existing performance metrics have become outdated and inaccurate.

As operations and business leaders start evaluating opportunities for optimizing their operations and processes in 2022 and beyond, they first need to build an updated understanding of the new baseline of work in their organizations.

Here are ways that Process Intelligence can help business leaders understand the 'new normal' of work in their organizations:

• Building accurate, detailed process maps to reflect the new digital reality of processes and work, including process variations and permutations.

• Creating a data-driven baseline of key process metrics such as utilization and throughput, to understand the exact time and effort involved in completing a process, even across different variations.

• Assessing the impact of interventions and implemented changes to processes, such as process redesign and automation: Quantifying the impact which changes to their processes have had on KPIs and efficiency metrics.


USE CASE #2 | Process Optimization and Redesign

Process optimization deals with adjusting and changing the way a process is being completed, or redesigning specific portions of a process in order to maximize the efficiency or output of that process. However, because of the complexity of factors which impact the efficiency of a process, determining the right area to focus on optimizing can be a challenging task. 

Process Intelligence ensures successful process optimization by building detailed visibility into the 'what, why, and how' of a process, so operations leaders can focus on the improvement opportunities that will have the biggest impact. 

Here are some examples of how Process Intelligence can support data-driven process optimization:

• Determining important factors in a processes: Because Process Intelligence quantifies the steps in a process by building a new data model, it is simple to run complex analytics such as root-cause analysis and key driver analysis. These techniques help to identify the most impactful factors and variables in a process so leaders can focus on improving the areas that will lead to the largest impact and ROI.

• Forecasting results before implementing process changes: Once it has been decided what part of the process is going to be optimized, Process Intelligence enables forecasting of the impact of process redesign or other process changes via powerful analytical techniques such as simulation, forecasting, and what/if analysis. This enables business leaders to validate that proposed changes will drive expected impact before implementing those optimizations, saving time, reducing rework, and helping deliver on promised ROI.

• Analyzing impact on business KPIs: Process optimization is about more than just improving efficiency: it is about understanding how to increasing process efficiency while simultaneously improving customer experience and satisfaction. With Process Intelligence, it is easy to connect process metrics to business metrics to analyze not only how process improvements have driven efficiency and cost reduction, but also understand their impact on customer facing metrics like SLAs, customer satisfaction, and revenue.


USE CASE #3 | Scaling Process Automation

Tools that enable automation of digital tasks and processes, such as robotic process automation (RPA), have surged into the forefront of transformation and IT budgets over the past several years. According to Gartner, worldwide RPA revenues have grown from $518M in 2017 to nearly $2B in 2021, enabled by RPA's ease of implementation as well as market wide acceleration of digital transformation.

Despite this impressive market growth, RPA has not been without its share of challenges particularly in growing from pilot projects to scaled, organization-wide implementations. For every RPA success story, there is an example of underwhelming results, unclear ROI, or failed implementation.

Process Intelligence overcomes many of the critical challenges commonly faced with implementing RPA by:

• Letting data guide automation strategy: Automating based on guesswork or 'instinct' is a sure path to failed implementations, error-prone automation workflows, and significant bot rework. Process Intelligence uses data and machine learning to identify the best and most impactful opportunities for automation to ensure that the correct processes and tasks are being automated to maximize ROI.

• Generating a comprehensive automation business case: Building internal support for automation beyond an initial pilot requires a strong and well-defined business case. Process Intelligence makes it simple to define and quantify key baseline operational metrics such as transaction volume, turnaround time, and total effort, as well as quality metrics such as SLA and allowable error rate.

• Accelerating the creation of RPA workflows: With Process Intelligence, RPA developers can leverage exportable, in-depth Process Design Documents to easily and quickly build RPA workflows within leading RPA tools such as UiPath, BluePrism, Automation Anywhere, and more.

• Quantifying and validating ROI of automation efforts: Measure the pre- and post-automation operational performance metrics to quantify and demonstrate clear ROI of RPA implementations and efforts.


USE CASE #4 | Accelerating Digital Transformation

The unprecedented wave of digital transformation over the last 18 months has presented new and exciting opportunities for operations and business leaders to modernize the way their organizations are getting work done. Enterprises are changing the way they interact with their customers, how their employees collaborate, and how data is leveraged to make informed and accurate decisions. In the new age of digital transformation, it is not a matter of what, but when.

However, this surge of new digital tools has also created an unexpected challenge. As every organization and department has accelerated their adoption of digital tools, it has resulted in a bottleneck and competition for limited IT budget and resources to deploy and implement these new digital tools.

Here are some ways Process Intelligence can help business and operations leaders can build a data-driven business case to secure the necessary IT resources to deploy new digital tools to optimize their business:

• Identifying the exact, specific steps in a process that would most benefit from the introduction of new tools by identifying bottlenecks resulting from unnecessary manual process steps or outdated software and tools.

• Build a data-driven business case on the time and effort that would be saved through the introduction of new tools.

• Determine the ROI of new tools that have been deployed by tracking their exact utilization rate, and quantifying their impact on process throughput, efficiency, and other key operational metrics.


Use Case #5 | Ensuring Conformance and Compliance

In the era of accelerated digital transformation, a wave of new digital tools are enabling incredible productivity benefits. However, this sudden shift has also presented new challenges in ensuring processes are being followed correctly and in a compliant manner.

There are three types of compliance that Process Intelligence helps with achieving:

• Operational Compliance: Confirming that actual process execution across an organization conforms to established best practices and organization-wide compliance and standardization, and ensuring key process steps are followed according to regulatory or other guidelines.

• Regulatory Compliance: Process Intelligence provides visibility into whether required regulatory steps are being completed in highly critical processes. Process Intelligence also can generate detailed and accurate process maps that can be used to meet state and national requirements for highly regulated industries such as insurance.

• Customer Compliance: Some industries and businesses have highly specific compliance and SLA requirements for individual customers. Process Intelligence makes it easy to ensure the required steps and SLAs are being met without having to introduce additional process steps or complexity to ensure oversight.

Beyond these compliance requirements, Process Intelligence also helps ensure general conformance to any standard operating procedures by providing visibility and validation of how often the required process steps are being followed. Additionally, Process Intelligence can identify when the required steps are not being followed, so operations leaders can identify opportunities for implementing new training and guidance.

SECTION 3:

GETTING STARTED WITH PROCESS INTELLIGENCE

Home Page Process Mining

 

BUILDING A PROCESS INTELLIGENCE PROJECT TEAM

Process Intelligence is a powerful tool to improve enterprise productivity. Another benefit of Process Intelligence is that the technology is fairly simple and quick to deploy and deliver insights and value.

However, to set up your Process Intelligence project for long term success, it is best to think of Process Intelligence as a team sport that requires input and consideration from multiple stakeholders across an organization.

Here are key stakeholders that should be part of your Process Intelligence Project Team.

 

Stakeholders v2

Core Stakeholders: The stakeholders who will be directly involved in a Process Intelligence effort, and also responsible for implementing identified opportunities for process improvement.

• Line of business: Impact to business KPIs, implementation of improvements and impact on processes and work

• Transformation and automation team: Impact on tooling and assessing automation opportunities and ROI

• Operations: Impact to operational efficiency, reduction of bottlenecks, etc

• Process Subject Matter Expert: Input into process details and requirements as well as validating outputs of analysis and data governance

Support Stakeholders: Stakeholders who, while not directly involved in the Process Intelligence effort, will be responsible for critical approvals and assessments, particularly early in the project.

• IT: Integration of tools, and centralized roll out (SCCM)

• Security: Information management and control, security of information residency and retention policies

• Procurement and Accounting: Billing and contracting

• Legal: Evaluating compliance, privacy, and data classification requirements

Informed Stakeholders: Stakeholders who have a strong interest in the outcomes and insights from Process Intelligence efforts.

• Executive leadership: CTO and CIO Office - assessment and adoption of new technology

• Human Resources: Work productivity and reporting

 

PROCESS INTELLIGENCE PILOT PLAN AND EVALUATION

The first step of a Process Intelligence effort should be a pilot implementation. A pilot will validate the initial outcomes of the Process Intelligence project, work through technical and implementation details, and build internal support for a scaled effort. A Process Intelligence Pilot should follow the following three-phased approach:

• Step 1: Setup and Preparation

• Step 2: Discovery

• Step 3: Analysis

 

Pilot Plan Graphic

 

STEP 1: SETUP AND PREPERATION

• Identify Target Process: When running a process intelligence pilot, choose a single process to evaluate over the course of the pilot. A best practice is to choose a process with which you already some understanding and visibility. This will help with validating the outcome of the initial process discovery to ensure the technology is accurate and working as intended.

• Agree on Pilot Success Criteria: Define and agree on a clear set of success criteria with the Process Intelligence vendor. This criteria should be evaluated by your full project team. This will not only help with having an objective set of criteria to evaluate the pilot, but it will also help with conducting an "apples-to-apples" comparison if you are evaluating more than one Process Intelligence tool across different pilots.

• Walk through process: Walk through the process with the Process Intelligence vendor. This will help ensure that the process is a good fit for a pilot. For example, processes that have heavy paper or 'offline' steps would not be a good fit for a Process Intelligence pilot.

• Discuss IT requirements: Discuss and understand the IT requirements with the vendor. Make sure to communicate these requirements to your IT and security teams before starting the pilot to ensure necessary reviews and approvals are in place. Interrupting the pilot in the middle of analysis, or at a specific process step can negatively impact the accuracy of the pilot results.

• Determine deployment process with IT: Talk to your IT team about their preferred method of deploying to end users, such as direct download, or centralized deployment via SCCM. Make sure the technology vendor supports the deployment method suggested by your IT team.

• Finalize pilot contract and approvals: Make sure to review the necessary pilot contract and approvals with your procurement and IT team ahead of the pilot kickoff to ensure there are no delays in initial deployment and analysis.

 

STEP 2: PROCESS DISCOVERY

• Analyze one process across 10-20 users: Run process discovery on the initial process across multiple users. Analyzing at least 10 users will ensure sufficient analysis and build visibility into potential variations of the process.

• Run the analysis for 2-3 weeks: While the exact time necessary for the analysis will vary depending on the length and frequency of the process that was selected, 2-3 weeks of analysis will typically be sufficient to build a quantifiable data model for a frequently run process that takes a couple of days to complete.

• Organize Weekly Review Sessions: Meet with key internal stakeholders and the process vendor at least once a week, if not more frequently. This will ensure any questions or issues are discussed and resolved as soon as possible.

• Create first PDD as output of initial discovery: After running analysis, ask the process intelligence vendor to create a Process Design Document (PDD) of the process that was analyzed. A Process Design Document provides a detailed, step-by-step description of the process, often with screenshots of each step. This PDD can be compared against existing process knowledge and expertise in the Analysis stage to validate outputs and learnings.

 

STEP 3: ANALYSIS

• Validate PDD outcomes: Compare the PDD outcome with existing process knowledge, and by reviewing with internal experts. Note that a difference between the PDD outputs and internal experts does not mean that the analysis incorrect. In some cases, even in a pilot, Process Intelligence can uncover previously unknown insights about how a process was running as well as new variants and permutations of a process.

• Analyze Process Metrics: Analyze process metrics such as actual throughput and effort. Compare against expected KPIs, and identify areas for potential process improvement based on gaps and bottlenecks.

• Identify potential outcomes: Identify an initial set of potential process improvements, such as process redesign or automation.

• Compare Against Initial Success Criteria: Review the output of the pilot against the initial success criteria. Schedule a formal review meeting with the key internal stakeholders.

• Complete Final Analysis and Report: Organize results of the pilot as well as initial findings and learnings into a final report. Present report and recommended next steps to the full set of core and informed stakeholders.

 

SECTION 4:

References and Additional Reading