Shared services

How to Identify Maximum ROI in 12 Months with Process Intelligence

In this session from PEX Live Process Mining 2023, Preeti Ramesh, VP of Customer Success, Marcin Clark, Senior Director and Evan Plemenos, Director of Business Development, discuss how Skan's clients are leveraging Process Intelligence to increase process transparency and drive ROI.

Transcript:

Hi, everyone. Welcome to the third session of the day. We're so excited to have you. Once again, I am Elizabeth Jackson, the editor of PEX Live network. And another just friendly reminder, this is a live session. We are actually here. So if you have any questions, just write them in the Q&A box and myself as well as the speakers will be able to see them and answer them at the end of the session, if not before.

Anyway, without further ado, let me introduce our next session. We have 'Beat the Downturn with Process Intelligence How to Identify a Maximum ROI in 12 Months'. Our speakers today are Evan Plemenos, Director, Business Development, Marcin Citak, Senior Partner, and Preeti Ramesh, VP of customer success of Skan.

Great to be here. Thanks, everyone, for dialing in. Want to give a quick overview of Skan and then turn it over to Preeti and Marcin to discuss some of Skan's best use cases and how enterprises are using us to help meet the downturn and identify greater ROI within a short timespan.

Skan comprises a process intelligence platform that uses computer vision and AI to give insight into process health for businesses and help them operationalize and quantify the impact of their transformation strategies. And I know a lot of the PEX network webinar that we've had so far has touched upon process mining and other more traditional solutions. But I want to also talk about how process intelligence from the front end starts out in a different way and provides alternative capabilities to be competitively advanced.

It is International Women's Day. And on that note, I want to start off with Preeti here. So Preeti, could you talk to us a little bit about what some of the drawbacks with traditional event logs are and how they make process optimization difficult and how maybe there are gaps in the information as organizations try to optimize.

Thanks Evan, Thanks for the wishes. Very excited to be here. It's a great question. Process mining is designed to see only committed states or work great. So that's why there's event logs, et cetera and it doesn't really look at what work happens in between. What that means is for many organizations or many industries where there's a lot of unstructured work and a lot of interaction between humans and the digital platform, only a very small percentage of the data actually gets into the logs. So it works for some industry, some use cases, but doesn't work for all.

The second big challenge because the data is all sitting in event logs, is that this requires heavy integration to pull the data out from event logs and then consume it into insights and make it more meaningful to the business users.

What we've heard from customers is that it could be very lengthy process. It takes typically 9 to 12 months, a lot of scripting, a lot of code, a lot of developers. Process intelligence on the other hand, is beyond just process mining. It uses computer vision, uses AI, ML, to stitch the whole process together to show you what is really happening on the ground.

The advantage being it can now tell us how many applications activity switches are happening, what activities and screens are taking longer, how many additional touches are happening, all this before the data actually hits the event log. So, I like to say in a way, it actually creates data which is leveraging the existing event logs.

That's amazing. And using some of the data that's created for insights that organizations might not have had access to before, Marcin how are they leveraging this with process intelligence to supercharge optimization efforts?

Thank you, Evan. And good morning, everyone. Let me maybe before I even answer the question directly, where I think I want to start is maybe highlighting how process discovery happens now in general.

So based on my experience in consulting life, etc., what we've seen is there are three typical ways to document processes. Either you start with discovery workshops or you have tens, 50, 60 people in different various rooms whiteboarding the process from end to end. While it's quite effective and provides information from those individuals, it's usually limited by the number or the specific individuals that are in the room.

Now, second way to collect that information is using time space. So you physically have and physically observe an individual conduct the work. This way you can see it, you can document it and you can time it to see it from beginning to end. Now, again, the limitations of that is, is that really truly scalable as you have to manually observe every single individual that is doing the work. And the secondary point is it's very time consuming to be able to do it itself. Then the minute you observe someone doing the work, they actually change the way they do the work itself. So those are big factors that limit the ability, the traditional way of business process of documentation or collection.

A couple of things that Preeti mentioned, from a process intelligence perspective, now thinking about what's an alternative way to do it. Using technology to observe it is having an agent or in aspects of running in a background of the individual computers to observe the work that is being done by the individual, how it's truly being done.

Now, as Preeti mentioned, you can jump between applications. There's no one to process that takes to one single application from end to end. We see numerous number of customers that have 50, 60, 100 applications and maybe 10-15 of them are true process applications. So how do you stitch that without integration? How do you stitch that and connect it using the data that's available to you? It actually brings an example to mine. Just recently, working with a customer is, we've identified a application that we did not enable to collect the data. We found that into two weeks into the discovery process.

Now, once we learned that we were able to enable it, identify the data we want to collect from that application in a matter of five business days. It wasn't a month, two month integration exercise. It was enabling it, identifying it and displaying it.

And really, the true power of this data is now you have a set of data collected at the most granular level that you can expose, summarize, slice, report and truly provide the leadership, the data they need to make the decisions they need based on their strategic initiatives.

And it sounds like that can essentially be customized to the enterprise's specific goals with the level of granularity and with all of those new and awesome features and abilities with that granular data. I think we would all love to hear about some of the key use cases amongst all of our clients. So Marcin, I know you have some of the best insurance experience that we have at Skan. Why don't you go first then Preeti if you want to talk about maybe a financial services use case.

In terms of context, as I mentioned, I'm very much focused on the insurance space, the majority of my customers on the insurance base. So I will leverage those experiences to that. And again, before I jump into the specific use cases, from a context perspective, I know all of us on this call in this room have interacted in one way or another with an insurance organization. Maybe your health insurance may be your auto or home policy, insurance. So keeping that in mind, in order to support the number of customers that the organization may have, they have departments, dedicated call centers dedicated to support those processes, and support those individuals to have those questions.

Many of these organizations are going through a transformational change. Ultimately, how do you move from the analog phone call perspective to more digital? How do you control the omni-channel, maybe through chats, through calls, through emails, or other ways in order to collect that information.

One of the specific examples that I do want to highlight is we've conducted a four month study understanding the utilization of the individuals of an end-to-end claim. Those claims can range anywhere between 3 to nine months. As you can imagine, the end to end process is quite extensive.

And what we've learned is the organization identified up to 50% operational efficiency. They can gain over four years. Now, in the first year alone, they themselves identified that the value anywhere between $2 to $3 million. And it's not for staff reduction it is truly simply by optimization by enabling people to do more with less. So avoiding the staff hiring that they originally have projected because they recognize they have the capacity to do the work that expected.

Now, the second example is, as I mentioned, these large departments call centers, they may receive millions of calls a year. Now, the question is, how do you collect insightful information from those calls? Theoretically, you say, I'm going to go to the system, see what changed, I know what change and I have outcomes for that.

But many times what happens is, for example, I may call and ask what is my deductible, when is my bill payment? The individual look up my policy, I type my information in, they'll look it up. Yep,  you're due December, 31. However, what's important to note is that agent never took any action in the system. So to Preeti's point, there won't be any logs. There won't be any information unless the agent themselves documents via survey or other means to tell you what happened in that call. So one of the exercises we're working with right now with a customer is enable in order to enable the organization to better understand the reasons for the calls and ultimately move the customer or move their customers from the calls to self-service via website, via newsletters, or what are the preemptive items the customer can do to provide to their end customer?

And again, by leveraging the technology, the technology can analyze and provide you that insight and intent based on the screen that the agents are clicking through. Preeti, I know you have vast experience in the finance space. I would love to kick it. Kick it to you.

Great, Marcin. Before I get into finance, I just wanted to add something to the call center in the example that you said. I know you're working with a lot of insurance customers, but this problem is across, right? So people can call it call center, customer support, incident management, you know, in one way or the other we are servicing you know, everybody's servicing either a customer, a broker, an administrator, right? So there are issues, complaints, calls throughout the industry.

I know you were talking about what are the reasons for the call, some of the examples that I was personally involved in talk about first pass rate. So how do you get the call correct at the very first shot. You've successfully answered and provided a resolution. So the main focus there is one to reduce the call volume. So people are not calling back again and again because we have an unresolved incident or a ticket or an issue. The other one is, of course, to improve the customer experience. Any metric called the first plus rate or productivity are all key.

And you just implemented a project recently with a very large health care company and there the potential ROI savings were 10x to 20x. Huge numbers coming out, you know, with the potential savings. The project itself took only three to four months with the process intelligence tool. And we heard, you know, on the other hand, if they had to do it themselves, it would have taken a team of, you know, 6 to 8 Six Sigma black belts, at least a couple of months to a year two to put the whole thing together. I thought personally, that was a great use case that, you know, came across recently.

In the financial sector, you know, again, a lot of data, a lot of manual work, still a lot of unstructured data as well. Seeing a lot of use cases around account payable, account receivable, reconciliations. Right, in addition to the call center that we talked about. Teams are looking to improve, you know, the end-to-end automation by identifying bottlenecks, where exactly can to reduce manual individual intervention. And I think the goal the vision there is to achieve more straight through processing. Now this itself, can result easily in an ROI of 5 to 10x depending on the team size. So a lot of examples coming from the financial sector there.

The third use case that I did want to mention, again, it's a little generic. Again, you know, any organization that is facing customers, they are bound to measure their SLAs. So a lot of use cases coming forward where the customers either don't have any benchmarks for an SLA, they want to create a benchmark or they have an estimated benchmark and, you know, they want to measure themselves against those benchmarks to say, hey, what percentage of my cases are not meeting the benchmarks and why?

So this also can have a very direct bearing on, you know, any penalties that could be leveraged or any customer retention, customer experience and peers, scores, CSATs, et cetera and they can add up to very quickly to hundreds and thousands of dollars and even loss of reputation.

Wow so what we're seeing, if I were to break it down, is the savings within 12 months, of millions of dollars of value. And whether that's coming from extracting more value out of an existing workforce or fixed cost or a reduction in certain aspects in very common front of house type systems and processes such as claims, customer facing, call centers that a lot of people are dealing with. That's pretty incredible. I would say that one thing we've seen in the market, a lot of people really doubling down on an automation strategy. I'd imagine a few of our listeners here today on the webinar are considering that. So Preeti could you continue and talk a little bit about how process intelligence can help inform, support, and bring success to an automation strategy, whether it's in a process or across an enterprise?

Sure, this reminded me very recently our EVP of Strategy and Customer Transformation, Vinay Mummigatti, he posted an article on LinkedIn, and it's titled 'Achieving Enterprise Automation Success'. And what is the road map for achieving the same? I thought it was extremely timely, good for me, and extremely insightful as he shares, you know, firsthand experience from running huge automation centers in his Bank of America and LexisNexis days. So I thought that was extremely insightful. You know, people can read a little more about it.

One article, one report that he referred to there by Ian White states that only 30% to 50% of automations actually fail. Right, why? I think it's because of, you know, the hammer and nail approach. Because you have invested so much money or, you know, as a strategy. There's so much money being invested in automation tools left, right, and center. So now you actually have to look for a project to actually make use of the tool, right? It's become like the hammer and nail approach. So every and there are so many out there, the IP, the chat, bot, AI, ML, et cetera. So which one to apply? But I think the more important question is whether to apply one at all or no. Right, so how do we know that to identify that?

I think it's very important to use a process intelligence tool, which are very focused on what I would call Automation Discovery. Right so they give an end to end view of what is the current state process, analyze the kpis, the bottleneck patterns. So it gives a more holistic picture to see exactly which area needs an automation automation tool. If so, which one? And sometimes you may not even need a tool. It could be a simple change in user behavior or, you know, a small integration piece of code in between somewhere. So I think that's very important. And again, to share a very firsthand experience story, working with a large transformation automation team in the insurance healthcare space.

So luckily what happened for us is we were working on two processes side by side. What we found out was, you know, obviously the gain was we wanted to deploy RPA. We were trying to look for opportunities where RPA could be deployed. But when we compared the two processes, one of them clearly showed such a huge degree of variation that applying any bot there would have just resulted in a technical debt wouldn't have solved any big problem there because it was just too much variation in the process.

Whereas the other one, we could clearly see that 60-70% of the cases were following a common path. So, you know, there is at least, you know, if there was an automation deployed, it would have resulted in bang for the buck or made a significant impact.

I think that is very important to understand that before we actually and, you know, let me rephrase it and say, simply put, I think it has to be a three step process where you first define, then you standardized, then scale and, you know, move forward with what automation you want to apply.

And with continuous observation, we can actually measure and quantify the impact of those initiatives. So when you're redesigning, standardizing and then scaling, you're maximizing the impact of your ROI  on your investment for automation.

That's right.

That's awesome. And then, Marcin, could you tell us a little bit about how continuous observation not only helps us with understanding the process and that redesign and implementation that Preeti spoke about, but also quantifies your ROI early and make sure those gains are carried forward into perpetuity with the process.

Maybe I'll tackle, so two questions. I'm seeing a question around, you know, a little bit more around how our technology works and operates. And I think this is going to weave in a little bit to the question itself.

So, at the most basic level, there's a virtual agent that runs on individual machines that uses computer vision to collect the information from the screen, from the interaction, to inform what application individuals are interacting with and/or collecting the data from those applications.

So if I go in and it's like start button, it will inform me that on timestamp eight, 19 AM Pacific time on March 8th, I click the start in an application zoom, as an example. So then when I switch between different applications, our customers are able to leverage and decide which applications they want to observe, to recognize which are part of the process. That being said, is because it is running silently and on the individual computer machines, it can be something that can be done very specifically to target a particular pain point or something that can be observed long term.

And that being said, our customers use Skan specifically in both of those areas. So as a specific pain point that I mentioned earlier around identifying the reasons for the call. Is there a spike for the call because there's a policy change in insurance? Is that one of the reasons? So maybe they observe it for 6 to six months or 12 months, collect that information. Get that insight.

Anyone who is a Six Sigma or Lean Six Sigma and experienced any value realization, the larger transformation of programs or any operational or any project initiatives will recognize how difficult it is to evaluate and identify if the value was reached at the end of the project.

Speaking from personal experience, if I have a business case that's making assumptions, putting numbers on a page, that's usually easier to part, right? The question then becomes, is how do I realize and truly know that I've gained the value that I expected. Normally requires - what are those metrics? Identifying how will I track these metrics? Who will provide me the raw data? How will I baseline before I start collecting? And then who's going to be responsible in providing this data on a monthly, quarterly,  or annual basis and then report it to the director of finance, etc., to validate that the savings are actually there.

So that being said, from a continuous observation perspective, there really are three areas of big benefits is you can use process intelligence to baseline current organization, before you even make any changes, product improvements, technical landscape changes.

You can define to future targets. And then using the data available, like I mentioned, data at the most granular level of the screen of the individual interacting with it. And then clearly measure the performance up to those set targets and quite easily identify and realizing the value.

And again, going back to the second point is how do I know that the project or initiative I intended has the impact I expected and not maybe a third or another project or larger strategic initiatives. So understanding the incremental improvements that they're affected, when did they go live in the system application landscape and then seeing what that has to change. Does it have to change? Does it have intent to change or are you eroding value that you expected to gain?

More importantly, or probably many individuals on this call are very much interested in process documentation or process deviation. That's the other aspect. Leveraging information, documenting and creates the process maps and process flows for you. You can identify and highlight when individuals and/or organizations/groups are deviating from that expected process. From that quote unquote golden process. That I know every organization has clearly documented as readily available. It's a bit of hyperbole there, but.

You know, this leaves to this leads to an example. You know, we're currently working on a great example of actually operationalizing and observing the agents in call centers to be able to understand both, not only the baseline that I discussed, identifying the opportunities, but then observing and understanding the utilization of the individuals in order to then help either by baseline internally, by geography, by region, by team, or being able to provide the individual managers the information they need to be able to give the guidance to their direct reports. The necessary training. Is that a training or are they hired cheaper? And actually, their effectiveness is much quicker than expected.

So how do you leverage the information to help the individuals that really need it?

That's awesome. So I know we're going to hand it over to Q/A to the audience but I wanted to put this just to Preeti here just for a moment, But then I know Marcin, you've got experience with it, too. Could we briefly just touch on how the unit cost report has kind of innervated the way that some of our financial services and insurance clients have been able to reimagine their costs further into the process and even renegotiate revenue with that level of data.

Sure Yeah. So unit cost is a very interesting concept. It tells you exactly how much you're going to pay for a particular processing, a particular case, a particular request. So I've seen some very interesting examples. One is, you know, in a space where the call center, like I said, it's so common to have it across finance, insurance, healthcare, everywhere. Right you have a call center. So when you see what is the cost of a claim, you can have certain types of calls that are more expensive and don't meet your benchmark. Right, so for example, you can see here every time they call about account balances, it's very easy but every time they have an issue with the website or the portal, it takes much longer to process.

Now, a lot of the large companies are regionally distributed, so there is a lot of difference in the cost structure, the labor cost itself. So, you know, we were discussing with the execs that is it prudent to have sort of an optimization to say here, where can I send my higher cost claims? Right, which by nature take a lot of time. Can we send it to an area, to a region which has a lower cost of processing or a lower cost of, you know, application turnaround time? So that was one example.

The other example is even when a customer's goal and they're expecting a service, some customers are using some outdated systems or disconnected systems as compared to the others who probably moved on to more modern ways of interacting with the call center.

So is there an incentive or, you know, can you put something into the premium plan or the kind of service that they leverage to say, hey, if you come this route, it's going to be a little more expensive. The annual cost is going to be higher as compared to start basically, incentivizing them to move to a place where you can then control the costs of your work. So these were two very interesting examples that had come across recently.

Adding a little bit to that Preeti, another example that we see is anytime we see a health insurance organization may bring on a new customer. So from a business to business contracting perspective. Many of those contracts have certain stipulations that the services will be paid based on number of calls or incidences or managed number of employees that organization has. So one of the pieces is, again, this goes back to their set of assumptions of understanding how much will the customer be billed. Or how much will the customer be charged for those services? What this enables is by being able to have the unit cost and really understanding, well, how much does this claim have cost me from a individual operator perspective to maintain and manage. I can now roll up to information from individual claim, to a set of claims, to a group of claims. And now I can understand it. Is it based on an employer or based on the company that company serves as a customer? Or is it based on a regional or is it based on an internal problem that needs to be fixed?

But really, from a unit cost perspective, what drives and helps saying, take customer A, there's actually more incidences than we expected. Are there more items that occurred than expected? Or contractual obligations? We need to have a conversation and here's the data to say why we want to discuss it? So it really opens up transparency and able to have smart conversations, not just not based on a set of assumptions that may be inaccurate.

A little problem with my video here. All right, guys, we only have about 1 minute left, so I just want to throw out the, you know, at least one audience question. So what are the main differences between process mining and process intelligence specifically related to the implantation of those two approaches?

Preeti, do you want to go with that?

Yeah, I can take that. That was my first topic. But just to explain a bit more so process intelligence is beyond just process mining ,where we were trying to say that it's not just talking about event logs, it's beyond taking data from event log. It's sort of a hybrid I would like to say between collecting data with the digital interactions that's happening between the human and the technology. Plus the data that could be coming from some of the logs.

So it's a hybrid of both models put together well. A hybrid model of task and process mining, which has been well put together, is what I would call process intelligence.

Fantastic, I just want to close briefly by thanking everyone for attending and listening in. We hope it was interesting and informative and our contact information is here on the screen that Elizabeth is displaying. We have plenty of decks and materials we'd love to send out. We have Vinay's article that was mentioned earlier, happy to send a link to that and always happy to get in touch and discuss what challenges you're facing and maybe show a demo as to how we might be able to support you. So Thanks again, folks.

Absolutely thanks, everyone, for attending.

Subscribe to our Newsletter

Unlock your transformation potential. Subscribe for expert tips and industry news.