This post is the third in a three-part series on critical elements of CloudFactory’s quality management framework for document and data processing. The first post covers Quality Assurance and the three quality assurance workflow models we use to turn unstructured data into high-quality, valuable assets that fuel business growth for financial services and financial technology companies. The second post covers Quality Control—how we handle it and how it differs from Quality Assurance. This post, the final in the series, covers the Continuous Improvement piece of our quality management framework.

“The largest room in the world is the room for improvement.” ~Anonymous

As with change, improvement often happens whether we like it, plan it, or not.

Imagine a financial services company whose largest client becomes at risk of churning. The customer success team suddenly swoops in to untangle the issues and implement immediate improvements.

Better late than never?

Photo by bohlam

Photo by bohlam

Perhaps.

Consider a business process outsourcing firm handling your financial technology company’s data processing. For the umpteenth time, you tell them that they’re not meeting your expectations.

How do they respond?

Perhaps they apply yet another “improvement” bandaid, attempting to fix ongoing:

  • Quality issues—which ultimately cause your team to lose time.
  • Training issues—which magnify the cascade of poor quality results.
  • Scale issues—which leave them unable to keep up with demand.
  • Timeliness issues—which mean they’re consistently late and behind schedule.

Time is a significant factor for you. You’re swimming in a backlog of data with an urgent need to catch up. You can’t afford to delay in finding a new data processing partner.

What do you do?

Look for a partner with a proven quality management framework that incorporates:

  • Quality assurance—to give you the confidence that the work is done well.
  • Quality control—to alert you when quality is at risk.
  • Continuous improvement—to mitigate risk and lift quality over the long haul.

In two previous blog posts, we shared CloudFactory’s quality assurance and quality control approaches. This post details the continuous improvement component of our quality management framework.

To begin, let’s define what we mean by continuous improvement.

Image by Sergey Nivens

Image by Sergey Nivens

What is continuous improvement?

Continuous improvement is the practice of monitoring, evaluating, and improving your data and document processing operations and, ultimately, your products and services.

Depending on what’s being improved, you might also hear continuous improvement referred to as quality improvement, process improvement, or performance improvement.

For our clients in the financial technology and financial services spaces, continuous improvement is a never-ending quest for greater accuracy, higher speed, and overall client satisfaction, which leads to customer retention, growth, and increased profitability.

How do we do it? Read on.

Two common approaches to continuous improvement

At CloudFactory, we primarily rely on two approaches for achieving continuous improvement, both from the Lean methodology: the Plan-Do-Check-Act cycle (PDCA) and Root Cause Analysis.

1. Plan-Do-Check-Act Cycle

Image by Elnur

Image by Elnur

The PDCA cycle is the most common model for bringing continuous improvement to your data processing practice. We use it to bring ongoing improvements in accuracy, speed, and value for our clients in the finance industry.

Plan: Establish the objectives, processes, and KPIs necessary to obtain expected results.

Do: Carry out the plan as a pilot or a limited study.

Check: Compare the results achieved to the expected results. We typically recommend using error analysis for this phase.

Act: If the analysis reveals a successful pilot, roll out the changes and begin planning for the next cycle and even more improvement. If not, work through the cycle again and continue until you achieve results worth rolling out on a broader scale.

One of our AR & AP Automation clients requires an error rate under 400 defects per million opportunities (DPMO). Their team (and ours) is delighted that our error rates routinely hit 220 DPMO.

2. Root Cause Analysis

Root cause analysis is an iterative process by which you drill down into a problem until you uncover the root cause of an error.

We conduct root cause analysis whenever concerns or changes arise. For instance, a client may need to ramp and scale after sudden growth or a new round of funding. Or throughput concerns sneak in—is it the new tool or something else? We might also apply root cause analysis when a client launches a new use case or requests overage hours.

Generally, the steps involved in root cause analysis are as follows:
  1. Define the problem or change.
  2. Collect data.
  3. Identify causal factors.
  4. Drill down to the root.
  5. Recommend and implement solutions.

When accuracy or speed is below target or at risk, we use root cause analysis methods to root out underlying issues rather than treat surface symptoms and put out fires.

Image by Vaeenma

Image by Vaeenma

Root cause analysis is most beneficial when quality is below target or at risk.

Practical applications of continuous improvement

The PDCA cycle and root cause analysis are more than good theories. CloudFactory implements those and other continuous improvement techniques for clients in practical ways.

For instance, for a large fintech client with more than 1,000 CloudFactory data analysts on the job, we might recommend developing a knowledge base analysts can turn to as needed for FAQs, task guidelines, performance statistics, and lessons-learned updates.

For another finance industry client in search of greater than 99% accuracy, we might recommend an accuracy-based job dispatcher that directs the next tasks in the queue to analysts with the highest accuracy ratings.

And for yet another client, we might recommend directed learning, whereby team leads tag along with struggling analysts while working on tasks.

In a Brief History of Continuous Improvement, published in the Journal for Quality and Participation, Six Sigma expert Tom Quick says that this problem-first approach is the best way to apply continuous improvement methods.

If you look closely at the origins [of continuous improvement], you can see that the methods sprang from the need to solve a specific problem. There was not a methodology looking for a problem to solve. We seem to have it backward today; we teach methods and then go looking for problems. The need should drive the method, just as form follows function. We should get back to this. We should identify problems and then apply a method rather than teach a method and search for an application.1

Of course, Quick’s approach requires that quality practitioners first be aware of the methods and tools at our disposal. And as Nancy Tague’s 600-page tome—The Quality Toolbox2—illustrates, there are many.

Continuous improvement for a managed workforce

Image by Leo Wolfert

Image by Leo Wolfert

At CloudFactory, quality improvement is baked into our outsourced data and document processing operations; we pursue it through a rich set of tools, among them:

  • Weekly analyst feedback sessions
  • Instant error feedback for analysts
  • Comprehensive, task-based training
  • Lessons-learned documentation
  • Analyst SLAs
  • Escalation paths
  • Quality improvement trackers
  • Feedback loops for accuracy and throughput

For a more thorough examination of how CloudFactory manages quality from start to finish, download Outsourced Data Processing: A Quality Management Framework for the Finance Industry.

Outsourced Data Processing Whitepaper

What's next?

How can we help drive continuous improvement for your company? Let us know.

CloudFactory is driving innovation in the financial services and financial technology industries by processing hundreds of millions of documents each year for visionary companies across the finance industry. We bring you a clean, well-described set of ideas around performance management, scaling, and elasticity to move volume up and down while maintaining high speed and accuracy. Learn more about our data and document processing work.

Receipt transcription involves manually entering or moderating customer-submitted receipt data. Receipts enter the workflow as images or exceptions to an OCR process. Outputs either sync directly to users’ accounts, flow into QA or QC workflows, or upload to your platform as structured, usable data.

1 Quick, Tom. "Brief History of Continuous Improvement." The Journal for Quality and Participation 42, no. 1 (04, 2019): 1-2. Accessed December 7, 2021.
2 Tague, N., 2021. The Quality Toolbox, Second Edition (e-book). | ASQ. [online] Asq.org. Available at: https://asq.org/quality-press/display-item?item=E1224. Accessed December 23, 2021.

Data Transcription Finance and Insurance Automation & Back Office Support Quality

Get the latest updates on CloudFactory by subscribing to our blog