Lean Six Sigma is a concept that aims to improve process performance by minimizing waste and reducing variations. It is a method that combines Lean Manufacturing, Lean Enterprise and Six Sigma principles to eliminate waste and improve quality.
The origins of Lean Six Sigma can be traced back to 1986 when Motorola came up with strategies to compete with higher quality Japanese products. Japan used the Kaizen approach (continuous improvement) in product development to produce world-class products of high quality.
In the 1990’s, an American businessman called Larry Bossidy introduced Six Sigma in Manufacturing and soon after he was engaged to introduce the concept in GE.
In early 2000’s the two concepts of Lean Manufacturing (Reduction of waste) and Six Sigma (higher process quality leading to reduced variability) came together as a single concept called Lean Six Sigma. The concept then found acceptance in other industries such as Healthcare, Finance, Retail and Supply chain etc.
Lean focusses on eight kinds of waste (Muda is Japanese word for waste) inherent in processes;
Six Sigma focuses on improving the quality of process outputs by identifying and removing the causes of defects and minimising variability in processes.
Lean Six Sigma aims to achieve continuous flow of quality outcomes, by exposing constraints between process steps and reducing variability between and within the process steps through a cycle of iterative improvements. Lean Six Sigma uses the DMAIC (Define, Measure, Analyse, Improve and Control) phases similar to Six Sigma.
Basic Concepts of Six Sigma
Six Sigma quality is a statistical term used to indicate how well a process is controlled in terms of its variability from the mean. It is a fundamental nature of any process that over time and scale, variations will creep in due to a variety of reasons or factors. The aim of Six Sigma is to keep the process running within acceptable limits from a mean (or arithmetic average of a process data set).
The word Sigma ( σ ) is the standard deviation or the spread around the mean or central tendency. In simple terms, Six Sigma quality performance means 3.4 defects per million opportunities. It is important to note that not all processes, products or systems need to function at Six Sigma quality level. Other than for critical processes involving high safety requirements, such as healthcare, pharmaceuticals, airplanes, manufacturing, etc. it is enough for most processes to function at 3 Sigma or 4 Sigma. The trade-off between achieving Six Sigma or lower levels of Sigma is simply cost and often it is not practical or cost-effective to aim for a high level of Sigma. The table below illustrates the number of defects per million opportunities (DMPO) at various levels of Sigma. It is easily evident as to how efficient processes need to be at Six Sigma level.
Sigma Level DMPO
2 σ 308,537
3 σ 66,807
4 σ 6,210
5 σ 233
6 σ 3.4
Lean Six Sigma Case Study
The objective of this case study is to illustrate how to apply Six Sigma thinking and concepts to organizational problems and processes.
Imagine a retail organization that uses disparate core systems such as CRM (Customer relationship management), ERP (Enterprise Resource Planning), Analytics and Financial Accounting to run its business. This organization has 100,000 unique Customer master records that are regularly referenced in sales, order management, delivery, invoicing and accounts receivable transactions. Each Customer master record has 10 attributes associated with it as shown below;
This structure implies that there are 1 million elements associated with Customer master data.
These Customer master records are created, referenced and updated separately by different individuals, in different departments and business units, depending on their role and function. For example, the Finance Department may manage elements of Customer master data relating to invoicing and accounts receivable. The Sales team may manage elements relating to Customer orders. As is typical in many organizations and situations, disparate systems and independent work functions cause the following issues with Master data;
Duplicated master data across systems that are out of sync
Wasted effort in data maintenance
Errors that get accumulated over time because of data changes made in multiple systems
Business risk arising from poor governance, etc.
DMAIC Approach to improve the process of data management
The DMAIC (Define, Measure, Analyse, Improve and Control) concept can be applied to the above case problem, as follows;
Define (The Problem)
Master Data, maintained separately in multiple business systems, has been observed to contain an unacceptable level of errors (defects) causing unnecessary manual intervention (extra processing) that is costing the organization money, business responsiveness and customer satisfaction.
Measure (The Process parameters and Sigma)
An important step in Six Sigma Analysis is to measure the key operating parameters of the process in consideration to understand current levels of Sigma (Standard Deviation from mean). The table below shows the impact of errors (Defects per million elements) in terms of cost and time. An error can be broadly defined as any situation relating to any Customer master data element that requires changes to data arising from any non-business driven reason. Sigma (standard deviation) can be determined by sampling in a specific section or department or the entire organization at data element levels or entire Customer Master record levels.
It has been assumed that it takes an average of 10 minutes to fix an error at a cost of $50/hour. These assumptions can be validated in the Analysis stage by end users or by Six Sigma experts.
Let us assume that sampling shows there are likely to be 200,000 errors (20% of the total volume of Customer Master data elements). From the table, we can see that the current process of managing customer master data reveals a Sigma between 2 and 3, implying it is costing the organization at least $556,725 in remediating these process errors. It is useful to note that data volume may also grow at 20% year on year and with growth in master data volume there is also an increase in DMPO.
Analysis (Why are there errors in the process?)
The errors in the process may be occurring due to one or a