Process quality management beats the rule of ten

The quality of the goods you produce depends on the quality of your production process. If your production process doesn’t meet the required quality standards, the product may be of inferior quality. Especially in areas where safety is a major concern, such as in the aviation and automotive industries, inferior quality can have serious consequences. For example, the frame of an airplane involves a very large number of joints, all of which require their own tightening process. Each of these tightening processes is logged and archived so that in the event of a liability claim, even years later, the company can prove that everything was done by the book.

Quality problems have to be detected as early as possible

Liability claims aren’t the only cost-intensive situations for manufacturing companies; quality problems can surface before the product has even left the factory. And the later a quality problem is detected in the value chain, the higher the company’s costs. The relationship between the two is called the “rule of 10”.

Rule of 10_cut

 

At each step of the value chain, fixing a problem is approximately 10 times faster and cheaper than doing so in the next phase. What makes things worse is that problems in the production process are likely to go undetected until a quality issue is found in the produced good. In the meantime, the factory keeps churning out inferior-quality goods.

Once a quality problem has been detected, the company engineers want to identify it as soon as possible. Since quality data is already being logged and documented for a number of production processes, the engineers can extract it from the database and analyze it. Subsequently, they can use their findings to define and implement problem-solving measures. Depending on the amount of data and how precisely the time slot when the problem first occurred can be pinpointed, this process can be quite time-consuming (see graphic below). Until the problem is solved, the company has three options: stop production, implement a work-around, or tolerate inferior-quality products.

20150729_WO_PQM

 

The Industrial Internet has a huge effect on reaction times when it comes to process quality management

Ongoing virtualization allows companies to continuously acquire process quality data, translate it into production-relevant information, and make it available where needed in production. This applies to process quality management in that software makes it possible to monitor process quality at runtime. For a specific example, see our blog post on process quality management for tightening systems. Instead of waiting for a quality report from somewhere along the value chain before identifying and solving problems in production, process quality is checked every time the process is run. If a quality problem is detected (see next graphic), troubleshooting can be started instantly, thus significantly reducing the reaction time. As a result, inferior-quality products don’t proceed along the value chain and production of further inferior-quality products can be prevented.

20150729_W_PQM

 

Is process quality management worth the investment?

It is, if the produced goods are high in price or have short cycle times. In the former case, every single product that comes off the line creates significant costs if quality falls short of what the customer is paying for. In the latter case, it’s the large number of inferior-quality products that cause the costs. Process quality management provides a major advantage by detecting problems in the process instantly, thereby considerably shortening the time it takes from problem detection to resolution and the factory returning to producing its usual level of quality.

In other words, investing in process quality management pays off if the costs incurred by inferior-quality products are greater than the initial investment plus the costs of operating a process quality management solution. The graphic below shows the relationship between time and increasing error costs, with the stepped curve visualizing the costs of quality problems, incorporating the rule of 10. The first section depicts the costs of detecting the quality problem at the production location, with subsequent sections representing the costs for detection at later stages in the value chain. Thanks to the prompt quality problem detection provided by process quality management, no inferior-quality product should leave the plant (zero defects). Depending on how many field complaints a company has, process quality management might already be worth the investment.

20150724_AddedValueCosts

 

The product of the future is transparent

Even if your production is already completely optimized, production and products will have to become increasingly transparent if they are to maintain or even acquire further market share. With the increasing virtualization of production landscapes, companies can collect data at any point in time and at any location. This means they can capture all product-relevant information throughout the lifecycle and make it available to all stakeholders in the value chain. As a result, a manufacturer may require its suppliers to provide quality data on each of their products to ensure production of high-class quality only. But even before information is shared across different locations in the value chain, process quality management is essential to the continuous improvement of your production operations.

Software for process quality management
 

About the author

Verena Majuntke

Verena Majuntke

Verena Majuntke is a Senior Solution Architect who joined Bosch Software Innovations in 2012. In her job, she is the first technical point of contact with customers in Industry & Logistics, understanding their requirements, developing solutions and contributing to the development of the factory of the future. Verena likes most about her job “that one can watch how Internet and Manufacturing meet and how software becomes palpable in the context of Industry 4.0 solutions”. Verena holds a diploma (RWTH Aachen) and a PhD (Mannheim University) in Computer Science. Her research work focused on intelligent and highly adaptable environments, e.g., smart factories.