Saturday, February 25, 2006

The Power of the PFMEA

My department supports over one hundred suppliers of direct material. Of the total, seven account for approximately seventy percent of our problems. Our focus is clear-Improve these suppliers and our problems will vanish.

The strategy for improving a supplier is complex. It is a mixture of technical and commercial issues and must come from a team representing both interests. One valuable tool to use is audits. Before you can start "fixing" any organization, you must know what to fix. This is easy for product. Pick a sample of parts, measure each part, and take corrective action if any of the parts violate the specification.

In this case, we aren't talking about product but systems and processes. Product is tangible. Processes are somewhat intangible. Conducting a process audit gives you the objective evidence to make the improvement project tangible. Just as you measure the product, measure the process.

After conducting process audits at three of the seven suppliers, one idea was confirmed to me. The Process Failure and Effects Analysis (PFMEA) is a powerful but poorly used tool.

For those not familiar with the tool, it is used to manage change and to mitigate risk. It entails brainstorming for what can go wrong in a process. Once you think through the possibilities of failure, you next consider how the customer can be impacted. Next, you talk about the causes of the failures and the likelihood of each cause's occurrence. Finally, you must consider the probability of the failure being detected within your plant. With this solid overview, you have an assessment of risk and a plan for improvement. For example, if you list a failure that can negatively impact your customer and your data says that the cause of the failure has a high likelihood of occurence and a poor probability of detection, you better do something now to protect your customer.

Back to the suppliers: The commom thread you see in poor performing suppliers is misuse of the PFMEA. My worst performing supplier has a PPM defective rate of approximately 2500. We write, on average, three nonconforming reports per week for his product. In our recent process audit, analysis of his PFMEA showed that the majority of his occurence ratings were two (a low likelihood of occurence) and his detection ratings congregated at three. ( high probability of detection). His PFMEA says "We know we have problems but they seldom occur. When they do occur, we will catch them"). My objective summary of his quality performance negates his PFMEA.

Now more than ever, I am convinced that the PFMEA is one of THE tools for improving processes and quality systems. I see it as a funnel. If you want to change a process, pour the change into the PFMEA process. The output of the funnel will be a plan for managing the change. Companies constantly audit their own product and processes (if not, they should!). All the data and information from these self assessments must be poured into the funnel. The output will be signals for continuous improvement.

2 comments:

Anonymous said...

Stephen,

I want to make a general comment on the field of quality engineering. Naturally, I am speaking from the point of view of my career for the past thirty years, scientific research. What impresses me about the tools of quality engineering is that they combine a keystone of science with effective communication. Modern science practices the analytic or reductionist approach. Complicated situations are broken down into (hopefully) simpler pieces. These pieces are broken down again. Despite the popular conception of scientific minds being able to handle immensely complicated ideas, most science practitioners such as myself need to make things as simple as possible. The PFMEA is thus a good approximation of what my colleagues and I do every day. The difference between research and quality engineering is that we don’t assign numbers to severity, occurrence and detection rates. Perhaps we should.

This difference in assignment of numbers highlights what I feel is a failing of many in the scientific community. We use fancy words to describe all this and thus often do a poor job of communication. As the press is correct to point out, most people outside our limited fields of study have a really tough time knowing what we are doing. Communication is a process like so many others. It is a problem when the customer, the public, does not receive usable information about the product.

Thus I have to be very impressed with quality engineering in its efforts to communicate results in a way that most everyone can understand. We all know how to rate things from one to ten. I think, for my field of image analysis, using phrases like dynamic causal modeling, and blood oxygen level dependent contrast (These are actual examples. I am not making them up!) might qualify as defects in the process called communication. They might rate as about a seven or eight in severity, a seven or eight in occurrence rate, but (unfortunately) they are not noticed by the scientists who produced them so their detection rate is about a zero or one!

Overall, it seems like scientific research could learn a lot from quality engineering in terms of communication of what we have accomplished. The simple, intuitive methods of quality engineering are hard to beat.

Stew Denslow

Stephen said...

What you seem to be saying, in part, is that quality engineering is about communicating with data.

This is very true. Subjective evidence never leads to sound decisions. Only decisions based on objective evidence keep all parties free from needless debates and/or arguments.