New insight: Quality application sets up visual cues for process flow integrity
When Power-One agreed to be a beta tester for a new quality management application, the results were surprising. “When we saw our own data in the beta product, we realized we were getting insight we'd never [seen] before,” recalls Chris Rodriguez, strategic planning manager for global test engineering at the Camarillo, Calif.
When Power-One agreed to be a beta tester for a new quality management application, the results were surprising.
“When we saw our own data in the beta product, we realized we were getting insight we'd never [seen] before,” recalls Chris Rodriguez, strategic planning manager for global test engineering at the Camarillo, Calif.-based manufacturer of power conversion devices.
This new application—Process Flow Visualization—was developed by SigmaQuest , which supplies a range of quality management solutions in the on-demand deployment model. SigmaQuest CEO Nader Fathi says Process Flow Visualization gives manufacturers new insight to production operations because it collects detailed data on each part as it passes through the production process, and then converts that data into easy-to-read charts and graphs that point out the underlying causes of any product quality issues.
In essence, says Fathi, “You can start a batch of a thousand parts off on Monday, and see, one by one, what stage they had reached by Friday, and the route that they had traveled to get there—including test failures, rework, and at which point in the process those reworked parts rejoined the flow.”
It's then possible to drill down into the underlying data, and determine how and why apparent anomalies occurred—along with test transaction data highlighting why tests were failed, how many times tests were repeated, yield indicators at each stage in the process, and if any parts skipped testing altogether.
This is the type of information that came as a revelation to Power-One, which had been using SigmaQuest quality management applications since 2006.
“We saw product flows that we didn't know existed—product being moved from a test station on Line B, for instance, to see if it would pass there, after already failing at the equivalent test station on Line A,” Rodriguez says. “You can't see that kind of thing so readily in static data. You have to see the actual product flows to see what's been going on.”
At a stroke, he relates, it became possible for management to understand why overtime levels were persistently high on some lines, and why certain parts of the plant floor struggled to meet production goals despite careful capacity planning and theoretically ample TAKT times.
“Some test stations were more popular than others because individual operators had their own favorites, and the line balancing and capacity planning calculations didn't take that into account,” Rodriguez explains.
Rework and retesting also seemed to take place on certain stations in preference to others. The result: Even as some line stations sat idle, backlogs were building up at others—necessitating overtime work.
That has since changed. Aware of the problem, engineers have been able to drill down to the root causes of the choice of line stations, calibrating and fine-tuning them to eliminate operator preference as a scheduling factor.
“Overtime has definitely come down, and we're meeting our numbers more consistently,” reports Rodriguez.
Just as important, the company knows its products are being built and tested as intended. “Being able to see product flows as they occur provides a real assurance of process-flow integrity,” Rodriguez concludes.