How FireVisor automates the process of engineering data collection, cleaning, and analysis
The Singapore-based AI platform can be used across industries for all types of visual defect detection and analysis. Its products can also be used by organisations buying parts from manufacturers for ensuring incoming quality.
Monday February 21, 2022,
5 min Read
Cost of poor quality (COPQ) is the cost to companies that would disappear if systems, processes, and products were perfect. This term was coined by IBM quality expert H James Harrington in 1987.
To give you an idea of the global cost of poor quality, let me tell you a story. A manufacturing company had an annual sales of $250 million. Its quality department calculated the COPQ, which amounted to 20 percent of the annual sales. This means that during one day of each five-day workweek, the entire company spent time and effort making scrap products, which represented a loss of approximately $100,000 per day.
With the high volume of products manufactured and data produced every day, this COPQ and associated scrap generated is only increasing.
Today, an army of skilled individuals are needed to make sure no fault escapes the production line, while process control engineers constantly tweak and monitor machines. The machines, on the other hand, produce an abundance of valuable data but these vast amounts of data are unused and simply get discarded.
I worked in manufacturing before and saw first-hand how difficult it was to handle the enormous amount of data produced every day, and how useful information was getting lost because all of this data. That’s how I saw the need for looking beyond existing automation.
One of the reasons is that factories today extensively use the ‘if-else’ type of automation. This creates many islands of intelligent machinery inside the manufacturing line, but none of them are talking to each other. So if something went wrong with machine X which is step 1 of the manufacturing process, machine Y that’s step 3 would know nothing about it and would continue processing defective material. This is where cognitive automation comes in. A central platform that interconnects to individual machines in a factory, and learns from these different data points to make better, well-informed decisions is what builds cognitive automation.
Solving the problem in steps
While cognitive automation is the answer, getting there might not be as straightforward. Modern factories are an intermesh of machines, legacy systems, processes, and humans. This multi-layered environment needs a multi-layered approach.
Identifying defects with a dependable, repeatable system that is also easy to integrate and use is the first step. The next is building an analytics platform that connects to data sources in the manufacturing line, and automatically performs engineering failure analysis in real-time.
FireVisor’s Defect Analytics does just that. We bring the power of data science to the manufacturing floor, and into the hands of process engineers. You can slice and dice data with a few clicks, getting to the root cause analysis much faster.
Our secret weapon is our capability of dealing with data. Our machine learning models work so well because we are able to clean the data, fill in the missing information, and then bring them all together in one platform.
To give you an example, one of our customers is now able to fix their machine issues every hour because their engineering team looks at defect trends in real-time. Every part goes through multiple engineering processes. In the case of this customer, they are able to instantly see the defect trends inside all the machine, operator, and complex image data right after an engineering process is completed. Based on these trends, the engineers in charge of the previous process then quickly isolate the root cause. This saves them 37 percent of their time, which they would otherwise spend on manually sieving through terabytes of data.
Applying AI to manufacturing
I have spent most of my working life implementing manufacturing line automation, improving product yield and quality.
When I met my co-founder, we originally started off with automating and improving visual checks (which is the most widely used method for detecting faults and defects). We wanted to do that by improving the machine vision systems used in lines. But after speaking to over 40 industry experts, engineers and line personnel, we realised that by improving defect detection, we are not fixing the cause. That’s when we knew that the answer lay in data.
Our defect detection system can also be used across industries for all types of visual defect detection and analysis. Although our products are tailor-made for the manufacturing line, they can also be used by research labs and institutes that engage in new technology transfer to factories.
We also offer our products on the cloud, which can be utilised by companies buying parts from manufacturing companies to judge incoming quality. This way, we enable quality control for the entire supply chain.
The way forward
With two mature products, our next challenge is implementation at scale. We aim to do that without complex integration infrastructure or the need for heavy consulting budgets. This way, access to manufacturing intelligence is democratised for large and small manufacturers alike.
Cohort 9 of NetApp Excellerator
The NetApp Excellerator program has proved to be an excellent partner for us at this stage of growth. The program helped us better understand product management and enabled us to optimise our cloud costs. They also helped connect us to key customers and growth opportunities. We are honoured and privileged to graduate from one of the best accelerators in the country, alongside marquee companies.
To participate in NetApp Excellerator’s Cohort 10, click here