Closing the Loop – Part 2 of 2

Part 2 of 2 – Closing the Loop

There is a school of thought in manufacturing that quality inspection is not necessary to build a high-quality vehicle. When major programs are launched, the focus is on building the part and often the inspection systems are last in line for installation and commissioning. Additionally, when budgets are trimmed, inspection dollars are the first to be thrifted or re-allocated. When this happens, the plant engineers pay the price. They heavily rely on inspection data while refining the process and tooling during launch and later when they scramble to troubleshoot line stoppages. One way to ensure that everyone is served by your manufacturing strategy is to combine the build and inspection into the same operation. As we covered in our previous blog, Data Rich and Information Rich, Artificial Intelligence (AI) plays an important role in augmenting the automation of data analysis. This same concept applies to building parts using AI and machine learning to make adaptive robot guidance even more powerful for modern manufacturers.

The Rise of Robot Guidance Solutions (RGS)

In the late 1980s, Perceptron released the world’s first robot guidance system. The system used laser-based machine vision to measure the opening for a windshield then guided an industrial robot to center the windshield into the opening. When this system was installed, a whole new industrial capability was born. Today, from simple pick and place operations to complicated best fit panel loading, robot guidance is a mainstay of industrial manufacturing. As manufacturers strive to automate more operations to improve productivity, they can deploy robot guidance systems to maximize speed and ensure high quality in assembly operations. RGS has become a truly versatile tool for multiple industries.

Closed Loop Manufacturing

Robot guidance and quality work in concert throughout the modern manufacturing facility. One way these two technologies work in harmony is in closed loop manufacturing. One example of closed loop processing is what Perceptron termed “deck and check” when they began applying their technology to load automobile roofs and then subsequently verifying the dimensions of the roof ditches before the vehicle left the build operation. This breakthrough created an in-station process control (ISPC) strategy for loading roofs with higher dimensional quality and immediate verification of the quality of the part before releasing it from the welding station. The benefits of ISPC are significant. ISPC reduces the production line space and cost associated with installing a separate inspection station and ensures the point of discovery for a quality issue is early in the process before significant value has been added to the vehicle.

Adaptive Feedback and Control

One of the holy grails of machine vision for robot guidance has been true adaptive feedback control (AFC). With AFC, process and quality inputs are monitored and adjusted in real-time, creating a manufacturing process that responds to all the inputs to produce an assembly that is truly custom fit to the individual parts and process inputs. Add Machine Learning to this process and you could be on the doorstep of “lights out” manufacturing with an adaptive process that learns as it builds. Utilizing the networked data and analytical horsepower creates a feed-forward automation and a self-teaching manufacturing process. Harnessing this power could lead to a process that does much of the “heavy lifting” for us, while ensuring the highest quality parts at the required line rates.

Data Rich and Information Rich – Part 1 of 2

Part 1 of 2 – Data Rich and Information Rich

One of the long-standing challenges of managing a 100% data sample is being data rich but information poor. In today’s modern manufacturing facilities, the fast cycle times and large data sets can often paralyze the people responsible for managing the overall quality of the parts produced. This data overload is compounded as companies lose skilled workers to retirement or attrition. The talent and experience gaps create a very reactionary culture in many facilities where the human resources can only react to critical quality alarms. Plant floor personnel have less and less time to be proactive and have even less time to perform the data analysis required to search for trends in common cause and special cause variation data. Advances in machine learning and artificial intelligence can aid modern manufacturers by automating the analysis for users and eliminating the reactive approach to quality spills.

 

Automatic plant floor analytics

It is now a reality for companies to leverage breakthroughs in data analytics to identify root cause faster. Tools like Argus from Perceptron act as “engineers in a box” crunching countless calculations behind the scenes while production runs uninterrupted. Now the large data sets get processed in real-time, with the software searching for correlations, upstream and downstream contributors to process variation, and special or common causes to inform quality professionals with suggestive answers to the problem, instead of merely pointing out there is an issue.

 

Data rich environments are perfect for machine learning

The amount of data generated during every production cycle in a manufacturing plant can be staggering. The data is often pristine with absolute accurate systems measuring in the 100-micron range every second, every cycle. Incorporating a machine learning layer is as simple as purchasing a software package and database that can handle all the data inputs in real-time. From there, the machine learning models will take over, looking for trends in the data based on existing measurement data and CAD information to notify operators of what is happening to their process and product. This is in stark contrast to the traditional methodology where plant personnel are alerted that there is a problem, not what the problem is or how to potentially solve it.

 

Use machine learning and AI to augment not replace

At the end of the value chain it is still people that perform most of the corrective actions based on the information provided by these new proactive systems. Think of the efficiencies to be gained in facilities if the workers can skip the hours or, in some cases, days of data analysis required to solve problems. Now the system does the analysis and provides the guidance on where to go first: Station 5 and locating pin. Valuable human resource time gets re-allocated to problem solving and you become data rich and information rich, which is a major win-win for manufacturing.

 

Taking information intelligence to the next level – Part 2

Our next blog will discuss how you can use information intelligence to automatically solve problems through proactive ‘machine learning’ and part routing.

 

Want to know more about how Perceptron uses machine learning? Contact us at info@perceptron.com.

Machine Learning and Industry 4.0 for Proactive Process Control

In our last blog we discussed automation as it relates to metrology. That post focuses on the data collection side of the quality equation. Putting sensors on robots to monitor in real-time was just one area where automation could be deployed to benefit the manufacturing plant. Now with the full arrival of Industry 4.0 through machine learning, automation does not only have to be reserved for data collection, it can also be used for data analysis. Engineers in modern manufacturing facilities can rely on advanced analytical tools to send them the answer, eliminating the time it takes to solve the process variation puzzle manually.

The large amounts of data available in modern, connected factories help engineers keep their processes stable and in control, but there is a time cost associated with managing that data. The time it takes to figure out what database to pull data from, time to pull reports, and the time to analyze and compare everything adds up quickly. The time spent only grows if the problem gets more involved. This time is critical, especially if a process issue or production stoppage is the reason the engineer started the data mining and analysis work to begin with.

Every minute of data mining and analysis could be one minute of lost production which equals lost profit. Some companies even track a metric called Non-Value-Added Activity (NVAA) which is the amount of time dedicated to work done that does not produce parts. An argument can be made that all the time mining and analyzing process data can be put directly in the NVAA cost bucket. However, without data, how can an engineer know where to start to fix a production issue?

The solution is to leverage machine learning to complete the analysis in real-time, and provide answers, not just data, to the engineer. Utilizing tools that enable aggregation of information, visibility without excessive keystroking or mouse clicking, and the answer, instead of just a report, will shorten time to root cause, reduce NVAA, and ultimately reduce loss.

 

Machine learning and plant floor analytics

As the connected factory grows and joins the internet of things (IoT), it has become possible to apply technology to eliminate the time associated with traditional data acquisition and analysis. Machine learning can be used to create a comprehensive set of rules that automate the analysis, creation of charts, and can send the information directly to the correct person to fix the issue. Think of it as a dimensional engineer in a box.

Improvements in other technologies that enable continuous real-time analytics powered by machine learning are more affordable and available to the factory floor than ever before. Lower-priced data storage, increased connectivity of machines, and high-speed computing are also catalysts for this change in manufacturing. Now instead of pushing alerts and charts, the software can push answers through email clients, text messages and even smartphone mobile apps, enabling engineers to spend their time fixing issues instead of analyzing data.

Providing an intelligent IT automation platform that analyzes the process data that is collected, and sends answers creates efficiency on the plant floor, where effort can be transitioned from data collection and analysis to fixing issues at a controlled and manageable pace, instead of reacting to production losses in a frenetic and unpredictable fashion. For example, Perceptron’s software product, Argus, provides engineers 3-D visuals of product build issues, not just charts. answers, not just charts. The software uses traditional, time-tested tools like statistical process control (SPC) and robust real-time metrology data combined with high speed computing, powerful algorithms, and artificial intelligence to enable a truly proactive approach to process management.

Challenges of proactive process control

Enabling proactive control does have its challenges. One of the first things that needs to be addressed is getting the plant personnel to trust the answers coming from the system without having to rely on their own ability to interpret the data. This new paradigm can make engineers and front-line personnel feel like they are not providing as much value without using their know-how to analyze the data. However, these skilled workers can now pivot from data collectors and analyzers to process improvement specialists. These newly empowered and informed personnel can focus on taking the answers to the plant floor to make immediate improvements to their tooling that can be immediately validated with their in-line gauges and automated analytics. The potential for continuous process improvement and world-class quality is limitless.

 

Next steps

The ability to harness machine learning for proactive process control is already here. Once the right technology is identified and deployed, all it takes is a change in a plant’s culture to fully leverage the efficiencies created by machine learning and automated process control.

Are You Ready to Automate?

Automation considerations

If you have spent any time in a major automotive plant, you understand that automation is a critical part of the infrastructure for building vehicles. Robots, conveyers, and other automation equipment are accepted at a level unseen in many industries. This automation helps auto manufacturers be more efficient, building high quality parts at rates that can boggle the mind. Today in some areas of the plant there are hardly any people as robots and automated guided vehicles (AGVs) transfer parts from station to station. Automation is accepted in most operations, but automation is not the first option for many when it comes to metrology and process control. This reluctance to automate has left many quality control strategies in the proverbial dark ages. Applying automation to your metrology processes can help increase measurement throughput, reduce direct labor involved in measurement, and help your overall quality improve. When deciding to automate your metrology process it is helpful to consider a few key questions prior to making the investment:

  1. Can I achieve the accuracy and repeatability necessary with an automated solution?
  2. How fast does my measurement system need to be?
  3. What uptime do I need?
  4. How flexible does my system need to be?

Can I Achieve the Accuracy and Repeatability Necessary?

Before we go into how accuracy and repeatability relate to metrology performance, one must differentiate between the two concepts as they are often misunderstood and misapplied. Simply put, a measurement device is accurate if it provides a reading that can be traced back to an accuracy standard by measuring a target like a sphere or more accurate comparison device. In metrology, accuracy is typically measured in microns, or a millionth of a meter. One example of accuracy to a standard is the tests utilized by Perceptron for our AccuSite system. AccuSite uses the ISO 10360-8 standard, ensuring their measurement device is accurate to 150 microns through the entire measurement volume. Accuracy tests like these are typically done on the plant floor as part of a system buyoff when compared to an acceptable tolerance. When you read the accuracy specification, make sure it is for the entire measurement volume, not just in the system’s sweet spot.

Repeatability is the ability for the measurement device to provide the same result within a certain error, over several measurements. Repeatability is an equally important performance criterion and is subject to the same considerations as accuracy. Automated metrology systems typically have superior repeatability performance relative to manually operated systems due to the lack of operator influence on the measurement results. It is important that you purchase a device that will be repeatable enough to handle the bulk of your metrology tasks.

When trying to determine how accurate your device needs to be, work back from the measurements you are trying to make, and look at the tolerances for those measurements. Remember that higher accuracy devices typically carry a higher price tag and may require more maintenance to keep them as accurate as they were when you first installed them — it may not pay to buy the most accurate device available.

Systems like AccuSite have been designed to provide both high repeatability and micron level accuracy, ensuring that manufacturers do not have to compromise to get the highest accuracy and repeatability in their automated system.

Speed

Speed is essential in automated metrology as it is directly related to measurement throughput, but speed impacts other factors as well. In general, speed can impact the accuracy and repeatability of a measurement system. For example, let us consider a robotic measurement system where a non-contact sensor is mounted to an industrial robot. At slow speeds, the robot will not experience as much backlash, and will produce a better measurement result. Speed the robot up and the measurement quality may degrade significantly. When specifying your system, make sure you include a requirement for accuracy and repeatability at production speeds.

Another key factor on speed is “how fast do you need to go?” The speed of your in-line automated measurement should be equal to or slightly greater than your fastest cycle time that allows you to measure all your KPI’s in real-time. When deciding on a robotic measurement solution, consider all components that influence your productivity – cycle time, robot speed, and resource time and availability.

Uptime

Uptime is related to throughput but is a standalone metric that is more closely related to performance. Uptime simply put, is the amount of time that the device is running and available to make measurements. Uptime can be impacted by more than just the overall reliability of the system. Make sure you find out what the maintenance schedule is for the equipment as you will most likely not be able to measure when you are doing the preventive maintenance on the system. For hardware and data confidence, it is critical to ask your vendor what the maintenance requirements are. Uptime in an automated in-line system is even more critical because often the line cannot run unless the measurement system is functioning – measurement system downtime can equal production downtime.

Flexibility

Flexibility is a key factor in your automation strategy and is often a main reason for automating. Today’s manufacturers are using concepts like palletized build allowing them to run multiple part types down the same production line. In industrial automation, the word flexibility is often over-used. Everyone markets their solution as “flexible” but what does flexibility truly mean? The answer to these questions is “it depends.” In fact, one could say that “flexibility is in the eye of the beholder.” Since flexibility is relative to the task and facility it will be located in, it is important that you don’t take vendor claims at face value when they position their solution as flexible — especially when they are referring to metrology solutions.

For a metrology device to be flexible it should, at a minimum, include the following attributes:

  1. Ease of programmability
  2. Ability to measure multiple parts and feature types with no targets or sprays required

Ease of programmability is a key for flexible metrology, but it is often misrepresented. Watching a well-trained and skilled technician program a part on the fly during a demo is not a real representation of programmability, neither are the initial programs completed by a vendor during the initial install. A metrology device should only be considered easy to program if the programming can be done by trained plant personnel without support from the metrology supplier. If you must call your vendor and cut a purchase order every time you need to make a change, the system is not easy to program and should not be considered flexible. The on-site technician from the vendor makes things look easy because it is their job, and they have a lot of time on task. Think about your employee that takes training at the original install, but then does not use the skill for six months. Ease of programmability ensures your employees can maximize the flexibility of the measurement system.

The ability to measure multiple parts and feature types is another key piece of the flexibility puzzle. It is imperative that you select a metrology device that will cover all your critical parts that require inspection. Of course, no system can measure everything, so you may at times need to apply the 80/20 rule to this flexibility requirement. The best way to know for sure if a system will measure what you need is to have the potential vendors measure some of your actual parts during the selection process. Allow the vendor to put together a nice presentation of the results, but also ask for the raw data. If the vendor is not eager to share the raw data with you, it should be a red flag about their solution.

Automation is a very valuable tool for modern manufacturers and every employee should look for areas to automate. Metrology is the perfect area for manufacturers to benefit from automation, but due diligence is required to make sure automation improves your overall productivity.

History of Automated In-Line Accurate Measurement

Looking Back ...

Prior to the advent of automated in-line measurement, the traditional dimensional quality control strategy in the automotive body shop relied on sampling production with off-line CMM machines in temperature-controlled measurement rooms. The metrology science and techniques for touch probe contact measurement were developed in the 1970s by metrology engineers in collaboration with the CMM companies. The quality engineers operating the CMM machines were highly trained metrology specialists. The absolute accuracy of the typical CMM machine in the automotive body shop could reach 0.010mm in a local area, but when assessed throughout the machine volume, a more typical accuracy of 0.100mm maximum error was more often the reality with dual arm configurations.

When Perceptron introduced plant floor hardened automated in-line measurement in the mid-1980s, the focus was on 100% measurement data and statistical process techniques for process variation reduction. The repeatability of the Perceptron technique was typically less than 0.100mm 3-Sigma. The systems were good for relative measurement typically achieving relative accuracy error on the order of 10% due to crudely measured relation from sensor coordinates to part coordinates.

And the debate over 100% vs sampling began. One big question was what to do with the overload of data? Another was how much is enough accuracy? Data confidence also became a big challenge as the laser optical technique applying image processing were subject to influences that affected the results differently than the CMM touch probes. The desire to have traceability of the in-line measurements drove a process of correlating and offsetting the in-line measurements relative to the CMM and this became a major effort for the quality engineers in the measurement rooms.

In the late 1980s, Perceptron invented and patented a technique for calibration of the in-line measurement stations directly into absolute coordinates. The technique made use of theodolites referenced to the part coordinate origin and a calibration target measurable by both the theodolites and the measurement sensor’s laser. The relation from sensor coordinates into absolute part coordinates was generated for each sensor and stored and applied to the measurements. This technique typically achieved absolute accuracy within 0.250mm when applied to fixed mounted sensors. This reduced the CMM correlation and offset process, but the differences between optical and touch probe techniques remained.

In the early 1990s, interest in flexible automation and measuring with robots positioning sensors, rather than fixed mounted sensors, for each checkpoint was growing—particularly in Japan and Korea. This was driven partly by the desire to run multiple models on a single line rather than single model dedicated tooling.

Error from robot repeatability and thermal drift had to be overcome, and Perceptron and Nissan developed high-accuracy measurement robots with rectilinear axes to allow straight forward linear thermal drift error correction. The measurement data was processed to optimize the numerically controlled tooling—an early instance of Industry 4.0 level of automation and information exchange. This was followed by techniques for applying kinematic model-based thermal compensation to standard industrial robots to reduce measurement error caused by robot thermal drift. Absolute accuracy was initially still achieved by reference measurement techniques at each checkpoint, such as the theodolite or eventually laser tracker, but results were never as accurate as with fixed-mounted sensors.

During the early 2000s, techniques to calibrate robots into absolute coordinates and sustain that calibration were developed and refined with a goal to simplify the use of measurement robots and increase the flexibility of the in-line measurement stations. The robot kinematic models and compensation techniques became more sophisticated and accurate. The industry-leading techniques developed by Perceptron to compensate for the absolute error of the robot TCP position and the relation from sensor coordinate to TCP coordinate to part coordinate could be relied on to achieve an absolute volumetric accuracy approaching 0.250mm. Standards were also developed and adopted for validating and comparing volumetric accuracy of the automated systems, such as the ISO 10360-8.

More recently, Perceptron has pioneered major advances such as optical measurement techniques and 3D point cloud laser sensors, such as the Helix sensor family. Helix was developed to produce measurements that exactly match the CMM touch probe techniques, virtually eliminating this long-standing correlation error factor. Perceptron developed self-learning software for compensating measurements such that plant floor temperature-induced dimensional changes of the measured part do not influence the measurement results. Software for split cycle configurations where different checkpoints are measured on different cycles have been introduced to maximize the in-line checkpoint coverage. And off-line programming techniques, including the use of Digital Twins to fully simulate automated systems, have simplified the programming and maintenance of the automated systems.